KR20140017735A - Wearable electronic device and method for controlling the same - Google Patents

Wearable electronic device and method for controlling the same Download PDF

Info

Publication number
KR20140017735A
KR20140017735A KR1020120083810A KR20120083810A KR20140017735A KR 20140017735 A KR20140017735 A KR 20140017735A KR 1020120083810 A KR1020120083810 A KR 1020120083810A KR 20120083810 A KR20120083810 A KR 20120083810A KR 20140017735 A KR20140017735 A KR 20140017735A
Authority
KR
South Korea
Prior art keywords
electronic device
user
wearable electronic
information
image
Prior art date
Application number
KR1020120083810A
Other languages
Korean (ko)
Inventor
김준식
정승모
Original Assignee
인텔렉추얼디스커버리 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 인텔렉추얼디스커버리 주식회사 filed Critical 인텔렉추얼디스커버리 주식회사
Priority to KR1020120083810A priority Critical patent/KR20140017735A/en
Priority to PCT/KR2013/006821 priority patent/WO2014021602A2/en
Priority to US14/413,802 priority patent/US20150156196A1/en
Publication of KR20140017735A publication Critical patent/KR20140017735A/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B27/0103Head-up displays characterised by optical features comprising holographic elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0024Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system for multiple sensor units attached to the patient, e.g. using a body or personal area network
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to a wearable electronic device and a method of controlling the same, comprising: a camera having at least one lens and display means for displaying information on the lens, the camera photographing at regular intervals to obtain an image; A sensing unit configured to detect user biometric information and motion information of the wearable electronic device; And a controller for controlling to store or transmit the acquired image in synchronization with the information detected through the sensing unit.

Description

[0001] Wearable electronic device and method for controlling same [0002]

The present invention relates to a method of controlling a wearable electronic device in the form of a pair of glasses or the like.

The augmented reality is different from the virtual reality technology in that the virtual reality is supplemented and displayed to the user by superimposing the virtual world on the real world, which is advantageous in providing a better sense of reality to the user than the virtual reality.

Generally, the augmented reality uses a display device such as an HMD (Head Mounted Display) or a HUD (Head Up Display) to display various information in front of the user's eyes. In addition, researches are being actively conducted to manipulate augmented objects in augmented reality using gesture recognition.

The HMD is mounted on the user's head or other part and presents independent projected images to the left and right eyes so that when the user watches the object of view, An image is generated and the binocular parallax enables the perception of the depth sense.

The HUD also projects images in a transparent glass such as glass to allow the user to visually recognize the information projected from the HUD and the external background simultaneously through the transparent glass.

An object of the present invention is to provide a wearable electronic device capable of easily recording and managing a life log of a user.

A wearable electronic device according to an embodiment of the present invention includes a camera including at least one lens and a display means for displaying information on the lens, and photographing at regular intervals to acquire an image; A sensing unit configured to detect user biometric information and motion information of the wearable electronic device; And a controller which controls to store or transmit the acquired image in synchronization with the information detected by the sensing unit.

According to an embodiment of the present invention, the wearable electronic device may store or manage an image photographed at a predetermined period in synchronization with the user's biometric information and motion information, and may record and manage a lifelog without the user's awareness. It may be easy to cope with a dangerous situation.

1 and 2 are perspective views showing a configuration of a wearable electronic device according to an embodiment of the present invention.
Figure 3 is an illustration of an example of a view seen by a user through a wearable electronic device.
Figs. 4 and 5 are perspective views showing still another embodiment of the configuration of the wearable electronic device. Fig.
6 is a block diagram illustrating an embodiment of a configuration of a wearable electronic device according to the present invention.
7 is a flowchart showing an embodiment of a control method according to the present invention.
8 and 9 illustrate embodiments of a side configuration of a wearable electronic device.
10 is a block diagram showing the configuration of a user risk detection system according to an embodiment of the present invention.
11 and 12 are diagrams illustrating an embodiment of a user interface for notifying wear of an wearable electronic device.
13 to 17 are diagrams illustrating embodiments of a user interface for notifying a user's state according to a user's risk level.
18 is a diagram illustrating an embodiment of a method for setting a weight for determining a risk level.
19 is a diagram illustrating an embodiment of a method of providing a user's lifelog together with map information.
20 is a diagram illustrating an embodiment of a method of expressing a life log of a user.

Hereinafter, a wearable electronic device and a control method thereof according to an embodiment of the present invention will be described in detail with reference to the accompanying drawings.

Fig. 1 is a perspective view showing the construction of a wearable electronic device according to an embodiment of the present invention. The wearable electronic device 1 shown in Fig. 1 can be manufactured in the form of glasses so as to be positioned close to the user's eyes.

1, which shows a front view of a wearable electronic device 1, a wearable electronic device 1 according to an embodiment of the present invention includes left and right lens frames 10 and 11, 20, left and right hooking portions 30, 31, and right and left lenses 50, 51.

1, a camera 110 may be mounted on a front surface of the frame connector 20, for example, as shown in FIG. 1, Can be disposed.

 Accordingly, the user wears the wearable electronic device 1 in the form of a glass, and can photograph or store or share the picture or the moving picture using the camera 110 while moving.

In this case, the viewpoint of the image captured by the camera 110 may be very similar to the viewpoint of the scene recognized by the user.

In addition, a gesture such as a user's hand movements is recognized using the camera 110, and the operation or function of the wearable electronic device 1 can be controlled according to the recognized gesture.

The position or the number of the camera 110 may be changed as needed, and a special-purpose camera such as an infrared camera may be used.

Further, units for performing a specific function may be arranged in each of the right and left fastening portions 30 and 31. [

The right side-arm 31 may be equipped with user interface devices for receiving user input for controlling the function of the wearable electronic device 1. [

For example, a track ball 100 or a touch pad 101 for selecting or moving an object such as a cursor on a screen, a menu item, or the like may be disposed on the right hook 31.

The user interface device provided in the wearable electronic device 1 is not limited to the track ball 100 and the touch pad 101 but may be a variety of devices such as a key pad dome switch, Input devices may be provided.

On the other hand, a microphone 120 may be mounted on the left side-arm 30, and the operation or function of the wearable electronic device 1 may be detected by using the voice of the user recognized through the microphone 120 Lt; / RTI >

The sensing unit 130 is disposed on the left hook 30 to sense the current state of the wearable electronic device 1 such as the position of the wearable electronic device 1, presence or absence of user contact, bearing, acceleration / deceleration, Thereby generating a sensing signal for controlling the operation of the wearable electronic device 1. [

For example, the sensing unit 130 may include a motion sensor such as a gyroscope or an accelerometer, a position sensor such as a GPS device, a magnetometer, a theodolite, A direction sensor, a temperature sensor, a humidity sensor, a wind direction / wind speed sensor, and the like. However, the present invention is not limited thereto and may include sensors capable of detecting various information in addition to the above sensors.

For example, the sensing unit 130 may further include an infrared sensor. The infrared sensor includes a light emitting unit that emits infrared rays and a light receiving unit that receives infrared rays. The infrared sensor may be used for infrared communication, .

The wearable electronic device 1 according to an embodiment of the present invention may include a communication unit 140 for communication with an external device.

For example, the communication unit 140 may include a broadcast receiving module, a mobile communication module, a wireless Internet module, and a local communication module.

The broadcast receiving module receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel. The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may be a server for generating and transmitting broadcast signals and / or broadcast-related information, or a server for receiving broadcast signals and / or broadcast-related information generated by the broadcast server and transmitting the generated broadcast signals and / or broadcast- The broadcast-related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.

Meanwhile, the broadcast-related information may be provided through a mobile communication network, and in this case, it may be received by a mobile communication module.

The broadcast related information may exist in various forms. For example, an EPG (Electronic Program Guide) of DMB (Digital Multimedia Broadcasting) or an ESG (Electronic Service Guide) of Digital Video Broadcast-Handheld (DVB-H).

The broadcast receiving module may be a DMB-T (Digital Multimedia Broadcasting-Terrestrial), a DMB-S (Digital Multimedia Broadcasting-Satellite), a MediaFLO (Media Forward Link Only), a DVB- And a digital broadcasting system such as ISDB-T (Integrated Services Digital Broadcast-Terrestrial). Of course, the broadcast receiving module may be configured to be suitable not only for the digital broadcasting system described above but also for all broadcasting systems that provide broadcasting signals.

The broadcast signal and / or broadcast related information received through the broadcast receiving module may be stored in a memory.

On the other hand, the mobile communication module transmits and receives radio signals to at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless internet module refers to a module for wireless internet access, and the wireless internet module may be internal or external. WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) and the like can be used as wireless Internet technologies.

The short-range communication module is a module for short-range communication. Bluetooth, radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB), ZigBee, and the like can be used as the short distance communication technology.

In addition, the wearable electronic device 1 according to an embodiment of the present invention may include a display device for displaying an image to transmit visual information to a user.

The display device may be configured to include a transparent or light transmissive unit so that a user can see a front view that is displayed in front of the information displayed by the display device.

For example, at least one of the left and right lenses 50, 51 shown in Fig. 1 functions as a transparent display as described above, so that the user can visually recognize the text or image formed on the lens, Can be seen.

For this purpose, the wearable electronic device 1 can display various information in front of the user by using a display device such as an HMD (Head Mounted Display) or a HUD (Head Up Display).

The HMD includes a lens that enlarges an image to form a virtual image, and a display panel disposed at a position closer to the focal distance of the lens. When the HMD is mounted near the user's head, the user can visually recognize the virtual image by viewing the image displayed on the display panel through the lens.

In the HUD, an image displayed on a display panel is enlarged through a lens, the enlarged image is reflected by a half mirror, and the reflected light is displayed to a user, thereby forming a virtual image. In addition, since the half mirror transmits external light, the user can see the foreground as well as the virtual image formed by the HUD by the light from the outside passing through the half mirror.

The display device may be implemented using various transparent display methods such as TOLED (Transparent OLED).

Hereinafter, an embodiment of the present invention will be described by exemplifying that the wearable electronic device 1 has the HUD, but the present invention is not limited thereto.

Referring to the rear surface configuration of the wearable electronic device 1 shown in Fig. 2, HUDs 150 and 151, which function similar to a projector, are provided on the rear surface of at least one of the left hooking portion 30 and the right hooking portion 31, Can be mounted.

The image formed by the HUDs 150 and 151 is reflected by the left and right lenses 50 and 51 and is displayed to the user so that the object 200 formed by the HUDs 150 and 151 is reflected by the left and right lenses 50 and 51, 51). ≪ / RTI >

In this case, the object 200 and the foreground 250 displayed on the left and right lenses 50 and 51 by the HUDs 150 and 151 can be observed together with the user's field of view, as shown in Fig.

The object 200 displayed on the left and right lenses 50 and 51 by the HUDs 150 and 151 is not limited to the menu icon as shown in Fig. 3 and may be an image such as text, a picture, a moving picture, or the like.

1 and 2, the wearable electronic device 1 can be used for shooting, telephone, message, social network service (SNS), navigation, , Search, and so on.

The functions of the wearable electronic device 1 may be various functions in addition to the functions described above according to the provided modules.

For example, functions that combine two or more functions may be implemented by transmitting a moving image photographed through the camera 110 to the SNS server through the communication unit 140 and sharing the same with other users.

A 3D glasses function for allowing a user to view a stereoscopic image may be implemented in the wearable electronic device 1. [

For example, the external display device alternately displays the left eye image or the right eye image alternately on a frame basis, so that the wearable electronic device 1 can selectively open or shut off the user's binocular to allow the user to feel a stereoscopic effect have.

That is, the wearable electronic device 1 opens the user's left-side shutter when the display device displays the left-eye image, opens the right-eye-side shutter of the user when the display device displays the right- 3-D images can be recognized.

Figures 4 and 5 show another embodiment of the construction of the wearable electronic device in perspective view.

4, the wearable electronic device 1 is provided with only one of the right and left lenses (for example, the right lens 51), so that only one eye can display the wearable electronic device 1 internal display So that an image displayed on the device can be displayed.

5, one eye (e.g., left eye) portion of the user is completely uncovered without being covered by the lens, and only an upper portion of another eye (e.g., right eye) The wearable electronic device 1 may be embodied in a structure that is shielded by the wearable electronic device 1.

The shape and configuration of the above-described wearable electronic device 1 may be selected or changed according to various needs such as the field of use, main function, main use layer, and the like.

Hereinafter, a method of controlling a wearable electronic device according to an exemplary embodiment of the present invention will be described in detail with reference to FIGS.

FIG. 6 is a block diagram of an embodiment of a wearable electronic device according to the present invention. The wearable electronic device 300 includes a control unit 310, a camera 320, a sensing unit 330, A display unit 340, a communication unit 350, and a storage unit 360.

Referring to FIG. 6, the control unit 310 typically controls the overall operation of the wearable electronic device 300 and may include, for example, associated controls for shooting, phone, message, SNS, navigation, And processing. In addition, the controller 310 may include a multimedia module (not shown) for multimedia playback, and the multimedia module 181 may be implemented in the controller 180 or may be implemented separately from the controller 310. have.

The control unit 310 includes a camera 320, a sensing unit 330, a display unit 340, a communication unit 350, and a storage unit 350, including one or more processors and a memory for performing the functions described above. And processing the signal input from the controller 360 and analyzing the processed signal.

On the other hand, the camera 320 processes an image frame such as still image or moving image obtained by the image sensor in a video communication mode or a photographing mode, and the processed image frame can be displayed through the display unit 340.

According to an embodiment of the present disclosure, the camera 320 may acquire an image by performing a photographing every predetermined period, for example, every 10 minutes under the control of the controller 310.

On the other hand, the image obtained at every predetermined period may be a still image such as a picture or a moving image captured for a predetermined time (for example, 5 seconds) according to a user's setting.

The image acquired by the camera 320 may be stored in the storage 360 or transmitted to the outside through the communication unit 350. The camera 320 may be provided at two or more different positions depending on the use environment.

The sensing unit 330 acquires movement information of the wearable electronic device 300 using a gyro sensor, an acceleration sensor, and a position sensor, and includes blood pressure, blood sugar, pulse, electrocardiogram, body temperature, exercise amount, face, iris, fingerprint, and the like. The biometric information of the user may be acquired, and the surrounding situation information may be obtained using a temperature sensor, a humidity sensor, a wind direction / wind amount sensor, and the like.

According to an embodiment of the present disclosure, the sensing unit 330 may use motion information of the wearable electronic device 300 using the plurality of sensors as described above, at a predetermined period in which an image is acquired by the camera 320. User biometric information and surrounding situation information can be detected.

Meanwhile, the controller 310 may synchronize the periodically obtained image and information with each other and store the same in the storage unit 360 or transmit the image and information to the external device 400 through the communication unit 350.

Meanwhile, the storage unit 360 may store a program for the operation of the control unit 310 and temporarily store input / output data (e.g., a message, a still image, a moving picture, and the like).

The storage unit 360 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory) (Random Access Memory), a static random access memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM) A magnetic disk, and / or an optical disk.

The wearable electronic device 300 may also operate in association with a web storage that performs storage functions of the storage 360 on the Internet.

The display unit 340 displays (outputs) information to be processed in the wearable electronic device 300. For example, when the wearable electronic device 300 is in the call mode, a UI (User Interface) or a GUI (Graphic User Interface) associated with the call is displayed. In the video communication mode or the photographing mode, Image, UI, and GUI can be displayed.

1 to 3, the display unit 340 may include an HMD, a HUD (HUD), and a display unit (not shown) to allow the user to visually recognize an object displayed through the display unit 340, Or a transparent display method such as TOLED.

The communication unit 350 may include one or more communication modules for enabling the wearable electronic device 300 to perform data communication with the external device 400 and may include, for example, a broadcast receiving module, a mobile communication module, An Internet module, a local area communication module, a location information module, and the like.

The wearable electronic device 300 may further include an interface unit (not shown) serving as a path for communication with all external devices connected to the wearable electronic device 300.

The interface unit receives data from an external device or supplies power to each component in the wearable electronic device 300 or allows data in the wearable electronic device 300 to be transmitted to an external device.

For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, an audio I / O port, A video input / output (I / O) port, an earphone port, and the like may be included in the interface unit.

The identification module is a chip for storing various information for authenticating the usage right of the wearable electronic device 300. The identification module includes a user identification module (UIM), a subscriber identity module (SIM) A Universal Subscriber Identity Module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Thus, the identification device can be connected to the wearable electronic device 300 through the port.

In addition, the interface may be a path through which power from the cradle is supplied to the wearable electronic device 300 when the wearable electronic device 300 is connected to an external cradle, Various command signals may be transmitted to the mobile terminal. The various command signals or the power source input from the cradle may be operated as a signal for recognizing that the mobile terminal is correctly mounted on the cradle.

Meanwhile, the wearable electronic device 300 may further include a power supply unit (not shown) that receives external power and internal power under the control of the control unit 310 and supplies power required for operation of the respective components have. The power supply unit may include a system capable of charging using solar energy.

The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof. According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays May be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing functions. In some cases, And may be implemented by the control unit 180.

According to a software implementation, embodiments such as procedures or functions may be implemented with separate software modules that perform at least one function or operation. The software code may be implemented by a software application written in a suitable programming language. Also, the software codes may be stored in the memory unit 360 and executed by the control unit 310. [

FIG. 7 is a flow chart illustrating an embodiment of a control method according to the present invention. The control method shown in FIG. 7 is a flowchart illustrating a control method of a wearable electronic device 300 according to an embodiment of the present invention shown in FIG. Will be described in conjunction with block diagrams.

Referring to FIG. 7, the camera 320 of the wearable electronic device 300 captures an image (step S510) at the same time as the preset time t elapses (step S500). The unit 330 detects user biometric information and motion information of the wearable electronic device 300 (operation S520).

The biometric information of the user is information for confirming the current state of the user wearing the wearable electronic device 300.

For this purpose, the sensing unit 330 may include a blood pressure measuring sensor, a blood sugar measuring sensor, a pulse measuring sensor, an electrocardiogram measuring sensor, a body temperature measuring sensor, an exercise amount measuring sensor, a face recognition module or an iris recognition module, and the like. The biometric information measurement / recognition module may be mounted at a position where the biometric information may be most easily measured or recognized.

For example, as described above, the sensing unit 130 for detecting the movement, location, and surrounding situation information (eg, temperature, humidity, noise, wind direction, air volume, etc.) of the wearable electronic device 300 is illustrated. As shown in FIG. 8, it may be mounted on the outer surface 30a of the hook portion.

In addition, the fingerprint recognition module 131 is mounted on the outer surface of the hook portion 30a as shown in FIG. 8, and when the user touches any one finger to the corresponding position, the fingerprint recognition unit 330 controls the fingerprint information. ) Can be delivered.

The pulse measuring sensor 132 is mounted at a position adjacent to the user's ear when the wearable electronic device 300 is worn on the side hook portion 30b of the side, more specifically, as shown in FIG. 9. When wearing the wearable electronic device 300, the user's pulse may be measured automatically and the information may be transmitted to the controller 310.

On the other hand, the camera 320 performs the function of the sensing unit 330 as described above, the user's eyes, a part of the face, the iris and the like to capture the user's biometric information, or the surrounding dangerous situation It can also be recognized from the image.

In addition, the microphone (not shown) may perform the function of the sensing unit 330 as described above, so that a situation such as ambient noise may be obtained.

The controller 310 stores or transmits the image photographed by the camera 320 in synchronization with the information detected by the sensing unit 330 (step S530).

For example, the controller 310 may manage and synchronize each other based on the time at which the image is photographed through the camera 320 and the time at which information is detected through the sensing unit 330.

The controller 310 causes the work from step S510 to step S530 to be periodically performed until the end of the lifelog function (step S540).

On the other hand, since the amount of information detected through the sensing unit 330 may be very large, the controller 310 processes the detected information to predetermined such as "emergency! Pulse stop", "emergency! Forced release" Information can also be judged by risk category or risk level.

In detail, when the wearable electronic device 300 is set such that a video is captured every 5 minutes and periodically synchronized with the related information, the controller 310 uses an image recognition search for the user during the past month. The actions performed can be classified and managed according to user biometric information and surrounding information.

For example, information about what foods the user has eaten over the past month may be provided with relevant photos, so that the user can refer to the past month's memories as well as diet plans or current meal menu selections. Can be.

In addition, the controller 310 determines whether the abnormally captured image is an unusual situation based on the risk level, user interest, recording and transmission value index, or recording and transmission cost as described above. The information measured through the sensing unit 330 may be transmitted to the storage unit 360 or the external device 400 through the communication unit 350.

In the above, the control method of the wearable electronic device 300 according to an embodiment of the present invention has been described with reference to FIGS. 6 to 9, but the present invention is not limited thereto.

For example, the wearable electronic device 300 may include a proximity sensor (not shown) to recognize whether the user wears the wearable electronic device 300, and the control unit 310 may be worn by the user. Before wearing the electronic device 300, the wearable electronic device 300 may be controlled to operate in a standby mode in which most functions are in an inactive state.

The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. The proximity sensor has a longer life span than the contact sensor and its utilization is also high.

Examples of proximity sensors include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. When the touch screen is capacitive, the touch screen is configured to detect the proximity of the pointer by the change of the electric field according to the proximity of the pointer. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

In addition, the wearable electronic device 300 may further include a haptic module (not shown) capable of generating various tactile effects that a user can feel.

A representative example of the haptic effect generated by the haptic module is vibration, and the intensity and pattern of vibration generated by the haptic module can be controlled. For example, different vibrations may be synthesized and output or sequentially output.

In addition to vibration, the haptic module can be operated by a pin array that vertically moves with respect to the contact skin surface, the effect of stimulation such as the blowing force or suction force of air through the injection or inlet, grazing to the skin surface, contact of the electrode, and electrostatic force. In addition, various tactile effects can be generated, such as an effect of reproducing a feeling of cold and heat using an endothermic or heat generating element.

On the other hand, the haptic module can not only deliver the haptic effect through direct contact, but also can be implemented so that the user can feel the haptic effect through the muscle sense of the finger or arm. Two or more haptic modules may be provided according to a configuration aspect of the wearable electronic device 300.

According to an embodiment of the present disclosure, the haptic module may be controlled by the controller 310 to inform a user of information related to a function performed by the wearable electronic device 300. The user may be notified of the start or end of a specific function or a specific state, or may be delivered to the user using a tactile effect such as whether or not an unusual situation occurs as described above.

FIG. 10 is a block diagram illustrating a configuration of a user risk detection system according to an embodiment of the present invention. The illustrated risk detection system includes a wearable electronic device 300, a server 410, a guardian terminal 420, and a public. It may be configured to include an institution server 430.

Referring to FIG. 10, as described with reference to FIGS. 6 to 9, the wearable electronic device 300 periodically captures an image and synchronizes the user biometric information, motion information, and surrounding situation information at a corresponding time point. Can be stored.

In addition, when the synchronized information satisfies a predetermined predetermined condition, for example, when it is determined as a dangerous situation or an unusual situation according to the detected information, the wearable electronic device 300 may determine the synchronized image and the related information. May transmit to the server 410.

Meanwhile, the server 410 may store and manage the image and the related information received from the wearable electronic device 300, and transmit and transmit at least one of the received image and the related information to the guardian terminal 420. .

Also, the server 410 may transmit at least one of the image and the related information received from the wearable electronic device 300 to a public institution server 430 such as a police station or a hospital to provide information about a dangerous situation. have.

Hereinafter, embodiments of a method of processing an image and information acquired by the wearable electronic device 300 will be described in detail with reference to FIGS. 11 to 17.

According to an embodiment of the present disclosure, when the user releases the wearable electronic device 300, a user interface for notifying the wearing release situation may be provided.

Referring to FIG. 11, when a predetermined allowable time elapses after the user takes off the wearable electronic device 300 in the form of glasses (for example, 10 minutes after releasing the wear), the wearable electronic device uses vibration or voice. The user may be informed that the device 300 should be quickly worn.

At the same time, or after a predetermined time elapses (for example, 15 minutes after releasing the wear), images and related information which are synchronized by the controller 310 and stored in the storage 360 are stored. The server 410 may be transmitted to the parental terminal 420 through the server 410.

Referring to FIG. 12, when 15 minutes have elapsed after the user releases the wearable electronic device 300, the corresponding information is transmitted to the guardian terminal 420 through the server 410 and displayed on the screen 420. Can be.

In addition, on the screen 420 of the guardian terminal 420, menu items 422 to 425 for confirming detailed information on the situation at the time of release of wear may be provided.

For example, the guardian selects the 'watch video' item 422 to attach to the point of un-wearing.

Check the images periodically captured by the electronic device 300, or select the 'bio information' item 423 to check the blood pressure, blood sugar, pulse, electrocardiogram, body temperature, exercise amount, face, and iris of the user synchronized with the image in time. You can check the status.

In addition, the guardian selects the 'location / movement' item 424 to check the user's movement or location information until the time of wearing, or the 'ambient situation' item 425 to select the temperature, humidity, air volume, noise, etc. You can check the surroundings.

According to another embodiment of the present invention, the guardian may allow the wearable electronic device 300 to perform the wear notification function as shown in FIG. 11 using the guardian terminal 420. 410 may be operated.

Meanwhile, the control operation of the wearable electronic device 300 as described above may be performed for each risk level, for each user's interest, or differently depending on an index calculated in terms of transmission value and cost.

For example, when the synchronized image and the related information are excessively transmitted in a general situation, power of the wearable electronic device 300 may be unnecessarily consumed, and it may be difficult to cope with an emergency or dangerous situation, and the user's pulse Sensing values that have high importance, such as location and location, but have a small amount of information, have a high overall value for money, while video and audio may have a lower overall value than battery consumption and data volume.

Accordingly, the control unit 310 of the wearable electronic device 300 utilizes past experience values recorded in the wearable electronic device 300, the server 410, and the terminal 420 to compensate for the transmission value and cost. It may be determined whether to store the image and the related information or transmit the image.

Hereinafter, embodiments of a user interface for notifying a user's state according to a user risk level will be described with reference to FIGS. 13 to 17.

First, an upper limit and a lower limit may be preset for information detected through the sensing unit 330 of the wearable electronic device 300, for example, an acceleration, a speed, a pulse rate, a heart rate, a blood pressure, a body temperature, and the like of a user.

On the other hand, the user or the guardian of the user may directly set the upper / lower limit, the user may set a safe location.

In this case, the risk level of the user may be determined by comparing the information detected through the sensing unit 330 with the set risk upper / lower limit value and the safety position.

For example, the risk level 'grade 4' may be when the user's pulse rate and instantaneous acceleration exceed the upper limit.

In the meantime, when the risk situation occurs, the wearable electronic device 300 may preferentially send information to the server 410 that is important or have a small amount of data, and increase the amount of data transmitted to the server 410 as the risk level increases.

Referring to FIG. 13, when a situation corresponding to a risk level 'grade 4' occurs, the wearable electronic device 300 may notify a user of a dangerous situation using vibration or voice.

In addition, the controller 310 stores information stored in the storage unit 360 for a predetermined time until a dangerous situation occurs, for example, user biometric information, location / motion information, and surrounding situation information through the communication unit 350. 410 may be transmitted.

Referring to FIG. 14, when a risk level 'grade 4' occurs, the guardian terminal 420 may receive user biometric information, location / motion information, and surrounding situation information from the server 410 and display it on the screen 420.

The risk level 'Class 3' may be a case where the user's position is outside the preset safety position for a predetermined time and the pulse rate exceeds the upper limit.

When a situation corresponding to the risk level 'three levels' occurs, the wearable electronic device 300 may notify the user of the occurrence of the risk level 'three levels' using vibration or voice.

At the same time, the wearable electronic device 300 transmits the user biometric information, location / movement information, and surrounding situation information stored in the storage 360 to the server 410 together with the synchronized image for a predetermined time until a dangerous situation occurs. Can transmit

Referring to FIG. 15, when a risk level 'Class 3' occurs, the guardian terminal 420 may receive user biometric information, location / movement information, and surrounding situation information and images from the server 410 and display them on the screen 420. have.

On the other hand, until the guardian recognizes the situation and takes a specific action, the fact that the risk level 'grade 3' can be continuously notified through the vibration or voice of the guardian terminal 420.

The risk level 'grade 2' may be a case where the user's position is outside the preset safety position for a predetermined time, the pulse rate exceeds the upper limit and the instantaneous acceleration, or the user's surrounding sound is determined to be a dangerous level.

When a situation corresponding to the risk level '2nd grade' occurs, the wearable electronic device 300 may notify the user of the occurrence of the dangerous level '2nd grade' using vibration or voice.

At the same time, the wearable electronic device 300 transmits the user biometric information, location / movement information, and surrounding situation information stored in the storage 360 to the server 410 together with the synchronized image for a predetermined time until a dangerous situation occurs. The camera may transmit the real-time image to the server 410 by photographing the surrounding situation in real time.

Referring to FIG. 16, when a risk level 'Class 2' occurs, the guardian terminal 420 displays user biometric information, location / motion information, and surrounding situation information on the screen 420 from the server 410, more importantly. The real-time image around the wearable electronic device 300 received from the server 410 may be displayed on the screen 420.

On the other hand, until the guardian recognizes the situation and takes a specific action, the fact that the risk level 'grade 2' can be continuously notified through the vibration or voice of the guardian terminal 420.

Risk Level 1 is a very small or absent pulse rate below the lower limit, possibly causing a heart attack or excessive bleeding. There may be a possibility of forcibly releasing the wearable electronic device 300 by a criminal.

Referring to FIG. 17, when a risk level 'first grade' occurs, the guardian terminal 420 displays a real-time image on the screen 420 together with user biometric information, location / motion information, and surrounding situation information from the server 410. It can be displayed to check the emergency report to the guardian's police or hospital.

Meanwhile, the wearable electronic device 300 operates all sensors capable of recognizing a user's current state and surrounding conditions, and continuously transmits images and related information to the public institution server 430 in real time.

However, the operation according to the risk level as described above may be adjusted to suit the battery situation of the wearable electronic device 300 which is checked periodically, and the shooting and sensing cycles may be reduced as the risk level increases.

On the other hand, when the guardian checks the user's state through the terminal 420, and if it is determined that the user is in a safe state, the current state of the wearable electronic device 300 may be set to the normal state by remote operation.

In addition to the above-described risk level ratings, if the user is not moving but the body temperature is high, it is determined that the user's body is likely to be sick, and only body temperature information may be transmitted to the guardian terminal 420.

According to another embodiment of the present invention, the user log recording and transmitting operation as described above may be performed according to the interest set by the user in advance.

For example, a user may set to automatically record video and audio when a user visits a specific place at a specific time, and to record pulses or movements of the corresponding time in synchronization with the user.

The weight for determining the risk level as described above may be changed by the user or the guardian of the user.

Referring to FIG. 18, a user may set weights for pulses, positions, body temperatures, images, and sounds through the terminal 500, and, for example, increase weights for specific information to be important in determining a risk level. It can be set as an item, or it can be set as an item that is relatively insignificant in determining the risk level by lowering the weight for specific information.

According to an embodiment of the present invention, the image and user-related information managed as described above may be expressed according to the movement of the user.

19 illustrates an embodiment of a method of providing a user's lifelog together with map information.

Referring to FIG. 19, a map 510 may be displayed on a screen of the terminal 500, and a moving path 511 of the user may be displayed on the map 510. Meanwhile, the movement route 511 displayed on the map 510 may be obtained through a GPS device provided in the wearable electronic device 300.

On the other hand, points 512, 514, and 515 where images and user-related information are synchronized may be displayed on the movement path 511 of the map 510, and correspond to each of the points 512, 513, and 515. The time information 513 may be displayed adjacent to each other.

The user may select one of the points 512, 513, and 515 to check the image, the biometric information, the motion / location information, and the surrounding situation information acquired at the corresponding point in time.

Among the points 512, 513, and 515 displayed on the movement route 511 of the map 510, for example, a starred point 514 may be used to display an image and related information corresponding to the point to the SNS. It may have been uploaded.

Meanwhile, a specific point among the points 512, 513, and 515 displayed on the movement path 511 of the map 510, for example, a face marked point 515, indicates a point where the image and related information were most recently acquired. Can be represented.

The life log of the user recorded as described above may be automatically organized and managed for each event as shown in FIG. 20.

The method according to the present invention may be implemented as a program for execution on a computer and stored in a computer-readable recording medium. Examples of the computer-readable recording medium include a ROM, a RAM, a CD- , A floppy disk, an optical data storage device, and the like, and may also be implemented in the form of a carrier wave (for example, transmission over the Internet).

The computer readable recording medium may be distributed over a networked computer system so that computer readable code can be stored and executed in a distributed manner. And, functional programs, codes and code segments for implementing the above method can be easily inferred by programmers of the technical field to which the present invention belongs.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention.

Claims (1)

A wearable electronic device comprising at least one lens and display means for displaying information on the lens.
A camera photographing at a predetermined period to obtain an image;
A sensing unit configured to detect user biometric information and motion information of the wearable electronic device; And
And a controller configured to control to store or transmit the obtained image in synchronization with the information detected by the sensing unit.
KR1020120083810A 2012-07-31 2012-07-31 Wearable electronic device and method for controlling the same KR20140017735A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
KR1020120083810A KR20140017735A (en) 2012-07-31 2012-07-31 Wearable electronic device and method for controlling the same
PCT/KR2013/006821 WO2014021602A2 (en) 2012-07-31 2013-07-30 Wearable electronic device and method for controlling same
US14/413,802 US20150156196A1 (en) 2012-07-31 2013-07-30 Wearable electronic device and method for controlling same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120083810A KR20140017735A (en) 2012-07-31 2012-07-31 Wearable electronic device and method for controlling the same

Publications (1)

Publication Number Publication Date
KR20140017735A true KR20140017735A (en) 2014-02-12

Family

ID=50266121

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120083810A KR20140017735A (en) 2012-07-31 2012-07-31 Wearable electronic device and method for controlling the same

Country Status (1)

Country Link
KR (1) KR20140017735A (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015123035A1 (en) * 2014-02-17 2015-08-20 General Electric Company Video system and method for data communication
WO2015157016A1 (en) * 2014-04-09 2015-10-15 3M Innovative Properties Company Head mounted display and low conspicuity pupil illuminator
EP2947968A3 (en) * 2014-05-02 2015-12-09 LG Electronics Inc. Lighting system and control method thereof
CN105398471A (en) * 2014-09-08 2016-03-16 通用电气公司 Optical Route Examination System And Method
KR20160074127A (en) * 2014-12-18 2016-06-28 숭실대학교산학협력단 Home network system and control method thereof, recording medium for performing the method
KR20170018930A (en) * 2014-06-14 2017-02-20 매직 립, 인코포레이티드 Methods and systems for creating virtual and augmented reality
WO2017204396A1 (en) * 2016-05-26 2017-11-30 삼성전자 주식회사 Electronic device and method for electronic device
US9873442B2 (en) 2002-06-04 2018-01-23 General Electric Company Aerial camera system and method for identifying route-related hazards
US9875414B2 (en) 2014-04-15 2018-01-23 General Electric Company Route damage prediction system and method
US9919723B2 (en) 2002-06-04 2018-03-20 General Electric Company Aerial camera system and method for determining size parameters of vehicle systems
US10049298B2 (en) 2014-02-17 2018-08-14 General Electric Company Vehicle image data management system and method
US10110795B2 (en) 2002-06-04 2018-10-23 General Electric Company Video system and method for data communication
US10798282B2 (en) 2002-06-04 2020-10-06 Ge Global Sourcing Llc Mining detection system and method
US11124207B2 (en) 2014-03-18 2021-09-21 Transportation Ip Holdings, Llc Optical route examination system and method

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11039055B2 (en) 2002-06-04 2021-06-15 Transportation Ip Holdings, Llc Video system and method for data communication
US10110795B2 (en) 2002-06-04 2018-10-23 General Electric Company Video system and method for data communication
US9919723B2 (en) 2002-06-04 2018-03-20 General Electric Company Aerial camera system and method for determining size parameters of vehicle systems
US9873442B2 (en) 2002-06-04 2018-01-23 General Electric Company Aerial camera system and method for identifying route-related hazards
US10798282B2 (en) 2002-06-04 2020-10-06 Ge Global Sourcing Llc Mining detection system and method
US10049298B2 (en) 2014-02-17 2018-08-14 General Electric Company Vehicle image data management system and method
CN106537900A (en) * 2014-02-17 2017-03-22 通用电气公司 Video system and method for data communication
CN110545380B (en) * 2014-02-17 2021-08-06 通用电气全球采购有限责任公司 Video system and method for data communication
WO2015123035A1 (en) * 2014-02-17 2015-08-20 General Electric Company Video system and method for data communication
CN110545380A (en) * 2014-02-17 2019-12-06 通用电气全球采购有限责任公司 video system and method for data communication
AU2015217536B2 (en) * 2014-02-17 2019-05-30 Ge Global Sourcing Llc Video system and method for data communication
CN106537900B (en) * 2014-02-17 2019-10-01 通用电气全球采购有限责任公司 Video system and method for data communication
US11124207B2 (en) 2014-03-18 2021-09-21 Transportation Ip Holdings, Llc Optical route examination system and method
US10545340B2 (en) 2014-04-09 2020-01-28 3M Innovative Properties Company Head mounted display and low conspicuity pupil illuminator
US11675191B2 (en) 2014-04-09 2023-06-13 3M Innovative Properties Company Head mounted display and low conspicuity pupil illuminator
WO2015157016A1 (en) * 2014-04-09 2015-10-15 3M Innovative Properties Company Head mounted display and low conspicuity pupil illuminator
US9875414B2 (en) 2014-04-15 2018-01-23 General Electric Company Route damage prediction system and method
US9655212B2 (en) 2014-05-02 2017-05-16 Lg Electronics Inc. Lighting system having a plurality of lighting devices and an integrated control module
EP2947968A3 (en) * 2014-05-02 2015-12-09 LG Electronics Inc. Lighting system and control method thereof
KR20170018930A (en) * 2014-06-14 2017-02-20 매직 립, 인코포레이티드 Methods and systems for creating virtual and augmented reality
CN105398471A (en) * 2014-09-08 2016-03-16 通用电气公司 Optical Route Examination System And Method
KR20160074127A (en) * 2014-12-18 2016-06-28 숭실대학교산학협력단 Home network system and control method thereof, recording medium for performing the method
WO2017204396A1 (en) * 2016-05-26 2017-11-30 삼성전자 주식회사 Electronic device and method for electronic device

Similar Documents

Publication Publication Date Title
KR20140017735A (en) Wearable electronic device and method for controlling the same
US20150156196A1 (en) Wearable electronic device and method for controlling same
US10356398B2 (en) Method for capturing virtual space and electronic device using the same
KR101594428B1 (en) Method and system for providing security service using drone
EP3051463B1 (en) Image processing method and electronic device for supporting the same
KR20140130321A (en) Wearable electronic device and method for controlling the same
KR20170067058A (en) Mobile terminal and method for controlling the same
KR102091604B1 (en) Mobile terminal and method for controlling the same
KR20170055869A (en) Mobile terminal and method for controlling the same
KR20140017734A (en) Wearable electronic device and method for controlling the same
KR20150142516A (en) Glass type terminal and control method thereof
KR20180028211A (en) Head mounted display and method for controlling the same
CN106067833A (en) Mobile terminal and control method thereof
KR20140128489A (en) Smart glass using image recognition and touch interface and control method thereof
CN109379539A (en) A kind of screen light compensation method and terminal
US9958681B2 (en) Electronic device and control method thereof
KR20140130331A (en) Wearable electronic device and method for controlling the same
KR20150110053A (en) Method and apparatus for sharing information using wearable device
KR20170062376A (en) Electronic apparatus and method for displaying and generating panorama video
KR101582266B1 (en) Method for controlling information display at mobile device interworking with wearable device
KR20170055296A (en) Tethering type head mounted display and method for controlling the same
KR20140130332A (en) Wearable electronic device and method for controlling the same
US12072489B2 (en) Social connection through distributed and connected real-world objects
CN104809370A (en) Device and method for determining validity of authentication information of head-wearing intelligent device
EP3591514A1 (en) Electronic device and screen image display method for electronic device

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination