WO2022158854A1 - 외부 웨어러블 전자 장치로부터 정보를 수신하는 웨어러블 전자 장치 및 그 작동 방법 - Google Patents
외부 웨어러블 전자 장치로부터 정보를 수신하는 웨어러블 전자 장치 및 그 작동 방법 Download PDFInfo
- Publication number
- WO2022158854A1 WO2022158854A1 PCT/KR2022/000998 KR2022000998W WO2022158854A1 WO 2022158854 A1 WO2022158854 A1 WO 2022158854A1 KR 2022000998 W KR2022000998 W KR 2022000998W WO 2022158854 A1 WO2022158854 A1 WO 2022158854A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- electronic device
- wearable electronic
- external
- audio data
- display
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 25
- 230000000007 visual effect Effects 0.000 claims abstract description 105
- 238000004891 communication Methods 0.000 claims abstract description 76
- 230000006870 function Effects 0.000 claims description 88
- 238000006243 chemical reaction Methods 0.000 claims description 21
- 230000003213 activating effect Effects 0.000 claims description 7
- 230000004044 response Effects 0.000 claims description 6
- 230000003287 optical effect Effects 0.000 description 33
- RMPWIIKNWPVWNG-UHFFFAOYSA-N 1,2,3,4-tetrachloro-5-(2,3,4-trichlorophenyl)benzene Chemical compound ClC1=C(Cl)C(Cl)=CC=C1C1=CC(Cl)=C(Cl)C(Cl)=C1Cl RMPWIIKNWPVWNG-UHFFFAOYSA-N 0.000 description 5
- 238000001514 detection method Methods 0.000 description 5
- 239000004973 liquid crystal related substance Substances 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 239000000758 substrate Substances 0.000 description 5
- 238000004590 computer program Methods 0.000 description 4
- 239000011521 glass Substances 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 3
- 230000003190 augmentative effect Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 210000001747 pupil Anatomy 0.000 description 3
- 229910052710 silicon Inorganic materials 0.000 description 3
- 239000010703 silicon Substances 0.000 description 3
- 230000004913 activation Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 230000001678 irradiating effect Effects 0.000 description 2
- 229920000642 polymer Polymers 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 210000003027 ear inner Anatomy 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000036651 mood Effects 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/26—Speech to text systems
Definitions
- the present disclosure relates to a wearable electronic device that receives information from an external wearable electronic device, and a method of operating the same.
- Augmented reality is a technology that superimposes a three-dimensional (or two-dimensional) virtual image on a real image or background and displays it as a single image.
- Augmented reality technology in which the real environment and virtual objects are mixed can provide a better sense of reality and additional information by allowing the user to see the real environment. The user may observe the image together with the actual environment, for example, may check information about the object in the currently observed environment.
- the augmented reality device may be a wearable electronic device.
- an electronic device in the form of AR glasses that can be worn on the face like glasses is widespread.
- Speech to text is a technology for receiving a speech language, converting the input speech language into a text form, and outputting it.
- the recognition ability of a voice generated in the surrounding environment may be reduced, and thus information on a voice generated in the surrounding environment may be visually provided to the user of the wearable electronic device through the STT function.
- the wearable electronic device may be miniaturized to be worn on one part of the user's body, only data in a local environment may be acquired, and thus the accuracy of the STT function may be limited.
- the wearable electronic device may receive, from the external wearable electronic device, state information based on a signal obtained from the external wearable electronic device, and provide the STT function in consideration of the state information.
- a wearable electronic device includes a display, a communication circuit, a voice input device, and at least one processor, wherein the at least one processor acquires audio data through the voice input device, and confirming that data satisfies a predetermined condition, receiving status information based on a signal obtained from the external wearable electronic device from the external wearable electronic device through the communication circuit, and based at least in part on the status information, the and control the display to display visual information corresponding to audio data.
- a wearable electronic device may include a display, a communication circuit, a voice input device, and at least one processor, wherein the at least one processor includes a first audio device corresponding to an external event through the voice input device. acquire data, and receive second audio data obtained in the external wearable electronic device from the external wearable electronic device through the communication circuit and corresponding to the external event, based on the first audio data and the second audio data Thus, it may be configured to check an event direction corresponding to the external event and perform an operation corresponding to the event direction.
- a method performed in a wearable electronic device includes obtaining audio data, confirming that the audio data satisfies a predetermined condition, from an external wearable electronic device, in the external wearable electronic device.
- the method may include receiving status information based on the obtained signal, and displaying visual information corresponding to the audio data based at least in part on the status information.
- a wearable electronic device that receives information from an external wearable electronic device and a method of operating the same may be provided.
- the wearable electronic device may receive, from the external wearable electronic device, state information based on a signal obtained from the external wearable electronic device, and provide the STT function in consideration of the state information.
- the wearable electronic device may determine whether to provide the STT function in consideration of state information based on a signal obtained from the external wearable electronic device, so that the user can determine whether the user is in a situation requiring the STT function. can be judged accurately.
- the wearable electronic device since the wearable electronic device according to embodiments provides the STT function based on state information based on a signal obtained from the external wearable electronic device, the STT function may be provided with high accuracy.
- FIG. 1 illustrates a structure of a wearable electronic device according to various embodiments of the present disclosure.
- FIG. 2 illustrates a structure of a display of a wearable electronic device and an eye tracking camera, according to various embodiments of the present disclosure.
- FIG. 3 is a block diagram of a wearable electronic device according to various embodiments of the present disclosure.
- FIG. 4 is a block diagram of an external wearable electronic device according to various embodiments of the present disclosure.
- FIG. 5 illustrates communication between a wearable electronic device and an external wearable electronic device, according to various embodiments.
- FIG. 6 is a flowchart illustrating operations performed by a wearable electronic device according to various embodiments of the present disclosure.
- FIG. 7 is a flowchart illustrating operations performed by a wearable electronic device according to various embodiments of the present disclosure.
- FIG. 8 is a flowchart illustrating operations performed by a wearable electronic device according to various embodiments of the present disclosure.
- FIG. 9 is a flowchart illustrating operations performed by a wearable electronic device according to various embodiments of the present disclosure.
- FIG. 10 is a flowchart illustrating operations performed by a wearable electronic device according to various embodiments of the present disclosure.
- FIG. 11 is a flowchart illustrating operations performed by a wearable electronic device according to various embodiments of the present disclosure.
- FIG. 12 is a flowchart illustrating operations performed by a wearable electronic device according to various embodiments of the present disclosure.
- FIG. 13 is a flowchart illustrating operations performed by a wearable electronic device according to various embodiments of the present disclosure.
- FIG. 14 is a flowchart illustrating operations performed by a wearable electronic device according to various embodiments of the present disclosure.
- these blocks which may be referred to herein by the names of units or modules, etc., or devices, etc., include logic gates, integrated circuits, microprocessors, microcontrollers, memory circuits, passive electronic components, active electronic components, and the like. , may be physically implemented by analog or digital circuits such as optical components, wiring circuits, etc., and may be driven by firmware and software.
- the circuit may be implemented, for example, in one or more semiconductor chips, or on a substrate support such as a printed circuit board or the like.
- circuitry included in a block may be implemented by dedicated hardware or processors (eg, one or more programmed microprocessors and associated circuits), or by a combination of dedicated hardware that performs some functions of the block and a processor that performs other functions. have. of blocks. Each block of an embodiment may be physically separated into two or more interacting individual blocks. Likewise, blocks of an embodiment may be physically combined into more complex blocks.
- the wearable electronic device 100 includes a frame 105 , a first support part 101 , a second support part 102 , and a first hinge part connecting the frame 105 and the first support part 101 . 103 and a second hinge portion 104 connecting the frame 105 and the second support portion 102 .
- the frame 105 includes at least one camera, for example, a first camera 111-1, a first camera 111-2, a second camera 112-1, a second camera.
- the wearable electronic device 100 includes one or more first cameras 111-1 and 111-2, one or more second cameras 112-1 and 112-2, and one or more third cameras ( 113) may be included.
- an image acquired through one or more of the first cameras 111-1 and 111-2 may be used for detecting a hand gesture by a user, tracking a user's head, and recognizing a space.
- the one or more first cameras 111-1 and 111-2 may be global shutter (GS) cameras.
- the one or more first cameras 111-1 and 111-2 may perform a simultaneous localization and mapping (SLAM) operation through depth imaging.
- the one or more first cameras 111-1, 111-2 may perform spatial recognition for 6 degrees of freedom (DoF).
- an image acquired through one or more second cameras 112-1 and 112-2 may be used to detect and track a user's pupil.
- the one or more second cameras 112-1 and 112-2 may be GS cameras.
- the one or more second cameras 112-1 and 112-2 may correspond to the left eye and the right eye, respectively, and the performance of the one or more second cameras 112-1 and 112-2 may be the same.
- the one or more third cameras 113 may be high-resolution cameras. According to various embodiments, the one or more third cameras 113 may perform an auto-focusing (AF) function and a shake correction function. According to various embodiments, the one or more third cameras 113 may be a GS camera or a rolling shutter (RS) camera.
- AF auto-focusing
- RS rolling shutter
- the wearable electronic device 100 may include one or more light emitting devices 114 - 1 and 114 - 2 .
- the light emitting devices 114 - 1 and 114 - 2 may be different from a light source, which will be described later, for irradiating light to a screen output area of the display.
- the light emitting devices 114-1 and 114-2 facilitate detection of the pupils in detecting and tracking the pupils of the user through one or more second cameras 112-1 and 112-2. Can be irradiated with light to
- each of the light emitting devices 114 - 1 and 114 - 2 may include an LED.
- the light emitting devices 114 - 1 and 114 - 2 may emit light in the infrared region. According to various embodiments, the light emitting devices 114 - 1 and 114 - 2 may be attached around the frame 105 of the wearable electronic device 100 . According to various embodiments, the light emitting devices 114 - 1 and 114 - 2 are positioned around one or more first cameras 111-1 and 111 - 2 , and when the wearable electronic device 100 is used in a dark environment, one or more Gesture detection, head tracking, and/or spatial recognition by the first cameras 111-1 and 111-2 may be aided.
- the light emitting devices 114 - 1 and 114 - 2 are positioned around the one or more third cameras 113 , and when the wearable electronic device 100 is used in a dark environment, the one or more third cameras 113 . It can assist in image acquisition by
- the wearable electronic device 100 includes a first display 151 , a second display 152 , one or more input optical members 153-1 and 153-2, and one It may include one or more transparent members 190-1 and 190-2, and one or more screen display parts 154-1 and 154-2.
- the first display 151 and the second display 152 may be, for example, a liquid crystal display (LCD), a digital mirror device (DMD), or a silicon liquid crystal display. It may include a liquid crystal on silicon (LCoS) device, an organic light emitting diode (OLED), or a micro light emitting diode (micro LED).
- LCD liquid crystal display
- DMD digital mirror device
- micro LED micro light emitting diode
- the wearable electronic device 100 may display a screen output area of the display. It may include a light source for irradiating light.
- a wearable electronic device 100 may provide a virtual image of good quality to the user even if a separate light source is not included.
- the one or more transparent members 190-1 and 190-2 may be disposed to face the user's eyes when the user wears the wearable electronic device 100 .
- the one or more transparent members 190-1 and 190-2 may include at least one of a glass plate, a plastic plate, and a polymer.
- the one or more input optical members 153 - 1 and 153 - 2 may guide the light generated by the first display 151 and the second display 152 to the user's eyes.
- the first display 151 and the second display 152 on the one or more screen display portions 154-1 and 154-2 on the one or more transparent members 190-1 and 190-2 An image is formed based on the generated light, and the user can see the image formed on one or more screen display portions 154 - 1 and 154 - 2 .
- the wearable electronic device 100 may include one or more optical waveguides.
- the optical waveguide may transmit the light generated by the first display 151 and the second display 152 to the user's eyes.
- the wearable electronic device 100 may include one optical waveguide, respectively, corresponding to the left eye and the right eye.
- the optical waveguide may include at least one of glass, plastic, or polymer.
- the optical waveguide may include a nano-pattern, for example, a polygonal or curved grating structure formed on one surface inside or outside.
- the optical waveguide may include a free-form type prism, and in this case, the optical waveguide may provide incident light to a user through a reflection mirror.
- the optical waveguide includes at least one of at least one diffractive element (eg, a diffractive optical element (DOE), a holographic optical element (HOE)) or a reflective element (eg, a reflective mirror) in the optical waveguide.
- the display light emitted from the light source may be guided to the user's eye by using the included at least one diffractive element or reflective element.
- the diffractive element may include an input/output optical member.
- the reflective element may include a member causing total reflection.
- the wearable electronic device 100 may include one or more voice input devices 162-1, 162-2, 162-3, and one or more voice input devices 162-1 and 162-2. , 162-3) may receive a user's voice or a sound generated in the vicinity of the wearable electronic device 100 .
- the one or more voice input devices 162-1, 162-2, and 162-3 receive a sound generated in the vicinity so that the wearable electronic device 100 can provide a speech to text (STT) function. It may be transmitted to a processor (eg, the processor 320 of FIG. 3 ).
- the one or more supports may include at least one printed circuit board (PCB) (eg, the first PCB 170 - 1 and the second support part 102 ).
- PCB 170-2) e.g., the first PCB 170 - 1 and the second support part 102
- audio output devices e.g, audio output device 163-1, audio output device 163-2
- batteries e.g, battery 135-1
- battery ( 135-2) may be included.
- the first PCB 170 - 1 and the second PCB 170 - 2 are a first camera 211 , a second camera 212 , a third camera 213 , and a display module 250 to be described later with reference to FIG. 2 .
- each of the first PCB 170 - 1 and the second PCB 170 - 2 may be a flexible printed circuit board (FPCB).
- FPCB flexible printed circuit board
- each of the first PCB 170-1 and the second PCB 170-2 includes a first substrate, a second substrate, and an interposer disposed between the first substrate and the second substrate. can do.
- the wearable electronic device 100 may include batteries 135 - 1 and 135 - 2 . The batteries 135 - 1 and 135 - 2 may store power for operating the remaining components of the wearable electronic device 100 .
- one or more audio output devices 163 - 1 and 163 - 2 may output audio data to a user. For example, feedback on a user's command (or input) may be provided, or information on a virtual object may be provided to the user through audio data.
- the wearable electronic device 100 may include one or more hinge parts (eg, the first hinge part 103 and the second hinge part 104 ).
- the first support part 101 is coupled to the frame 105 and is rotatable based on the frame 105
- the second hinge part 104 is the second support part 102 .
- the wearable electronic device 200 (eg, the wearable electronic device 100 of FIG. 1 ) includes a display 221 , an input optical member 222 , a display optical waveguide 223 , an output optical member 224 , and an eye tracking camera ( 210 , a first splitter 241 , an eye-tracking optical waveguide 242 , and a second splitter 243 .
- the display 221 may correspond to the first display 151 or the second display 152 illustrated in FIG. 1 .
- the light output from the display 221 passes through an input optical member 222 that may correspond to the input optical members 153-1 and 153-2 of FIG. 1 and is incident on the display optical waveguide 223, and the display optical waveguide It may be output through the output optical member 224 through 223 .
- the light output from the output optical member 224 can be seen by the user's eye 230 .
- the expression “displaying an object on the display” means that the light output from the display 221 is output through the output optical member 224 , and the light output through the output optical member 224 causes the user to It may mean that the shape of the object is seen in the eyes 230 of the
- the expression “control the display to display the object” means that the light output from the display 221 is output through the output optical member 224, and the light output through the output optical member 224 is This may mean that the display 221 is controlled so that the shape of the object is visible to the user's eye 230 by the user's eye 230 .
- the light 235 reflected from the user's eye 230 passes through the first splitter 241 and is incident on the eye-tracking optical waveguide 242 , passes through the eye-tracking optical waveguide 242 and passes through the second splitter 243 . It may be output to the eye tracking camera 210 .
- the light 235 reflected from the user's eye 230 is output from the light emitting devices 114 - 1 and 114 - 2 of FIG. 1 and may correspond to the light reflected from the user's eye 230 .
- the eye tracking camera 210 may correspond to one or more second cameras 112-1 and 112-2 illustrated in FIG. 1 .
- the wearable electronic device 300 includes a first camera 311 , a second camera 312 , a third camera 313 , a processor 320 , a power management integrated circuit (PMIC) 330 , may include a battery 335 , a memory 340 , a display 350 , an audio interface 361 , a voice input device 362 , a voice output device 363 , a communication circuit 370 , and a sensor 380 . have.
- PMIC power management integrated circuit
- the one or more first cameras 111-1, 111-2, one or more second cameras 112-1, 112-2, and one or more third cameras described above with reference to FIG. 1 may be equally applied to the first camera 311 , the second camera 312 , and the third camera 313 , respectively.
- the wearable electronic device 300 may include a plurality of at least one of a first camera 311 , a second camera 312 , and a third camera 313 .
- the processor 320 may include other components of the wearable electronic device 300 , for example, the first camera 311 , the second camera 312 , the third camera 313 , and the PMIC ( 330 ), the memory 340 , the display module 350 , the audio interface 361 , the communication circuit 370 , and the sensor 380 may be controlled, and various data processing or operations may be performed.
- the PMIC 330 converts power stored in the battery 335 to have a current or voltage required by other components of the wearable electronic device 300 , so that other components of the wearable electronic device 300 are can be supplied to
- the memory 340 may store various data used by at least one component (eg, the processor 320 or the sensor 380 ) of the wearable electronic device 300 .
- the display 350 may display a screen to be provided to the user.
- the display 350 includes the first display 151 , the second display 152 , one or more input optical members 153-1 and 153-2, and one or more transparent elements described above with reference to FIG. 1 . It may include members 190-1 and 190-2, and one or more screen display portions 154-1 and 154-2.
- the audio interface 361 is connected to the voice input device 362 and the voice output device 363 , converts data input through the voice input device 362 , and the voice output device 363 . You can convert the data to be output to .
- the voice input device 362 may include a microphone
- the voice output device 363 may include a speaker and an amplifier.
- the communication circuit 370 may support establishment of a wireless communication channel with an electronic device external to the wearable electronic device 300 and performing communication through the established communication channel.
- the sensor 380 may include a 6-axis sensor 381 , a magnetic sensor 382 , a proximity sensor 383 , and an optical sensor 384 .
- the sensor 380 may include a sensor for acquiring a biosignal for detecting whether the wearable electronic device 300 is being worn by a user.
- the sensor 380 may include at least one of a heart rate sensor, a skin sensor, and a temperature sensor.
- the processor 320 generates text and/or image-based data to be displayed through the display 350 based on data received from the audio interface 361 when the user activates the STT function. can do.
- the processor 320 may check the movement of the user wearing the wearable electronic device 300 through the 6-axis sensor 381 .
- the 6-axis sensor 381 generates a sensor value by detecting a change in the direction the user faces (eg, the direction the user views through the wearable electronic device 300 ), and the generated sensor value or sensor value The amount of change of may be transmitted to the processor 320 .
- the audio interface 361 receives a sound generated in the vicinity of the wearable electronic device 300 (or the user) through the voice input device 362 when the user activates the STT function, and receives The converted sound may be transmitted to the processor 320 .
- the communication circuit 370 may transmit/receive data to and from an external electronic device (eg, a wearable electronic device (eg, an earphone) or an external electronic device (eg, a terminal)).
- an external electronic device eg, a wearable electronic device (eg, an earphone) or an external electronic device (eg, a terminal)
- the wearable electronic device 300 may receive audio data received by the external wearable electronic device through the communication circuit 370 and transmit the received audio data to the processor 320 .
- the wearable electronic device 300 may output image data based on data received from the external electronic device through the communication circuit 370 through the display 350 .
- the external wearable electronic device 400 may be at least one of a wearable electronic device in the form of an earphone, a wearable electronic device in the form of a watch, and a wearable electronic device in the form of a necklace.
- the external wearable electronic device 400 may have a plurality of physically separated housings.
- the external wearable electronic device 400 may include a first housing to be worn on the left ear and a second housing to be worn on the right ear. and, in an embodiment, each of the components shown in FIG. 4 may be included in one or more of the plurality of housings.
- the external wearable electronic device 400 may include a processor 410 , a memory 420 , a communication circuit 430 , an audio interface 440 , a sensor 450 , and a battery 460 . have.
- the processor 410 includes other components of the external wearable electronic device 400 , for example, the memory 420 , the communication circuit 430 , the audio interface 440 , the sensor 450 , and It may receive data from the battery 460 , perform an operation based on the received data, and transmit signals for controlling other components to other components. According to various embodiments, the processor 410 may operate based on instructions stored in the memory 420 .
- the memory 420 includes other components of the external wearable electronic device 400 , for example, the processor 410 , the communication circuit 430 , the audio interface 440 , the sensor 450 , and The battery 460 may store instructions for performing a specified operation. According to various embodiments, the memory 420 may store audio data acquired through the audio interface 440 .
- the communication circuit 430 may perform wireless communication with another electronic device (eg, the wearable electronic device 300 ). According to various embodiments, the communication circuit 430 may transmit information obtained from the external wearable electronic device 400 to the wearable electronic device 300 .
- the type of communication supported by the communication circuit 430 is not limited.
- the audio interface 440 may include a plurality of microphones and one or more speakers.
- the plurality of microphones are directed toward the user's inner ear when the user wears the external wearable electronic device 400 and are directed away from the user when the user wears the external wearable electronic device 400 . It may include a microphone.
- the audio interface 440 may obtain audio data through a plurality of microphones, respectively, and perform noise cancellation based on the audio data obtained through the plurality of microphones.
- the senor 450 may include a biosensor for detecting whether the user wears the external wearable electronic device 400 .
- the biosensor may include at least one of a heart rate sensor, a skin sensor, and a temperature sensor.
- the sensor 450 may include a geomagnetic sensor.
- the external wearable electronic device 400 may receive a data transmission request from the wearable electronic device 300 through the communication circuit 430 .
- the external wearable electronic device 400 may receive a request to transmit audio data received through the audio interface 440 .
- a specified condition eg, a specified time or a specified motion detection
- the external wearable electronic device 400 provides information about audio data received through the audio interface 440 . You may be asked to send.
- the external wearable electronic device is an earphone-type wearable electronic device, and includes an external wearable electronic device (L) 530 to be worn on the left ear and an external wearable electronic device (R) 520 to be worn on the right ear.
- L external wearable electronic device
- R external wearable electronic device
- communication between the wearable electronic device 510 , the external wearable electronic device (R) 520 , and the external wearable electronic device (L) 530 is illustrated.
- one of the external wearable electronic device (R) 520 and the external wearable electronic device (L) 530 may operate as a master, and the other may operate as a slave.
- 5 illustrates an example in which the external wearable electronic device (R) 520 operates as a master and the external wearable electronic device (L) 530 operates as a slave.
- the wearable electronic device 510 and the external wearable electronic device (R) 520 are connected to each other through a Bluetooth communication protocol, and the external wearable electronic device (R) 520 and the external wearable electronic device (L) 530 are connected to each other. ) can be connected to each other through the Bluetooth communication protocol.
- the external wearable electronic device (R) 520 communicates with the wearable electronic device 510
- the external wearable electronic device (L) 530 performs the communication with the external wearable electronic device (R) 520 .
- Information on a communication link between the wearable electronic device 510 and the external wearable electronic device (R) 520 may be received from .
- information on a communication link between the wearable electronic device 510 and the external wearable electronic device (R) 520 includes address information, clock information, channel information, session description protocol (SDP) result information, support It may include information on a function to be used, key information, or an extended inquiry response (EIR) packet.
- the external wearable electronic device (L) 530 is connected to the wearable electronic device 510 and the external wearable electronic device ( The communication channel between R) 520 may be monitored.
- the external wearable electronic device (L) 530 is the wearable electronic device 510 and/or the external wearable electronic device (R) 520 is the wearable electronic device 510 and the external wearable electronic device (R) ( 520) may receive data transmitted and received through a communication channel between the 520).
- the external wearable electronic device (L) 530 may transmit data to the wearable electronic device 510 through a communication channel between the wearable electronic device 510 and the external wearable electronic device (R) 520 .
- the wearable electronic device 510 is acquired from the external wearable electronic device (R) 520 , the external wearable electronic device (R) 520 and the external wearable electronic device (L) 530 , the external wearable electronic device (R) ) 520 and the state information of the external wearable electronic device (L) 530 may be requested. Details of examples of state information will be described later with reference to FIG. 6 .
- the wearable electronic device 510 may receive state information from the external wearable electronic device (R) 520 .
- the state information obtained from the external wearable electronic device (L) 530 is displayed in the retransmission period between the external wearable electronic device (R) 520 and the wearable electronic device 510, indicated by W1 and W2 in FIG. 5 . 510 .
- the wearable electronic device receives audio data generated in the surroundings, and includes text and / or receive an activation request for an STT function that provides an image.
- the wearable electronic device 300 may automatically activate the STT function.
- the wearable electronic device 300 determines whether the user is outputting audio data through another wearable electronic device (eg, earphone), and based on the determination result to activate the STT function.
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300 ) may acquire audio data.
- the processor 320 may acquire audio data through the audio interface 361 of the wearable electronic device 300 .
- the processor (eg, the processor 320) of the wearable electronic device determines whether the audio data obtained in operation 610 satisfies a predetermined condition.
- the predetermined condition may be satisfied when the audio data indicates a situation in which the user of the wearable electronic device 300 may want to receive the STT function.
- the predetermined condition is that audio data includes a voice related to a language, that the audio data includes a voice related to a preset word, or that the audio data includes a voice of a preset size or larger may include at least one of
- 'preset' may mean 'predetermined' or 'predetermined', for example.
- the processor (eg, the processor 320 ) of the wearable electronic device may determine whether the acquired audio data satisfies a predetermined condition. In one example, if it is determined in operation 620 that the acquired audio data does not satisfy the predetermined condition, the processor (eg, the processor 320) of the wearable electronic device (eg, the wearable electronic device 300) may repeat operation 610 until audio data satisfying a predetermined condition is obtained.
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300) may receive status information from an external wearable electronic device (eg, external wearable electronic device 400 ) through the communication circuit 370 in operation 630 .
- the state information may be information based on a signal obtained from the external wearable electronic device 400 .
- the state information may indicate whether the external wearable electronic device 400 is being worn by the user of the external wearable electronic device 400 .
- the state information indicating whether the external wearable electronic device 400 is being worn by the user is a biosignal obtained from the sensor 450 of the external wearable electronic device 400, or the external wearable electronic device ( As information output by the processor 410 of the 400 , based on the biosignal obtained from the sensor module 450 , it may indicate a determination result of whether the external wearable electronic device 400 is being worn by the user.
- the state information may indicate whether a voice is being output from the external wearable electronic device 400 .
- the state information may indicate the volume of a voice being output from the external wearable electronic device 400 .
- the state The information may indicate which function of the noise canceling function and the ambient sound listening function is designated as having a higher priority.
- the state information may indicate whether an ambient sound listening function is being activated in the external wearable electronic device 400 by a user input to the external wearable electronic device 400 .
- the state information may include audio data acquired through the audio interface 440 of the external wearable electronic device 400 .
- the processor (eg, the processor 320) of the wearable electronic device displays visual information corresponding to the audio data based at least in part on the state information.
- a display eg, the display 350
- the visual information may include at least one of text or an image.
- the processor 320 controls the display 350 to display visual information corresponding to the audio data based at least in part on the state information, whether the processor 320 provides the STT function or not. It may mean that the determination is made based at least in part on the state information.
- the state information indicates whether the external wearable electronic device 400 is being worn by the user of the external wearable electronic device 400
- the state information indicates that the external wearable electronic device 400 is being worn.
- the indication may be included in at least one condition for the processor 320 to control the display 350 to display visual information corresponding to the audio data.
- at least one condition for the processor 320 to control the display to display visual information corresponding to the audio data may be referred to as “at least one visual information display condition”.
- the processor 320 may further have any of various conditions to be described later as a condition for providing the STT function.
- the processor 320 may not control the display 350 to display visual information corresponding to the audio data.
- the processor 320 when the state information is a biosignal obtained from the sensor 450 of the external wearable electronic device 400 , the processor 320 generates a biosignal through the sensor 380 of the wearable electronic device 300 . , and comparing the biosignal received from the external wearable electronic device 400 with the biosignal through the sensor 380 of the wearable electronic device 300 , the user wearing the external wearable electronic device 400 and the wearable electronic device It can be checked whether the user wearing the device 300 is the same person.
- the at least one visual information display condition includes a condition in which state information indicates that the external wearable electronic device 400 is being worn, and a user wearing the external wearable electronic device 400 and the wearable electronic device 300 .
- the processor 320 may provide visual information corresponding to the audio data.
- the display 350 may not be controlled to display .
- At least one visual information display condition is that the state information is outputting a voice from the external wearable electronic device 400 . It may include a condition indicating According to various embodiments, when the state information indicates the volume of the voice being output from the external wearable electronic device 400 , the at least one visual information display condition is the condition being output from the external wearable electronic device 400 indicated by the state information. It may include a condition in which the volume of the voice is equal to or higher than a preset level.
- the at least one visual information display condition is A condition in which the priority of the noise canceling function indicated by the state information is higher than the priority of the ambient sound listening function.
- the at least one visual information display condition is that the ambient sound listening function is not activated by a user input to the external wearable electronic device 400 . It may contain unconditional conditions. In other words, when the user activates the ambient sound listening function through a direct input to the external wearable electronic device 400 , the wearable electronic device 300 may not provide the STT function.
- the processor 320 controls the display 350 to display visual information corresponding to the audio data based at least in part on the state information, when the processor 320 provides the STT function, It may mean that the state information is considered in determining the visual information to be provided.
- the processor 320 when the state information includes audio data acquired through the audio interface 440 of the external wearable electronic device 400 , the processor 320 performs an audio interface ( The display 350 may be controlled to obtain third audio data by processing the audio data obtained in operation 610 based on the audio data obtained through 440 , and display visual information corresponding to the third audio data. .
- the processing for obtaining the third audio data may be a noise canceling processing for removing ambient noise except for voice dialogue.
- the audio interface 440 of the external wearable electronic device 400 is obtained as at least a part of the audio data and/or state information obtained in operation 610 .
- the visual characteristics of the visual information displayed on the display 350 may be adjusted according to the volume of the voice corresponding to the audio data obtained through .
- the visual information is text
- at least one of a font, a size, and a color of the text may be adjusted according to the volume of a voice corresponding to the audio data.
- the visual information is an image
- at least one of a size and a color of the image may be adjusted according to the size of a voice corresponding to the audio data.
- the wearable electronic device may communicate with an external wearable electronic device (eg, the external wearable electronic device 400 ).
- an external wearable electronic device eg, the external wearable electronic device 400
- the wearable electronic device 300 communicates with the external wearable electronic device through a communication circuit (eg, the communication circuit 370 ).
- a communication connection eg, Bluetooth communication connection
- the processor (eg, the processor 320 ) of the wearable electronic device configures the external wearable electronic device (eg, the external wearable electronic device).
- the device 400 may receive status information.
- the state information may indicate whether the external wearable electronic device 400 is being worn by the user and whether a voice is being output from the external wearable electronic device 400 .
- the processor (eg, the processor 320) of the wearable electronic device determines whether the external wearable electronic device 400 is being worn by the user based on the state information. can check whether In one example, when it is confirmed that the external wearable electronic device 400 is not being worn by the user, the method may end.
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300) performs operation 730 may check whether a voice is being output from the external wearable electronic device 400 based on the state information. In one example, when it is determined that the external wearable electronic device 400 is not outputting a voice, the method may end.
- the processor eg, processor 320 of the wearable electronic device (eg, the wearable electronic device 300) is configured in operation 740 .
- the display 385 may be controlled to display visual information corresponding to audio data acquired through the audio interface 361 of the wearable electronic device 300 .
- the visual information may include at least one of text or an image.
- operation 730 may be performed before operation 720 .
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300) performs operations 610 and 620 shown in FIG. 6 before performing operation 710.
- operation 620 in response to it being confirmed that the audio data satisfies a predetermined condition, operation 710 may be performed.
- the condition for providing the STT service eg, at least one visual information display condition
- the condition for providing the STT service includes audio data It may contain related conditions.
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300) is transmitted from the external wearable electronic device (eg, the external wearable electronic device 400). Status information can be received.
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300) determines whether the external wearable electronic device 400 is being worn by the user based on the state information.
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300) determines whether a voice is being output from the external wearable electronic device 400 based on the state information can be checked.
- the details of operation 710, operation 720, and operation 730 described above with reference to FIG. 7 may be equally applied to operation 810, operation 820, and operation 830.
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300) is configured in operation 840 .
- Display 385 may be controlled to display a visual indicator indicating that an STT service is available.
- the visual indicator includes information to be provided to the user based on the voice input device 362 included in the wearable electronic device 300 and/or the audio module 440 included in the external wearable electronic device 400 . It may be a virtual object including text and/or an image indicating that there is.
- the processor (eg, the processor 320 ) of the wearable electronic device determines whether a reaction condition regarding the user of the wearable electronic device 300 is satisfied. can do.
- the reaction condition may be a condition related to the user's reaction of the wearable electronic device 300 to the visual indicator displayed in operation 840 .
- the processor 320 checks the user's gaze through the second camera 312, and when it is confirmed that the user's gaze is on the visual indicator for more than a preset first time, it is determined that the reaction condition is satisfied.
- the processor 320 may analyze the user's utterance and, when a preset utterance for receiving the STT service is detected, may confirm that the reaction condition is satisfied.
- the processor 320 may store data related to the user's voice in the memory 340 and determine whether a preset utterance is the user's utterance based on the stored user's voice data.
- the processor 320 detects the gesture through the first camera 311 , and when a preset gesture for receiving the STT service is detected, it may be confirmed that the reaction condition is satisfied.
- the processor 320 may determine that the reaction condition is satisfied when any combination of examples of the above-described reaction conditions is satisfied.
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300 ) operates 850 until the reaction condition is satisfied. The action can be repeated.
- the display 385 may be controlled to display visual information corresponding to the audio data acquired through the audio interface 361 .
- the visual information may include at least one of text or an image.
- the processor 320 stores the audio data collected before the reaction condition is satisfied in the memory 340 , and after confirming that the reaction condition is satisfied, collects the audio data after the reaction condition is satisfied in operation 860 .
- the display 385 may be controlled to further display visual information corresponding to the audio data collected before the reaction condition is satisfied, as well as visual information corresponding to the acquired audio data.
- the processor eg, processor 320 of the wearable electronic device (eg, wearable electronic device 300) performs operations 610 and 620 illustrated in FIG. 6 . and in response to it being confirmed that the audio data satisfies a predetermined condition in operation 620 , operation 810 may be performed.
- FIG. 9 is a flowchart illustrating operations performed by a wearable electronic device according to various embodiments of the present disclosure.
- the processor eg, the processor 320
- the wearable electronic device eg, the wearable electronic device 300
- the external wearable electronic device eg, the external wearable electronic device 400
- Status information can be received.
- the state information may indicate which function of the noise canceling function and the ambient sound listening function is designated as having a higher priority in the external wearable electronic device 400 .
- the processor (eg, the processor 320 ) of the wearable electronic device prioritizes the noise canceling function of the external wearable electronic device 400 based on the state information. You can check whether the priority is higher than the priority of the ambient sound listening function. For example, the fact that the priority of the function of listening to ambient sounds of the external wearable electronic device 400 is higher than the priority of the noise canceling function of the external wearable electronic device 400 means that the user listens to external sounds through the wearable electronic device 300 . It may mean that the user wants to listen to the surrounding sound through the external wearable electronic device 400 rather than being provided with the STT-based function.
- the external wearable electronic device 400 enables the user to activate the ambient sound listening function through a user input such as a touch, a tap or a long press through the sensor 450, a designated motion detection, or a user's utterance detection. can sense that
- the processor eg, the wearable electronic device 300
- the processor 320 may control the display 385 to display visual information corresponding to audio data acquired through the audio interface 361 of the wearable electronic device 300 in operation 930 .
- the visual information may include at least one of text or an image.
- the processor eg, the wearable electronic device 300
- the processor 320 may end the method.
- the wearable electronic device when it is determined in operation 920 that the priority of the noise canceling function of the external wearable electronic device 400 is higher than the priority of the ambient sound listening function, the wearable electronic device (eg, the wearable electronic device 300)
- the processor eg, the processor 320
- the display 385 may be controlled to display visual information corresponding to the audio data acquired through the audio interface 361 .
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300 ) performs operations 610 and 620 illustrated in FIG. 6 before performing operation 910 . and in operation 620 , in response to it being confirmed that the audio data satisfies a predetermined condition, operation 910 may be performed.
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300 ) may acquire first audio data.
- the processor 320 may acquire first audio data through the audio interface 361 of the wearable electronic device 300 .
- the processor eg, processor 320 of the wearable electronic device (eg, wearable electronic device 300) determines whether the first audio data obtained in operation 1010 satisfies a predetermined condition. can be checked Details of the predetermined condition may be the same as described above with reference to operation 620 of FIG. 6 .
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300 ) performs the predetermined condition Operations 1010 and 1020 may be repeated until first audio data satisfying .
- State information including second audio data obtained through the audio interface 440 of the external wearable electronic device 400 may be obtained from the external wearable electronic device (eg, the external wearable electronic device 400 ) .
- the wearable electronic device 300 transmits data including information about the first audio data (eg, reception time information and/or sampling data) to the external wearable electronic device 400, and the external wearable electronic device ( 400) may receive status information including second audio data based on information on first audio data.
- the wearable electronic device 300 performs communication connection with the external wearable electronic device 400 through the communication circuit 370 before determining whether the first audio data satisfies a predetermined condition, and State information including second audio data may be obtained from the wearable electronic device 400 .
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300 ) processes the first audio data based on the second audio data to thereby process the third audio data.
- the processor 320 may perform a noise canceling process for removing ambient noise excluding voice conversations from the first audio data.
- the processor eg, processor 320 of the wearable electronic device (eg, wearable electronic device 300 ) displays visual information corresponding to the third audio data obtained in operation 1040 . 350 can be controlled.
- the visual information may include at least one of text or an image.
- the processor 320 receives state information indicating whether the external wearable electronic device 400 is being worn by the user and whether a voice is being output from the external wearable electronic device 400 in operation 1030 , and in operation 1030 .
- operations 720 and 730 of FIG. 7 may be performed without immediately performing operation 1040, and operation 1040 may be performed when the conditions of operations 720 and 730 are satisfied.
- the processor 320 receives state information indicating whether the external wearable electronic device 400 is being worn by the user and whether a voice is being output from the external wearable electronic device 400 in operation 1030 .
- state information indicating whether the external wearable electronic device 400 is being worn by the user and whether a voice is being output from the external wearable electronic device 400 in operation 1030 .
- operation 1040 is not immediately performed, operations 820 to 850 of FIG. 8 are performed, and operation 1040 can be performed when the conditions of operation 820, operation 830, and operation 850 are satisfied. have.
- FIG. 11 is a flowchart illustrating operations performed by a wearable electronic device according to various embodiments of the present disclosure.
- the processor eg, the processor 320
- the display 350 may be controlled to display visual information corresponding to .
- the visual information may include at least one of text or an image.
- the processor (eg, the processor 320 ) of the wearable electronic device may determine whether a condition for stopping the provision of visual information is satisfied.
- the condition for stopping the provision of visual information may be a condition indicating that it is appropriate to stop providing the STT service.
- the condition for stopping the provision of visual information may include a condition in which a time during which the user's gaze is not on the visual information displayed in operation 1110 continues for a preset time or longer.
- the condition for stopping the provision of visual information may include a condition in which it is detected that a preset gesture, which requires stopping the STT service, is made by the user.
- the condition for stopping the provision of visual information may include a condition in which the accuracy of sentences included in the visual information displayed in operation 1110 is less than or equal to a preset level.
- the accuracy of sentences included in the visual information may be determined based on the completeness of the sentence and/or the accuracy of the context.
- the condition for stopping the provision of time information may include a condition in which the time for which the time information displayed in operation 1110 is displayed exceeds a specified time.
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300) performs the audio data in operation 1130
- the display 350 may be controlled not to display visual information corresponding to .
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300) sets the condition for stopping the provision of visual information
- the display 350 may be controlled to continuously display visual information corresponding to the audio data acquired through the audio interface 361 while repeating operation 1120 until it is confirmed that this is satisfied.
- the condition for stopping the provision of visual information includes a condition in which the accuracy of sentences included in the visual information displayed in operation 1110 is less than or equal to a preset level, and in operation 1120 that the accuracy of sentences included in the visual information is less than or equal to a preset level If it is confirmed, the processor (eg, the processor 320 ) of the wearable electronic device (eg, the wearable electronic device 300 ) transmits a signal for activating the ambient sound listening function through the communication circuit 370 to the outside. The transmission may be performed to the wearable electronic device 400 .
- FIG. 12 is a flowchart illustrating operations performed by the wearable electronic device according to various embodiments of the present disclosure.
- the processor (eg, the processor 320 ) of the wearable electronic device performs an external event through the voice input device 362 of the wearable electronic device 300 . It is possible to obtain first audio data corresponding to .
- the external event may include an utterance by a person other than the user of the wearable electronic device 300 .
- the external event may include that a sound corresponding to a specified condition (eg, greater than or equal to a specified signal strength) is generated from the outside (eg, any outside of the wearable electronic device 100 , 200 , or 300 ).
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300 ) communicates with the external wearable electronic device (eg, the external wearable device) through the communication circuit 370 .
- the second audio data obtained from the external wearable electronic device 400 and corresponding to the external event may be received from the electronic device 400 .
- the processor (eg, the processor 320 ) of the wearable electronic device responds to an external event based on the first audio data and the second audio data.
- direction can be checked.
- the processor 320 of the wearable electronic device 300 may include a location of at least one voice input device 362 of the wearable electronic device 300 and at least one audio interface of the external wearable electronic device 400 .
- a direction corresponding to the external event may be determined based on the location of 440 .
- the processor 320 may determine a direction corresponding to the external event based on time information at which the first audio data is received and information on the time at which the second audio data is received.
- the direction corresponding to the external event may be a direction relative to the wearable electronic device 300 of a location where the external event occurs.
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300 ) may perform an operation corresponding to the identified direction.
- the processor 320 may be configured to obtain information obtained through the sensor 380 and/or at least one camera (eg, the first camera 311 and the second camera 312 ) of the wearable electronic device 300 . Based on the data, the user's gaze direction of the wearable electronic device 300 (or the user passes through the transparent member of the wearable electronic device 300 (eg, one or more transparent members 190-1 and 190-2 of FIG. 1)) looking direction), and different operations may be performed according to whether the confirmed user's gaze direction matches the direction identified in operation 1230 . According to various embodiments, the processor 320 may identify the gaze direction of the user of the wearable electronic device 300 based on data obtained through the second camera 312 .
- the processor 320 may identify the gaze direction of the user of the wearable electronic device 300 based on data obtained through the second camera 312 .
- the processor 320 checks the direction that the wearable electronic device 300 is facing based on data obtained through the sensor 380 of the wearable electronic device 300, and uses this as the user's gaze direction. can be checked According to various embodiments, when the user's gaze direction coincides with the direction corresponding to the external event identified in operation 1230 , the processor 320 transmits the ambient sound to the external wearable electronic device 400 through the communication circuit 370 . It can transmit a signal that activates the listening function. According to various embodiments of the present disclosure, when the user's gaze direction does not match the direction identified in operation 1230 , the processor 320 may be configured to respond to an external event based on at least one of the first audio data and the second audio data. The display 350 may be controlled to display visual information.
- FIG. 13 is a flowchart illustrating operations performed by a wearable electronic device according to various embodiments of the present disclosure.
- the processor eg, the processor 320
- the wearable electronic device eg, the wearable electronic device 300
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300 ) may determine whether the first audio data satisfies a predetermined condition. Details of the predetermined condition may be the same as described above with reference to operation 620 of FIG. 6 .
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300 ) performs the predetermined condition Operations 1310 and 1320 may be repeated until first audio data satisfying .
- the processor eg, processor 320 of the wearable electronic device (eg, wearable electronic device 300) in operation 1330
- Status information may be received from the external wearable electronic device (eg, the external wearable electronic device 400 ).
- the state information indicates whether the external wearable electronic device 400 is being worn by the user and whether a voice is being output from the external wearable electronic device 400 , and an audio interface of the external wearable electronic device 400 .
- the second audio data obtained through 440 may be included.
- the wearable electronic device 300 may perform a communication connection with the external wearable electronic device 400 through the communication circuit 370 before determining whether the first audio data satisfies a predetermined condition.
- the communication may include short-range communication such as Bluetooth or WiFi.
- the processor (eg, the processor 320 ) of the wearable electronic device determines that the external wearable electronic device 400 is being worn and that the external wearable electronic device is being worn based on the state information. It may be checked whether the electronic device 400 is outputting voice.
- the processor 320 is a user who is wearing the external wearable electronic device 400 and the wearable electronic device 300 is being worn. It is additionally checked whether the user is the same person, and when it is confirmed that the user wearing the external wearable electronic device 400 and the user wearing the wearable electronic device 300 are the same person, operation 1350 may be performed.
- the processor 320 may end the method.
- the processor eg, the wearable electronic device 300 of the wearable electronic device
- the processor 320 via the communication circuit 370 , from the external wearable electronic device (eg, the external wearable electronic device 400 ) to the audio interface of the external wearable electronic device 400 .
- the second audio data obtained through 440 may be received.
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300 ) responds to an external event based on the first audio data and the second audio data. direction can be checked. The details of operation 1230 may be equally applied to operation 1360 .
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300 ) acquires the information obtained through at least one of the sensor 380 or the second camera 312 .
- the processor 320 may identify the gaze direction of the user of the wearable electronic device 300 based on data acquired through the second camera 312 .
- the processor 320 checks the direction that the wearable electronic device 300 is facing based on data obtained through the sensor 380 of the wearable electronic device 300, and uses this as the user's gaze direction. can be checked
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300) may determine whether the direction corresponding to the external event matches the gaze direction of the user. have.
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300) 1390
- a signal for activating an ambient sound listening function may be transmitted to the external wearable electronic device 400 through the communication circuit 370 .
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300)
- the display 350 may be controlled to display visual information corresponding to an external event based on at least one of the first audio data and the second audio data.
- the visual information may include at least one of text or an image.
- the user may set the wearable electronic device 300 and the external wearable electronic device 400 to display visual information corresponding to the external event by activating the STT function.
- the processor eg, the processor 320
- the wearable electronic device eg, the wearable electronic device 300
- the display 350 may display visual information corresponding to an external event.
- the processor 320 displays visual information based on the first audio data received from the wearable electronic device 300 and the third audio data based on the second audio data received from the external wearable electronic device 400 .
- the visual information may include at least one of text or an image.
- the processor (eg, the processor 320 ) of the wearable electronic device may identify the user's second gaze direction.
- a process for confirming the user's second gaze direction may be the same as operation 1370 of FIG. 13 .
- the term “second gaze direction” may mean a gaze direction of a user while visual information corresponding to an external event is displayed, that is, while an STT function is provided.
- the term “second gaze direction” may be different from the gaze direction in operation 1370 of FIG. 13 , that is, the gaze direction of the user before the STT function is provided (eg, the first gaze direction).
- the processor determines whether the user's second gaze direction matches the direction corresponding to the external event. can be checked According to various embodiments, similar to operation 1360 of FIG. 13 , the processor 320 receives the first audio data obtained through the voice input device 362 and the second audio data received from the external wearable electronic device 400 . A direction corresponding to an external event may be identified based on the.
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300) may continue to provide the STT function while repeating operations 1410 to 1430 until it is confirmed that the user's second gaze direction matches the direction corresponding to the external event.
- the processor eg, the processor 320 of the wearable electronic device (eg, the wearable electronic device 300) is 1440
- the display 350 is controlled to stop displaying the visual information corresponding to the external event, and a signal for activating the ambient sound listening function is transmitted to the external wearable electronic device 400 through the communication circuit 370 .
- the wearable electronic device (eg, the wearable electronic device 300 ) according to various embodiments includes a display (eg, the display 350 ), a communication circuit (eg, the communication circuit 370 ), and a voice input device (eg, a voice input).
- a display eg, the display 350
- a communication circuit e.g, the communication circuit 370
- a voice input device e.g, a voice input
- the at least one processor eg, processor (320) operatively coupled with the display, the communication circuitry and the voice input device, the at least one processor (320) comprising: Acquire audio data through the voice input device 362 , confirm that the audio data satisfies a predetermined condition, and use the communication circuit 370 to an external wearable electronic device (eg, external wearable electronic device 400 ) ), the external wearable electronic device 400 receives status information based on a signal acquired, and controls the display 350 to display visual information corresponding to the audio data based at least in part on the status information.
- an external wearable electronic device eg, external wearable electronic device 400
- the external wearable electronic device 400 receives status information based on a signal acquired, and controls the display 350 to display visual information corresponding to the audio data based at least in part on the status information.
- the predetermined condition may include a condition in which the audio data includes a voice related to a language, a condition in which the audio data includes a voice related to a preset word, or a condition in which the audio data has a preset size or more. It may include at least one of the conditions including a voice having a magnitude.
- the state information may indicate whether the external wearable electronic device 400 is being worn by the user.
- the state information includes first data acquired from a first biosensor (eg, sensor 380) of the external wearable electronic device 400, and the wearable electronic device 300 includes a first 2 biosensors (eg, sensor 380), wherein the at least one processor 320 acquires second data through the second biosensor, and obtains the first data and the second data based on the confirmation that the user who is wearing the external wearable electronic device 400 is wearing the wearable electronic device 300 , and the user who is wearing the external wearable electronic device 400 is the wearable electronic device Based on it is confirmed that the device 300 is being worn, it may be configured to control the display 350 to display visual information corresponding to the audio data.
- a first biosensor eg, sensor 380
- the wearable electronic device 300 includes a first 2 biosensors (eg, sensor 380)
- the at least one processor 320 acquires second data through the second biosensor, and obtains the first data and the second data based on the confirmation that the user who is wearing the external wearable electronic device 400 is wearing the wearable electronic device
- the status information indicates whether a voice is being output by the external wearable electronic device 400
- the at least one processor 320 determines whether the external wearable electronic device 400 is activated by the user.
- the display 350 may be controlled to display visual information corresponding to the audio data based on it being worn and it is confirmed that a voice is being output from the external wearable electronic device 400 .
- the at least one processor 320 may be configured to determine that the external wearable electronic device 400 is being worn by the user and that a voice is being output from the external wearable electronic device 400 based on , control the display 350 to display a visual indicator indicating that a speech to text (STT) service is available, and while the visual indicator is displayed on the display 350, a reaction condition regarding the user is In response to being satisfied, control the display 350 to display visual information corresponding to the audio data.
- STT speech to text
- the reaction condition may include a condition in which the gaze of the user faces the visual indicator for a preset first time or more, a condition in which a preset utterance by the user is detected, or a first preset first time by the user It may include at least one of a condition in which a gesture is detected.
- the at least one processor 320 may be configured to control the display 350 to further display visual information corresponding to the audio data before the reaction condition is satisfied.
- the state information indicates that the priority of the noise canceling function of the external wearable electronic device 400 is higher than the priority of the ambient sound listening function of the external wearable electronic device 400
- the at least one The processor 320 of the is configured to control the display 350 to display visual information corresponding to the audio data based on it being confirmed that the priority of the noise canceling function is higher than the priority of the ambient sound listening function can be
- the state information may include second audio data obtained from the external wearable electronic device 400 .
- the at least one processor 320 obtains third audio data by processing the audio data based on the second audio data, and displays visual information corresponding to the third audio data. It may be configured to control the display 350 to do so.
- the at least one processor 320 may be configured to adjust a visual characteristic of the visual information displayed on the display 350 according to a volume of a voice corresponding to the audio data.
- the at least one processor 320 checks whether a condition for stopping the provision of visual information is satisfied while controlling the display 350 to display visual information corresponding to the audio data, and It is configured to control the display 350 so as not to display visual information corresponding to the audio data based on the satisfaction of a condition for stopping the provision of visual information, wherein the condition for stopping the provision of visual information is: 350) a condition in which the time not on the visual information displayed on the screen continues for a preset second time or longer, a condition in which a preset second gesture by the user is detected, or the visual information corresponding to the audio data At least one of conditions in which the accuracy of the included sentences is less than or equal to a preset level may be included.
- the wearable electronic device 300 is communicatively connected to an external electronic device (eg, a smartphone) through a communication circuit 370 , and the external wearable electronic device 400 is an external electronic device through a communication circuit 430 .
- a device may be communicatively connected.
- the wearable electronic device 300 includes at least one camera (eg, the first cameras 111-1 and 111-2), the second cameras 112-1 and 112-2, and/or the third Data received through the camera 113) or one or more voice input devices 162-1, 162-2, and 162-3 may be transmitted to an external electronic device through the communication circuit 370.
- the wearable electronic device 300 displays at least one display (eg, the first display 151 , the second display 152 , or the display 350 ) based on data received from the external electronic device.
- the visual information may be output through the display, or the voice may be output through the at least one audio output device 363 .
- the external electronic device may obtain audio data from the wearable electronic device 300 and/or the external wearable electronic device 400 and provide an STT function based on the obtained audio data.
- the external electronic device may include at least one voice input device.
- the external electronic device may acquire audio data corresponding to the external event through the voice input device.
- the external electronic device may request the wearable electronic device 300 and/or the external wearable electronic device 400 to transmit the audio data.
- the external electronic device may obtain audio data from the wearable electronic device 300 and/or the external wearable electronic device 400 and may provide an STT function based on the obtained audio data.
- the external electronic device may receive first audio data through the wearable electronic device 300 and may receive second audio data through the external wearable electronic device 400 .
- the external electronic device generates third audio data based on the first audio data and the second audio data, and transmits visual information based on the generated third audio data to the wearable electronic device 300 . can be printed out.
- the wearable electronic device 300 may include at least one camera (eg, the first cameras 111-1 and 111-2), the second cameras 112-1 and 112-2, and/or the third camera ( 113)) and/or data obtained from the sensor 380 may be transmitted to an external electronic device.
- the external wearable electronic device 400 may transmit data obtained from the sensor 450 to the external electronic device.
- the external electronic device may confirm that the user's gaze is changed to a direction corresponding to the external event based on the data received from the wearable electronic device 300 and/or the external wearable electronic device 400 .
- the external electronic device requests the wearable electronic device 300 to stop outputting visual information through the display 350, and the external wearable electronic device
- the device 400 may request activation of the ambient sound listening function.
- the wearable electronic device (eg, the wearable electronic device 300 ) according to various embodiments includes a display (eg, the display 350 ), a communication circuit (eg, the communication circuit 370 ), and a voice input device (eg, a voice input). device (362), and at least one processor (eg, processor (320)) operatively coupled to the display (350), the communication circuitry (370) and the voice input device (362); The at least one processor 320 obtains first audio data corresponding to an external event through the voice input device 362 and an external wearable electronic device (eg, an external wearable electronic device) through the communication circuit 370 .
- the at least one processor 320 is configured to receive the second audio data based on the first audio data satisfying a first predetermined condition, the first predetermined condition is, a condition in which the first audio data includes a voice related to a language, a condition in which the first audio data includes a voice related to a preset word, or a voice in which the first audio data has a size greater than or equal to a preset size It may include at least one of the conditions including.
- the at least one processor 320 receives state information from the external wearable electronic device 400 through the communication circuit 370 , and the state information is transmitted to the external wearable electronic device 400 .
- the at least one processor 320 may be worn and configured to receive the second audio data based on indicating that the voice is being output.
- the wearable electronic device 300 further includes a sensor (eg, a sensor 380 ), and the at least one processor 320 uses the sensor 380 to enable the wearable electronic device Checking the gaze direction of the user of 300, checking whether the direction corresponding to the external event coincides with the gaze direction, and based on the fact that the direction corresponding to the external event matches the gaze direction, the It may be configured to control the communication circuit 370 to transmit a signal for activating an ambient sound listening function to the external wearable electronic device 400 .
- a sensor eg, a sensor 380
- the at least one processor 320 uses the sensor 380 to enable the wearable electronic device Checking the gaze direction of the user of 300, checking whether the direction corresponding to the external event coincides with the gaze direction, and based on the fact that the direction corresponding to the external event matches the gaze direction, the It may be configured to control the communication circuit 370 to transmit a signal for activating an ambient sound listening function to the external wearable electronic device 400 .
- the at least one processor 320 is configured to, based on the fact that the direction corresponding to the external event does not match the gaze direction, to at least one of the first audio data and the second audio data. Based on it, it may be configured to control the display 350 to display visual information corresponding to the external event.
- the gaze direction includes a first gaze direction
- the at least one processor 320 uses the sensor while the visual information is displayed on the display 350 to display a second gaze direction.
- Checking the gaze direction confirming that the direction corresponding to the external event coincides with the second gaze direction, and based on the fact that the direction corresponding to the external event coincides with the second gaze direction, to control the display 350 to stop displaying the corresponding visual information, and to control the communication circuit 370 to transmit a signal activating an ambient sound listening function to the external wearable electronic device 400 .
- a method performed in a wearable electronic device includes an operation of acquiring audio data, an operation of confirming that the audio data satisfies a predetermined condition, and an external wearable electronic device.
- first, second, or first or second may simply be used to distinguish an element from other elements in question, and may refer elements to other aspects (e.g., importance or order) is not limited. It is said that one (eg, first) component is “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively”. When referenced, it means that one component can be connected to the other component directly (eg by wire), wirelessly, or through a third component.
- module used in various embodiments of this document may include a unit implemented in hardware, software, or firmware, and is interchangeable with terms such as, for example, logic, logic block, component, or circuit.
- a module may be an integrally formed part or a minimum unit or a part of the part that performs one or more functions.
- the module may be implemented in the form of an application-specific integrated circuit (ASIC).
- ASIC application-specific integrated circuit
- Various embodiments of the present document include one or more instructions stored in a storage medium (eg, internal memory 336 or external memory 338) readable by a machine (eg, electronic device 301). may be implemented as software (eg, the program 340) including
- the processor eg, the processor 320 of the device (eg, the electronic device 301 ) may call at least one of one or more instructions stored from a storage medium and execute it. This makes it possible for the device to be operated to perform at least one function according to the called at least one command.
- the one or more instructions may include code generated by a compiler or code executable by an interpreter.
- the device-readable storage medium may be provided in the form of a non-transitory storage medium.
- 'non-transitory' only means that the storage medium is a tangible device and does not contain a signal (eg, electromagnetic wave), and this term is used in cases where data is semi-permanently stored in the storage medium and It does not distinguish between temporary storage cases.
- a signal eg, electromagnetic wave
- the method according to various embodiments disclosed in this document may be provided as included in a computer program product.
- Computer program products may be traded between sellers and buyers as commodities.
- the computer program product is distributed in the form of a machine-readable storage medium (eg compact disc read only memory (CD-ROM)), or via an application store (eg Play Store TM ) or on two user devices ( It can be distributed (eg downloaded or uploaded) directly or online between smartphones (eg: smartphones).
- a part of the computer program product may be temporarily stored or temporarily created in a machine-readable storage medium such as a memory of a server of a manufacturer, a server of an application store, or a relay server.
- each component eg, a module or a program of the above-described components may include a singular or a plurality of entities, and some of the plurality of entities may be separately disposed in other components. have.
- one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
- a plurality of components eg, a module or a program
- the integrated component may perform one or more functions of each component of the plurality of components identically or similarly to those performed by the corresponding component among the plurality of components prior to the integration. .
- operations performed by a module, program, or other component are executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations are executed in a different order, or omitted. , or one or more other operations may be added.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Computer Security & Cryptography (AREA)
- Software Systems (AREA)
- Computational Linguistics (AREA)
- Optics & Photonics (AREA)
- Biomedical Technology (AREA)
- Dermatology (AREA)
- Acoustics & Sound (AREA)
- Neurology (AREA)
- Neurosurgery (AREA)
- User Interface Of Digital Computer (AREA)
- Credit Cards Or The Like (AREA)
- Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)
Abstract
Description
Claims (15)
- 웨어러블 전자 장치에 있어서,디스플레이,통신 회로,음성 입력 장치, 및상기 디스플레이, 상기 통신 회로 및 상기 음성 입력 장치와 작동적으로 연결된 적어도 하나의 프로세서를 포함하고,상기 적어도 하나의 프로세서는,상기 음성 입력 장치를 통하여 오디오 데이터를 획득하고,상기 오디오 데이터가 미리 결정된 조건을 만족하는 것을 확인하고,상기 통신 회로를 통하여 외부 웨어러블 전자 장치로부터, 상기 외부 웨어러블 전자 장치에서 획득된 신호에 기초한 상태 정보를 수신하고,상기 상태 정보에 적어도 일부 기반하여, 상기 오디오 데이터에 대응되는 시각 정보를 표시하도록 상기 디스플레이를 제어하도록 구성되는, 웨어러블 전자 장치.
- 제1항에 있어서,상기 미리 결정된 조건은,상기 오디오 데이터가 언어와 관련된 음성을 포함하는 조건, 상기 오디오 데이터가 미리 설정된 단어와 관련된 음성을 포함하는 조건, 또는, 상기 오디오 데이터가 미리 설정된 크기 이상의 크기를 갖는 음성을 포함하는 조건 중 적어도 하나를 포함하는, 웨어러블 전자 장치.
- 제1항에 있어서,상기 상태 정보는 상기 외부 웨어러블 전자 장치가 사용자에 의하여 착용 중인지 여부를 나타내는, 웨어러블 전자 장치.
- 제3항에 있어서,상기 상태 정보는 상기 외부 웨어러블 전자 장치의 제1 생체 센서에서 획득한 제1 데이터를 포함하고,상기 웨어러블 전자 장치는 제2 생체 센서를 포함하고,상기 적어도 하나의 프로세서는,상기 제2 생체 센서를 통하여 제2 데이터를 획득하고,상기 제1 데이터 및 상기 제2 데이터에 기반하여, 상기 외부 웨어러블 전자 장치를 착용 중인 상기 사용자가 상기 웨어러블 전자 장치를 착용하고 있다는 것을 확인하고,상기 외부 웨어러블 전자 장치를 착용 중인 상기 사용자가 상기 웨어러블 전자 장치를 착용하고 있다고 확인되는 것에 기반하여, 상기 오디오 데이터에 대응되는 시각 정보를 표시하도록 상기 디스플레이를 제어하도록 더 구성되는, 웨어러블 전자 장치.
- 제3항에 있어서,상기 상태 정보는 상기 외부 웨어러블 전자 장치에서 음성이 출력 중인지 여부를 나타내고,상기 적어도 하나의 프로세서는,상기 외부 웨어러블 전자 장치가 상기 사용자에 의하여 착용 중이고, 상기 외부 웨어러블 전자 장치에서 음성이 출력 중이라고 확인되는 것에 기반하여, 상기 오디오 데이터에 대응되는 시각 정보를 표시하도록 상기 디스플레이를 제어하도록 더 구성되는, 웨어러블 전자 장치.
- 제5항에 있어서,상기 적어도 하나의 프로세서는,상기 외부 웨어러블 전자 장치가 상기 사용자에 의하여 착용 중이고, 상기 외부 웨어러블 전자 장치에서 음성이 출력 중이라고 확인되는 것에 기반하여, STT(speech to text) 서비스가 제공 가능하다는 것을 나타내는 시각적 표시자를 표시하도록 상기 디스플레이를 제어하고,상기 시각적 표시자가 상기 디스플레이 상에 표시되는 동안, 상기 사용자에 관한 반응 조건이 만족되는 것에 응답하여, 상기 오디오 데이터에 대응되는 시각 정보를 표시하도록 상기 디스플레이를 제어하도록 더 구성되는, 웨어러블 전자 장치.
- 제6항에 있어서,상기 반응 조건은,미리 설정된 제1 시간 이상 상기 사용자의 시선이 상기 시각적 표시자를 향하는 조건,상기 사용자에 의한 미리 설정된 발화가 검출되는 조건, 또는상기 사용자에 의한 미리 설정된 제1 제스처가 검출되는 조건,중 적어도 하나를 포함하는, 웨어러블 전자 장치.
- 제6항에 있어서,상기 적어도 하나의 프로세서는 상기 반응 조건이 만족되기 전의 상기 오디오 데이터에 대응되는 시각 정보를 더 표시하도록 상기 디스플레이를 제어하도록 더 구성되는, 웨어러블 전자 장치.
- 제1항에 있어서,상기 상태 정보는 상기 외부 웨어러블 전자 장치의 노이즈 캔슬링 기능의 우선순위가 상기 외부 웨어러블 전자 장치의 주변 소리 듣기 기능의 우선순위보다 높다는 것을 나타내고,상기 적어도 하나의 프로세서는 상기 노이즈 캔슬링 기능의 우선순위가 상기 주변 소리 듣기 기능의 우선순위보다 높다고 확인되는 것에 기반하여, 상기 오디오 데이터에 대응되는 시각 정보를 표시하도록 상기 디스플레이를 제어하도록 더 구성되는, 웨어러블 전자 장치.
- 웨어러블 전자 장치에 있어서,디스플레이,통신 회로,음성 입력 장치, 및상기 디스플레이, 상기 통신 회로 및 상기 음성 입력 장치와 작동적으로 연결된 적어도 하나의 프로세서를 포함하고,상기 적어도 하나의 프로세서는,상기 음성 입력 장치를 통하여 외부 이벤트에 대응하는 제1 오디오 데이터를 획득하고,상기 통신 회로를 통하여 외부 웨어러블 전자 장치로부터 상기 외부 웨어러블 전자 장치에서 획득되고 상기 외부 이벤트에 대응하는 제2 오디오 데이터를 수신하고,상기 제1 오디오 데이터 및 상기 제2 오디오 데이터에 기반하여, 상기 외부 이벤트에 대응하는 방향을 확인하고,상기 확인된 방향에 대응하는 동작을 수행하도록 구성되는, 웨어러블 전자 장치.
- 제10항에 있어서,상기 적어도 하나의 프로세서는,상기 제1 오디오 데이터가 미리 결정된 제1 조건을 만족하는 것에 기반하여 상기 제2 오디오 데이터를 수신하도록 더 구성되고,상기 미리 결정된 제1 조건은,상기 제1 오디오 데이터가 언어와 관련된 음성을 포함하는 조건, 상기 제1 오디오 데이터가 미리 설정된 단어와 관련된 음성을 포함하는 조건, 또는, 상기 제1 오디오 데이터가 미리 설정된 크기 이상의 크기를 갖는 음성을 포함하는 조건 중 적어도 하나를 포함하는, 웨어러블 전자 장치.
- 제10항에 있어서,상기 적어도 하나의 프로세서는,상기 통신 회로를 통하여 상기 외부 웨어러블 전자 장치로부터 상태 정보를 수신하고,상기 상태 정보가 상기 외부 웨어러블 전자 장치가 착용 중이고, 음성을 출력 중임을 나타내는 것에 기반하여, 상기 제2 오디오 데이터를 수신하도록 더 구성되는, 웨어러블 전자 장치.
- 제10항에 있어서,상기 웨어러블 전자 장치는 센서를 더 포함하고,상기 적어도 하나의 프로세서는,상기 센서를 이용하여 상기 웨어러블 전자 장치의 사용자의 시선 방향을 확인하고,상기 외부 이벤트에 대응하는 방향이 상기 시선 방향과 일치하는지 여부를 확인하고,상기 외부 이벤트에 대응하는 방향이 상기 시선 방향과 일치하는 것에 기반하여, 상기 외부 웨어러블 전자 장치에 주변 소리 듣기 기능을 활성화하는 신호를 송신하도록 상기 통신 회로를 제어하도록 더 구성되는, 웨어러블 전자 장치.
- 제13항에 있어서,상기 적어도 하나의 프로세서는,상기 외부 이벤트에 대응하는 방향이 상기 시선 방향과 일치하지 않는 것에 기반하여, 상기 제1 오디오 데이터 또는 상기 제2 오디오 데이터 중 적어도 하나에 기반하여, 상기 외부 이벤트에 대응하는 시각 정보를 표시하도록 상기 디스플레이를 제어하도록 더 구성되는, 웨어러블 전자 장치.
- 제14항에 있어서,상기 시선 방향은 제1 시선 방향을 포함하고,상기 적어도 하나의 프로세서는,상기 시각 정보가 상기 디스플레이 상에 표시되는 중, 상기 센서를 이용하여 제2 시선 방향을 확인하고,상기 외부 이벤트에 대응하는 방향이 상기 제2 시선 방향과 일치하는 것을 확인하고,상기 외부 이벤트에 대응하는 방향이 상기 제2 시선 방향과 일치하는 것에 기반하여, 상기 외부 이벤트에 대응하는 상기 시각 정보의 표시를 중단하도록 상기 디스플레이를 제어하고, 상기 외부 웨어러블 전자 장치에 주변 소리 듣기 기능을 활성화하는 신호를 송신하도록 상기 통신 회로를 제어하도록 더 구성되는, 웨어러블 전자 장치.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22742830.7A EP4206901A4 (en) | 2021-01-21 | 2022-01-19 | PORTABLE ELECTRONIC DEVICE RECEIVING INFORMATION FROM AN EXTERNAL PORTABLE ELECTRONIC DEVICE AND OPERATING METHODS THEREOF |
CN202280008475.8A CN116670618A (zh) | 2021-01-21 | 2022-01-19 | 从外部可穿戴电子设备接收信息的可穿戴电子设备及其操作方法 |
US17/581,454 US20220230649A1 (en) | 2021-01-21 | 2022-01-21 | Wearable electronic device receiving information from external wearable electronic device and method for operating the same |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020210008797A KR20220105893A (ko) | 2021-01-21 | 2021-01-21 | 외부 웨어러블 전자 장치로부터 정보를 수신하는 웨어러블 전자 장치 및 그 작동 방법 |
KR10-2021-0008797 | 2021-01-21 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/581,454 Continuation US20220230649A1 (en) | 2021-01-21 | 2022-01-21 | Wearable electronic device receiving information from external wearable electronic device and method for operating the same |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022158854A1 true WO2022158854A1 (ko) | 2022-07-28 |
Family
ID=82548876
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2022/000998 WO2022158854A1 (ko) | 2021-01-21 | 2022-01-19 | 외부 웨어러블 전자 장치로부터 정보를 수신하는 웨어러블 전자 장치 및 그 작동 방법 |
Country Status (2)
Country | Link |
---|---|
KR (1) | KR20220105893A (ko) |
WO (1) | WO2022158854A1 (ko) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20150029976A (ko) * | 2013-09-11 | 2015-03-19 | 엘지전자 주식회사 | 착용형 컴퓨팅 디바이스 및 사용자 인터페이스 방법 |
KR101668165B1 (ko) * | 2011-11-14 | 2016-10-20 | 구글 인코포레이티드 | 웨어러블 컴퓨팅 시스템상에 사운드 표시들을 디스플레이 |
KR20190121720A (ko) * | 2019-10-07 | 2019-10-28 | 엘지전자 주식회사 | 웨어러블 디바이스 및 웨어러블 디바이스에서 정보를 제공하기 위한 방법 |
KR20190141696A (ko) * | 2017-04-19 | 2019-12-24 | 매직 립, 인코포레이티드 | 웨어러블 시스템을 위한 멀티모달 임무 실행 및 텍스트 편집 |
KR20200026798A (ko) * | 2017-04-23 | 2020-03-11 | 오캠 테크놀로지스 리미티드 | 이미지를 분석하기 위한 웨어러블기기 및 방법 |
-
2021
- 2021-01-21 KR KR1020210008797A patent/KR20220105893A/ko active Search and Examination
-
2022
- 2022-01-19 WO PCT/KR2022/000998 patent/WO2022158854A1/ko active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101668165B1 (ko) * | 2011-11-14 | 2016-10-20 | 구글 인코포레이티드 | 웨어러블 컴퓨팅 시스템상에 사운드 표시들을 디스플레이 |
KR20150029976A (ko) * | 2013-09-11 | 2015-03-19 | 엘지전자 주식회사 | 착용형 컴퓨팅 디바이스 및 사용자 인터페이스 방법 |
KR20190141696A (ko) * | 2017-04-19 | 2019-12-24 | 매직 립, 인코포레이티드 | 웨어러블 시스템을 위한 멀티모달 임무 실행 및 텍스트 편집 |
KR20200026798A (ko) * | 2017-04-23 | 2020-03-11 | 오캠 테크놀로지스 리미티드 | 이미지를 분석하기 위한 웨어러블기기 및 방법 |
KR20190121720A (ko) * | 2019-10-07 | 2019-10-28 | 엘지전자 주식회사 | 웨어러블 디바이스 및 웨어러블 디바이스에서 정보를 제공하기 위한 방법 |
Also Published As
Publication number | Publication date |
---|---|
KR20220105893A (ko) | 2022-07-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020032608A1 (en) | Feedback method and apparatus of electronic device for confirming user's intention | |
WO2020159288A1 (ko) | 전자 장치 및 그 제어 방법 | |
WO2020122677A1 (en) | Method of performing function of electronic device and electronic device using same | |
WO2015199288A1 (en) | Glass-type terminal and method of controling the same | |
WO2021060728A1 (ko) | 사용자 발화를 처리하는 전자 장치 및 그 작동 방법 | |
WO2020091248A1 (ko) | 음성 명령에 응답하여 컨텐츠를 표시하기 위한 방법 및 그 전자 장치 | |
WO2017007101A1 (ko) | 스마트 디바이스 및 이의 제어방법 | |
WO2020080635A1 (ko) | 동작 상태에 기반하여 선택한 마이크를 이용하여 음성 인식을 수행하는 전자 장치 및 그의 동작 방법 | |
WO2018143509A1 (ko) | 이동 로봇 및 그 제어방법 | |
WO2020171548A1 (ko) | 사용자 입력 처리 방법 및 이를 지원하는 전자 장치 | |
WO2021187901A1 (en) | Method for controlling external device based on voice and electronic device thereof | |
WO2020111727A1 (ko) | 전자 장치 및 전자 장치의 제어 방법 | |
WO2022124829A1 (ko) | 사용자 음성에 기초한 인터랙션을 제공하는 전자 장치 및 그 방법 | |
WO2016190676A1 (ko) | 로봇, 스마트 블록 완구 및 이를 이용한 로봇 제어 시스템 | |
WO2022158854A1 (ko) | 외부 웨어러블 전자 장치로부터 정보를 수신하는 웨어러블 전자 장치 및 그 작동 방법 | |
WO2021118229A1 (en) | Information providing method and electronic device for supporting the same | |
WO2020101174A1 (ko) | 개인화 립 리딩 모델 생성 방법 및 장치 | |
WO2021107200A1 (ko) | 이동 단말기 및 이동 단말기 제어 방법 | |
EP4206901A1 (en) | Wearable electronic device receiving information from external wearable electronic device and operation method thereof | |
EP3762819A1 (en) | Electronic device and method of controlling thereof | |
WO2021020727A1 (ko) | 대상의 언어 수준을 식별하는 전자 장치 및 방법 | |
WO2023080296A1 (ko) | Ar 디바이스 및 ar 디바이스 제어 방법 | |
WO2020235821A1 (en) | Electronic device for providing feedback corresponding to input for housing | |
WO2022220373A1 (ko) | 외부 웨어러블 전자 장치의 노이즈 캔슬링을 제어하는 웨어러블 전자 장치 및 이의 동작 방법 | |
WO2020076087A1 (ko) | 전자 장치 및 그의 동작 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22742830 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022742830 Country of ref document: EP Effective date: 20230330 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280008475.8 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202317050773 Country of ref document: IN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |