WO2022035166A1 - Dispositifs électroniques comprenant un module de caméra, et son procédé de commande de direction de photographie - Google Patents

Dispositifs électroniques comprenant un module de caméra, et son procédé de commande de direction de photographie Download PDF

Info

Publication number
WO2022035166A1
WO2022035166A1 PCT/KR2021/010513 KR2021010513W WO2022035166A1 WO 2022035166 A1 WO2022035166 A1 WO 2022035166A1 KR 2021010513 W KR2021010513 W KR 2021010513W WO 2022035166 A1 WO2022035166 A1 WO 2022035166A1
Authority
WO
WIPO (PCT)
Prior art keywords
module
electronic device
camera module
processor
gaze
Prior art date
Application number
PCT/KR2021/010513
Other languages
English (en)
Korean (ko)
Inventor
최윤형
이영욱
전진아
이정근
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2022035166A1 publication Critical patent/WO2022035166A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/327Calibration thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/51Housings
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects

Definitions

  • Various embodiments of the present document relate to an electronic device including a camera module and a method for controlling a photographing direction thereof.
  • wearable electronic devices eg, head mounted devices (HMDs), augmented reality (AR) devices, and virtual reality (VR) devices
  • wearable electronic devices may provide various functions.
  • wearable electronic devices may obtain and provide an image of the real world in a direction in which the user views it through a camera module.
  • the wearable electronic devices may analyze the acquired image of the real world and provide an augmented reality object or a virtual reality object related to objects of the real world included in the image of the real world.
  • the camera module included in the wearable electronic device may be fixed to the wearable electronic device so that the photographing direction cannot be changed. For this reason, the camera module included in the wearable electronic device may acquire only an image in a specified direction (eg, the front of the user), and in order to photograph the other direction, the user may use a part of the body (eg, the wearable electronic device) on which the wearable electronic device is worn. : There may be discomfort in moving the head).
  • a specified direction eg, the front of the user
  • the photographing direction of the camera module and the user's viewing direction may be different.
  • the wearable electronic device may not acquire an image of the real world in the direction the user sees, and may not properly provide the augmented reality object and/or virtual object related to the real world in the direction the user sees.
  • an electronic device including a camera module capable of tracking a user's gaze and adjusting a shooting direction of the camera module based on a result of tracking the gaze, and a method of controlling the shooting direction thereof.
  • Various embodiments of the present disclosure provide an electronic device including a camera module capable of adjusting the shooting direction of the camera module in response to detection of a designated gesture for changing the shooting direction of the camera module, and a method of controlling the shooting direction thereof can do.
  • Various embodiments of the present disclosure provide an electronic device including a camera module capable of providing an image of a partial region including a specific point in response to a user looking at (gaze) at a specific point, and photographing the same You can provide a way to control the direction.
  • An electronic device may include, for example, a housing; a gaze tracking module located in the housing; located in the housing, photographing a subject, and a camera module; a driving module for controlling a photographing direction of the camera module; a processor operatively connected to the eye tracking module, the camera module, and the driving module; and a memory operatively coupled to the processor, wherein the memory, when executed, causes the processor to acquire an image using the camera module and use the eye tracking module to obtain the image while acquiring the image. It is possible to store instructions for controlling the driving module to track the gaze of the user and to adjust the shooting direction of the camera module based on the tracked gaze of the user.
  • a method of controlling a photographing direction of an electronic device may include, for example, acquiring an image using a camera module; tracking a user's gaze using a gaze tracking module while acquiring the image; and adjusting the photographing direction of the camera module through a driving module based on the tracked user's gaze.
  • an image corresponding to (eg, matching) the direction in which the user is looking may be obtained by adjusting the shooting direction of the camera module in response to the movement of the user's gaze. Also, according to various embodiments of the present disclosure, it is possible to easily acquire an image in a direction desired by the user by changing the photographing direction of the camera module in the direction specified by the user. According to various embodiments of the present disclosure, the user can check the specific point in detail by enlarging and displaying a partial area corresponding to the point at which the user gazes.
  • FIG. 1 is a block diagram of an electronic device in a network environment, according to embodiments.
  • FIG. 2 is a block diagram of an electronic device according to an exemplary embodiment.
  • 3A is a flowchart illustrating a method of controlling a photographing direction of an electronic device based on a user's gaze, according to an exemplary embodiment.
  • 3B is an exemplary diagram of controlling a photographing direction of an electronic device based on a user's gaze, according to an embodiment.
  • 3C is another exemplary diagram of controlling a photographing direction of an electronic device based on a user's gaze, according to an exemplary embodiment.
  • FIG. 4 is a flowchart illustrating a method of controlling a photographing direction of an electronic device based on a user's gaze, according to an exemplary embodiment.
  • FIG. 5 is a flowchart illustrating a method of controlling a photographing direction of an electronic device based on a movement of the electronic device and a user's gaze, according to an exemplary embodiment.
  • 6A is a flowchart illustrating a method of controlling a photographing direction of an electronic device based on a user's gesture according to an exemplary embodiment.
  • 6B is an exemplary diagram of controlling a photographing direction of an electronic device based on a user's gesture according to an exemplary embodiment.
  • FIG. 7A is a flowchart illustrating a method of providing a captured image based on a user's gaze, according to an exemplary embodiment.
  • 7B is an exemplary diagram of providing an image captured based on a user's gaze, according to an embodiment.
  • FIG. 8A is a diagram illustrating an example in which a camera module is fastened to an electronic device according to an exemplary embodiment.
  • 8B is a diagram illustrating an example of an assembly for fastening a camera module to a housing of an electronic device according to an embodiment.
  • Fig. 8c is an exploded perspective view of the assembly of Fig. 8b;
  • 8D is a diagram illustrating an example in which an assembly is fastened to a housing of an electronic device according to an embodiment.
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100 according to various embodiments.
  • an electronic device 101 communicates with an electronic device 102 through a first network 198 (eg, a short-range wireless communication network) or a second network 199 . It may communicate with the electronic device 104 or the server 108 through (eg, a long-distance wireless communication network). According to an embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • a first network 198 eg, a short-range wireless communication network
  • a second network 199 e.g., a second network 199
  • the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • the electronic device 101 includes a processor 120 , a memory 130 , an input module 150 , a sound output module 155 , a display module 160 , an audio module 170 , and a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or an antenna module 197 may be included.
  • at least one of these components eg, the connection terminal 178
  • may be omitted or one or more other components may be added to the electronic device 101 .
  • some of these components are integrated into one component (eg, display module 160 ). can be
  • the processor 120 for example, executes software (eg, a program 140) to execute at least one other component (eg, a hardware or software component) of the electronic device 101 connected to the processor 120 . It can control and perform various data processing or operations. According to one embodiment, as at least part of data processing or operation, the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 . may be stored in the volatile memory 132 , and may process commands or data stored in the volatile memory 132 , and store the result data in the non-volatile memory 134 .
  • software eg, a program 140
  • the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 .
  • the volatile memory 132 may be stored in the volatile memory 132 , and may process commands or data stored in the volatile memory 132 , and store the result data in the non-volatile memory 134 .
  • the processor 120 is the main processor 121 (eg, a central processing unit or an application processor) or a secondary processor 123 (eg, a graphic processing unit, a neural network processing unit) a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor).
  • the main processor 121 e.g, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, a neural network processing unit
  • NPU neural processing unit
  • an image signal processor e.g., a sensor hub processor, or a communication processor.
  • the main processor 121 e.g, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, a neural network processing unit
  • NPU neural processing unit
  • an image signal processor e.g., a sensor hub processor, or a communication processor.
  • the main processor 121 e.g, a central processing unit or an application processor
  • a secondary processor 123
  • the auxiliary processor 123 is, for example, on behalf of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or the main processor 121 is active (eg, executing an application). ), together with the main processor 121, at least one of the components of the electronic device 101 (eg, the display module 160, the sensor module 176, or the communication module 190) It is possible to control at least some of the related functions or states.
  • the co-processor 123 eg, an image signal processor or a communication processor
  • may be implemented as part of another functionally related component eg, the camera module 180 or the communication module 190. there is.
  • the auxiliary processor 123 may include a hardware structure specialized for processing an artificial intelligence model.
  • Artificial intelligence models can be created through machine learning. Such learning may be performed, for example, in the electronic device 101 itself on which artificial intelligence is performed, or may be performed through a separate server (eg, the server 108).
  • the learning algorithm may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but in the above example not limited
  • the artificial intelligence model may include a plurality of artificial neural network layers.
  • Artificial neural networks include deep neural networks (DNNs), convolutional neural networks (CNNs), recurrent neural networks (RNNs), restricted boltzmann machines (RBMs), deep belief networks (DBNs), bidirectional recurrent deep neural networks (BRDNNs), It may be one of deep Q-networks or a combination of two or more of the above, but is not limited to the above example.
  • the artificial intelligence model may include, in addition to, or alternatively, a software structure in addition to the hardware structure.
  • the memory 130 may store various data used by at least one component of the electronic device 101 (eg, the processor 120 or the sensor module 176 ).
  • the data may include, for example, input data or output data for software (eg, the program 140 ) and instructions related thereto.
  • the memory 130 may include a volatile memory 132 or a non-volatile memory 134 .
  • the program 140 may be stored as software in the memory 130 , and may include, for example, an operating system 142 , middleware 144 , or an application 146 .
  • the input module 150 may receive a command or data to be used in a component (eg, the processor 120 ) of the electronic device 101 from the outside (eg, a user) of the electronic device 101 .
  • the input module 150 may include, for example, a microphone, a mouse, a keyboard, a key (eg, a button), or a digital pen (eg, a stylus pen).
  • the sound output module 155 may output a sound signal to the outside of the electronic device 101 .
  • the sound output module 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback.
  • the receiver may be used to receive an incoming call. According to one embodiment, the receiver may be implemented separately from or as part of the speaker.
  • the display module 160 may visually provide information to the outside (eg, a user) of the electronic device 101 .
  • the display module 160 may include, for example, a control circuit for controlling a display, a hologram device, or a projector and a corresponding device.
  • the display module 160 may include a touch sensor configured to sense a touch or a pressure sensor configured to measure the intensity of a force generated by the touch.
  • the audio module 170 may convert a sound into an electric signal or, conversely, convert an electric signal into a sound. According to an embodiment, the audio module 170 acquires a sound through the input module 150 , or an external electronic device (eg, a sound output module 155 ) connected directly or wirelessly with the electronic device 101 . A sound may be output through the electronic device 102 (eg, a speaker or headphones).
  • an external electronic device eg, a sound output module 155
  • a sound may be output through the electronic device 102 (eg, a speaker or headphones).
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101 or an external environmental state (eg, user state), and generates an electrical signal or data value corresponding to the sensed state. can do.
  • the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 177 may support one or more designated protocols that may be used by the electronic device 101 to directly or wirelessly connect with an external electronic device (eg, the electronic device 102 ).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card
  • the connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102 ).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that the user can perceive through tactile or kinesthetic sense.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture still images and moving images. According to an embodiment, the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101 .
  • the power management module 188 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101 .
  • battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108). It can support establishment and communication performance through the established communication channel.
  • the communication module 190 may include one or more communication processors that operate independently of the processor 120 (eg, an application processor) and support direct (eg, wired) communication or wireless communication.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg, : It may include a LAN (local area network) communication module, or a power line communication module).
  • GNSS global navigation satellite system
  • a corresponding communication module among these communication modules is a first network 198 (eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (eg, legacy It may communicate with the external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
  • a first network 198 eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)
  • a second network 199 eg, legacy It may communicate with the external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
  • a telecommunication network
  • the wireless communication module 192 uses the subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199 .
  • the electronic device 101 may be identified or authenticated.
  • the wireless communication module 192 may support a 5G network after a 4G network and a next-generation communication technology, for example, a new radio access technology (NR).
  • NR access technology includes high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), minimization of terminal power and access to multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low-latency) -latency communications)).
  • eMBB enhanced mobile broadband
  • mMTC massive machine type communications
  • URLLC ultra-reliable and low-latency
  • the wireless communication module 192 may support a high frequency band (eg, mmWave band) to achieve a high data rate, for example.
  • a high frequency band eg, mmWave band
  • the wireless communication module 192 includes various technologies for securing performance in a high-frequency band, for example, beamforming, massive multiple-input and multiple-output (MIMO), all-dimensional multiplexing. It may support technologies such as full dimensional MIMO (FD-MIMO), an array antenna, analog beam-forming, or a large scale antenna.
  • the wireless communication module 192 may support various requirements specified in the electronic device 101 , an external electronic device (eg, the electronic device 104 ), or a network system (eg, the second network 199 ).
  • the wireless communication module 192 may include a peak data rate (eg, 20 Gbps or more) for realizing eMBB, loss coverage (eg, 164 dB or less) for realizing mMTC, or U-plane latency for realizing URLLC ( Example: downlink (DL) and uplink (UL) each 0.5 ms or less, or round trip 1 ms or less).
  • a peak data rate eg, 20 Gbps or more
  • loss coverage eg, 164 dB or less
  • U-plane latency for realizing URLLC
  • the antenna module 197 may transmit or receive a signal or power to the outside (eg, an external electronic device).
  • the antenna module 197 may include an antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is connected from the plurality of antennas by, for example, the communication module 190 . can be selected. A signal or power may be transmitted or received between the communication module 190 and an external electronic device through the selected at least one antenna.
  • other components eg, a radio frequency integrated circuit (RFIC)
  • RFIC radio frequency integrated circuit
  • the antenna module 197 may form a mmWave antenna module.
  • the mmWave antenna module comprises a printed circuit board, an RFIC disposed on or adjacent to a first side (eg, bottom side) of the printed circuit board and capable of supporting a designated high frequency band (eg, mmWave band); and a plurality of antennas (eg, an array antenna) disposed on or adjacent to a second side (eg, top or side) of the printed circuit board and capable of transmitting or receiving signals of the designated high frequency band. can do.
  • peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • the command or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199 .
  • Each of the external electronic devices 102 or 104 may be the same as or different from the electronic device 101 .
  • all or part of the operations performed by the electronic device 101 may be executed by one or more external electronic devices 102 , 104 , or 108 .
  • the electronic device 101 may perform the function or service itself instead of executing the function or service itself.
  • one or more external electronic devices may be requested to perform at least a part of the function or the service.
  • One or more external electronic devices that have received the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit a result of the execution to the electronic device 101 .
  • the electronic device 101 may process the result as it is or additionally and provide it as at least a part of a response to the request.
  • cloud computing distributed computing, mobile edge computing (MEC), or client-server computing technology may be used.
  • the electronic device 101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 104 may include an Internet of things (IoT) device.
  • Server 108 may be an intelligent server using machine learning and/or neural networks.
  • the external electronic device 104 or the server 108 may be included in the second network 199 .
  • the electronic device 101 may be applied to an intelligent service (eg, smart home, smart city, smart car, or health care) based on 5G communication technology and IoT-related technology.
  • FIG. 2 is a block diagram of an electronic device according to an exemplary embodiment.
  • an electronic device 201 (eg, the electronic device 101 of FIG. 1 ) according to an embodiment of the present invention may be a wearable electronic device that can be worn on a part of a user's body.
  • the electronic device 201 may include a head mounted device (HMD), smart glasses (or augmented reality (AR) glasses), or a virtual reality device. .
  • HMD head mounted device
  • AR augmented reality
  • the electronic device 201 includes a gaze tracking module 210 , a driving module 215 , a processor 220 (eg, the processor 120 of FIG. 1 ), and a memory 230 (eg, FIG. 1 ). of memory 130 ), display module 260 (eg, display module 160 in FIG. 1 ), sensor module 276 (eg, sensor module 176 in FIG. 1 ), camera module 280 (eg, in FIG. 1 ) : The camera module 180 of FIG. 1 ) and the communication module 290 (eg, the communication module 190 of FIG. 1 ) may be included.
  • the gaze tracking module 210 may track the gaze of the user.
  • the gaze tracking module 210 may track the gaze of the user by tracking the movement of the user's eyes (eg, pupils).
  • the eye tracking module 210 may transmit a tracking result to the processor 220 .
  • the eye tracking module 210 may periodically/aperiodically transmit the tracking result to the processor 220 .
  • the gaze tracking module 210 transmits information that the gaze has changed when a gaze change is detected based on the tracking result (eg, the eyes move more than a reference value while the head is not moving) to the processor 220 ) can be transmitted.
  • the eye tracking module 210 may include at least one of a camera, a vertical cavity surface emitting laser (VCSEL), an infrared sensor, and/or a photodiode.
  • the photodiode may include a PIN photodiode or an avalanche photodiode (APD).
  • the photodiode may be referred to as a photo detector or a photo sensor.
  • the driving module 215 may change (adjust) the shooting direction of the camera module 280 under the control of the processor 220 .
  • the driving module 215 may change (adjust) the shooting direction by rotating the camera module 280 upward, downward, left, or right under the control of the processor 220 .
  • the driving module 215 includes a first actuator (eg, a first motor) that rotates the camera module 280 in the vertical direction and a second actuator (eg, a second actuator) that rotates the camera module 280 in the left and right directions. : a second motor) may be included.
  • a first actuator eg, a first motor
  • a second actuator eg, a second actuator
  • a second motor may be included.
  • the driving module 215 may be used to compensate for vibration under the control of the processor 220 .
  • the driving module 215 moves the camera module 280 in a direction opposite to the detected movement.
  • the driving module 215 may include only one of the first actuator and the second actuator.
  • the electronic device 201 corrects vertical vibration using the first actuator of the driving module 215 , and the horizontal vibration is corrected by software (eg, DIS ( digital image stabilizer) correction function).
  • the electronic device 201 corrects the left-right shaking by using the second actuator of the driving module 215 under the control of the processor 220, and the vertical shaking is corrected by software (eg, a DIS correction function). ) can be corrected.
  • the weight reduction and/or reduction in the size of the driving module 215 of the electronic device 201 may be configured. there is.
  • the driving module 215 may perform a 3-axis rotational motion as a rotational motion related to the movement of the camera module 280 under the control of the processor 220 .
  • the three-axis rotational motion may consist of a rolling motion, a pitching motion, or a yawing motion.
  • a motion related to any rotation axis perpendicular to the optical axis (not shown) of the camera module 280 may be defined as a rolling motion, and any other rotation perpendicular to the optical axis (not shown).
  • a motion relative to the axis may be defined as a pitching motion.
  • motion with respect to an axis of rotation substantially parallel (coincident with) to an optical axis may be defined as a yaw motion.
  • a rotation axis parallel to an optical axis may be an axis orthogonal to a plane orthogonal to the rolling motion and the pitching motion.
  • the driving module 215 may include a plurality of actuators (eg, motors) for 3-axis rotational driving.
  • the driving module 215 may include a magnet and a coil for 3-axis rotational driving.
  • an electromagnetic force eg, Lorentz force
  • the driving module 215 may perform 3-axis rotational driving using an actuator (eg, a motor), a magnet, and a coil under the control of the processor 220 .
  • the driving module 215 may perform a rolling motion and a pitching motion using a magnet and a coil, and may perform a yaw motion using an actuator.
  • the processor 220 may control the overall operation of the electronic device 201 .
  • the processor 220 may control each component of the electronic device 201 .
  • the processor 220 may control each component based on a command or execution of instructions stored in the memory 230 and perform various operations.
  • the processor 220 controls operations of adjusting (controlling) the shooting direction of the camera module 280 or enlarging some images based on at least one of a user's gaze, a designated gesture, and a movement of the electronic device. can do.
  • the processor 220 obtains an image in the direction the user is looking through the camera module 280 , and outputs the obtained image through the display module 260 , or through the communication module 290 .
  • the transmission may be performed to an external electronic device (eg, the electronic device 102 , the electronic device 104 of FIG. 1 , or the server 108 ).
  • the processor 220 identifies the user's viewing direction (hereinafter, the gaze direction) based on the user's gaze tracking through the gaze tracking module 210 in order to accurately obtain an image in the user's viewing direction. can do.
  • the processor 220 may control the driving module 215 to adapt the photographing direction of the camera module 280 based on the gaze direction.
  • the processor 220 may adjust the shooting direction of the camera module 280 based on the specified gesture.
  • the processor 220 may control the driving module 215 to adjust the photographing direction of the camera module 280 based on the movement of the electronic device and/or the user's gaze.
  • the processor 220 may include information received from at least one of the eye tracking module 210 , the camera module 280 , and the sensor module 276 (eg, a user's gaze, a designated gesture, or an electronic device). based on the sensor value related to the movement of the , generates control data (eg, a motor output value) for controlling the driving module 215 (eg, rolling, pitching, or yaw), and using the generated control data to the driving module It can be sent to (215). The control data may be changed to another value (eg, voltage) by a driver (not shown) of the driving module 215 .
  • control data eg, a motor output value
  • the control data may be changed to another value (eg, voltage) by a driver (not shown) of the driving module 215 .
  • the processor 220 may enlarge and provide a part of the image including the point at which the user is gazing. A detailed description thereof will be described later with reference to FIGS. 3A to 7B, which will be described later.
  • the processor 220 may provide a virtual reality object and/or an augmented reality object related to an image obtained through the camera module 280 .
  • the processor 220 may overlay a virtual reality object and/or an augmented reality object on the obtained image.
  • the memory 230 is located inside the housing of the electronic device 201 and may be operatively connected to the processor 220 .
  • the memory 230 stores various programs, and may store data generated while executing the various programs or downloaded data.
  • the memory 230 may store various commands and/or instructions for operating the processor 220 .
  • the memory 230 adjusts the photographing direction of the camera module 280 based on at least one of the user's gaze (eg, gaze direction or gaze), a specified gesture, and/or movement of the electronic device ( control) or various program codes, commands, or instructions to enlarge and provide a part of an image.
  • the memory 230 may store an initial sensed value of the sensor module 276 measured when the electronic device 201 is worn as reference information. The stored reference information may be used by the processor 220 to determine whether the electronic device 201 moves.
  • the display module 260 may be exposed through a first surface (eg, a front surface) of the housing of the electronic device 201 and may provide an output function.
  • the display module 260 may display an image acquired through the camera module 280 and/or a virtual reality object or augmented reality object related to the acquired image.
  • the sensor module 276 may detect a movement of the electronic device 201 .
  • the sensor module 276 may detect a movement of the electronic device 201 due to a movement of a body (eg, head) of a user wearing the electronic device 201 .
  • the sensor module 276 may transmit a sensing value that is changed (measured) according to the movement of the electronic device 201 to the processor 220 .
  • the camera module 280 may be located in a housing of the electronic device 201 .
  • the camera module 280 may be positioned in the housing of the electronic device 201 to be rotatable up, down, left, and right by the driving module 215 .
  • the fastening structure of the camera module 280 will be described later with reference to FIGS. 8A to 8D .
  • the camera module 280 may be a wide-angle camera capable of photographing a subject and having a viewing angle substantially equal to or similar to a user's viewing angle. Also, the camera module 280 may support a zoom function.
  • the electronic device 201 may include a plurality of camera modules 280 each having different properties (eg, angle of view) or functions.
  • the camera module 280 may include a plurality of cameras, and each camera may include a lens having a different angle of view.
  • the electronic device 201 may control to change the angle of view of the camera module 280 based on the user's selection.
  • at least one of the plurality of camera modules 280 may be a wide-angle camera, and at least the other may be a telephoto camera.
  • at least one of the plurality of camera modules 280 may be a front camera, and at least the other may be a rear camera.
  • the plurality of camera modules 280 may include at least one of a wide-angle camera, a telephoto camera, and an IR (infrared) camera (eg, a time of flight (TOF) camera, a structured light camera).
  • IR infrared
  • the IR camera may be operated as at least a part of the sensor module 276 .
  • the TOF camera may be operated as at least a part of the sensor module 276 for detecting the distance to the subject.
  • the communication module 290 is located in the housing of the electronic device 201 and may perform wired communication and/or wireless communication.
  • the communication module 290 may include at least one wireless communication (eg, mobile communication, WiFi, LiFi, or Bluetooth) circuit and/or at least one wired communication (eg, high definition multimedia (HDMI)) interface), a display port (DP), or a universal serial bus (USB) circuit.
  • wireless communication eg, mobile communication, WiFi, LiFi, or Bluetooth
  • wired communication eg, high definition multimedia (HDMI) interface
  • DP display port
  • USB universal serial bus
  • the communication module 290 transmits the image acquired by the camera module 280 to at least one external electronic device (eg, the electronic device 102 of FIG. 1 , the electronic device 104 , or a server). (108)).
  • the external electronic device eg, the electronic device 102 of FIG. 1 , the electronic device 104 , or a server. (108)).
  • FIGS. 3A and 3C are photographing of an electronic device based on a user's gaze according to an embodiment It is an example of controlling the direction.
  • a processor eg, the processor 120 of FIGS. 1 and 2 of an electronic device (eg, the electronic devices 101 and 201 of FIGS. 1 and 2 ) according to an embodiment of the present document , 220)
  • may acquire an image through the camera module 380 eg, the camera modules 180 and 280 of FIGS. 1 and 2 .
  • the processor may acquire an image corresponding to (eg, matching) a direction in which a user sees (hereinafter, a gaze direction) through the camera module 380 .
  • the processor may track the user's gaze through the gaze tracking module (eg, the gaze tracking module 210 of FIG. 2 ). For example, the processor may track (or monitor) the gaze to determine whether the gaze direction is changed. According to an embodiment, the processor may track the gaze in a state in which the electronic device does not move (eg, there is no movement of the body wearing the electronic device) and determine whether the gaze direction is changed. According to an embodiment, in operation 303 , the processor uses the low-power mode camera module 380 and/or the gaze tracking module (eg, the gaze tracking module 210 of FIG. 2 ) to analyze an image (eg, a user eye tracking).
  • the processor uses the low-power mode camera module 380 and/or the gaze tracking module (eg, the gaze tracking module 210 of FIG. 2 ) to analyze an image (eg, a user eye tracking).
  • the camera module 380 and/or the eye tracking module may operate in a software standby mode when the camera does not operate (eg, operates in the background).
  • the camera module 380 and/or the eye tracking module may be performed only when the processor performs a specified function (eg, complex electronic circuit analysis, telemedicine).
  • the camera module 380 and/or the eye tracking module may, for example, reduce the number of frames per second (FPS) of input image data, a method using sub sampling, or a data bit At least one method of reducing (data bit) may be used to drive in a low power mode.
  • FPS frames per second
  • the processor may adjust the photographing direction of the camera module 380 based on the tracked gaze. For example, as shown in FIG. 3C , when it is confirmed that the user's gaze direction has been changed based on the tracked gaze, the processor may be configured to obtain an image corresponding to (eg, matching) the changed gaze direction (eg, by using the driving module 215 of FIG. 2 ), the shooting direction of the camera module 380 may be adjusted (eg, the camera module 380 may be rotated up, down, left, and right).
  • the processor changes the field of view of the camera module 380 from the first field of view 31a to the second field of view 32b . (31b) can be changed.
  • the processor compares the identified gaze direction and the photographing direction to determine whether they are different, and if different, drives the driving module (eg, the driving module of FIG. 2 ) to obtain an image corresponding to (eg, matching) the gaze direction of the user.
  • the shooting direction of the camera module 380 may be adjusted (eg, the camera module 380 may be rotated up, down, left, and right).
  • the processor when the tracked gaze deviates from the reference gaze (eg, gaze when the user looks at the front (when the pupil is centered)) by more than a reference value, corresponds to the user's viewing direction (eg: The photographing direction of the camera module 380 may be adjusted (eg, the camera module 380 is rotated in the vertical, horizontal, left and right directions) through the driving module to obtain a matching image.
  • the reference gaze e.g, gaze when the user looks at the front (when the pupil is centered)
  • the processor corresponds to the user's viewing direction (eg: The photographing direction of the camera module 380 may be adjusted (eg, the camera module 380 is rotated in the vertical, horizontal, left and right directions) through the driving module to obtain a matching image.
  • FIG. 3C shows an example in which the gaze direction is changed to the bottom
  • the user may change the gaze to the top, left, or right side
  • the processor performs the function of the camera module 380 based on the changed gaze direction.
  • the driving module may be controlled to change the shooting direction to the top, left, or right.
  • FIG. 4 is a flowchart illustrating a method of controlling a photographing direction of an electronic device based on a user's gaze, according to an exemplary embodiment.
  • a processor eg, the processors 120 and 220 of FIGS. 1 and 2 of the electronic device (eg, the electronic devices 101 and 201 of FIGS. 1 and 2 ) according to an embodiment of the present document ) may activate (eg, turn-on) a camera module (eg, the camera modules 180 , 280 , 380 of FIGS. 1 to 3C ) in operation 401 .
  • the processor may activate the camera module in response to a user's request or execution of a specified function (or application).
  • the processor may perform calibration of the photographing direction of the camera module. For example, the processor may correct the photographing direction of the camera module to correspond to (eg, match) a reference direction (eg, a direction when a user looks at the front). This is because the photographing direction of the camera module may be different depending on the wearing state (eg, worn at a position higher than the reference position or worn at a lower position). For example, the processor may output a guide message inducing the user to look in a reference direction (eg, the front).
  • a reference direction eg, a direction when a user looks at the front.
  • the processor may acquire an image through the camera module.
  • the processor may provide a virtual reality (VR) object and/or an augmented reality (AR) object.
  • the processor may extract at least one object from the obtained image, and provide a virtual reality object and/or augmented reality object related to the extracted object (eg, displayed by overlaying the image).
  • the processor may control the communication module (eg, the communication modules 190 and 290 of FIGS. 1 and 2 ) to transmit the acquired image to an external electronic device.
  • the processor may track the user's gaze through the gaze tracking module (eg, the gaze tracking module 210 of FIG. 2 ).
  • the gaze tracking module eg, the gaze tracking module 210 of FIG. 2 .
  • the processor may determine whether the gaze direction is changed. For example, the processor may determine whether the gaze direction is changed in a state in which the electronic device does not move (eg, there is no movement of a body wearing the electronic device).
  • the processor may proceed to operation 415, which will be described later.
  • the processor may adjust the photographing direction of the camera module based on the changed gaze direction in operation 413 .
  • the processor uses a driving module (eg, the driving module 215 of FIG. 2 ) to obtain an image corresponding to (eg, matching) the changed gaze direction to adjust the shooting direction of the camera module (eg, up, down, left and right) direction) to rotate the camera module.
  • a driving module eg, the driving module 215 of FIG. 2
  • the shooting direction of the camera module eg, up, down, left and right
  • the processor may determine whether the camera module is deactivated. If the camera module is not deactivated as a result of checking operation 415, the processor may return to operation 409 to perform the above-described operations.
  • the processor may end control of the photographing direction of the camera module.
  • FIG. 5 is a flowchart illustrating a method of controlling a photographing direction of an electronic device based on a movement of the electronic device and a user's gaze, according to an exemplary embodiment.
  • a processor eg, the processors 120 and 220 of FIGS. 1 and 2 of an electronic device (eg, the electronic devices 101 and 201 of FIGS. 1 and 2 ) according to an embodiment of the present document ) may activate a camera module (eg, the camera modules 180 , 280 , 380 of FIGS. 1 to 3C ) in operation 501 .
  • the processor may activate the camera module in response to a user's request or execution of a specified function (or application).
  • the processor may perform calibration of a sensor module (eg, the sensor modules 176 and 276 of FIGS. 1 and 2 ).
  • the processor receives a sensed value detected from at least one sensor (eg, an acceleration sensor, a geomagnetic sensor, a gyro sensor) that detects a movement of the electronic device when the electronic device is worn on the user's body (eg, head)
  • the sensor module may be calibrated by storing the reference information in a memory (eg, the memories 130 and 230 of FIGS. 1 and 2 ).
  • correction of the gaze tracking module (eg, the gaze tracking module 210 of FIG. 2 ) may be performed together.
  • the processor may acquire an image through the camera module.
  • the processor performs virtual reality (VR) objects and/or augmented reality (AR) You can provide an object.
  • the processor may extract at least one object from the acquired image, and provide a virtual reality object and/or augmented reality object related to the extracted object (eg, displayed by overlaying the acquired image).
  • the processor may control the communication module (eg, the communication modules 190 and 290 of FIGS. 1 and 2 ) to transmit the acquired image to an external electronic device.
  • the processor may track the movement of the electronic device and the gaze of the user.
  • the processor may track the movement of the electronic device through the sensor module, and may track the user's gaze through the eye tracking module (eg, the eye tracking module 210 of FIG. 2 ).
  • the processor may adjust the photographing direction of the camera module based on the tracking result. For example, when the user's gaze direction and the camera's photographing direction are different due to movement and/or gaze movement of the electronic device, the processor may be configured to obtain an image corresponding to (eg, matching) the gaze direction of the user. (eg, the driving module 215 of FIG. 2 ) may be used to adjust the photographing direction of the camera module (eg, rotate the camera module in up, down, left, and right directions). According to an embodiment, when the user's gaze direction and the photographing direction of the camera module are not different from each other, the processor may maintain acquisition of the image through the camera module.
  • the processor may maintain acquisition of the image through the camera module.
  • the processor moves in a direction opposite to the movement.
  • Shake compensation eg OIS
  • the processor is It can perform an auto focus (AF, auto focus) function that adjusts the focal length of the .
  • the movement of the electronic device may be determined based on information obtained by using a sensor module (eg, the sensor modules 176 and 276 of FIGS. 1 and 2 ) (eg, a TOF camera). Based on the , an auto focus function may be performed by controlling the lens of the camera module.
  • FIG. 6A is a flowchart illustrating a method of controlling a photographing direction of an electronic device based on a user's gesture according to an embodiment
  • FIG. 6B is a flowchart illustrating a method of controlling a photographing direction of an electronic device based on a user's gesture according to an embodiment It is also an example of
  • a processor eg, the processor 120 of FIGS. 1 and 2 of an electronic device (eg, the electronic devices 101 and 201 of FIGS. 1 and 2 ) according to an embodiment of the present document , 220)
  • the designated gesture may be a gesture set to change the shooting direction of the camera module (eg, the camera modules 180, 280, 380 of FIGS. 1 to 3C).
  • the processor may maintain operation 601.
  • the processor may adjust a photographing direction of the camera module based on the detected gesture in operation 603 .
  • the processor adjusts the shooting direction of the camera module by using the driving module (eg, the driving module 215 of FIG. 2 ) to obtain an image corresponding to (eg, matching) the direction in which the specified gesture is detected (eg, the driving module 215 of FIG. 2 ) : rotate the camera module in an upward direction, a downward direction, a left direction, and/or a right direction).
  • the processor controls the driving module to control the camera module can be adjusted to the specific direction.
  • the processor detects the user's hand from the image acquired through the camera module, tracks the detected user's hand, and adjusts the shooting direction of the camera module to the direction in which the hand disappears when the hand disappears from the image can The user of the electronic device may easily check the real world in a different direction through the camera module without looking away while having a conversation with the other party facing each other.
  • the processor detects the gesture when the specified gesture is detected. It is checked whether the user's gaze moves (changes) in the specified direction, and when the user's gaze moves (changes) in the direction in which the gesture is detected, the shooting direction of the camera module may be adjusted.
  • the example of controlling the shooting direction based on the gesture of FIGS. 6A and 6B may be combined with at least one of the embodiments of FIGS. 3A to 5 described above.
  • FIG. 7A is a flowchart illustrating a method of providing an image photographed based on a user's gaze according to an embodiment
  • FIG. 7B is an exemplary diagram of providing an image photographed based on a user's gaze according to an embodiment .
  • a processor eg, the processor 120 of FIGS. 1 and 2 of an electronic device (eg, the electronic devices 101 and 201 of FIGS. 1 and 2 ) according to an embodiment of the present document , 220)
  • the processor may track the user's gaze through the gaze tracking module (eg, the gaze tracking module 210 of FIG. 2 ) to determine whether the user gazes at a specific point.
  • the processor may maintain operation 701.
  • the processor may enlarge and provide some images including the specific point among the images acquired through the camera module in operation 703 .
  • the processor magnifies some images 713 including the specific point 711 , and the enlarged image 715 . is displayed on a display (eg, the display modules 160 and 260 of FIGS. 1 and 2 ), or an enlarged image 715 is displayed on an external electronic device (eg, the electronic devices 102 and 104 of FIG. 1 ), or a server (108)).
  • the camera module supports a zoom function (eg, zoom-in or zoom-out)
  • the processor may zoom-in to a specific point and provide a zoom-in image.
  • the processor when the user gazes at a specific point of a complex electronic circuit including small-sized parts, the processor enlarges and provides a part of the image including the specific point, or uses the zoom function of the camera module A zoomed-in image may be provided.
  • the specific part in response to a doctor's request to show a specific part of a patient in detail when diagnosing a patient, when a doctor wearing an electronic device gazes at a specific part, the specific part is enlarged or zoomed-in to the remote location. It can be transmitted to an external electronic device that the doctor of the doctor is looking at.
  • FIGS. 7A and 7B may be combined with at least one of the embodiments of FIGS. 3A to 6B .
  • FIG. 8A is a diagram illustrating an example in which a camera module is fastened to an electronic device according to an embodiment
  • FIG. 8B is a diagram illustrating an example of an assembly for fastening the camera module to a housing of the electronic device according to an embodiment
  • FIG. 8C is an exploded perspective view of the assembly of FIG. 8B
  • FIG. 8D is a diagram illustrating an example in which the assembly is fastened to a housing of an electronic device according to an embodiment.
  • the camera module 880 (eg, the camera modules 180, 280, 380 of FIGS. 1 to 3B) according to an embodiment of the present document is provided through the assembly 800 through the housing ( 810) may be fastened to one side to be rotatable (eg, to be rotatable in up, down, left, and right directions).
  • the camera module 880 may be fastened to one side (eg, an upper surface) of the housing 810 by the assembly 800 in a suspended form, and may be located in a hole area of the housing 810 .
  • the assembly 800 may include a bracket 801 , a damper 802 , a damper bar 803 , and a protection window 804 .
  • the bracket 801 may be coupled to the camera module 880 .
  • the bracket 801 may be formed of an injection molding material.
  • the bracket 801 may be moved vertically (eg, rotated) by a driving module (eg, the driving module 215 ).
  • the damper 802 may be fastened to at least one protrusion 801a formed on a side surface of the bracket 801 .
  • the damper 802 may be formed of a rubber material in order to minimize the vibration generated by the user's movement from being transmitted to the camera module 880 .
  • the damper bar 803 (eg, the first damper bar 803 - 1 and the second damper bar 803 - 2 ) is connected to the at least one damper 802 , and the housing 810 .
  • the damper bar 803 may include a first connection part 803a to which the damper 802 is connected and a second connection part 803b to be connected to the housing 810 .
  • the first connecting portion 803a and the second connecting portion 803b of the damper bar 803 may be formed to have a step difference.
  • the damper bar 803 may be fastened to the housing 810 by a fastening member (eg, a screw) 805 , as shown in FIG. 8D .
  • the damper bar 803 When the damper bar 803 is fastened to the housing 810 , the second connection part 803b may be adjacent to (or in contact with) the housing 810 , and the first connection part 803a may be spaced apart from the housing 810 . .
  • the damper bar 803 may be formed of an injection-molded material or a steel material.
  • the damper bar 803 includes a first damper bar 803-1 to which the first damper 802a and the second damper 802b are connected, and/or the third damper 802c and the fourth A second damper bar 803 - 2 to which the damper 802d is connected may be included.
  • the damper bar 803 includes a first damper bar to which the first damper 802a is connected, a second damper bar to which the second damper 802b is connected, and a third damper bar to which the third damper 802c is connected. It may include three damper bars and/or a fourth damper bar to which the fourth damper 802d is connected.
  • the protection window 804 may protect the camera module 880 (eg, a lens).
  • the protective window 804 may be formed of a transparent material (eg, glass).
  • dampers 802 are present on both sides of the bracket 801 in FIGS. 8A to 8D , the present invention is not limited thereto.
  • one damper 802 may exist on both sides of the bracket 801 .
  • the horizontal (X-axis) 81 and the vertical (Y-axis) 82 of the assembly 800 including the camera module 880 may be about 10 mm, and the height (Z-axis) ( 83) may be about 13 mm, but is not limited thereto.
  • An electronic device (eg, the electronic devices 101 and 201 of FIGS. 1 and 2 ) according to an embodiment of the present document includes a housing (eg, the housing 810 of FIG. 8A ); a gaze tracking module (eg, the gaze tracking module 210 of FIG. 2 ) located in the housing; located in the housing, photographing a subject, and a camera module (eg, the camera module 180 of FIG. 1 , the camera module 280 of FIG. 2 , the camera module 380 of FIGS. 3B and 3C); a driving module (eg, the driving module 215 of FIG. 2 ) for controlling the shooting direction of the camera module; a processor (eg, the processors 120 and 220 of FIGS.
  • a housing eg, the housing 810 of FIG. 8A
  • a gaze tracking module eg, the gaze tracking module 210 of FIG. 2
  • a camera module eg, the camera module 180 of FIG. 1 , the camera module 280 of FIG. 2 , the camera module 380
  • the processor uses the camera module to Acquiring an image, tracking a user's gaze using the gaze tracking module while acquiring the image, and driving the camera module to adapt a shooting direction based on the tracked gaze of the user It can store instructions for controlling the module.
  • the memory provides at least one of an augmented reality object and a virtual reality object corresponding to at least one object included in the acquired image. You can store more instructions.
  • the memory may further store instructions for calibrating a photographing direction of the camera module to match a reference direction before acquiring the image.
  • the electronic device may further include a sensor module (eg, the sensor module 176 of FIG. 1 , the sensor module 276 of FIG. 2 ) for detecting the movement of the electronic device, and the memory may further store instructions for controlling the driving module to perform shake correction based on the sensed movement of the electronic device.
  • a sensor module eg, the sensor module 176 of FIG. 1 , the sensor module 276 of FIG. 2
  • the memory may further store instructions for controlling the driving module to perform shake correction based on the sensed movement of the electronic device.
  • the memory checks whether the movement of the electronic device sensed through the sensor module is within a specified range, and controls the driving module to perform the shake compensation when the movement of the electronic device is within the specified range You can store more instructions.
  • the electronic device may further include a communication module (eg, the communication module 190 of FIG. 1 and the communication module 290 of FIG. 2 ), and the memory transmits the acquired image to the communication. Instructions for transmitting to an external electronic device through the module may be further stored.
  • a communication module eg, the communication module 190 of FIG. 1 and the communication module 290 of FIG. 2
  • the memory transmits the acquired image to the communication. Instructions for transmitting to an external electronic device through the module may be further stored.
  • the eye tracking module may include at least one of another camera for eye tracking, a vertical-cavity surface-emitting laser (VCSEL), an infrared sensor, or a photodiode.
  • VCSEL vertical-cavity surface-emitting laser
  • the eye tracking module may include at least one of another camera for eye tracking, a vertical-cavity surface-emitting laser (VCSEL), an infrared sensor, or a photodiode.
  • VCSEL vertical-cavity surface-emitting laser
  • the driving module may include: a first actuator configured to rotate the camera module in a vertical direction; and a second actuator that rotates the camera module in a left and right direction.
  • the memory in response to detecting that the user gazes at a specific point, may further store instructions for magnifying and providing a partial area including the specific point in the image.
  • the memory may further store instructions for controlling the driving module to adjust the shooting direction of the camera module in response to detection of a preset gesture to change the shooting direction of the camera module.
  • a method of controlling a photographing direction of an electronic device includes a camera module (eg, the camera module 180 of FIG. 1 , FIG. 1 ) acquiring an image using the camera module 280 of 2 and the camera module 380 of FIGS. 3B and 3C); tracking the user's gaze using a gaze tracking module (eg, the gaze tracking module 210 of FIG. 2 ) while acquiring the image; and adjusting the photographing direction of the camera module through a driving module (eg, the driving module 215 of FIG. 2 ) based on the tracked user's gaze.
  • a camera module eg, the camera module 180 of FIG. 1 , FIG. 1
  • a gaze tracking module eg, the gaze tracking module 210 of FIG. 2
  • driving module eg, the driving module 215 of FIG. 2
  • the method includes providing at least one of an augmented reality object and a virtual reality object corresponding to at least one object included in the acquired image may further include.
  • the method may further include calibrating the photographing direction of the camera module to match a reference direction before acquiring the image.
  • the method may include detecting a movement of the electronic device through a sensor module (eg, the sensor module 176 of FIG. 1 , the sensor module 276 of FIG. 2 ); and performing vibration compensation by controlling the driving module based on the detected movement of the electronic device.
  • a sensor module eg, the sensor module 176 of FIG. 1 , the sensor module 276 of FIG. 2
  • vibration compensation by controlling the driving module based on the detected movement of the electronic device.
  • the method may further include checking whether the movement of the electronic device sensed through the sensor module is within a specified range.
  • the method may further include transmitting the obtained image to an external electronic device through a communication module (eg, the communication module 190 of FIG. 1 , the communication module 290 of FIG. 2 ).
  • a communication module eg, the communication module 190 of FIG. 1 , the communication module 290 of FIG. 2 .
  • the operation of tracking the user's gaze is performed using at least one of another camera for tracking gaze, a vertical-cavity surface-emitting laser (VCSEL), an infrared sensor, or a photodiode. It may include an operation of tracking the user's gaze.
  • VCSEL vertical-cavity surface-emitting laser
  • an infrared sensor or a photodiode. It may include an operation of tracking the user's gaze.
  • the operation of adapting the photographing direction of the camera module through the driving module may include: adjusting the camera module in the vertical direction through a first actuator that rotates the camera module in the vertical direction; Or it may include at least one of the operation of adjusting the camera module in the left and right direction through a second actuator that rotates the camera module in the left and right direction.
  • the method may further include, in response to detecting that the user gazes at a specific point, enlarges and provides a partial region including the specific point in the image.
  • the method includes: checking whether a preset gesture is detected to change the shooting direction of the camera module; and adjusting a photographing direction of the camera module through the driving module based on the detected gesture.
  • Electronic devices may be devices of various types.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a smart phone
  • a portable multimedia device e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a camera
  • a wearable device e.g., a smart bracelet
  • a home appliance device e.g., a smart bracelet
  • the electronic device according to the embodiment of the present document is not limited to the above-described devices.
  • first”, “second”, or “first” or “second” may be used simply to distinguish the component from other such components, and may refer to the component in another aspect (e.g., importance or order) is not limited. It is said that one (eg, first) component is “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively”. When referenced, it means that one component can be connected to the other component directly (eg by wire), wirelessly, or through a third component.
  • module used in various embodiments of this document may include a unit implemented in hardware, software, or firmware, for example, and interchangeably with terms such as logic, logic block, component, or circuit.
  • a module may be an integrally formed part or a minimum unit or a part of the part that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • one or more instructions stored in a storage medium may be implemented as software (eg, the program 140) including
  • the processor eg, the processor 120
  • the device eg, the electronic device 101
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not contain a signal (eg, electromagnetic wave), and this term refers to the case where data is semi-permanently stored in the storage medium and It does not distinguish between temporary storage cases.
  • a signal eg, electromagnetic wave
  • the method according to various embodiments disclosed in this document may be provided as included in a computer program product.
  • Computer program products may be traded between sellers and buyers as commodities.
  • the computer program product is distributed in the form of a machine-readable storage medium (eg compact disc read only memory (CD-ROM)), or via an application store (eg Play Store TM ) or on two user devices ( It can be distributed online (eg download or upload), directly between smartphones (eg smartphones).
  • a part of the computer program product may be temporarily stored or temporarily generated in a machine-readable storage medium such as a memory of a server of a manufacturer, a server of an application store, or a relay server.
  • each component (eg, module or program) of the above-described components may include a singular or a plurality of entities, and some of the plurality of entities may be separately disposed in other components.
  • one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, a module or a program
  • the integrated component may perform one or more functions of each component of the plurality of components identically or similarly to those performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component are executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations are executed in a different order, or omitted. or one or more other operations may be added.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Un mode de réalisation de la présente invention concerne un dispositif électronique comprenant un module de caméra et un procédé de commande d'une direction de photographie de celui-ci. Le dispositif électronique comprend : un boîtier ; un module de suivi du regard situé dans le boîtier ; un module de caméra, situé dans le boîtier, pour photographier un objet ; un module de commande pour commander une direction de photographie du module de caméra ; un processeur qui est connecté de manière fonctionnelle au module de suivi du regard, au module de caméra et au module de commande ; et une mémoire qui est connectée de manière fonctionnelle au processeur, la mémoire pouvant stocker des instructions qui, lorsqu'elles sont exécutées par le processeur, amènent le processeur à : acquérir une image au moyen du module de caméra ; suivre le regard de l'utilisateur au moyen du module de suivi du regard tout en acquérant l'image ; et commander le module de commande pour ajuster la direction de photographie du module de caméra sur la base du regard de l'utilisateur suivi. Divers autres modes de réalisation sont possibles.
PCT/KR2021/010513 2020-08-11 2021-08-09 Dispositifs électroniques comprenant un module de caméra, et son procédé de commande de direction de photographie WO2022035166A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020200100312A KR20220019963A (ko) 2020-08-11 2020-08-11 카메라 모듈을 포함하는 전자 장치 및 그의 촬영 방향을 제어하는 방법
KR10-2020-0100312 2020-08-11

Publications (1)

Publication Number Publication Date
WO2022035166A1 true WO2022035166A1 (fr) 2022-02-17

Family

ID=80248001

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/010513 WO2022035166A1 (fr) 2020-08-11 2021-08-09 Dispositifs électroniques comprenant un module de caméra, et son procédé de commande de direction de photographie

Country Status (2)

Country Link
KR (1) KR20220019963A (fr)
WO (1) WO2022035166A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4375946A1 (fr) * 2022-09-28 2024-05-29 Samsung Electronics Co., Ltd. Dispositif de réalité augmentée et procédé d'identification d'objet dans une imag

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002032212A (ja) * 2000-07-14 2002-01-31 Toshiba Corp コンピュータシステムおよびヘッドセット型表示装置
KR20060115656A (ko) * 2005-05-05 2006-11-09 아바고 테크놀로지스 제너럴 아이피 (싱가포르) 피티이 리미티드 이동 센서, 이미징 장치, 이미징 장치의 움직임에 대한보상 방법
KR20180002534A (ko) * 2016-06-29 2018-01-08 포브, 아이엔씨. 외부 촬상 시스템, 외부 촬상 방법, 외부 촬상 프로그램
JP2019040555A (ja) * 2017-08-29 2019-03-14 ソニー株式会社 情報処理装置、情報処理方法、およびプログラム
KR20190048241A (ko) * 2017-10-31 2019-05-09 동국대학교 산학협력단 웨어러블 장치 및 이의 영상 처리 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002032212A (ja) * 2000-07-14 2002-01-31 Toshiba Corp コンピュータシステムおよびヘッドセット型表示装置
KR20060115656A (ko) * 2005-05-05 2006-11-09 아바고 테크놀로지스 제너럴 아이피 (싱가포르) 피티이 리미티드 이동 센서, 이미징 장치, 이미징 장치의 움직임에 대한보상 방법
KR20180002534A (ko) * 2016-06-29 2018-01-08 포브, 아이엔씨. 외부 촬상 시스템, 외부 촬상 방법, 외부 촬상 프로그램
JP2019040555A (ja) * 2017-08-29 2019-03-14 ソニー株式会社 情報処理装置、情報処理方法、およびプログラム
KR20190048241A (ko) * 2017-10-31 2019-05-09 동국대학교 산학협력단 웨어러블 장치 및 이의 영상 처리 방법

Also Published As

Publication number Publication date
KR20220019963A (ko) 2022-02-18

Similar Documents

Publication Publication Date Title
WO2022145956A1 (fr) Dispositif électronique comportant une structure de dissipation de chaleur
WO2022169255A1 (fr) Dispositif électronique et son procédé de suivi du regard d'un utilisateur et de fourniture d'un service de réalité augmentée
WO2022215895A1 (fr) Dispositif électronique à porter sur soi comprenant une pluralité d'appareils photo
WO2022124734A1 (fr) Dispositif électronique comprenant un afficheur souple, son procédé de fonctionnement et support de stockage
WO2022035166A1 (fr) Dispositifs électroniques comprenant un module de caméra, et son procédé de commande de direction de photographie
WO2023106895A1 (fr) Dispositif électronique destiné à utiliser un dispositif d'entrée virtuel, et procédé de fonctionnement dans un dispositif électronique
WO2022050638A1 (fr) Procédé de modification de paramètres d'affichage et dispositif électronique
WO2023027276A1 (fr) Dispositif électronique pour exécuter une pluralité de fonctions à l'aide d'un stylet et son procédé de fonctionnement
WO2022231162A1 (fr) Dispositif de réalité augmentée et dispositif électronique interagissant avec le dispositif de réalité augmentée
WO2024096485A1 (fr) Dispositif habitronique qui ajuste la transmittance de lumière en fonction de l'éclairement d'une source de lumière externe et son procédé de commande
WO2024034811A1 (fr) Dispositif électronique d'identification de direction du regard et son procédé de fonctionnement
WO2022114918A1 (fr) Dispositif de réalité augmentée, dispositif électronique interagissant avec le dispositif de réalité augmentée et procédé de commande associé
WO2024096460A1 (fr) Dispositif électronique permettant d'acquérir des informations de distance, et son procédé de fonctionnement
WO2024049110A1 (fr) Dispositif électronique et procédé de commande permettant de corriger un objet virtuel en utilisant des informations de profondeur d'un objet réel
WO2024080770A1 (fr) Dispositif portable permettant de détecter des informations d'iris et son procédé de commande
WO2023149671A1 (fr) Mode d'entrée de commutation de dispositif de réalité augmentée et procédé associé
WO2024144158A1 (fr) Dispositif habitronique pour commander au moins un objet virtuel en fonction d'attributs d'au moins un objet virtuel, et son procédé de commande
WO2024096560A1 (fr) Dispositif électronique habitronique et procédé de fonctionnement associé
WO2022154338A1 (fr) Dispositif électronique portable comprenant une caméra miniature
WO2024058434A1 (fr) Dispositif de commande pour capturer un environnement externe d'un utilisateur et son procédé de fonctionnement et dispositif électronique monté sur la tête connecté au dispositif de commande
WO2024123041A1 (fr) Dispositif à porter sur soi pour ajuster l'état d'affichage d'un objet virtuel en fonction du mouvement du dispositif à porter sur soi, et son procédé de commande
WO2024117452A1 (fr) Dispositif électronique et procédé pour effectuer un rendu sur la base d'informations de suivi sur le mouvement d'un utilisateur
WO2023128622A1 (fr) Dispositif électronique à porter sur soi
WO2022197036A1 (fr) Procédé de mesure utilisant la ra, et dispositif électronique
WO2024075977A1 (fr) Module de caméra et dispositif électronique comprenant le module de caméra

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21856172

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21856172

Country of ref document: EP

Kind code of ref document: A1