WO2022035267A1 - Dispositif électronique comprenant un module de caméra - Google Patents

Dispositif électronique comprenant un module de caméra Download PDF

Info

Publication number
WO2022035267A1
WO2022035267A1 PCT/KR2021/010765 KR2021010765W WO2022035267A1 WO 2022035267 A1 WO2022035267 A1 WO 2022035267A1 KR 2021010765 W KR2021010765 W KR 2021010765W WO 2022035267 A1 WO2022035267 A1 WO 2022035267A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
area
display
image
module
Prior art date
Application number
PCT/KR2021/010765
Other languages
English (en)
Korean (ko)
Inventor
김현제
김창근
배재철
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2022035267A1 publication Critical patent/WO2022035267A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof

Definitions

  • Various embodiments of the present document relate to an electronic device including a camera module and a method for processing an image using the same.
  • it relates to an electronic device having a camera module disposed under a display (eg, a rear surface).
  • the pixel pattern of the display is widened while arranging the camera under the display, the size of the optical sensor is similar to the size of the pixel, the technique of forming the wiring related to the display with transparent wiring, the technique of forming the display related to the display There is a technique to arrange the wiring so that it does not overlap with the camera placed below.
  • the image acquired through the camera looks hazy overall, and there is a problem in that the quality is deteriorated.
  • an electronic device configured to improve image quality by using at least one camera (eg, an under display camera) disposed under a display (eg, a rear surface) of the electronic device can provide Also, various embodiments may provide an electronic device that configures an opening area (eg, an opening size) of a plurality of camera modules by using a pixel arrangement of a display related to the plurality of camera modules.
  • at least one camera eg, an under display camera
  • a display eg, a rear surface
  • an electronic device that configures an opening area (eg, an opening size) of a plurality of camera modules by using a pixel arrangement of a display related to the plurality of camera modules.
  • Various embodiments of the present document may provide an electronic device including a display and a camera module.
  • An electronic device includes a display, a plurality of camera modules including a plurality of image sensors disposed under the display, and a processor electrically connected to the display and the plurality of camera modules wherein the display includes a first area having a first pixel density and a second area having a second pixel density and corresponding to an arrangement area of the plurality of image sensors, wherein the processor comprises: Obtaining first image data having a first resolution by using a first image sensor among the image sensors, obtaining second image data having a second resolution by using a second image sensor among the plurality of image sensors, , an electronic device that generates a result image based on at least the first image data and the second image data.
  • the method of operating an electronic device includes using a processor electrically connected to a plurality of camera modules including a plurality of image sensors disposed under a display to obtain a second one of the plurality of image sensors. acquiring first image data having a first resolution by using one image sensor, acquiring second image data having a second resolution by using a second image sensor among the plurality of image sensors, and at least and generating a result image based on the first image data and the second image data, wherein the display includes: a first area having a first pixel density; and a second area having a second pixel density and corresponding to an arrangement area of the plurality of image sensors.
  • the aperture of the camera by configuring the aperture of the camera to be small (eg, pixel density or pixel pattern difference of the display), it is possible to relatively reduce the area shielded by the pattern of the display, thereby preventing deterioration of low-frequency resolution.
  • an increase in resolving power at a low frequency and/or an improvement in a modulation transfer function (MTF) may be achieved.
  • FIG. 1 is a block diagram of an electronic device in a network environment, according to an embodiment.
  • FIG. 2 is a block diagram of a display device according to an exemplary embodiment.
  • 3A is a perspective view of a front side of a mobile electronic device according to an exemplary embodiment
  • 3B is a perspective view of a rear surface of an electronic device according to an exemplary embodiment
  • FIG. 4 is an exploded perspective view of an electronic device according to an embodiment.
  • FIG. 5 is a diagram illustrating a shape according to a size of a camera opening of an electronic device according to an exemplary embodiment.
  • FIG. 6 is a diagram illustrating the shape of an LSF graph according to a size of a camera opening of an electronic device according to an exemplary embodiment.
  • FIG. 7 is a diagram illustrating the shape of an MTF graph according to a size of a camera opening of an electronic device according to an exemplary embodiment.
  • FIG. 8 is a diagram illustrating a result of comparing low-frequency resolution according to a size of a camera opening of an electronic device according to an exemplary embodiment.
  • 9A is a diagram illustrating an arrangement of a plurality of cameras under a display of a photographing area of an electronic device according to an exemplary embodiment.
  • 9B is a diagram illustrating an arrangement of a plurality of cameras under a display of a photographing area of an electronic device according to an exemplary embodiment.
  • 9C is a diagram illustrating an arrangement of a plurality of cameras under a display of a photographing area of an electronic device according to an exemplary embodiment.
  • 9D is a diagram illustrating an arrangement of a plurality of cameras under a display of a photographing area of an electronic device according to an exemplary embodiment.
  • 9E is a diagram illustrating an arrangement of a plurality of cameras under a display of a photographing area of an electronic device according to an exemplary embodiment.
  • FIG. 10 is a diagram illustrating a flow in which an electronic device performs image synthesis based on a plurality of cameras, according to an embodiment.
  • FIG. 11 is a diagram illustrating a pixel arrangement under a display of a photographing area of an electronic device according to an exemplary embodiment.
  • FIG. 12 is a diagram illustrating a pixel arrangement under a display of a photographing area of an electronic device according to an exemplary embodiment.
  • FIG. 13 is a diagram illustrating an outline of an MTF graph according to pixel arrangement under a display of a photographing area of an electronic device according to an exemplary embodiment
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100, according to an embodiment.
  • an electronic device 101 communicates with an electronic device 102 through a first network 198 (eg, a short-range wireless communication network) or a second network 199 . It may communicate with the electronic device 104 or the server 108 through (eg, a long-distance wireless communication network). According to an embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • a first network 198 eg, a short-range wireless communication network
  • a second network 199 e.g., a second network 199 . It may communicate with the electronic device 104 or the server 108 through (eg, a long-distance wireless communication network). According to an embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • the electronic device 101 includes a processor 120 , a memory 130 , an input module 150 , a sound output module 155 , a display module 160 , an audio module 170 , and a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or an antenna module 197 may be included.
  • at least one of these components eg, the connection terminal 178
  • may be omitted or one or more other components may be added to the electronic device 101 .
  • some of these components are integrated into one component (eg, display module 160 ). can be
  • the processor 120 for example, executes software (eg, a program 140) to execute at least one other component (eg, a hardware or software component) of the electronic device 101 connected to the processor 120 . It can control and perform various data processing or operations. According to an embodiment, as at least part of data processing or operation, the processor 120 stores a command or data received from another component (eg, the sensor module 176 or the communication module 190 ) into the volatile memory 132 . may be stored in the volatile memory 132 , and may process commands or data stored in the volatile memory 132 , and store the result data in the non-volatile memory 134 .
  • software eg, a program 140
  • the processor 120 stores a command or data received from another component (eg, the sensor module 176 or the communication module 190 ) into the volatile memory 132 .
  • the processor 120 stores a command or data received from another component (eg, the sensor module 176 or the communication module 190 ) into the volatile memory 132 .
  • the processor 120 is the main processor 121 (eg, a central processing unit or an application processor) or a secondary processor 123 (eg, a graphic processing unit, a neural network processing unit) a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor).
  • the main processor 121 e.g, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, a neural network processing unit
  • NPU neural processing unit
  • an image signal processor e.g., a sensor hub processor, or a communication processor.
  • the main processor 121 e.g, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, a neural network processing unit
  • NPU neural processing unit
  • an image signal processor e.g., a sensor hub processor, or a communication processor.
  • the main processor 121 e.g, a central processing unit or an application processor
  • a secondary processor 123
  • the auxiliary processor 123 is, for example, on behalf of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or the main processor 121 is active (eg, executing an application). ), together with the main processor 121, at least one of the components of the electronic device 101 (eg, the display module 160, the sensor module 176, or the communication module 190) It is possible to control at least some of the related functions or states.
  • the auxiliary processor 123 eg, an image signal processor or a communication processor
  • the auxiliary processor 123 may include a hardware structure specialized for processing an artificial intelligence model.
  • Artificial intelligence models can be created through machine learning. Such learning may be performed, for example, in the electronic device 101 itself on which artificial intelligence is performed, or may be performed through a separate server (eg, the server 108).
  • the learning algorithm may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but in the above example not limited
  • the artificial intelligence model may include a plurality of artificial neural network layers.
  • Artificial neural networks include deep neural networks (DNNs), convolutional neural networks (CNNs), recurrent neural networks (RNNs), restricted boltzmann machines (RBMs), deep belief networks (DBNs), bidirectional recurrent deep neural networks (BRDNNs), It may be one of deep Q-networks or a combination of two or more of the above, but is not limited to the above example.
  • the artificial intelligence model may include, in addition to, or alternatively, a software structure in addition to the hardware structure.
  • the memory 130 may store various data used by at least one component of the electronic device 101 (eg, the processor 120 or the sensor module 176 ).
  • the data may include, for example, input data or output data for software (eg, the program 140 ) and instructions related thereto.
  • the memory 130 may include a volatile memory 132 or a non-volatile memory 134 .
  • the program 140 may be stored as software in the memory 130 , and may include, for example, an operating system 142 , middleware 144 , or an application 146 .
  • the input module 150 may receive a command or data to be used in a component (eg, the processor 120 ) of the electronic device 101 from the outside (eg, a user) of the electronic device 101 .
  • the input module 150 may include, for example, a microphone, a mouse, a keyboard, a key (eg, a button), or a digital pen (eg, a stylus pen).
  • the sound output module 155 may output a sound signal to the outside of the electronic device 101 .
  • the sound output module 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback.
  • the receiver may be used to receive an incoming call. According to an embodiment, the receiver may be implemented separately from or as a part of the speaker.
  • the display module 160 may visually provide information to the outside (eg, a user) of the electronic device 101 .
  • the display module 160 may include, for example, a control circuit for controlling a display, a hologram device, or a projector and a corresponding device.
  • the display module 160 may include a touch sensor configured to sense a touch or a pressure sensor configured to measure the intensity of a force generated by the touch.
  • the audio module 170 may convert a sound into an electric signal or, conversely, convert an electric signal into a sound. According to an embodiment, the audio module 170 acquires a sound through the input module 150 or an external electronic device (eg, a sound output module 155 ) directly or wirelessly connected to the electronic device 101 . A sound may be output through the electronic device 102 (eg, a speaker or headphones).
  • an external electronic device eg, a sound output module 155
  • a sound may be output through the electronic device 102 (eg, a speaker or headphones).
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101 or an external environmental state (eg, user state), and generates an electrical signal or data value corresponding to the sensed state. can do.
  • the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 177 may support one or more designated protocols that may be used by the electronic device 101 to directly or wirelessly connect with an external electronic device (eg, the electronic device 102 ).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card
  • the connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102 ).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that the user can perceive through tactile or kinesthetic sense.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture still images and moving images. According to an embodiment, the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101 .
  • the power management module 188 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101 .
  • the battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108). It can support establishment and communication performance through the established communication channel.
  • the communication module 190 may include one or more communication processors that operate independently of the processor 120 (eg, an application processor) and support direct (eg, wired) communication or wireless communication.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg, : It may include a LAN (local area network) communication module, or a power line communication module).
  • GNSS global navigation satellite system
  • a corresponding communication module among these communication modules is a first network 198 (eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (eg, legacy It may communicate with the external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
  • a first network 198 eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)
  • a second network 199 eg, legacy It may communicate with the external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
  • a telecommunication network
  • the wireless communication module 192 uses the subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199 .
  • the electronic device 101 may be identified or authenticated.
  • the wireless communication module 192 may support a 5G network after a 4G network and a next-generation communication technology, for example, a new radio access technology (NR).
  • NR access technology includes high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), minimization of terminal power and access to multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low-latency) -latency communications)).
  • eMBB enhanced mobile broadband
  • mMTC massive machine type communications
  • URLLC ultra-reliable and low-latency
  • the wireless communication module 192 may support a high frequency band (eg, mmWave band) to achieve a high data rate, for example.
  • a high frequency band eg, mmWave band
  • the wireless communication module 192 includes various technologies for securing performance in a high-frequency band, for example, beamforming, massive multiple-input and multiple-output (MIMO), all-dimensional multiplexing. It may support technologies such as full dimensional MIMO (FD-MIMO), an array antenna, analog beam-forming, or a large scale antenna.
  • the wireless communication module 192 may support various requirements specified in the electronic device 101 , an external electronic device (eg, the electronic device 104 ), or a network system (eg, the second network 199 ).
  • the wireless communication module 192 includes a peak data rate (eg, 20 Gbps or more) for realizing eMBB, loss coverage (eg, 164 dB or less) for realizing mMTC, or U-plane latency ( Example: downlink (DL) and uplink (UL) each 0.5 ms or less, or round trip 1 ms or less).
  • a peak data rate eg, 20 Gbps or more
  • loss coverage eg, 164 dB or less
  • U-plane latency Example: downlink (DL) and uplink (UL) each 0.5 ms or less, or round trip 1 ms or less.
  • the antenna module 197 may transmit or receive a signal or power to the outside (eg, an external electronic device).
  • the antenna module 197 may include an antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is connected from the plurality of antennas by, for example, the communication module 190 . can be selected. A signal or power may be transmitted or received between the communication module 190 and an external electronic device through the selected at least one antenna.
  • other components eg, a radio frequency integrated circuit (RFIC)
  • RFIC radio frequency integrated circuit
  • the antenna module 197 may form a mmWave antenna module.
  • the mmWave antenna module comprises a printed circuit board, an RFIC disposed on or adjacent to a first side (eg, bottom side) of the printed circuit board and capable of supporting a specified high frequency band (eg, mmWave band); and a plurality of antennas (eg, an array antenna) disposed on or adjacent to a second side (eg, top or side) of the printed circuit board and capable of transmitting or receiving signals of the designated high frequency band. can do.
  • peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • the command or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199 .
  • Each of the external electronic devices 102 or 104 may be the same as or different from the electronic device 101 .
  • all or part of the operations performed by the electronic device 101 may be executed by one or more external electronic devices 102 , 104 , or 108 .
  • the electronic device 101 may perform the function or service itself instead of executing the function or service itself.
  • one or more external electronic devices may be requested to perform at least a part of the function or the service.
  • One or more external electronic devices that have received the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit a result of the execution to the electronic device 101 .
  • the electronic device 101 may process the result as it is or additionally and provide it as at least a part of a response to the request.
  • cloud computing distributed computing, mobile edge computing (MEC), or client-server computing technology may be used.
  • the electronic device 101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 104 may include an Internet of things (IoT) device.
  • Server 108 may be an intelligent server using machine learning and/or neural networks.
  • the external electronic device 104 or the server 108 may be included in the second network 199 .
  • the electronic device 101 may be applied to an intelligent service (eg, smart home, smart city, smart car, or health care) based on 5G communication technology and IoT-related technology.
  • FIG. 2 is a block diagram 200 of a display module 160, according to an embodiment.
  • the display module 160 may include a display 210 and a display driver integrated circuit (DDI) 230 for controlling the display 210 .
  • the DDI 230 may include an interface module 231 , a memory 233 (eg, a buffer memory), an image processing module 235 , or a mapping module 237 .
  • the DDI 230 transmits, for example, image data or image information including an image control signal corresponding to a command for controlling the image data to an electronic device (eg, the electronic device of FIG. 1 ) through the interface module 231 . from other components of the device 101 ).
  • the image information is provided by the processor 120 (eg, the main processor 121 (eg, an application processor) or the auxiliary processor 123 ( For example, a graphic processing device)
  • the DDI 230 may communicate with the touch circuit 250 or the sensor module 176 through the interface module 231.
  • the DDI 230 is the At least a portion of the received image information may be stored in the memory 233, for example, in units of frames.
  • the memory 233 may include a register (not shown), , the register may store a setting value for performing a function of the electronic device 101 .
  • the image processing module 235 pre-processes or post-processes at least a portion of the image data based on at least a characteristic of the image data or a characteristic of the display 210 (eg, adjusting resolution, brightness, or size) can be performed.
  • the mapping module 237 may generate a voltage value or a current value corresponding to the image data pre-processed or post-processed by the image processing module 235 .
  • the generation of the voltage value or the current value is at least dependent on, for example, a property of the pixels of the display 210 (eg, an arrangement of pixels (RGB stripe or pentile structure), or a size of each sub-pixel). It may be performed based on some.
  • At least some pixels of the display 210 are driven based at least in part on the voltage value or the current value, so that visual information (eg, text, image, or icon) corresponding to the image data is displayed on the display 210 . can be displayed through
  • the display module 160 may further include a touch circuit 250 .
  • the touch circuit 250 may include a touch sensor 251 and a touch sensor IC 253 for controlling the touch sensor 251 .
  • the touch sensor IC 253 may control the touch sensor 251 to sense a touch input or a hovering input for a specific position of the display 210 , for example.
  • the touch sensor IC 253 may detect a touch input or a hovering input by measuring a change in a signal (eg, voltage, light amount, resistance, or electric charge amount) for a specific position of the display 210 .
  • the touch sensor IC 253 may provide information (eg, location, area, pressure, or time) regarding the sensed touch input or hovering input to the processor 120 .
  • At least a part of the touch circuit 250 is disposed as a part of the display driver IC 230 , the display 210 , or outside the display module 160 . may be included as a part of another component (eg, the coprocessor 123).
  • the display module 160 may further include at least one sensor (eg, a fingerprint sensor, an iris sensor, a pressure sensor, or an illuminance sensor) of the sensor module 176 , or a control circuit therefor.
  • the at least one sensor or a control circuit therefor may be embedded in a part of the display module 160 (eg, the display 210 or the DDI 230 ) or a part of the touch circuit 250 .
  • the sensor module 176 embedded in the display module 160 includes a biometric sensor (eg, a fingerprint sensor)
  • the biometric sensor provides biometric information related to a touch input through a partial area of the display 210 . (eg fingerprint image) can be acquired.
  • the pressure sensor may acquire pressure information related to a touch input through a part or the entire area of the display 210 .
  • the touch sensor 251 or the sensor module 176 may be disposed between pixels of the pixel layer of the display 210 or above or below the pixel layer.
  • 3A is a perspective view of a front surface of an electronic device according to an embodiment of the present disclosure
  • 3B is a perspective view of a rear surface of an electronic device according to an embodiment of the present disclosure
  • the electronic device 300 may correspond to the electronic device 101 of FIG. 1 .
  • an electronic device 300 has a first surface (or front surface) 310A, a second surface (or rear surface) 310B, and a first surface 310A. and a housing 310 including a side surface 310C surrounding the space between the second surfaces 310B.
  • the housing may refer to a structure forming a part of the first surface 310A, the second surface 310B, and the side surface 310C of FIG. 1 .
  • the first surface 310A may be formed by the front plate 302 (eg, a glass plate including various coating layers or a polymer plate), at least a portion of which is substantially transparent.
  • the second surface 310B may be formed by a substantially opaque back plate 311 .
  • the back plate 311 is formed by, for example, coated or colored glass, ceramic, polymer, metal (eg, aluminum, stainless steel (STS), or magnesium), or a combination of at least two of the above materials.
  • the side surface 310C is coupled to the front plate 302 and the rear plate 311 and may be formed by a side bezel structure 318 (or “side member”) including a metal and/or a polymer.
  • the back plate 311 and the side bezel structure 318 are integrally formed and may include the same material (eg, a metal material such as aluminum).
  • the front plate 302 includes two first regions 310D extending seamlessly by bending from the first surface 310A toward the rear plate 311 , the front plate It may include both ends of the long edge of (302).
  • the rear plate 311 has two second regions 310E that extend seamlessly by bending from the second surface 310B toward the front plate 302 with long edges. It can be included at both ends.
  • the front plate 302 (or the back plate 311 ) may include only one of the first regions 310D (or the second regions 310E). In some embodiments, some of the first regions 310D or the second regions 310E may not be included.
  • the side bezel structure 318 when viewed from the side of the electronic device 300 , is the first side bezel structure 318 on the side that does not include the first regions 310D or the second regions 310E. It may have a thickness (or width) of 1, and may have a second thickness that is thinner than the first thickness on the side surface including the first regions 310D or the second regions 310E.
  • the electronic device 300 includes a display 301 , an input device 303 , a sound output device 307 and 314 , sensor modules 304 and 319 , camera modules 305 and 312 , and a key. It may include at least one of an input device 317 , an indicator (not shown), and connectors 308 and 309 .
  • the display 301 is the display module 160 of FIG. 1
  • the input device 303 is the input module 150 of FIG. 1
  • the sound output devices 307 and 314 are the sound output modules of FIG. 1
  • the sensor modules 304 and 319 may correspond to the sensor module 176 of FIG. 1
  • the camera modules 305 and 312 may correspond to the camera module 180 of FIG. 1
  • components that perform the same or similar functions as those included in the electronic device 101 of FIG. 1 may correspond to the configuration of the electronic device 101 of FIG. 1 . there is.
  • the electronic device 300 may omit at least one of the components (eg, the key input device 317 or an indicator) or additionally include other components.
  • the display 301 can be seen through, for example, a top portion of the front plate 302 . In some embodiments, at least a portion of the display 301 may be visible through the front plate 302 forming the first area 310D of the first surface 310A and the side surface 310C.
  • the display 301 may be coupled to or disposed adjacent to a touch sensing circuit, a pressure sensor capable of measuring the intensity (pressure) of a touch, and/or a digitizer that detects a magnetic field type stylus pen.
  • at least a portion of the sensor module 304 , 319 , and/or at least a portion of a key input device 317 is located in the first area 310D and/or the second area 310E. can be placed.
  • At least one of an audio module 314 , a sensor module 304 , a camera module 305 , and a fingerprint sensor may be included on the rear surface of the screen display area of the display 301 .
  • the display 301 is coupled to or adjacent to a touch sensing circuit, a pressure sensor capable of measuring the intensity (pressure) of a touch, and/or a digitizer detecting a magnetic field type stylus pen. can be placed.
  • a portion of the sensor module 304 , 319 , and/or at least a portion of a key input device 317 , the first area 310D, and/or the second area 310E can be placed in
  • the input device 303 may include a microphone. In some embodiments, the input device 303 may include a plurality of microphones disposed to detect the direction of sound.
  • the sound output devices 307 and 314 may include speakers 307 and 314 .
  • the speakers 307 and 314 may include an external speaker 307 and a receiver 314 for a call.
  • the input device 303 eg, a microphone
  • the speakers 307 , 314 and the connectors 308 , 309 are disposed in the space of the electronic device 300 , and at least one formed in the housing 310 .
  • a hole formed in the housing 310 may be commonly used for an input device 303 (eg, a microphone) and speakers 307 and 314 .
  • the sound output devices 307 and 314 may include a speaker (eg, a piezo speaker) that operates while excluding a hole formed in the housing 310 .
  • the sensor modules 304 and 319 may generate an electrical signal or data value corresponding to an internal operating state of the electronic device 300 or an external environmental state.
  • the sensor modules 304 and 319 include, for example, a first sensor module 304 (eg, a proximity sensor) and/or a second sensor module (not shown) disposed on the first side 310A of the housing 310 . ) (eg, a fingerprint sensor), and/or a third sensor module 319 (eg, an HRM sensor) disposed on the second surface 310B of the housing 310 .
  • the fingerprint sensor may be disposed on the second surface 310B as well as the first surface 310A (eg, the display 301 ) of the housing 310 .
  • the electronic device 300 includes a sensor module not shown, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, a temperature sensor, It may further include at least one of a humidity sensor and an illuminance sensor.
  • a sensor module not shown, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, a temperature sensor, It may further include at least one of a humidity sensor and an illuminance sensor.
  • the camera modules 305 and 312 are disposed on the first surface 310A of the electronic device 300 and the first camera module 305 and/or the second surface 310B of the electronic device 300 .
  • a second camera module 312 may be included.
  • the camera modules 305 and 312 may include one or more lenses, an image sensor, and/or an image signal processor.
  • the camera modules 305 and 312 may include a flash 313 .
  • the flash 313 may include, for example, a light emitting diode or a xenon lamp.
  • the first camera module 305 may be disposed under the display panel in an under display camera (UDC) method.
  • UDC under display camera
  • two or more lenses (wide-angle and telephoto lenses) and image sensors may be disposed on one surface of the electronic device 300 .
  • a plurality of first camera modules 305 may be disposed on a first surface (eg, a surface on which a screen is displayed) of the electronic device 300 in an under display camera (UDC) manner.
  • UDC under display camera
  • the electronic device 100 may include a plurality of camera modules (eg, a dual camera or a triple camera) each having different properties (eg, angle of view) or functions.
  • a plurality of camera modules 305 and 312 including lenses having different angles of view may be configured.
  • the electronic device 100 may control to use the camera modules 305 and 312 of an angle of view related to the user's selection.
  • at least one of the plurality of camera modules 305 and 312 may be a wide-angle camera, and at least the other may be a telephoto camera.
  • At least one of the plurality of camera modules 305 and 312 is a front camera facing the front in the z-axis direction of the electronic device 300 , and at least another is a front camera facing the rear opposite to the front side of the electronic device 300 . It may be a rear camera.
  • the plurality of camera modules 305 and 312 are a wide-angle camera, a telephoto camera, a color camera, a monochrome camera, or an IR (infrared) camera (eg, a time of flight (TOF) camera, structured light camera).
  • the IR camera may be operated as at least a part of the sensor modules 304 and 319 .
  • the TOF camera may be operated as at least a part of the sensor modules 304 and 319 for detecting the distance to the subject.
  • the key input device 317 may be disposed on the side surface 310C of the housing 310 .
  • the electronic device 300 may not include some or all of the above-mentioned key input devices 317 , and the not included key input devices 317 may be displayed on the display 301 as soft keys, etc. It can be implemented in the form In some embodiments, the key input device 317 may be implemented using a pressure sensor included in the display 301 .
  • the indicator may be disposed, for example, on the first surface 310A of the housing 310 .
  • the indicator may provide, for example, state information of the electronic device 300 in the form of a light (eg, a light emitting device).
  • the indicator may provide, for example, a light source that is interlocked with the operation of the camera module 305 .
  • Indicators may include, for example, LEDs, IR LEDs and xenon lamps.
  • the connector holes 308 and 309 include a first connector hole 308 capable of accommodating a connector (eg, a USB connector) for transmitting and receiving power and/or data to and from an external electronic device; and/or may include a second connector hole 309 (or earphone jack) capable of accommodating a connector for transmitting/receiving an audio signal to/from an external electronic device.
  • a connector eg, a USB connector
  • a second connector hole 309 or earphone jack
  • some camera modules 305 among the camera modules 305 and 312 , some sensor modules 304 among the sensor modules 304 and 319 , or an indicator are arranged to be visible through the display 301 .
  • the camera module 305 may be disposed to overlap the display area, and may also display a screen in the display area corresponding to the camera module 305 .
  • Some sensor modules 304 may be arranged to perform their functions without being visually exposed through the front plate 302 in the internal space of the electronic device.
  • FIG. 4 is an exploded perspective view of an electronic device 400 according to an embodiment of the present disclosure.
  • the electronic device 400 (eg, the electronic device 300 of FIG. 3A ) includes a side member 410 (eg, the side bezel structure 318 of FIG. 3A ) and a first support member 411 . ) (eg, a bracket or support structure), a front plate 420 (eg, a front cover) (eg, the front plate 302 of FIG. 3A ), a display 430 (eg, the display 301 of FIG.
  • a printed circuit board 440 eg, a printed circuit board (PCB), a flexible PCB (FPCB), or a rigid flexible PCB (RFPCB)
  • a battery 450 e.g, a second support member 460 (eg, a rear case);
  • An antenna 470 and a rear plate 480 eg, a rear cover
  • the electronic device 400 may omit at least one of the components (eg, the first support member 411 or the second support member 460 ) or additionally include other components. .
  • At least one of the components of the electronic device 400 may be the same as or similar to at least one of the components of the electronic device 101 of FIG. 1 , and overlapping descriptions will be omitted below.
  • the first support member 411 may be disposed inside the electronic device 400 and connected to the side member 410 , or may be integrally formed with the side member 410 .
  • the first support member 411 may be formed of, for example, a metallic material and/or a non-metallic (eg, polymer) material.
  • the first support member 411 may have a display 430 coupled to one surface and a printed circuit board 440 coupled to the other surface.
  • the printed circuit board 440 may be equipped with a processor, memory, and/or an interface.
  • the processor may include, for example, one or more of a central processing unit, an application processor, a graphics processing unit, an image signal processor, a sensor hub processor, or a communication processor.
  • the printed circuit board 440 may include a plurality of printed circuit boards 440 spaced apart from each other, and may be electrically connected to each other using an electrical connection member (not shown).
  • the electrical connection member may include at least one of a flexible printed circuit board (FPCB), a coaxial cable, and a B to B connector (board to board connector), but is not limited thereto. .
  • the memory may include, for example, the volatile memory 132 of FIG. 1 or the non-volatile memory 134 of FIG. 1 .
  • the interface may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, and/or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • the interface may, for example, electrically or physically connect the electronic device 400 to an external electronic device, and may include a USB connector, an SD card/MMC connector, or an audio connector.
  • the battery 450 is a device for supplying power to at least one component of the electronic device 400 and may include, for example, a non-rechargeable primary battery, a rechargeable secondary battery, or a fuel cell. . At least a portion of the battery 450 may be disposed substantially on the same plane as the printed circuit board 440 . The battery 450 may be integrally disposed inside the electronic device 400 . In another embodiment, the battery 450 may be detachably disposed from the electronic device 400 .
  • the antenna 470 may be disposed between the rear plate 480 and the battery 450 .
  • the antenna 470 may include, for example, a near field communication (NFC) antenna, a wireless charging antenna, and/or a magnetic secure transmission (MST) antenna.
  • the antenna 470 may, for example, perform short-range communication with an external device or wirelessly transmit/receive power required for charging.
  • the antenna structure may be formed by a part of the side bezel structure 410 and/or the first support member 411 or a combination thereof.
  • the first support member 411 of the side member 410 may have a first surface 410 - 1 facing the front plate 420 and a direction opposite to the first surface 410 - 1 (eg: The second surface 410 - 2 facing the rear plate direction) may be included.
  • the camera module 180 (eg, the camera module 180 of FIG. 1 ) may be disposed between the first support member 411 and the rear plate 480 .
  • the camera module 180 is the front plate 420 through the through hole 401 connected from the first surface 410 - 1 to the second surface 410 - 2 of the first support member 411 . ) in the direction of protruding or visible.
  • the portion protruding through the through hole 401 of the camera module 180 may be disposed to detect the external environment at a corresponding position of the display 430 .
  • the through hole 401 may be unnecessary.
  • the display 430 may include a display that is arranged to be slidable and provides a screen (eg, a display screen).
  • the display area of the electronic device 400 is an area that is visually exposed to output an image, and the electronic device 400 changes the display area according to the movement of the sliding plate (not shown) or the movement of the display. can be adjusted
  • at least a portion (eg, a housing) of the electronic device 101 includes a rollable electronic device configured to selectively expand the display area by at least partially slidably operating. can do.
  • the display 430 may be referred to as a slide-out display or an expandable display.
  • an area (display area) corresponding to the plurality of camera modules according to a state change (eg, a folded/reduced state, an unfolded/expanded state) of the electronic device 400 . ) may be different.
  • a state change eg, a folded/reduced state, an unfolded/expanded state of the electronic device 400 .
  • the pixel arrangement of the area (display area) may be different from the pixel arrangement of the area (display area) that does not correspond to the plurality of camera modules.
  • FIG. 5 illustrates a size of a camera opening of an electronic device according to an exemplary embodiment.
  • the electronic device 300 may include a camera module 180 (eg, corresponding to the first camera module 305 of FIG. 3A ).
  • the camera module 180 may be disposed under the display 301 .
  • the area of the display 301 corresponding to the area in which the camera module 180 of the electronic device 300 is disposed is the area of the display 301 corresponding to the area in which the camera module 180 is not disposed. It may have a pixel density or pixel pattern that is different from For example, the pixel density of the area of the display 301 corresponding to the area in which the camera module 180 is disposed is lower than the pixel density of the area of the display 301 corresponding to the area in which the camera module 180 is not disposed. It may have a pixel density.
  • the display 301 corresponding to the area in which the camera module 180 is not disposed is 1, the display 301 corresponding to the area in which the camera module 180 is disposed.
  • the pixel density of the area of n may be less than 1 and higher than 1/16. In addition, it is not limited to 1/16, which is a numerical value related to pixel density, and may be implemented in various ways according to design.
  • the camera module 180 may include a lens, and the size (eg, diameter, area) of the lens may be implemented in various ways.
  • the camera module 180 may include a plurality of lenses.
  • the lens may have a lens size corresponding to the first area 510 , the second area 520 , or the third area 530 .
  • the lens size corresponding to the first area 510 may be a standard lens size
  • the lens size corresponding to the second area 520 and the third area 530 is the first area ( 510) by comparison with the corresponding lens size.
  • the lens size corresponding to the second region 520 may be a lens size having an area 0.5 times larger than that of the lens corresponding to the first region 510 .
  • the lens size corresponding to the third region 530 may be a lens size having an area of 0.125 times that of the lens size corresponding to the first region 510 .
  • the regions correspond to regions in which the camera module 180 of the electronic device 300 is disposed. These may be areas included in the area of the display 301 .
  • the pixel density of the image sensor (not shown) of the camera module 180 disposed below the area of the display 301 of the electronic device 300 may not be limited.
  • an image sensor (not shown) of the camera module 180 may have a pixel density of 6 million pixels.
  • the image sensor (not shown) of the camera module 180 may have a pixel density of 12 million pixels.
  • the image sensor (not shown) of the camera module 180 having a pixel density of 6 million pixels is higher than the image sensor (not shown) of the camera module 180 having a pixel density of 12 million pixels.
  • the gap may be wide.
  • the line spread function (LSF) graph may indicate the degree of diffusion (or intensity distribution) of an image (eg, an image) formed in an imaging optical system or a photosensitive material by a line light source.
  • LSF line spread function
  • the degree of diffusion may be small.
  • the relative intensity (eg, y-axis value) in the LSF graph is farther from the image center (eg, the zero point on the x-axis)
  • the degree of diffusion may be large.
  • the curves (curve 610 , curve 620 , and curve 630 ) of the LSF graph are the regions (eg, the first region 510 and the second region 520 ) described in FIG. 5 , respectively. , and curves corresponding to the third region 530 ).
  • the first curve 610 may be a curve of the LSF graph corresponding to 45 degrees of the first area 510 of FIG. 5 .
  • the second curve 620 may be a curve of the LSF graph corresponding to 45 degrees of the second area 520 of FIG. 5
  • the third curve 630 may be 45 degrees of the third area 530 of FIG. 5 .
  • 0 on the x-axis of the LSF graph may mean the center of regions (eg, the first region 510 , the second region 520 , and the third region 530 ), and 0 on the x-axis It may mean that it is further away from the center in a negative direction and a positive direction.
  • the y-axis of the LSF graph may mean the intensity of a formed image (eg, an image).
  • the LSF graph of FIG. 6 may represent the degree of diffusion of an image (eg, image) formed according to the regions (eg, region 610 , region 620 , and region 630 ) of FIG. 5 . there is.
  • the lens size corresponding to the first region 510 , the lens size corresponding to the second region 520 , and the lens size corresponding to the third region 530 are as described with reference to FIG. 5 .
  • the area ratio may be 1:0.5:0.125. This may be expressed as a ratio of the entrance pupil size, or it may be expressed as a ratio of fno. area ratio If so, the entrance fee is , and fno is am.
  • the first curve 610 corresponding to the first region 510 is an image based on the center of the image (eg, the zero point of the x-axis) rather than the second curve 620 corresponding to the second region 520 .
  • the second curve 620 corresponding to the second region 520 is more diffuse than the third curve 630 corresponding to the third region 530 based on the image center (eg, the 0 point of the x-axis).
  • the image center eg, the 0 point of the x-axis
  • the curve of the LSF graph gradually gathers based on the image center (eg, the 0 point of the x-axis).
  • the degree of diffusion of the formed image may decrease, and as the degree of diffusion decreases, A sharper or less lossy image (eg an image) may be formed.
  • a modulation transfer function (MTF) graph may indicate resolution/resolution power and contrast of a camera lens.
  • the alternating black and white lines can be called a line pair, and the resolution can be defined as the frequency measured in line pairs per millimeter (lp (line pair)/mm).
  • the MTF characteristics of a camera eg, UDC
  • the MTF may be a curve quantifying the change in resolution according to the distance from the center point of the image.
  • Resolving power may mean the ability to optically distinguish different objects from each other, and may be related to contrast, sharpness, or sharpness.
  • the curves (curve 700 , curve 710 , curve 720 , and curve 730 ) of the MTF graph of FIG. 7 depend on the size of the camera opening (eg, lens) of the electronic device. resolution can be displayed.
  • the camera module eg, the camera module 180 of FIG. 5
  • the curve of the MTF graph may mean that camera performance having a modulation (eg, contrast input/output ratio) greater than or equal to a specified value for contrast (contrast) is expressed in a low frequency band.
  • the curve of the MTF graph is a curve 700 to a curve 710 , a curve 710 to a curve 720 , and a curve 720 . ) to the shape of the curve 730 .
  • the MTF graph of FIG. 7 is a graph showing changes in curves in one graph as the size of the camera aperture (eg, lens) decreases.
  • the MTF characteristic of the lens eg, the original in FIG. 7
  • the curve 700 may have a form close to linear without a separate ripple.
  • diffraction or scattering having various frequencies may occur in external light flowing into the camera module 180 according to a pixel (eg, a pixel pattern) of the display 301 .
  • a pixel eg, a pixel pattern
  • the MTF for each frequency (eg, spatial frequency) of light fluctuates in a curved shape (eg ripple, sine wave) instead of linear (eg, linear) phenomenon (eg, a generally lowering phenomenon) appears, and this phenomenon may deteriorate the image quality (eg, distortion, image quality deterioration) of the camera module 180 .
  • the low-frequency resolution may increase as the size of the camera aperture (eg, lens) decreases.
  • the resolution of the low frequency (eg, 200 lp/mm or less) section may increase as the size of the lens decreases.
  • the shielding area recognized by the camera module 180 eg, the area shielded by the pattern of the display
  • the shielding area recognized by the camera module 180 may vary, and in the curve 710 , the curve ( 730), as the shielding area becomes relatively small, the resolution/resolution in the low frequency (eg, 200 lp/mm or less) section may increase by about 20%.
  • FIG. 8 illustrates a comparison result of low-frequency resolution according to a size of a camera opening of an electronic device according to an exemplary embodiment. Specifically, FIG. 8 shows an effect of applying an under display camera (UDC) and a 5M lens to an electronic device.
  • UDC under display camera
  • the first graph 810 represents a contrast transfer function (CTF) graph of a 5M lens in the case of not an under display camera (UDC).
  • the second graph 820 according to an embodiment represents a CTF graph of a 5M lens in the case of UDC.
  • the third graph 830 represents a CTF graph of the 32M lens in the case of not UDC.
  • a fourth graph 840 according to an embodiment represents a CTF graph of a 32M lens in the case of UDC.
  • the CTF graph can represent (or to what extent) the lens contrast response when, for example, a square pattern is imaged.
  • the first graph 810 has a resolution of 89%
  • the second graph 820 has a resolution of 57%
  • the third graph 830 has a resolution of 89%
  • the fourth graph 840 has a resolution of 49%. can have a resolution of
  • low-frequency resolution may be improved according to a change in the size of the camera opening.
  • the resolution may be increased.
  • the second lens (eg, 5M lens) having an area ratio of about 0.5 to that of the first lens (eg, 32M lens) resolution may be improved by about 8 to 10% at low frequencies.
  • 9A to 9E illustrate an arrangement of a plurality of cameras under a display of a photographing area of an electronic device according to an exemplary embodiment.
  • a plurality of camera openings may be disposed in an area (eg, a photographing area) in which the camera module 180 of the electronic device 300 is disposed.
  • a plurality of lenses may be disposed under the display 301 having a pixel pattern.
  • the arrangement area which will be described later, may be an area of the display 301 corresponding to an area where the camera module is disposed, and may be described as a photographing area in some cases.
  • a plurality of lenses may be arranged to share the same photographing area (eg, a first lens arrangement).
  • the two lenses may share the same imaging area by being disposed such that an overlapping area exists within the arrangement area 910 .
  • the size of the area where the two lenses overlap within the arrangement area 910 may not be limited.
  • the sizes of the two lenses in the placement area 910 may be the same or different.
  • two lenses may have the same size.
  • the two lenses may be lenses of different sizes, and the lenses of different sizes may share the same photographing area by being disposed such that an overlapping area exists.
  • a plurality of lenses may be arranged so as not to share the same photographing area (eg, a second lens arrangement).
  • the two lenses may have different photographing areas by being disposed so that there is no overlapping area within the arrangement area 920 .
  • the sizes of the two lenses in the placement area 920 may be the same or different.
  • the two lenses may have the same size.
  • the two lenses may be lenses of different sizes, and the photographing areas may be different by being disposed so that there is no overlapping area of the lenses of different sizes.
  • a plurality of lenses may be disposed in an arrangement area (eg, the arrangement area 930 , the arrangement area 940 , and the arrangement area 950 ).
  • the number of the plurality of lenses disposed in the arrangement area may not be limited to the above-described number of lenses or the number of lenses to be described later.
  • it may be configured by disposition of a plurality of camera modules 180 having a photographing area of a specified multiple compared to an electronic device in which one camera module (eg, the camera module 180 of FIG. 5 ) is disposed.
  • four lenses may be arranged (eg, a third lens arrangement) as in the arrangement area 930 of FIG. 9C .
  • four lenses of substantially the same size may be arranged at regular intervals (eg, arranged in a square shape) in the arrangement area 930 .
  • four substantially identically sized lenses can be placed without overlapping areas.
  • the four lenses disposed in the disposition area 930 may have different sizes, and the intervals at which they are arranged may not be uniform.
  • the lens disposed in the arrangement area 930 may include a lens having a photographing area of about 0.25 times that of one camera module 180 .
  • seven lenses may be arranged as in the arrangement area 940 of FIG. 9D .
  • lenses of substantially the same size may be disposed in the arrangement area 940 in a shape surrounding the lenses disposed in the center at regular intervals (eg, disposed in the shape of a regular hexagon).
  • seven substantially identically sized lenses can be placed without overlapping regions.
  • the seven lenses disposed in the arrangement area 940 may have different sizes, and the arrangement intervals may not be uniform.
  • the lens disposed in the arrangement area 940 may include a lens having a photographing area approximately 0.125 times larger than that of one camera module 180, and the sum of the photographing area of the central lens and the surrounding lens is , may be arranged to be similar to a photographing area of one camera module 180 .
  • nine lenses may be arranged (eg, a fifth lens arrangement).
  • nine lenses of substantially the same size may be arranged at regular intervals (eg, arranged in a square shape) in the arrangement area 950 .
  • nine identically sized lenses can be placed without overlapping areas.
  • the nine lenses disposed in the disposition area 950 may have different sizes, and the intervals at which they are arranged may not be uniform.
  • the lens disposed in the arrangement area 950 may include a lens having a photographing area of about 0.125 times that of one camera module 180, and the sum of the photographing areas of the nine lenses is one camera. It may be arranged to be larger than the imaging area of the module 180 .
  • regions in which a plurality of lenses overlap exist in the placement areas (eg, the placement area 930 , the placement area 940 , and the placement area 950 ) described with reference to FIGS. 9C to 9E . They may be arranged to share the same imaging area.
  • the plurality of lenses described with reference to FIGS. 9A to 9E may include placement areas (a placement area 910 , a placement area 920 , a placement area 930 , a placement area 940 , and a placement area 950 ). )) may be arranged adjacent to each other in the placement areas (a placement area 910 , a placement area 920 , a placement area 930 , a placement area 940 , and a placement area 950 ). )) may be arranged adjacent to each other in the
  • FIG. 10 is a flowchart illustrating an electronic device performing image processing (eg, synthesizing) based on a plurality of camera modules, according to an embodiment.
  • the electronic device may correspond to the electronic device 101 of FIG. 1 , and may correspond to the electronic device 300 of FIG. 3 and the electronic device 400 of FIG. 4 .
  • the electronic device 101 will be described as an example.
  • the electronic device 101 may perform operations 1010 to 1040, which will be described later, under the control of a processor (eg, the processor 120 of FIG. 1 ).
  • the electronic device 101 may take a picture with a plurality of camera modules (eg, N cameras).
  • a plurality of camera modules eg, N cameras.
  • the camera module 180 of the electronic device 101 may include a plurality of camera modules, and each of the plurality of camera modules may include a lens.
  • the electronic device 101 may perform photographing with a plurality of camera modules (eg, N camera modules) under the control of the processor 120 .
  • the electronic device 101 may acquire at least N images.
  • the camera module 180 of the electronic device 101 may include a plurality of camera modules (eg, N camera modules), and also a plurality of camera modules (eg, N camera modules). may include a plurality of image sensors (eg, N image sensors).
  • the electronic device 101 may acquire at least N image data (or N images) by using the N image sensors under the control of the processor 120 .
  • the electronic device 101 may acquire at least N image data (or N images) by using a common image sensor and/or N camera modules under the control of the processor 120 .
  • the at least N image data (or N images) acquired by the electronic device 101 may include a characteristic (eg, a size of an opening (eg, a lens)) of the camera module 180 or a display (eg, a display).
  • the module 160 may have different resolutions according to characteristics (eg, pixel density, pixel arrangement, or pixel pattern of the imaging area).
  • the electronic device 101 may synthesize at least N images.
  • the electronic device 101 may perform synthesis of at least N pieces of acquired image data (or N images) under the control of the processor 120 .
  • the electronic device 101 may synthesize at least N images having different resolutions under the control of the processor 120 .
  • the electronic device 101 may generate a result image.
  • the electronic device 101 may generate a result image by processing (eg, synthesizing) the acquired at least N image data (or N images) under the control of the processor 120 .
  • the electronic device 101 may perform at least a part of operations 1010 to 1040 using an image signal processor (ISP) (not shown) included in the camera module 180 .
  • ISP image signal processor
  • the image signal processor may be configured as at least a part of the processor 120 , or as a separate processor operated independently of the processor 120 .
  • the processor 120 of the electronic device records an image acquired using an image sensor (not shown) of the camera module 180 or an image stored in a memory (eg, the memory 130 of FIG. 1 ).
  • one or more image processing may be performed.
  • the one or more image processes may include, for example, depth map generation, 3D modeling, panorama generation, feature point extraction, image synthesis, or image compensation (eg, noise reduction, resolution adjustment, brightness adjustment, blurring ( blurring), sharpening (sharpening), or softening (softening) may be included.
  • FIG. 11 illustrates a pixel density (or pixel arrangement) under a display of a photographing area of an electronic device according to an exemplary embodiment.
  • the electronic device 300 may include a camera module 180 , and a photographing area (eg, a photographing area 1110 , a photographing area 1120 ) is located in an area in which the camera module 180 is disposed. It may be a corresponding display area (eg, the display 301 of FIG. 3 ).
  • a photographing area eg, a photographing area 1110 , a photographing area 1120
  • It may be a corresponding display area (eg, the display 301 of FIG. 3 ).
  • the electronic device 300 may have the same pixel density (or pixel arrangement) as the photographing area 1110 in the area where the camera module 180 is disposed.
  • pixels in a general display area (eg, non-photography area) excluding the photographing area (eg, photographing area 1110) may be arranged in an RGB Bayer pattern, and the arrangement of RGBG forms one package.
  • the pixel arrangement of the photographing area 1110 may have a pixel arrangement in which pixels of a general RGB Bayer pattern are 1/4 times larger.
  • the photographing area 1110 may have a pixel arrangement in which one package in which pixels of a general RGB Bayer pattern are 1/4 times are arranged at regular intervals.
  • the pixel density of the photographing area 1110 may be 1/4 times the pixel density of a general RGB Bayer pattern.
  • the electronic device 300 may have the same pixel arrangement as the photographing area 1120 in the area where the camera module 180 is disposed.
  • pixels in a general display area (eg, non-photography area) excluding the imaging area (eg, imaging area 1120) may be arranged in an RGB Bayer pattern, and the arrangement of RGBG may form one package.
  • the pixel arrangement of the photographing area 1120 may have a pixel arrangement in which the pixels of the general RGB Bayer pattern are 1/4 times larger.
  • the photographing area 1120 may have a pixel arrangement arranged at regular intervals in a state in which four packages in which pixels of a general RGB Bayer pattern are 1/4 times are gathered.
  • the pixel density of the photographing area 1120 may be 1/4 times the pixel density of a general RGB Bayer pattern.
  • the pixel density of the photographing area 1120 may be the same as the pixel density of the photographing area 1110 , and there may be differences in pixel patterns (eg, pixel spacing, pixel size) according to the arrangement of pixels. .
  • the aperture area of the photographing area (eg, the photographing area 1110 and the photographing area 1120 ) is widened by multiplying the pixels of the general RGB Bayer pattern by 1/4, and the low frequency band resolution can be increased.
  • FIG. 12 is a diagram illustrating a pixel arrangement under a display of a photographing area of an electronic device according to an exemplary embodiment.
  • the photographing area 1200 of the display may have a different aperture ratio for each area.
  • the aperture ratio may vary according to the pixel density (or pixel arrangement) of the photographing area 1200 of the display. For example, as described with reference to FIG. 11 , if the aperture area of the photographing area (eg, the photographing area 1110 and the photographing area 1120 ) is widened by changing the density of pixels of the general RGB Bayer pattern, the aperture ratio will also increase. can As another example, as the distance between one package having the RGBG arrangement increases, the aperture area may increase, thereby increasing the aperture ratio.
  • the imaging area 1200 may include an area 1210 having a first aperture ratio and an area 1220 having a second aperture ratio.
  • the area 1210 having the first aperture ratio may be disposed in a central area of the imaging area 1200
  • the area 1220 having the second aperture ratio may be disposed in a peripheral area of the central area.
  • the first aperture ratio of the area 1210 corresponding to the central area of the imaging area 1200 may be greater than the second aperture ratio of the area 1220 corresponding to the peripheral area.
  • the first aperture ratio may be twice the second aperture ratio.
  • the relationship (eg, twice) between the first aperture ratio and the second aperture ratio in an embodiment may not be limited to the above-described example, and may be implemented in various ways according to pixel density (or pixel arrangement, pixel pattern).
  • the central region may be a region corresponding to within about 25% of the total area of the imaging region 1200 from the center of the imaging region 1200 .
  • the central region which is the region 1210 having the first aperture ratio, may not be limited to a region corresponding to the above-mentioned about 25%.
  • the aperture ratio of the photographing area 1200 of the display may vary according to a direction from the center.
  • the first aperture ratio of the area 1210 corresponding to the central area of the photographing area 1200 and the second aperture ratio of the area 1220 corresponding to the peripheral area may vary according to a direction from the center.
  • the aperture ratio in the x-axis or y-axis direction may be higher than the aperture ratio in the x-axis or y-axis direction at 45 degrees.
  • display characteristics eg, a region 1210 (eg, a first region) corresponding to the central region of the imaging area 1200
  • a region 1220 eg, a second region corresponding to the peripheral region
  • the pixel density, pixel arrangement, or pixel pattern) of the imaging area may be different.
  • lens MTF characteristics in the first region 1210 and the second region 1220 may be different from each other.
  • the first region 1210 may be formed as an opening having a first shape (eg, an octagonal shape), and the second region 1220 may be formed as an opening having a second shape (eg, a cross-shaped polygon).
  • the electronic device 101 uses a plurality of camera modules 180 or a plurality of camera openings (eg, lenses) to obtain a plurality of image data (eg, a lens).
  • first image data, and second image data may be obtained.
  • the plurality of image data may include, for example, an image acquired through the first region 1210 and an image acquired through the second region 1220 .
  • the plurality of image data may include a first lens (eg, one of lenses included in the arrangement area 910 of FIG. 9A ) and a second lens (eg, a lens) among a plurality of camera openings (eg, lenses). It may include an image acquired through the other one of the lenses included in the arrangement area 910 of FIG. 9A .
  • the electronic device 101 performs image processing (eg, the operation of FIG. 10 ) using a plurality of image data (eg, first image data and second image data). 1030, operation 1040) may be performed.
  • the electronic device 101 converts a plurality of image data (eg, first image data, second image data) to be projected onto one common plane (eg, planar rectification), and Image data can be synthesized (eg, fusion) into one image.
  • the electronic device 101 may synthesize the first image data and the second image data into one image by reflecting the MTF characteristics of each of the plurality of camera modules 180 or the plurality of camera openings (eg, lenses).
  • the electronic device 101 may compensate (eg, restore) the attenuated signal component in consideration of each MTF characteristic.
  • FIG. 13 is a diagram illustrating an MTF graph according to pixel arrangement under a display of a photographing area of an electronic device according to an exemplary embodiment.
  • the MTF graph of FIG. 13 shows a change in resolution when the density of pixels is changed by densely arranging pixels.
  • the correlation between the number of pixels and the pixel density may be as shown in Table 1 below.
  • pixel area pixel area
  • pixel length pixel length
  • ppi pixel density
  • the area occupied by pixels in the pixel area may decrease as the size of a plurality of unit pixels is reduced in the pixel arrangement under the display of the photographing area of the electronic device.
  • the aperture ratio may increase.
  • the curve 1310 of the MTF graph may be an MTF graph curve when pixels of a general RGB Bayer pattern are increased by 1/4 and the size of an opening (eg, a lens) is reduced by an area ratio of 0.25.
  • the curve 1330 of the MTF graph may be an MTF graph curve in the case where only the size of an opening (eg, a lens) is reduced by an area ratio of 0.25, without multiplying pixels of a general RGB Bayer pattern by 1/n.
  • the size of the opening (eg, lens) of the photographing area (eg, photographing area 1110) is the same, and the pixels of the RGB Bayer pattern are arranged to be densely arranged by 1/4.
  • the resolution of the low frequency band may be increased by increasing the aperture ratio.
  • the case corresponding to the curve 1310 may have a higher resolution in the low frequency band (eg, 200lp/mm or less) than the curve 1330. .
  • the resolution of the low frequency band of the curve 1310 may be increased by about 10% or more than that of the low frequency band of the curve 1330 , and about 35 more than the low frequency band of the curve 710 (eg, opening 1) of FIG. 7 . % or more can be increased.
  • the resolution of the low frequency band may increase by about 5% to 10%.
  • the electronic device includes a display (eg, the electronic device 101 ). : a display module 160, a display 301), a plurality of camera modules including a plurality of image sensors disposed below the display (eg, the display module 160, the display 301), and the display (eg, display module 160, display 301) and a processor (eg, processor 120) electrically connected to the plurality of camera modules, and the display (eg, display module 160);
  • the display 301 includes a first area having a first pixel density and a second area having a second pixel density and corresponding to an arrangement area of the plurality of image sensors, and the processor (eg, the processor 120 ) ))) obtains first image data having a first resolution using a first image sensor among the plurality of image sensors, and has a second resolution by using a second image sensor among the pluralit
  • the resolution of the result image may be higher than the first resolution and the second resolution.
  • the first resolution may be the same resolution as the second resolution.
  • the second pixel density may be higher than the first pixel density.
  • the plurality of camera modules include a plurality of lenses, and the plurality of lenses include the second It may be arranged to correspond to the area.
  • the plurality of lenses may be symmetrically disposed in the second area.
  • the plurality of lenses may be disposed adjacent to each other in the second area.
  • the processor eg, the processor 120
  • the processor generates the result image, at least The result image may be generated by synthesizing the first image data and the second image data.
  • the aperture ratio of the second area may be higher than the aperture ratio of the first area.
  • the first region includes a region having a first aperture ratio and a region having a second aperture ratio. can do.
  • the plurality of camera modules include a plurality of lenses, and the lens sizes of the plurality of lenses are dependent on the size of the lenses. Accordingly, the first resolution or the second resolution may be determined.
  • the lens sizes of the plurality of lenses may be the same size.
  • the lens sizes of the plurality of lenses may have different sizes.
  • the plurality of lenses may be disposed without overlapping regions.
  • the plurality of lenses may be disposed such that an overlapping area exists.
  • the method of operating an electronic device is disposed below a display (eg, the display module 160 , the display 301 ).
  • a first image sensor having a first resolution using a first image sensor among the plurality of image sensors using a processor (eg, the processor 120 ) electrically connected to a plurality of camera modules including a plurality of image sensors acquiring image data, acquiring second image data having a second resolution by using a second image sensor among the plurality of image sensors, and based on at least the first image data and the second image data to generate a result image
  • the display eg, the display module 160 or the display 301
  • the display includes a first area having a first pixel density, and a first area having a second pixel density and the plurality of pixels having a second pixel density. It may include a second area corresponding to the arrangement area of the image sensors.
  • the resolution of the result image is higher than the first resolution and the second resolution.
  • the first resolution may be the same resolution as the second resolution.
  • the electronic device eg, the electronic device 101 or the electronic device 300
  • in generating the result image at least the first image data and the second and generating the result image by synthesizing image data.
  • the plurality of camera modules include a plurality of lenses
  • the method includes: and determining the first resolution or the second resolution according to the lens sizes of the plurality of lenses.
  • the electronic device may be a device of various types.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a laptop, a desktop, a tablet, or a portable multimedia device
  • portable medical device e.g., a portable medical device
  • camera e.g., a camera
  • a wearable device e.g., a smart watch
  • a home appliance device e.g., a smart bracelet
  • first”, “second”, or “first” or “second” may simply be used to distinguish the component from other components in question, and may refer to components in other aspects (e.g., importance or order) is not limited. It is said that one (eg, first) component is “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively”. When referenced, it means that one component can be connected to the other component directly (eg by wire), wirelessly, or through a third component.
  • module used in various embodiments of this document may include a unit implemented in hardware, software, or firmware, for example, and interchangeably with terms such as logic, logic block, component, or circuit.
  • a module may be an integrally formed part or a minimum unit or a part of the part that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • one or more instructions stored in a storage medium may be implemented as software (eg, the program 140) including
  • a processor eg, processor 120
  • a device eg, electronic device 101
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not include a signal (eg, electromagnetic wave), and this term is used in cases where data is semi-permanently stored in the storage medium and It does not distinguish between temporary storage cases.
  • a signal eg, electromagnetic wave
  • the method according to various embodiments disclosed in this document may be provided as included in a computer program product.
  • Computer program products may be traded between sellers and buyers as commodities.
  • the computer program product is distributed in the form of a machine-readable storage medium (eg compact disc read only memory (CD-ROM)), or via an application store (eg Play Store TM ) or on two user devices ( It can be distributed online (eg download or upload), directly between smartphones (eg smartphones).
  • a part of the computer program product may be temporarily stored or temporarily generated in a machine-readable storage medium such as a memory of a server of a manufacturer, a server of an application store, or a relay server.
  • each component eg, a module or a program of the above-described components may include a singular or a plurality of entities, and some of the plurality of entities may be separately disposed in other components. .
  • one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, a module or a program
  • the integrated component may perform one or more functions of each component of the plurality of components identically or similarly to those performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component are executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations are executed in a different order, omitted, or , or one or more other operations may be added.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Telephone Function (AREA)

Abstract

Un dispositif électronique selon un mode de réalisation du présent document comprend : un dispositif d'affichage ; une pluralité de modules de caméra comprenant une pluralité de capteurs d'image disposée sous le dispositif d'affichage ; et un processeur connecté électriquement à l'affichage et à la pluralité de modules de caméra, le dispositif d'affichage comprenant une première zone ayant une première densité de pixels, et une seconde zone ayant une seconde densité de pixel et correspondant à une zone dans laquelle la pluralité de capteurs d'image est disposée, et le processeur pouvant obtenir des premières données d'image ayant une première résolution en utilisant un premier capteur d'image parmi la pluralité de capteurs d'image, obtenir des secondes données d'image ayant une seconde résolution en utilisant un second capteur d'image parmi la pluralité de capteurs d'image, et générer une image résultante sur la base d'au moins les premières données d'image et les secondes données d'image. Selon divers modes de réalisation du présent document, la résolution à basses fréquences peut être augmentée et la fonction de transfert de modulation (MTF) peut être améliorée en utilisant une pluralité de petits systèmes optiques ajustés en taille ou par réduction relative d'une zone couverte par un modèle de l'affichage en réduisant une ouverture d'une caméra.
PCT/KR2021/010765 2020-08-14 2021-08-12 Dispositif électronique comprenant un module de caméra WO2022035267A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020200102641A KR20220021728A (ko) 2020-08-14 2020-08-14 카메라 모듈을 포함하는 전자 장치
KR10-2020-0102641 2020-08-14

Publications (1)

Publication Number Publication Date
WO2022035267A1 true WO2022035267A1 (fr) 2022-02-17

Family

ID=80247227

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/010765 WO2022035267A1 (fr) 2020-08-14 2021-08-12 Dispositif électronique comprenant un module de caméra

Country Status (2)

Country Link
KR (1) KR20220021728A (fr)
WO (1) WO2022035267A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220311925A1 (en) * 2021-03-26 2022-09-29 Lenovo (Beijing) Limited Information processing method and electronic apparatus

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140062801A (ko) * 2012-11-15 2014-05-26 엘지전자 주식회사 어레이 카메라, 휴대 단말기 및 그 동작 방법
KR20160012743A (ko) * 2014-07-25 2016-02-03 삼성전자주식회사 촬영 장치 및 그 촬영 방법
JP2016197878A (ja) * 2008-05-20 2016-11-24 ペリカン イメージング コーポレイション 異なる種類の撮像装置を有するモノリシックカメラアレイを用いた画像の撮像および処理
JP2017069926A (ja) * 2015-10-02 2017-04-06 ソニー株式会社 画像処理装置、および画像処理方法、並びにプログラム
KR20190084397A (ko) * 2018-01-08 2019-07-17 삼성전자주식회사 디스플레이에 형성된 개구를 통해 입사된 빛을 이용하여, 이미지 데이터를 생성하기 위한 센서를 포함하는 전자 장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016197878A (ja) * 2008-05-20 2016-11-24 ペリカン イメージング コーポレイション 異なる種類の撮像装置を有するモノリシックカメラアレイを用いた画像の撮像および処理
KR20140062801A (ko) * 2012-11-15 2014-05-26 엘지전자 주식회사 어레이 카메라, 휴대 단말기 및 그 동작 방법
KR20160012743A (ko) * 2014-07-25 2016-02-03 삼성전자주식회사 촬영 장치 및 그 촬영 방법
JP2017069926A (ja) * 2015-10-02 2017-04-06 ソニー株式会社 画像処理装置、および画像処理方法、並びにプログラム
KR20190084397A (ko) * 2018-01-08 2019-07-17 삼성전자주식회사 디스플레이에 형성된 개구를 통해 입사된 빛을 이용하여, 이미지 데이터를 생성하기 위한 센서를 포함하는 전자 장치

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220311925A1 (en) * 2021-03-26 2022-09-29 Lenovo (Beijing) Limited Information processing method and electronic apparatus
US11838643B2 (en) * 2021-03-26 2023-12-05 Lenovo (Beijing) Limited Information processing method and electronic apparatus

Also Published As

Publication number Publication date
KR20220021728A (ko) 2022-02-22

Similar Documents

Publication Publication Date Title
WO2020171448A1 (fr) Structure d'agencement de composant électronique et dispositif électronique comprenant une telle structure
WO2020101344A1 (fr) Dispositif électronique et procédé de détection d'éclairement basé sur des informations d'affichage sur un dispositif électronique
WO2019039714A1 (fr) Ensemble caméra et dispositif électronique comprenant ce dernier
WO2019240365A1 (fr) Dispositif électronique comprenant une structure capacitive
WO2021141454A1 (fr) Module de caméra et dispositif électronique le comprenant
EP3729794A1 (fr) Dispositif électronique comprenant un élément conducteur couplé électriquement à l'ouverture d'un support pour ajuster la résonance générée par l'ouverture
WO2022139376A1 (fr) Dispositif électronique comprenant une antenne à cadre
WO2022035267A1 (fr) Dispositif électronique comprenant un module de caméra
WO2023008854A1 (fr) Dispositif électronique comprenant un capteur optique intégré dans une unité d'affichage
WO2022203285A1 (fr) Module de caméra comprenant un ensemble de stabilisation d'image, et dispositif électronique le comprenant
WO2022203336A1 (fr) Dispositif électronique comprenant un trou de microphone ayant une forme de fente
WO2022149954A1 (fr) Dispositif électronique ayant un écran souple et procédé de fourniture d'un panneau de commande en fonction d'un changement de mode de celui-ci
WO2022045579A1 (fr) Dispositif électronique pour corriger la position d'un dispositif externe et son procédé de fonctionnement
WO2022050620A1 (fr) Circuit de détection et dispositif électronique le comprenant
WO2023121068A1 (fr) Dispositif électronique comprenant un numériseur et procédé de fonctionnement associé
WO2022114789A1 (fr) Dispositif électronique et procédé pour obtenir une quantité de lumière
WO2023085650A1 (fr) Module d'appareil photo et dispositif électronique le comprenant
WO2022050627A1 (fr) Dispositif électronique comprenant un affichage souple et procédé de fonctionnement de celui-ci
WO2022098001A1 (fr) Dispositif électronique ayant un écran souple et procédé de commande de module de caméra correspondant
WO2023214653A1 (fr) Dispositif électronique comprenant une structure pour supporter une carte de circuit imprimé souple
WO2024111882A1 (fr) Dispositif électronique comprenant un module de caméra
WO2024071730A1 (fr) Procédé de synchronisation de données entre des dispositifs, et dispositif électronique associé
WO2023013904A1 (fr) Procédé et dispositif de commande d'écran
WO2022154164A1 (fr) Dispositif électronique apte à régler un angle de vue et procédé de fonctionnement associé
WO2023204419A1 (fr) Dispositif électronique, et procédé pour régler une zone d'affichage d'une unité d'affichage sur la base d'une intensité de lumière émise par une unité d'affichage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21856273

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21856273

Country of ref document: EP

Kind code of ref document: A1