WO2022220444A1 - Procédé de balayage lors d'une prise de vue avec un appareil photo et appareil électronique associé - Google Patents

Procédé de balayage lors d'une prise de vue avec un appareil photo et appareil électronique associé Download PDF

Info

Publication number
WO2022220444A1
WO2022220444A1 PCT/KR2022/004334 KR2022004334W WO2022220444A1 WO 2022220444 A1 WO2022220444 A1 WO 2022220444A1 KR 2022004334 W KR2022004334 W KR 2022004334W WO 2022220444 A1 WO2022220444 A1 WO 2022220444A1
Authority
WO
WIPO (PCT)
Prior art keywords
reflector
image
electronic device
image frame
processor
Prior art date
Application number
PCT/KR2022/004334
Other languages
English (en)
Korean (ko)
Inventor
김용관
허민
곽호근
양경동
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2022220444A1 publication Critical patent/WO2022220444A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/40Picture signal circuits
    • H04N1/401Compensating positionally unequal response of the pick-up or reproducing head
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00281Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a telecommunication apparatus, e.g. a switched network of teleprinters for the distribution of text-based information, a selective call terminal
    • H04N1/00307Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a telecommunication apparatus, e.g. a switched network of teleprinters for the distribution of text-based information, a selective call terminal with a mobile telephone apparatus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/04Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa
    • H04N1/19Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using multi-element arrays
    • H04N1/191Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using multi-element arrays the array comprising a one-dimensional array, or a combination of one-dimensional arrays, or a substantially one-dimensional array, e.g. an array of staggered elements
    • H04N1/192Simultaneously or substantially simultaneously scanning picture elements on one main scanning line
    • H04N1/193Simultaneously or substantially simultaneously scanning picture elements on one main scanning line using electrically scanned linear arrays, e.g. linear CCD arrays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/04Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa
    • H04N1/19Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using multi-element arrays
    • H04N1/191Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using multi-element arrays the array comprising a one-dimensional array, or a combination of one-dimensional arrays, or a substantially one-dimensional array, e.g. an array of staggered elements
    • H04N1/192Simultaneously or substantially simultaneously scanning picture elements on one main scanning line
    • H04N1/193Simultaneously or substantially simultaneously scanning picture elements on one main scanning line using electrically scanned linear arrays, e.g. linear CCD arrays
    • H04N1/1935Optical means for mapping the whole or part of a scanned line onto the array
    • H04N1/1937Optical means for mapping the whole or part of a scanned line onto the array using a reflecting element, e.g. a mirror or a prism
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils

Definitions

  • Embodiments disclosed in this document relate to a method of scanning when photographing by a camera and an electronic device thereof.
  • An electronic device such as a smart phone may include a camera module in which the digital camera is miniaturized.
  • a camera module mounted in an electronic device such as a smart phone is miniaturized in order to satisfy the user's needs.
  • a periscope camera may be employed.
  • a periscope-type camera may include a reflector, such as a prism, capable of redirecting light.
  • the electronic device may perform object tracking (eg, a scan function) through a camera module.
  • object tracking eg, a scan function
  • lens shading characteristics may be deteriorated, such as taking pictures with different brightness for each image position even under a uniform subject and lighting conditions without a change in contrast.
  • the electronic device may perform lens shading correction (LSC) on the image so that the image acquired by the camera module has uniform brightness.
  • LSC lens shading correction
  • the movement range of the prism included in the camera module is limited in the pitch direction and the yaw direction, it is difficult to provide a scan function using the camera module. And one image was obtained by photographing the light source only at a fixed position of the prism, and one shading correction value was extracted from this image and applied.
  • the scan function is being used by actively rotating the prism. When the prism needs to move in a wide range (eg, about 20 ⁇ to 25 ⁇ ) through the scan function, it is difficult to accurately correct lens shading if one shading correction value is applied to images acquired at various positions of the prism. . Accordingly, there is a need for a technique for effectively correcting a change in shading characteristics by accurately detecting an optical change in real time.
  • Various embodiments of the present disclosure provide an electronic device that performs lens shading correction according to a position of a prism that is changed while performing a scan function by presetting lens shading correction values corresponding to various positions of a prism and control of the electronic device method can be provided.
  • An electronic device includes at least one lens, a reflector for changing a field of view (FOV) by adjusting incident light within a first range based on a central axis of the at least one lens; and a camera module including at least one Hall sensor for checking the location information of the reflector, and a non-volatile memory for storing hall data corresponding to the location information.
  • the electronic device may include at least one processor operatively connected to the camera module.
  • the at least one processor of the electronic device obtains data for correcting lens shading according to the position of the reflector from the non-volatile memory, drives the camera module to obtain a first image frame, and the at least one hole identify a first position of the reflector corresponding to the first image frame through a sensor, and in response to identifying the first position, the first position based on the first position and the acquired data Lens shading correction may be performed on the image frame.
  • An operating method of an electronic device includes an operation of acquiring data for correcting lens shading according to a position of a reflector from a nonvolatile memory, an operation of driving a camera module to acquire a first image frame, and at least one hole identifying a first position of the reflector corresponding to the first image frame through a sensor, and in response to identifying the first position, the first image based on the first position and the acquired data It may include an operation of performing lens shading correction on the frame.
  • a shading correction value suitable for the position of the prism may be applied.
  • the brightness of the image may be uniform.
  • the shading correction value may be stored in the non-volatile memory of the camera module and applied to various electronic devices employing the corresponding camera module.
  • FIG 1 illustrates an electronic device according to an embodiment.
  • FIG. 2 shows a camera module according to an embodiment.
  • FIG 3 is an exploded perspective view of a camera module according to an embodiment.
  • FIG 4 is an exemplary view viewed from a bottom view and an exemplary view viewed from a side view of the camera module according to an embodiment.
  • FIG. 5 is a flowchart illustrating a process of performing lens shading correction according to a position of a prism in an electronic device according to an exemplary embodiment.
  • FIG. 6 is a flowchart illustrating a process of setting a camera module according to an embodiment through a camera module process.
  • FIG. 7 is a flowchart illustrating a process of generating a lookup table for lens shading correction in an electronic device according to an exemplary embodiment.
  • FIG. 8 is a graph illustrating a brightness distribution for each image area when a light source image is captured by the electronic device according to an exemplary embodiment.
  • FIG 9 illustrates images taken for each position of a prism within a scannable range of an electronic device according to an exemplary embodiment.
  • FIG. 10 is a diagram illustrating a brightness distribution for each area of a photographed image when an image is photographed according to various positions of a prism in an electronic device according to an exemplary embodiment.
  • 11 is a graph illustrating a result of performing correction by applying one lens shading correction value to images captured according to various positions of a prism in an electronic device according to an embodiment.
  • FIG. 12 is a graph illustrating brightness values of images captured according to various positions of a prism in an electronic device according to an embodiment.
  • FIG. 13 is a block diagram of an electronic device in a network environment according to various embodiments of the present disclosure.
  • FIG. 14 is a block diagram illustrating a camera module according to various embodiments.
  • FIG. 1 illustrates a structure of an electronic device 100 and a camera module 180 according to an embodiment.
  • FIG. 1 is a view of an electronic device 100 (eg, the electronic device 1301 of FIG. 13 ) mounted with a camera module 180 (eg, the camera module 1380 of FIG. 13 ) according to an embodiment, and a camera module; (180) is a diagram schematically showing.
  • a camera module 180 eg, the camera module 1380 of FIG. 13
  • FIG. 1 has been illustrated and described on the premise of a mobile device, in particular, a smart phone, it will be appreciated by those skilled in the art that it can be applied to various electronic devices or electronic devices equipped with a camera among mobile devices. will be clearly understood.
  • a display 110 (eg, the display module 1360 of FIG. 13 ) may be disposed on the front surface of the electronic device 100 according to an embodiment.
  • the display 110 may occupy most of the front surface of the electronic device 100 .
  • a display 110 and a bezel 190 region surrounding at least some edges of the display 110 may be disposed on the front surface of the electronic device 100 .
  • the display 110 may include a flat area and a curved area extending from the flat area toward the side of the electronic device 100 .
  • the electronic device 100 illustrated in FIG. 1 is an example, and various embodiments are possible.
  • the display 110 of the electronic device 100 may include only a flat area without a curved area, or may include a curved area only at one edge instead of both sides.
  • the curved area may extend toward the rear surface of the electronic device, so that the electronic device 100 may include an additional planar area.
  • the electronic device 100 may additionally include a speaker, a receiver, a front camera, a proximity sensor, or a home key.
  • the electronic device 100 may be provided in which the rear cover 150 is integrated with the main body of the electronic device.
  • the rear cover 150 may be separated from the main body of the electronic device 100 to have a form in which the battery can be replaced.
  • the back cover 150 may be referred to as a battery cover or a back cover.
  • a fingerprint sensor 171 for recognizing a user's fingerprint may be included in one area 170 of the display 110 . Since the fingerprint sensor 171 is disposed on a layer below the display 110 , the fingerprint sensor 171 may not be recognized by the user or may be difficult to recognize.
  • a sensor for additional user/biometric authentication in addition to the fingerprint sensor 171 may be disposed in a portion of the display 110 .
  • a sensor for user/biometric authentication may be disposed on an area of the bezel 190 . For example, the IR sensor for iris authentication may be exposed through one area of the display 110 or exposed through one area of the bezel 190 .
  • the front camera 161 may be disposed in one area 160 of the front of the electronic device 100 .
  • the front camera 161 is shown to be exposed through one area of the display 110 , but in another embodiment, the front camera 161 may be exposed through the bezel 190 .
  • the electronic device 100 may include one or more front cameras 161 .
  • the electronic device 100 may include two front cameras, such as a first front camera and a second front camera.
  • the first front camera and the second front camera may be cameras of the same type having the same specifications (eg, pixels), but the first front camera and the second front camera may be implemented as cameras of different specifications.
  • the electronic device 100 may support a function related to a dual camera (eg, 3D imaging, auto focus, etc.) through two front cameras. The above-mentioned description of the front camera may be equally or similarly applied to the rear camera of the electronic device 100 .
  • the front camera 161 may be disposed on the rear surface (eg, the side facing the -z direction) of the one region 160 of the display 110 to face the one region 160 .
  • the front camera 161 may not be visually exposed to the one area 160 , and may include a hidden under display camera (UDC).
  • UDC hidden under display camera
  • an area 160 of the display 110 that at least partially faces the front camera 161 is a part of an area for displaying content, and may be formed as a transmissive area having a specified transmittance.
  • the transmissive region may be formed to have a transmittance in a range of about 5% to about 20%.
  • the transmissive area is formed by an image sensor (eg, the image sensor 1330 of FIG. 13 ) and an effective area of the front camera 161 (eg, the field of view (FOV), through which light for generating an image passes; field of view)) and overlapping areas may be included.
  • the transmissive area of the display 110 may include an area having a lower pixel density and/or wiring density than the surrounding area.
  • various hardware or sensors 163 to assist photographing such as a flash, may be additionally disposed.
  • a distance sensor eg, TOF sensor
  • the distance sensor may be applied to both a front camera and/or a rear camera.
  • the distance sensor may be separately disposed or included and disposed on the front camera and/or the rear camera.
  • At least one physical key may be disposed on a side portion of the electronic device 100 .
  • the first function key 151 for turning on/off the display 110 or turning on/off the power of the electronic device 100 may be disposed on the right edge with respect to the front of the electronic device 100 .
  • the second function key 152 for controlling the volume or screen brightness of the electronic device 100 may be disposed on the left edge with respect to the front surface of the electronic device 100 .
  • additional buttons or keys may be disposed on the front or rear of the electronic device 100 .
  • a physical button or a touch button mapped to a specific function may be disposed in a lower region of the front bezel 190 .
  • an electronic device 100 includes at least one camera module, a flash, and sensors such as a distance sensor on a rear surface (eg, a surface facing the +z direction) of the electronic device 100 .
  • the camera module 180 includes a lens assembly 182 (eg, the lens assembly 1410 of FIG. 14 ), an AF/OIS carrier assembly 183 , an infrared cut filter 184 , and an image sensor ( 185) (eg, the image sensor 1430 of FIG. 14 ) and an image signal processor (ISP) 187 (eg, the image signal processor 1460 of FIG. 14 ).
  • the lens assembly 182 , the AF/OIS carrier assembly 183 , and the image sensor 185 are not disposed parallel to the direction of light incident to the electronic device 100 , but the electronic device 100 . It may be disposed to be substantially perpendicular to the direction of the light incident to the . For example, the thickness of the electronic device 100 may be reduced by disposing it to be substantially perpendicular to a direction 101 (eg, a -z direction) of light incident on the electronic device 100 .
  • the prism 181 included in the camera module 180 serves as a reflector and changes the direction 101 of the incident light to the direction 102 to be reflected, so that the lens assembly 182 and the AF/OIS carrier assembly 183 are included.
  • the prism 181 may reflect light incident in the -z direction to change the direction of the light to the +x direction. Due to this structural characteristic, the electronic device 100 may perform shake correction by controlling the movement of the prism 181 .
  • the prism 181 may be replaced with a mirror.
  • the lens assembly 182 may have different numbers, arrangements, and/or types of lenses depending on the front camera and the rear camera.
  • the front camera and the rear camera may have different characteristics (eg, focal length, maximum magnification, etc.).
  • the lens may move forward and backward along the optical axis, and may operate to change a focal length so that a target object, which is a subject, can be clearly captured.
  • the camera module 180 includes a lens assembly 182 for mounting at least one or more lenses aligned on an optical axis and at least one coil that at least partially surrounds the periphery of the lens assembly 182 about the optical axis. It may include an AF / OIS carrier assembly 183 for mounting the.
  • the infrared cut filter 184 may be disposed on the upper surface (eg, -x direction) of the image sensor 185 .
  • the image of the subject passing through the lens may be partially filtered by the infrared cut filter 184 and then detected by the image sensor 185 .
  • the image sensor 185 may be a complementary metal oxide semiconductor (CMOS) sensor or a charged coupled device (CCD) sensor.
  • CMOS complementary metal oxide semiconductor
  • CCD charged coupled device
  • a plurality of individual pixels are integrated in the image sensor 185 , and each individual pixel may include a micro lens, a color filter, and a photodiode.
  • Each individual pixel is a kind of photodetector that can convert incoming light into an electrical signal. Photodetectors generally cannot detect the wavelength of the captured light by themselves and cannot determine color information.
  • the photodetector may include a photodiode.
  • the image sensor 185 may be electrically connected to the image signal processor 187 connected to the printed circuit board 188 by a connector 186 .
  • a flexible printed circuit board (FPCB) or a cable may be used as the connector 186 .
  • the light information of the subject incident through the lens assembly 182 may be converted into an electrical signal by the image sensor 185 and input to the image signal processor 187 .
  • the image signal processor 187 may be disposed independently of the processor 120 in the electronic device 100 or may be driven as a part of the processor 120 .
  • the camera module 180 may be disposed on the front side as well as the rear side of the electronic device 100 .
  • the electronic device 100 may include a plurality of camera modules 180 as well as one camera module 180 to improve camera performance.
  • the electronic device 100 may further include a front camera 161 for video call or self-camera photography.
  • the front camera 161 may support a relatively low number of pixels compared to the rear camera module.
  • the front camera may be relatively smaller than the rear camera module.
  • the electronic device 100 illustrated in FIG. 1 corresponds to one example, and the form of the device to which the technical idea disclosed in this document is applied is not limited.
  • the technical idea disclosed in this document is applicable to various user devices including a first camera module facing a first direction and a second camera module facing a direction different from the first direction.
  • the technical idea disclosed in this document may be applied to a foldable electronic device that can be folded in a horizontal direction or a foldable electronic device in a vertical direction, a tablet, or a notebook computer.
  • the present technical idea can be applied even when it is possible to arrange the first camera module and the second camera module facing the same direction to face different directions through rotation, folding, deformation, etc. of the device.
  • the illustrated electronic device 100 of the illustrated example has a bar-type or plate-type appearance
  • various embodiments of the present disclosure are not limited thereto.
  • the illustrated electronic device may be a part of a rollable electronic device.
  • a rollable electronic device is an electronic device capable of bending and deforming the display 110 , so that at least a portion of the display 110 is wound or rolled or accommodated inside the electronic device 100 .
  • the rollable electronic device can be used by expanding the screen display area by unfolding the display 110 or exposing a larger area of the display 110 to the outside according to a user's needs.
  • the display 110 may also be referred to as a slide-out display or an expandable display.
  • FIG. 2 illustrates a camera module 180 according to an embodiment.
  • 3 is an exploded perspective view of the camera module 180 according to an embodiment.
  • the camera module 180 includes a cover 310 , a housing assembly 320 , a prism module 210 , a lens assembly 330 (eg, the lens assembly 182 of FIG. 1 ); It may include an image sensor 360 (eg, the image sensor 185 of FIG. 1 ).
  • the prism module 210 includes a prism 211 (eg, the prism 181 of FIG. 1 ), a prism connection part 213 , a prism holder 215 , a first set of balls 217 , and a second Ball set 219 may be included.
  • the prism connection part 213 may connect the prism 211 and the prism holder 215 .
  • the prism holder 215 may have a 'U' shape or a 'C' shape, and a space between the prism holder 215 may include a prism 211 and a prism connection part 213 .
  • the prism holder 215 is physically coupled to the prism 211 and may move integrally with the prism 211 .
  • the prism holder 215 may protect the prism 211 from external impact.
  • the prism 211 may rotate along a panning axis (eg, a z-axis) and a tilting axis (eg, a y-axis).
  • a panning axis eg, a z-axis
  • a tilting axis eg, a y-axis
  • the prism 211 may rotate based on a tilting axis (eg, a y-axis) based on the first ball set 217 .
  • the prism 211 may rotate based on the panning axis (eg, the z-axis) based on the second ball set 219 .
  • Rotating about the tilting axis may be understood as rotating in a first direction (eg, pitch direction), and rotating about the panning axis (eg, z-axis) This may be understood as rotating in a second direction (eg, a yaw direction).
  • the prism module 210 may be referred to as a prism assembly, and/or a prism structure.
  • the prism 211 included in the prism module 210 is disposed in front of the lens assembly 330 (eg, in the -x-axis direction), and the prism 211 has one axis (eg, in the -z direction). ) may be reflected toward the lens assembly 330 .
  • the prism 211 may convert the light entering the rear surface (eg, -z direction) of the electronic device 100 by about 90° to direct the light toward the lens assembly 330 .
  • the camera module 180 may include a prism 211 and a prism holder 215 surrounding the prism connector 213 .
  • the camera module 180 includes at least two or more magnetic bodies fixed to the prism module 210 and/or the lens assembly 330 (eg, the first magnet 216 of FIG. 2 and/or the second magnet of FIG. 3 ). 2 magnets 331) may be included.
  • the camera module 180 may include at least two driving coils 350 interacting with the at least two or more magnetic bodies.
  • the at least two or more magnetic materials move integrally with the prism 211 and/or the lens assembly 330 and transmit the electromagnetic force by the at least two or more driving coils 350 to the prism 211 and/or the lens assembly 330 .
  • the camera module 180 may include at least two Hall sensors 390 .
  • At least two or more Hall sensors 390 are described below in FIG. 4 , and thus detailed descriptions thereof will be omitted. .
  • the camera module 180 may include a prism module 210 and a housing assembly 320 in which the lens assembly 330 may be mounted.
  • the housing assembly 320 may be covered by the cover 310 .
  • the camera module 180 includes a plurality of bearings, and the plurality of bearings may support rotation and/or linear motion of the lens assembly 330 .
  • the image sensor 360 is connected to a printed circuit board 370 (eg, a printed circuit board (PCB), a printed board assembly (PBA), a flexible PCB (FPCB), or a rigid-flex PCB (RFPCB)). and can be placed.
  • the image sensor 360 is disposed at the rear of the lens assembly 330 (eg, in the +x direction), and may collect light passing through the lens assembly 330 through the image sensor 360 .
  • the printed circuit board 370 may be electrically connected to an auto focus (AF) driver and an optical image stabilization (OIS) driver.
  • AF auto focus
  • OIS optical image stabilization
  • at least one processor included in the electronic device 100 eg, the processor 120 of FIG. 1
  • the electronic device 100 may perform shake correction by transmitting an electric signal corresponding to the OIS control value to at least one coil of the OIS driver.
  • at least one processor eg, the processor 120 of FIG.
  • the electronic device 100 may generate an AF control value to adjust a focal length between the subject and the camera, and the electronic device ( 100) may implement AF by transmitting an electrical signal corresponding to the AF control value to at least one coil of the AF driver.
  • the camera module 180 may include a non-volatile memory 380 .
  • the non-volatile memory 380 may be an electrically erasable and programmable read only memory (EEPROM).
  • the non-volatile memory 380 may be configured as separate hardware from the main memory.
  • the main memory may be configured as separate hardware not included in the camera module 180
  • the nonvolatile memory 380 may be included in the camera module 180 and configured as one module.
  • the non-volatile memory 380 may be disposed in a part of the camera module.
  • the non-volatile memory 380 may be attached to one surface of the housing assembly 320 .
  • the non-volatile memory 380 may be disposed outside and/or inside the housing assembly 320 .
  • the non-volatile memory 380 may be electrically and/or operatively coupled to the processor 120 .
  • the non-volatile memory 380 may be manufactured together in a module process of the camera module 180 .
  • the non-volatile memory 380 may store optical characteristics of the lens assembly 330 .
  • the non-volatile memory 380 may store a hall value according to the position of the prism 211 and an image profile value of a photographed image according to the position of the prism 211 .
  • FIG 4 is an exemplary view of the camera module 180 viewed from a bottom view and an exemplary view viewed from a side view, according to an embodiment.
  • the prism 211 may be physically coupled to the first magnet (eg, the first magnet 216 of FIG. 2 ) 410 .
  • the prism 211 may move integrally with the physically coupled first magnet 410 .
  • the prism holder eg, the prism holder 215 of FIG. 2
  • the prism 211 is the second 1 It is possible to move integrally with the magnet 410 .
  • the camera module 180 may include a first driving coil 421 , a second driving coil 423 , and a third driving coil 425 .
  • the first drive coil 421 interacts with the first Hall sensor 431
  • the second drive coil 423 interacts with the second Hall sensor 433
  • the third drive coil 425 interacts with the third may interact with the Hall sensor 435 .
  • At least one hall sensor configured to detect the position of the first magnet 410 may be disposed in the center of the at least one driving coil.
  • the first Hall sensor 431 may detect a position of the first magnet 410 in a pitch direction.
  • the second Hall sensor 433 and the third Hall sensor 435 may detect the position of the first magnet 410 in the yaw direction.
  • the at least one Hall sensor 431 , 433 , 435 measures a position of the magnet with respect to the at least one Hall sensor through interaction with an opposing magnet (eg, the first magnet 410 ). can do.
  • the at least one Hall sensor 431 , 433 , 435 is disposed at the center of the at least one driving coil 421 , 423 , 425 , and the at least one magnet provides position information for the at least one Hall sensor (eg, pitch direction, yaw direction location information) can be measured.
  • the first magnet 410 may be composed of at least two magnets. One is attached to one side of the prism connector 213 to detect the position in the pitch direction, and the other is attached to one side of the prism holder 215 to detect the position in the yaw direction. can be attached.
  • Lens shading refers to a ray passing through the center of the lens assembly 182 among rays incident on the lens assembly (eg, the lens assembly 182 of FIG. 1 ) due to the curvature of the lens.
  • 182 refers to a phenomenon in which a darker area is output as it goes toward the edge of the image by reaching the image sensor (eg, the image sensor 185 of FIG. 1 ) more than the light passing through the outer part instead of the central part.
  • the curved shape of the lens and/or the internal diaphragm of the lens can cause lens shading.
  • the processor 120 may obtain data for correcting lens shading according to the position of the reflector (eg, the prism 181 of FIG. 1 ) from the nonvolatile memory.
  • the processor 120 may obtain a lookup table for correcting lens shading from the nonvolatile memory.
  • the lookup table may include a lens shading compensation value corresponding to each position information of the prism 181 .
  • the data for correcting the lens shading and/or the lookup table may be data previously stored in a nonvolatile memory (eg, the nonvolatile memory 380 of FIG. 3 ) included in the camera module 180 .
  • Data stored in the non-volatile memory 380 may be input through a camera module process.
  • Data for correcting lens shading may include a correction function described below with reference to FIG. 7 and/or a correction lookup table.
  • the non-volatile memory may be an electrically erasable programmable read-only memory (EEPROM).
  • EEPROM electrically erasable programmable read-only memory
  • the reflector may be a prism or a mirror.
  • the processor 120 may obtain the first image frame by driving the camera module 180 .
  • the processor 120 may obtain an input for executing a camera application (eg, a user input) to drive the camera module 180 .
  • the input may include at least one of a user pressing a function key, touching a camera icon displayed on the display 110, or a user's voice input.
  • the processor 120 may sequentially acquire the second image frame.
  • the processor 120 may identify a first position of a reflector (eg, the prism 181 of FIG. 1 ) corresponding to the first image frame through at least one Hall sensor.
  • the at least one Hall sensor is configured to acquire position information of the reflector, and the processor 120 may acquire position information of the reflector through interaction with a magnet coupled to the reflector.
  • the processor 120 may identify the first position of the reflector corresponding to the time point at which the first image frame is acquired.
  • a time point at which the first image frame is acquired is a time point at which the first image frame is captured by the image sensor 185 , a time point at which the first image frame is read-out from the image sensor 185 , and/or a first time point
  • the image frame may be at least one of the time points output from the image sensor 185 .
  • the processor 120 may identify the position of the prism 181 in the first direction (eg, a pitch direction) through at least one Hall sensor.
  • the processor 120 may acquire a hall code in a first direction (eg, a pitch direction) through at least one Hall sensor.
  • the processor 120 may obtain position information of the prism 181 in a first direction (eg, a pitch direction) based on the obtained hole value.
  • the processor 120 may identify the position of the prism 181 in the second direction (eg, a yaw direction) through at least one Hall sensor.
  • the processor 120 may acquire a hall code in the second direction (eg, a yaw direction) through at least one Hall sensor.
  • the processor 120 may acquire position information of the prism 181 in a second direction (eg, a yaw direction) based on the obtained hole value.
  • the processor 120 may generate a hall code in a first direction (eg, a pitch direction) and a hall value in a second direction (eg, a yaw direction). ), the position of the prism 181 can be confirmed.
  • the position information of the prism 181 may be previously stored or set to correspond to a hall code obtained from a hall sensor. This may be further described with reference to FIG. 6 or FIG. 9 .
  • the processor 120 may perform lens shading correction on the first image frame based on the first position and the acquired data.
  • the camera module 180 may output a darker image toward the edge of the image. Accordingly, in an embodiment, the processor 120 may perform lens shading correction so that the entire region of the image frame has a uniform brightness value by increasing the brightness value of the region having a low brightness value with respect to the obtained image frame. . In an embodiment, lens shading correction may be performed on the first image frame based on the first position of the prism 181 and a lens shading correction value corresponding to the first position.
  • the processor 120 may output the first image frame on which the lens shading correction is performed as a preview through a display (eg, the display 110 of FIG. 1 ).
  • FIG. 6 is a flowchart illustrating a process of setting the camera module 180 according to an embodiment through a camera module process.
  • image capturing may be performed for each hall code in the camera module process.
  • image capturing may be performed for each position of a reflector (eg, the prism 181 of FIG. 1 ). The image capturing may be performed so that an image corresponding to each hall code is determined in the camera module process.
  • a hall value and an image profile value may be stored in a non-volatile memory.
  • image capturing is performed for each position of the reflector (eg, the prism 181 in FIG. 1 ), and a hall value indicating the position and an image profile of the image photographed at that position (image) profile) value can be obtained.
  • the image profile value may indicate a brightness value according to a pixel position of the photographed image.
  • FIG. 7 is a flowchart illustrating a process of generating a lookup table for lens shading correction in the electronic device 100 according to an exemplary embodiment. Operation 710 of FIG. 7 may be performed after operation 620 of FIG. 6 .
  • the processor 120 may boot the camera module. In an embodiment, when the power of the electronic device 100 is turned from an off state to an on state, the processor 120 may boot the camera module 180 . In an embodiment, when the camera application is executed while the electronic device 100 is powered on, the processor 120 may boot the camera module.
  • the processor 120 may read a hall code and an image profile value from a non-volatile memory (eg, EEPROM).
  • the non-volatile memory may be the non-volatile memory 380 of FIG. 3 physically coupled to the camera module 180 .
  • the hall value (hall code) may be a value obtained by converting a hall voltage to a resolution value of the sensor.
  • the processor 120 may read a hall code and an image profile value from a non-volatile memory.
  • the processor 120 may read a hall code and an image profile value from a non-volatile memory whenever the camera module 180 is booted.
  • the processor 120 moves in a first direction (eg, a pitch direction) and a second direction (eg, a yaw direction) based on a hall code.
  • a correction function can be created for
  • the processor 120 is configured to perform a first direction (eg, a pitch direction) and a second direction (eg, (yaw) direction) can be generated.
  • the processor 120 reads from a non-volatile memory (eg, EEPROM) in a first direction (eg, a hall code) and an image profile value in a first direction (eg, a pitch direction).
  • a first polynomial for pitch direction can be generated.
  • the processor 120 may generate a second polynomial for the yaw direction based on a hall value and an image profile value in the yaw direction read from a non-volatile memory (eg, EEPROM).
  • the processor 120 calculates a correction lookup table for a first direction (eg, a pitch direction) and a second direction (eg, a yaw direction). can create In other words, the processor 120 may generate a correction lookup table according to the position of the prism 181 .
  • the processor 120 may be configured to perform a first direction (eg, a pitch direction) and a second direction based on a hall value and an image profile value read from a non-volatile memory (eg, EEPROM).
  • a correction lookup table for a direction eg, a yaw direction
  • the processor 120 may generate a correction lookup table based on a correction function generated based on the hall code and the image profile value.
  • the correction lookup table may be a table for shading correction values corresponding to position information of a reflector (eg, the prism 181 of FIG. 1 ).
  • the first shading correction value is applied when the reflector (eg, the prism 181 of FIG. 1 ) is in the first position, and the reflector (eg, the prism 181 of FIG. 1 ) is the second position.
  • It may be a lookup table formed so that the second shading correction value is applied at the 2nd position.
  • the correction lookup table may be a table for shading correction values corresponding to a hall code related to the position of the reflector.
  • a hall value is a first hall value
  • a first shading correction value is applied
  • a hall value is a second hall value (hall code)
  • It may be a lookup table formed so that the second shading correction value is applied when .
  • the horizontal axis of the first graph 803 and the second graph 805 indicates a position corresponding to a number displayed on the horizontal axis of the light source image 801 , and the vertical axis indicates a brightness value.
  • the light source image 801 may be an image obtained by photographing the light source. Since at least one lens of the lens assembly (eg, the lens assembly 182 of FIG. 1 ) has a curvature, the amount of light reaching the image sensor (eg, the image sensor 185 of FIG. 1 ) is closer to the center according to the curvature rate. is the largest, and the amount of light reaching the image sensor 185 may decrease as it is deviated from the center.
  • the numbers 1 to 7 included in the light source image 801 may mean first to seventh areas in the first graph 803 and the second graph 805 .
  • the first graph 803 may be a graph showing the brightness distribution for each area of the image before lens shading correction is performed on the light source image 801 .
  • a plurality of graphs are shown, and the plurality of graphs are graphs showing a plurality of results by performing the same process along the vertical axis of the light source image 801 .
  • the brightness value is highest in the fourth region that is the center of the light source image 801 , and the brightness value decreases as the distance from the fourth region increases.
  • the second graph 805 may be a graph showing a brightness distribution for each image area after lens shading correction is performed on the light source image 801 .
  • the lens shading correction value may be a value for correcting each value of pixels of an original image.
  • FIG. 9 illustrates images taken for each position of a prism (eg, the prism 181 of FIG. 1 ) within the scannable range 920 of the electronic device 100 according to an exemplary embodiment.
  • a prism eg, the prism 181 of FIG. 1
  • the scannable range 920 may be understood as the maximum movement range of the prism 181 or the angle of view range of images captured within the maximum movement range of the prism 181 .
  • the scannable range 920 represents a scannable range for object tracking.
  • the scannable range 920 may be -10 ⁇ to +10 ⁇ in the pitch direction, and -25 ⁇ to +25 ⁇ in the yaw direction.
  • the range of the hall value or hall code may be 0 to 4095 codes on a 12-bit basis.
  • the processor 120 is approximately 0.005deg/code ( ⁇ 20deg/4096code) within -10 ⁇ to +10 ⁇ in the pitch direction, and -25 ⁇ to +25 ⁇ in the yaw direction.
  • the prism 181 may be controlled in units of about 0.0125deg/code ( ⁇ 50deg/4096code).
  • the processor 120 may perform a scan based on the acquired image data.
  • the processor 120 may detect an object through the obtained image data and perform a scan to track the detected object.
  • the processor 120 may detect the motion of the detected object based on image data acquired through the camera module 180 .
  • the processor 120 when the processor 120 detects that the detected object has moved by the first angle, the processor 120 may control the prism to track the moved object. For example, when the processor 120 detects that the object to be photographed has moved by about 2 ⁇ in the yaw direction with respect to the center of the lens, the processor 120 moves the prism corresponding to the 2 ⁇ for tracking the object. It can be rotated or moved as much as possible.
  • the first image 901 to the tenth image 910 photographed according to the position of the prism 181 are located at the position of the prism 181 within the scannable range 920 . It may be an image taken along with it.
  • the first image 901 to the tenth image 910 may be images taken according to the angle in the yaw direction of the prism 181 and the angle in the pitch direction of the prism 181 . have.
  • the first image 901 to the tenth image 910 may be determined by a hall value (code) corresponding to an angle in each direction (yaw, pitch).
  • a hall value in a yaw direction is 50
  • a hall code in a pitch direction is 50
  • the sixth image 906 is a yaw value.
  • a hall value (hall code) in the (yaw) direction may be 4050
  • a hall value (hall code) in the pitch direction may be 1850.
  • the first image 901 to the tenth image 910 may be referred to or understood as a first region of interest (ROI) (eg, ROI 1 ) to a tenth ROI (eg, ROI 10 ).
  • ROI region of interest
  • ROI 10 tenth ROI
  • FIG. 10 illustrates a brightness distribution for each area of each photographed image when images are photographed according to various positions of a prism (eg, the prism 181 of FIG. 1 ) in the electronic device 100 according to an embodiment.
  • the horizontal axis of the graphs of FIG. 10 means a position corresponding to a number displayed on the horizontal axis of an image (eg, the light source image 801 of FIG. 8 ), and the vertical axis means a brightness value.
  • a plurality of graphs are shown for each ROI, and the plurality of graphs perform the same process along the vertical axis of the image (eg, the light source image 801 of FIG. 8 ) to show a plurality of results. These are graphs.
  • FIG. 10 it can be seen that the distribution of brightness values for each area of the photographed image is different according to various positions of the prism 181 .
  • a plurality of images may be photographed within the scan range 920 of FIG. 9 , and brightness distribution of the plurality of photographed images may be represented for each area in a horizontal direction (eg, a row).
  • FIG. 10 shows a distribution of brightness values for each region (eg, row) within each image captured in the first image (eg, first ROI) to 10th image (eg, tenth ROI) 10 shown in FIG. 9 .
  • the lens shading correction value applied to one image is applied to another image as in the prior art, the brightness of the image becomes non-uniform for each position. may occur.
  • 11 is a result of performing correction by applying one lens shading correction value to images photographed according to various positions of a prism (eg, the prism 181 in FIG. 1 ) in the electronic device 100 according to an embodiment; graph can be displayed.
  • a prism eg, the prism 181 in FIG. 1
  • the graph 1101 of FIG. 11 shows lens shading correction applied to an image (eg, the fifth ROI of FIG. 9 ) corresponding to the reference position (eg, center) of the prism 181 . It is a graph showing correction of brightness values of images corresponding to the remaining positions (eg, first ROIs to fourth ROIs and sixth ROIs to tenth ROIs in FIG. 9 ) based on the values.
  • the reference position of the prism 181 may be understood as the position of the prism that injects light into the center of the lens assembly.
  • the image corresponding to the reference position of the prism 181 may be understood as an image photographed at the reference position of the prism 181 .
  • a reference image 1103 of FIG. 11 is a division of a photographed image area.
  • the reference image 1103 is an image in which upper-left, upper, upper-right, left, center, right, lower-left, lower, and lower-right are displayed for each area of the photographed image in order to assist the description of the graph 1101 .
  • the fifth image eg, the fifth ROI
  • the lens shading correction value set to correct the fifth image eg, the fifth ROI
  • the fifth image eg, the fifth ROI
  • the fourth image (eg, fourth ROI) and the sixth image (eg, sixth ROI) are corrected with a lens shading correction value set to correct the fifth image (eg, fifth ROI).
  • a lens shading correction value set to correct the fifth image eg, fifth ROI.
  • a first image eg, a first ROI
  • a third image eg, a third ROI
  • a seventh image as a lens shading correction value set to correct the fifth image (eg, a fifth ROI)
  • the image eg, the seventh ROI
  • the tenth image eg, the tenth ROI
  • FIG. 12 is a graph illustrating brightness values of images taken according to various positions of a prism 181 in the electronic device 100 according to an exemplary embodiment.
  • the graph 1201 represents a change in brightness of an image according to a change in a hall code in a pitch direction (eg, a change in a position in a prism pitch direction).
  • the brightness change according to the Hall value can be modeled with a polynomial formula, and the modeling coefficient value can be obtained from the Hall value stored in the non-volatile memory (eg EEPROM) and the image profile.
  • the graph 1201 may include a graph modeled based on brightness values obtained from ROI 2 , ROI 5 , and ROI 8 of FIG. 9 .
  • the graph 1201 may be a graph obtained by modeling with Equation 1 below, based on brightness values obtained from ROIs 2, 5, and 8 of FIG. 9 . Equation 1 is an example, and various embodiments that can be implemented by those skilled in the art are possible.
  • the graph 1203 represents a change in brightness of an image according to a change in a hall code in a yaw direction (eg, a change in a position in the yaw direction of a prism).
  • the brightness change according to the Hall value can be modeled using a polynomial formula, and the modeling coefficient value can be obtained from the Hall code and the image profile value stored in a non-volatile memory (eg, EEPROM).
  • the graph 1203 is a graph modeled based on brightness values obtained from ROI 1, ROI 2, and ROI 3 of FIG. 9, and brightness values obtained from ROI 4, ROI 5, and ROI 6 of FIG.
  • the graph 1203 is a graph obtained by modeling with Equation 2 below based on the brightness values obtained from ROI 1, ROI 2, and ROI 3 of FIG. 9, ROI 4, ROI 5, and ROI of FIG. 9 Based on the brightness values obtained in 6, the graph obtained by modeling with Equation 3 below, and the brightness values obtained in ROI 7, ROI 8, ROI 9, and ROI 10 of FIG. 9 as Equation 4 below
  • Equations 2 to 4 are examples, and various embodiments that can be implemented by those skilled in the art are possible.
  • FIG. 13 is a block diagram of an electronic device 1301 in a network environment 1300 according to various embodiments of the present disclosure.
  • the electronic device 1301 communicates with the electronic device 1302 through a first network 1398 (eg, a short-range wireless communication network) or a second network 1399 . It may communicate with the electronic device 1304 or the server 1308 through (eg, a long-distance wireless communication network). According to an embodiment, the electronic device 1301 may communicate with the electronic device 1304 through the server 1308 .
  • a first network 1398 eg, a short-range wireless communication network
  • a second network 1399 e.g., a second network 1399
  • the electronic device 1301 may communicate with the electronic device 1304 through the server 1308 .
  • the electronic device 1301 includes a processor 1320 , a memory 1330 , an input module 1350 , a sound output module 1355 , a display module 1360 , an audio module 1370 , and a sensor module ( 1376), interface 1377, connection terminal 1378, haptic module 1379, camera module 1380, power management module 1388, battery 1389, communication module 1390, subscriber identification module 1396 , or an antenna module 1397 .
  • at least one of these components eg, the connection terminal 1378
  • some of these components are integrated into one component (eg, display module 1360 ). can be
  • the processor 1320 for example, executes software (eg, a program 1340) to execute at least one other component (eg, a hardware or software component) of the electronic device 1301 connected to the processor 1320 . It can control and perform various data processing or operations. According to an embodiment, as at least part of data processing or operation, the processor 1320 converts commands or data received from other components (eg, the sensor module 1376 or the communication module 1390) to the volatile memory 1332 . may store the command or data stored in the volatile memory 1332 , and store the result data in the non-volatile memory 1334 .
  • software eg, a program 1340
  • the processor 1320 converts commands or data received from other components (eg, the sensor module 1376 or the communication module 1390) to the volatile memory 1332 .
  • the volatile memory 1332 may store the command or data stored in the volatile memory 1332 , and store the result data in the non-volatile memory 1334 .
  • the processor 1320 is a main processor 1321 (eg, a central processing unit or an application processor) or a secondary processor 1323 (eg, a graphic processing unit, a neural network processing unit) a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor).
  • a main processor 1321 eg, a central processing unit or an application processor
  • a secondary processor 1323 eg, a graphic processing unit, a neural network processing unit
  • NPU neural processing unit
  • an image signal processor e.g., a sensor hub processor, or a communication processor.
  • the auxiliary processor 1323 uses less power than the main processor 1321 or is set to be specialized for a specified function.
  • the coprocessor 1323 may be implemented separately from or as part of the main processor 1321 .
  • the coprocessor 1323 may, for example, act on behalf of the main processor 1321 while the main processor 1321 is in an inactive (eg, sleep) state, or when the main processor 1321 is active (eg, executing an application). ), together with the main processor 1321, at least one of the components of the electronic device 1301 (eg, the display module 1360, the sensor module 1376, or the communication module 1390) It is possible to control at least some of the related functions or states.
  • the coprocessor 1323 eg, an image signal processor or a communication processor
  • the auxiliary processor 1323 may include a hardware structure specialized for processing an artificial intelligence model.
  • Artificial intelligence models can be created through machine learning. Such learning may be performed, for example, in the electronic device 1301 itself on which artificial intelligence is performed, or may be performed through a separate server (eg, the server 1308).
  • the learning algorithm may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but in the above example not limited
  • the artificial intelligence model may include a plurality of artificial neural network layers.
  • Artificial neural networks include deep neural networks (DNNs), convolutional neural networks (CNNs), recurrent neural networks (RNNs), restricted boltzmann machines (RBMs), deep belief networks (DBNs), bidirectional recurrent deep neural networks (BRDNNs), It may be one of deep Q-networks or a combination of two or more of the above, but is not limited to the above example.
  • the artificial intelligence model may include, in addition to, or alternatively, a software structure in addition to the hardware structure.
  • the memory 1330 may store various data used by at least one component of the electronic device 1301 (eg, the processor 1320 or the sensor module 1376 ).
  • the data may include, for example, input data or output data for software (eg, the program 1340 ) and commands related thereto.
  • the memory 1330 may include a volatile memory 1332 or a non-volatile memory 1334 .
  • the program 1340 may be stored as software in the memory 1330 , and may include, for example, an operating system 1342 , middleware 1344 , or an application 1346 .
  • the input module 1350 may receive a command or data to be used in a component (eg, the processor 1320 ) of the electronic device 1301 from the outside (eg, a user) of the electronic device 1301 .
  • the input module 1350 may include, for example, a microphone, a mouse, a keyboard, a key (eg, a button), or a digital pen (eg, a stylus pen).
  • the sound output module 1355 may output a sound signal to the outside of the electronic device 1301 .
  • the sound output module 1355 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback.
  • the receiver can be used to receive incoming calls. According to an embodiment, the receiver may be implemented separately from or as a part of the speaker.
  • the display module 1360 may visually provide information to the outside (eg, a user) of the electronic device 1301 .
  • the display module 1360 may include, for example, a display, a hologram device, or a projector and a control circuit for controlling the corresponding device.
  • the display module 1360 may include a touch sensor configured to sense a touch or a pressure sensor configured to measure the intensity of a force generated by the touch.
  • the audio module 1370 may convert a sound into an electric signal or, conversely, convert an electric signal into a sound. According to an embodiment, the audio module 1370 acquires a sound through the input module 1350 or an external electronic device (eg, a sound output module 1355 ) directly or wirelessly connected to the electronic device 1301 .
  • the electronic device 1302) eg, a speaker or headphones
  • the sensor module 1376 detects an operating state (eg, power or temperature) of the electronic device 1301 or an external environmental state (eg, a user state), and generates an electrical signal or data value corresponding to the sensed state. can do.
  • the sensor module 1376 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 1377 may support one or more specified protocols that may be used for the electronic device 1301 to connect directly or wirelessly with an external electronic device (eg, the electronic device 1302 ).
  • the interface 1377 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card
  • connection terminal 1378 may include a connector through which the electronic device 1301 can be physically connected to an external electronic device (eg, the electronic device 1302 ).
  • the connection terminal 1378 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 1379 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that the user can perceive through tactile or kinesthetic sense.
  • the haptic module 1379 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 1380 may capture still images and moving images. According to an embodiment, the camera module 1380 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 1388 may manage power supplied to the electronic device 1301 .
  • the power management module 1388 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 1389 may supply power to at least one component of the electronic device 1301 .
  • the battery 1389 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 1390 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 1301 and an external electronic device (eg, the electronic device 1302 , the electronic device 1304 , or the server 1308 ). It can support establishment and communication performance through the established communication channel.
  • the communication module 1390 operates independently of the processor 1320 (eg, an application processor) and may include one or more communication processors that support direct (eg, wired) communication or wireless communication.
  • the communication module 1390 is a wireless communication module 1392 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 1394 (eg, : It may include a local area network (LAN) communication module, or a power line communication module).
  • a corresponding communication module among these communication modules is a first network 1398 (eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 1399 (eg, legacy).
  • a first network 1398 eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)
  • a second network 1399 eg, legacy
  • the wireless communication module 1392 uses subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 1396 within a communication network, such as the first network 1398 or the second network 1399 .
  • the electronic device 1301 may be identified or authenticated.
  • the wireless communication module 1392 may support a 5G network after a 4G network and a next-generation communication technology, for example, a new radio access technology (NR).
  • NR access technology includes high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), minimization of terminal power and access to multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low-latency) -latency communications)).
  • eMBB enhanced mobile broadband
  • mMTC massive machine type communications
  • URLLC ultra-reliable and low-latency
  • the wireless communication module 1392 may support a high frequency band (eg, mmWave band) to achieve a high data rate, for example.
  • a high frequency band eg, mmWave band
  • the wireless communication module 1392 uses various techniques for securing performance in a high frequency band, for example, beamforming, massive multiple-input and multiple-output (MIMO), and all-dimensional multiplexing. It may support technologies such as full dimensional MIMO (FD-MIMO), an array antenna, analog beam-forming, or a large scale antenna.
  • the wireless communication module 1392 may support various requirements specified in the electronic device 1301 , an external electronic device (eg, the electronic device 1304 ), or a network system (eg, the second network 1399 ).
  • the wireless communication module 1392 may include a peak data rate (eg, 20 Gbps or more) for realizing eMBB, loss coverage (eg, 164 dB or less) for realizing mMTC, or U-plane latency for realizing URLLC (eg, 20 Gbps or more).
  • a peak data rate eg, 20 Gbps or more
  • loss coverage e.g, 164 dB or less
  • U-plane latency for realizing URLLC eg, 20 Gbps or more
  • DL Downlink
  • UL uplink
  • the antenna module 1397 may transmit or receive a signal or power to the outside (eg, an external electronic device).
  • the antenna module 1397 may include an antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 1397 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication scheme used in a communication network such as the first network 1398 or the second network 1399 is connected from the plurality of antennas by, for example, the communication module 1390 . can be selected. A signal or power may be transmitted or received between the communication module 1390 and an external electronic device through the selected at least one antenna.
  • other components eg, a radio frequency integrated circuit (RFIC)
  • RFIC radio frequency integrated circuit
  • the antenna module 1397 may form a mmWave antenna module.
  • the mmWave antenna module includes a printed circuit board, an RFIC disposed on or adjacent to a first side (eg, bottom side) of the printed circuit board and capable of supporting a specified high frequency band (eg, mmWave band); and a plurality of antennas (eg, an array antenna) disposed on or adjacent to a second side (eg, top or side) of the printed circuit board and capable of transmitting or receiving signals of the designated high frequency band. can do.
  • peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • the command or data may be transmitted or received between the electronic device 1301 and the external electronic device 1304 through the server 1308 connected to the second network 1399 .
  • Each of the external electronic devices 1302 or 1304 may be the same or a different type of the electronic device 1301 .
  • all or a part of operations executed by the electronic device 1301 may be executed by one or more external electronic devices 1302 , 1304 , or 1308 .
  • the electronic device 1301 may instead of executing the function or service itself.
  • one or more external electronic devices may be requested to perform at least a part of the function or the service.
  • One or more external electronic devices that have received the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit a result of the execution to the electronic device 1301 .
  • the electronic device 1301 may process the result as it is or additionally and provide it as at least a part of a response to the request.
  • cloud computing, distributed computing, mobile edge computing (MEC), or client-server computing technology may be used.
  • the electronic device 1301 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 1304 may include an Internet of things (IoT) device.
  • the server 1308 may be an intelligent server using machine learning and/or neural networks.
  • the external electronic device 1304 or the server 1308 may be included in the second network 1399 .
  • the electronic device 1301 may be applied to an intelligent service (eg, smart home, smart city, smart car, or health care) based on 5G communication technology and IoT-related technology.
  • FIG. 14 is a block diagram 1400 illustrating a camera module 1380 according to various embodiments.
  • the camera module 1380 includes a lens assembly 1410 , a flash 1420 , an image sensor 1430 , an image stabilizer 1440 , a memory 1450 (eg, a buffer memory), or an image signal processor. (1460).
  • the lens assembly 1410 may collect light emitted from a subject, which is an image to be captured.
  • the lens assembly 1410 may include one or more lenses.
  • the camera module 1380 may include a plurality of lens assemblies 1410 . In this case, the camera module 1380 may form, for example, a dual camera, a 360 degree camera, or a spherical camera.
  • Some of the plurality of lens assemblies 1410 may have the same lens properties (eg, angle of view, focal length, auto focus, f number, or optical zoom), or at least one lens assembly may be a different lens assembly. It may have one or more lens properties that are different from the lens properties of .
  • the lens assembly 1410 may include, for example, a wide-angle lens or a telephoto lens.
  • the flash 1420 may emit light used to enhance light emitted or reflected from the subject.
  • the flash 1420 may include one or more light emitting diodes (eg, a red-green-blue (RGB) LED, a white LED, an infrared LED, or an ultraviolet LED), or a xenon lamp.
  • the image sensor 1430 may acquire an image corresponding to the subject by converting light emitted or reflected from the subject and transmitted through the lens assembly 1410 into an electrical signal.
  • the image sensor 1430 is, for example, one image sensor selected from among image sensors having different properties, such as an RGB sensor, a black and white (BW) sensor, an IR sensor, or a UV sensor, the same It may include a plurality of image sensors having properties, or a plurality of image sensors having different properties.
  • Each image sensor included in the image sensor 1430 may be implemented using, for example, a charged coupled device (CCD) sensor or a complementary metal oxide semiconductor (CMOS) sensor.
  • CCD charged coupled device
  • CMOS complementary metal oxide semiconductor
  • the image stabilizer 1440 moves at least one lens or the image sensor 1430 included in the lens assembly 1410 in a specific direction or Operation characteristics of the image sensor 1430 may be controlled (eg, read-out timing may be adjusted, etc.). This makes it possible to compensate for at least some of the negative effects of the movement on the image being taken.
  • the image stabilizer 1440 is, according to an embodiment, the image stabilizer 1440 is a gyro sensor (not shown) or an acceleration sensor (not shown) disposed inside or outside the camera module 1380 . Such a movement of the camera module 1380 or the electronic device 1301 may be detected using .
  • the image stabilizer 1440 may be implemented as, for example, an optical image stabilizer.
  • the memory 1450 may temporarily store at least a portion of the image acquired through the image sensor 1430 for a next image processing operation. For example, when image acquisition is delayed according to the shutter or a plurality of images are acquired at high speed, the acquired original image (eg, Bayer-patterned image or high-resolution image) is stored in the memory 1450 and , a copy image corresponding thereto (eg, a low-resolution image) may be previewed through the display device 1360 .
  • the acquired original image eg, Bayer-patterned image or high-resolution image
  • a copy image corresponding thereto eg, a low-resolution image
  • the memory 1450 may be configured as at least a part of the memory 1330 or as a separate memory operated independently of the memory 1330 .
  • the image signal processor 1460 may perform one or more image processing on an image acquired through the image sensor 1430 or an image stored in the memory 1450 .
  • the one or more image processes may include, for example, depth map generation, three-dimensional modeling, panorama generation, feature point extraction, image synthesis, or image compensation (eg, noise reduction, resolution adjustment, brightness adjustment, blurring ( blurring, sharpening, or softening.
  • the image signal processor 1460 may include at least one of components included in the camera module 1380 (eg, an image sensor). 1430), for example, exposure time control, readout timing control, etc.
  • the image processed by the image signal processor 1460 is stored back in the memory 1450 for further processing.
  • the image signal processor 1460 may be configured as at least a part of the processor 1320 or as a separate processor operated independently of the processor 1320.
  • the image signal processor 1460 may include the processor 1320 and a separate processor, at least one image processed by the image signal processor 1460 may be displayed through the display device 1360 as it is or after additional image processing is performed by the processor 1320 .
  • the electronic device 1301 may include a plurality of camera modules 1380 each having different properties or functions.
  • at least one of the plurality of camera modules 1380 may be a wide-angle camera, and at least the other may be a telephoto camera.
  • at least one of the plurality of camera modules 1380 may be a front camera, and at least the other may be a rear camera.
  • the electronic device 100 (eg, the electronic device 1301 of FIG. 13 ) includes at least one lens (eg, the lens assembly 182 of FIG. 1 , the lens assembly 330 of FIG. 3 ), the A reflector (eg, the prism 181 of FIG. 1 , the prism 211 of FIGS. )), at least one Hall sensor 390 for checking the position information of the reflector (eg, the first Hall sensor 431, the second Hall sensor 433, and the third Hall sensor 435 of FIG. 4); and a camera module 180 including a non-volatile memory 380 for storing hole data corresponding to the location information, and at least one processor operatively connected to the camera module 180 ( 120) (eg, the processor 1320 of FIG. 13 ).
  • the camera module 180 120
  • the processor 1320 of FIG. 13 the processor 1320 of FIG. 13 .
  • the at least one processor 120 may obtain data for correcting lens shading according to the position of the reflector from the non-volatile memory 380 .
  • the at least one processor 120 may obtain a first image frame by driving the camera module 180 .
  • the at least one processor may identify a first position of the reflector corresponding to the first image frame through the at least one Hall sensor 390 .
  • the at least one processor may perform lens shading correction on the first image frame based on the first position and the acquired data in response to identifying the first position.
  • the at least one processor 120 is a first of the reflector (eg, the prism 181 of FIG. 1 , the prism 211 of FIGS. 2 to 4 ) through the first Hall sensor 431 ).
  • the angle of the reflector in the direction may be checked, and the angle of the reflector in the second direction of the reflector may be confirmed through the second Hall sensor 433 .
  • the first direction may be a pitch direction of the reflector, and the second direction may be a yaw direction of the reflector.
  • the at least one processor 120 is a first of the reflector (eg, the prism 181 of FIG. 1 , the prism 211 of FIGS. 2 to 4 ) through the first Hall sensor 431 ).
  • a first hall code in the direction may be checked, and a second hall code in the second direction of the reflector may be checked through the second Hall sensor.
  • the at least one processor 120 may acquire at least one image frame by driving the camera module 180 .
  • the at least one processor 120 may detect an object based on the acquired image frame.
  • the at least one processor 120 moves the reflector (eg, the prism 181 of FIG. 1 and the prism 211 of FIGS. 2 to 4 ) within the first range. can be controlled in
  • the non-volatile memory 380 may include one of a shading correction function or a correction lookup table corresponding to a position of the reflector (eg, the prism 181 of FIG. 1 and the prism 211 of FIGS. 2 to 4 ). At least one can be stored.
  • the data for correcting the lens shading may include at least a shading correction function or a correction lookup table.
  • the data for correcting the lens shading may be calculated based on the hole data and an image profile value indicating a brightness distribution of an image.
  • the at least one processor 120 may acquire a second image frame that is a next frame of the first image frame.
  • the at least one processor 120 transmits the reflector (eg, the prism 181 of FIG. 1 , the prism 211 of FIGS. 2 to 4 ) corresponding to the second image frame through the at least one Hall sensor 390 . )) of the second position can be confirmed.
  • the at least one processor 120 may perform lens shading correction on the second image frame based on the second position and the acquired data.
  • the at least one processor 120 may apply a first lens shading correction value corresponding to the first position to the first image frame.
  • the at least one processor 120 may perform the lens shading correction by applying a second lens shading correction value corresponding to the second position to the second image frame.
  • the reflector (eg, the prism 181 of FIG. 1 and the prism 211 of FIGS. 2 to 4 ) may include at least one of a prism and a mirror.
  • the method of operating the electronic device 100 does not compare data for correcting lens shading according to positions of a reflector (eg, the prism 181 of FIG. 1 and the prism 211 of FIGS. 2 to 4 ).
  • the first of the reflectors (eg, the prism 181 of FIG. 1 and the prism 211 of FIGS. 2 to 4 ) through the first Hall sensor 431 ) It may include an operation of confirming the angle of the reflector in the direction, and an operation of confirming the angle of the reflector in the second direction of the reflector through the second Hall sensor 433 .
  • the first of the reflectors eg, the prism 181 of FIG. 1 and the prism 211 of FIGS. 2 to 4
  • the first Hall sensor 431 identifying a first hall code in a direction, and identifying a second hall code in a second direction of the reflector through a second Hall sensor 433 .
  • the method of operating the electronic device 100 includes an operation of driving the camera module 180 to acquire at least one image frame, an operation of detecting an object based on the acquired image frame, and the object
  • the operation of controlling the movement of the reflector eg, the prism 181 of FIG. 1 and the prism 211 of FIGS. 2 to 4 ) within a first range may be included.
  • the method of operating the electronic device 100 includes acquiring a second image frame that is a frame next to the first image frame, and corresponding to the second image frame through the at least one Hall sensor 390 .
  • the second position of the reflector eg, the prism 181 of FIG. 1 , the prism 211 of FIGS. 2 to 4
  • the second position and performing the lens shading correction on the second image frame based on the acquired data.
  • the method of operating the electronic device 100 includes applying a first lens shading correction value corresponding to the first position to the first image frame, and applying the second lens shading correction value to the second image frame. and performing the lens shading correction by applying a second lens shading correction value corresponding to the position.
  • the electronic device may be a device of various types.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a laptop, a desktop, a tablet, or a portable multimedia device
  • portable medical device e.g., a portable medical device
  • camera e.g., a camera
  • a wearable device e.g., a smart watch
  • a home appliance device e.g., a smart bracelet
  • first”, “second”, or “first” or “second” may simply be used to distinguish the component from other such components, and refer to those components in other aspects (e.g., importance or order) is not limited. It is said that one (eg, first) component is “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively”. When referenced, it means that one component can be connected to the other component directly (eg by wire), wirelessly, or through a third component.
  • module used in various embodiments of the present document may include a unit implemented in hardware, software, or firmware, for example, and interchangeably with terms such as logic, logic block, component, or circuit.
  • a module may be an integrally formed part or a minimum unit or a part of the part that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • one or more instructions stored in a storage medium may be implemented as software (eg, a program 1340) including
  • a processor eg, processor 1320
  • a device eg, electronic device 1301
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not contain a signal (eg, electromagnetic wave), and this term is used in cases where data is semi-permanently stored in the storage medium and It does not distinguish between temporary storage cases.
  • a signal eg, electromagnetic wave
  • the method according to various embodiments disclosed in this document may be provided by being included in a computer program product.
  • Computer program products may be traded between sellers and buyers as commodities.
  • the computer program product is distributed in the form of a machine-readable storage medium (eg compact disc read only memory (CD-ROM)), or via an application store (eg Play Store TM ) or on two user devices ( It can be distributed (eg downloaded or uploaded) directly or online between smartphones (eg: smartphones).
  • a portion of the computer program product may be temporarily stored or temporarily generated in a machine-readable storage medium such as a memory of a server of a manufacturer, a server of an application store, or a memory of a relay server.
  • each component (eg, module or program) of the above-described components may include a singular or a plurality of entities, and some of the plurality of entities may be separately disposed in other components.
  • one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, a module or a program
  • the integrated component may perform one or more functions of each component of the plurality of components identically or similarly to those performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component are executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations are executed in a different order, omitted, or , or one or more other operations may be added.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

Au moins un processeur d'un appareil électronique peut : obtenir, à partir d'une mémoire non volatile, des données pour assurer une correction de vignettage en fonction de la position d'un réflecteur ; obtenir une première trame d'image en pilotant un module d'appareil photo ; identifier une première position du réflecteur correspondant à la première trame d'image par l'intermédiaire d'au moins un capteur à effet Hall ; et en réponse à l'identification de la première position, effectuer une correction de vignettage de la première trame d'image sur la base de la première position et des données obtenues. La divulgation peut également concerner divers autres modes de réalisation.
PCT/KR2022/004334 2021-04-14 2022-03-28 Procédé de balayage lors d'une prise de vue avec un appareil photo et appareil électronique associé WO2022220444A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020210048662A KR20220142205A (ko) 2021-04-14 2021-04-14 카메라 촬영 시 스캔하는 방법 및 그 전자 장치
KR10-2021-0048662 2021-04-14

Publications (1)

Publication Number Publication Date
WO2022220444A1 true WO2022220444A1 (fr) 2022-10-20

Family

ID=83639823

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2022/004334 WO2022220444A1 (fr) 2021-04-14 2022-03-28 Procédé de balayage lors d'une prise de vue avec un appareil photo et appareil électronique associé

Country Status (2)

Country Link
KR (1) KR20220142205A (fr)
WO (1) WO2022220444A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05328123A (ja) * 1992-05-26 1993-12-10 Canon Inc 画像読み取り装置
JPH11305278A (ja) * 1998-04-20 1999-11-05 Asahi Optical Co Ltd 像振れ補正カメラ及び像振れ補正カメラの制御方法
JP2001298656A (ja) * 2000-04-13 2001-10-26 Nec Corp 撮像装置
KR20090027338A (ko) * 2007-09-12 2009-03-17 삼성전자주식회사 굴절면으로 구성된 굴곡 프리즘의 구동을 통한 광경로 보정시스템 및 방법
JP2021027431A (ja) * 2019-08-01 2021-02-22 日本電産サンキョー株式会社 光学ユニット

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05328123A (ja) * 1992-05-26 1993-12-10 Canon Inc 画像読み取り装置
JPH11305278A (ja) * 1998-04-20 1999-11-05 Asahi Optical Co Ltd 像振れ補正カメラ及び像振れ補正カメラの制御方法
JP2001298656A (ja) * 2000-04-13 2001-10-26 Nec Corp 撮像装置
KR20090027338A (ko) * 2007-09-12 2009-03-17 삼성전자주식회사 굴절면으로 구성된 굴곡 프리즘의 구동을 통한 광경로 보정시스템 및 방법
JP2021027431A (ja) * 2019-08-01 2021-02-22 日本電産サンキョー株式会社 光学ユニット

Also Published As

Publication number Publication date
KR20220142205A (ko) 2022-10-21

Similar Documents

Publication Publication Date Title
WO2022124642A1 (fr) Module de prise de vues pour prendre en charge un zoom optique, et dispositif électronique le comprenant
WO2022039424A1 (fr) Procédé de stabilisation d'images et dispositif électronique associé
EP3931635A1 (fr) Caméra pliée et dispositif électronique le comprenant
WO2022092706A1 (fr) Procédé de prise de photographie à l'aide d'une pluralité de caméras, et dispositif associé
WO2022149654A1 (fr) Dispositif électronique pour réaliser une stabilisation d'image, et son procédé de fonctionnement
WO2022119218A1 (fr) Procédé et dispositif électronique pour corriger un tremblement de caméra
WO2022235043A1 (fr) Dispositif électronique comprenant une pluralité de caméras et son procédé de fonctionnement
WO2022245129A1 (fr) Procédé de suivi d'objet et appareil électronique associé
WO2022196993A1 (fr) Dispositif électronique et procédé de capture d'image au moyen d'un angle de vue d'un module d'appareil de prise de vues
WO2023277298A1 (fr) Procédé de stabilisation d'image et dispositif électronique associé
WO2022139391A1 (fr) Module de caméra et dispositif électronique le comprenant
WO2022220444A1 (fr) Procédé de balayage lors d'une prise de vue avec un appareil photo et appareil électronique associé
WO2021235747A1 (fr) Dispositif électronique comprenant une caméra et un microphone, et son procédé de fonctionnement
WO2022220621A1 (fr) Dispositif électronique comprenant un réflecteur et un ensemble objectif
WO2021251631A1 (fr) Dispositif électronique comprenant une fonction de réglage de mise au point, et procédé associé
WO2022270870A1 (fr) Procédé pour effectuer une photographie de manière multidirectionnelle par l'intermédiaire d'une caméra et dispositif électronique associé
WO2021230507A1 (fr) Procédé et dispositif pour fournir un guidage en imagerie
WO2022211304A1 (fr) Dispositif électronique comprenant un module de caméra
WO2022231251A1 (fr) Procédé de traitement d'images et dispositif électronique pour le prendre en charge
WO2022240186A1 (fr) Procédé de correction de distorsion d'image et dispositif électronique associé
WO2022225248A1 (fr) Procédé et dispositif électronique de stabilisation d'image pendant la capture d'image
WO2023043132A1 (fr) Dispositif électronique d'application d'effet bokeh sur une image et son procédé de fonctionnement
WO2022203355A1 (fr) Dispositif électronique comprenant une pluralité de caméras
WO2022154164A1 (fr) Dispositif électronique apte à régler un angle de vue et procédé de fonctionnement associé
WO2021230567A1 (fr) Procédé de capture d'image faisant intervenir une pluralité d'appareils de prise de vues et dispositif associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22788300

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22788300

Country of ref document: EP

Kind code of ref document: A1