WO2020075953A1 - Procédé de génération d'information de profondeur au moyen d'une lumière structurée projetée sur un objet externe, et appareil électronique l'utilisant - Google Patents

Procédé de génération d'information de profondeur au moyen d'une lumière structurée projetée sur un objet externe, et appareil électronique l'utilisant Download PDF

Info

Publication number
WO2020075953A1
WO2020075953A1 PCT/KR2019/008220 KR2019008220W WO2020075953A1 WO 2020075953 A1 WO2020075953 A1 WO 2020075953A1 KR 2019008220 W KR2019008220 W KR 2019008220W WO 2020075953 A1 WO2020075953 A1 WO 2020075953A1
Authority
WO
WIPO (PCT)
Prior art keywords
polygons
electronic device
structured light
pattern
specified
Prior art date
Application number
PCT/KR2019/008220
Other languages
English (en)
Korean (ko)
Inventor
볼로치뉴크안드레이
모로조프코스티아틴
계용찬
벗안드레이
리모노프알렉산더
박병훈
최지환
홍태화
야쿠비악안토니
키즈파벨
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2020075953A1 publication Critical patent/WO2020075953A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals

Definitions

  • Various embodiments of the present invention relate to an image processing method and an electronic device using the same.
  • AR augmented reality
  • VR virtual reality
  • the present invention can provide a specific method of obtaining depth information to perform 3D modeling.
  • An electronic device comprising: a camera; A light emitting module including one or more light emitters; Memory; And a processor, wherein the processor projects the structured light corresponding to the designated structured light pattern as an external object, using the light emitting module, and the specified structured light pattern comprises a plurality of firsts distributed according to a specified rule.
  • a method of an electronic device using a light emitting module, projecting a structured light corresponding to a specified structured light pattern to an external object; Obtaining an image in which the structure light is reflected by the external object, using the camera; Identifying a specified repeating grid pattern and at least some first polygons based on the acquired image; Restoring other unidentified portions of the first polygons based on the identified specified repetitive grid pattern and at least some of the first polygons; And generating depth information for the external object based on the positions of the at least some of the first polygons and the restored first part of the other polygons, wherein the specified structured light pattern comprises: It may include a plurality of second polygons connecting at least a portion of the plurality of first polygons based on the plurality of the first polygons distributed according to a specified rule and the specified repeating grid pattern.
  • the electronic device may acquire depth information with an external object, and perform 3D modeling of the external object based on the depth information.
  • FIG. 1A is a block diagram of an electronic device in a network environment according to various embodiments of the present disclosure.
  • FIG. 1B is a block diagram of a camera module according to various embodiments of the present invention.
  • FIG. 2 is a diagram schematically illustrating a 3D modeling system of an electronic device according to various embodiments of the present disclosure.
  • FIG. 3 is a block diagram schematically illustrating a depth sensing system of an electronic device according to various embodiments of the present disclosure.
  • FIG. 4 is a flowchart illustrating a method of sensing a depth of an external object of an electronic device according to various embodiments of the present disclosure.
  • FIG. 5 is a diagram illustrating a logical reference pattern of an electronic device according to various embodiments of the present disclosure.
  • FIG. 6 is a diagram illustrating a first structured light pattern of an electronic device according to various embodiments of the present disclosure.
  • FIG. 7 is a view illustrating a composite structured light pattern in which a first structured light pattern and a second structured light pattern of an electronic device are combined according to various embodiments of the present disclosure.
  • FIGS. 8A to 8C are diagrams illustrating a method for generating a composite structured light pattern of an electronic device according to various embodiments of the present disclosure.
  • FIGS. 9A to 9C are diagrams illustrating a method of using a cellular automata to generate a logical reference pattern of an electronic device according to various embodiments of the present disclosure.
  • 10A is a diagram illustrating an image reflected by an external object, obtained by projecting a composite structured light pattern of an electronic device according to various embodiments of the present disclosure.
  • 10B is a diagram illustrating a physical shape corresponding to at least a portion of an external object, obtained from an image reflected by an external object of the electronic device according to various embodiments of the present disclosure.
  • 10C is a diagram illustrating a logical shape generated based on an acquired physical shape of an electronic device according to various embodiments of the present disclosure.
  • 11A to 11E are diagrams illustrating a method of restoring at least a portion of a logical reference pattern previously generated from an image reflected by an external object, by utilizing a cellular automata of an electronic device according to various embodiments of the present disclosure to be.
  • FIG. 12 is a diagram illustrating a method of generating a lookup map from a previously generated logical reference pattern of an electronic device according to various embodiments of the present disclosure.
  • FIG. 13 is a diagram illustrating a method of acquiring a connected component through at least some of the recovered logical reference patterns of an electronic device and acquiring at least one coding word included in the connected component according to various embodiments of the present disclosure.
  • FIG. 14 is a diagram illustrating a method of generating depth information of an external object through a look-up map of an electronic device according to various embodiments of the present disclosure.
  • 15 is a flowchart illustrating a method of obtaining a connected component from an image reflected by an external object of an electronic device according to various embodiments of the present disclosure.
  • 16 is a flowchart illustrating a method of generating depth information corresponding to a connected component based on a coding word included in a connected component of an electronic device according to various embodiments of the present disclosure.
  • 17 is a diagram illustrating 3D modeling using connected components of an electronic device according to various embodiments of the present disclosure.
  • 18A to 18B are diagrams illustrating performing 3D modeling and self-correction using a complex structured light pattern of an electronic device according to various embodiments of the present disclosure.
  • 1A is a block diagram of an electronic device 101 in a network environment 100 according to various embodiments.
  • the electronic device 101 communicates with the electronic device 102 through the first network 198 (eg, a short-range wireless communication network), or the second network 199 It may communicate with the electronic device 104 or the server 108 through (eg, a remote wireless communication network).
  • the electronic device 101 may communicate with the electronic device 104 through the server 108.
  • the electronic device 101 includes a processor 120, a memory 130, an input device 150, an audio output device 155, a display device 160, an audio module 170, a sensor module ( 176), interface 177, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196, or antenna module 197 ).
  • the components for example, the display device 160 or the camera module 180
  • the sensor module 176 eg, a fingerprint sensor, an iris sensor, or an illuminance sensor
  • the display device 160 eg., a display
  • the sensor module 176 eg, a fingerprint sensor, an iris sensor, or an illuminance sensor
  • the processor 120 executes software (eg, the program 140) to execute at least one other component (eg, hardware or software component) of the electronic device 101 connected to the processor 120. It can be controlled and can perform various data processing or operations. According to one embodiment, as at least part of data processing or computation, the processor 120 may receive instructions or data received from other components (eg, the sensor module 176 or the communication module 190) in the volatile memory 132. Loaded into, process instructions or data stored in volatile memory 132, and store result data in non-volatile memory 134.
  • software eg, the program 140
  • the processor 120 may receive instructions or data received from other components (eg, the sensor module 176 or the communication module 190) in the volatile memory 132. Loaded into, process instructions or data stored in volatile memory 132, and store result data in non-volatile memory 134.
  • the processor 120 may include a main processor 121 (eg, a central processing unit or an application processor), and an auxiliary processor 123 (eg, a graphics processing unit, an image signal processor) that can be operated independently or together. , Sensor hub processor, or communication processor). Additionally or alternatively, the coprocessor 123 may be set to use lower power than the main processor 121, or to be specialized for a designated function. The coprocessor 123 may be implemented separately from the main processor 121 or as part of it.
  • a main processor 121 eg, a central processing unit or an application processor
  • an auxiliary processor 123 eg, a graphics processing unit, an image signal processor
  • the coprocessor 123 may be set to use lower power than the main processor 121, or to be specialized for a designated function.
  • the coprocessor 123 may be implemented separately from the main processor 121 or as part of it.
  • the coprocessor 123 may replace, for example, the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or the main processor 121 may be active (eg, execute an application) ) With the main processor 121 while in the state, at least one of the components of the electronic device 101 (for example, the display device 160, the sensor module 176, or the communication module 190) It can control at least some of the functions or states associated with.
  • the coprocessor 123 eg, image signal processor or communication processor
  • may be implemented as part of other functionally relevant components eg, camera module 180 or communication module 190). have.
  • the memory 130 may store various data used by at least one component of the electronic device 101 (eg, the processor 120 or the sensor module 176).
  • the data may include, for example, software (eg, the program 140) and input data or output data for commands related thereto.
  • the memory 130 may include a volatile memory 132 or a non-volatile memory 134.
  • the program 140 may be stored as software in the memory 130, and may include, for example, an operating system 142, middleware 144, or an application 146.
  • the input device 150 may receive commands or data to be used for components (eg, the processor 120) of the electronic device 101 from outside (eg, a user) of the electronic device 101.
  • the input device 150 may include, for example, a microphone, mouse, keyboard, or digital pen (eg, a stylus pen).
  • the audio output device 155 may output an audio signal to the outside of the electronic device 101.
  • the audio output device 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback, and the receiver can be used to receive an incoming call.
  • the receiver may be implemented separately from, or as part of, a speaker.
  • the display device 160 may visually provide information to the outside of the electronic device 101 (eg, a user).
  • the display device 160 may include, for example, a display, a hologram device, or a projector and a control circuit for controlling the device.
  • the display device 160 may include a touch circuitry configured to sense a touch, or a sensor circuit (eg, a pressure sensor) configured to measure the strength of the force generated by the touch. have.
  • the audio module 170 may convert sound into an electrical signal, or vice versa. According to an embodiment, the audio module 170 acquires sound through the input device 150, or an external electronic device (eg, directly or wirelessly connected to the sound output device 155 or the electronic device 101) Sound may be output through the electronic device 102 (eg, speakers or headphones).
  • an external electronic device eg, directly or wirelessly connected to the sound output device 155 or the electronic device 101
  • Sound may be output through the electronic device 102 (eg, speakers or headphones).
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101 or an external environmental state (eg, a user state), and generates an electrical signal or data value corresponding to the detected state can do.
  • the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an infrared (IR) sensor, a biological sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 177 may support one or more designated protocols that can be used for the electronic device 101 to be directly or wirelessly connected to an external electronic device (eg, the electronic device 102).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card interface
  • audio interface audio interface
  • connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert electrical signals into mechanical stimuli (eg, vibration or movement) or electrical stimuli that the user can perceive through tactile or motor sensations.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture still images and videos. According to one embodiment, the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101.
  • the power management module 388 may be implemented, for example, as at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101.
  • the battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108). It can support establishing and performing communication through the established communication channel.
  • the communication module 190 operates independently of the processor 120 (eg, an application processor), and may include one or more communication processors supporting direct (eg, wired) communication or wireless communication.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg : Local area network (LAN) communication module, or power line communication module.
  • a wireless communication module 192 eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module
  • GNSS global navigation satellite system
  • LAN Local area network
  • Corresponding communication module among these communication modules includes a first network 198 (for example, a short-range communication network such as Bluetooth, WiFi direct, or infrared data association (IrDA)) or a second network 199 (for example, a cellular network, the Internet, or It may communicate with external electronic devices through a computer network (eg, a telecommunication network such as a LAN or WAN).
  • a computer network eg, a telecommunication network such as
  • the wireless communication module 192 uses a subscriber information (eg, an international mobile subscriber identifier (IMSI)) stored in the subscriber identification module 196 in a communication network such as the first network 198 or the second network 199.
  • IMSI international mobile subscriber identifier
  • the antenna module 197 may transmit a signal or power to the outside (eg, an external electronic device) or receive it from the outside.
  • the antenna module may include a single antenna including a conductor formed on a substrate (eg, a PCB) or a radiator made of a conductive pattern.
  • the antenna module 197 may include a plurality of antennas. In this case, at least one antenna suitable for a communication method used in a communication network, such as the first network 198 or the second network 199, is transmitted from the plurality of antennas by, for example, the communication module 190. Can be selected.
  • the signal or power may be transmitted or received between the communication module 190 and an external electronic device through the at least one selected antenna.
  • other components eg, RFIC
  • other than the radiator may be additionally formed as part of the antenna module 197.
  • peripheral devices for example, a bus, a general purpose input and output (GPIO), a serial peripheral interface (SPI), or a mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • commands or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199.
  • Each of the electronic devices 102 and 104 may be the same or a different type of device from the electronic device 101.
  • all or some of the operations performed on the electronic device 101 may be performed on one or more external devices of the external electronic devices 102, 104, or 108.
  • the electronic device 101 when the electronic device 101 needs to perform a certain function or service automatically or in response to a request from a user or another device, the electronic device 101 instead executes the function or service itself.
  • one or more external electronic devices may be requested to perform at least a portion of the function or the service.
  • the one or more external electronic devices receiving the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and deliver the result of the execution to the electronic device 101.
  • the electronic device 101 may process the result, as it is or additionally, and provide it as at least part of a response to the request.
  • cloud computing, distributed computing, or client-server computing technology can be used.
  • the electronic device may be various types of devices.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a smart phone
  • a portable multimedia device e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a wearable device e.g., a smart bracelet
  • a home appliance device e.g., a home appliance
  • the camera module 180 includes a lens assembly 181, a flash 182, an image sensor 183, an image stabilizer 184, a memory 185 (eg, a buffer memory), or an image signal processor (186).
  • the lens assembly 181 may collect light emitted from a subject that is an object of image capture.
  • the lens assembly 181 may include one or more lenses.
  • the camera module 180 may include a plurality of lens assemblies 181. In this case, the camera module 180 may be, for example, a dual camera, a 360 degree camera, or a spherical camera.
  • the plurality of lens assemblies 181 have the same lens properties (eg, angle of view, focal length, autofocus, f number, or optical zoom), or at least one lens assembly is at least one other lens assembly It can have different lens properties.
  • the lens assembly 181 may include, for example, a wide-angle lens or a telephoto lens.
  • the flash 182 may emit a light source used to enhance light emitted from a subject.
  • the flash 182 may include one or more light emitting diodes (eg, red-green-blue (RGB) LED, white LED, infrared LED, or ultraviolet LED), or xenon lamp.
  • RGB red-green-blue
  • the image sensor 183 may obtain an image corresponding to the subject by converting light transmitted from the subject through the lens assembly 181 into an electrical signal.
  • the image sensor 183 is an image sensor selected from among image sensors having different attributes, such as an RGB sensor, a black and white (BW) sensor, an IR sensor, or a UV sensor, for example. It may include a plurality of image sensors having attributes, or a plurality of image sensors having different attributes.
  • Each image sensor included in the image sensor 183 may be implemented as, for example, a charged coupled device (CCD) sensor or a complementary metal oxide semiconductor (CMOS) sensor.
  • CCD charged coupled device
  • CMOS complementary metal oxide semiconductor
  • the image stabilizer 184 responds to the movement of the camera module 180 or the electronic device 101 including the same, the lens to compensate for at least some negative effects (eg, image shaking) caused by the movement on the captured image
  • At least one lens or image sensor 183 included in the assembly 181 may be moved or controlled in a specific direction (eg, read-out timing may be adjusted).
  • the image stabilizer 184 may be implemented as, for example, an optical image stabilizer, a gyro sensor (not shown) or an acceleration sensor (not shown) disposed inside or outside the camera module 180. ) To detect the movement.
  • the memory 185 may temporarily store at least a part of the image acquired through the image sensor 183 for the next image processing operation. For example, when the image acquisition according to the shutter is delayed, or when a plurality of images are acquired at high speed, the acquired original image (eg, a high resolution image) is stored in the memory 185, and a corresponding copy An image (eg, a low resolution image) may be previewed through the display device 160. Thereafter, when a specified condition is satisfied (eg, a user input or a system command), at least a part of the original image stored in the memory 185 may be obtained and processed, for example, by the image signal processor 186.
  • the memory 185 may be configured as at least a part of the memory 130 or a separate memory operated independently of the memory 130.
  • the image signal processor 186 processes images obtained through the image sensor 183 or images stored in the memory 185 (eg, creating a depth map, 3D modeling, panorama generation, feature point extraction, Image synthesis, or image compensation (e.g. noise reduction, resolution adjustment, brightness adjustment, blurring, sharpening, or softening) can be performed.
  • the image signal processor ( 186) may perform control (eg, exposure time control, or lead-out timing control, etc.) for at least one of the components included in the camera module 180 (eg, the image sensor 183).
  • Images processed by the signal processor 186 are stored in the memory 185 again for further processing or external components of the camera module 180 (eg, the memory 185, the display device 160, and the electronic device) (102), the electronic device 104, or the server 108.
  • the image signal processor 186 is configured as at least a part of the processor 120, or the processor 120 It may be configured as a separate processor that is independently operated.When configured as a separate processor, images processed by the image signal processor 186 may be displayed as it is or after additional image processing by the processor 120. 186).
  • the electronic device 101 may include two or more camera modules 180 each having different attributes or functions.
  • the at least one camera module 180 may be a wide-angle camera or a front camera
  • the at least one other camera module may be a telephoto camera or a rear camera.
  • any (eg, first) component is referred to as a "coupled” or “connected” to another (eg, second) component, with or without the term “functionally” or “communicatively”
  • any of the above components can be connected directly to the other component (eg, by wire), wirelessly, or through a third component.
  • module may include a unit implemented in hardware, software, or firmware, and may be used interchangeably with terms such as logic, logic blocks, components, or circuits.
  • the module may be an integrally configured component or a minimum unit of the component or a part thereof performing one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • Various embodiments of the present document may include one or more instructions stored in a storage medium (eg, internal memory 136 or external memory 138) readable by a machine (eg, electronic device 101). It may be implemented as software (e.g., program 140) that includes.
  • a processor eg, processor 120
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the storage medium readable by the device may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device, and does not include a signal (eg, electromagnetic wave), and this term is used when data is semi-permanently stored in the storage medium. It does not distinguish between temporary storage cases.
  • a signal eg, electromagnetic wave
  • a method according to various embodiments disclosed in this document may be provided as being included in a computer program product.
  • Computer program products are commodities that can be traded between sellers and buyers.
  • the computer program product is distributed in the form of a device-readable storage medium (eg compact disc read only memory (CD-ROM)), or through an application store (eg Play StoreTM) or two user devices ( It can be distributed (eg, downloaded or uploaded) directly or online between smartphones).
  • a portion of the computer program product may be temporarily stored at least temporarily in a storage medium readable by a device such as a memory of a manufacturer's server, an application store's server, or a relay server, or may be temporarily generated.
  • each component (eg, module or program) of the above-described components may include a singular or a plurality of entities.
  • one or more components or operations of the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, modules or programs
  • the integrated component may perform one or more functions of each component of the plurality of components the same or similar to that performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component may be executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations may be executed in a different order, or omitted Or, one or more other actions can be added.
  • FIG. 2 is a diagram schematically illustrating a 3D modeling system of an electronic device according to various embodiments of the present disclosure.
  • the 3D modeling system of the electronic device 101 may include a projector (210), a structured light pattern generator (215, 216), a camera (camera, 220), and the like. have.
  • the projector 210 may project external objects 231 and 232 using electromagnetic waves or light as a source.
  • the structured light pattern generators 215 and 216 may include a first structured light pattern generator 215 and a second structured light pattern generator 216.
  • the structured light pattern generators 215 and 216 may generate a structured light pattern using an electromagnetic field filter or mask.
  • the first structured light pattern 215 may include a logical reference pattern generated by using a cellular automata (CA).
  • CA cellular automata
  • the first structured light pattern 215 may include a logical reference pattern itself or a pattern in which some modifications are made to the logical reference pattern.
  • the first structured light pattern 215 may be combined with the second structured light pattern 216 to be used to generate a composite structured light pattern.
  • the second structured light pattern 216 may include a grid pattern repeatedly arranged.
  • the second structured light pattern 216 may include at least two types of polygons, and the at least two types of polygons may have different shapes or colors.
  • the second structured light pattern 216 may be combined with the first structured light pattern 215 to be used to generate a composite structured light pattern.
  • the light projected from the projector 210 of the electronic device 101 may include a composite structured light pattern generated by the first structured light pattern 215 and the second structured light pattern 216. have.
  • the camera 220 of the electronic device 101 may acquire or capture an image 225 in which the projected composite structured light pattern is reflected by the objects 231 and 232 (acquired or captured). .
  • the light projected from the projector 210 of the electronic device 101 may be reflected by the first object 231 or the second object 232, the first object from the electronic device 101
  • the optical paths 211 and 212 may be different because the distance to the 231 or the second object 232 is different. Due to the difference in the optical paths 211 and 212, a parallax shift 240 may occur, and the electronic device 101 based on the parallax shift 240, the electronic device 101 and the object 231, 232) and depth information.
  • FIG. 3 is a block diagram schematically illustrating a depth sensing system of an electronic device according to various embodiments of the present disclosure.
  • the electronic device 101 may generate depth information of an external object using the depth detection system 300.
  • the depth detection system 300 may include a structured light pattern projection module 310 and a depth information generation module 320.
  • the structured light pattern projection module 310 may project a predetermined or previously generated structured light pattern to an external object. For example, generating a new structured light pattern each time 3D modeling is performed by the electronic device 101 may continuously load the processor (eg, processor 120 of FIG. 1), so that the structured light pattern to be projected is It may be previously generated by the manufacturer and stored in the memory of the electronic device 101 (eg, memory 130 of FIG. 1).
  • the processor eg, processor 120 of FIG. 1
  • the structured light pattern projection module 310 may project light based on a logical reference pattern generated by a cellular automata. Alternatively, the structured light pattern projection module 310 may project light that has modified at least a part of the logical reference pattern generated by the cellular automata.
  • the structured light pattern projection module 310 may be generated based on a logical reference pattern generated by a cellular automata, a structured light pattern generated based on a logical reference pattern, or a logical reference pattern It is possible to project at least a part of the structured light pattern of the structured light patterned with the structured light. Since the above-mentioned structured light patterns are basically based on a logical reference pattern, they may include structures conforming to the rules of cellular automata.
  • the depth information generation module 320 may obtain an image reflected by the light projected by the structured light pattern projection module 310 of an external object.
  • the depth information generation module 320 checks the disparity of the parallax movement to generate depth information for the external object You can.
  • depth information when the external object has a uniform plane, depth information may be generated by simply considering parallax movement of the structured light pattern, but when at least a part of the external object is not uniform, the structured light pattern Distortion may occur, and in the case of severe distortion, data loss may occur.
  • the depth information generation module 320 may recover a pre-generated structured light pattern or a pre-generated logical reference pattern from the acquired image.
  • the depth information generation module 320 may recover at least a part of the pre-generated logical reference pattern from at least a part of the acquired image, using the cellular automata rule used for the pre-generated logical reference pattern.
  • the depth information generation module 320 may acquire or confirm a connected component representing the same level of depth in at least some of the recovered logical reference patterns.
  • the depth information generation module 320 may acquire or confirm at least one coding word included in the acquired connected component.
  • the depth information generation module 320 may determine which part of the logical reference pattern that the corresponding connected component is pre-generated from coding words included in the connected component. For example, in the generated logical reference pattern, coding words corresponding to each cell may be stored as a lookup map. Accordingly, by comparing the coded words included in the connected component with the pre-stored lookup map, it is possible to determine which position in the logical reference pattern the connected component is generated.
  • the depth information generating module 320 may determine how much a parallax shift has occurred compared to a logical reference in which the connected component is previously generated, so that depth information on at least a part of the acquired image corresponding to the connected component Can generate
  • the functions performed by the depth sensing system 300 have been described in general, and specific embodiments will be described with reference to FIGS. 4 to 18 below.
  • FIG. 4 is a flowchart illustrating a method of sensing a depth of an external object of an electronic device according to various embodiments of the present disclosure.
  • the electronic device 101 may project a structured light pattern to an external object.
  • the electronic device 101 may include a projector, and may project a structured light pattern to an external object by using electromagnetic waves or light as a source through the projector.
  • the structured light pattern projected by the electronic device 101 may include a logical reference pattern generated by using a cellular automata (CA).
  • CA cellular automata
  • the structured light pattern may acquire an image reflected by an external object.
  • the electronic device 101 may include a camera, and acquire an image in which a structured light pattern projected from a projector is reflected by an external object.
  • the electronic device 101 may sense the depth of the external object based on the acquired image.
  • the distance between some regions constituting the external object from the electronic device 101 may be different.
  • the optical path may also be different, and the parallax movement can be confirmed in the acquired image due to the difference in the optical path.
  • the electronic device 101 may generate depth information of the external object based on the identified parallax movement.
  • FIG. 5 is a diagram illustrating a logical reference pattern of an electronic device according to various embodiments of the present disclosure.
  • the electronic device 101 may project a structured light pattern based on the logical reference pattern 500 to an external object.
  • the electronic device 101 may use an electromagnetic field filter or mask to project a structured light pattern based on the logical reference pattern 500.
  • the electronic device 101 may generate a logical reference pattern 500 whenever performing 3D modeling, but a logical reference pattern previously generated or stored to efficiently use limited resources of the processor ( 500).
  • the logical reference pattern 500 of the electronic device 101 may include a logical reference pattern 500 generated by using a cellular automata (CA).
  • CA cellular automata
  • Cellular automata is a method of interpreting the dynamics system, and it is possible to derive specific patterns by automatically updating the state of each cell by local interaction by dealing with space and time discretely. The detailed method of generating the logical reference pattern 500 disclosed in FIG. 5 will be described in detail in FIGS. 9A to 9C.
  • FIG. 6 is a diagram illustrating a first structured light pattern of an electronic device according to various embodiments of the present disclosure.
  • the electronic device 101 may generate the first structured light pattern 600 based on the logical reference pattern 500 disclosed in FIG. 5.
  • the first structured light pattern 600 of FIG. 6 emits the logical reference pattern 500 as it is, but may correspond to an enlarged view of a part of the logical reference pattern 500 to help understanding of FIG. 7 You can.
  • the first structured light pattern 600 may be generated by spacing the intervals of each cell included in the logical reference pattern 500 by a predetermined interval. That is, the electronic device 101 may emit the pre-generated logical reference pattern 500 as it is, or may emit the first structured light pattern 600 with some modification to the logical reference pattern 500.
  • FIG. 7 is a view illustrating a composite structured light pattern in which a first structured light pattern and a second structured light pattern of an electronic device are combined according to various embodiments of the present disclosure.
  • the electronic device 101 may project the composite structured light pattern 700 having the first structured light pattern 600 coated with the second structured light pattern on an external object.
  • the second structured light pattern may include a grid pattern.
  • the lattice pattern may be regularly arranged and repeated in all regions of the composite structured light pattern 700.
  • the electronic device 101 may distribute the first polygons to correspond to the cells of the logical reference pattern 500 generated by the cellular automata, but use a second polygon smaller than the spacing between the cells. Accordingly, a separation space may naturally occur between the first polygons, and the second polygons may fill the separation space.
  • the first polygons may be divided into different polygons based on a difference (eg, 0 or 1) of a logical value (eg, black or white).
  • the second polygons can also be divided in the same way as the first polygons.
  • the electronic device 101 may set a grid that can be a reference point even in a non-repetitive logical reference pattern using the composite structured light pattern 700.
  • the electronic device 101 can find the rule even in an irregular state, and through this, the depth information can be generated from the image reflected by the external object.
  • a specific embodiment for generating the composite structured light pattern 700 will be described in detail in FIGS. 8A to 8C.
  • FIGS. 8A to 8C are diagrams illustrating a method for generating a composite structured light pattern of an electronic device according to various embodiments of the present disclosure.
  • part of the logical reference pattern may include nine different cells 801 to 809.
  • the cells 801, 803, and 806 shown in black may contain a value of 0, and the cells 802, 804, 805, 807, 808 and 809 shown in white may contain a value of 1.
  • some of the logical reference patterns may be displayed spaced apart from each other by a predetermined distance. Although it is shown in FIG. 8A that there is no distance between cells, if the logical reference pattern is enlarged, the space between each cell may be spaced apart.
  • grids 811 to 819 may be filled in empty spaces between cells using polygons smaller than the distance between cells in FIG. 8B. It can be expressed that the first structured light pattern is tessellated and fitted by the second structured light pattern. Since it is difficult to accurately determine in which region the image reflected by the external object is reflected and how much distortion is generated by using the logical reference pattern alone, a second structured light pattern that can be a reference point or a frame is mixed with the first structured light pattern. It may be necessary. Accordingly, the electronic device 101 may project a composite structured light pattern in which the first structured light pattern and the second structured light pattern are mixed to an external object.
  • FIGS. 9A to 9C are diagrams illustrating a method of using a cellular automata to generate a logical reference pattern of an electronic device according to various embodiments of the present disclosure.
  • a logical reference pattern of the electronic device 101 may be formed by a combination of a plurality of cells. Each cell can be filled with a value of 0 or 1.
  • regularity may exist in that the state of other cells is determined by the state (eg, value) of neighboring cells. For example, with respect to the cell 921 adjacent to the right side of the cell 911, a cell (based on the state of the cell 911 and the state of the cells 910 and 912 neighboring above and below the cell 911) 921) may be determined. See FIG. 9B for rules for generating cellular automata.
  • a logical reference pattern of the electronic device 101 may be generated using Rule # 25 920 among cellular automata.
  • s corresponding to columns s0 and s1 of Rule # 25 920 may indicate the state of the current cell (eg, 911, Cell (i)).
  • k corresponding to the row (k0, k1, k2) of Rule # 25 is a cell neighboring to the top of the current cell 911 (eg, 910, Cell (i-1)) and a cell neighboring to the bottom (eg : 912, may mean the sum of the states of Cell (i + 1)).
  • a state of a cell 921 neighboring the right side of the current cell 911 may be determined as 1.
  • the state of the cell 921 adjacent to the right side of the current cell 911 may be determined as 0.
  • the state of the cell 921 adjacent to the right side of the current cell 911 may be determined as 1.
  • the state of the cell 921 adjacent to the right side of the current cell 911 may be determined as 0.
  • the state of the cell 921 adjacent to the right side of the current cell 911 may be determined as 1.
  • s is 1 and k is 2 the state of the cell 921 adjacent to the right side of the current cell 911 may be determined as 0.
  • a logical reference pattern may be generated using Rule # 25 920.
  • Rule # 25 920 may be used to determine the state of the cell 941 adjacent to the right based on the current cell 931. Referring to FIG. 9C, the state of the current cell 931 is 1, and the sum of the current cells 931 and adjacent cells 930 and 932 is 0. Accordingly, since s is 1 and k is 0, when assigned to Rule # 25, the state of the cell 941 neighboring to the right based on the current cell 931 may be 0.
  • the rule # 25 920 of the cellular automata disclosed in FIGS. 9A to 9C is only one embodiment, and it may be possible to generate a logical reference pattern using other rules. .
  • 10A is a diagram illustrating an image reflected by an external object, obtained by projecting a composite structured light pattern of an electronic device according to various embodiments of the present disclosure.
  • the electronic device 101 may project a composite structured light pattern in which the first structured light pattern and the second structured light pattern are mixed to an external object.
  • the electronic device 101 may obtain an image 1000 in which an external object reflects the composite structured light pattern.
  • the composite structured light pattern is a two dimensional (2D) image, and the image 1000 acquired by the electronic device 101 may also be a 2D image.
  • the electronic device 101 may perform three dimensional (3D) modeling of an external object based on the acquired image.
  • 10B is a diagram illustrating a physical shape corresponding to at least a portion of an external object, obtained from an image reflected by an external object of the electronic device according to various embodiments of the present disclosure.
  • an image obtained by the electronic device 101 and reflected by an external object may be different from a uniform grid pattern (eg, a second structured light pattern) of a logical reference pattern.
  • a uniform grid pattern eg, a second structured light pattern
  • the electronic device 101 may acquire an image having a distorted grid pattern, which may be defined as a physical appearance (1010).
  • 10C is a diagram illustrating a logical shape generated based on an acquired physical shape of an electronic device according to various embodiments of the present disclosure.
  • the electronic device 101 may move the physical shape 1010 obtained from the captured image to a computational space.
  • the electronic device 101 may generate a logical appearance 1020 by moving a lattice portion serving as a reference point from the physical shape 1010 to the calculation region.
  • the electronic device 101 may recover at least a part of the logical reference pattern previously generated from the image obtained based on the logical shape 1020.
  • the electronic device 101 may verify the cells included in the logical shape 1020 by performing cellular automata again based on the state of each cell included in the logical shape 1020. Since the initially generated logical reference pattern was generated using a specific rule of cellular automata (for example, Rule # 25), each cell included in the logical shape 1020 generated from the reflected image is also a cellular automata ( This is because certain rules of cellular automata (eg Rule # 25) can be satisfied. In the case of cells that have passed the verification, it can be defined as a connected component in a previously generated logical reference pattern.
  • the connected component may mean a cell that is continuously connected in a previously generated logical reference pattern, and it can be assumed that the depth of the corresponding part is a part having a uniform depth such that distortion does not occur. Accordingly, the depth of each cell included in the connected component may be defined as uniform.
  • the electronic device 101 may calculate parallax movement from the logical reference pattern based on the at least one connected component, and based on this, generate depth information corresponding to the connected component. For detailed methods of generating depth information corresponding to connected components, refer to FIGS. 12 to 14.
  • 11A to 11E are diagrams illustrating a method of restoring at least a portion of a logical reference pattern previously generated from an image reflected by an external object, by utilizing a cellular automata of an electronic device according to various embodiments of the present disclosure to be.
  • the value of the operation column 1120 corresponding to the second column in which cellular automata is performed based on the first column may be disclosed.
  • the electronic device 101 displays values included in the second column 1110 of a logical shape and values of the operation column 1120 obtained through cellular automata. Can be compared. For example, the state of the cell 1111 corresponding to the first row of the second column 1110 of a logical shape is 0, and the first row of the operation column 1120 obtained through the cellular automata is The state of the corresponding cell 1121 may also be equal to zero. In this case, the electronic device 101 may set the value of the cell 1131 corresponding to the first row of the first column 1130 as a reference column to 0. If the comparison results do not match, the value of the cell 1131 corresponding to the first row of the first column 1130 may be set to 1.
  • the electronic device 101 displays values included in the second column 1110 of a logical shape and values of the operation column 1120 obtained through cellular automata. You can continue to compare. For example, the state of the cell 1112 corresponding to the second row of the second column 1110 of the logical shape is 1, and the second row of the operation column 1120 obtained through the cellular automata The state of the corresponding cell 1122 may also be the same as 1. In this case, the electronic device 101 may set the value of the cell 1132 corresponding to the second row of the first column 1130 as a reference column to 0. If the comparison results do not match, the value of the cell 1132 corresponding to the second row of the first column 1130 may be set to 1.
  • values included in the generated logical shape and values obtained through cellular automata up to the eighth column 1140 of the image acquired by the electronic device 101 This can be shown.
  • the electronic device 101 may continuously perform the comparison process over the entire area of the acquired image. Thereafter, the electronic device 101 may check an area where the operation value is connected to 0, and determine that the area is connected.
  • FIG. 12 is a diagram illustrating a method of generating a lookup map from a previously generated logical reference pattern of an electronic device according to various embodiments of the present disclosure.
  • the electronic device 101 may record a coding word corresponding to each cell from the previously generated logical reference pattern 1200.
  • the electronic device 101 may store a coding word corresponding to each cell of the logical reference pattern, which may be defined as a look-up map 1210.
  • the electronic device 101 may identify a plurality of patterns that intersect at right angles including specific cells of a previously generated logical reference pattern.
  • the pattern that intersects at a right angle may have a length of 8 as a longitudinal axis. Coding words of the unfilled portion of the crossing pattern may be stored as zero.
  • the first crossing pattern since it is 111 vertically, it may be 00000111, and it may be 00100000 horizontally. Accordingly, the first coding word 1210 corresponding to the first crossing pattern may be 0000011100100000.
  • the second crossing pattern it may be 01111111 vertically and 00110111 horizontally.
  • the second coding word 1220 corresponding to the second crossing pattern may be 0111111100110111.
  • the third crossing pattern it may be 01111110 vertically and 00010000 horizontally.
  • the third coding word 1230 corresponding to the third crossing pattern may be 0111111000010000.
  • the electronic device 101 may map and store coding words corresponding to each cell in each cell included in the lookup map 1210. For example, 36 is recorded in the cell 1201 of the lookup map 1210, which may mean that 36 different coding words are mapped.
  • FIG. 13 is a diagram illustrating a method of acquiring a connected component through at least some of the recovered logical reference patterns of an electronic device and acquiring at least one coding word included in the connected component according to various embodiments of the present disclosure.
  • the electronic device 101 may identify the component 1310 connected in the logical shape 1300 through the comparison method disclosed in FIG. 11.
  • the electronic device 101 may generate a cross pattern included in the connected component 1310.
  • the electronic device 101 may generate first coding words 1321 and 0000111100011001 and second coding words 1322 and 0000000000000100 through the first crossing pattern.
  • the electronic device 101 may store coding words 1320 that can be generated in the connected component.
  • the electronic device 101 compares the coded words 1320 that can be generated in the connected component with a lookup map corresponding to the previously generated logical reference pattern to determine which part of the previously generated logical reference pattern corresponds to the connected component. You can.
  • FIG. 14 is a diagram illustrating a method of generating depth information of an external object through a look-up map of an electronic device according to various embodiments of the present disclosure.
  • the electronic device 101 compares a part of the logical reference pattern recovered from the connected component with a previously generated logical reference pattern 1420, and a part of an object corresponding to the connected component You can see how much time has been shifted.
  • the electronic device 101 may compare the coding words and confirm that the first cell 1401 of the first connected component corresponds to the cell 1431 of the lookup map 1430.
  • the electronic device 101 may check which part of the logical reference pattern 1420 in which the cell 1431 of the lookup map 1430 corresponds to the previously generated logical reference pattern 1420, and may check a different degree from the connected component.
  • the electronic device 101 may check the depth difference between the connected components by utilizing the separation between each of the different connected components.
  • the electronic device 101 may perform 3D modeling by utilizing a plurality of connected components and each depth difference. 3D modeling of an external object of the electronic device 101 may be referred to FIGS. 17 to 18.
  • 15 is a flowchart illustrating a method of obtaining a connected component from an image reflected by an external object of an electronic device according to various embodiments of the present disclosure.
  • the electronic device 101 may obtain a physical shape based on an image in which the structured light pattern is reflected by an external object.
  • the electronic device 101 may arrange or convert the obtained physical shape into a logical shape.
  • the electronic device 101 may recover at least a part of a logical reference pattern previously generated in a logical shape.
  • the electronic device 101 may acquire a connected component based on the recovered at least some logical reference patterns.
  • the electronic device 101 may perform an operation continuously following the operation described in FIG. 15 in FIG. 16.
  • 16 is a flowchart illustrating a method of generating depth information corresponding to a connected component based on a coding word included in a connected component of an electronic device according to various embodiments of the present disclosure.
  • the electronic device 101 may obtain coding words included in the connected component.
  • the electronic device 101 may compare the obtained coding words with a stored lookup map based on a previously generated logical reference pattern.
  • the electronic device 101 may determine which part of the logical reference pattern in which the connected component is generated, based on the comparison.
  • the electronic device 101 may generate depth information corresponding to the connected component based on the parallax movement of the connected component.
  • 17 is a diagram illustrating 3D modeling using connected components of an electronic device according to various embodiments of the present disclosure.
  • the electronic device 101 may perform 3D modeling 1700 by adding a plurality of different connected components.
  • the connected components may be arranged at different depths. For example, referring to FIG. 17, each color of connected components is different, and 3D modeling may be performed using relativity of depth information of each connected component.
  • 18A to 18B are diagrams illustrating performing 3D modeling and self-correction using a complex structured light pattern of an electronic device according to various embodiments of the present disclosure.
  • the electronic device 101 projects a structured light pattern to the external object 1810 through the projector 210 and acquires an image reflected by the external object 1810 through the camera 220.
  • a drawing of a system for performing 3D modeling 1800 is disclosed.
  • FIG. 18B there is disclosed a diagram for the electronic device 101 to perform self-repair using a cellular automata.
  • the electronic device 101 may perform 3D modeling through the connected component and perform self-correction on the missing areas in the 3D modeling.
  • the electronic device 101 can use a cellular automata to infer the state in which it is in a normal case even to areas that are not accurately recovered due to data loss, and based on this, it can be used for self-calibration of 3D modeling. .
  • An electronic device comprising: a camera; A light emitting module including one or more light emitters; Memory; And a processor, wherein the processor projects the structured light corresponding to the designated structured light pattern as an external object, using the light emitting module, and the specified structured light pattern comprises a plurality of firsts distributed according to a specified rule.
  • the light emitting module may include at least one of a mask, a filter, or a metasurface formed to correspond to the designated structured light pattern.
  • the plurality of first polygons may be distributed based on a pattern generated through a designated cellular automata.
  • the plurality of first polygons may be distributed to correspond to cells corresponding to a pattern generated through the designated cellular automata, and to be spaced apart from each other among the plurality of first polygons.
  • the designated repeating grid pattern may be formed by filling the plurality of second polygons in an empty space between adjacent first polygons among the plurality of first polygons.
  • the processor may be configured to identify a connected component that matches at least a portion of the specified structured light pattern based on the identified specified repeating grid pattern and at least a portion of the first polygons.
  • the processor may be configured to identify a coding word included in the connected component.
  • the connected component may be set to identify a position of the designated structured light pattern.
  • the processor compares a coded word included in the connected component with a lookup map stored corresponding to the specified structured light pattern to identify a location of the lookup map that includes a coded word included in the connected component, and Based on the identified location of the lookup map, it may be set to confirm the location of the connected component in the designated structured light pattern.
  • the processor checks a position shift between a position of the connected component in the acquired image and a position of a region corresponding to the connected component in the designated structured light pattern, and based on the position movement, the It can be set to generate depth information.
  • a method of an electronic device using a light emitting module, projecting a structured light corresponding to a specified structured light pattern to an external object; Obtaining an image in which the structure light is reflected by the external object, using the camera; Identifying a specified repeating grid pattern and at least some first polygons based on the acquired image; Restoring other unidentified portions of the first polygons based on the identified specified repetitive grid pattern and at least some of the first polygons; And generating depth information for the external object based on the positions of the at least some of the first polygons and the restored first part of the other polygons, wherein the specified structured light pattern comprises: It may include a plurality of second polygons connecting at least a portion of the plurality of first polygons based on the plurality of the first polygons distributed according to a specified rule and the specified repeating grid pattern.
  • the light emitting module may include at least one of a mask, a filter, or a metasurface formed to correspond to the designated structured light pattern.
  • the plurality of first polygons may be distributed based on a pattern generated through a designated cellular automata.
  • the plurality of first polygons may be distributed to correspond to cells corresponding to a pattern generated through the designated cellular automata and to be spaced apart from each other among the plurality of first polygons.
  • the designated repeating grid pattern may be formed by filling the plurality of second polygons in an empty space between adjacent first polygons among the plurality of first polygons.
  • It may include an operation of checking a coding word included in the connected component.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un appareil électronique comportant: une caméra; un module électroluminescent comprenant au moins un corps électroluminescent; une mémoire; et un processeur. Le processeur peut être configuré pour: projeter de la lumière structurée, qui correspond à un motif de lumière structuré prédéterminé, sur un objet externe au moyen du module électroluminescent, le motif de lumière structurée prédéterminé comprenant une pluralité de premiers polygones distribués selon une règle prédéterminée et une pluralité de seconds polygones reliant au moins une partie de la pluralité de premiers polygones sur la base d'une grille de partition répétée prédéterminée; utiliser la caméra pour obtenir une image formée par la lumière structurée réfléchie par l'objet externe; identifier la grille de partition répétée prédéterminée et la partie des premiers polygones sur la base de l'image obtenue; restaurer une autre partie non identifiée des premiers polygones sur la base de la grille de partition répétée prédéterminée identifiée et de la partie identifiée des premiers polygones; et générer une information de profondeur concernant l'objet externe sur la base des positions de la partie des premiers polygones et de l'autre partie restaurée des premiers polygones. D'autres modes de réalisation sont également possibles.
PCT/KR2019/008220 2018-10-08 2019-07-04 Procédé de génération d'information de profondeur au moyen d'une lumière structurée projetée sur un objet externe, et appareil électronique l'utilisant WO2020075953A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020180119883A KR102551261B1 (ko) 2018-10-08 2018-10-08 외부 객체에 투사된 구조광을 이용하여 깊이 정보를 생성하는 방법 및 이를 사용하는 전자 장치
KR10-2018-0119883 2018-10-08

Publications (1)

Publication Number Publication Date
WO2020075953A1 true WO2020075953A1 (fr) 2020-04-16

Family

ID=70163951

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/008220 WO2020075953A1 (fr) 2018-10-08 2019-07-04 Procédé de génération d'information de profondeur au moyen d'une lumière structurée projetée sur un objet externe, et appareil électronique l'utilisant

Country Status (2)

Country Link
KR (1) KR102551261B1 (fr)
WO (1) WO2020075953A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112562059B (zh) * 2020-11-24 2023-12-08 革点科技(深圳)有限公司 一种自动化结构光图案设计方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100118123A1 (en) * 2007-04-02 2010-05-13 Prime Sense Ltd Depth mapping using projected patterns
KR20130037152A (ko) * 2011-10-05 2013-04-15 한국전자통신연구원 패턴 광을 이용한 깊이 정보 획득 장치 및 방법
KR20140125145A (ko) * 2013-04-18 2014-10-28 한국전자통신연구원 3차원 깊이 정보 획득 장치 및 이를 이용한 3차원 깊이 정보 획득 방법
KR20150111197A (ko) * 2014-03-25 2015-10-05 삼성전자주식회사 깊이 카메라 장치, 그것을 구비한 3d 영상 디스플레이 시스템 및 그 제어방법
EP3176601A1 (fr) * 2012-08-14 2017-06-07 Microsoft Technology Licensing, LLC Projection de lumiere d'eclairage pour une camera de profondeur

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9773155B2 (en) 2014-10-14 2017-09-26 Microsoft Technology Licensing, Llc Depth from time of flight camera
KR101733228B1 (ko) 2016-04-28 2017-05-08 주식회사 메디트 구조광을 이용한 3차원 스캐닝 장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100118123A1 (en) * 2007-04-02 2010-05-13 Prime Sense Ltd Depth mapping using projected patterns
KR20130037152A (ko) * 2011-10-05 2013-04-15 한국전자통신연구원 패턴 광을 이용한 깊이 정보 획득 장치 및 방법
EP3176601A1 (fr) * 2012-08-14 2017-06-07 Microsoft Technology Licensing, LLC Projection de lumiere d'eclairage pour une camera de profondeur
KR20140125145A (ko) * 2013-04-18 2014-10-28 한국전자통신연구원 3차원 깊이 정보 획득 장치 및 이를 이용한 3차원 깊이 정보 획득 방법
KR20150111197A (ko) * 2014-03-25 2015-10-05 삼성전자주식회사 깊이 카메라 장치, 그것을 구비한 3d 영상 디스플레이 시스템 및 그 제어방법

Also Published As

Publication number Publication date
KR20200040342A (ko) 2020-04-20
KR102551261B1 (ko) 2023-07-05

Similar Documents

Publication Publication Date Title
WO2020171583A1 (fr) Dispositif électronique pour stabiliser une image et son procédé de fonctionnement
WO2019156308A1 (fr) Appareil et procédé d'estimation de mouvement de stabilisation d'image optique
WO2019164185A1 (fr) Dispositif électronique et procédé de correction d'une image corrigée selon un premier programme de traitement d'image, selon un second programme de traitement d'image dans un dispositif électronique externe
WO2019203579A1 (fr) Procédé de génération d'informations de profondeur et dispositif électronique prenant en charge ledit procédé
WO2020091262A1 (fr) Procédé de traitement d'image à l'aide d'un réseau neuronal artificiel, et dispositif électronique le prenant en charge
WO2020159149A1 (fr) Dispositif électronique et procédé de traitement d'image
WO2020032497A1 (fr) Procédé et appareil permettant d'incorporer un motif de bruit dans une image sur laquelle un traitement par flou a été effectué
WO2019142997A1 (fr) Appareil et procédé pour compenser un changement d'image provoqué par un mouvement de stabilisation d'image optique (sio)
WO2020032383A1 (fr) Dispositif électronique permettant de fournir un résultat de reconnaissance d'un objet externe à l'aide des informations de reconnaissance concernant une image, des informations de reconnaissance similaires associées à des informations de reconnaissance, et des informations de hiérarchie, et son procédé d'utilisation
WO2021133025A1 (fr) Dispositif électronique comprenant un capteur d'image et son procédé de fonctionnement
WO2020116844A1 (fr) Dispositif électronique et procédé d'acquisition d'informations de profondeur à l'aide au moins de caméras ou d'un capteur de profondeur
WO2022080869A1 (fr) Procédé de mise à jour d'une carte tridimensionnelle au moyen d'une image et dispositif électronique prenant en charge ledit procédé
WO2019172723A1 (fr) Interface connectée à un capteur d'image et dispositif électronique comprenant des interfaces connectées parmi une pluralité de processeurs
WO2020171450A1 (fr) Dispositif électronique et procédé de génération carte de profondeur
WO2019066370A1 (fr) Dispositif électronique pour commander une caméra sur la base d'une lumière extérieure, et procédé de commande associé
WO2019160237A1 (fr) Dispositif électronique, et procédé de commande d'affichage d'images
WO2021112500A1 (fr) Dispositif électronique et procédé pour corriger une image dans une commutation de caméra
WO2021162353A1 (fr) Dispositif électronique incluant un appareil photographique et son procédé de fonctionnement
WO2019103420A1 (fr) Dispositif électronique et procédé de partage d'image comprenant un dispositif externe, à l'aide d'informations de lien d'image
WO2021080307A1 (fr) Procédé de commande de caméra et dispositif électronique correspondant
WO2020190030A1 (fr) Dispositif électronique de génération d'image composite et procédé associé
WO2020075953A1 (fr) Procédé de génération d'information de profondeur au moyen d'une lumière structurée projetée sur un objet externe, et appareil électronique l'utilisant
WO2020171558A1 (fr) Procédé de fourniture de contenus de réalité augmentée et dispositif électronique associé
WO2019054610A1 (fr) Dispositif électronique et procédé de commande d'une pluralité de capteurs d'image
WO2020190008A1 (fr) Dispositif électronique pour fonction de focalisation auto, et procédé de commande correspondant

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19871777

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19871777

Country of ref document: EP

Kind code of ref document: A1