WO2024071653A1 - Procédé de calcul de carte de distance basé sur une analyse de parallaxe, et système associé - Google Patents

Procédé de calcul de carte de distance basé sur une analyse de parallaxe, et système associé Download PDF

Info

Publication number
WO2024071653A1
WO2024071653A1 PCT/KR2023/011660 KR2023011660W WO2024071653A1 WO 2024071653 A1 WO2024071653 A1 WO 2024071653A1 KR 2023011660 W KR2023011660 W KR 2023011660W WO 2024071653 A1 WO2024071653 A1 WO 2024071653A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
color image
point
movement
distance
Prior art date
Application number
PCT/KR2023/011660
Other languages
English (en)
Korean (ko)
Inventor
문재영
Original Assignee
문재영
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 문재영 filed Critical 문재영
Publication of WO2024071653A1 publication Critical patent/WO2024071653A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Definitions

  • the present invention relates to a method for calculating a distance map based on parallax analysis and a system therefor. More specifically, it relates to a method for calculating a real-time spatial coordinate map based on a camera position based on convolutional parallax analysis and a system therefor. It's about.
  • distance information about the front from the image capture point that is, a distance map, is very important.
  • This distance map is an image in which each pixel has xyz coordinate information by replacing the color.
  • the distance map can be used in virtualization technology of real space such as Metaverse or in autonomous driving. It is applied in various fields such as distance measurement.
  • real-time image synthesis technology which is essential in AR and VR, it can be used as a synthesis technology to apply a composite image to the surface by synthesizing images using location search of the relevant pixels and calculating the normal vector of the surface of the subject.
  • the chroma key technique currently used in movies can be replaced at a low cost by using a masking technique based on a distance map, and can also be filmed as synthesized video directly on site.
  • distance maps with high resolution it has high utility value in various industrial fields by using it for highly efficient real-time 3D scanning. Since it is fundamentally a technique for searching streets based on optical cameras, it is also used as a core technology in sensors for autonomous driving, robotics, and drones, and has high utility in the field of heavy industry.
  • One technical aspect of the present invention is to solve the problems of the prior art described above, by calculating movement information between each pixel based on color images taken at different locations and extracting distance information at each pixel based on this.
  • the aim is to provide a distance map calculation method based on parallax analysis and a system therefor that can generate a camera-based spatial coordinate map using a color image without a separate sensor.
  • one technical aspect of the present invention is to calculate a distance map based on disparity analysis, where movement information between each pixel can be quickly calculated using efficient resources by applying a pixel similarity evaluation function and a disparity analysis function therefor. It provides a method and a system for it.
  • the distance map calculation method is a distance map calculation method performed by a service server that receives a color image and generates a distance map image based on the color image, including a first color image taken at a first point, and the first point. Obtaining a second color image taken at a second point adjacent to the first color image, calculating pixel similarity between the first color image and the second color image, and calculating the first color image from the first color image based on the calculated pixel similarity. It may include calculating a pixel movement amount on the image of each pixel up to a two-color image and calculating a distance from each pixel based on point movement information between the first point and the second point and the pixel movement amount. there is.
  • the distance map calculation method may include generating a distance map image for the first color image based on the calculated distance from each pixel.
  • movement from the first point to the second point may correspond to parallel movement.
  • calculating the pixel movement amount on the image of each pixel includes generating pixel similarity information by applying a pixel similarity evaluation function to the first color image and the second color image, and the point movement information. It may include calculating a reference direction vector and a movement range based on , and calculating the pixel movement amount by applying the reference direction vector within the movement range to pixel similarity information.
  • the pixel similarity information is obtained by setting the first color image and the second color image as a scalar function in UV coordinates and calculating a similarity value between the two pixels, which is defined by the values of the gradient functions on the UV. It can be saved as a displayed 3 Dimension Float Map Array.
  • the distance map calculation system based on the parallax analysis includes a photographing terminal that acquires a first color image taken at a first point and a second color image taken at a second point adjacent to the first point, and the first It may include a service server that calculates pixel similarity between the color image and the second color image, and calculates the distance from each pixel based on the calculated pixel similarity.
  • the service server calculates a pixel movement amount on the image of each pixel from the first color image to the second color image based on the calculated pixel similarity, and provides point movement information between the first point and the second point. The distance from each pixel can be calculated based on the pixel movement amount.
  • the service server may generate a distance map image for the first color image based on the calculated distance from each pixel.
  • movement from the first point to the second point may correspond to parallel movement.
  • the service server generates pixel similarity information by applying a pixel similarity evaluation function to the first color image and the second color image when calculating the amount of pixel movement on the image of each pixel, and After calculating the reference direction vector and movement range based on the point movement information, the pixel movement amount can be calculated by applying the reference direction vector within the movement range to pixel similarity information.
  • the pixel similarity information is obtained by setting the first color image and the second color image as a scalar function in UV coordinates and calculating a similarity value between the two pixels, which is defined by the values of the gradient functions on the UV. It can be saved as a displayed 3 Dimension Float Map Array.
  • movement information between each pixel is calculated based on color images taken at different locations, and distance information at each pixel is extracted based on this, thereby using the color image without a separate sensor. This has the effect of creating a distance map.
  • FIG. 1 is a diagram illustrating an application example of a distance map calculation system based on parallax analysis according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating an exemplary computing operating environment of the service server shown in FIG. 1.
  • FIG. 3 is a block diagram illustrating an embodiment of the service server shown in FIG. 1.
  • Figure 4 is a flowchart explaining a distance map calculation method based on parallax analysis according to an embodiment of the present invention.
  • Figures 5 and 6 are reference drawings for explaining a distance map calculation method based on parallax analysis.
  • FIG. 7 is a block diagram illustrating an embodiment of the pixel movement amount calculation unit shown in FIG. 3.
  • FIG. 8 is a flowchart explaining an operation of an embodiment of the pixel movement amount calculation unit shown in FIG. 7.
  • Figure 9 is a conceptual diagram for explaining a distance calculation algorithm according to an embodiment of the present invention.
  • Figure 10 is a conceptual diagram for explaining a disparity analysis algorithm according to an embodiment of the present invention.
  • each element may be implemented as an electronic configuration to perform the corresponding function, or may be implemented as software itself that can be run in an electronic system, or as a functional element of such software. Alternatively, it may be implemented with an electronic configuration and corresponding driving software.
  • each function executed in the system of the present invention may be configured in module units and may be recorded in one physical memory, or may be distributed and recorded between two or more memories and recording media.
  • FIG. 1 is a diagram illustrating an application example of a distance map calculation system based on parallax analysis according to an embodiment of the present invention.
  • a distance map calculation system based on parallax analysis may include a photographing terminal 100 and a service server 300.
  • the capturing terminal 100 may capture an image and provide it to the service server 300.
  • the image captured by the photographing terminal 100 is an image displaying color, and is hereinafter referred to as a 'color image'.
  • a color image refers to an image expressed in color compared to a street map image, and therefore can be expressed in various forms such as RGB images and CYAN images.
  • the photographing terminal 100 may photograph a first color image at a first point, photograph a second color image at a second point adjacent to the first point, and provide the image to the service server 300.
  • the first color image captured at the first point and the second color image captured at the second point have at least some areas in common.
  • the service server 300 calculates the pixel movement between the first color image and the second color image, and calculates the distance from each pixel, that is, the vertical distance (distance shown in FIG. 1), based on this pixel movement. You can.
  • the service server 300 may calculate the pixel similarity between the first color image and the second color image, and calculate the distance at each pixel based on the calculated pixel similarity.
  • the service server 300 may calculate the amount of pixel movement on the image of each pixel from the first color image to the second color image based on the calculated pixel similarity. Thereafter, the service server 300 may calculate the distance at each pixel based on the point movement information and pixel movement amount between the first point and the second point. Additionally, the service server 300 may generate a distance map image for the first color image based on the calculated distance from each pixel.
  • FIG. 2 is a diagram illustrating an exemplary computing operating environment of a management server according to an embodiment of the present invention.
  • management server 300 is intended to provide a general and simplified description of a suitable computing environment in which embodiments of management server 300 may be implemented; with reference to FIG. 2, a computing device is shown as an example of management server 300. do.
  • the computing device may include at least a processing unit 303 and a system memory 301.
  • a computing device may include multiple processing units that cooperate in executing programs.
  • system memory 301 may be volatile (e.g., RAM), non-volatile (e.g., ROM, flash memory, etc.), or a combination thereof.
  • the system memory 301 includes a suitable operating system 302 for controlling the operation of the platform, which may be, for example, the WINDOWS operating system from Microsoft or Linux.
  • System memory 301 may include one or more software applications, such as program modules and applications.
  • the computing device may include additional data storage devices 304, such as magnetic disks, optical disks, or tape.
  • This additional storage device may be a removable storage device and/or a stationary storage device.
  • Computer-readable storage media may include volatile and non-volatile, removable and non-removable media implemented in any method or technique for storage of information such as computer-readable instructions, data structures, program modules, or other data.
  • the system memory 301 and the storage device 304 are all examples of computer-readable storage media.
  • Computer-readable storage media may include RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, DVD or other optical storage, magnetic tape, magnetic disk storage or other magnetic storage, Or, it may include, but is not limited to, any other medium that stores desired information and can be accessed by computing device 300.
  • Input devices 305 of the computing device may include, such as a keyboard, mouse, pen, voice input device, touch input device, and comparable input devices.
  • Output devices 306 may include, for example, displays, speakers, printers, and other types of output devices. Since these devices are widely known in the art, detailed descriptions are omitted.
  • a computing device may include a communication device 307 that allows the device to communicate with other devices, such as through networks, such as wired and wireless networks, satellite links, cellular links, local area networks, and comparable mechanisms, such as in a distributed computing environment.
  • Communication device 307 is one example of communication media, which may contain computer-readable instructions, data structures, program modules, or other data therein.
  • communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media.
  • the management server 300 can be described as a functional configuration implemented in this computing environment. This will be explained with reference to FIG. 3 .
  • FIG. 3 is a block diagram illustrating an embodiment of the service server shown in FIG. 1
  • FIG. 4 is a flowchart illustrating a distance map calculation method based on parallax analysis according to an embodiment of the present invention.
  • the management server 300 may include a pixel movement amount calculator 310, a distance calculator 320, and a distance map generator 330.
  • the pixel movement amount calculation unit 310 may acquire a first color image taken at a first point and a second color image taken at a second point adjacent to the first point (S410).
  • Figure 5 shows an example in which a photographing terminal captures a first color image and a second color image at a first point (P1) and a second point (P2), respectively, and figure (a) in Figure 6 shows the first color image and the second color image.
  • Figure (b) shows an example of a second color image.
  • the first color image and the second color image generated by respectively capturing images from two adjacent points are captured in an area in which the majority of pixels are the same.
  • the point movement amount (Dx) from the first point (P1) to the second point (P2) causes movement within the image taken with respect to the same subject point, that is, the pixel movement amount (D_px).
  • the pixel movement amount calculation unit 310 compares the first color image and the second color image to calculate the position transformation of the pixel representing the same photographic point in the color image, that is, the pixel movement amount (D_px) (S420).
  • the pixel movement amount calculator 310 may calculate pixel similarity between the first color image and the second color image. Thereafter, the pixel movement amount calculation unit 310 may calculate the pixel movement amount on the image of each pixel from the first color image to the second color image based on the calculated pixel similarity. This embodiment will be described in more detail below with reference to FIGS. 7 and 8.
  • the distance calculation unit 320 may calculate the distance value at each pixel based on the movement amount of the photographing device - that is, point movement information from the first point to the second point - and the pixel movement amount (S430).
  • the distance calculation unit 320 receives the amount of movement of the pixel compared to the amount of movement of the imaging device, and applies a calculation formula based on the perspective effect to calculate the distance from the camera to the corresponding pixel, that is, the camera reference spatial coordinates.
  • Figure 9 is a conceptual diagram for explaining a distance calculation algorithm according to an embodiment of the present invention.
  • xDepth 1+log(Abs(deltaPixel)/d*S)/log(h)).
  • the coordinates of yz are as follows. CenterAlignedScreenSpace UV*resolution*deltaPixel/(S*d)
  • a 3D point cloud map can be created using the camera reference spatial coordinates xyz calculated in this way.
  • the distance map generator 330 may generate a distance map image based on the distance of each pixel calculated by the distance calculator 320 (S440).
  • the distance map image generated by the distance map generator 330 may be a distance map image corresponding to the first color image. This is because, based on the first color image, the movement of each pixel in the pixel area (overlapping pixel area) overlapping with the second color image in the first color image can be calculated, the distance value for each pixel in the overlapping pixel area You can create a street map image including .
  • the distance map generator 330 may calculate distance information for a pixel area (non-overlapping pixel area) in the first color image that does not overlap with the second color image. For example, the distance map generator 330 may predict and calculate distance information for each pixel in the non-overlapping pixel area using the edge of the image object and the distance value for the pixel in the overlapping pixel area. As an example, the distance map generator 330 may calculate an edge of a color image and identify an object in the color image based on this. Afterwards, the distance value of the pixel in the overlapping pixel area belonging to the same object can be set as the distance between the pixel in the overlapping pixel area and the pixel in the non-overlapping pixel area included in the same object.
  • the distance values of a plurality of pixels in the overlapping pixel area belonging to the same column or row in the same object are different from each other, a gradient of the distance values of the plurality of pixels in the overlapping pixel area belonging to the same column or row is calculated, By expanding such a column or row to a non-overlapping pixel area and then reflecting the degree of change with respect to the expanded column or row, the distance between pixels in the non-overlapping pixel area can be calculated.
  • point cloud scanning data can be created using the generated distance map. This can be implemented by recording pixel coordinates output along screen coordinates in the form of an array. In the case of video, if information about the camera's position is expressed in the form of four vectors about the origin (camera position coordinates, three camera space basis vectors), the obtained coordinate map is linearly converted into a 3x3 matrix composed of basis vectors. After adding the camera coordinates again, it is easily converted from absolute coordinates to a pixel coordinate map. If this is updated at a specific cycle in the empty point cloud space, 3D scanning data can be obtained even from moving images.
  • FIG. 7 is a block diagram explaining an embodiment of the pixel movement amount calculation unit shown in FIG. 3, and FIG. 8 is a flowchart explaining an operation of an embodiment of the pixel movement amount calculation unit shown in FIG. 7.
  • the pixel movement amount calculation unit 310 may include a pixel similarity evaluation module 311, a parallax analysis module 312, and a pixel movement amount calculation module 313.
  • the pixel similarity evaluation module 311 may apply a pixel similarity evaluation function to the first color image and the second color image to generate pixel similarity information for the two color images (S810).
  • the pixel similarity evaluation module 311 may use a Taylor function or a Laplacian function as a pixel similarity evaluation function.
  • the pixel similarity evaluation module 311 determines that if the differential values of the derivatives of a point in the first color image and a point in the second color image, that is, any two points, are similar, the shape of the function near the two points is similar. Similarity evaluation can be performed by applying similar points.
  • the pixel similarity information is a 3 Dimension Float Map Array that is displayed by setting the first color image and the second color image as a scalar function in UV coordinates and calculating the similarity value between the two pixels. It can be saved as .
  • the pixel similarity evaluation module 311 sets the first color image and the second color image as a scalar function in UV coordinates, then compares the derivative functions of the Laplacian for these, and determines the one color image in the first color image. It is possible to estimate how similar a point in the point and second color image, that is, the vicinity of two points, is.
  • the pixel similarity evaluation module 311 can add these estimated values and store them as a 3 Dimension Float Map Array. Since the pixel values of each moving pixel in this 3 Dimension Float Map Array have a unique value for conversion to the limited movement of the camera, they can be used to track the amount of movement of the pixels.
  • a convolution operation can be introduced. Because the Laplacian in an image is simply defined as a convolution operation, simple programming logic can be achieved. Additionally, the accuracy of mapping can be improved by manipulating the convolution kernel matrix in the convolution operation to quantify the similarity in a specific direction.
  • the convolution kernel matrix (3x3) of the row-shifted first color image (A) and second color image (B) is used as an input value and converted to a convolution layer. At this time, a recursive loop operation is applied to the convolution kernel to generate image arrays A ⁇ [LayerNumber], B ⁇ [LayerNumber].
  • a secondary layer array is created as B ⁇ [LayerNumber][deltaPixel] according to the delta shift value, which is the parallel shift distance.
  • the parallax analysis module 312 uses a pixel similarity evaluation module ( 311), pixel similarity information - for example, a 3D plot map array displayed by calculating the similarity value between two pixels - is provided, and pixel movement is tracked using the point movement information and pixel similarity information regarding the movement of the shooting point. You can.
  • the parallax analysis module 312 can calculate a reference direction vector indicating the direction in which pixels move and a movement range a using the movement distance of the camera, that is, point movement information about the movement of the shooting point (S820).
  • the movement range is the range in which the pixel moves along the reference direction vector, and may satisfy, for example, the range from -a to a.
  • the parallax analysis module 312 may move the unit image from -a to a in response to the resolution of the pixel in the reference direction vector and subtract the first and second images processed into a 3D plot map array.
  • the unit image can be set by dividing the movement range by a predetermined resolution set smaller than the movement range, and subtraction can be performed on this unit image while moving in the direction of the reference direction vector. According to this operation, each pixel approaches its eigenvalue, and thus pixels with similar surroundings depending on the distance moved are expressed as a value close to 0.
  • the pixel movement amount calculation module 313 can map the amount of movement of each pixel on the screen according to the movement of the camera based on the result of this subtraction operation performed by the parallax analysis module 312.
  • the parallax analysis module 312 may perform mapping along with processing such as noise removal (S830).
  • Figure 10 is a conceptual diagram for explaining a disparity analysis algorithm according to an embodiment of the present invention.
  • the parallax analysis module 312 can calculate how much a pixel has actually moved based on pixel similarity information. In other words, by moving images in batches according to the movement distance delta and comparing them, the movement distance deltaPixel of each pixel can be collectively calculated through image processing. To this end, the disparity analysis module 312 performs the convolution calculation process performed by the disparity evaluation module 311 into A1 ⁇ [LayerNumber][deltaPixel], A2 ⁇ [ LayerNumber][deltaPixel] can be created.
  • deltaPixelMap For each LayerNumber, find the deltaPixel where A1 ⁇ 2+A2 ⁇ 2 is the minimum value in each pixel value and record it in the pixel to create an image array C[LayerNumber], and calculate the average of the elements of C to create a distance map (deltaPixelMap ) can be created.
  • processing unit 304 storage device
  • Pixel similarity evaluation module 312 Parallax analysis module

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

Un procédé de calcul de carte de distance, selon un aspect technique de la présente invention, effectué par un serveur de service pour générer une image de carte de distance sur la base d'une image en couleur en étant pourvu de l'image en couleur, peut comprendre les étapes consistant à : obtenir une première image en couleur capturée à un premier point et une seconde image en couleur capturée à un second point adjacent au premier point ; calculer une similarité de pixel entre la première image couleur et la seconde image couleur et calculer une quantité de décalage de pixel sur une image de chaque pixel de la première image couleur à la seconde image couleur sur la base de la similarité de pixel calculée ; et calculer une distance de chaque pixel sur la base de la quantité de décalage de pixel et d'informations de mouvement de point entre le premier point et le second point.
PCT/KR2023/011660 2022-09-28 2023-08-08 Procédé de calcul de carte de distance basé sur une analyse de parallaxe, et système associé WO2024071653A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2022-0123640 2022-09-28
KR1020220123640A KR20240044174A (ko) 2022-09-28 2022-09-28 시차 분석을 기반으로 하는 거리맵 산출 방법 및 그를 위한 시스템

Publications (1)

Publication Number Publication Date
WO2024071653A1 true WO2024071653A1 (fr) 2024-04-04

Family

ID=90478355

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2023/011660 WO2024071653A1 (fr) 2022-09-28 2023-08-08 Procédé de calcul de carte de distance basé sur une analyse de parallaxe, et système associé

Country Status (2)

Country Link
KR (1) KR20240044174A (fr)
WO (1) WO2024071653A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130085030A (ko) * 2010-07-08 2013-07-26 스피넬라 아이피 홀딩스, 인코포레이티드 비디오 시퀀스에서의 샷 변화 검출을 위한 시스템 및 방법
KR101370785B1 (ko) * 2012-11-06 2014-03-06 한국과학기술원 입체 영상의 깊이 맵 생성 방법 및 장치
JP5751117B2 (ja) * 2011-09-29 2015-07-22 大日本印刷株式会社 画像生成装置、画像生成方法、画像生成装置用プログラム
KR20200090391A (ko) * 2019-01-21 2020-07-29 엘지이노텍 주식회사 카메라 모듈 및 이의 이미지 생성 방법
KR102278536B1 (ko) * 2020-11-09 2021-07-19 대한민국 영상 이미지를 이용하여 거리를 측정하기 위한 장치 및 방법

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102094506B1 (ko) 2013-10-14 2020-03-27 삼성전자주식회사 피사체 추적 기법을 이용한 카메라와 피사체 사이의 거리 변화 측정방법 상기 방법을 기록한 컴퓨터 판독 가능 저장매체 및 거리 변화 측정 장치.
KR102263064B1 (ko) 2014-08-25 2021-06-10 삼성전자주식회사 피사체의 움직임을 인식하는 장치 및 그 동작 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130085030A (ko) * 2010-07-08 2013-07-26 스피넬라 아이피 홀딩스, 인코포레이티드 비디오 시퀀스에서의 샷 변화 검출을 위한 시스템 및 방법
JP5751117B2 (ja) * 2011-09-29 2015-07-22 大日本印刷株式会社 画像生成装置、画像生成方法、画像生成装置用プログラム
KR101370785B1 (ko) * 2012-11-06 2014-03-06 한국과학기술원 입체 영상의 깊이 맵 생성 방법 및 장치
KR20200090391A (ko) * 2019-01-21 2020-07-29 엘지이노텍 주식회사 카메라 모듈 및 이의 이미지 생성 방법
KR102278536B1 (ko) * 2020-11-09 2021-07-19 대한민국 영상 이미지를 이용하여 거리를 측정하기 위한 장치 및 방법

Also Published As

Publication number Publication date
KR20240044174A (ko) 2024-04-04

Similar Documents

Publication Publication Date Title
WO2015005577A1 (fr) Appareil et procédé d'estimation de pose d'appareil photo
WO2016095192A1 (fr) Système et procédé d'imagerie de flux optique utilisant la détection à ultrasons de profondeur
CN110070598B (zh) 用于3d扫描重建的移动终端及其进行3d扫描重建方法
WO2017150878A1 (fr) Utilisation de repères multiples pour un classement d'objets à grains fins
WO2013015549A2 (fr) Système de réalité augmentée sans repère à caractéristique de plan et son procédé de fonctionnement
WO2016053067A1 (fr) Génération de modèle tridimensionnel à l'aide de bords
WO2014185710A1 (fr) Procédé de correction d'image 3d dans un dispositif d'affichage mosaïque, et appareil correspondant
WO2013151270A1 (fr) Appareil et procédé de reconstruction d'image tridimensionnelle à haute densité
US20110091131A1 (en) System and method for stabilization of fisheye video imagery
WO2021075772A1 (fr) Procédé et dispositif de détection d'objet au moyen d'une détection de plusieurs zones
WO2021025364A1 (fr) Procédé et système utilisant un lidar et une caméra pour améliorer des informations de profondeur concernant un point caractéristique d'image
Eichhardt et al. Affine correspondences between central cameras for rapid relative pose estimation
WO2017195984A1 (fr) Dispositif et procédé de numérisation 3d
WO2023169281A1 (fr) Procédé et appareil d'enregistrement d'image, support de stockage, et dispositif électronique
WO2015056826A1 (fr) Appareil et procédé de traitement des images d'un appareil de prise de vues
WO2023027268A1 (fr) Dispositif et procédé d'étalonnage de caméra-lidar
WO2024071653A1 (fr) Procédé de calcul de carte de distance basé sur une analyse de parallaxe, et système associé
WO2011078430A1 (fr) Procédé de recherche séquentielle pour reconnaître une pluralité de marqueurs à base de points de caractéristique et procédé de mise d'oeuvre de réalité augmentée utilisant ce procédé
KR100362171B1 (ko) 영상 특징점 매칭을 이용한 변환 행렬 추정 장치, 방법 및 그 기록매체와 그를 이용한 모자이크 영상 생성 장치, 방법 및 그 기록매체
JPH08242469A (ja) 撮像カメラ装置
Hui et al. Determination of line scan camera parameters via the direct linear transformation
WO2016159613A1 (fr) Procédé et système de suivi d'objet pour synthèse d'image
WO2022203464A2 (fr) Procédé de mise en correspondance stéréo omnidirectionnelle en temps réel à l'aide d'objectifs fisheye à vues multiples et système associé
WO2021182793A1 (fr) Procédé et appareil d'étalonnage de différents types de capteurs à l'aide d'un damier unique
WO2020256517A2 (fr) Procédé et système de traitement de mappage de phase automatique basés sur des informations d'image omnidirectionnelle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23872776

Country of ref document: EP

Kind code of ref document: A1