WO2023074999A1 - Système d'acquisition d'image d'objet ar 3d en temps réel, et son procédé de fonctionnement - Google Patents

Système d'acquisition d'image d'objet ar 3d en temps réel, et son procédé de fonctionnement Download PDF

Info

Publication number
WO2023074999A1
WO2023074999A1 PCT/KR2021/017527 KR2021017527W WO2023074999A1 WO 2023074999 A1 WO2023074999 A1 WO 2023074999A1 KR 2021017527 W KR2021017527 W KR 2021017527W WO 2023074999 A1 WO2023074999 A1 WO 2023074999A1
Authority
WO
WIPO (PCT)
Prior art keywords
depth
rgb
image
partial data
cameras
Prior art date
Application number
PCT/KR2021/017527
Other languages
English (en)
Korean (ko)
Inventor
장준환
박우출
양진욱
윤상필
최민수
이준석
송수호
구본재
Original Assignee
한국전자기술연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국전자기술연구원 filed Critical 한국전자기술연구원
Publication of WO2023074999A1 publication Critical patent/WO2023074999A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/139Format conversion, e.g. of frame-rate or size
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/167Synchronising or controlling image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/189Recording image signals; Reproducing recorded image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/243Image signal generators using stereoscopic image cameras using three or more 2D image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/257Colour aspects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps

Definitions

  • the present invention relates to 3D AR object acquisition, and more particularly, to a 3D AR object image real-time acquisition system that supports acquiring 3AR object images in real time and a method for operating the same.
  • augmented reality since an augmented image is processed based on an image taken by a camera, the amount of data to be processed is very large. Accordingly, in order to implement augmented reality-based content, there is a problem in that use of augmented reality content is not high because a hardware device with high specifications is required or the augmented reality content must be implemented in a very limited environment. In addition, in the case of conventional augmented reality content, since the amount of data processing is large, real-time properties of content applied to augmented reality are poor.
  • the present invention provides a 3D AR object image real-time acquisition system and its operating method capable of acquiring a specific AR object image in real time.
  • the object of the present invention is not limited to the above object, and other objects not mentioned will be clearly understood from the description below.
  • the present invention for achieving the above object is a subject, a plurality of cameras each acquiring an RGB image and a depth image of the subject, 3D for the RGB image and the depth image acquired in real time by the plurality of cameras
  • a 3D AR object image collecting device generating an AR object image, wherein the 3D AR object image collecting device converts RGB partial data corresponding to at least some objects among the subjects and depth partial data corresponding to the at least some objects to the RGB image and filtering from the depth image, generating and storing 3D point cloud data based on the filtered RGB partial data and the depth partial data.
  • the 3D AR object image collection device supports the connection of the plurality of cameras, checks the connection state of the plurality of cameras and the on-off state of the plurality of cameras, and controls the driving of the plurality of cameras.
  • a data acquisition module acquiring the RGB image and the depth image from the plurality of cameras in real time.
  • the camera interface may output error information indicating an inactive or unconnected state of a camera providing the RGB image or the depth image among the plurality of cameras.
  • the 3D AR object image collection device filters the RGB partial data corresponding to a specific object selected from the RGB image and the depth partial data corresponding to the specific object, or filters an object located at a specific depth among the depth images.
  • a correction module filtering the depth part data and the RGB part data corresponding to the object of the specific depth may be included.
  • the 3D AR object image collection device includes a data sync module that converts the resolution of the RGB partial data based on the resolution of the depth partial data to match the resolution of the depth partial data with the resolution of the RGB partial data. can do.
  • a method for operating a 3D AR object image collection device supporting real-time acquisition of 3D AR object images includes an RGB image and a depth image of a subject from a plurality of cameras. obtaining, filtering RGB partial data corresponding to at least some objects among the subjects and depth partial data corresponding to the at least some objects from the RGB image and the depth image; and generating and storing 3D point cloud data based on partial data.
  • the method includes checking a deactivation or disconnection state of a camera providing the RGB image or the depth image among the plurality of cameras, and outputting error information corresponding to the deactivation or disconnection state of the camera. can include more.
  • the filtering may include filtering the RGB partial data corresponding to a specific object selected from the RGB image and the depth partial data corresponding to the specific object, or the depth corresponding to an object located at a specific depth among the depth images. Filtering the partial data and the RGB partial data corresponding to the object of the specific depth may be included.
  • the method may further include converting the resolution of the RGB partial data based on the resolution of the depth partial data so that the resolution of the depth partial data matches the resolution of the RGB partial data.
  • a 3D object can be easily created by combining color data and depth data.
  • the present invention can provide a real-time 3D AR object that can be applied to various additional services, such as E-Sports relay, home fitness and patient management, and smart factory operation, by solving physical limitations.
  • FIG. 1 is a diagram showing an example of a 3D AR object image collection environment capable of acquiring 3D AR object images in real time according to an embodiment of the present invention.
  • FIG. 2 is a diagram showing an example of a camera interface of a 3D AR object image collection device according to an embodiment of the present invention.
  • FIG. 3 is a diagram showing an example of data change between a data acquisition module and a correction module in a 3D AR object image collection device according to an embodiment of the present invention.
  • FIG. 4 is a diagram showing an example of a configuration of a data sync module in a 3D AR object image collection device according to an embodiment of the present invention.
  • FIG. 5 is a diagram showing an example of a data generation module in a 3D AR object image collection device according to an embodiment of the present invention.
  • FIG. 6 is a diagram illustrating an example of a method for obtaining a 3D AR object image in real time according to an embodiment of the present invention.
  • first and second are used to describe various components, and are used only for the purpose of distinguishing one component from other components, and to limit the components. Not used. For example, a second element may be termed a first element, and similarly, a first element may be termed a second element, without departing from the scope of the present invention.
  • embodiments within the scope of the present invention include computer-readable media having or conveying computer-executable instructions or data structures stored thereon.
  • Such computer readable media can be any available media that can be accessed by a general purpose or special purpose computer system.
  • Such computer readable media may be in the form of RAM, ROM, EPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage, or computer executable instructions, computer readable instructions or data structures.
  • physical storage media such as, but not limited to, any other medium that can be used to store or convey any program code means in a computer system and which can be accessed by a general purpose or special purpose computer system. .
  • FIG. 1 is a diagram showing an example of a 3D AR object image collection environment capable of acquiring 3D AR object images in real time according to an embodiment of the present invention.
  • a 3D AR object image collection environment 10 may include a subject 11, a plurality of cameras 100, and a 3D AR object image collection device 200.
  • the plurality of cameras 100 may be included as one component of the 3D AR object image collecting device 200 .
  • the subject 11 may be a target to be made into a 3D AR object.
  • the subject 11 may include various targets such as people, animals, objects, tools, and backgrounds.
  • the subject 11 may include at least a part of a photographing environment captured by the plurality of cameras 100 .
  • the plurality of cameras 100 may be disposed at positions capable of photographing the subject 11 .
  • the plurality of cameras 100 may include at least one RGB camera capable of obtaining an RGB image related to the subject 11 and at least one depth camera capable of obtaining depth data of the subject 11. .
  • the at least one RGB camera and the at least one depth camera are disposed to capture the same subject 11 and may have the same or similar shooting distance and shooting angle within a specified range.
  • the 3D AR object image collection device 200 acquires RGB images and depth images obtained by the plurality of cameras 100 in real time, performs filtering on a target object on the obtained real-time images, and then performs data synchronization.
  • Point cloud data can be generated and provided through
  • the 3D AR object image collection device 200 may include a camera interface 210, a data acquisition module 220, a correction module 230, a data sync module 240, and a data generation module 250.
  • the plurality of cameras 100 control the 3D AR object image collection device 200 or the user who controls the 3D AR object image collection device 200 Activated in response to manipulation, the activated plurality of cameras 100 may collect images of the subject 11 and provide the collected images to the 3D AR object image collection device 200 .
  • FIG. 2 is a diagram showing an example of a camera interface of a 3D AR object image collection device according to an embodiment of the present invention.
  • the camera interface 210 may include a camera connection unit 211 , a driving unit 212 and an error handling unit 213 .
  • the camera connection unit 211 may include a configuration for connecting the plurality of cameras 100 and the 3D AR object image collection device 200.
  • the camera connection part 211 may include a wired cable inserted into a connector formed in the plurality of cameras 100 and a connection pin or device connector connected to the wired cable.
  • the camera connection unit 211 may include a wireless communication interface capable of forming a wireless communication channel with the plurality of cameras 100.
  • the camera connection unit 211 may include a wired communication interface (eg, connection through a cable) that can be connected to at least one camera or a wireless communication interface that can form a wireless communication channel with at least one camera.
  • the camera connection part 211 may further include a wire capable of supplying power to the plurality of cameras 100 .
  • the driving unit 212 may be in charge of setting the number of cameras used, setting values, and controlling On/Off. In this regard, when the plurality of cameras 100 are connected through the camera connection unit 211, the driving unit 212 transmits and receives initial data with the plurality of cameras 100 connected to the identification information of the plurality of cameras 100. , the type and specification information of each camera can be collected. Also, the driving unit 212 may drive an application for driving the plurality of cameras 100 . The driving unit 212 may perform initial setting of the plurality of cameras 100 according to predefined settings and may adjust setting values of each camera according to user manipulation.
  • the drive unit 212 controls turn-on of the plurality of cameras 100 according to user manipulation or execution of a specific application (eg, an application supporting a 3D AR object image providing function), and in response to termination of the corresponding application, a plurality of Turn-off control of the camera 100 may be performed.
  • a specific application eg, an application supporting a 3D AR object image providing function
  • the error handling unit 213 may process an error occurring in physical connection with the plurality of cameras 100 or in setting a setting value.
  • the camera interface 210 may include a display or audio device capable of displaying information related to connection of the plurality of cameras 100 .
  • the error handling unit 213 outputs a list of the plurality of cameras 100 connected according to the connection of the plurality of cameras 100, and the status of each of the plurality of cameras 100 (eg, initialization completed, the plurality of cameras 100 ) can be output.
  • the error handling unit 213 may also output identification information and error information about a camera that does not operate normally among the plurality of cameras 100 .
  • the error information may include a problem of a camera in which an error has occurred.
  • FIG. 3 is a diagram showing an example of data change between a data acquisition module and a correction module in a 3D AR object image collection device according to an embodiment of the present invention.
  • the data acquisition module 220 may acquire at least an RGB image and a depth image from among the plurality of cameras 100 according to a user's manipulation or execution of a specific application. In this process, the data acquisition module 220 checks information on the plurality of cameras 100 that are connected and activated through the camera interface 210, and the camera capable of acquiring RGB images from the plurality of cameras 100 and You can check if there is a camera capable of acquiring depth images. The data acquisition module 220 transmits an error message to the error handling unit 213 of the camera interface 210 when a camera capable of obtaining an RGB image and a depth image, respectively, is not included in the plurality of cameras 100.
  • the error handling unit 213 receiving the error message may output error information requesting connection or turn-on of a camera of a type that is not currently connected (eg, an RGB camera or a depth camera).
  • a camera of a type that is not currently connected eg, an RGB camera or a depth camera.
  • the data acquisition module 220 may transmit the acquired RGB image and the depth image correction module 230.
  • the RGB image and the depth image delivered by the data acquisition module 220 may include images of the entire shooting environment including the subject 11 .
  • the correction module 230 may perform filtering to acquire only desired data based on the RGB image and the depth image delivered by the data acquisition module 220 . During this process, the correction module 230 may filter subjects at a certain depth by using the depth value. The correction module 230 may separately extract an RGB image of a subject at a certain depth. In this regard, the correction module 230 may perform object classification based on boundary lines in the RGB image, and selectively filter objects having a certain depth based on depth information. Alternatively, the correction module 230 may perform filtering on a corresponding object based on depth information on an object (eg, a human face pattern or a specific animal pattern) including a specific pattern among RGB images. That is, the correction module 230 may filter and obtain only RGB data and depth data of at least some of the objects 11 .
  • an object eg, a human face pattern or a specific animal pattern
  • FIG. 4 is a diagram showing an example of a configuration of a data sync module in a 3D AR object image collection device according to an embodiment of the present invention.
  • the data sync module 240 may include an info handler 241 , a pixel matching unit 242 and a first data buffer 243 .
  • the info handler 241 may obtain depth data and RGB data from data transmitted from the correction module 230 . In this process, the info handler 241 may obtain resolution information of depth data and resolution of RGB data from the correction module 230 together. Alternatively, the info handler 241 may obtain the resolution information of the RGB camera that provided the RGB image and the resolution information of the depth camera that provided the depth image from the camera interface 210 .
  • the pixel matcher 242 can change the resolution of RGB data. That is, the pixel matching unit 242 converts the size of RGB information among RGB information having different resolutions (resolution: 1920*1080) and depth information (resolution: 1024*1024) to match the depth information to generate a point cloud. can do.
  • the pixel matching unit 242 may store changed information in the first data buffer 243 .
  • the first data buffer 243 may store RGB data and depth data matched by the pixel matching unit 242 .
  • the RGB data stored here may be information converted to suit the resolution of depth data.
  • FIG. 5 is a diagram showing an example of a data generation module in a 3D AR object image collection device according to an embodiment of the present invention.
  • the data generation module 250 of the present invention includes a second data buffer 251, a point cloud generator 252, a point cloud data buffer 253, a video output generator 254, and a data transmitter 255. ) and PLY file manager 256.
  • the second data buffer 251 may receive RGB data and the depth data converted based on the depth data by the data sync module 240 from the first data buffer 243 and temporarily store them.
  • the second data buffer 251 may transmit stored data (eg, resolution-converted RGB data and depth data) to the point cloud generator 252 .
  • the second data buffer 251 stores the data received from the first data buffer 243 in a specified data format format (eg, a format format defined for point cloud generation or the purpose of using the data) for point cloud generation. format) can be converted and saved.
  • a specified data format format eg, a format format defined for point cloud generation or the purpose of using the data
  • the point cloud generator 252 may generate point cloud data based on data stored in the second data buffer 251 .
  • the point cloud generator 252 may allocate coordinates in a 3D space to data stored in the second data buffer 251 and process a collection of points for the allocated coordinates.
  • the point cloud generator 252 may perform an overlay process on the point cloud to build 3D data.
  • the point cloud data buffer 253 may store point cloud data generated by the point cloud generator 252 .
  • Data stored in the point cloud data buffer 253 may include collections of coordinates (or points) corresponding to the 3D AR object image.
  • the video output generator 254 may generate a video file corresponding to the point cloud data stored in the point cloud data buffer 253, and store the generated video file or transmit it to a designated device.
  • the video generated by the video output generator 254 may include a real-time image of the 3D AR object image.
  • the video file generated by the video output generator 254 may include real-time video images of 3D AR object images of some objects based on the subject 11 .
  • the data transmitter 255 may transmit the point cloud data stored in the point cloud data buffer 253 to a designated device. For example, when a 3D AR object image supporting function is performed based on a program such as a video conference or game, the data transmitter 255 transmits the point cloud data corresponding to a specific object to other electronic devices performing the corresponding video conference or game. can be passed to the device.
  • the PLY file manager 256 may create a 3D polygon file based on the point cloud data.
  • the PLY file manager 256 may define a polygonal file format corresponding to at least some objects of the subject 11 based on the point cloud data.
  • the PLY file manager 256 may store various properties including color and transparency of an object corresponding to at least a part of the subject 11, surface normal, texture coordinates, and data reliability value. That is, the PLY file manager 256 may perform PLY data conversion corresponding to the point cloud data.
  • FIG. 6 is a diagram illustrating an example of a method for obtaining a 3D AR object image in real time according to an embodiment of the present invention.
  • the 3D AR object image collection device 200 may activate a plurality of cameras 100 including an RGB camera and a depth camera. During this process, the 3D AR object image collection device 200 may activate the plurality of cameras 100 according to user manipulation or execution of an application designed for a 3D AR object image support function.
  • the 3D AR object image collection device 200 may acquire RGB data and depth data in real time. In this operation, the 3D AR object image collection device 200 may obtain time information on RGB data and depth data together, and maintain time synchronization for each data.
  • the 3D AR object image collection device 200 may perform filtering on acquired data. For example, the 3D AR object image collecting device 200 may filter only depth data of an object located at a specific depth and RGB data corresponding to an object at a corresponding depth based on the depth data. Alternatively, the 3D AR object image collection device 200 may detect a specific object based on the RGB image, and filter RGB partial data corresponding to the object and depth partial data corresponding to the object. After selecting a specific object, the 3D AR object image collecting device 200 may filter data (eg, RGB partial data and depth partial data) of the corresponding object with respect to data obtained in real time.
  • data eg, RGB partial data and depth partial data
  • the 3D AR object image collection device 200 may process data sync.
  • the 3D AR object image collection device 200 may synchronize resolution conversion of RGB data based on the resolution of depth data. That is, the resolution of the RGB image acquired by the RGB camera and the depth image obtained by the depth camera may differ depending on the characteristics of the cameras. Accordingly, the 3D AR object image collection device 200 may process a conversion task of synchronizing the resolution of RGB data based on the resolution of depth data. At this time, the 3D AR object image collection device 200 may process resolution conversion (synchronization or matching operation) of the filtered RGB partial data based on the filtered depth partial data.
  • the 3D AR object image collection device 200 may generate and store point cloud data.
  • the 3D AR object image collection device 200 includes a point cloud generator 252, and uses the point cloud generator 252 to convert the RGB partial data and the depth partial data to points corresponding to 3D coordinates. It can be converted to , and collection processing of the converted points can be performed.
  • the point cloud generator 252 may generate a specific file (eg, a video file or a PLY file) by overlaying point clouds corresponding to RGB partial data and depth partial data obtained in real time. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

Sont divulgués un procédé pour faire fonctionner un appareil de collecte d'image d'objet AR 3D prenant en charge une acquisition en temps réel d'image d'objet AR 3D, et un système auquel le procédé est appliqué, le procédé comprenant les étapes consistant à : acquérir, par l'appareil de collecte d'image d'objet AR 3D à partir d'une pluralité de caméras, une image RVB et une image de profondeur pour un sujet ; filtrer, à partir de l'image RVB et de l'image de profondeur, des données partielles RVB correspondant à au moins certains objets dans le sujet et des données partielles de profondeur correspondant auxdits objets ; et générer des données de nuage de points 3D sur la base des données partielles RVB filtrées et des données partielles de profondeur filtrées et stocker les données de nuage de points 3D.
PCT/KR2021/017527 2021-10-27 2021-11-25 Système d'acquisition d'image d'objet ar 3d en temps réel, et son procédé de fonctionnement WO2023074999A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020210144555A KR102539827B1 (ko) 2021-10-27 2021-10-27 3d ar 객체 영상 실시간 획득 시스템 및 이의 운용 방법
KR10-2021-0144555 2021-10-27

Publications (1)

Publication Number Publication Date
WO2023074999A1 true WO2023074999A1 (fr) 2023-05-04

Family

ID=86160051

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/017527 WO2023074999A1 (fr) 2021-10-27 2021-11-25 Système d'acquisition d'image d'objet ar 3d en temps réel, et son procédé de fonctionnement

Country Status (2)

Country Link
KR (1) KR102539827B1 (fr)
WO (1) WO2023074999A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002157606A (ja) * 2000-11-17 2002-05-31 Canon Inc 画像表示制御装置、複合現実感提示システム、画像表示制御方法、及び処理プログラムを提供する媒体
KR20190105458A (ko) * 2018-03-05 2019-09-17 한국전자통신연구원 홀로그램 생성 방법
KR20190133867A (ko) * 2018-05-24 2019-12-04 (주)네모랩스 증강현실 서비스 제공 시스템 및 이의 360도 회전 이미지 파일 생성 방법
KR102113812B1 (ko) * 2014-09-19 2020-05-22 한국전자통신연구원 Rgb-d 영상을 이용한 실감형 증강현실 구현 장치 및 방법
KR20200136432A (ko) * 2018-05-18 2020-12-07 이베이 인크. 물리적 객체 경계 검출 기술들 및 시스템들

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002157606A (ja) * 2000-11-17 2002-05-31 Canon Inc 画像表示制御装置、複合現実感提示システム、画像表示制御方法、及び処理プログラムを提供する媒体
KR102113812B1 (ko) * 2014-09-19 2020-05-22 한국전자통신연구원 Rgb-d 영상을 이용한 실감형 증강현실 구현 장치 및 방법
KR20190105458A (ko) * 2018-03-05 2019-09-17 한국전자통신연구원 홀로그램 생성 방법
KR20200136432A (ko) * 2018-05-18 2020-12-07 이베이 인크. 물리적 객체 경계 검출 기술들 및 시스템들
KR20190133867A (ko) * 2018-05-24 2019-12-04 (주)네모랩스 증강현실 서비스 제공 시스템 및 이의 360도 회전 이미지 파일 생성 방법

Also Published As

Publication number Publication date
KR102539827B1 (ko) 2023-06-08
KR20230060596A (ko) 2023-05-08

Similar Documents

Publication Publication Date Title
TW378159B (en) Input position detection device and entertainment system
WO2016027930A1 (fr) Dispositif portatif et son procédé de commande
CN104583982B (zh) 医疗支援系统及其方法
WO2016048020A1 (fr) Appareil et procédé de génération d'images permettant la génération d'images panoramiques 3d
WO2020197115A2 (fr) Berceau de scanner intra-buccal ayant un processeur intégré dans celui-ci et système de scanner intra-buccal comprenant celui-ci
WO2020197070A1 (fr) Dispositif électronique effectuant une fonction selon une entrée de geste et son procédé de fonctionnement
WO2015102126A1 (fr) Procédé et système pour gérer un album électronique à l'aide d'une technologie de reconnaissance de visage
WO2023074999A1 (fr) Système d'acquisition d'image d'objet ar 3d en temps réel, et son procédé de fonctionnement
WO2014003509A1 (fr) Appareil et procédé d'affichage de réalité augmentée
WO2021137555A1 (fr) Dispositif électronique comprenant un capteur d'image et son procédé de fonctionnement
WO2011071313A2 (fr) Procédé et appareil d'extraction d'une image de texture et d'une image de profondeur
WO2018110810A1 (fr) Système de création de contenu de réalité virtuelle
WO2019103193A1 (fr) Système et procédé pour acquérir une image de rv à 360° dans un jeu à l'aide d'une caméra virtuelle distribuée
WO2017213335A1 (fr) Procédé pour combiner des images en temps réel
WO2024080438A1 (fr) Dispositif, système et procédé d'acquisition d'objet ar 3d en temps réel
WO2017086522A1 (fr) Procédé de synthèse d'image d'incrustation couleur sans écran d'arrière-plan
WO2020085571A1 (fr) Appareil et système de conversion d'image permettant de générer une image de rv à 360° en temps réel
WO2023085492A1 (fr) Procédé de prise en charge d'une carte d'espace 3d pour un contenu ar à l'aide de dispositifs kinect, et dispositif électronique prenant en charge ledit procédé
KR20000042653A (ko) 유에스비(usb) 카메라의 스냅 샷 처리장치 및 처리 방법
WO2011007970A1 (fr) Procédé et appareil pour traiter une image
WO2017209468A1 (fr) Système et procédé de synthèse d'incrustation couleur permettant de fournir des effets stéréoscopiques tridimensionnels
CN207753797U (zh) 视频传送系统及线缆
WO2023224169A1 (fr) Système d'estimation de squelette tridimensionnel et procédé d'estimation de squelette tridimensionnel
CN113630592A (zh) 扩增实境异地共演系统
KR101853694B1 (ko) 캐리어 보드를 이용한 단일 모듈화 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21962622

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE