WO2023189690A1 - Système d'aide à la communication en temps réel et procédé associé, terminal mobile, serveur et support lisible par ordinateur - Google Patents

Système d'aide à la communication en temps réel et procédé associé, terminal mobile, serveur et support lisible par ordinateur Download PDF

Info

Publication number
WO2023189690A1
WO2023189690A1 PCT/JP2023/010480 JP2023010480W WO2023189690A1 WO 2023189690 A1 WO2023189690 A1 WO 2023189690A1 JP 2023010480 W JP2023010480 W JP 2023010480W WO 2023189690 A1 WO2023189690 A1 WO 2023189690A1
Authority
WO
WIPO (PCT)
Prior art keywords
point cloud
cloud data
target
landscape
mesh screen
Prior art date
Application number
PCT/JP2023/010480
Other languages
English (en)
Japanese (ja)
Inventor
ゆり 安達
真則 高岡
悟己 上野
教之 青木
研二 河野
Original Assignee
日本電気通信システム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気通信システム株式会社 filed Critical 日本電気通信システム株式会社
Publication of WO2023189690A1 publication Critical patent/WO2023189690A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present disclosure relates to a real-time communication support system, a real-time communication support method, a mobile terminal, a server, and a computer-readable medium.
  • Patent Document 1 discloses that a distance sensor measures the distance to each point within a predetermined area including an object.
  • Patent Document 1 discloses that three-dimensional object recognition means converts measurement data of each point into mesh data, and groups this mesh data into a plurality of mesh groups. Further, Patent Document 1 discloses that a projection plane perpendicular to the normal vector of each mesh group is created, and projection data is obtained by projecting each mesh group onto the corresponding projection plane. Furthermore, Patent Document 1 discloses that contour data of projection data is extracted and the contour data is compared with two-dimensional shape data of the target object to recognize the position and orientation of the target object.
  • 3D point cloud data When 3D point cloud data is converted into mesh, CAD (Computer Aided Design) data, or polygons, structures are represented as objects, the data volume becomes smaller, and the processing speed for drawing and interaction can be increased.
  • CAD Computer Aided Design
  • meshed objects have lower accuracy than the 3D point cloud itself, and are therefore not suitable for measuring the details of an object with high precision.
  • the real-time communication support system disclosed herein is a means of photographing a subject; means for acquiring 3D point cloud data from the photographed object; A means for obtaining a mesh screen of a landscape in which non-target objects are meshed; means for storing the 3D point cloud data of the object and the mesh screen of the landscape; The real-time communication support system includes means for simultaneously displaying the 3D point cloud data of the target and the mesh screen of the scenery.
  • the present invention is a mobile terminal comprising means for photographing an object, and means for simultaneously displaying 3D point cloud data of the object and a mesh screen of a landscape obtained by meshing areas other than the object.
  • the server of the present disclosure is means for acquiring 3D point cloud data from a photographed object; means for acquiring a mesh screen of a landscape obtained by meshing parts other than the object; and storing the 3D point cloud data of the target and the mesh screen of the landscape.
  • a server comprising means.
  • the real-time communication support method disclosed herein is a step of photographing a subject; obtaining 3D point cloud data from the imaged object; a step of obtaining a mesh screen of a landscape with meshed parts other than the target; storing the 3D point cloud data of the object and the mesh screen of the landscape;
  • the real-time communication support method includes the step of simultaneously displaying the 3D point cloud data of the target and the mesh screen of the scenery.
  • the program of this disclosure is This is a program that causes a mobile terminal to execute the steps of photographing an object and simultaneously displaying 3D point cloud data of the object and a mesh screen of a landscape in which areas other than the object are meshed.
  • FIG. 3 is a diagram illustrating distance measurement during infrastructure inspection, etc. according to the embodiment.
  • FIG. 3 is a diagram illustrating a first meshing method according to an embodiment.
  • FIG. 7 is a diagram showing a second meshing method according to the embodiment.
  • FIG. 1 is a diagram showing the configuration of a system according to an embodiment.
  • FIG. 1 is a diagram showing the configuration of a mobile terminal according to an embodiment.
  • FIG. 1 is a diagram showing the configuration of a server (cloud) according to an embodiment. It is a flowchart of processing of the portable terminal concerning an embodiment. It is a flowchart of processing of a server (cloud) concerning an embodiment.
  • FIG. 3 is a diagram showing application of meshing processing according to an embodiment to simulation.
  • 3D point cloud data is stored on a tablet or other device held by a worker at height. Can only be confirmed with measuring equipment. Therefore, a supervisor who is located at a location other than the measurement location at the site, such as a distant office, cannot check the 3D point cloud data. Since 3D point cloud data has a large data volume and a high processing load, it is difficult to transfer the 3D point cloud data to other workers' terminals and display and share it in real time. Furthermore, it is difficult for other workers to understand the situation at the site if all the displayed data is 3D point cloud data.
  • FIG. 2 is a diagram showing a first meshing method according to the embodiment. As shown in FIG. 2, a pre-prepared building mesh other than the target is obtained, and this mesh and the photographed 3D point cloud can be displayed overlappingly.
  • the building mesh prepared in advance may use external data such as city data such as PLATEAU.
  • the mobile terminal 200 is a personal computer, a tablet, or a smartphone. Furthermore, the mobile terminal 200 may be a glasses-type device that realizes VR (Virtual Reality), AR (Augmented Reality), or MR (Mixed Reality). Moreover, the mobile terminal 200 may be a 3D display. The mobile terminal 200 may be carried by the user. Furthermore, the mobile terminal 200 may be a system that exists at the site and exchanges information with users. For example, the mobile terminal 200 is a system that presents information to a user using a projector and obtains information from the user using a sensor or voice. The mobile terminal 200 acquires photographic data 300 using a 3D sensor, and transmits and receives data to and from other mobile terminals 200 in real time through the server (cloud) 100. The mobile terminal 200 may only obtain data from the server (cloud) 100 instead of using a 3D sensor to obtain the photographic data 300 like the supervisor's terminal.
  • FIG. 5 is a diagram showing the configuration of the mobile terminal according to the embodiment. The configuration of the mobile terminal according to the embodiment will be described with reference to FIG. 5.
  • the mobile terminal 200 includes a data collection section 210, an information display section 211, an information provision section 212, and a data transmission section 213.
  • the data collection unit 210 is a part that has a function of acquiring photographic data 300 acquired by LiDAR, a 3D sensor such as a ToF camera or a stereo sensor, an RGB camera, or other sensors, and data stored in the server (cloud) 100.
  • the 3D sensor may take pictures while the photographer is moving or moving the 3D sensor. Further, a plurality of 3D sensors may be used.
  • the information display unit 211 is a display device such as a liquid crystal display device or an organic EL (Electro Luminescence) display device.
  • the information display section 211 is connected to the data collection section and has a function of displaying information acquired by the data collection section 210.
  • the information display unit 211 displays 3D point cloud data and mesh screen data.
  • the information display section 211 may be expressed in a virtual space or may be displayed in an easy-to-understand manner using a user's avatar, but the expression method is not limited to these. Further, the information display unit 211 may not only display the 3D point cloud data and the mesh screen data, but may also superimpose (overlay) them to make them easier to see. Further, the information display unit 211 may switch between 3D point cloud data and meshed mesh screen data, and may adjust the degree of overlap. Further, the information display unit 211 may select between displaying 3D point cloud data and meshed data.
  • the information adding unit 212 is a part that is connected to the information display unit 211 and has a function of adding information to the data displayed on the information display unit 211.
  • the information adding unit 212 may update and edit the data acquired by the data collecting unit 210.
  • the information providing unit 212 generates information from other functions of the mobile terminal 200 (camera, GPS (Global Positioning System), etc.) and generates information unique to the user.
  • the information providing unit 212 provides information such as positional information such as the latitude and longitude of the location where the image was taken, the direction in which the image was taken, and information that identifies the object to be displayed in the 3D point cloud.
  • the information adding unit 212 also adds information such as comments regarding the object.
  • the information providing section 212 cooperates with a touch sensor, touch pen, etc. installed on the information display section 211.
  • FIG. 6 is a diagram showing the configuration of a server (cloud) according to the embodiment.
  • the configuration of the server (cloud) according to the embodiment will be described with reference to FIG. 6.
  • the server (cloud) 100 includes a location information acquisition section 110, a data collection section 111, an information management section 112, and a data transmission section 115.
  • the information management section 112 includes a mesh processing section 113 and an information holding section 114.
  • the information management unit 112 is connected to the data collection unit 111 and receives mesh data or 3D point cloud data.
  • the mesh processing unit 113 of the information management unit 112 is a part that has a function of converting mesh data or meshing a point group other than the target.
  • the mesh processing unit processes mesh data acquired from the outside so that it can be applied to this system.
  • the mesh processing unit 113 performs meshing, polygonization, or conversion of the 3D point group into CAD data.
  • the mesh processing unit 113 may perform meshing processing in cooperation with an external system.
  • the external systems include a CAD system, a finished drawing management system, a drawing management system, and a GIS (Geographic Information System).
  • the object of meshing is not limited by distinguishing between objects and non-objects. Furthermore, in the meshing process, the presence or absence of meshing and the granularity of meshing may be determined based on processing capacity, number of point groups, density, accuracy, error, and the like. The granularity of meshing refers to the precision of the number of polygons and the number of operations. Furthermore, in the meshing process, the object of meshing and the granularity of meshing may be determined based on instructions from a person or in cooperation with external data.
  • the information holding unit 114 of the information management unit 112 is a magnetic recording medium (for example, a flexible disk, a magnetic tape, a hard disk drive), a magneto-optical recording medium (for example, a magneto-optical disk), or a CD-ROM (Read Only Memory). Further, the information holding unit 114 of the information management unit 112 is a CD-R or a CD-R/W. Further, the information holding unit 114 of the information management unit 112 is a semiconductor memory (for example, a mask ROM, a PROM (programmable ROM), an EPROM (erasable PROM), a flash ROM, or a RAM (random access memory)).
  • a semiconductor memory for example, a mask ROM, a PROM (programmable ROM), an EPROM (erasable PROM), a flash ROM, or a RAM (random access memory)
  • the information storage unit 114 of the information management unit 112 is a part that has a function of storing 3D point cloud data of the target and data of scenery other than the target that has been meshed by the mesh processing unit 113.
  • the information holding unit 114 may hold a temporal history of the data and perform meshing processing based on the history and changes over time. Further, the information holding unit 114 may hold meshed data and data that is not to be meshed, and use the stored data to determine whether meshing is to be performed. Further, the information holding unit 114 may hold the data and use it for object detection (object search and matching, etc.) and object tracking. These processes make it possible to reduce and simplify point cloud processing.
  • the mobile terminal 200 may include the mesh processing section 113. The mobile terminal 200 may transmit the 3D point cloud data, the mesh, or both to the server (cloud) 100.
  • the data transmission section 115 is connected to the information management section 112 and to the mobile terminal 200.
  • the data transmitting unit 115 transmits the 3D point cloud data of the target stored in the information management unit 112 and data of scenery other than the target that has been meshed by the mesh processing unit 113 to each mobile terminal 200 in response to a request received from each terminal. This is the part that has the function of sending data to.
  • the data transmitter 115 connects to a wireless LAN, wired LAN, Internet line, mobile phone line, and the like.
  • FIG. 7 is a flowchart of processing of the mobile terminal according to the embodiment. A flowchart of the mobile terminal according to this embodiment will be described with reference to FIG.
  • step A1 When displaying data stored in the server (cloud) 100 instead of data acquired by the 3D sensor, step A1 may be omitted.
  • the data collection unit 210 may perform point cloud processing, meshing processing, or preprocessing for point cloud processing and meshing processing.
  • the information adding unit 212 of the mobile terminal 200 determines a target from the photographed scenery as needed, and adds information such as adding a comment (step A4).
  • the mobile terminal 200 transmits the data to the server (cloud) 100 (step A5).
  • FIG. 8 is a flowchart of server processing according to the embodiment. A flowchart of the server processing of this embodiment will be described with reference to FIG.
  • the server (cloud) 100 uses the location information acquisition unit 110 to acquire location information (shooting position, shooting direction, shooting settings and situation, and additional information) of the terminal and shooting data (step B1).
  • the data collection unit 111 of the server (cloud) 100 acquires 3D point cloud data of the photographed object or the object and scenery (step B2).
  • the point cloud data of the sensors photographed in step B2 may be combined, a plurality of sensors may be synthesized (registration), etc. Alignment of point cloud data using position information, external data, etc. may be performed in this step B2, or may be performed after meshing.
  • the server (cloud) 100 uses the mesh processing unit 113 to obtain a mesh screen, or creates a mesh screen by meshing the 3D point cloud data of the scenery other than the determined target (step B3). Meshing may be performed not on the server (cloud) 100 but on the mobile terminal 200 at the site.
  • the 3D mesh may be created before photographing the object with a 3D sensor and acquiring 3D point cloud data, or may be created after photographing the 3D point cloud data.
  • the created mesh screen may be matched with a mesh (polygon, CAD information, model, etc.) held externally or in the information holding unit 114, and the mesh screen may be aligned and identified. In this step, they are treated as "to be meshed.”
  • the point cloud processing and meshing processing may include pre-processing and post-processing such as noise removal and correction.
  • the server (cloud) 100 stores necessary or all processing data in the information holding unit 114 (step B4).
  • the server (cloud) 100 stores 3D point cloud data of the target and a mesh screen of the scenery.
  • real-time information is displayed. Real-time information is displayed by adjusting the mesh granularity and precision based on instructions from people, coordination with external systems, real-time internal processing, and resource security.
  • the server (cloud) 100 adds additional information as necessary, such as the information added using the mobile terminal 200 in step A4 (step B6).
  • the server transmits the target 3D point cloud data and data such as a landscape mesh screen to each mobile terminal 200 (step B7).
  • the information display unit 211 simultaneously displays the target 3D point cloud data sent to each mobile terminal 200 and the landscape mesh screen.
  • this embodiment when acquiring a point cloud with a 3D sensor to measure the distance between utility poles and electric wires, it is possible to mesh everything other than the target and display an image with only the target as a point cloud. This makes the operation screen easier to understand, reduces data volume, reduces processing load, and allows images to be displayed in real time and shared with other terminals.
  • FIG. 9 is a diagram showing application of the meshing process according to the embodiment to simulation. Application of the meshing process according to this embodiment to simulation will be explained with reference to FIG.
  • objects created by clustering (segmenting) or meshing point cloud data are used for simulation. For example, when measuring the separation of power lines or communication lines, if the captured 3D data violates the separation, it is possible to try moving the existing 3D point group or model of the electric wire. By doing this, it is possible to search for a place where the separation violation can be resolved. Further, a function may be provided to automatically search for a place where the separation violation can be resolved.
  • FIG. 10 is a diagram showing application of the meshing process according to the embodiment to physical distribution. Application of the mesh processing according to this embodiment to logistics will be described with reference to FIG. 10.
  • the mesh processing according to this embodiment can be applied to logistics and warehouse operation management.
  • the meshing process according to this embodiment is used in situations such as measuring the size of luggage or checking for abnormalities on a belt conveyor in a factory or warehouse. If all 3D point cloud data of cargo in a factory or warehouse is measured using a 3D sensor, the data volume is large and the processing load is high, and the cargo moves, so it cannot be displayed in real time.
  • meshing other objects such as luggage, the amount of data is reduced, and the luggage in the factory or warehouse can be displayed in real time and the data can be shared.
  • an event camera may be used in combination to display moving objects as a 3D point cloud and stationary objects as a mesh.
  • FIG. 11 is a diagram showing application of the mesh processing according to the embodiment to animal breeding. Application of the mesh processing according to this embodiment to animal breeding will be described with reference to FIG. 11.
  • the meshing process according to this embodiment can be applied to a situation where construction machinery at a construction site is remotely controlled.
  • the processing load increases if you try to display the entire vast site as a 3D point cloud.
  • the processing load increases if you try to display the entire vast site as a 3D point cloud.
  • the program may also be provided to the computer on various types of temporary computer-readable media.
  • Examples of transitory computer-readable media include electrical signals, optical signals, and electromagnetic waves.
  • the temporary computer-readable medium can provide the program to the computer via wired communication channels, such as electrical wires and fiber optics, or wireless communication channels.
  • Additional note 1 a means of photographing a subject; means for acquiring 3D point cloud data from the photographed object; A means for obtaining a mesh screen of a landscape in which non-target objects are meshed; means for storing the 3D point cloud data of the object and the mesh screen of the landscape; A real-time communication support system comprising: means for simultaneously displaying the 3D point cloud data of the target and the mesh screen of the scenery.
  • a mobile terminal comprising means for photographing the object; and means for simultaneously displaying the 3D point cloud data of the object and the mesh screen of the landscape; means for acquiring 3D point cloud data from the photographed object; means for acquiring the mesh screen of the landscape in which areas other than the object are meshed; and the 3D point cloud data of the object and the mesh of the landscape.
  • the real-time communication support system according to supplementary note 1, comprising: a means for saving a screen; and a server.
  • a real-time communication support system comprising: means for simultaneously displaying the 3D point cloud data of the target and the mesh screen of the scenery.
  • (Additional note 4) means for photographing a landscape including the target; means for determining the target from the photographed landscape; and means for simultaneously displaying the 3D point cloud data of the target and the mesh screen of the landscape;
  • a mobile terminal comprising; means for acquiring 3D point cloud data from the photographed scenery; means for meshing the 3D point cloud data other than the determined target to create the mesh screen of the scenery; and the 3D points of the target.
  • the real-time communication support system according to appendix 3, comprising: a server comprising means for storing group data and the mesh screen of the scenery.
  • a mobile terminal comprising means for photographing an object, and means for simultaneously displaying 3D point cloud data of the object and a mesh screen of a landscape obtained by meshing areas other than the object.
  • (Appendix 8) means for acquiring 3D point cloud data from a photographed object; means for acquiring a mesh screen of a landscape obtained by meshing parts other than the object; and storing the 3D point cloud data of the target and the mesh screen of the landscape.
  • a server comprising means.
  • (Appendix 9) a step of photographing a subject; obtaining 3D point cloud data from the imaged object; a step of obtaining a mesh screen of a landscape with meshed parts other than the target; storing the 3D point cloud data of the object and the mesh screen of the landscape; A real-time communication support method comprising the step of simultaneously displaying the 3D point cloud data of the target and the mesh screen of the scenery.
  • (Appendix 10) A non-transitory computer-readable medium that records a program that causes a mobile terminal to execute the steps of photographing a target and simultaneously displaying 3D point cloud data of the target and a mesh screen of a meshed landscape other than the target. .
  • (Appendix 11) a step of acquiring 3D point cloud data from the photographed object; a step of acquiring a mesh screen of a landscape obtained by meshing parts other than the object; and saving the 3D point cloud data of the target and the mesh screen of the landscape.
  • (Appendix 12) means for photographing a landscape including a target; means for determining the target from the photographed landscape; and means for simultaneously displaying 3D point cloud data of the target and a mesh screen of the landscape other than the target.
  • (Appendix 13) means for acquiring 3D point cloud data from a landscape including a photographed object; and means for creating a mesh screen of the landscape by meshing the 3D point cloud data other than the object from the 3D point cloud data of the landscape. , means for storing the 3D point cloud data of the object and the mesh screen of the landscape.
  • Real-time communication support system 11 Means for photographing a target 12 Means for acquiring 3D point cloud data 13 Means for acquiring mesh screen 14 Means for saving 3D point cloud data and mesh screen 15 Means for storing 3D point cloud data and mesh screen Means for simultaneous display 100 Server (Cloud) 110 Location information acquisition unit 111 Data collection unit 112 Information management unit 113 Mesh processing unit 114 Information holding unit 115 Data transmission unit 200 Mobile terminal 210 Data collection unit 211 Information display unit 212 Information provision unit 213 Data transmission unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

La présente invention concerne un système d'aide à la communication en temps réel pour afficher simultanément de fines données de nuage de points 3D et des données de maillage de petite taille. L'invention concerne un système d'assistance de communication en temps réel comprenant un moyen (11) qui photographie un sujet ; un moyen (12) qui acquiert des données de nuage de points 3D à partir du sujet photographié ; un moyen (13) qui acquiert un écran de maillage de paysage obtenu par conversion d'objets autres que le sujet en maillages ; un moyen (14) qui stocke les données de nuage de points 3D du sujet et l'écran de maillage de paysage ; et un moyen (15) qui affiche simultanément les données de nuage de points 3D du sujet et l'écran de maillage de paysage.
PCT/JP2023/010480 2022-03-28 2023-03-16 Système d'aide à la communication en temps réel et procédé associé, terminal mobile, serveur et support lisible par ordinateur WO2023189690A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022051437 2022-03-28
JP2022-051437 2022-03-28

Publications (1)

Publication Number Publication Date
WO2023189690A1 true WO2023189690A1 (fr) 2023-10-05

Family

ID=88201685

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/010480 WO2023189690A1 (fr) 2022-03-28 2023-03-16 Système d'aide à la communication en temps réel et procédé associé, terminal mobile, serveur et support lisible par ordinateur

Country Status (1)

Country Link
WO (1) WO2023189690A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002074323A (ja) * 2000-09-01 2002-03-15 Kokusai Kogyo Co Ltd 三次元市街地空間モデル作成方法およびシステム
WO2020110164A1 (fr) * 2018-11-26 2020-06-04 三菱電機株式会社 Dispositif, procédé et programme de génération de données d'affichage
JP2020201863A (ja) * 2019-06-13 2020-12-17 キヤノン株式会社 情報処理装置、情報処理方法、およびプログラム
JP2022043539A (ja) * 2020-09-04 2022-03-16 キヤノン株式会社 画像処理装置、画像処理システム、画像処理方法、及びプログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002074323A (ja) * 2000-09-01 2002-03-15 Kokusai Kogyo Co Ltd 三次元市街地空間モデル作成方法およびシステム
WO2020110164A1 (fr) * 2018-11-26 2020-06-04 三菱電機株式会社 Dispositif, procédé et programme de génération de données d'affichage
JP2020201863A (ja) * 2019-06-13 2020-12-17 キヤノン株式会社 情報処理装置、情報処理方法、およびプログラム
JP2022043539A (ja) * 2020-09-04 2022-03-16 キヤノン株式会社 画像処理装置、画像処理システム、画像処理方法、及びプログラム

Similar Documents

Publication Publication Date Title
JP7050683B2 (ja) 三次元情報処理方法及び三次元情報処理装置
WO2019233445A1 (fr) Procédé de collecte de données et de génération de modèle pour une maison
CN107836012B (zh) 投影图像生成方法及其装置、图像像素与深度值之间的映射方法
CN113570721B (zh) 三维空间模型的重建方法、装置和存储介质
JP7041551B2 (ja) 施工工程管理システム及び施工工程管理方法
TW202001786A (zh) 用於更新高度自動化駕駛地圖的系統和方法
CN110164135B (zh) 一种定位方法、定位装置及定位系统
JP6180647B2 (ja) クラウドポイントを利用した屋内地図構築装置および方法
JP6261815B1 (ja) 群集監視装置、および、群集監視システム
JP6321570B2 (ja) 屋内位置情報測位システム及び屋内位置情報測位方法
CN111221012A (zh) 用于基于周围环境改进的位置决策的方法和设备
JP2019153274A (ja) 位置算出装置、位置算出プログラム、位置算出方法、及びコンテンツ付加システム
CN114140528A (zh) 数据标注方法、装置、计算机设备及存储介质
US11412186B2 (en) Enhanced video system
JP2016092693A (ja) 撮像装置、撮像装置の制御方法およびプログラム
CN113610702B (zh) 一种建图方法、装置、电子设备及存储介质
JP6725736B1 (ja) 画像特定システムおよび画像特定方法
WO2023189690A1 (fr) Système d'aide à la communication en temps réel et procédé associé, terminal mobile, serveur et support lisible par ordinateur
WO2020186856A1 (fr) Système de navigation intérieure tridimensionnelle et procédé de mise en œuvre associé
CN108298101A (zh) 云台旋转的控制方法及装置、无人机
CN110191284B (zh) 对房屋进行数据采集的方法、装置、电子设备和存储介质
JP7179583B2 (ja) 画像処理装置および画像処理方法
WO2020151429A1 (fr) Système de chien-robot et son procédé de mise en œuvre
CN113378605A (zh) 多源信息融合方法及装置、电子设备和存储介质
CN110267087B (zh) 一种动态标签添加方法、设备及系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23779708

Country of ref document: EP

Kind code of ref document: A1