CN113419630B - Projection AR-based adaptive occlusion elimination method - Google Patents

Projection AR-based adaptive occlusion elimination method Download PDF

Info

Publication number
CN113419630B
CN113419630B CN202110717326.6A CN202110717326A CN113419630B CN 113419630 B CN113419630 B CN 113419630B CN 202110717326 A CN202110717326 A CN 202110717326A CN 113419630 B CN113419630 B CN 113419630B
Authority
CN
China
Prior art keywords
projection
projector
information
scene
fastener
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110717326.6A
Other languages
Chinese (zh)
Other versions
CN113419630A (en
Inventor
吕昊
王淑侠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN202110717326.6A priority Critical patent/CN113419630B/en
Publication of CN113419630A publication Critical patent/CN113419630A/en
Application granted granted Critical
Publication of CN113419630B publication Critical patent/CN113419630B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Geometry (AREA)
  • Architecture (AREA)
  • Computer Hardware Design (AREA)
  • Projection Apparatus (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The invention discloses a projection AR-based self-adaptive occlusion elimination method, which senses the real-time state of an assembly scene through the cooperative work of a camera and a projector, detects, tracks and positions an occlusion target and occluded projection information in a projection light path in real time, and compensates the occluded information in real time through the cooperative work of a plurality of projectors. The method and the device can improve the universality of the detection algorithm of the shielded target, respectively judge whether the shielded target moves or not, and improve the efficiency and the precision of the detection of the shielded target.

Description

Projection AR-based adaptive occlusion elimination method
Technical Field
The invention belongs to the technical field of augmented reality, and particularly relates to a self-adaptive occlusion elimination method.
Background
In the aerospace field, aircraft products have the characteristics of complex structure, many types and quantities of parts, high precision requirement and the like, and the traditional manual assembly has high difficulty and low efficiency and has high requirement on comprehensive literacy of workers.
As augmented Reality technology (AR) began to be applied to various mechanical assembly scenarios. Virtual information can be overlaid into a real scene by using the AR technology, and the established virtual environment can provide more concentrated and comprehensive assembly information. The auxiliary assembly system based on the AR technology can uniformly and intensively manage a large amount of information required by assembly, greatly shortens the assembly lead time, reduces the resource waste and reduces the cost. The virtual environment is matched and fused with the real environment, so that for workers, channels and ranges for information acquisition are expanded, the perception capability of important information is enhanced, and the requirement on the visual space capability of the workers is lowered; for the assembly work, the assembly error can be greatly reduced, the assembly time can be shortened, and the assembly efficiency and the assembly quality can be improved.
The projection-based AR technology is used as one of AR visualization modes, and an operator does not need to wear any equipment in the implementation process, so that no constraint is generated on the operator; in addition, the projection AR integrates a real object, a virtual object and an interactive environment, so that the interactive form of the projection AR is more natural and more diversified; finally, the projection AR can expand its display range almost infinitely as the number and location of projectors change. Based on the advantages, the projection AR technology is widely applied to the field of auxiliary assembly of complex products such as aerospace and the like.
However, projection-based Augmented Reality (PBAR) technology is often accompanied by a disadvantage of insufficient mobility. Because the position of the projector is relatively fixed, in the process of auxiliary assembly and guidance, the shielding problem of the guidance information often occurs to the hands of an operator, mechanical parts to be assembled and other entities. At present, the research on the problem of projection occlusion is few and has serious defects.
According to the current research situation at home and abroad, at present, aiming at the problem of projection occlusion in the projection AR, shadow is eliminated by a multi-projection method at home and abroad. That is, after the occluded region is identified and detected, another projector projects the occluded region from another angle to perform supplementary projection. At present, the research focus of scholars at home and abroad is mainly focused on the detection, identification and positioning of shelters and shadow areas, the research on multi-projection cooperative work is shallow, and the following defects exist:
1) The identification and detection method for the shelters and shadow areas is not accurate enough, most of the methods are to establish a skeleton model aiming at human body sheltering, and the methods have no generality and universality; the direct connection between the shelters and the sheltered shadow areas is not established, and the shadow areas on the projection surface need to be mapped through spatial information transformation on the basis that the depth camera identifies the shelters.
2) The projected picture of the supplementary projection and the projected picture of the main projector are not subjected to positioning and aligning treatment, so that the supplementary projected picture may have visual interference such as disorder in spatial position and ghost in projection effect.
3) The projection picture of the supplementary projection is not processed, the projection picture needs to be divided before the supplementary projection, only the shielded information is compensated, otherwise, the projection picture which is not shielded brings visual discomfort due to the re-projection of a plurality of projectors.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides a projection AR-based adaptive occlusion elimination method, which senses the real-time state of an assembly scene through the cooperative work of a camera and a projector, detects, tracks and positions an occlusion target and occluded projection information in a projection light path in real time, and compensates the occluded information in real time through the cooperative work of a plurality of projectors. The method and the device can improve the universality of the detection algorithm of the shielded target, respectively judge whether the shielded target moves or not, and improve the efficiency and the precision of the detection of the shielded target.
The technical scheme adopted by the invention for solving the technical problem is as follows:
step 1: constructing a fastener AR guide assembly system, which comprises an image acquisition client for scene state perception, a projection client for AR visual display and a virtual-real registration client for multi-projection picture positioning;
establishing a three-dimensional model of the fastener, establishing a fastener guide installation visualization scheme according to process information, and overlaying the process information to a hole position to be installed in a video and pattern mode to provide installation guidance for workers;
step 2: establishing a data transmission communication framework among three clients of the fastener AR guide assembly system, and performing data transmission among the clients through Socket communication of a TCP network protocol;
and step 3: starting the Basler industrial camera and the projector, and carrying out combined calibration on the Basler industrial camera and the projector by a Zhang-Zhengyou calibration method; solving internal and external parameters of the camera, and calculating the internal and external parameters of the projector to realize the unification of a virtual coordinate system, a camera coordinate system and a projection coordinate system;
and 4, step 4: placing four artificial identification codes based on elliptical characteristics in an area to be assembled, positioning and aligning a multi-projection picture through a camera-projector combined calibration technology, and realizing a virtual and real registration function;
and 5: establishing coordinate relation between a projection interactive menu of the fastener AR guide assembly system and menus in the Unity scene, projecting projection interactive menus of different fastener types by the main projector, clicking the interactive menus by an operator through fingers, responding by the system and projecting process information of fasteners of corresponding types, and installing the fasteners by the operator according to the guide information;
step 6: carrying out real-time scene monitoring on an assembly scene by a Basler industrial camera; a user selects a scene range to be monitored by setting an ROI (region of interest) so as to improve the response speed and the detection precision of the system; the Basler industrial camera monitors whether a user clicks an interactive menu on one hand, judges whether projection information is shielded or not in real time on the other hand, and tracks and positions a shielded target in real time;
and 7: if an occlusion target appears in the scene, the system segments and extracts the occluded guide information image through an image processing technology according to the occlusion target and the coordinate information of the shadow area on the projection surface; the system automatically selects an unblocked supplementary projector for supplementary projection from another angle by detecting and tracking the blocked target and the shadow area, compensates the guide information and realizes the self-adaptive elimination of the blocking;
and step 8: because the shadow generated by the shielding object on the projection surface exists, the brightness of the image after supplementary projection is lower than that of other non-shielded areas, and the system performs luminosity compensation processing on the projected image;
the system knows the pixel coordinates of the shadow area, and adjusts the contrast and brightness of the shadow area image through image processing, so that the projection picture is consistent in brightness and color;
and step 9: and the assembly operator sequentially clicks different projection interactive menus according to the guide information to gradually complete the installation of the fastening piece.
The invention has the following beneficial effects:
1. the invention can adjust and limit the projection position of the projector by controlling the position of the artificial marker, and can be used for moving the accurate projection of the projector under projection and preventing the problems of projection jitter and the like;
2. according to the invention, the problem of shielding of projection information in the working process of the projector can be effectively solved through the cooperative work of the camera and the projector, the projection visual effect is improved, and the information guide efficiency is improved.
Drawings
FIG. 1 is a schematic view of the assembly system of the present invention.
FIG. 2 is a diagram illustrating adaptive elimination of projection masks according to the present invention.
FIG. 3 is a diagram illustrating positioning of multiple projection frames according to the present invention.
FIG. 4 is a flowchart of an algorithm for detecting and tracking an occluded target according to the present invention.
FIG. 5 is an interactive interface and visualization form of the present invention.
FIG. 6 is a comparison graph of the effects before and after shading compensation according to the present invention.
FIG. 7 is a comparison graph of the effects before and after the photometric compensation according to the present invention.
FIG. 8 is a view of the fastener of the present invention prior to installation of shading compensation.
FIG. 9 illustrates the fastener of the present invention after installation for shade compensation.
Detailed Description
The invention is further illustrated with reference to the following figures and examples.
Aiming at the problems in the prior art, the invention provides a projection AR-based adaptive occlusion elimination method, which mainly realizes the functions of auxiliary guide installation of a fastener, adaptive elimination of projection guide information occlusion and projection menu man-machine interaction, and comprises the following specific steps:
step 1: constructing a fastener AR guide assembly system, which comprises an image acquisition client for scene state perception, a projection client for AR visual display and a virtual-real registration client for multi-projection picture positioning;
establishing a three-dimensional model of the fastener, establishing a fastener guide installation visualization scheme according to process information, and overlaying the process information to a hole position to be installed in a video and pattern mode to provide installation guidance for workers;
step 2: establishing a data transmission communication framework among three clients of the fastener AR guide assembly system, and performing data transmission among the clients through Socket communication of a TCP network protocol;
and step 3: starting the Basler industrial camera and the projector, and carrying out combined calibration on the Basler industrial camera and the projector by a Zhang-Zhengyou calibration method; solving the internal and external parameters of the camera, and calculating the internal and external parameters of the projector to realize the unification of a virtual coordinate system, a camera coordinate system and a projection coordinate system;
and 4, step 4: placing four artificial identification codes based on elliptical characteristics in an area to be assembled, positioning and aligning a multi-projection picture through a camera-projector combined calibration technology, and realizing a virtual-real registration function;
and 5: establishing coordinate relation between a projection interactive menu of the fastener AR guide assembly system and menus in the Unity scene, projecting projection interactive menus of different fastener types by the main projector, clicking the interactive menus by an operator through fingers, responding by the system and projecting process information of fasteners of corresponding types, and installing the fasteners by the operator according to the guide information;
step 6: carrying out real-time scene monitoring on an assembly scene by a Basler industrial camera; a user selects a scene range to be monitored by setting an ROI (region of interest) so as to improve the response speed and the detection precision of the system; the Basler industrial camera monitors whether a user clicks an interactive menu on one hand, judges whether projection information is shielded or not in real time on the other hand, and tracks and positions a shielded target in real time;
and 7: if an occlusion target appears in the scene, the system segments and extracts the occluded guide information image through an image processing technology according to the occlusion target and the coordinate information of the shadow area on the projection surface; the system automatically selects an unblocked supplementary projector for supplementary projection from another angle by detecting and tracking the blocked target and the shadow area, compensates the guide information and realizes the self-adaptive elimination of the blocking;
and 8: because the shadow generated by the shielding object on the projection surface exists, the brightness of the image after supplementary projection is lower than that of other non-shielded areas, and the system performs luminosity compensation processing on the projected image;
the system knows the pixel coordinates of the shadow area, and adjusts the contrast and brightness of the shadow area image through image processing, so that the projection picture is consistent in brightness and color;
and step 9: and the assembly operator sequentially clicks different projection interactive menus according to the guide information to gradually complete the installation of the fastening piece.
The invention can effectively solve the following problems in the existing projection occlusion solution technology:
1. the universality of an occluded target detection algorithm is improved, whether an occluded target moves is respectively judged, and the efficiency and the precision of occluded target detection are improved;
2. coordinate position information of a shadow area on a projection surface is directly obtained and is not required to be obtained through space coordinate transformation calculation;
3. positioning and aligning the multi-projection image, preprocessing the supplementary projection image and the source projection image, and reducing visual interference such as double images; and performing luminosity compensation on the image to solve the problem of inconsistent brightness of a projection picture caused by shadow.

Claims (1)

1. A projection AR-based adaptive occlusion elimination method is characterized by comprising the following steps:
step 1: constructing a fastener AR guide assembly system, which comprises an image acquisition client for scene state perception, a projection client for AR visual display and a virtual-real registration client for multi-projection picture positioning;
establishing a three-dimensional model of the fastener, establishing a fastener guide installation visualization scheme according to process information, and overlaying the process information to a hole site to be installed in the form of videos and patterns to provide installation guidance for workers;
and 2, step: establishing a data transmission communication framework among three clients of the fastener AR guide assembly system, and performing data transmission among the clients through Socket communication of a TCP network protocol;
and 3, step 3: starting the Basler industrial camera and the projector, and carrying out combined calibration on the Basler industrial camera and the projector by a Zhang-Zhengyou calibration method; solving the internal and external parameters of the camera, and calculating the internal and external parameters of the projector to realize the unification of a virtual coordinate system, a camera coordinate system and a projection coordinate system;
and 4, step 4: placing four artificial identification codes based on elliptical characteristics in an area to be assembled, positioning and aligning a multi-projection picture through a camera-projector combined calibration technology, and realizing a virtual and real registration function;
and 5: establishing coordinate relation between a projection interactive menu of the AR guide assembly system and menus in the Unity scene, projecting projection interactive menus of different fastener types by the main projector, clicking the interactive menus by an operator through fingers, responding by the system and projecting process information of fasteners of corresponding types, and installing the fasteners by the operator according to the guide information;
step 6: carrying out real-time scene monitoring on an assembly scene by a Basler industrial camera; a user selects a scene range to be monitored by setting an ROI (region of interest) so as to improve the response speed and the detection precision of the system; the Basler industrial camera monitors whether a user clicks an interactive menu on one hand, judges whether projection information is shielded or not in real time on the other hand, and tracks and positions a shielded target in real time;
and 7: if an occlusion target appears in the scene, the system segments and extracts the occluded guide information image through an image processing technology according to the occlusion target and the coordinate information of the shadow area on the projection surface; the system autonomously selects an unblocked supplementary projector for supplementary projection from another angle by detecting and tracking the blocked target and the shadow area, compensates the guide information and realizes the self-adaptive elimination of the blocking;
and 8: because the shadow generated by the shielding object on the projection surface exists, the brightness of the image after supplementary projection is lower than that of other non-shielded areas, and the system performs luminosity compensation processing on the projected image;
the system knows the pixel coordinates of the shadow area, and adjusts the contrast and the brightness of the shadow area image through image processing so that the projection picture is kept consistent in brightness and color;
and step 9: and the assembly operator sequentially clicks different projection interactive menus according to the guide information to gradually complete the installation of the fastening piece.
CN202110717326.6A 2021-06-28 2021-06-28 Projection AR-based adaptive occlusion elimination method Active CN113419630B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110717326.6A CN113419630B (en) 2021-06-28 2021-06-28 Projection AR-based adaptive occlusion elimination method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110717326.6A CN113419630B (en) 2021-06-28 2021-06-28 Projection AR-based adaptive occlusion elimination method

Publications (2)

Publication Number Publication Date
CN113419630A CN113419630A (en) 2021-09-21
CN113419630B true CN113419630B (en) 2022-12-13

Family

ID=77717784

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110717326.6A Active CN113419630B (en) 2021-06-28 2021-06-28 Projection AR-based adaptive occlusion elimination method

Country Status (1)

Country Link
CN (1) CN113419630B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010113568A (en) * 2008-11-07 2010-05-20 Toyota Infotechnology Center Co Ltd Motion detector and operation system using the same
CN102129708A (en) * 2010-12-10 2011-07-20 北京邮电大学 Fast multilevel imagination and reality occlusion method at actuality enhancement environment
CN102509343A (en) * 2011-09-30 2012-06-20 北京航空航天大学 Binocular image and object contour-based virtual and actual sheltering treatment method
CN103489214A (en) * 2013-09-10 2014-01-01 北京邮电大学 Virtual reality occlusion handling method, based on virtual model pretreatment, in augmented reality system
CN104537695A (en) * 2015-01-23 2015-04-22 贵州现代物流工程技术研究有限责任公司 Anti-shadow and anti-covering method for detecting and tracing multiple moving targets
CN107292965A (en) * 2017-08-03 2017-10-24 北京航空航天大学青岛研究院 A kind of mutual occlusion processing method based on depth image data stream
CN110076277A (en) * 2019-05-07 2019-08-02 清华大学 Match nail method based on augmented reality
CN110825234A (en) * 2019-11-11 2020-02-21 江南大学 Projection type augmented reality tracking display method and system for industrial scene

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4270264B2 (en) * 2006-11-01 2009-05-27 セイコーエプソン株式会社 Image correction apparatus, projection system, image correction method, image correction program, and recording medium
US10134198B2 (en) * 2016-04-19 2018-11-20 Adobe Systems Incorporated Image compensation for an occluding direct-view augmented reality system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010113568A (en) * 2008-11-07 2010-05-20 Toyota Infotechnology Center Co Ltd Motion detector and operation system using the same
CN102129708A (en) * 2010-12-10 2011-07-20 北京邮电大学 Fast multilevel imagination and reality occlusion method at actuality enhancement environment
CN102509343A (en) * 2011-09-30 2012-06-20 北京航空航天大学 Binocular image and object contour-based virtual and actual sheltering treatment method
CN103489214A (en) * 2013-09-10 2014-01-01 北京邮电大学 Virtual reality occlusion handling method, based on virtual model pretreatment, in augmented reality system
CN104537695A (en) * 2015-01-23 2015-04-22 贵州现代物流工程技术研究有限责任公司 Anti-shadow and anti-covering method for detecting and tracing multiple moving targets
CN107292965A (en) * 2017-08-03 2017-10-24 北京航空航天大学青岛研究院 A kind of mutual occlusion processing method based on depth image data stream
CN110076277A (en) * 2019-05-07 2019-08-02 清华大学 Match nail method based on augmented reality
CN110825234A (en) * 2019-11-11 2020-02-21 江南大学 Projection type augmented reality tracking display method and system for industrial scene

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
Dream-Experiment: A MR User Interface with Natural Multi-channel Interaction for Virtual Experiments;Tianren Luo等;《IEEE Xplore》;20201231;全文 *
基于亮度补偿的遥感影像阴影遮挡道路提取方法;何惠馨等;《地球信息科学学报》;20200225;第22卷(第02期);全文 *
增强现实中的多层次遮挡算法;张金玲等;《湖南大学学报(自然科学版)》;20090525;第36卷(第05期);全文 *
空间增强现实中的人机交互技术综述;袁庆曙等;《计算机辅助设计与图形学学报》;20210331;第33卷(第3期);全文 *
采用双相机结构光三维测量技术解决遮挡问题;彭权等;《工具技术》;20180320;第52卷(第03期);全文 *
面向产品装配引导的增强现实虚实融合技术研究;王月;《中国博士学位论文全文数据库 信息科技辑》;20200215;第2020年卷(第2期);全文 *

Also Published As

Publication number Publication date
CN113419630A (en) 2021-09-21

Similar Documents

Publication Publication Date Title
Sukthankar et al. Dynamic shadow elimination for multi-projector displays
CN101916175B (en) Intelligent projecting method capable of adapting to projection surface automatically
US10506207B2 (en) Projection system, method for controlling projection system, and projector
US20080136976A1 (en) Geometric Correction Method in Multi-Projection System
CN107580204B (en) Realize the projecting method of automatic adjusument
AU738375B2 (en) Methods and apparatus for changing a color of an image
CN105554447B (en) A kind of coal working face real-time video splicing system based on image processing techniques
CN106534817B (en) Curved surface projection automatic geometric correction method based on image recognition
CA3001430C (en) Image processing method and device for led display screen
CN206819048U (en) A kind of ball curtain projection system
US6320578B1 (en) Image shadow elimination method, image processing apparatus, and recording medium
CN113419630B (en) Projection AR-based adaptive occlusion elimination method
CN101483742A (en) Forward projection displaying method for combined large screen and control apparatus
CN104516482A (en) Shadowless projection system and method
CN112465959B (en) Transformer substation three-dimensional live-action model inspection method based on local scene updating
CN117278856A (en) Method for realizing personnel space positioning by using monitoring camera in digital twin technology
Xiang et al. Towards mobile projective AR for construction co-robots
JP2000081950A (en) Image processor, image processing method, presentation medium, and presentation system
CN111064947A (en) Panoramic-based video fusion method, system, device and storage medium
CN112040596B (en) Virtual space light control method, computer readable storage medium and system
Leubner et al. Computer-vision-based human-computer interaction with a back projection wall using arm gestures
CN107610043A (en) A kind of polyphaser topology connected relation edit methods based on web
JPH0746582A (en) Video segmenting method
Hiroi et al. Low-Latency Beaming Display: Implementation of Wearable, 133$\mu $ s Motion-to-Photon Latency Near-eye Display
JPH0675617A (en) Camera view point change system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant