CN109587464A - A kind of open air large format interaction method - Google Patents

A kind of open air large format interaction method Download PDF

Info

Publication number
CN109587464A
CN109587464A CN201811578472.XA CN201811578472A CN109587464A CN 109587464 A CN109587464 A CN 109587464A CN 201811578472 A CN201811578472 A CN 201811578472A CN 109587464 A CN109587464 A CN 109587464A
Authority
CN
China
Prior art keywords
projection
interaction
projector
building
interactive medium
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811578472.XA
Other languages
Chinese (zh)
Inventor
孙晓颖
李翔
燕学智
陈建
佴威至
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jilin University
Original Assignee
Jilin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jilin University filed Critical Jilin University
Priority to CN201811578472.XA priority Critical patent/CN109587464A/en
Publication of CN109587464A publication Critical patent/CN109587464A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4038Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor

Abstract

The present invention relates to a kind of outdoor large format interaction methods, belong to projection tiled display technology and human-computer interaction technology.Using external surface of buildings as projection plane, it is display equipment with more projectors, projection splicing is carried out to projected image by using the method for manually or automatically projecting splicing, by the alignment of each projector's project content and building surface key point, completes multi-projection system and reach 3D effect using building sides surface key point.The present invention is projected in rough building surface, there is the interactive medium of stereoscopic effect using the sags and crests production of building surface, simultaneously by using novel human-machine interaction technology, it is interacted with the media of media hype method proposed by the present invention production, to generate shock visual effect beyond imagination and while can make one to generate stereoscopic vision during appreciation with building perfect combination.

Description

A kind of open air large format interaction method
Technical field
The present invention relates to projection tiled display technology and novel human-machine interaction technologies, and in particular to a kind of building of interaction Object surface projection method.
Background technique
Traditional projection tiled display technology is all using plane or curved surface projection screen as display surface.With projection The progress of machine technology, people need one to constantly being promoted of requiring of visual experience and along with the development of New Media Art Kind more has visual impact, has the Novel projection display technology of shock effect.Also, with the development of interaction technique, It people are no longer content with only watching film, and is desirable to oneself be able to participate in entire activity, this requires novel Human-computer interaction technology.
Summary of the invention
The present invention provides a kind of outdoor large format interaction method, can be thrown in rough building surface Shadow, and real-time control is carried out to project content by various sensors or hand-held mobile terminal, generate interaction effect.
Outdoor large format interaction method of the invention, using external surface of buildings as projection plane, with more projectors To show equipment, projection splicing is carried out to projected image by using the method for manually or automatically projecting splicing, is thrown by each The alignment of shadow machine project content and building surface key point completes multi-projection system and reaches 3D using building sides surface key point Effect, wherein key point is the intersection point of intersection and each intersection that each discontinuity surface of outer surface of building is formed;Game is set Develop engine and human-computer interaction equipment, wherein human-computer interaction equipment is used to collect the movement of user, and development of games engine is equipped with end Mouth is formed mutually for receiving the action message for the user that man-machine interaction device is sent back to, the action message of user in conjunction with projected image Dynamic media.
Specifically includes the following steps:
Step 1, projection surface and viewer position are determined, using camera in viewer position shooting projection surface Photo;
Step 2, projector's parameter, projection are determined according to the material of projection surface, live illumination condition and the project budget Machine quantity and projector's location arrangements;
Step 3, effective view field is determined using manually or automatically method:
Manual methods determine that effective view field's method is as follows: choosing the square that can surround whole building region manually Shape is as effective view field;
Automated process determines that effective view field's method is as follows: according to construction zone brightness and background in step 1 photo The difference of brightness seeks construction zone using image processing method, takes the minimum circumscribed rectangle in the region as effectively projection Region;
Step 4, interaction solutions are determined: the interaction technique used, institute are determined according to project demand and project field condition Stating interaction technique includes body-sensing interaction, laser interaction, voice interface and mobile terminal interdynamic technology;
Step 5, the interactive medium with 3D effect makes:
The template made using the photo that step 1 obtains as interactive medium carries out material-making using video production software, Obtain continuous picture materials;When needing to show effect relevant to key point position, closed along building surface in photo Key point position and the straight line for connecting key point make 3D special efficacy;Development of games engine is set, is used for synthesising picture material, and pre- Port is stayed to form interaction matchmaker in conjunction with user action for receiving the user action information that man-machine interaction device is sent back to, picture materials Body;
Step 6, in-site installation sets up projector, adjusts each projector position and angle full of entire effective projected area Domain;In-site installation human-computer interaction equipment, and the parameter of commissioning device;
Step 7, more perspective geometry corrections are carried out using manually or automatically projection joining method;
Wherein, the manual projection joining method is as follows: making the key point position of projection surface by dragging grid manually It is aligned with projection video, while the straight line in image space equally being adjusted at viewer visual angle and is in line, and carry out face Color correction and integration region brightness decay;
The automatic projection joining method is the automatic multi-projection system based on structure light scan, specific steps are as follows:
Step 7.1, camera being set up in position identical with step 1, camera parameter setting is identical as step 1 camera parameter, The camera can take projection surface;
Step 7.2, the coordinate according to four angle points of effective view field under camera image coordinate system and area to be projected Coordinate of four, the domain angle point in the case where showing image coordinate system calculates camera image space and display image space using homography matrix Between corresponding relationship HD
Step 7.3, for the building surface region of every projector covering, the method for coding structure optical scanning is used respectively Establish the corresponding relationship M between every projection space and camera imageCi, i=1,2 ..., N, N is projector's quantity;
Step 7.4, the corresponding relationship H between combining camera image space and display image spaceDAnd any one throwing Corresponding relationship M between shadow machine image space and camera imageCi, obtain display image space and any one projection sky Between corresponding relationship;
Step 7.5, color correction and integration region brightness decay are carried out to image to be displayed;
Step 8, user interaction data is collected by human-computer interaction equipment, user interaction data is sent out in real time by network Interactive medium is given, interactive medium is according to program step preset when receiving information according to production interactive medium, real-time calling Corresponding material and special efficacy complete interactive medium rendering;
Step 9, the corresponding relationship of the display image space and any one projection space that are obtained according to step 7, It will show that the point in image space is mapped to corresponding position in its corresponding projection space, be sequentially completed every projector The correspondence of middle all the points is to complete the projection splicing rendering of interactive medium, allows all projectors to be played simultaneously through step by network Rapid 8 obtained interactive media.
Beneficial effects of the present invention:
Use ground, metope or projection splicing different as display surface from traditional interaction, the present invention proposes It is a kind of to be projected in rough building surface, and there is stereoscopic effect using the production of the sags and crests of building surface Interactive medium, while by using (such as body-sensing interaction, laser interaction, voice interface or movement of novel human-machine interaction technology Terminal interaction technique) it is interacted with the media of media hype method proposed by the present invention production, to generate beyond imagination It shakes visual effect and while can make one to generate stereoscopic vision during appreciation with building perfect combination.
Detailed description of the invention
Fig. 1 is the projection pattern arrangement schematic diagram that the manual splicing and amalgamation method of the present invention carries out Interactive architecture Projection Display;
Wherein, the external surface of buildings of the quasi- projection of 101-;102- projector;103- converged services device;104- rendering control Server;106- body feeling interaction identifies server;107- body-sensing sensor;108- participates in the user of interaction;106,107,108 groups At body-sensing interaction portion 105.
Specific embodiment
Specific implementation step is as follows:
Step 1, projection surface and viewer position are determined:
It determines the surface for needing to project, and determines viewer position according to demand, using camera where viewer Position shoots the photo (photo requires to include the building surface for entirely needing to project) of projection surface, and records the position, phase The focal length of the height of machine, angle and camera.If camera lens there are obvious distortion, need in advance to camera into Line distortion correction (the general camera calibration method proposed using Zhang Zhengyou, Zhang Z.A flexible new technique for camera calibration[J].Pattern Analysis and Machine Intelligence,IEEE Transactions on,2000,22(11):1330-1334.);
Step 2, projection scheme is determined:
It determines and throws according to the material of building projection surface, live illumination condition and other factors (such as project budget) Shadow type number, projector lens model, projector's quantity and projector's fixed solution etc.;
Step 3, effective view field is determined:
General effective view field is whole building region in photo, can either manually or automatically method it is true Fixed effective view field.Automatic effectively view field determines that method is according to construction zone brightness and background luminance in photo Difference seeks construction zone using image processing method (such as image binaryzation method), takes the minimum circumscribed rectangle in the region As effective view field;Effectively view field determines that method is that manual selection one can surround whole building region manually Rectangle is as effective view field, and usually the minimum circumscribed rectangle of selection construction zone is as effective view field;
Step 4, interaction solutions are determined:
The interaction technique used, general energy are determined according to project demand and project field condition (such as illumination, space) It include: enough body-sensing interaction, laser interaction, voice interface, movement applied to the novel human-machine interaction technology of building interaction Terminal interaction technique etc.;
Body-sensing interaction: using body-sensing input equipment (such as Microsoft's Kinect somatosensory camera, Asus's Xtion body-sensing camera, PhaseSpace optics motion capture system of SouVR.com etc.), the position of user is acquired, by computer vision Pattern algorithm identifies the movement of user's input, and the movement is transferred to the variation that rendering server is used to control output interaction;
Laser interaction: user is directed toward the specific position of building surface using laser transmitting set (such as laser pen), leads to The position using video camera acquisition laser spot is crossed, mass detection and mass tracking algorithm by computer vision are swashed The position of light-emitting device meaning, and the movement is transferred to the variation that rendering server is used to control interactive medium;
Voice interface: using voice-input device, acquires user speech content or acoustic scene by sound collection equipment The movement after processing, and is transferred to the variation that rendering server is used to control interactive medium by sound decibels.
Mobile terminal interdynamic technology: clothes are received to interaction using input device for mobile terminal (such as mobile phone, tablet computer) The information (such as short breath, microblogging, wechat) that device reception user sends of being engaged in is passed through after processing, controls media by rendering server Effect makes corresponding change to achieve the purpose that interaction;
Step 5, the interactive medium with 3D effect makes:
The template made using the photo that step 1 obtains as interactive medium makes interactive medium, the specific steps are that:
Step 5.1, material-making: the photo obtained using step 1 directly uses video using photo as production template Software for producing (such as After Effect) carries out material-making, obtains continuous picture materials.If necessary to show one it is secondary with (key point is the intersection point of intersection and each intersection that each discontinuity surface of building projection surface is formed to key point, such as The angle point etc. of building window) the unrelated effect in position, then building surface is directly made into material as plain film;If It needs to show effect relevant to key point position, then needs to close along building surface key point position in photo and connection The straight line of key point makes special efficacy, to achieve the purpose that the multimedia video production with 3D effect;
Step 5.2, interactive medium synthesizes: using development of games engine (such as Flash, Unity3D) that above-mentioned picture is plain Material synthesis, and network (such as TCP/IP) interface is added, for receiving the user interaction information of novel human-machine interaction technology input, Picture materials form interactive medium in conjunction with user action;
Step 6, in-site installation sets up projector and human-computer interaction equipment:
In-site installation sets up projector, and adjusts each projector position and angle full of entire effective view field; In-site installation human-computer interaction equipment, and the parameter (such as sensitivity) of commissioning device;
Step 7, multi-projection system merges:
It completes building surface projection using manually or automatically projection joining method to splice, to realize multi-projector corresponding circle of sensation The alignment of the alignment in domain and projection video and building surface key point position.Generally, throwing is simply built for surface shape Shadow is adjusted by the way of manual splicing fusion, for surface shape complexity, is difficult manually to complete projection splicing Surface carries out more perspective geometry corrections using the automatic projection joining method based on structure light scan;
Building surface projection's joining method is similar with the manual splicing and amalgamation method such as existing plane, curved surface manually, is all It is aligned the key point position of projection surface with projection video by dragging grid manually, while one in image space is straight Line is equally adjusted at viewer visual angle and is in line, and carries out color correction and integration region brightness decay;
Automatic building surface projection's joining method carries out throw using the automatic projection joining method based on structure light scan more Shadow geometric correction, the specific steps are that:
Step 7.1, camera is set up in position identical with step 1, what the parameters such as angle focal length of camera and step 1 recorded Data are identical, which can take projection surface;
Step 7.2, the position and area to be projected according to four angle points of effective view field under camera image coordinate system Position of four, the domain angle point in the case where showing image coordinate system calculates camera image space and display image space using homography matrix Between corresponding relationship HD
Step 7.3, for the building surface region of every projector covering, the method for coding structure optical scanning is used respectively Establish the corresponding relationship M between any one projection space and camera imageCi, i=1,2 ..., N, N is projector's number Amount.I-th projection space is sought for example, by using Gray Code+Phase Shifting coding structure light scan method Corresponding relationship M between camera image spaceCiThe step of it is as follows:
Step 7.3.1, efficient coding region obtain: control projector successively projects the image of secondary a white and black, leads to It crosses image difference and efficient coding region, the i.e. picture to above-mentioned two photo corresponding positions can be obtained in binary image processing Element does difference, carries out binary conversion treatment to error image, white portion is efficient coding region, and black portions are invalid code area Domain;
Step 7.3.2 carries out longitudinal coding to effective view field of i-th projector and laterally encodes;
Step 7.3.3 removes error coded by using the methods of consistent (RANSAC) algorithm of random sampling;
Camera image space is divided into m × n grid by step 7.3.4, is found grid intersection point by above-mentioned coding and is being thrown Corresponding points in shadow image space calculate it in projected image space using the method for bilinearity difference to the point in grid Corresponding points position.
Step 7.4, the corresponding relationship H between combining camera image space and display image spaceDAnd any one throwing Corresponding relationship M between shadow machine image space and camera imageCi, obtain scheming between display image space with any one projector The corresponding relationship at image space midpoint, that is, sought needing each pixel in image to be shown to answer in every projection The position being in;
Step 7.5, color correction and integration region brightness decay are carried out;
Step 8, interactive medium is rendered:
By the initial data of human-computer interaction equipment acquisition user interaction, (depth image, laser in such as body-sensing interaction are mutual Visible images in dynamic, data that terminal is sent in voice and mobile terminal interdynamic in voice interface etc.), according to different Data type obtains user interaction data using data processing method described in step 5, real-time by TCP/IP network protocol Above-mentioned interactive data is sent to interactive medium, it is pre- when interactive medium is according to the different interactive media according to production for receiving information If program step, the corresponding material of real-time calling and special efficacy generate expection to complete interactive medium rendering in real time Projected image;
Step 9, the correction data after step 7 generates more perspective geometry corrections is read using converged services device, according to step The corresponding relationship of rapid 7 obtained display image spaces and any one projection space, will show the point in image space It is mapped to corresponding position in its corresponding projection space, the correspondence for being sequentially completed all the points in every projector is completed Rendering is spliced in the projection of interactive medium, while passing through all converged services using ICP/IP protocol using Rendering Control Service device Device coordinates each projector and the interactive medium obtained through step 8 is played simultaneously.
In conclusion the above is merely preferred embodiments of the present invention, being not intended to limit the scope of the present invention. All within the spirits and principles of the present invention, any modification, equivalent replacement, improvement and so on should be included in of the invention Within protection scope.

Claims (2)

1. a kind of open air large format interaction method, it is characterised in that: using external surface of buildings as projection plane, thrown with more Shadow machine is display equipment, projection splicing is carried out to projected image by using the method for manually or automatically projecting splicing, by each The alignment of platform projector project content and building surface key point is completed multi-projection system and is reached using building sides surface key point To 3D effect, wherein key point is the intersection point of intersection and each intersection that each discontinuity surface of outer surface of building is formed;Setting Development of games engine and human-computer interaction equipment, wherein human-computer interaction equipment is used to collect the movement of user, and development of games engine is set There is port for receiving the action message for the user that man-machine interaction device is sent back to, the action message of user shape in conjunction with projected image At interactive medium.
2. a kind of outdoor large format interaction method as described in claim 1, characterized in that it comprises the following steps:
Step 1, projection surface and viewer position are determined, shoots the photograph of projection surface in viewer position using camera Piece;
Step 2, projector's parameter, projector's number are determined according to the material of projection surface, live illumination condition and the project budget Amount and projector's location arrangements;
Step 3, effective view field is determined using manually or automatically method:
Manual methods determine that effective view field's method is as follows: choosing the rectangle that one can surround whole building region manually and make For effective view field;
Automated process determines that effective view field's method is as follows: according to construction zone brightness and background luminance in step 1 photo Difference, seek construction zone using image processing method, take the minimum circumscribed rectangle in the region as effective view field;
Step 4, interaction solutions are determined: the interaction technique used is determined according to project demand and project field condition, it is described mutual Dynamic technology includes body-sensing interaction, laser interaction, voice interface and mobile terminal interdynamic technology;
Step 5, the interactive medium with 3D effect makes:
The template made using the photo that step 1 obtains as interactive medium is carried out material-making using video production software, obtained Continuous picture materials;When needing to show effect relevant to key point position, along building surface key point in photo Position and the straight line for connecting key point make 3D special efficacy;Development of games engine is set, synthesising picture material, and reserved end are used for For mouth for receiving the user action information that man-machine interaction device is sent back to, picture materials form interactive medium in conjunction with user action;
Step 6, in-site installation sets up projector, adjusts each projector position and angle full of entire effective view field; In-site installation human-computer interaction equipment, and the parameter of commissioning device;
Step 7, more perspective geometry corrections are carried out using manually or automatically projection joining method;
Wherein, the manual projection joining method is as follows: making key point position and the throwing of projection surface by dragging grid manually Video display frequency is aligned, while the straight line in image space equally being adjusted at viewer visual angle and is in line, and carries out color school Just and integration region brightness decay;
The automatic projection joining method is the automatic multi-projection system based on structure light scan, specific steps are as follows:
Step 7.1, camera is set up in position identical with step 1, camera parameter setting is identical as step 1 camera parameter, the phase Machine can take projection surface;
Step 7.2, the coordinate according to four angle points of effective view field under camera image coordinate system and region to be projected four Coordinate of a angle point in the case where showing image coordinate system is calculated between camera image space and display image space using homography matrix Corresponding relationship HD
Step 7.3, it for the building surface region of every projector covering, is established respectively using the method for coding structure optical scanning Corresponding relationship M between every projection space and camera imageCi, i=1,2 ..., N, N is projector's quantity;
Step 7.4, the corresponding relationship H between combining camera image space and display image spaceDAnd any one projector's figure Corresponding relationship M between image space and camera imageCi, obtain pair of display image space and any one projection space It should be related to;
Step 7.5, color correction and integration region brightness decay are carried out to image to be displayed;
Step 8, user interaction data is collected by human-computer interaction equipment, user interaction data is sent in real time by network Interactive medium, for interactive medium according to program step preset when receiving information according to production interactive medium, real-time calling is corresponding Material and special efficacy, complete interactive medium rendering;
Step 9, the corresponding relationship of the display image space and any one projection space that are obtained according to step 7, will show Point in diagram image space is mapped to corresponding position in its corresponding projection space, is sequentially completed institute in every projector Correspondence a little is to complete the projection splicing rendering of interactive medium, allows all projectors to be played simultaneously through step 8 by network Obtained interactive medium.
CN201811578472.XA 2018-12-23 2018-12-23 A kind of open air large format interaction method Pending CN109587464A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811578472.XA CN109587464A (en) 2018-12-23 2018-12-23 A kind of open air large format interaction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811578472.XA CN109587464A (en) 2018-12-23 2018-12-23 A kind of open air large format interaction method

Publications (1)

Publication Number Publication Date
CN109587464A true CN109587464A (en) 2019-04-05

Family

ID=65930699

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811578472.XA Pending CN109587464A (en) 2018-12-23 2018-12-23 A kind of open air large format interaction method

Country Status (1)

Country Link
CN (1) CN109587464A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110469155A (en) * 2019-08-23 2019-11-19 湖南融视文化创意有限公司 A kind of KTV box with the huge curtain of panorama
CN111182282A (en) * 2019-12-30 2020-05-19 成都极米科技股份有限公司 Method and device for detecting projection focusing area and projector
CN115648847A (en) * 2022-10-26 2023-01-31 东莞市皓龙激光科技有限公司 Laser decoration method, system and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103500438A (en) * 2013-10-21 2014-01-08 北京理工大学 Interactive building surface projection method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103500438A (en) * 2013-10-21 2014-01-08 北京理工大学 Interactive building surface projection method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110469155A (en) * 2019-08-23 2019-11-19 湖南融视文化创意有限公司 A kind of KTV box with the huge curtain of panorama
CN111182282A (en) * 2019-12-30 2020-05-19 成都极米科技股份有限公司 Method and device for detecting projection focusing area and projector
CN111182282B (en) * 2019-12-30 2022-03-29 成都极米科技股份有限公司 Method and device for detecting projection focusing area and projector
CN115648847A (en) * 2022-10-26 2023-01-31 东莞市皓龙激光科技有限公司 Laser decoration method, system and storage medium

Similar Documents

Publication Publication Date Title
CN103500438A (en) Interactive building surface projection method
KR101566543B1 (en) Method and system for mutual interaction using space information argumentation
CN109587464A (en) A kind of open air large format interaction method
US11488348B1 (en) Computing virtual screen imagery based on a stage environment, camera position, and/or camera settings
WO2010001756A1 (en) Portable type game device and method for controlling portable type game device
CN103533318A (en) Building outer surface projection method
CN111770326B (en) Indoor three-dimensional monitoring method for panoramic video projection
JP2015506030A (en) System for shooting video movies
US11176716B2 (en) Multi-source image data synchronization
TW201943259A (en) Window system based on video communication
CN110324554A (en) Video communication device and method
US11615755B1 (en) Increasing resolution and luminance of a display
CN115118880A (en) XR virtual shooting system based on immersive video terminal is built
CN114845147B (en) Screen rendering method, display screen synthesizing method and device and intelligent terminal
CN101807311A (en) Making method of digital spherical screen stereoscopic film
CN109996048A (en) A kind of projection correction's method and its system based on structure light
CN104883561A (en) Three-dimensional panoramic display method and head-mounted display device
CN104759094A (en) Multi-person free shooting identifying system facing to 7D shooting cinema and multi-person free shooting identifying method
Zhou et al. Light field projection for lighting reproduction
CN103533278B (en) A kind of large format Free Surface many projections method for automatically split-jointing
KR101398252B1 (en) Method for Recognizing Reality and Reducing Observational Error Using infrared camera
TWI515691B (en) Composition video producing method by reconstruction the dynamic situation of the capture spot
JP3956543B2 (en) Wide viewing angle display device with automatic correction mechanism
CN213426345U (en) Digital sand table interactive item exhibition device based on oblique photography
CN113507599B (en) Education cloud service platform based on big data analysis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20190405