CN109961520A - A kind of classroom VR/MR and its construction method based on third visual angle technology - Google Patents

A kind of classroom VR/MR and its construction method based on third visual angle technology Download PDF

Info

Publication number
CN109961520A
CN109961520A CN201910086586.0A CN201910086586A CN109961520A CN 109961520 A CN109961520 A CN 109961520A CN 201910086586 A CN201910086586 A CN 201910086586A CN 109961520 A CN109961520 A CN 109961520A
Authority
CN
China
Prior art keywords
classroom
space
visual angle
equipment
experience
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910086586.0A
Other languages
Chinese (zh)
Other versions
CN109961520B (en
Inventor
蔡铁峰
陈锐浩
王瑛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Polytechnic
Original Assignee
Shenzhen Polytechnic
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Polytechnic filed Critical Shenzhen Polytechnic
Priority to CN201910086586.0A priority Critical patent/CN109961520B/en
Publication of CN109961520A publication Critical patent/CN109961520A/en
Application granted granted Critical
Publication of CN109961520B publication Critical patent/CN109961520B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/02Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present invention provides a kind of classroom VR/MR based on third visual angle technology and its construction method, belongs to technical field of virtual reality.The classroom VR/MR includes VR/MR experience equipment, third viewing angles equipment, third visual angle display equipment, console and server, additionally provides a kind of construction method.Using the classroom VR/MR constructed by the present invention, VR/MR experience carried out to course content by a small amount of student, teacher controls the progress of teaching with console, other students achieve the effect that learn on the job well by viewing and emulating the experience of the process of experiencer;Student can enter VR third visual angle, MR third visual angle or VR/MR mixing third visual angle with unrestricted choice, and participate in interacting, to achieve the purpose that accurate, secured grasp knowledge, help to promote teaching efficiency.

Description

A kind of classroom VR/MR and its construction method based on third visual angle technology
Technical field
The invention belongs to technical field of virtual reality, especially a kind of classroom VR/MR based on third visual angle technology and its Construction method.
Background technique
The classroom VR/MR for teaching can be constructed with VR and MR technology, course set is showed with the form of VR/MR Come, can effectively promote teaching efficiency.Wherein refer to that classroom would be compatible with VR and MR both of which using " VR/MR " this formulation. Wherein, VR can construct entirely virtual teaching environment, be conducive to the experience course content that student's height is immersed, and MR is constructed Teaching environment actual situation combine, the feeling of immersion of content of courses experience may be reduced, but preferably can avoid student in classroom Collision.
Learning on the job is a kind of important teaching method in teaching, and student reaches quasi- by viewing and emulating other people practice situations on the spot Really, the purpose of knowledge is securely grasped.The choice of technology of third visual angle one is different from the visual angle of experiencer, is entered to experiencer Space (such as the space VR, the space MR) imaging, the image at this visual angle will not because of experiencer movement and shake, it is other The people on side can easily view and emulate the interactive process of experiencer and dummy object, and wherein the third visual angle technology in the space VR is often Further the real-time portrait of experiencer can be synthesized in virtual image.
Summary of the invention
The present invention provides a kind of classroom VR/MR based on third visual angle technology and its construction method, and student by seeing on the spot It rubs other people practice situations, achievees the purpose that accurate, secured grasp knowledge, help to promote teaching efficiency.
The working principle of the invention is:
The present invention establishes a unified coordinate system based on true classroom space, in this coordinate system building one simultaneously A space VR and a space MR, the two spaces are parallel with real world space and are overlapped in classroom.Wherein the space VR is used The VR experience of course content is carried out in student, the space MR is mainly used for the real-time interactive of teacher or other students and experiencer;And The space VR can be torn, so that the space MR is formed the blending space of more convenient teaching in conjunction with the space VR of part, that is, in classroom Under unified coordinate system, the partial region in the space VR is replaced with as needed the MR area of space of same position, VR experience with In MR experience set to the same space.The content of courses will be reflected in simultaneously in three spaces, embody any void of course content The pose parameter of quasi- object is completely the same in true classroom and in three spaces of building.Due to unified coordinate system It is constructed based on true classroom space, so can accomplish to arrange that dummy object is presented as intuitively and just under unified coordinate system Dummy object is arranged in true classroom in sharp ground.
The present invention provides experience personnel and two kinds of students' roles of trainee, wherein only a small amount of student is as experience personnel The VR/MR experience of course content is carried out in classroom experience area, most students are learned on the job course on seat by third visual angle Experience.
The present invention provides third visual angle imaging function to the space VR, the space MR, these three spaces of VR/MR blending space, In order to generate third multi-view image, a virtual camera is matched to each real camera/camera, real camera/ The image of the image of camera acquisition real world, virtual camera acquisition virtual world or dummy object be (virtual camera Pose and field angle and real camera/camera are completely the same), true picture is synthesized with virtual image by certain way Third multi-view image.
The present invention provides diversified interaction input and display channel, maximum convenient teaching.
The present invention provides a variety of interaction input channels, mainly there is following four kinds: VR equipment interaction, the interaction of MR glasses, packet The touch-screen input and console for including the handheld mobile device including mobile phone and tablet computer manipulate.The wherein interaction of VR equipment and MR Glasses interaction is mainly used for the VR/MR experience of course, and teacher manipulates console and dummy object can be placed in classroom arbitrarily Specified position, may further control the VR/MR teaching process of course, and the student to learn on the job passes through handheld mobile device and VR/ MR experience human eye is simply interacted.Any sky in the space VR, the space MR and these three spaces of VR/MR blending space In, reflection can all be synchronized into other two space to the interactive operation of any dummy object for embodying the content of courses.
The present invention provides a variety of display channels, mainly there is following five kinds, comprising: and VR aobvious, MR glasses, big display screen/ Projector, handheld mobile device, console display screen.These display channels can unrestricted choice the display space VR, the space MR Or VR/MR blending space.Wherein big display screen/projector, handheld mobile device and console display screen are used to display third Visual angle picture.Moreover, multiple cameras provides the third multi-view image of multiple seats in the plane in classroom, teacher can be according in course Hold VR/MR experience situation, the third multi-view image that big display screen/projector is shown is adjusted by way of switching seat in the plane;Teacher Or student can change its handheld device third visual angle figure by adjusting modes such as the poses of used handheld mobile device The visual angle of picture, and screen further can be wirelessly thrown as needed to big display screen or projector.
Technical scheme is as follows:
A kind of classroom VR/MR based on third visual angle technology, including system and classroom;The system comprises VR/MR experience Equipment, third viewing angles equipment, third visual angle show that equipment, console and server, VR/MR experience equipment, third visual angle Capture apparatus, third visual angle show that equipment, console are each provided in classroom, wherein
VR/MR experiences equipment, and positioning, the interaction for obtaining its own input information and be sent to server, while root The state parameter of each dummy object sent according to server renders VR or MR picture and shows;
Third viewing angles equipment, including multiple cameras and handheld mobile device, multiple cameras acquisition are different The image at visual angle sends the third multi-view image that server is used to generate multiple visual angles to;The camera of handheld mobile device is adopted Collect image, the dummy object information then sent according to server just generates third visual angle in handheld mobile device itself Image, and directly display out on the subsidiary screen of handheld mobile device;
Third visual angle shows equipment, including at least one piece is located at the indoor big display screen/projector of religion, and console is subsidiary Screen, the subsidiary screen of handheld mobile device, wherein the subsidiary screen of big display screen/projector, console receives server The third multi-view image sent;
Console, teacher are used to intuitively setting dummy object and are teaching indoor pose, control scene switching, virtual object Appearance, the event scripts triggering of body, control VR/MR teaching process on this basis;
Server receives interaction input and location information, and the status information of dummy object is calculated according to procedure script, and The dummy object status information calculated is sent to handheld mobile device, VR/MR experience equipment;In addition, acquisition camera The image sent further generates third multi-view image, and sends big display screen/projector and console to.
As a further improvement of the present invention, the classroom includes that large-size screen monitors show/projected area, teacher platform, VR/MR body Test area and student seating area, wherein
Large-size screen monitors show/projected area, for disposing big display screen/projector;
Teacher platform, for disposing console;
VR/MR experiences area, for disposing VR/MR to experience equipment, carries out VR/MR experience for teacher and/or student;
Student seating area carries out third visual angle for student and learns on the job;
VR/MR experience is carried out by teacher or the student in VR/MR experience area, the student in student seating area passes through The course experience of the process that the image of third visual angle display equipment views and emulates experiencer learn on the job and can use VR/ on seat MR experiences equipment or handheld mobile device participates in simple interaction;Console control VR/MR teaching process can be used in teacher.
As a further improvement of the present invention, multiple cameras provides the third visual angle figure of multiple seats in the plane in the classroom Picture, teacher it is aobvious can to adjust big display screen/projector according to course content VR/MR experience situation by way of switching seat in the plane The third multi-view image shown;Teacher or student can change by adjusting the mode of the pose of used handheld mobile device Become the visual angle of its handheld mobile device third multi-view image, and can further as needed it is wireless throw screen to big display screen or On projector.
A kind of construction method in the classroom VR/MR based on third visual angle technology, comprising the following steps:
S1 establishes classroom unified coordinate system, and the uniform coordinate constructs on the basis of tying up to true classroom, while covering VR Space, the space MR and VR/MR blending space;
S2 reads in the VR/MR resource of course, resource arrangement is carried out under the unified coordinate system of classroom, due to unified coordinate system It is to be constructed on the basis of true classroom, this step is exactly to arrange dummy object in true classroom;
S3, positioning are taught indoor equipment, are mainly positioned to VR/MR experience equipment, handheld mobile device, they Positioning be that the positioning device carried by these equipment or application program are realized, in addition to this in classroom some fixations equipment Location information can survey and draw to obtain in advance;
S4 matches a virtual camera, the pose parameter and view of virtual camera to each video camera/camera Rink corner is set as being matched video camera/camera respective value;
S5, server according to including interaction input, including location information calculate each dummy object in real time includes pose Status information including parameter, and send data to VR/MR experience equipment, handheld mobile device;
S6, real camera/camera acquisition real world image, virtual camera acquire virtual world or virtual The image of object, and the image acquired to real camera or camera carries out distortion correction;
S7, for any third viewing angles equipment, Classroom System is according to VR third visual angle, the MR third for selecting entrance Visual angle or the mixing third visual angle VR/MR and video camera imaging content generate third visual angle mask images, and mask images are determined The specific synthesis mode that true picture synthesizes third multi-view image with virtual image is determined;
S8, for any third viewing angles equipment, system is according to true picture, virtual image, mask images synthesis the Three multi-view images, and show that equipment is shown by corresponding third visual angle
As a further improvement of the present invention, in S1, classroom constructs the space VR, the space MR and VR/MR mixing simultaneously Space, the corresponding dummy object of the content of courses appear in the space VR, the space MR and VR/MR blending space simultaneously, and Dummy object is identical relative to the position in true classroom in the space VR, the space MR and VR/MR blending space, to the space VR, The space MR and VR/MR blending space can the imagings of third visual angle.
As a further improvement of the present invention, including but not limited to following to interact input mode: VR in the classroom VR/MR Equipment interaction, the interaction of MR glasses, the touch-screen input of handheld mobile device including mobile phone and tablet computer and console are grasped Control;The including but not limited to following output channel that shows: VR aobvious, MR glasses, the hand-held shifting including mobile phone and tablet computer Dynamic equipment, big display screen/projector and console screen.
As a further improvement of the present invention, MR, VR experience equipment, and handheld mobile device, big display screen/projector is all The space VR, the space MR or VR/MR blending space can be shown with unrestricted choice.
As a further improvement of the present invention, VR/MR blending space is using the space VR constructed in classroom and the space MR as base Plinth as needed replaces with the partial region in the space VR the MR area of space of same position under the unified coordinate system of classroom, VR experience and MR experience set are arrived in the same space.
As a further improvement of the present invention, it is mixed to establish unified true classroom, the space VR, the space MR, the VR/MR covered The unified coordinate system for closing space, embodies the pose parameter of any dummy object of course content in true classroom and constructs Three spaces in be completely the same.
As a further improvement of the present invention, in the space VR, the space MR and these three spaces of VR/MR blending space In any space, reflection can all be synchronized into other two space to the operation of any dummy object for embodying the content of courses.
Compared with prior art, the beneficial effects of the present invention are:
(1) using the classroom VR/MR constructed by the present invention, teacher can control the progress of teaching, other students with console By viewing and emulating the interactive process of experiencer, achieve the effect that learn on the job well.
(2) student can enter VR third visual angle, MR third visual angle or VR/MR mixing third visual angle with unrestricted choice, and And participate in interaction.Content with scene needs the mode of VR to experience can be more preferable, can be more without the experience mode MR with scene It is good, in addition to this, VR and MR it can experience simultaneously, and the two can interact.To reach accurate, secured grasp knowledge Purpose helps to promote teaching efficiency.
Detailed description of the invention
Fig. 1 is a kind of system composition schematic diagram in classroom VR/MR based on third visual angle technology.
Fig. 2 is a kind of classroom functional areas composition schematic diagram in classroom VR/MR based on third visual angle technology.
Fig. 3 is a kind of VR/MR mixing third visual angle schematic diagram of the invention.
Fig. 4 is a kind of construction method flow chart in classroom VR/MR based on third visual angle technology.
Fig. 5 is the true picture in the classroom VR/MR of the invention based on third visual angle technology.
Description of symbols: 1- server, the big display screen of 2- or projector, 3- console, 4-VR/MR experience equipment, 41- the One rendering and processing unit, 42-VR/MR display unit, 43- interact input unit, 44- positioning unit, 5- video camera, 6- hand Hold mobile device, the rendering of 61- second and processing unit 61,62- flat display unit, 63- dollying head, the big screen display of 10- Show/project seating area, 20- teacher platform, 30-VR/MR experiences area, 40- student seating area.
Specific embodiment
In order to illustrate the technical solutions in the embodiments of the present application or in the prior art more clearly, with reference to the accompanying drawing and tool The present invention is further described for body embodiment.
1 system of embodiment composition
As shown in Figure 1, system of the invention is mainly by VR/MR experience equipment 4, third viewing angles equipment (including mobile phone Or plate, while can wirelessly throw screen to going on big display screen/projector), third visual angle show equipment, console 3, server 1 Deng composition.
(1) VR/MR experience equipment 4 has the functions such as positioning, stereoscopic display, interaction (such as gesture identification, handle etc.), This kind of equipment can be all-in-one machine, be also possible to the external member comprising multiple components.Such as: the MR glasses Hololens of Microsoft is All-in-one machine, the VIVE of HTC are the external members comprising the helmet, locator, handle etc., and in addition to this VIVE also needs to be equipped with one again Host could use.
Specifically, as shown in Figure 1, VR/MR experience equipment includes positioning unit 44, interaction input unit 43, first Rendering and processing unit 41 and VR/MR display unit 42, the interaction input unit 43 include passing through gesture identification, keyboard, mouse At least one of mark or handle mode input, and the state parameter for each dummy object that server 1 is sent first passes through first After rendering is rendered, handled with processing unit 41, it is resent to the progress of VR/MR display unit 42 VR/MR and shows.
(2) third viewing angles equipment includes multiple cameras 5, and at least 1 hand-held movement with camera function is set Standby 6 (such as mobile phones, tablet computer).The image that multiple cameras 5 acquires different perspectives sends server 1 to, by server 1 Third multi-view image is generated, video camera 5 is generally fixed in classroom.The handheld mobile device 6 of mobile phone, tablet computer etc. acquires After image, after obtaining location information, third multi-view image can be generated directly in handheld mobile device 6, it can also be further wireless It passes on screen to the indoor big display screen of religion or projector 2.
Specifically, as shown in Figure 1, handheld mobile device 6 includes the rendering of dollying head 63, second and processing unit 61 With flat display unit 62, dollying head 63 acquires image, and by acquired image, the second rendering is transported with processing unit 61 Handheld mobile device 6 is positioned with computer vision, the second rendering determines matched void according to location information with processing unit 61 The pose parameter of quasi- camera, and the dummy object status information sent according to server 1 renders this Softcam Imaging, then dollying head 63 at picture and this Softcam at picture third visual angle synthesized by certain way Image is resent to flat display unit 62 and is shown.
(3) third visual angle shows equipment, including one or more big display screen or projector 2, the subsidiary screen of console 32, the subsidiary screen 62 of student's handheld mobile device.
(4) console 3, including controller 31 and the subsidiary screen 32 of console, controller 31 are used to control control The state parameters such as scene switching, the appearance of dummy object, dummy object pose change, event scripts trigger, and are used to control to teacher VR/MR teaching process processed.
(5) server 1 acquires the image that each road video camera 5 is sent, and receives the control signal of console 3, holds The interaction input of mobile device 6 and VR/MR experience equipment 4, and each dummy object is further calculated according to VR/MR script Status information is sent to VR/MR experience equipment 4, handheld mobile device 6, and generates third multi-view image and send to greatly by state Display screen or projector 2 or console 3.
Each functional areas composition in 2 classroom of embodiment
As shown in Fig. 2, classroom functional areas can simply divide are as follows: large-size screen monitors show/projected area 10, teacher platform 20, VR/ MR experiences area 30 and student's seat area 40, wherein
Large-size screen monitors show/projected area 10, for disposing big display screen or projector 2;
Teacher platform 20, for disposing console 3;
VR/MR experiences area 30, for disposing VR/MR to experience equipment 4, carries out VR/MR experience for teacher and/or student;
Student seating area 40 carries out third visual angle for student and learns on the job;
Wherein VR/MR experience area 30 can be overlapped with student seating area 40, and server 1 may be mounted at some in classroom Convenient corner, video camera 5 can be fixed on the wall or ceiling in classroom.
In the classroom, VR/MR experience is carried out by the student that teacher or a small amount of student are in VR/MR experience area 30, is in The student of student seating area 40 shows that the image of equipment views and emulates the interactive process of experiencer and learn on the job and can by third visual angle To use handheld mobile device or VR/MR equipment to participate in simple interaction;Teacher can be used console 3 and control VR/MR teaching Process.
The process and algorithm of 3 construction method of embodiment
As shown in figure 3, the process includes:
S1 establishes classroom unified coordinate system, and the uniform coordinate constructs on the basis of tying up to true classroom, while covering VR Space, the space MR and VR/MR blending space;
S2 reads in the VR/MR resource of course, resource arrangement is carried out under the unified coordinate system of classroom, due to unified coordinate system It is to be constructed on the basis of true classroom, this step is exactly to arrange dummy object in true classroom;
S3, positioning are taught indoor equipment, are mainly positioned to VR/MR experience equipment, handheld mobile device, they Positioning be that the positioning device carried by these equipment or application program are realized, in addition to this in classroom some fixations equipment Location information can survey and draw to obtain in advance;
S4 matches a virtual camera, pose (position and the appearance of virtual camera to each video camera/camera State) parameter and field angle be set as being matched video camera/camera respective value;
S5, server calculates the states such as the pose parameter of each dummy object according to interaction input, location information etc. in real time to be believed Breath, and send data to VR/MR experience equipment, handheld mobile device etc.;
S6, real camera/camera acquisition real world image, virtual camera acquire virtual world or virtual The image of object, and the image acquired to real camera or camera carries out distortion correction;
S7, for any third viewing angles equipment, Classroom System is according to VR third visual angle, the MR third for selecting entrance Visual angle or the VR/MR mixing factors such as third visual angle and video camera imaging content generate third visual angle mask images, mask Image determines that true picture synthesizes the specific synthesis mode of third multi-view image with virtual image;
S8, for any third viewing angles equipment, system is according to true picture, virtual image, mask images synthesis the Three multi-view images, and shown by corresponding equipment.
More specifically, the algorithm of the construction method is as follows:
(1) establish classroom unified coordinate system Ψ, for example classroom coordinate system is established in leftmost corner before classroom, two face walls with The intersection point on ground is coordinate origin, and two intersections on two face walls and ground are x-axis and y-axis, and the intersection of two face walls is z-axis.Together When construct VR space Sv, MR space Sm, blending space Sh.In space SvInterior object set is expressed as Γv={ v0,v1,v2, v3..., SmInterior object set is expressed as Γm={ m0,m1,m2,m3..., ShInterior object set is expressed as Γh={ h0, h1,h2,h3..., wherein SvAll objects for inside including all are virtual, and SmAnd ShInterior partial objects are that religion is indoor true Real object.The pose parameter of the dummy object pose parameter that this VR/MR Classroom System is related to and all devices in true classroom It is all undefined in this unified coordinate system Ψ.
(2) the VR/MR resource of course is read in, dummy object collection wherein included is combined intoAccording to It needs the dummy object in P to be arranged in classroom designated position, enables any dummy object ρjIt is used to embody course content, Being arranged in pose (position and attitude angle) initially specified in classroom is [rj,tj], then requiring it to meet ρ in principlej∈ Γv、ρj∈Γm、ρj∈Γh, that is, this dummy object need simultaneously the space VR, the space MR and VR/MR blending space this Occur in three spaces, and ρjInitial pose (position and attitude angle) parameter in these three spaces is completely the same, is all [rj,tj]。
(3) VR/MR experiences the included positioning function of equipment, and the also general-purpose computers of the handheld mobile device with camera Vision obtains positioning function, can obtain the pose parameter of itself in real time, but these pose parameters are not in classroom uniform coordinate It generates down but in the case where equipment carries coordinate system, requires transformation under the unified coordinate system of classroom, this requires solve device coordinate To the rotation translation relation of classroom uniform coordinate.In order to obtain the translation rotation relationship of pose, can be determined in classroom several Identification point is surveyed and drawn position of the identification point in the unified coordinate system of classroom in advance by the methods of ruler or laser tracker, then existed VR/MR experiences the position that these identification points are obtained in equipment, carries the position in coordinate system in equipment according to these identification points and arrives Position in the unified coordinate system of classroom can acquire the included coordinate of equipment to the rotation translation relation of classroom uniform coordinate.In classroom The positioning of fixed video camera need to demarcate all video camera internal references in classroom according to camera marking method.On this basis, it is taking the photograph Several identification points (or using point in classroom with obvious characteristic as identification point) are placed in the camera visual field, identification point is being taught Coordinate in the coordinate system Ψ of room can directly be measured by the methods of ruler or laser tracker, by identification point in video camera Imaging position can calculate translation and rotation amount of the video camera in coordinate system Ψ.In addition to this, indoor console, seat are taught The equipment such as chair can directly be measured by the methods of ruler or laser tracker.
(4) virtual camera is defined to each video camera, the field angle of virtual camera and teaching indoor position The corresponding real camera of appearance is completely the same, and dummy object will be imaged in virtual camera.For example, arbitrarily being taken the photograph in classroom Camera/camera Ci, pose is expressed as with rotational translation matrixField angle isWhereinFor level Field angle,For vertical field of view angle, Classroom System matches the completely the same virtual camera shooting of field angle, a pose parameter for it Machine C 'i.Namely C 'iPose be similarly with field angleWith
(5) server calculates the states letter such as pose parameter of each dummy object according to interaction input, location information etc. in real time Breath, and send data to VR/MR experience equipment, handheld mobile device etc.;
(6) real camera/camera acquisition real world image, virtual camera acquire virtual world or virtual The image of object, and the image acquired to real camera or camera carries out distortion correction.Such as real camera CiIt obtains Image after distortion correction be Gi, virtual camera C 'iThe image of acquisition is G 'i
(7) according to the space entered, third multi-view image can be divided into: VR third visual angle, MR third visual angle and VR/MR mixes third visual angle.Wherein VR third visual angle is that virtual scene background is fully retained, and MR third visual angle is virtual scene Background does not retain completely, and mixing third visual angle is reservation part virtual scene background, as shown in figure 4, a is true classroom, b It is virtual scene, c is that virtual scene is placed on classroom, and d is that virtual scene only remains part even part large-scale virtual object Also part is only remained, VR scene can be torn in this way convenient for the student on seat and participated in Teaching Experience in a manner of MR. The generation of third multi-view image needs real world images and dummy object/virtual scene image superposition, and wherein stacked system can be with Using mask images.If not considering to block, the mask at VR third visual angle is exactly the picture of buckleing to experiencer, and MR third visual angle is covered Mould is exactly the virtual object image removed after virtual scene background, and the mask images at VR/MR mixing third visual angle are having virtual field The place of scape background is button picture, is not having some places to be dummy object image.If further considering to block, root is needed Block masks are generated according to the object dimensional information measured in advance or by the data that depth camera acquires in real time;For arbitrarily imaging Machine/camera CiIt is Θ in the mask images that t moment generatesi, pixel value must between 0~1 value.
(8) according to mask, virtual image and real picture are synthesized, generates third multi-view image Pi, wherein
Pi(x, y)=Θ (x, y) Gi(x, y)+(1- Θ (x, y)) G ' (x, y).
The process for using in the classroom embodiment 4VR/MR
Fig. 5 shows the true picture in the classroom VR/MR of the invention based on third visual angle technology, the use stream in the classroom VR/MR The main points of journey mainly have:
A. VR is then used when the content of courses needs to rely on virtual scene, otherwise uses MR, if there is in part in certain content of courses Hold and relies on scene, and part does not need, and the content of courses can be presented with VR combination MR.Outside in addition to experiencer, when other When life will also participate in, portion can be replicated to him, or directly enter into experience area, directly participate in experiencing.Third visual angle can be into Enter virtual emulation mode and individually show virtual content, is not constrained by video camera or camera visual angle.
B. teacher can use the progress of console control teaching, other students are by viewing and emulating the interactive process of experiencer, very well Achieve the effect that learn on the job.Content with virtual scene needs the mode of VR to experience can be more preferable, without with virtual scene Experience mode MR can be more preferable, in addition to this, VR and MR can experience simultaneously, and the two can interact.
Specific teaching process is as follows:
(1) start the classroom VR/MR, the VR/MR resource for representing course content be arranged in classroom according to presetting, Or teacher can arrange that dummy object is teaching indoor position on the spot.
(2) VR/MR Classroom System real-time perfoming third viewing angles and display.
(3) teacher explains course set point with third visual angle picture, and is carried out centainly with console to dummy object Control knowledge point is preferably presented.Student can watch large-size screen monitors/projection, handheld mobile device can also be used from third visual angle Mode watches dummy object.
(4) when course proceeds to Key Points or difficult point it when, is transported by grasping preferable student to knowledge point Practical operation (being also possible to teacher) is carried out with VR/MR experience equipment, during practical operation, there can also be other students to be added mutual It moves, for example student's practical operation just in VR/MR experience is incorrect or other students have better method, can be added into, Wherein when experiencer be VR experience when, the student being newly added can arrive experience area put on VR/MR experience equipment experience, It can also stay on oneself seat, participate in interacting with various ways such as handheld mobile devices.
(5) in experience of the process, teacher by console control experience of the process, and according to circumstances select big display screen or The corresponding video camera position of projector third visual angle picture, in addition selection by student's handheld mobile device third visual angle picture without Line throws screen and comes.
The foregoing is merely illustrative of the preferred embodiments of the present invention, is not intended to limit the invention, all in essence of the invention Made any modifications, equivalent replacements, and improvements etc., should all be included in the protection scope of the present invention within mind and principle.

Claims (10)

1. a kind of classroom VR/MR based on third visual angle technology, which is characterized in that including system and classroom;The system comprises VR/MR experience equipment, third viewing angles equipment, third visual angle display equipment, console and server, VR/MR experience equipment, Third viewing angles equipment, third visual angle show that equipment, console are each provided in classroom, wherein
VR/MR experiences equipment, and positioning, the interaction for obtaining its own input information and be sent to server, while according to clothes The state parameter for each dummy object that business device is sent renders VR or MR picture and shows;
Third viewing angles equipment, including multiple cameras and handheld mobile device, multiple cameras acquire different perspectives Image sends the third multi-view image that server is used to generate multiple visual angles to;The camera collection image of handheld mobile device, Then the dummy object information sent according to server just generates third multi-view image in handheld mobile device itself, and It directly displays out on the subsidiary screen of handheld mobile device;
Third visual angle shows equipment, including at least one piece is located at the indoor big display screen/projector of religion, the subsidiary screen of console Curtain, the subsidiary screen of handheld mobile device, wherein the screen reception server transmission that big display screen/projector, console are subsidiary The third multi-view image to come over;
Console, teacher be used to intuitively setting dummy object and teaching indoor pose, and control scene switching, dummy object go out Existing, event scripts triggering, controls VR/MR teaching process on this basis;
Server, reception interaction input and location information, the status information of dummy object are calculated according to procedure script, and calculating Dummy object status information out is sent to handheld mobile device, VR/MR experience equipment;In addition, acquisition camera transmitted The image come further generates third multi-view image, and sends big display screen/projector and console to.
2. the classroom VR/MR according to claim 1 based on third visual angle technology, which is characterized in that the classroom includes Large-size screen monitors show/projected area, teacher platform, the experience area VR/MR and student seating area, wherein
Large-size screen monitors show/projected area, for disposing big display screen/projector;
Teacher platform, for disposing console;
VR/MR experiences area, for disposing VR/MR to experience equipment, carries out VR/MR experience for teacher and/or student;
Student seating area carries out third visual angle for student and learns on the job;
VR/MR experience is carried out by teacher or the student in VR/MR experience area, the student in student seating area is regarded by third The course experience of the process that the image of angle display equipment views and emulates experiencer is learned on the job and can be set on seat using VR/MR experience Standby or handheld mobile device participates in simple interaction;Console control VR/MR teaching process can be used in teacher.
3. the classroom VR/MR according to claim 1 or 2 based on third visual angle technology, which is characterized in that in the classroom Multiple cameras provides the third multi-view image of multiple seats in the plane, and teacher can be according to course content VR/MR experience situation, by cutting The mode of position of changing planes adjusts the third multi-view image that big display screen/projector is shown;Teacher or student can be by adjusting being made The mode of the pose of handheld mobile device changes the visual angle of its handheld mobile device third multi-view image, and can basis Need wirelessly to throw screen to big display screen or projector.
4. a kind of construction method in the classroom VR/MR as described in any one of claims 1-3 based on third visual angle technology, special Sign is, comprising the following steps:
S1 establishes classroom unified coordinate system, and the uniform coordinate constructs on the basis of tying up to true classroom, at the same cover the space VR, The space MR and VR/MR blending space;
S2 reads in the VR/MR resource of course, and resource arrangement is carried out under the unified coordinate system of classroom;
S3, indoor equipment is taught in positioning, including is positioned to VR/MR experience equipment, handheld mobile device, some solid in classroom The location information of fixed equipment can survey and draw to obtain in advance;
S4 matches a virtual camera, the pose parameter and field angle of virtual camera to each video camera/camera It is set as being matched video camera/camera respective value;
S5, server according to including interaction input, including location information calculate each dummy object in real time includes pose parameter Status information inside, and send data to VR/MR experience equipment, handheld mobile device;
S6, real camera/camera acquisition real world image, virtual camera acquisition virtual world or dummy object Image, and the image acquired to real camera or camera carries out distortion correction;
S7, for any third viewing angles equipment, VR third visual angle that Classroom System enters according to selection, MR third visual angle are also It is that the mixing third visual angle VR/MR and video camera imaging content generate third visual angle mask images, mask images determine really Image synthesizes the specific synthesis mode of third multi-view image with virtual image;
S8, for any third viewing angles equipment, system is regarded according to true picture, virtual image, mask images synthesis third Angle image, and show that equipment is shown by corresponding third visual angle.
5. the construction method in the classroom VR/MR according to claim 4 based on third visual angle technology, which is characterized in that S1 In, classroom constructs the space VR, the space MR and VR/MR blending space simultaneously, and the corresponding dummy object of the content of courses occurs simultaneously It is virtual in the space VR, the space MR and VR/MR blending space, and in the space VR, the space MR and VR/MR blending space Object is identical relative to the position in true classroom, to the space VR, the space MR and VR/MR blending space can third visual angle at Picture.
6. the construction method in the classroom VR/MR according to claim 4 based on third visual angle technology, which is characterized in that In the classroom VR/MR, the including but not limited to following input mode that interacts: the interaction of VR equipment, the interaction of MR glasses including mobile phone and plate The touch-screen input of handheld mobile device including computer and console manipulation;It is including but not limited to following to show output channel: VR Head is aobvious, MR glasses, handheld mobile device, big display screen/projector and console screen including mobile phone and tablet computer.
7. the classroom VR/MR according to claim 6 based on third visual angle technology, which is characterized in that MR, VR experience are set Standby, handheld mobile device, big display screen/projector can the unrestricted choice display space VR, the space MR or VR/MR mixing sky Between.
8. the classroom VR/MR according to claim 4 based on third visual angle technology, which is characterized in that VR/MR blending space Based on the space VR and the space MR that are constructed in classroom, under the unified coordinate system of classroom, as needed the part in the space VR Region replaces with the MR area of space of same position, and VR experience and MR experience set are arrived in the same space.
9. the classroom VR/MR according to claim 4 based on third visual angle technology, which is characterized in that in unified coordinate system In, it is complete for embodying the pose parameter of any dummy object of course content in true classroom and in three spaces of building It is consistent.
10. the classroom VR/MR according to claim 4 based on third visual angle technology, it is characterised in that: in the space VR, MR In any space in space and these three spaces of VR/MR blending space, to the behaviour of any dummy object for embodying the content of courses Work can all synchronize reflection into other two space.
CN201910086586.0A 2019-01-29 2019-01-29 VR/MR classroom based on third view angle technology and construction method thereof Active CN109961520B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910086586.0A CN109961520B (en) 2019-01-29 2019-01-29 VR/MR classroom based on third view angle technology and construction method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910086586.0A CN109961520B (en) 2019-01-29 2019-01-29 VR/MR classroom based on third view angle technology and construction method thereof

Publications (2)

Publication Number Publication Date
CN109961520A true CN109961520A (en) 2019-07-02
CN109961520B CN109961520B (en) 2023-05-09

Family

ID=67023441

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910086586.0A Active CN109961520B (en) 2019-01-29 2019-01-29 VR/MR classroom based on third view angle technology and construction method thereof

Country Status (1)

Country Link
CN (1) CN109961520B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110517546A (en) * 2019-09-12 2019-11-29 成都泰盟软件有限公司 A kind of virtual reality tutoring system and interactive teaching and learning method
CN110688005A (en) * 2019-09-11 2020-01-14 塔普翊海(上海)智能科技有限公司 Mixed reality teaching environment, teacher and teaching aid interaction system and interaction method
CN110971678A (en) * 2019-11-21 2020-04-07 深圳职业技术学院 Immersive visual campus system based on 5G network
CN110992744A (en) * 2019-12-16 2020-04-10 武汉鑫科信科技有限公司 Sight teaching system based on VR technique
CN111899590A (en) * 2020-08-25 2020-11-06 成都合纵连横数字科技有限公司 Mixed reality observation method for simulation operation training process
CN115379278A (en) * 2022-03-31 2022-11-22 深圳职业技术学院 XR technology-based immersive micro-class recording method and system

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103760981A (en) * 2014-01-21 2014-04-30 北京师范大学 Magnetic field visualization and interaction method
CN105425955A (en) * 2015-11-06 2016-03-23 中国矿业大学 Multi-user immersive full-interactive virtual reality engineering training system
CN206400816U (en) * 2016-07-07 2017-08-11 南京凌越铭盛信息工程有限公司 Wisdom classroom system
CN107731016A (en) * 2017-10-10 2018-02-23 东莞华南设计创新院 A kind of Chemical Engineering Training system based on virtual reality
CN107976811A (en) * 2017-12-25 2018-05-01 河南新汉普影视技术有限公司 A kind of simulation laboratory and its emulation mode based on virtual reality mixing
CN108288419A (en) * 2017-12-31 2018-07-17 广州市坤腾软件技术有限公司 A kind of vocational education craftsman's platform based on AR/VR technologies
CN108389249A (en) * 2018-03-06 2018-08-10 深圳职业技术学院 A kind of spaces the VR/AR classroom of multiple compatibility and its construction method
CN109044374A (en) * 2018-07-19 2018-12-21 杭州心景科技有限公司 It integrates audiovisual and continuously performs test method, apparatus and system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103760981A (en) * 2014-01-21 2014-04-30 北京师范大学 Magnetic field visualization and interaction method
CN105425955A (en) * 2015-11-06 2016-03-23 中国矿业大学 Multi-user immersive full-interactive virtual reality engineering training system
CN206400816U (en) * 2016-07-07 2017-08-11 南京凌越铭盛信息工程有限公司 Wisdom classroom system
CN107731016A (en) * 2017-10-10 2018-02-23 东莞华南设计创新院 A kind of Chemical Engineering Training system based on virtual reality
CN107976811A (en) * 2017-12-25 2018-05-01 河南新汉普影视技术有限公司 A kind of simulation laboratory and its emulation mode based on virtual reality mixing
CN108288419A (en) * 2017-12-31 2018-07-17 广州市坤腾软件技术有限公司 A kind of vocational education craftsman's platform based on AR/VR technologies
CN108389249A (en) * 2018-03-06 2018-08-10 深圳职业技术学院 A kind of spaces the VR/AR classroom of multiple compatibility and its construction method
CN109044374A (en) * 2018-07-19 2018-12-21 杭州心景科技有限公司 It integrates audiovisual and continuously performs test method, apparatus and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
徐菊红: "混合现实虚拟智能教室的方案设计", 《武汉工程大学学报》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110688005A (en) * 2019-09-11 2020-01-14 塔普翊海(上海)智能科技有限公司 Mixed reality teaching environment, teacher and teaching aid interaction system and interaction method
CN110517546A (en) * 2019-09-12 2019-11-29 成都泰盟软件有限公司 A kind of virtual reality tutoring system and interactive teaching and learning method
CN110971678A (en) * 2019-11-21 2020-04-07 深圳职业技术学院 Immersive visual campus system based on 5G network
CN110971678B (en) * 2019-11-21 2022-08-12 深圳职业技术学院 Immersive visual campus system based on 5G network
CN110992744A (en) * 2019-12-16 2020-04-10 武汉鑫科信科技有限公司 Sight teaching system based on VR technique
CN111899590A (en) * 2020-08-25 2020-11-06 成都合纵连横数字科技有限公司 Mixed reality observation method for simulation operation training process
CN111899590B (en) * 2020-08-25 2022-03-11 成都合纵连横数字科技有限公司 Mixed reality observation method for simulation operation training process
CN115379278A (en) * 2022-03-31 2022-11-22 深圳职业技术学院 XR technology-based immersive micro-class recording method and system
CN115379278B (en) * 2022-03-31 2023-09-05 深圳职业技术学院 Recording method and system for immersion type micro lessons based on augmented reality (XR) technology

Also Published As

Publication number Publication date
CN109961520B (en) 2023-05-09

Similar Documents

Publication Publication Date Title
CN109961520A (en) A kind of classroom VR/MR and its construction method based on third visual angle technology
US10701344B2 (en) Information processing device, information processing system, control method of an information processing device, and parameter setting method
Azuma Overview of augmented reality
US20050264559A1 (en) Multi-plane horizontal perspective hands-on simulator
US20050219240A1 (en) Horizontal perspective hands-on simulator
CN110969905A (en) Remote teaching interaction and teaching aid interaction system for mixed reality and interaction method thereof
CN106600709A (en) Decoration information model-based VR virtual decoration method
CN106293087B (en) A kind of information interacting method and electronic equipment
CN103543827B (en) Based on the implementation method of the immersion outdoor activities interaction platform of single camera
EP1740998A2 (en) Horizontal perspective hand-on simulator
CN103246673B (en) A kind of picture switching method and electronic equipment
CN110688005A (en) Mixed reality teaching environment, teacher and teaching aid interaction system and interaction method
WO2013119475A1 (en) Integrated interactive space
CN109901713B (en) Multi-person cooperative assembly system and method
CN109358754B (en) Mixed reality head-mounted display system
CN110568923A (en) unity 3D-based virtual reality interaction method, device, equipment and storage medium
CN108389249A (en) A kind of spaces the VR/AR classroom of multiple compatibility and its construction method
CN109191983A (en) Distribution network live line work emulation training method, apparatus and system based on VR
KR20130052769A (en) Apparatus and method fot providing mixed reality contents for virtual experience based on story
US20050248566A1 (en) Horizontal perspective hands-on simulator
JP2010257081A (en) Image procession method and image processing system
WO2020177318A1 (en) Virtual reality-based craft-beer saccharification operation system and method
JP2005341060A (en) Camera control apparatus
CN113941138A (en) AR interaction control system, device and application
WO2022047768A1 (en) Virtual experience system and method combining hololens and cave

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant