CN111866489A - Method for realizing immersive panoramic teaching - Google Patents

Method for realizing immersive panoramic teaching Download PDF

Info

Publication number
CN111866489A
CN111866489A CN201910353336.9A CN201910353336A CN111866489A CN 111866489 A CN111866489 A CN 111866489A CN 201910353336 A CN201910353336 A CN 201910353336A CN 111866489 A CN111866489 A CN 111866489A
Authority
CN
China
Prior art keywords
video
teaching
panoramic
vector
live
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910353336.9A
Other languages
Chinese (zh)
Inventor
张玉峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Kaiqi Technology Co ltd
Original Assignee
Zhejiang Kaiqi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Kaiqi Technology Co ltd filed Critical Zhejiang Kaiqi Technology Co ltd
Priority to CN201910353336.9A priority Critical patent/CN111866489A/en
Publication of CN111866489A publication Critical patent/CN111866489A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/261Image signal generators with monoscopic-to-stereoscopic image conversion
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/02Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/257Colour aspects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a method for realizing immersive panoramic teaching, and belongs to the technical field of computer simulation. The introduction of the prior teaching tool to the scenic spot can only be passively listened or seen, the invention adopts the steps of firstly carrying out panoramic photography on the actual location of the scenic spot in the teaching content to obtain a live-action panoramic video, establishing a 3D model of the related elements of the scenic spot in a 3D modeling mode, manufacturing a 3D character and a camera control system which can control walking and view angles by using a Unity engine, and then fusing the live-action panoramic video and 3D materials by using a computer program; during fusion, extracting the illumination information of the video by using a video light information extraction method, and polishing the 3D material according to the illumination information to realize natural fusion of the live-action panoramic video and the 3D material; the user can walk to any point through the operation personage, selects any angle to observe the teaching content and relates to the sight spot, so that the teaching content is visual and real, and the immersive panoramic teaching is realized.

Description

Method for realizing immersive panoramic teaching
Technical Field
The invention belongs to the technical field of computer simulation, and particularly relates to a method for realizing immersive panoramic teaching.
Background
In teaching, introduction of scenic spots such as natural scenery, scenic spots and historical sites, famous buildings, museums and the like is realized by means of wall map or images and the like at present, a taught person can only listen or see passively, an in-person experience function cannot be realized, and the teaching aid lacks intuition and interestingness.
Disclosure of Invention
The invention aims to solve the problem that the current teaching tool can only passively listen or watch the introduction of the scenic spot, and provides a method for realizing immersive panoramic teaching.
Therefore, the invention adopts the following technical scheme: a method for realizing immersive panoramic teaching is characterized in that panoramic photography is carried out on the actual location of a scenic spot in teaching content to obtain a live panoramic video, a 3D model of related elements of the scenic spot is established in a 3D modeling mode, a Unity engine is used for manufacturing a 3D character and a camera control system which can control walking and view angles, and then a computer program is used for fusing the live panoramic video and 3D materials such as the 3D model and the 3D character;
during fusion, extracting the illumination information of the video by using a video light information extraction method, and polishing the 3D material according to the illumination information; the video light information extraction method comprises the steps of determining an incident light vector by observing the position of eyes of a camera, calculating a reflected light vector by using a surface normal vector, then enabling the vector to hit a point on a video ball, and calculating the color in a computer program by using a vector reflection formula;
After the color of the reflected light ray vector on the video ball is obtained, fuzzy processing is firstly carried out, then alpha transparency is carried out, and finally fusion with the chartlet color of the object is carried out to form the illumination color.
In addition to the above technical solutions, the present invention also includes the following technical features.
The vector reflection formula is R = I-2N (N × I), the computer program is written in CGshader language, and the corresponding functions in the computer program are as follows:
Float3 reflect(float3 T, float3 N)
{
Return I – 2.0*N*dot(N, I);
}。
the invention can achieve the following beneficial effects: the panoramic video of the scenery spot live-action related to the teaching content is used as the three-dimensional simulation scene of the teaching video, the specially developed video light information extraction method is used for extracting the illumination information of the video, the 3D material is lightened according to the illumination information, the live-action and the 3D material are naturally fused, a user can walk to any point through operating characters and select any angle to observe the scene and the material object related to the teaching content, the teaching content is visual and real, the immersive panoramic teaching learning is realized, the participation of a taught person is enhanced, and the learning interest of the taught person is enriched.
Detailed Description
The following detailed description of the present invention is provided as illustrative and explanatory only and is not to be construed as limiting the invention.
The method comprises the steps of firstly carrying out panoramic photography on the actual locations of scenic spots in teaching contents to obtain a real panoramic video, establishing a 3D model of scenic spot related elements in a 3D modeling mode, using a Unity engine to manufacture a 3D character and a camera control system which can control movement and view angles, and then using a computer program to fuse the real panoramic video and 3D materials such as the 3D model and the 3D character.
During fusion, extracting the illumination information of the video by using a video light information extraction method, and polishing the 3D material according to the illumination information; the video optical information extraction method comprises the steps of determining an incident light vector by observing the position of eyes of a camera, calculating a reflected light vector by using a surface normal vector, then enabling the vector to hit a point on a video ball, and calculating the color in a computer program written in a CGshader language by using a vector reflection formula, wherein the vector reflection formula is R = I-2N (N I), and the computer program is that corresponding functions in the computer program are as follows:
Float3 reflect(float3 T, float3 N)
{
Return I – 2.0*N*dot(N, I);
}。
after the color of the reflected light ray vector on the video ball is obtained, fuzzy processing is firstly carried out, then alpha transparency is carried out, and finally fusion with the chartlet color of the object is carried out to form the illumination color.

Claims (2)

1. A method for realizing immersive panoramic teaching is characterized in that: panoramic photography is carried out on the actual location of the scenic spot in the teaching content to obtain a live-action panoramic video, a 3D model of the scenic spot related elements is established in a 3D modeling mode, a Unity engine is used for manufacturing a 3D character and a camera control system which can control walking and view angles, and then a computer program is used for fusing the live-action panoramic video and 3D materials such as the 3D model and the 3D character;
during fusion, extracting the illumination information of the video by using a video light information extraction method, and polishing the 3D material according to the illumination information; the video light information extraction method comprises the steps of determining an incident light vector by observing the position of eyes of a camera, calculating a reflected light vector by using a surface normal vector, then enabling the vector to hit a point on a video ball, and calculating the color in a computer program by using a vector reflection formula;
after the color of the reflected light ray vector on the video ball is obtained, fuzzy processing is firstly carried out, then alpha transparency is carried out, and finally fusion with the chartlet color of the object is carried out to form the illumination color. .
2. The method of claim 1, wherein the step of: the vector reflection formula is R = I-2N (N × I), the computer program is written in CGshader language, and the corresponding functions in the computer program are as follows:
Float3 reflect(float3 T, float3 N)
{
Return I – 2.0*N*dot(N, I);
}。
CN201910353336.9A 2019-04-29 2019-04-29 Method for realizing immersive panoramic teaching Pending CN111866489A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910353336.9A CN111866489A (en) 2019-04-29 2019-04-29 Method for realizing immersive panoramic teaching

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910353336.9A CN111866489A (en) 2019-04-29 2019-04-29 Method for realizing immersive panoramic teaching

Publications (1)

Publication Number Publication Date
CN111866489A true CN111866489A (en) 2020-10-30

Family

ID=72966293

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910353336.9A Pending CN111866489A (en) 2019-04-29 2019-04-29 Method for realizing immersive panoramic teaching

Country Status (1)

Country Link
CN (1) CN111866489A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101246600A (en) * 2008-03-03 2008-08-20 北京航空航天大学 Method for real-time generating reinforced reality surroundings by spherical surface panoramic camera
CN106157359A (en) * 2015-04-23 2016-11-23 中国科学院宁波材料技术与工程研究所 A kind of method for designing of virtual scene experiencing system
CN106940897A (en) * 2017-03-02 2017-07-11 苏州蜗牛数字科技股份有限公司 A kind of method that real shadow is intervened in AR scenes
CN107871339A (en) * 2017-11-08 2018-04-03 太平洋未来科技(深圳)有限公司 The rendering intent and device of virtual objects color effect in video
CN109214979A (en) * 2017-07-04 2019-01-15 北京京东尚科信息技术有限公司 Method and apparatus for merging object in panoramic video

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101246600A (en) * 2008-03-03 2008-08-20 北京航空航天大学 Method for real-time generating reinforced reality surroundings by spherical surface panoramic camera
CN106157359A (en) * 2015-04-23 2016-11-23 中国科学院宁波材料技术与工程研究所 A kind of method for designing of virtual scene experiencing system
CN106940897A (en) * 2017-03-02 2017-07-11 苏州蜗牛数字科技股份有限公司 A kind of method that real shadow is intervened in AR scenes
CN109214979A (en) * 2017-07-04 2019-01-15 北京京东尚科信息技术有限公司 Method and apparatus for merging object in panoramic video
CN107871339A (en) * 2017-11-08 2018-04-03 太平洋未来科技(深圳)有限公司 The rendering intent and device of virtual objects color effect in video

Similar Documents

Publication Publication Date Title
CN106157359B (en) Design method of virtual scene experience system
US10109113B2 (en) Pattern and method of virtual reality system based on mobile devices
CN113099204B (en) Remote live-action augmented reality method based on VR head-mounted display equipment
CN114401414B (en) Information display method and system for immersive live broadcast and information pushing method
JP2004537082A (en) Real-time virtual viewpoint in virtual reality environment
CN104050859A (en) Interactive digital stereoscopic sand table system
CN110045832B (en) AR interaction-based immersive safety education training system and method
CN109531566A (en) A kind of robot livewire work control method based on virtual reality system
US20180239514A1 (en) Interactive 3d map with vibrant street view
CN105513436A (en) Interactive holographic illusion teaching system and method
CN103533318A (en) Building outer surface projection method
Zhang et al. The Application of Folk Art with Virtual Reality Technology in Visual Communication.
CN110992486B (en) Shooting method of underwater simulation shooting system based on VR technology
JPH11259685A (en) Three-dimensional cg picture video merging device
CN110764247A (en) AR telescope
CN113941138A (en) AR interaction control system, device and application
WO2022047768A1 (en) Virtual experience system and method combining hololens and cave
KR102236965B1 (en) Remote travel support device using drone
WO2024022070A1 (en) Picture display method and apparatus, and device and medium
CN111866489A (en) Method for realizing immersive panoramic teaching
CN109712246B (en) Augmented reality image coloring method based on generation countermeasure network technology
CN111383343A (en) Home decoration-oriented augmented reality image rendering and coloring method based on generation countermeasure network technology
Takenawa et al. 360RVW: Fusing Real 360 Videos and Interactive Virtual Worlds
Tao A VR/AR-based display system for arts and crafts museum
CN111913572B (en) Human-computer interaction system and method for user labor learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20201030

RJ01 Rejection of invention patent application after publication