CN110493581A - A kind of control method for supporting more people's immersion experience - Google Patents

A kind of control method for supporting more people's immersion experience Download PDF

Info

Publication number
CN110493581A
CN110493581A CN201910745565.5A CN201910745565A CN110493581A CN 110493581 A CN110493581 A CN 110493581A CN 201910745565 A CN201910745565 A CN 201910745565A CN 110493581 A CN110493581 A CN 110493581A
Authority
CN
China
Prior art keywords
sound
people
environment sound
supporting
control method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910745565.5A
Other languages
Chinese (zh)
Inventor
张树坤
段冰
林伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Zhaojian Intelligent Technology Co Ltd
Original Assignee
Suzhou Zhaojian Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Zhaojian Intelligent Technology Co Ltd filed Critical Suzhou Zhaojian Intelligent Technology Co Ltd
Priority to CN201910745565.5A priority Critical patent/CN110493581A/en
Publication of CN110493581A publication Critical patent/CN110493581A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24133Distances to prototypes
    • G06F18/24143Distances to neighbourhood prototypes, e.g. restricted Coulomb energy networks [RCEN]
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/48Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
    • G10L25/51Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3147Multi-projection systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor

Abstract

The invention discloses a kind of control methods for supporting more people's immersion experience, are to solve the problem that the matching degree of picture and sound is bad in existing immersion experience.Specific step is as follows: step 1 by the present invention, obtains position and the modeling of each experiencer, every projector, light converting member and projection screen in real time, determines the angle of every projector directive light converting member;Step 2, first environment sound and second environment sound are received, determine target sound and is projected according to starting information starting, projecting direction is switched according to the current location of target sound, picture and the matched three-dimensional video-frequency of sound can be obtained, can mostly enjoy and be experienced in immersion per capita.The present invention has rational design, the angle of every projector directive light converting member is determined by the position according to all parts, then switch projecting direction according to ambient sound, so that picture and sound matching degree are high.

Description

A kind of control method for supporting more people's immersion experience
Technical field
The present invention relates to field of projection display, specifically a kind of control method for supporting more people's immersion experience.
Background technique
With the development of science and technology with the continuous improvement of living standard, people to the requirement in terms of visual perception increasingly It is high, on the one hand, people are increasingly biased to miniature, large-size screen monitors and high-resolution to the pursuit of the display device of man-machine interface;It is another Aspect, in display effect, people, which have, to be tended to pursue 3D innervation, augmented reality, visual enjoyment on the spot in person.Then, mutually Dynamic projection is come into being, and increasingly common in people's lives.
With further increasing for people's demand, immersion experiences the favor by more and more people, and control method is heavy The determinant of immersion experience effect, although existing control method has certain experience effect, picture and sound Matching degree is bad, causes experience effect that the expection of people is not achieved, people are also in the research for carrying out related fields.
Summary of the invention
The embodiment of the present invention is designed to provide a kind of control method for supporting the experience of more people's immersions, above-mentioned to solve The problem of being proposed in background technique.
To achieve the above object, the embodiment of the present invention provides the following technical solutions:
A kind of control method for supporting more people's immersion experience, the specific steps are as follows:
Step 1 obtains position and the modeling of each experiencer, every projector, light converting member and projection screen in real time, The angle for determining every projector directive light converting member, obtains the image effect of best projection;
Step 2 receives first environment sound and second environment sound, judges whether the characteristic parameter of first environment sound contains Start information, if first environment sound includes starting information, first environment sound is determined as target sound and according to starting Information starting projection then judges whether the characteristic parameter of second environment sound contains if first environment sound does not include starting information There is starting information, if second environment sound includes starting information, second environment sound is determined as target sound and according to opening Dynamic information starting projection, switches projecting direction according to the current location of target sound, and picture can be obtained and sound is matched vertical Volumetric video can mostly be enjoyed per capita and be experienced in immersion.
As further embodiment of the embodiment of the present invention: light converting member includes reflection component, refractive component and polarization Component, the light that can be projected to projector are reflected, reflected and are polarized, and the use demand of people is met.
As further embodiment of the embodiment of the present invention: being modeled in step 1 using KNN algorithm, KNN algorithm is several According to one of simplest method in sorting technique of excavating.So-called K arest neighbors is exactly the meaning of k nearest neighbours, and what is said is every A sample can be represented with its immediate k neighbour, and in doing anomalous identification, our iteration distances find out neighbours most Few point, is defined as abnormal point.
As further embodiment of the embodiment of the present invention: the specific steps modeled in step 1 are as follows: be divided into projection screen Multiple rectangle frames interlock contiguous concatenation formation lattice, using projector from multiple and different angles crawl lattice figure As and projector is demarcated, recycle spin matrix and translation matrix to be converted, then corresponding correction.
As further embodiment of the embodiment of the present invention: further including the spy to three-dimensional video-frequency addition given category in step 2 Effect, fidelity are higher.
As further embodiment of the embodiment of the present invention: the special efficacy of given category include sound zoom, image amplification, Discoloration, mobile, preposition plus light beam, Jia Jiejie, luminous, flash of light and animal jump out, and experience effect is more preferably.
As further embodiment of the embodiment of the present invention: the characteristic parameter of first environment sound and the spy of second environment sound Levying parameter includes acoustic amplitudes and sound arrival time, can preferably determine the current location of target sound.
Compared with prior art, the beneficial effect of the embodiment of the present invention is:
The present invention has rational design, and the angle of every projector directive light converting member is determined by the position according to all parts Degree, then switch projecting direction according to ambient sound, so that picture and sound matching degree are high, immersion experience is more preferable, use scope Extensively.
Specific embodiment
The technical solution of the patent is explained in further detail With reference to embodiment.
Embodiment 1
A kind of control method for supporting more people's immersion experience, the specific steps are as follows:
Step 1 obtains position and the use of each experiencer, every projector, light converting member and projection screen in real time KNN algorithm is modeled, and KNN algorithm is one of simplest method in Data Mining Classification technology.So-called K arest neighbors, is exactly k The meaning of a nearest neighbours, what is said is that each sample can be represented with its immediate k neighbour, is doing anomalous identification In, our iteration distances find out the least point of neighbours, are defined as abnormal point, and modeling effect is good, and light converting member includes reflection Component, refractive component and polarization member, the light that can be projected to projector are reflected, reflected and are polarized, and meet people's Use demand determines the angle of every projector directive light converting member, obtains the image effect of best projection;
Step 2 receives first environment sound and second environment sound, judges whether the characteristic parameter of first environment sound contains Start information, if first environment sound includes starting information, first environment sound is determined as target sound and according to starting Information starting projection then judges whether the characteristic parameter of second environment sound contains if first environment sound does not include starting information There is starting information, if second environment sound includes starting information, second environment sound is determined as target sound and according to opening Dynamic information starting projection, the characteristic parameter of first environment sound and the characteristic parameter of second environment sound include acoustic amplitudes and Sound arrival time can preferably determine the current location of target sound, switched according to the current location of target sound and projected Picture and the matched three-dimensional video-frequency of sound can be obtained in direction, can mostly enjoy and experience in immersion per capita.
Embodiment 2
A kind of control method for supporting more people's immersion experience, the specific steps are as follows:
Step 1 obtains the position of each experiencer, every projector, light converting member and projection screen in real time, by projection screen Be divided into multiple rectangle frames interlock contiguous concatenation formation lattice, using projector from multiple and different angles grab grid chart The image of case and projector is demarcated, spin matrix and translation matrix is recycled to be converted, then corresponding correction, really The angle of fixed every projector directive light converting member, obtains the image effect of best projection;
Step 2 receives first environment sound and second environment sound, judges whether the characteristic parameter of first environment sound contains Start information, if first environment sound includes starting information, first environment sound is determined as target sound and according to starting Information starting projection then judges whether the characteristic parameter of second environment sound contains if first environment sound does not include starting information There is starting information, if second environment sound includes starting information, second environment sound is determined as target sound and according to opening Dynamic information starting projection, switches projecting direction according to the current location of target sound, then add the special efficacy of given category, and specified kind The special efficacy of class includes sound zoom, image amplification, discoloration, mobile, preposition plus light beam, Jia Jiejie, shine, flash of light and dynamic Object is jumped out, and picture and the matched three-dimensional video-frequency of sound can be obtained, and fidelity and experience effect more preferably, can mostly be enjoyed in heavy per capita Immersion experience.
The foregoing is merely illustrative of the preferred embodiments of the present invention, is not intended to limit the invention, all in essence of the invention Within mind and principle, any modification, equivalent replacement, improvement and so on be should all be included in the protection scope of the present invention.
In addition, it should be understood that although this specification is described in terms of embodiments, but not each embodiment is only wrapped Containing an independent technical solution, this description of the specification is merely for the sake of clarity, and those skilled in the art should It considers the specification as a whole, the technical solutions in the various embodiments may also be suitably combined, forms those skilled in the art The other embodiments being understood that.

Claims (7)

1. a kind of control method for supporting more people's immersion experience, which is characterized in that specific step is as follows: step 1 obtains in real time Position and the modeling for taking each experiencer, every projector, light converting member and projection screen determine every projector directive The angle of light converting member;
Step 2 receives first environment sound and second environment sound, judges whether the characteristic parameter of first environment sound contains Start information, if first environment sound includes starting information, first environment sound is determined as target sound and according to starting Information starting projection then judges whether the characteristic parameter of second environment sound contains if first environment sound does not include starting information There is starting information, if second environment sound includes starting information, second environment sound is determined as target sound and according to opening Dynamic information starting projection, switches projecting direction according to the current location of target sound.
2. the control method according to claim 1 for supporting more people's immersion experience, which is characterized in that the light conversion Component includes reflection component, refractive component and polarization member.
3. the control method according to claim 1 or 2 for supporting more people's immersion experience, which is characterized in that the step It is modeled in one using KNN algorithm.
4. the control method according to claim 1 for supporting more people's immersion experience, which is characterized in that in the step 1 The specific steps of modeling are as follows: by projection screen be divided into multiple rectangle frames interlock contiguous concatenation formation lattice, utilize projection Instrument grabs the image of lattice from multiple and different angles and demarcates to projector, recycles spin matrix and translation square Battle array is converted, then corresponding correction.
5. the control method according to claim 1 for supporting more people's immersion experience, which is characterized in that in the step 2 It further include the special efficacy to three-dimensional video-frequency addition given category.
6. the control method according to claim 5 for supporting more people's immersion experience, which is characterized in that the given category Special efficacy include sound zoom, image amplification, discoloration, it is mobile, preposition, plus light beam, Jia Jiejie, shine, flash of light and animal It jumps out.
7. the control method according to claim 1 for supporting more people's immersion experience, which is characterized in that an ambient sound The characteristic parameter of sound and the characteristic parameter of second environment sound include acoustic amplitudes and sound arrival time.
CN201910745565.5A 2019-08-13 2019-08-13 A kind of control method for supporting more people's immersion experience Pending CN110493581A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910745565.5A CN110493581A (en) 2019-08-13 2019-08-13 A kind of control method for supporting more people's immersion experience

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910745565.5A CN110493581A (en) 2019-08-13 2019-08-13 A kind of control method for supporting more people's immersion experience

Publications (1)

Publication Number Publication Date
CN110493581A true CN110493581A (en) 2019-11-22

Family

ID=68550936

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910745565.5A Pending CN110493581A (en) 2019-08-13 2019-08-13 A kind of control method for supporting more people's immersion experience

Country Status (1)

Country Link
CN (1) CN110493581A (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1988674A (en) * 2005-12-21 2007-06-27 国际商业机器公司 Method and device for three-dimensional projection
CN102323713A (en) * 2010-11-09 2012-01-18 浙江工业大学 Portable 360-DEG circular-screen 3D theatre system
CN102497572A (en) * 2011-09-07 2012-06-13 航天科工仿真技术有限责任公司 Desktop type space stereoscopic imaging apparatus
CN103076714A (en) * 2013-02-06 2013-05-01 保利影业投资有限公司 L-shaped double-machine projection system
CN103327276A (en) * 2013-06-04 2013-09-25 天津朗合数字科技有限公司 Nonlinear geometric correction technology of audio and video synchronization playing system and device thereof
CN203337993U (en) * 2013-03-25 2013-12-11 上海科斗电子科技有限公司 Raster-type 3D projection system
CN104714361A (en) * 2014-12-02 2015-06-17 上海理鑫光学科技有限公司 Multi-viewpoint 3D display device
CN105827338A (en) * 2016-03-14 2016-08-03 中国人民解放军国防科学技术大学 Indoor environment content identification method based on Wi-Fi signal and mobile phone
CN106646338A (en) * 2016-12-07 2017-05-10 华南理工大学 Rapidly accurate indoor location method
CN108680174A (en) * 2018-05-10 2018-10-19 长安大学 A method of map match abnormal point is improved based on machine learning algorithm
CN109309827A (en) * 2018-10-26 2019-02-05 浙江大学 More people's apparatus for real time tracking and method for 360 ° of suspension light field three-dimensional display systems
CN109982054A (en) * 2017-12-27 2019-07-05 广景视睿科技(深圳)有限公司 A kind of projecting method based on location tracking, device, projector and optical projection system

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1988674A (en) * 2005-12-21 2007-06-27 国际商业机器公司 Method and device for three-dimensional projection
CN102323713A (en) * 2010-11-09 2012-01-18 浙江工业大学 Portable 360-DEG circular-screen 3D theatre system
CN102497572A (en) * 2011-09-07 2012-06-13 航天科工仿真技术有限责任公司 Desktop type space stereoscopic imaging apparatus
CN103076714A (en) * 2013-02-06 2013-05-01 保利影业投资有限公司 L-shaped double-machine projection system
CN203337993U (en) * 2013-03-25 2013-12-11 上海科斗电子科技有限公司 Raster-type 3D projection system
CN103327276A (en) * 2013-06-04 2013-09-25 天津朗合数字科技有限公司 Nonlinear geometric correction technology of audio and video synchronization playing system and device thereof
CN104714361A (en) * 2014-12-02 2015-06-17 上海理鑫光学科技有限公司 Multi-viewpoint 3D display device
CN105827338A (en) * 2016-03-14 2016-08-03 中国人民解放军国防科学技术大学 Indoor environment content identification method based on Wi-Fi signal and mobile phone
CN106646338A (en) * 2016-12-07 2017-05-10 华南理工大学 Rapidly accurate indoor location method
CN109982054A (en) * 2017-12-27 2019-07-05 广景视睿科技(深圳)有限公司 A kind of projecting method based on location tracking, device, projector and optical projection system
CN108680174A (en) * 2018-05-10 2018-10-19 长安大学 A method of map match abnormal point is improved based on machine learning algorithm
CN109309827A (en) * 2018-10-26 2019-02-05 浙江大学 More people's apparatus for real time tracking and method for 360 ° of suspension light field three-dimensional display systems

Similar Documents

Publication Publication Date Title
US9900694B1 (en) Speaker array for sound imaging
WO2012141872A1 (en) Object tracking with projected reference patterns
US9979953B1 (en) Reflector-based depth mapping of a scene
US11272165B2 (en) Image processing method and device
US7434943B2 (en) Display apparatus, image processing apparatus and image processing method, imaging apparatus, and program
CN103984097B (en) Head-mounted display apparatus, the control method of head-mounted display apparatus and image display system
CN106462036B (en) Use the multicamera system of the folded-optics device of no parallax artifact
JP4764305B2 (en) Stereoscopic image generating apparatus, method and program
US9196067B1 (en) Application specific tracking of projection surfaces
EP2305358B1 (en) Portable type game device and method for controlling portable type game device
CN110291564B (en) Image generating apparatus and image generating method
US9557630B1 (en) Projection system with refractive beam steering
US20100164948A1 (en) Apparatus and method of enhancing ray tracing speed
CN110392251B (en) Dynamic projection method and system based on virtual reality
JP7212519B2 (en) Image generating device and method for generating images
US8730210B2 (en) Multipoint source detection in a scanned beam display
JP2001325069A (en) Device and method for detecting position
CN109996051A (en) A kind of trend projecting method that view field is adaptive, apparatus and system
CN105868748A (en) Data service platform on basis of cloud processing
US10129471B2 (en) Method, apparatus and system for detecting location of laser point on screen
JP2011215968A (en) Program, information storage medium and object recognition system
CN105096374A (en) Shading CG representations of materials
US20210056662A1 (en) Image processing apparatus, image processing method, and storage medium
US20190188902A1 (en) Determining pixel values using reference images
CN103716612A (en) Image processing apparatus and method for performing image rendering based on orientation of display

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20191122

RJ01 Rejection of invention patent application after publication