CN111083462A - Stereo rendering method based on double viewpoints - Google Patents

Stereo rendering method based on double viewpoints Download PDF

Info

Publication number
CN111083462A
CN111083462A CN201911406078.2A CN201911406078A CN111083462A CN 111083462 A CN111083462 A CN 111083462A CN 201911406078 A CN201911406078 A CN 201911406078A CN 111083462 A CN111083462 A CN 111083462A
Authority
CN
China
Prior art keywords
data
virtual
acquisition systems
actual
tracker
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911406078.2A
Other languages
Chinese (zh)
Inventor
张子敬
李帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhenjing Technology Co Ltd
Original Assignee
Beijing Zhenjing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhenjing Technology Co Ltd filed Critical Beijing Zhenjing Technology Co Ltd
Priority to CN201911406078.2A priority Critical patent/CN111083462A/en
Publication of CN111083462A publication Critical patent/CN111083462A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/122Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/349Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking
    • H04N13/351Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking for displaying simultaneously
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a three-dimensional rendering method based on double viewpoints, which comprises the steps of building one or more groups of ART acquisition systems according to the actual environment of an equipment site, enabling the acquisition ranges of the ART acquisition systems to be complementary, determining the moving area of an actual tracker of each group of ART acquisition systems, and creating an actual coordinate system of data acquired by each group of ART acquisition systems; each group of ART acquisition systems transmits the acquired data to a virtual scene establishing system, the virtual scene establishing system establishes a corresponding virtual coordinate system and a corresponding virtual tracker according to an actual coordinate system of the data acquired by each group of ART acquisition systems, and distributes the data of each virtual scene to a corresponding projector; and then the data is converted by the corresponding projector and output to the double-viewpoint display device, and multi-viewpoint display is carried out.

Description

Stereo rendering method based on double viewpoints
Technical Field
The invention relates to the technical field of three-dimensional rendering, in particular to a three-dimensional rendering method based on double viewpoints.
Background
At present, an ART tracking system can only convert one tracking node into viewpoint information and output the viewpoint information to a display screen for displaying, but cannot display the viewpoint information collected by a plurality of ART tracking systems on one screen.
Disclosure of Invention
The object of the present invention is to solve at least one of the technical drawbacks mentioned.
Therefore, the invention aims to provide a stereoscopic rendering method based on double viewpoints, which can convert two or more tracking nodes into two or more tracking viewpoint information, and the two or more tracking viewpoint information is applied to the same virtual scene, so that two or more pictures are rendered at the same time; so that different content is seen at two or more different frequencies on one screen.
In order to achieve the above object, the present invention provides a stereoscopic rendering method based on two viewpoints, comprising the steps of:
step S1, according to the actual environment of the equipment field, one or more groups of ART acquisition systems are built, the acquisition ranges of the ART acquisition systems are complemented, the moving area of the actual tracker of each group of ART acquisition systems is determined, and the actual coordinate system of the data acquired by each group of ART acquisition systems is established;
step S2, each group of ART acquisition systems transmits the acquired data to a virtual scene establishing system, the virtual scene establishing system establishes a corresponding virtual coordinate system according to the actual coordinate system of the data acquired by each group of ART acquisition systems, the virtual scene establishing system also establishes a virtual tracker corresponding to each group of ART acquisition systems, and the virtual tracker receives the actual tracker data from the corresponding ART acquisition systems, processes the data and uses the data for an engine;
step S3, the virtual scene establishing system establishes a plurality of groups of virtual scenes according to the data acquired by the actual trackers of the ART acquisition systems and distributes the data of each virtual scene to the corresponding projector; then the data is converted by the corresponding projector and output to the double-viewpoint display device;
in step S4, the dual-viewpoint display device displays and receives data output from all the projectors, and performs multi-viewpoint display.
In any of the above aspects, preferably, the ART acquisition system comprises 1 or more cameras, an ART server; the method comprises the following steps that 1 or more cameras are installed according to the field range of the actual environment of the field of equipment, the acquisition ranges of the 1 or more cameras are complementary, and the 1 or more cameras transmit acquired data to an ART server.
In any of the above schemes, preferably, the ART server acquires data from an actual tracker of the hardware, and transmits the data of the actual tracker to the virtual scene creation system.
In any of the above schemes, preferably, the ART server is also responsible for setting, calibrating, monitoring, running the actual tracker, and synchronously sending the actual tracker to the host in the same local area network, so that the data can be used by the third-party software.
In any of the above schemes, preferably, the virtual tracker receives actual tracker data from a corresponding ART acquisition system, then performs internal processing on the actual tracker data, executes the acquired effective data, and transmits the effective data to a corresponding projector.
In any of the above schemes, preferably, the projector renders an image by corresponding to the motion trajectory acquired by the virtual tracker, and then transmits the image to the dual viewpoint display device.
In any of the above schemes, preferably, when the number of the ART acquisition systems is 2, the 2 groups of ART acquisition systems respectively transmit the acquired data to the virtual scene establishing system, the virtual scene establishing system establishes a virtual coordinate system corresponding to the 2 groups of ART acquisition systems and 2 corresponding virtual trackers, and the 2 virtual trackers respectively receive actual tracker data corresponding to the 2 groups of ART acquisition systems, process the data and use the data for the engine.
In any of the above schemes, preferably, the 2 virtual trackers are further respectively bound to the 2 projectors, and send the processed data to the bound 2 projectors; and rendering images by the 2 projectors according to the motion tracks acquired by the corresponding virtual trackers, and transmitting the images to the double-viewpoint display device.
In any of the above embodiments, it is preferable that the dual view point display device outputs the motion trajectory rendering images output by the 2 projectors at different display frequencies, respectively.
In any of the above schemes, it is preferable that the dual-view display device outputs the motion trail rendering image output by one projector at a display frequency of 0-120 hz, and outputs the motion trail rendering image output by the other projector at a display frequency of 120-240 hz.
The stereoscopic rendering method based on the double viewpoints has the following beneficial effects:
1. the invention can convert two or more tracking nodes into two or more tracking viewpoint information, and the two or more tracking viewpoint information are applied to the same virtual scene, so that two or more pictures can be rendered at the same time; so that different content is seen at two or more different frequencies on one screen.
2. The invention makes the collection range of the ART collection system complementary, is beneficial to obtaining the collection data of the complete equipment field actual environment, is beneficial to establishing a complete actual coordinate system and enhances the rendering effect.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a flow chart of the present invention;
FIG. 2 is a block diagram of the architecture of the present invention;
fig. 3 is a block diagram showing the structure of the ART acquisition system of the present invention;
fig. 4 is a schematic view of the ART acquisition system of the present invention in the number of 2;
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
Example 1
The invention provides a stereo rendering method based on double viewpoints, as shown in fig. 1-3, comprising the following steps:
step S1, according to the actual environment of the equipment field, one or more groups of ART acquisition systems are built, the acquisition ranges of the ART acquisition systems are complemented, the moving area of the actual tracker of each group of ART acquisition systems is determined, and the actual coordinate system of the data acquired by each group of ART acquisition systems is established;
as shown in fig. 3, the ART acquisition system 1 includes 1 or more cameras 12, an ART server 11; the 1 or more cameras 12 are installed according to the field range of the actual environment of the field of the equipment, the acquisition ranges of the 1 or more cameras 12 are complementary, and the 1 or more cameras 12 transmit the acquired data to the ART server 11.
The ART server acquires data from an actual tracker of the hardware and transmits the data of the actual tracker to the virtual scene establishing system. The ART server is also responsible for setting, calibrating, monitoring and operating the actual tracker, and synchronously sends the actual tracker to a host in the same local area network for third-party software to use the data of the actual tracker.
Step S2, each group of ART acquisition systems transmits the acquired data to a virtual scene establishing system, the virtual scene establishing system establishes a corresponding virtual coordinate system according to the actual coordinate system of the data acquired by each group of ART acquisition systems, the virtual scene establishing system also establishes a virtual tracker corresponding to each group of ART acquisition systems, and the virtual tracker receives the actual tracker data from the corresponding ART acquisition systems, processes the data and uses the data for an engine;
step S3, the virtual scene establishing system establishes a plurality of groups of virtual scenes according to the data acquired by the actual trackers of the ART acquisition systems and distributes the data of each virtual scene to the corresponding projector; and then the data is converted by the corresponding projector, and the projector renders images through the motion trail collected by the corresponding virtual tracker and then transmits the images to the double-viewpoint display device. The virtual tracker receives actual tracker data from the corresponding ART acquisition system, then carries out internal processing and executes the obtained effective data, and transmits the effective data to the corresponding projector.
In step S4, the dual-viewpoint display device displays and receives data output from all the projectors, and performs multi-viewpoint display.
The virtual scene establishing system of the invention is to receive the real-time data from the actual tracker of the corresponding ART acquisition system by the virtual tracker, process the real-time data, then output the real-time data to the corresponding projector, and output the real-time data to the double-viewpoint display device by the corresponding projector, thereby greatly improving the data processing time of the double-viewpoint display device.
Example 2
This example 2 differs from example 1 in that: when the number of the ART acquisition systems is 2, 2 groups of ART acquisition systems respectively transmit acquired data to a virtual scene establishing system, the virtual scene establishing system establishes a virtual coordinate system corresponding to the 2 groups of ART acquisition systems and 2 corresponding virtual trackers through TMAX3D software, and the 2 virtual trackers respectively receive actual tracker data corresponding to the 2 groups of ART acquisition systems, process the actual tracker data and use the actual tracker data for an engine.
The 2 virtual trackers are also respectively bound with the 2 projectors, and the processed data is sent to the 2 projectors bound with the 2 virtual trackers; and rendering images by the 2 projectors according to the motion tracks acquired by the corresponding virtual trackers, and transmitting the images to the double-viewpoint display device. The double-viewpoint display device outputs the motion trail rendering images output by the 2 projectors on different display frequencies respectively.
Example 3
This example 3 differs from example 2 in that: the dual-view display device outputs the motion trail rendering image output by one projector at the display frequency of 0-120 Hz, and the motion trail rendering image output by the other projector at the display frequency of 120-240 Hz.
The invention can convert two or more tracking nodes into two or more tracking viewpoint information, and the two or more tracking viewpoint information are applied to the same virtual scene, so that two or more pictures can be rendered at the same time; so that different content is seen at two or more different frequencies on one screen.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made in the above embodiments by those of ordinary skill in the art without departing from the principle and spirit of the present invention. The scope of the invention is defined by the appended claims and equivalents thereof.

Claims (10)

1. A stereoscopic rendering method based on two viewpoints is characterized by comprising the following steps:
step S1, according to the actual environment of the equipment field, one or more groups of ART acquisition systems are built, the acquisition ranges of the ART acquisition systems are complemented, the moving area of the actual tracker of each group of ART acquisition systems is determined, and the actual coordinate system of the data acquired by each group of ART acquisition systems is established;
step S2, each group of ART acquisition systems transmits the acquired data to a virtual scene establishing system, the virtual scene establishing system establishes a corresponding virtual coordinate system according to the actual coordinate system of the data acquired by each group of ART acquisition systems, the virtual scene establishing system also establishes a virtual tracker corresponding to each group of ART acquisition systems, and the virtual tracker receives the actual tracker data from the corresponding ART acquisition systems, processes the data and uses the data for an engine;
step S3, the virtual scene establishing system establishes a plurality of groups of virtual scenes according to the data acquired by the actual trackers of the ART acquisition systems and distributes the data of each virtual scene to the corresponding projector; then the data is converted by the corresponding projector and output to the double-viewpoint display device;
in step S4, the dual-viewpoint display device displays and receives data output from all the projectors, and performs multi-viewpoint display.
2. The dual viewpoint-based stereoscopic rendering method according to claim 1, wherein the ART acquisition system includes 1 or more cameras, an ART server; the method comprises the following steps that 1 or more cameras are installed according to the field range of the actual environment of the field of equipment, the acquisition ranges of the 1 or more cameras are complementary, and the 1 or more cameras transmit acquired data to an ART server.
3. The dual viewpoint-based stereoscopic rendering method according to claim 2, wherein the ART server acquires data from an actual tracker of hardware and transmits the data of the actual tracker to the virtual scene creation system.
4. The dual viewpoint-based stereoscopic rendering method according to claim 2 or 3, wherein the ART server is also responsible for setting, calibrating, monitoring, running the actual tracker, and synchronously sending to the host computer in the same local area network for the third party software to use the data.
5. The dual viewpoint-based stereoscopic rendering method according to claim 1, wherein the virtual tracker receives actual tracker data from a corresponding ART acquisition system, performs internal processing on the actual tracker data, executes the acquired effective data, and transmits the effective data to a corresponding projector.
6. The dual viewpoint-based stereoscopic rendering method of claim 5, wherein the projector renders the image through a motion trajectory collected by the corresponding virtual tracker and then transmits the image to the dual viewpoint display apparatus.
7. The stereoscopic rendering method based on two viewpoints as claimed in claim 1, wherein when the number of ART acquisition systems is 2, 2 groups of ART acquisition systems respectively transmit the acquired data to the virtual scene establishment system, the virtual scene establishment system establishes a virtual coordinate system corresponding to the 2 groups of ART acquisition systems and 2 corresponding virtual trackers, and the 2 virtual trackers respectively receive actual tracker data corresponding to the 2 groups of ART acquisition systems, process and use the data for the engine.
8. The dual viewpoint-based stereoscopic rendering method according to claim 7, wherein 2 virtual trackers are further respectively bound to 2 projectors, and send the processed data to the 2 projectors to which they are bound; and rendering images by the 2 projectors according to the motion tracks acquired by the corresponding virtual trackers, and transmitting the images to the double-viewpoint display device.
9. The dual viewpoint-based stereoscopic rendering method according to claim 8, wherein the dual viewpoint display means outputs the motion trajectory rendering images outputted from the 2 projectors at different display frequencies, respectively.
10. The dual-viewpoint-based stereoscopic rendering method as claimed in claim 8, wherein the dual-viewpoint display device outputs the motion trail rendering image output by one projector at a display frequency of 0-120 hz, and the motion trail rendering image output by the other projector at a display frequency of 120-240 hz.
CN201911406078.2A 2019-12-31 2019-12-31 Stereo rendering method based on double viewpoints Pending CN111083462A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911406078.2A CN111083462A (en) 2019-12-31 2019-12-31 Stereo rendering method based on double viewpoints

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911406078.2A CN111083462A (en) 2019-12-31 2019-12-31 Stereo rendering method based on double viewpoints

Publications (1)

Publication Number Publication Date
CN111083462A true CN111083462A (en) 2020-04-28

Family

ID=70320351

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911406078.2A Pending CN111083462A (en) 2019-12-31 2019-12-31 Stereo rendering method based on double viewpoints

Country Status (1)

Country Link
CN (1) CN111083462A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101253538A (en) * 2005-07-01 2008-08-27 索尼电影娱乐公司 Mobile motion capture cameras
CN102822869A (en) * 2010-01-22 2012-12-12 索尼电脑娱乐美国公司 Capturing views and movements of actors performing within generated scenes
CN104331901A (en) * 2014-11-26 2015-02-04 北京邮电大学 TLD-based multi-view target tracking device and method
CN104394400A (en) * 2014-12-09 2015-03-04 山东大学 Virtual simulation system and method of antagonistic event with net based on three-dimensional multi-image display
CN105427338A (en) * 2015-11-02 2016-03-23 浙江宇视科技有限公司 Moving object tracking method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101253538A (en) * 2005-07-01 2008-08-27 索尼电影娱乐公司 Mobile motion capture cameras
CN102822869A (en) * 2010-01-22 2012-12-12 索尼电脑娱乐美国公司 Capturing views and movements of actors performing within generated scenes
CN104331901A (en) * 2014-11-26 2015-02-04 北京邮电大学 TLD-based multi-view target tracking device and method
CN104394400A (en) * 2014-12-09 2015-03-04 山东大学 Virtual simulation system and method of antagonistic event with net based on three-dimensional multi-image display
CN105427338A (en) * 2015-11-02 2016-03-23 浙江宇视科技有限公司 Moving object tracking method and device

Similar Documents

Publication Publication Date Title
CN109313484B (en) Virtual reality interaction system, method and computer storage medium
US9965026B2 (en) Interactive video display method, device, and system
JP4198054B2 (en) 3D video conferencing system
JP5138031B2 (en) Method, apparatus and system for processing depth related information
KR102564729B1 (en) Method and apparatus for transmitting information on 3D content including a plurality of viewpoints
JP2019022151A (en) Information processing apparatus, image processing system, control method, and program
US11568893B2 (en) Image acquisition system and method
WO2020218023A1 (en) Object information processing device, object information processing method and object information processing program
JP2020178237A (en) Video processing device, video processing method, and video processing program
CN111083462A (en) Stereo rendering method based on double viewpoints
CN103051866B (en) network 3D video monitoring system, method and video processing platform
CN114844585A (en) Data synchronization method, system and related equipment
CN109257588A (en) A kind of data transmission method, terminal, server and storage medium
US20190191099A1 (en) System, method and apparatus for networking-independent synchronized generation of a series of images
CN114184127B (en) Single-camera target-free building global displacement monitoring method
CN104935830A (en) Splicing display apparatus video information rendering and displaying methods and systems
JP2023075859A (en) Information processing apparatus, information processing method, and program
WO2020066699A1 (en) Information integration method, information integration device, and information integration program
CN106412682A (en) Augmented reality information transmission method
CN1570906A (en) Projection playing system and playing method thereof
CN212752485U (en) Stereoscopic display system
CN114866740B (en) On-demand dynamic video pre-monitoring method and system
WO2024103708A1 (en) Positioning method, terminal device, server, and storage medium
JP7129283B2 (en) Processing device, control method, and program
Zhou et al. Streaming Location-Based Panorama Videos into Augmented Virtual Environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200428

RJ01 Rejection of invention patent application after publication