CN110149505B - Different-surface perspective correction fusion system based on CAVE - Google Patents

Different-surface perspective correction fusion system based on CAVE Download PDF

Info

Publication number
CN110149505B
CN110149505B CN201910396778.1A CN201910396778A CN110149505B CN 110149505 B CN110149505 B CN 110149505B CN 201910396778 A CN201910396778 A CN 201910396778A CN 110149505 B CN110149505 B CN 110149505B
Authority
CN
China
Prior art keywords
image
unit
processing
projector
server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910396778.1A
Other languages
Chinese (zh)
Other versions
CN110149505A (en
Inventor
王翔
师欣欣
许梦捷
蒋萍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Demagy International Exhibition Co ltd
Original Assignee
Demagy International Exhibition Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Demagy International Exhibition Co ltd filed Critical Demagy International Exhibition Co ltd
Priority to CN201910396778.1A priority Critical patent/CN110149505B/en
Publication of CN110149505A publication Critical patent/CN110149505A/en
Application granted granted Critical
Publication of CN110149505B publication Critical patent/CN110149505B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3147Multi-projection systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Geometry (AREA)
  • Projection Apparatus (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

The invention discloses a CAVE-based different-surface perspective correction fusion system, which particularly relates to the technical field of image processing and comprises a server, an operation end, a plurality of processing computers and projectors, wherein the number of the processing computers and the number of the projectors are multiple, the processing computers and the projectors are mutually connected, the server is connected with the processing computers through the Internet, and the processing computers are connected with each other through a synchronous card; the processing computers are respectively responsible for the image information in different projection surfaces to finish perspective correction of the image information in the different projection surfaces; the operation end is connected with the server, and the operation end is a command input end of a user and used for inputting a control command to the operation of the server. The invention has the advantages that the figure after perspective correction can not be deformed and distorted, the vertical line in the figure can be matched with the content of the figure, the investment is small, the operation is simple, the line is easy to be on-line, and the practicability is strong.

Description

Different-surface perspective correction fusion system based on CAVE
Technical Field
The invention relates to the technical field of image processing, in particular to a CAVE-based different-surface perspective correction fusion system.
Background
CAVE is represented by the VR-Platform CAVE virtual reality display system. CAVE is an immersion type virtual reality display system based on projection, and is characterized by high resolution, strong immersion and good interactivity. The principle of CAVE immersive virtual reality display system is complex, and it is based on computer graphics, and perfectly integrates the high resolution stereo projection display technology, multi-channel visual synchronization technology, sound technology, sensor technology, etc. together, thereby generating a completely immersive virtual environment surrounded by three-dimensional stereo projection pictures for use by multiple people.
Because the rear projection hard curtain structure of the multi-channel CAVE system cannot ensure the complete verticality of the curtain, the projection curtain has certain deformation. When the projector is directly projected onto the four projection screens, distortion of the image occurs. Due to the fact that the screen deforms to a certain degree, the upper edge and the lower edge of the right side of the image exceed the projection area, and the vertical line in the image deforms obviously.
The existing perspective correction methods mainly comprise two methods, one is an optical method, projection light is changed by adopting a special projector, but the method has large investment; the other method is a geometric correction method, which performs perspective correction of images in a mapping relationship, but the method is complex in operation, large in technical difficulty and difficult to apply.
Disclosure of Invention
In order to overcome the above-mentioned defects in the prior art, embodiments of the present invention provide a CAVE-based non-coplanar perspective correction fusion system, which converts an image into an editable image by using a graphic processor, a point correction unit pulls a control point in the editable image to make a line in the image coincide with or parallel to the image content to correct the image, a virtual coordinate unit generates a corresponding virtual space projector coordinate according to the image content and the direction of a projector, a three-dimensional image unit generates a stereoscopic image of a corresponding projection plane in real time according to the projector coordinate, the number of processing computers and graphic processors is the same as the number of projection planes, a plurality of processing computers are connected by using a synchronization card, and the system can send an instruction within a minimum response time, so that the system does not generate a distortion phenomenon in the image after perspective correction, and a vertical line in the image can match with the image content, and the investment is small, the operation is simple, the online is easy, and the practicability is strong.
In order to achieve the purpose, the invention provides the following technical scheme: a CAVE-based different-surface perspective correction fusion system comprises a server, an operation end, a plurality of processing computers and a plurality of projectors, wherein the processing computers and the projectors are connected with each other;
the processing computers are respectively responsible for the image information in different projection surfaces to finish perspective correction of the image information in the different projection surfaces;
the operation end is connected with the server, the operation end is a command input end of a user and is used for inputting a control command to the operation of the server and transmitting image information needing perspective correction to the inside of the server, and the server is responsible for interactive control, sending processing commands to the processing computers, controlling the operation states of the processing computers and transmitting the image information;
the processing computer is internally provided with a graphic processor, the connecting end of the graphic processor is provided with a light processing unit, an image output unit and an image conversion unit, the output end of the image conversion unit is provided with a correction unit, the correction unit comprises a point correction unit, a virtual coordinate unit and a three-dimensional image unit, the output end of the correction unit is provided with an image integration unit, and the output end of the image integration unit is connected with the image output unit;
the light processing unit comprises a light tracking unit and a light adjusting unit, the light tracking unit captures light information in single image content based on performance support provided by the graphic processor and identifies light intensity and angle in an image, and the light adjusting unit adjusts the light intensity and angle in the image information based on the performance support provided by the graphic processor according to the direction and angle of the projection surface, which is responsible by the processing computer, and is matched with the direction and angle of the projection surface;
the image conversion unit is also connected with the light processing unit, the image information processed by the light processing unit is transmitted into the image conversion unit, the image conversion unit reads the image information after receiving the image information, converts the image into an editable image, utilizes the point correction unit to pull a control point in the editable image to enable lines in the image to be overlapped or parallel with the image content, corrects the image, the virtual coordinate unit generates corresponding virtual space projector coordinates according to the content of the image and the direction of the projector, and the three-dimensional image unit generates a three-dimensional image of a corresponding projection plane in real time according to the projector coordinates;
the image integration unit receives the three-dimensional images generated by the three-dimensional image unit, integrates a plurality of sections of three-dimensional images, converges the three-dimensional images into complete and smooth video information, and transmits the video information to the image output unit;
the image output unit is also connected with a G-SYNC unit, and the G-SYNC unit processes the video information received by the image output unit and adapts to the output of the projector so as to integrate and synchronize the frame number displayed in each second of the video and the frame number output in each second of the projector;
the output end of the image output unit is connected with a projector, and the projector performs projection display on the video image information after synchronous integration.
In a preferred embodiment, the server, the console, the processing computer and the projector connection structure are a master-slave distributed network structure.
In a preferred embodiment, the graphics processor supports the DisplayPort 1.2 standard.
In a preferred embodiment, the number of processing computers and projectors is customized based on the number of display surfaces of CAVE.
In a preferred embodiment, the number of the graphic processors in the processing computer can be set to be multiple, and particularly, the processing computer is set to be multi-card cross fire.
The invention has the technical effects and advantages that:
1. the image is converted into an editable image by using the image processor, the point correction unit pulls the control point in the editable image to enable the line in the image to be coincident or parallel with the image content, the image is corrected, the virtual coordinate unit generates a corresponding virtual space projector coordinate according to the image content and the direction of the projector, the three-dimensional image unit generates a three-dimensional image of a corresponding projection surface in real time according to the projector coordinate, the number of the processing computers and the image processor is the same as that of the projection surfaces, the processing computers are connected by using the synchronous card, and the system can send an instruction within the minimum response time, so that the system can not generate the phenomenon of deformation distortion after perspective correction of the image, the vertical line in the image can be matched with the image content, the investment is small, the operation is simple, the online is easy, and the practicability is strong;
2. by adopting a graphic processor supporting the DisplayPort 1.2 standard, the G-SYNC unit can be operated, the G-SYNC unit is utilized to process the video information received by the image output unit and is matched with the output of the projector, so that the frame number displayed in each second of the video and the frame number output in each second of the projector are integrated and synchronized, and the corrected video and image information are not easy to tear, jam and delay when being output and displayed.
Drawings
FIG. 1 is a schematic diagram of the system of the present invention.
FIG. 2 is a schematic diagram of an internal system of a processing computer according to the present invention.
The reference signs are: the system comprises a server 1, an operation terminal 2, a processing computer 3, a projector 4, a graphic processor 5, a light processing unit 6, an image output unit 7, an image conversion unit 8, a correction unit 9, an image integration unit 10 and an image-SYNC unit 11G-SYNC.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example 1
As shown in fig. 1-2, the CAVE-based non-coplanar perspective correction fusion system includes a server 1, an operation end 2, a plurality of processing computers 3 and projectors 4, wherein the number of the processing computers 3 and the number of the projectors 4 are multiple, the plurality of processing computers 3 and the plurality of projectors 4 are connected with each other, the server 1 is connected with the plurality of processing computers 3 through the internet, and the plurality of processing computers 3 are connected with each other through a synchronization card;
the processing computers 3 are respectively responsible for the image information in different projection surfaces to finish perspective correction of the image information in the different projection surfaces;
the operation end 2 is connected with the server 1, the operation end 2 is a command input end of a user and is used for inputting a control command to the operation of the server 1 and transmitting image information needing perspective correction to the inside of the server 1, the server 1 is responsible for interactive control, sends a processing command to the plurality of processing computers 3, controls the operation states of the plurality of processing computers 3 and transmits the image information;
a graphic processor 5 is arranged in the processing computer 3, a light processing unit 6, an image output unit 7 and an image conversion unit 8 are arranged at the connecting end of the graphic processor 5, a correction unit 9 is arranged at the output end of the image conversion unit 8, the correction unit 9 comprises a point correction unit, a virtual coordinate unit and a three-dimensional image unit, an image integration unit 10 is arranged at the output end of the correction unit 9, and the output end of the image integration unit 10 is connected with the image output unit 7;
the light processing unit 6 comprises a light tracking unit and a light adjusting unit, the light tracking unit captures light information in single image content based on performance support provided by the graphic processor 5 and identifies light intensity and angle in the image, the light adjusting unit adjusts the light intensity and angle in the image information based on the performance support provided by the graphic processor 5 according to the direction and angle of the projection surface for which the processing computer 3 is responsible, and the light intensity and angle are matched with the direction and angle of the projection surface;
the image conversion unit 8 is further connected with the light processing unit 6, the image information processed by the light processing unit 6 is transmitted into the image conversion unit 8, the image conversion unit 8 reads the image information after receiving the image information, converts the image into an editable image, pulls a control point in the editable image by using a point correction unit to enable a line in the image to be coincident or parallel with the image content, corrects the image, generates a corresponding virtual space projector coordinate according to the image content and the direction of the projector 4 by using a virtual coordinate unit, and generates a three-dimensional image of a corresponding projection plane in real time according to the projector 4 coordinate;
the image integration unit 10 receives the three-dimensional images generated by the three-dimensional image unit, integrates a plurality of sections of three-dimensional images, converges the three-dimensional images into complete and smooth video information, and transmits the video information to the image output unit 7;
the image output unit 7 is further connected with a G-SYNC unit 11, and the G-SYNC unit 11 processes the video information received by the image output unit 7, adapts to the output of the projector 4, and integrates and synchronizes the frame number displayed per second in the video and the frame number output per second by the projector 4;
the output end of the image output unit 7 is connected with the projector 4, and the projector 4 performs projection display on the video image information after synchronous integration;
in the above CAVE-based different-plane perspective correction fusion system, the connection structure of the server 1, the operation end 2, the processing computer 3, and the projector 4 is a master-slave distributed network structure, and the graphics processor 5 supports the DisplayPort 1.2 standard.
The implementation mode is specifically as follows: when the device works, a user controls the server 1 through the operation end 2 and transmits image information needing perspective correction to the inside of the server 1, the server 1 carries out interactive control to control the processing computer 3 to work, the processing computer 3 receives the image information needing correction, a ray tracing unit in the processing computer 3 captures the ray information in single image content and identifies the ray intensity and angle in the image based on performance support provided by the graphic processor 5, a ray adjusting unit adjusts the ray intensity and angle in the image information according to the direction and angle of a projection surface which is responsible for the processing computer 3 and is matched with the direction and angle of the projection surface, the image information processed by the ray processing unit 6 is transmitted to the image conversion unit 8, the image conversion unit 8 reads the image information after receiving the image information and converts the image into an editable image, the point correction unit is utilized to pull the control points in the editable image to ensure that the lines in the image are superposed or parallel with the image content, correcting the image, generating corresponding virtual space projector coordinates by a virtual coordinate unit according to the content of the image and the direction of the projector 4, generating a stereo image of a corresponding projection surface in real time by a three-dimensional image unit according to the coordinates of the projector 4, the image integration unit 10 receives the three-dimensional image generated by the three-dimensional image unit, integrates the multiple segments of three-dimensional images to form complete and smooth video information, and transmits the video information to the image output unit 7, the G-SYNC unit 11 processes the video information received by the image output unit 7, and the video is adapted to the output of the projector 4, so that the number of frames displayed per second in the video and the number of frames output per second by the projector 4 are integrated and synchronized, and then the projector 4 finishes projection display.
Example 2
As shown in fig. 1-2, in the case of the CAVE-based non-coplanar perspective correction fusion system, the number of the processing computer 3 and the number of the projectors 4 are set by self-definition according to the number of the display surfaces of the CAVE;
when the number of the display surfaces of the CAVE system is increased, the number of the processing computer 3 and the number of the projectors 4 are also increased and synchronized with the CAVE system, the newly accessed processing computer 3 is connected with the previous processing computer 3 by adopting a synchronization card and is connected with the server 1 through the Internet, so that each display surface in the CAVE system can be provided with an independent different-surface image perspective correction module, and the phenomena of tearing and deformation of a single picture can not occur in the display process;
the processing computers 3 are respectively responsible for the image information in different projection surfaces and finish perspective correction of the image information in the different projection surfaces.
Example 3
As shown in fig. 1-2, the number of the graphic processors 5 in the processing computer 3 may be set to be multiple, specifically, to be multiple cards for fire crossing.
The number of the graphic processors 5 is increased by processing an expansion interface on a mainboard of the computer 3, multi-card cross fire is carried out, so that the image conversion unit 8, the light processing unit 6 and the correction unit 9 can obtain higher graphic performance support during working, the speed can be improved when the image different-surface perspective correction is completed, the image conversion unit 8 reads the image information after receiving the image information, converts the image into an editable image, the point correction unit is utilized to pull a control point in the editable image, so that lines in the image are overlapped or parallel with the image content, the image is corrected, the virtual coordinate unit generates corresponding virtual space projector coordinates according to the content of the image and the direction of the projector 4, and the three-dimensional image unit generates a three-dimensional image of a corresponding projection surface in real time according to the coordinates of the projector 4.
And finally: the above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that are within the spirit and principle of the present invention are intended to be included in the scope of the present invention.

Claims (5)

1. A CAVE-based different-surface perspective correction fusion system is characterized by comprising a server (1), an operation end (2), a plurality of processing computers (3) and a plurality of projectors (4), wherein the processing computers (3) and the projectors (4) are connected in number, the processing computers (3) and the projectors (4) are connected with each other, the server (1) is connected with the processing computers (3) through the Internet, and the processing computers (3) are connected with each other through synchronous cards;
the processing computers (3) are respectively responsible for the image information in different projection surfaces to finish perspective correction of the image information in the different projection surfaces;
the operation end (2) is connected with the server (1), the operation end (2) is a command input end of a user and is used for inputting a control command to the operation of the server (1) and transmitting image information needing perspective correction to the inside of the server (1), the server (1) is responsible for interactive control, sending a processing command to the processing computers (3), controlling the operation states of the processing computers (3) and transmitting the image information;
the image processing system is characterized in that a graphic processor (5) is arranged in the processing computer (3), a light processing unit (6), an image output unit (7) and an image conversion unit (8) are arranged at the connecting end of the graphic processor (5), a correction unit (9) is arranged at the output end of the image conversion unit (8), the correction unit (9) comprises a point correction unit, a virtual coordinate unit and a three-dimensional image unit, an image integration unit (10) is arranged at the output end of the correction unit (9), and the output end of the image integration unit (10) is connected with the image output unit (7);
the light processing unit (6) comprises a light tracking unit and a light adjusting unit, the light tracking unit captures light information in single image content based on performance support provided by the graphic processor (5), and identifies light intensity and angle in the image, the light adjusting unit adjusts the light intensity and angle in the image information based on the performance support provided by the graphic processor (5) according to the direction and angle of the projection surface responsible by the processing computer (3), and the light intensity and angle are matched with the direction and angle of the projection surface;
the image conversion unit (8) is further connected with the light ray processing unit (6), the image information processed by the light ray processing unit (6) is transmitted into the image conversion unit (8), the image conversion unit (8) reads the image information after receiving the image information, converts the image into an editable image, pulls a control point in the editable image by using the point correction unit to enable a line in the image to be coincident or parallel to the image content, corrects the image, the virtual coordinate unit generates a corresponding virtual space projector coordinate according to the content of the image and the direction of the projector (4), and the three-dimensional image unit generates a three-dimensional image of a corresponding projection surface in real time according to the projector (4) coordinate;
the image integration unit (10) receives the three-dimensional images generated by the three-dimensional image unit, integrates multiple sections of three-dimensional images, converges the three-dimensional images into complete and smooth video information, and transmits the video information to the image output unit (7);
the image output unit (7) is also connected with a G-SYNC unit (11), the G-SYNC unit (11) processes the video information received by the image output unit (7) and adapts to the output of the projector (4), so that the frame number displayed per second in the video and the frame number output per second by the projector (4) are integrated and synchronized;
the output end of the image output unit (7) is connected with the projector (4), and the projector (4) performs projection display on the synchronously integrated video image information.
2. The CAVE-based hemifacial perspective correction fusion system of claim 1, wherein: the connection structure of the server (1), the operation end (2), the processing computer (3) and the projector (4) is a master-slave distributed network structure.
3. The CAVE-based hemifacial perspective correction fusion system of claim 1, wherein: the graphics processor (5) supports the DisplayPort 1.2 standard.
4. The CAVE-based hemifacial perspective correction fusion system of claim 1, wherein: and the number of the processing computer (3) and the number of the projectors (4) are set by self according to the number of the display surfaces of CAVE.
5. The CAVE-based hemifacial perspective correction fusion system of claim 1, wherein: the number of the graphic processors (5) in the processing computer (3) can be set to be a plurality, and specifically, the graphic processors are set to be multi-card fire-crossing.
CN201910396778.1A 2019-05-14 2019-05-14 Different-surface perspective correction fusion system based on CAVE Active CN110149505B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910396778.1A CN110149505B (en) 2019-05-14 2019-05-14 Different-surface perspective correction fusion system based on CAVE

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910396778.1A CN110149505B (en) 2019-05-14 2019-05-14 Different-surface perspective correction fusion system based on CAVE

Publications (2)

Publication Number Publication Date
CN110149505A CN110149505A (en) 2019-08-20
CN110149505B true CN110149505B (en) 2021-01-08

Family

ID=67595273

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910396778.1A Active CN110149505B (en) 2019-05-14 2019-05-14 Different-surface perspective correction fusion system based on CAVE

Country Status (1)

Country Link
CN (1) CN110149505B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111901578B (en) * 2020-06-28 2021-10-22 成都威爱新经济技术研究院有限公司 Multi-channel cave type projection method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2573157C (en) * 2004-07-08 2014-08-26 Imax Corporation Equipment and methods for the display of high resolution images using multiple projection displays
US8267531B2 (en) * 2010-03-26 2012-09-18 Seiko Epson Corporation Correcting row and column shifts in a calibration matrix for a projection system
CN104767988A (en) * 2014-01-07 2015-07-08 四川爱特尔科技有限公司 Stereoscopic image display system
CN107635120B (en) * 2017-09-19 2019-05-07 南京乐飞航空技术有限公司 A kind of method of multiple channel ball curtain Geometry rectification and Fusion Edges

Also Published As

Publication number Publication date
CN110149505A (en) 2019-08-20

Similar Documents

Publication Publication Date Title
CN107924589B (en) Communication system
US7321367B2 (en) Arrangement and method for spatial visualization
US10133364B2 (en) Image processing apparatus and method
JP2005039788A (en) Projecting system
WO2020140758A1 (en) Image display method, image processing method, and related devices
CN105094289B (en) A kind of method, equipment and system for realizing graphical user's interactive interface
EP2824924A2 (en) Computer-readable medium, method and projection system for adjusting a projected image
US11749141B2 (en) Information processing apparatus, information processing method, and recording medium
CN104317546A (en) Situational interactive experience simulation system
CN101483742A (en) Forward projection displaying method for combined large screen and control apparatus
CN110149505B (en) Different-surface perspective correction fusion system based on CAVE
CN111355861A (en) Multi-screen video synchronous splicing device and method
CN103049135A (en) Double-board splicing implementation method on basis of electronic white boards
CN110471772B (en) Distributed system, rendering method thereof and client
CN105551380A (en) Mirror surface display equipment
CN109885172B (en) Object interaction display method and system based on Augmented Reality (AR)
Aga et al. 24‐2: Latency Compensation for Optical See‐Through Head‐Mounted with Scanned Display
CN112911260B (en) Multimedia exhibition hall sand table projection display system
JP3979229B2 (en) Video display device and synchronization control program
CN109040740A (en) Virtual reality display system and display drive apparatus
CN108154719A (en) Intelligent digital sand table information system
CN110400277A (en) One bulb curtain geometric correction system
CN114220123B (en) Posture correction method and device, projection equipment and storage medium
CN111966313B (en) Method, device, equipment and medium for realizing fusion of white boards
US20240221237A1 (en) Control apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant