CN111176593A - Projection method and system for extended picture - Google Patents

Projection method and system for extended picture Download PDF

Info

Publication number
CN111176593A
CN111176593A CN201811333530.2A CN201811333530A CN111176593A CN 111176593 A CN111176593 A CN 111176593A CN 201811333530 A CN201811333530 A CN 201811333530A CN 111176593 A CN111176593 A CN 111176593A
Authority
CN
China
Prior art keywords
azimuth
virtual
picture
angle
virtual scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811333530.2A
Other languages
Chinese (zh)
Inventor
王珏
王琦琛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Yunshen Intelligent Technology Co ltd
Original Assignee
Shanghai Yunshen Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Yunshen Intelligent Technology Co ltd filed Critical Shanghai Yunshen Intelligent Technology Co ltd
Priority to CN201811333530.2A priority Critical patent/CN111176593A/en
Publication of CN111176593A publication Critical patent/CN111176593A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a projection method and a system for extending pictures, wherein the method comprises the following steps: acquiring position reference information corresponding to a viewing position; calculating an azimuth viewing angle in at least one azimuth by combining the position reference information; generating a virtual extension picture corresponding to the virtual scene according to the azimuth visual angle in each azimuth; the virtual scene is formed by simulating a real scene, and the ratio of each virtual object in the virtual scene to a real object in the real scene is 1: 1; and projecting the virtual extension picture on the projection interface. The invention can adjust the visual angle of the extended picture of the virtual scene according to the position of the viewer, so that the projected picture accords with the visual angle of the viewer, and the sense of reality of the extended picture is improved.

Description

Projection method and system for extended picture
Technical Field
The present invention relates to the field of projection technologies, and in particular, to a method and a system for projecting an extended frame.
Background
The extended picture is an image with a visual extension effect, is applied to the propaganda pictures of a plurality of artworks and products of merchants, and has an immersive feeling when human eyes watch the extended picture.
For example, an extended view of a road may feel that the road is extended wirelessly, and an extended view of a gallery may not see the end of the gallery. In this way a stereoscopic, realistic experience is achieved.
Some merchants can make the product drawing into an extended picture for projection display, but in the prior art, many merchants only make the extended picture with a single viewing angle for projection display, and when a user faces the extended picture, the extended picture can be well experienced, but when the user watches the extended picture from the side, the stereoscopic impression and the real feeling brought to the user cannot be so high due to the relationship of the viewing angles.
In order to bring better viewing experience of the extended picture to users, the invention provides a projection method and system of the extended picture.
Disclosure of Invention
The invention aims to provide a method and a system for projecting an extended picture, which can adjust the visual angle of the extended picture of a virtual scene according to the position of a viewer, so that the projected picture conforms to the visual angle of the viewer, and the reality of the extended picture is improved.
The technical scheme provided by the invention is as follows:
the invention provides a projection method of an extended picture, which comprises the following steps: acquiring position reference information corresponding to a viewing position; calculating an azimuth viewing angle in at least one azimuth by combining the position reference information; the virtual scene generates a corresponding virtual extension picture according to the azimuth visual angle in each azimuth; the virtual scene is formed by simulating a real scene, and the ratio of each virtual object in the virtual scene to a real object in the real scene is 1: 1; and projecting the virtual extension picture on the projection interface.
Preferably, the generating of the virtual extension picture corresponding to the virtual scene according to the azimuth viewing angle in each azimuth specifically includes: intercepting a corresponding virtual extension picture at the same watching position in the virtual scene according to the azimuth viewing angle in each azimuth; and making the virtual extension picture corresponding to each azimuth visual angle into a virtual scene video, and projecting the virtual extension picture in the virtual scene video on the projection interface, so that the virtual extension picture in the virtual scene is displayed at 360 degrees.
Preferably, the generating a virtual extension picture corresponding to the virtual scene according to the azimuth viewing angle in each azimuth specifically includes: and generating a virtual extended picture according to the virtual pictures in a plurality of directions.
Preferably, the generating a virtual screen corresponding to each azimuth according to the azimuth viewing angle of each azimuth specifically includes:
when the azimuth visual angles of the front azimuth and the rear azimuth in the plurality of azimuth visual angles are equal and the azimuth visual angles of the left azimuth and the right azimuth are not equal, respectively corresponding virtual pictures of the left azimuth visual angle and/or the right azimuth visual angle are cut in the virtual scene;
and/or;
and calculating cutting areas corresponding to the front view angle and/or the rear view angle and/or the upper view angle and/or the lower view angle respectively, and cutting out corresponding virtual pictures in the virtual scene according to the cutting areas and the view angles corresponding to the cutting areas.
Preferably, the generating a virtual screen corresponding to each azimuth according to the azimuth viewing angle of each azimuth specifically includes:
when the azimuth angles of the front azimuth and the rear azimuth in the azimuth angles are not equal, and the azimuth angles of the left azimuth and the right azimuth are equal, virtual pictures corresponding to the front azimuth angle and/or the rear azimuth angle are cut in the virtual scene;
and/or;
and calculating cutting areas corresponding to the left visual angle and/or the right visual angle and/or the upper visual angle and/or the lower visual angle respectively, and cutting out a corresponding virtual picture in the virtual scene according to the cutting areas and the visual angles corresponding to the cutting areas.
Preferably, the generating a virtual screen corresponding to each azimuth according to the azimuth viewing angle of each azimuth specifically includes: when the azimuth visual angles of the left azimuth and the right azimuth are not equal, and the azimuth visual angles of the front azimuth and the rear azimuth are not equal, respectively calculating a cutting area corresponding to each azimuth visual angle; and cutting out a corresponding virtual picture in the virtual scene according to each azimuth visual angle and the cutting area.
Preferably, the generating a virtual screen corresponding to each azimuth according to the azimuth viewing angle of each azimuth specifically includes:
when the X coordinate information in the position reference information is on the X-axis central line and the Y coordinate information in the position reference information is not on the Y-axis central line, cutting the position reference information into corresponding virtual pictures according to the azimuth viewing angle corresponding to the X axis in the virtual scene;
and/or;
and respectively calculating cutting areas corresponding to the orientation visual angles corresponding to the coordinate information on the residual axes in the position reference information, and cutting out corresponding virtual pictures in the virtual scene according to the cutting areas and the orientation visual angles corresponding to the cutting areas.
Preferably, the generating a virtual screen corresponding to each azimuth according to the azimuth viewing angle of each azimuth specifically includes: when the X coordinate information in the position reference information is not on the X-axis central line and the Y coordinate information in the position reference information is on the Y-axis central line, cutting the position reference information into corresponding virtual pictures according to the azimuth viewing angle corresponding to the Y axis in the virtual scene;
and/or;
and respectively calculating cutting areas corresponding to the orientation visual angles corresponding to the coordinate information on the residual axes in the position reference information, and cutting out corresponding virtual pictures in the virtual scene according to the cutting areas and the orientation visual angles corresponding to the cutting areas.
Preferably, the generating a virtual screen corresponding to each azimuth according to the azimuth viewing angle of each azimuth specifically includes: when the X coordinate information is not on the X-axis central line and the Y coordinate information is not on the Y-axis central line, respectively calculating cutting areas corresponding to all the azimuth viewing angles; and cutting out a corresponding virtual picture in the virtual scene according to each azimuth visual angle and the cutting area.
The invention also provides a system applied to the projection method for the extended picture, which comprises intelligent equipment and projection equipment: the smart device includes: the acquisition module is used for acquiring position reference information corresponding to the watching position; the calculation module is electrically connected with the acquisition module and used for calculating azimuth viewing angles in a plurality of azimuths by combining the position reference information; the picture generation module is electrically connected with the calculation module and generates virtual extension pictures according to the azimuth visual angles in the plurality of azimuths; and the projection equipment is used for projecting the virtual extension picture on the projection interface according to the size of the projection interface.
The extended picture projection method and the extended picture projection system provided by the invention can bring at least one of the following beneficial effects:
the invention can adjust the virtual extension picture of the virtual scene according to the watching position of the user, so that the projected virtual extension picture accords with the watching visual angle of the user, and the sense of reality and the stereoscopic impression of the virtual extension picture are increased.
The virtual extended picture of each azimuth visual angle can be made into a virtual scene video and displayed in a 360-degree naked eye mode, so that a user can experience a multi-azimuth virtual scene, and the watching experience of the user is improved.
Drawings
The foregoing features, technical features, advantages and implementations of a method and system for extended screen projection will be further described in the following detailed description of preferred embodiments in a clearly understandable manner with reference to the accompanying drawings.
FIG. 1 is a flow chart of one embodiment of a method for projecting an extended picture according to the present invention;
FIG. 2 is a flow chart of another embodiment of a method for projecting an extended picture according to the present invention;
FIG. 3 is a schematic view of the viewing angle at various orientations of a viewpoint/viewing position in accordance with the present invention;
FIG. 4 is a schematic view of the viewing angle at various orientations of another viewpoint/viewing position in accordance with the present invention;
FIG. 5 is a schematic view of a perspective of another viewpoint/viewing position in various orientations of the present invention;
FIG. 6 is a schematic diagram of cropping in a direction in front of a viewpoint/viewing position in accordance with the present invention;
FIG. 7 is a schematic view of cropping at a view point/viewing position rear orientation in accordance with the present invention;
FIG. 8 is a schematic diagram of cropping in the left-hand side of a viewpoint/viewing position in accordance with the present invention;
FIG. 9 is a schematic diagram of cropping in the right side orientation of a viewpoint/viewing position in the present invention;
FIG. 10 is a schematic diagram of an embodiment of a projection system for extending a picture.
The reference numbers illustrate:
11-an acquisition module, 12-a calculation module and 13-a picture generation module.
Detailed Description
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the following description will be made with reference to the accompanying drawings. It is obvious that the drawings in the following description are only some examples of the invention, and that for a person skilled in the art, other drawings and embodiments can be derived from them without inventive effort.
For the sake of simplicity, the drawings only schematically show the parts relevant to the present invention, and they do not represent the actual structure as a product. In addition, in order to make the drawings concise and understandable, components having the same structure or function in some of the drawings are only schematically illustrated or only labeled. In this document, "one" means not only "only one" but also a case of "more than one".
As shown in fig. 1, the present invention provides an embodiment of a projection method for extending a picture, including:
s101, acquiring position reference information corresponding to a viewing position;
specifically, after a viewer enters a viewing space, a viewing position of the viewer is acquired by using a mobile terminal carried by the viewer; the mobile terminal can complete indoor positioning. The mobile terminal can be a mobile phone, a tablet personal computer, an intelligent bracelet and the like, and integrates an indoor positioning function on equipment frequently used by a viewer at ordinary times; or a hand-held terminal and the like can be specially produced, and the indoor positioning function is integrated.
S102, calculating an azimuth viewing angle in at least one azimuth by combining the position reference information;
specifically, at different positions, the perspective view of a person may also be different at each orientation; if at different positions, the pictures presented by watching the same object at the same direction are different; the different pictures are seen because the perspective view angle changes when the object is viewed.
The position information of the watching position comprises X-axis coordinate information, Y-axis coordinate information and Z-axis coordinate information, and a plurality of azimuth viewing angles can be calculated through the position information of the watching position; for example: an azimuth view right ahead, an azimuth view right behind, an azimuth view left, an azimuth view right above, and an azimuth view right below.
S103, generating a virtual extension picture corresponding to the virtual scene according to the azimuth visual angle in each azimuth; the virtual scene is formed by simulating a real scene, and the ratio of each virtual object in the virtual scene to a real object in the real scene is 1: 1;
specifically, the directions can be understood as six directions, i.e., up, down, left, right, front and back directions, and the present embodiment is exemplified by a classroom, and the classroom can be divided into six directions, i.e., up, down, left, right, front and back directions, with the direction facing the platform as the front. When the platform is in the front, people can see the opposite side of the platform, but when people look at the platform on the side, people can see the side of the platform, so that the visual images in different directions are different. After the azimuth viewing angle is calculated again, the virtual scene in each azimuth can be confirmed according to the azimuth viewing angle in each range, that is, the virtual scene in the left position is confirmed according to the left viewing angle, the virtual scene in the right position is confirmed according to the right viewing angle, the virtual scene in the front position is confirmed according to the front viewing angle, the virtual scene in the rear position is confirmed according to the rear viewing angle, the virtual scene in the upper position is confirmed according to the upper viewing angle, and the virtual scene in the lower position is confirmed according to the lower viewing angle. After the virtual scenes of each orientation are confirmed, the scenes of each orientation can be combined together to generate a corresponding virtual extended picture.
The scheme is different from the scheme that the virtual extension picture is fixed at a certain visual angle in the prior art, and the virtual extension picture conforms to the real visual angle of the user no matter which direction the user sees the virtual extension picture, so that the virtual extension picture has better stereoscopic impression and sense of reality.
S104, projecting the virtual extension picture on the projection interface.
The projection interface can be a wall in a room or a specially made projection screen, and after the virtual extension picture is projected on the projection interface, a user can view the virtual extension picture. By the aid of the projection method for the extended picture, the virtual extended picture can be adjusted according to the direction of the user, and stereoscopic impression and sense of reality of the user on the virtual extended picture are improved.
As shown in fig. 2, the present invention further provides another embodiment of a method for projecting an extended picture, including the steps of:
s201, acquiring position reference information corresponding to a viewing position;
s202, calculating an azimuth viewing angle in at least one azimuth by combining the position reference information;
specifically, the method specifically comprises the following steps:
A. acquiring position reference information corresponding to a viewing position; calculating an azimuth viewing angle in one azimuth of the viewing position by combining the position reference information;
specifically, when a plurality of azimuth viewing angles need to be calculated, for example, the azimuth viewing angles of the front, rear, left and right azimuths; the front azimuth viewing angle can be calculated by utilizing a viewing angle calculation formula of the front azimuth viewing angle; as shown in fig. 6, the forward azimuthal view is FOV; (ii) a Where L1 is the width of the viewing space, s is an offset value from the center position of the viewing space, and y is the viewing distance directly in front within the viewing space.
B. Calculating azimuth viewing angles of adjacent azimuths of the rest azimuths in the plurality of azimuths according to the calculated azimuth viewing angles and the angle relation between the adjacent azimuths;
specifically, the azimuth angle between the front azimuth viewing angle and the left or right azimuth viewing angle is a fixed angle of 180 degrees, and after the front azimuth viewing angle is calculated, the front azimuth viewing angle is subtracted from the fixed angle of 180 degrees, so that the azimuth viewing angle of the left or right azimuth can be obtained.
as shown in fig. 6, the azimuth angle between the forward azimuth viewing angle and the right azimuth viewing angle is a fixed angle of 180 °, the azimuth viewing angle of the right azimuth is equal to 180 ° minus the forward azimuth viewing angle, the forward and backward azimuth viewing angles are equal, the circumferential angle of the viewpoint o is 360 °, and the left azimuth viewing angle can be calculated when the right azimuth viewing angle, ∠ aob, is known.
Or C, combining the position reference information and the visual angle calculation formulas of all the directions to respectively calculate a plurality of directional visual angles of the watching positions.
Specifically, when a plurality of azimuth viewing angles need to be calculated, for example, the azimuth viewing angles of the front, rear, left and right azimuths; the front azimuth viewing angle can be calculated by utilizing a viewing angle calculation formula of the front azimuth viewing angle; the rear azimuth viewing angle can be calculated by utilizing a viewing angle calculation formula of the rear azimuth viewing angle; the left-side azimuth viewing angle can be calculated by using a viewing angle calculation formula of the left-side azimuth viewing angle; the right-side azimuth viewing angle can be calculated by using a viewing angle calculation formula of the right-side azimuth viewing angle.
as shown in fig. 6, the forward azimuth angle is FOV, FOV is 2 ∠ θ, and tan θ is (L)12+ s)/y; where L1 is the width of the viewing space, s is the lateral offset from the center position of the viewing space, and y is the viewing distance directly in front within the viewing space.
As shown in fig. 7, the rear azimuth viewing angle is FOV,
Figure BDA0001860628650000081
tanθ=(L1/2+s)/(L2-y); where L2 is the length of the viewing space, s is the lateral offset from the center position of the viewing space, and y is the viewing distance directly in front within the viewing space.
When the position information of the viewpoint o is known, the azimuth viewing angles of all azimuths can be calculated, and the azimuth viewing angles corresponding to the left and right sides of the viewpoint o can be calculated by a formula, which is not described herein again.
S203, intercepting a corresponding virtual extension picture at the same watching position in the virtual scene according to the azimuth viewing angle in each azimuth;
s204, making the virtual extension picture corresponding to each azimuth visual angle into a virtual scene video;
that is, as described in the above embodiment, the classroom is divided into six directions, i.e., front, back, left, right, up, down, and the front direction is taken as the forward direction, and a virtual extended screen can be confirmed according to the viewing position of the user. Similarly, a virtual extended picture can be identified by taking the left direction as the positive direction. Thus, a virtual extended picture of each orientation in 360 degrees can be obtained. The virtual extended pictures in each direction are combined to generate a virtual scene video.
S205, projecting the virtual extension picture in the virtual scene video on the projection interface, so that the virtual extension picture in the virtual scene is displayed in 360 degrees.
And (4) rolling and playing the virtual scene video, namely displaying the virtual extension picture in each direction in 360 degrees, and playing the panoramic picture shot on the mobile phone like rolling and playing the panoramic picture.
Preferably, the generating a virtual extension picture corresponding to the virtual scene according to the azimuth viewing angle in each azimuth specifically includes: and generating a virtual extended picture according to the virtual pictures in a plurality of directions.
The present invention also provides another embodiment of a method for projecting an extended picture, including:
s301, acquiring position reference information corresponding to the viewing position;
s302, calculating an azimuth viewing angle in at least one azimuth by combining the position reference information;
s303, generating a virtual picture corresponding to each azimuth according to the azimuth viewing angle of each azimuth, and generating a virtual extension picture according to the virtual pictures in a plurality of azimuths; the virtual scene is formed by simulating a real scene, and the ratio of each virtual object in the virtual scene to a real object in the real scene is 1: 1;
according to the scheme, after the azimuth visual angle of each azimuth is determined, the intelligent device can determine the virtual picture in the azimuth through the azimuth visual angle, and then the virtual extended picture is formed according to the virtual pictures in a plurality of azimuths in a combined mode. For example, the user can stand in the center of a classroom and face the platform, so that the user can confirm that the virtual picture in the front direction is the virtual picture of the platform and the blackboard, the left direction is the virtual picture of the window, the right direction is the virtual picture of the front door and the back door, the upper direction is the virtual picture of the electric lamp, and the lower direction is the virtual picture of the floor.
S304, projecting the virtual extended picture on the projection interface.
Further preferably, the generating a virtual screen corresponding to each azimuth according to the azimuth angle of each azimuth specifically includes: when the azimuth visual angles of the front azimuth and the rear azimuth in the plurality of azimuth visual angles are equal and the azimuth visual angles of the left azimuth and the right azimuth are not equal, virtual scene pictures corresponding to the left azimuth visual angle and/or the right azimuth visual angle are cut in the virtual scene; and/or; and calculating cutting areas corresponding to the front view angle and/or the rear view angle and/or the upper view angle and/or the lower view angle respectively, and cutting out corresponding virtual scene pictures in the virtual scene according to the cutting areas and the view angles corresponding to the cutting areas.
Specifically, after the plurality of azimuth viewing angles are calculated, whether two equal azimuth viewing angles exist in the plurality of azimuth viewing angles is analyzed, and if two equal azimuth viewing angles exist, whether two azimuths corresponding to the two equal azimuth viewing angles are opposite azimuths is analyzed.
When the azimuth viewing angles of the front and rear opposite azimuths are analyzed to be equal, as shown in fig. 6 and 7, the front and rear opposite azimuths are opposite; according to the actual display condition, a virtual scene picture corresponding to the left-side visual angle can be cut out from the virtual scene, a virtual scene picture corresponding to the right-side visual angle can be cut out from the virtual scene, virtual scene pictures corresponding to the left-side visual angle and the right-side visual angle can be cut out from the virtual scene, and the virtual scene pictures can not cut out normal pictures of the left-side and the right-side in the virtual scene.
Specifically, the viewing position is a central position, and as shown in fig. 3, when the azimuth viewing angles of all the two opposite azimuths are equal, the virtual scene picture cut from the virtual scene at the central position in each azimuth is a normal picture.
Further preferably, the generating a virtual screen corresponding to each azimuth according to the azimuth viewing angle of each azimuth specifically includes: when the azimuth angles of the front azimuth and the rear azimuth in the azimuth angles are not equal and the azimuth angles of the left azimuth and the right azimuth are equal, virtual scene pictures corresponding to the front azimuth angle and/or the rear azimuth angle are cut out from the virtual scene;
specifically, after the plurality of azimuth viewing angles are calculated, whether two equal azimuth viewing angles exist in the plurality of azimuth viewing angles is analyzed, and if two equal azimuth viewing angles exist, whether two azimuths corresponding to the two equal azimuth viewing angles are opposite azimuths is analyzed.
When the azimuth viewing angles of the left and right opposite orientations are analyzed to be equal, as shown in fig. 4 and 5, and as shown in fig. 8 and 9, the left and right opposite orientations are opposite; according to the requirements of actual display conditions, a virtual scene picture corresponding to a front position view angle can be cut out from a virtual scene, a virtual scene picture corresponding to a rear position view angle can be cut out from the virtual scene, virtual scene pictures corresponding to the front and rear position view angles can be cut out from the virtual scene, and the virtual scene pictures can not cut out normal pictures of the front and rear positions in the virtual scene.
Further preferably, the generating a virtual screen corresponding to each azimuth according to the azimuth angle of each azimuth specifically includes: and calculating cutting areas corresponding to the left visual angle and/or the right visual angle and/or the upper visual angle and/or the lower visual angle respectively, and cutting out a corresponding virtual scene picture in the virtual scene according to the cutting areas and the visual angles corresponding to the cutting areas.
Specifically, when the left and right azimuth viewing angles are analyzed to be equal, the pictures corresponding to the left azimuth viewing angle, the right azimuth viewing angle, the upper azimuth viewing angle and the lower azimuth viewing angle are no longer normal pictures, and the normal pictures need to be cut.
And according to the requirement of the actual display condition, cutting out the virtual scene picture corresponding to each direction after selecting a plurality of directions from the left direction view angle, the right direction view angle, the upper direction view angle and the lower direction view angle.
Further preferably, the generating a virtual extension picture corresponding to the virtual scene according to the azimuth viewing angle in each azimuth specifically includes: when the azimuth visual angles of the left azimuth and the right azimuth are not equal, and the azimuth visual angles of the front azimuth and the rear azimuth are not equal, respectively calculating a cutting area corresponding to each azimuth visual angle; and cutting out a corresponding virtual scene picture in the virtual scene according to each azimuth visual angle and the cutting area.
Specifically, when the azimuth viewing angles of the left and right opposite directions are analyzed to be unequal, and the azimuth viewing angles of the front and back opposite directions are analyzed to be unequal, the pictures corresponding to the front viewing angle, the rear viewing angle, the left viewing angle, the right viewing angle, the upper viewing angle and the lower viewing angle are no longer normal pictures, and the normal pictures need to be cut.
According to the requirement of an actual display condition, after selecting a plurality of azimuths from a front visual angle, a rear visual angle, a left visual angle, a right visual angle, an upper visual angle and a lower visual angle, cutting out virtual scene pictures corresponding to all azimuths.
Further preferably, the generating a virtual extension picture corresponding to the virtual scene according to the azimuth viewing angle in each azimuth specifically includes: when the X coordinate information in the position reference information is on the X-axis central line and the Y coordinate information in the position reference information is not on the Y-axis central line, cutting the position reference information into corresponding virtual scene pictures according to the azimuth viewing angle corresponding to the X axis in the virtual scene; and/or; and respectively calculating cutting areas corresponding to the orientation visual angles corresponding to the coordinate information on the residual axes in the position reference information, and cutting out corresponding virtual scene pictures in the virtual scene according to the cutting areas and the orientation visual angles corresponding to the cutting areas.
Specifically, the central line of the X axis is a straight line which is 1/2 of the width of the viewing space and is parallel to the Y axis; if the viewing space is 4 m long and 2m wide, the X-axis center line is a straight line 1m wide and parallel to the Y-axis.
The central line of the X axis is a straight line which is 1/2 of the width of the viewing space and is parallel to the Y axis; when the viewing space is expressed in pixels, the specification is 800dp in length and 400dp in width, and the X-axis center line is a straight line 200dp in width and parallel to the Y-axis.
When the X coordinate information in the position reference information is 1m or 200dp, if the X axis corresponds to the front and rear positions, a virtual scene picture corresponding to the front position view angle can be cut out from the virtual scene according to the actual display condition, a virtual scene picture corresponding to the rear position view angle can be cut out from the virtual scene, virtual scene pictures corresponding to the front and rear position view angles can be cut out from the virtual scene, and the virtual scene pictures do not cut out normal pictures of the front and rear positions in the virtual scene.
Further preferably, when the X coordinate information in the position reference information is on the X-axis center line and the Y coordinate information in the position reference information is not on the Y-axis center line, a clipping region corresponding to each of the orientation views corresponding to the coordinate information on the remaining axes in the position reference information is calculated, and a corresponding virtual scene screen is clipped in the virtual scene according to the clipping region and the orientation view corresponding to the clipping region.
Specifically, when the position reference information includes Y coordinate information and Z coordinate information, if the Y axis corresponds to the left and right azimuths, the Z axis corresponds to the upper and lower azimuths.
The frames corresponding to the left visual angle, the right visual angle, the upper visual angle and the lower visual angle are no longer normal frames, and the normal frames need to be cut.
And according to the requirement of the actual display condition, cutting out the virtual scene picture corresponding to each direction after selecting a plurality of directions from the left direction view angle, the right direction view angle, the upper direction view angle and the lower direction view angle.
Further preferably, when the X coordinate information in the position reference information is not on the X-axis central line and the Y coordinate information in the position reference information is on the Y-axis central line, the virtual scene is cut into a corresponding virtual scene picture according to the azimuth viewing angle corresponding to the Y-axis;
when the Y coordinate information in the position reference information is 2m or 400dp, if the two left and right positions corresponding to the Y axis are required according to the actual display situation, the virtual scene picture corresponding to the left position view angle can be cut out in the virtual scene, the virtual scene picture corresponding to the right position view angle can be cut out in the virtual scene, the virtual scene pictures corresponding to the left and right position view angles can be cut out in the virtual scene, and the virtual scene pictures do not cut out the normal pictures of the left and right positions in the virtual scene.
When the X coordinate information in the position reference information is not on the X-axis central line and the Y coordinate information in the position reference information is on the Y-axis central line, respectively calculating cutting areas corresponding to the orientation visual angles corresponding to the coordinate information on the residual axes in the position reference information, and cutting a corresponding virtual scene picture in the virtual scene according to the cutting areas and the orientation visual angles corresponding to the cutting areas.
Specifically, when the position reference information includes X coordinate information and Z coordinate information, if the X axis corresponds to the front and rear two directions, the Z axis corresponds to the upper and lower two directions.
The pictures corresponding to the front view angle, the rear view angle, the upper view angle and the lower view angle are no longer normal pictures, and the normal pictures need to be cut.
According to the requirement of an actual display condition, after selecting a plurality of azimuths from a front azimuth visual angle, a rear azimuth visual angle, an upper azimuth visual angle and a lower azimuth visual angle, cutting out virtual scene pictures corresponding to all azimuths.
When the X coordinate information is not on the X-axis central line and the Y coordinate information is not on the Y-axis central line, respectively calculating cutting areas corresponding to all the azimuth viewing angles; and cutting out a corresponding virtual scene picture in the virtual scene according to each azimuth visual angle and the cutting area.
Specifically, when the X coordinate information in the position reference information is not on the X-axis central line and the Y coordinate information in the position reference information is not on the Y-axis central line, the frames corresponding to the front view angle, the rear view angle, the left view angle, the right view angle, the upper view angle, and the lower view angle are no longer normal frames, and the normal frames need to be cut.
According to the requirement of an actual display condition, after selecting a plurality of azimuths from a front visual angle, a rear visual angle, a left visual angle, a right visual angle, an upper visual angle and a lower visual angle, cutting out virtual scene pictures corresponding to all azimuths.
When the cutting area corresponding to each azimuth viewing angle is calculated, two calculation schemes are provided:
the first calculation scheme is as follows:
calculating a view angle picture parameter corresponding to each azimuth according to the azimuth view angle and the position reference information corresponding to each azimuth;
specifically, under the condition that the azimuth viewing angle is known, the position reference information contains the viewing distance; a view angle picture width at each azimuth at the viewing position can be calculated, for example, the view angle picture width is 600 dp; the view frame width is used as a view frame parameter.
And calculating the cutting area corresponding to each direction according to the visual angle picture parameter and the viewing space parameter corresponding to each direction.
Specifically, after the view angle picture width (600dp) corresponding to each azimuth is calculated, the picture view width (400dp) of the view space in each azimuth is fixed, and the cut region corresponding to each azimuth is obtained by subtracting the picture view width (400dp) from the view angle picture width (600 dp).
The second calculation scheme is as follows:
and analyzing the position deviation information of the position reference information relative to the preset position information, and calculating the corresponding cutting area by combining the position deviation information.
specifically, as shown in fig. 6, the virtual screen corresponding to the right-ahead view angle has a width to be clipped of 2s, the front view angle is FOV, FOV is 2 ∠ θ, and tan θ is (L ∠ θ)12+ s)/y; where L1 is the width of the viewing space, s is the lateral offset from the center position of the viewing space, and y is the viewing distance directly in front within the viewing space.
As shown in fig. 7, the virtual frame corresponding to the front view has a width of 2s to be cut; the rear azimuth viewing angle is the FOV,
Figure BDA0001860628650000141
tanθ=(L1/2+s)/(L2-y); where L2 is the length of the viewing space, s is the lateral offset from the center position of the viewing space, and y is the viewing distance directly in front within the viewing space.
The present invention also provides an embodiment of a projection system for extending a picture, as shown in fig. 10, including a smart device and a projection device:
the smart device includes:
an obtaining module 11, configured to obtain position reference information corresponding to a viewing position;
specifically, after a viewer enters a viewing space, a viewing position of the viewer is acquired by using a mobile terminal carried by the viewer; the mobile terminal can complete indoor positioning. The mobile terminal can be a mobile phone, a tablet personal computer, an intelligent bracelet and the like, and integrates an indoor positioning function on equipment frequently used by a viewer at ordinary times; or a hand-held terminal and the like can be specially produced, and the indoor positioning function is integrated.
A calculating module 12 electrically connected to the acquiring module 11, and configured to calculate an azimuth viewing angle in at least one azimuth by combining the position reference information;
specifically, at different positions, the perspective view of a person may also be different at each orientation; if at different positions, the pictures presented by watching the same object at the same direction are different; the different pictures are seen because the perspective view angle changes when the object is viewed.
The position information of the watching position comprises X-axis coordinate information, Y-axis coordinate information and Z-axis coordinate information, and a plurality of azimuth viewing angles can be calculated through the position information of the watching position; for example: an azimuth view right ahead, an azimuth view right behind, an azimuth view left, an azimuth view right above, and an azimuth view right below.
The picture generation module 13 is electrically connected with the calculation module 12 and is used for generating a virtual extension picture corresponding to the virtual scene according to the azimuth viewing angle in each azimuth; the virtual scene is formed by simulating a real scene, and the ratio of each virtual object in the virtual scene to a real object in the real scene is 1: 1;
specifically, the directions can be understood as six directions, i.e., up, down, left, right, front and back directions, and the present embodiment is exemplified by a classroom, and the classroom can be divided into six directions, i.e., up, down, left, right, front and back directions, with the direction facing the platform as the front. When the platform is in the front, people can see the opposite side of the platform, but when people look at the platform on the side, people can see the side of the platform, so that the visual images in different directions are different. After the azimuth viewing angle is calculated again, the virtual scene in each azimuth can be confirmed according to the azimuth viewing angle in each range, that is, the virtual scene in the left position is confirmed according to the left viewing angle, the virtual scene in the right position is confirmed according to the right viewing angle, the virtual scene in the front position is confirmed according to the front viewing angle, the virtual scene in the rear position is confirmed according to the rear viewing angle, the virtual scene in the upper position is confirmed according to the upper viewing angle, and the virtual scene in the lower position is confirmed according to the lower viewing angle. After the virtual scenes of each orientation are confirmed, the scenes of each orientation can be combined together to generate a corresponding virtual extended picture.
The projection device is used for projecting the virtual extension picture on the projection interface.
By the aid of the projection method for the extended picture, the virtual extended picture can be adjusted according to the direction of the user, and stereoscopic impression and sense of reality of the user on the virtual extended picture are improved. In addition, the projection system for extending the picture also has other functions described in the above embodiments of the method, and details are not described here.
It should be noted that the above embodiments can be freely combined as necessary. The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (10)

1. A method for projecting an extended picture, comprising the steps of:
acquiring position reference information corresponding to a viewing position;
calculating an azimuth viewing angle in at least one azimuth by combining the position reference information;
generating a virtual extension picture corresponding to the virtual scene according to the azimuth visual angle in each azimuth; the virtual scene is formed by simulating a real scene, and the ratio of each virtual object in the virtual scene to a real object in the real scene is 1: 1;
and projecting the virtual extension picture on the projection interface.
2. The method for projecting an extended picture according to claim 1, wherein the generating of the virtual extended picture corresponding to the virtual scene according to the azimuth viewing angle in each azimuth specifically comprises:
intercepting a corresponding virtual extension picture at the same watching position in the virtual scene according to the azimuth viewing angle in each azimuth;
and making the virtual extension picture corresponding to each azimuth visual angle into a virtual scene video, and projecting the virtual extension picture in the virtual scene video on the projection interface, so that the virtual extension picture in the virtual scene is displayed at 360 degrees.
3. The method for projecting an extended picture according to claim 1, wherein the generating a virtual extended picture corresponding to a virtual scene according to an azimuth viewing angle in each azimuth specifically comprises:
and generating a virtual extended picture according to the virtual pictures in a plurality of directions.
4. The extended picture projection method according to claim 3, wherein the generating of the virtual picture corresponding to each orientation according to the orientation view angle of each orientation specifically comprises:
when the azimuth visual angles of the front azimuth and the rear azimuth in the plurality of azimuth visual angles are equal and the azimuth visual angles of the left azimuth and the right azimuth are not equal, respectively corresponding virtual pictures of the left azimuth visual angle and/or the right azimuth visual angle are cut in the virtual scene;
and/or;
and calculating cutting areas corresponding to the front view angle and/or the rear view angle and/or the upper view angle and/or the lower view angle respectively, and cutting out corresponding virtual pictures in the virtual scene according to the cutting areas and the view angles corresponding to the cutting areas.
5. The extended picture projection method according to claim 3, wherein the generating of the virtual picture corresponding to each orientation according to the orientation view angle of each orientation specifically comprises:
when the azimuth angles of the front azimuth and the rear azimuth in the azimuth angles are not equal, and the azimuth angles of the left azimuth and the right azimuth are equal, virtual pictures corresponding to the front azimuth angle and/or the rear azimuth angle are cut in the virtual scene;
and/or;
and calculating cutting areas corresponding to the left visual angle and/or the right visual angle and/or the upper visual angle and/or the lower visual angle respectively, and cutting out a corresponding virtual picture in the virtual scene according to the cutting areas and the visual angles corresponding to the cutting areas.
6. The extended picture projection method according to claim 3, wherein the generating of the virtual picture corresponding to each orientation according to the orientation view angle of each orientation specifically comprises:
when the azimuth visual angles of the left azimuth and the right azimuth are not equal, and the azimuth visual angles of the front azimuth and the rear azimuth are not equal, respectively calculating a cutting area corresponding to each azimuth visual angle;
and cutting out a corresponding virtual picture in the virtual scene according to each azimuth visual angle and the cutting area.
7. The extended picture projection method according to claim 3, wherein the generating of the virtual picture corresponding to each orientation according to the orientation view angle of each orientation specifically comprises:
when the X coordinate information in the position reference information is on the X-axis central line and the Y coordinate information in the position reference information is not on the Y-axis central line, cutting the position reference information into corresponding virtual pictures according to the azimuth viewing angle corresponding to the X axis in the virtual scene;
and/or;
and respectively calculating cutting areas corresponding to the orientation visual angles corresponding to the coordinate information on the residual axes in the position reference information, and cutting out corresponding virtual pictures in the virtual scene according to the cutting areas and the orientation visual angles corresponding to the cutting areas.
8. The extended picture projection method according to claim 3, wherein the generating of the virtual picture corresponding to each orientation according to the orientation view angle of each orientation specifically comprises:
when the X coordinate information in the position reference information is not on the X-axis central line and the Y coordinate information in the position reference information is on the Y-axis central line, cutting the position reference information into corresponding virtual pictures according to the azimuth viewing angle corresponding to the Y axis in the virtual scene;
and/or;
and respectively calculating cutting areas corresponding to the orientation visual angles corresponding to the coordinate information on the residual axes in the position reference information, and cutting out corresponding virtual pictures in the virtual scene according to the cutting areas and the orientation visual angles corresponding to the cutting areas.
9. The extended picture projection method according to claim 3, wherein the generating of the virtual picture corresponding to each orientation according to the orientation view angle of each orientation specifically comprises:
when the X coordinate information is not on the X-axis central line and the Y coordinate information is not on the Y-axis central line, respectively calculating cutting areas corresponding to all the azimuth viewing angles;
and cutting out a corresponding virtual picture in the virtual scene according to each azimuth visual angle and the cutting area.
10. A system applied to the projection method of extended picture according to any one of claims 1 to 9, characterized by comprising an intelligent device and a projection device:
the smart device includes:
the acquisition module is used for acquiring position reference information corresponding to the watching position;
the calculation module is electrically connected with the acquisition module and used for calculating an azimuth viewing angle in at least one azimuth by combining the position reference information;
the picture generation module is electrically connected with the calculation module and used for generating a virtual extension picture corresponding to the virtual scene according to the azimuth visual angle in each azimuth; the virtual scene is formed by simulating a real scene, and the ratio of each virtual object in the virtual scene to a real object in the real scene is 1: 1;
the projection device is used for projecting the virtual extension picture on the projection interface.
CN201811333530.2A 2018-11-09 2018-11-09 Projection method and system for extended picture Pending CN111176593A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811333530.2A CN111176593A (en) 2018-11-09 2018-11-09 Projection method and system for extended picture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811333530.2A CN111176593A (en) 2018-11-09 2018-11-09 Projection method and system for extended picture

Publications (1)

Publication Number Publication Date
CN111176593A true CN111176593A (en) 2020-05-19

Family

ID=70647927

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811333530.2A Pending CN111176593A (en) 2018-11-09 2018-11-09 Projection method and system for extended picture

Country Status (1)

Country Link
CN (1) CN111176593A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2365694A2 (en) * 2008-11-18 2011-09-14 LG Electronics Inc. Method and apparatus for processing image signal
CN106251403A (en) * 2016-06-12 2016-12-21 深圳超多维光电子有限公司 A kind of methods, devices and systems of virtual three-dimensional Scene realization
US20170195664A1 (en) * 2015-12-31 2017-07-06 Beijing Pico Technology Co., Ltd. Three-dimensional viewing angle selecting method and apparatus
CN106954061A (en) * 2017-03-08 2017-07-14 山东大学 Many equipment interaction display control systems and method based on Arduino
CN107193372A (en) * 2017-05-15 2017-09-22 杭州隅千象科技有限公司 From multiple optional position rectangle planes to the projecting method of variable projection centre
WO2018019256A1 (en) * 2016-07-26 2018-02-01 北京小鸟看看科技有限公司 Virtual reality system, and method and device for adjusting visual angle thereof
US20180061071A1 (en) * 2016-09-01 2018-03-01 Canon Kabushiki Kaisha Display control apparatus, display control method, and storage medium
WO2018197743A1 (en) * 2017-04-27 2018-11-01 Nokia Technologies Oy Virtual reality viewport adaption

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2365694A2 (en) * 2008-11-18 2011-09-14 LG Electronics Inc. Method and apparatus for processing image signal
US20170195664A1 (en) * 2015-12-31 2017-07-06 Beijing Pico Technology Co., Ltd. Three-dimensional viewing angle selecting method and apparatus
CN106251403A (en) * 2016-06-12 2016-12-21 深圳超多维光电子有限公司 A kind of methods, devices and systems of virtual three-dimensional Scene realization
WO2018019256A1 (en) * 2016-07-26 2018-02-01 北京小鸟看看科技有限公司 Virtual reality system, and method and device for adjusting visual angle thereof
US20180061071A1 (en) * 2016-09-01 2018-03-01 Canon Kabushiki Kaisha Display control apparatus, display control method, and storage medium
CN106954061A (en) * 2017-03-08 2017-07-14 山东大学 Many equipment interaction display control systems and method based on Arduino
WO2018197743A1 (en) * 2017-04-27 2018-11-01 Nokia Technologies Oy Virtual reality viewport adaption
CN107193372A (en) * 2017-05-15 2017-09-22 杭州隅千象科技有限公司 From multiple optional position rectangle planes to the projecting method of variable projection centre

Similar Documents

Publication Publication Date Title
CN101783967B (en) Signal processing device, image display device, signal processing method, and computer program
US9549174B1 (en) Head tracked stereoscopic display system that uses light field type data
JP2007052304A (en) Video display system
CN102802014B (en) Naked eye stereoscopic display with multi-human track function
AU2018249563B2 (en) System, method and software for producing virtual three dimensional images that appear to project forward of or above an electronic display
CN108076208B (en) Display processing method and device and terminal
CN103747236A (en) 3D (three-dimensional) video processing system and method by combining human eye tracking
CA2859521C (en) System and method for providing videoconferencing among a plurality of locations
CN111050148A (en) Three-folding-screen-site-based projection method and system and three-folding-screen site
CN111179407A (en) Virtual scene creating method, virtual scene projecting system and intelligent equipment
JP4496823B2 (en) Image processing apparatus and method, and image display system
WO2013133057A1 (en) Image processing apparatus, method, and program
CN111176593A (en) Projection method and system for extended picture
CN111050145B (en) Multi-screen fusion imaging method, intelligent device and system
CN111045286A (en) Projection method and system based on double-folding screen field and double-folding screen field
CN111050144A (en) Projection method and system based on six-fold screen field and six-fold screen field
JP2005092363A (en) Image generation device and image generation program
CN111050156A (en) Projection method and system based on four-fold screen field and four-fold screen field
CN111050147A (en) Projection method and system based on five-fold screen field and five-fold screen field
JP2015008394A (en) Information terminal device
JP2021180469A (en) Videophone device and videophone unit and application software used therefor
CN111182288B (en) Space object imaging method and system
KR20120052142A (en) 3-dimension display apparatus and method for extracting depth of 3d image thereof
KR101343552B1 (en) Image display apparatus displaying personalized three-dimensional image based on audience position and displaying method thereof
CN111182278B (en) Projection display management method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200519

WD01 Invention patent application deemed withdrawn after publication