GB2258979A - Method and device for the synthesis of three-dimensional animated map-type images - Google Patents

Method and device for the synthesis of three-dimensional animated map-type images Download PDF

Info

Publication number
GB2258979A
GB2258979A GB8705032A GB8705032A GB2258979A GB 2258979 A GB2258979 A GB 2258979A GB 8705032 A GB8705032 A GB 8705032A GB 8705032 A GB8705032 A GB 8705032A GB 2258979 A GB2258979 A GB 2258979A
Authority
GB
United Kingdom
Prior art keywords
scene
points
observer
trihedron
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB8705032A
Other versions
GB8705032D0 (en
GB2258979B (en
Inventor
Daniel Garnier
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Thales SA
Original Assignee
Thomson CSF SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thomson CSF SA filed Critical Thomson CSF SA
Publication of GB8705032D0 publication Critical patent/GB8705032D0/en
Publication of GB2258979A publication Critical patent/GB2258979A/en
Application granted granted Critical
Publication of GB2258979B publication Critical patent/GB2258979B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/40Hidden part removal

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The device for the implementation of the process according to the invention comprises: a main memory (1) for storing the geographical coordinates of the scene, means (2, 3) for computing the coordinates of the perspective on the screen, transcoding means (10) for bestowing an aspect attribute or colour attribute on each point of the perspective, and means of sequential analysis (4, 5) of the points of the scene by successive planes perpendicular to the direction of motion of the observer and coupled to storage means (7, 8) which store only the attributes of the points of the scene which remain in the superimposition of the successive planes visible to the observer. Application: devices for aiding navigation. <IMAGE>

Description

Method and device for the synthesis of three-dimensional animated map-type images Background of the invention The present invention relates to a method and a device for the synthesis of three-dimensional animated map-type images allowing the representation in perspective, on a display screen, of the scene seen by an observer moving over a relief.
It applies in particular to the development of navigation aid devices for aircraft, surface ships and submarines.
Two stages are generally necessary to create a synthetic image perceptible with an impression of relief by an observer looking at a display screen.
The first stage consists in computing, from numerical informations representing the geographic coordinates of a scene, the perspective of each of the points of the scene as a function of the position of the observer in the observed scene.
The second stage consists in eliminating the masked portions not seen by the observer to give the impression of relief.
The algorithms that allow the suppression of the masked portions are of two types.
A first type of algorithms allows processing in the real space of the scene such as defined, for instance, by the geographic maps and allows to establish that a given portion of the scene is visible or is not visible by the observer placed over the scene. The computations are carried out with ac curacies at least equal to those with which the scene is defined. However, the cost of such algorithms increases as a function of the complexity of the observed scene.
A second type of algorithms allows processing in the space of the screen the observer is looking at. Through these algorithms, it is attempted to determine wether a given point on the screen belongs to the image of the visible portion of the scene with a very relative accuracy that depends of the accuracy with which the image is represented on the display screen.
Whichever method is employed, the implementation of such algorithms requires very long computation times when using serial computing architectures and, conversely, powerful means when, for increasing speed, parallel architectures are used.
Summary of the invention The purpose of the present invention is to remedy the above-mentionned deficiencies.
To this end, an object of the present invention is to provide a method for the synthesis of three-dimensional animated map-type images allowing the representation in perspective, on a display screen, of the scene seen by an observer moving above a relief, comprising the steps of - storing the horizontal and vertical geographic coordinates of the points of the scene defined in a local trihedron occupying a fixed position with respect to the scene, - referencing the position of the observer in the local trihedron, - defining, with respect to the local trihedron, a moving trihedron tied to the observer, the top P of this trihedron materializing the position of the observer in the local trihedron and a side of this trihedron figuring the axis of displacement of the moving trihedron, - considering in the direction of displacement of the moving trihedron the points of the scene situated in planes parallel to each other and perpendicular to the axis of displacement of the moving trihedron, these planes intersecting the axis of displacement of the moving trihedron in points distant from the top P of the moving trihedron by a distance greater than a minimum distance d, - defining in the moving space a plane of projection perpendicular to the axis of displacement of the moving trihedron and intersecting the axis of displacement of the moving trihedron in a point distant from the top P of the moving trihedron by the minimum distance d, - determining in the plane of projection the homothetic points of the scene belonging to each parallel plane, beginning with the points of the plane farthest from the top P of the moving trihedron and going on successively through the points of the other planes up to the plane of projection, the center of similarity of each similarity being coincident with the top P of the moving trihedron, - storing as they are determined the coordinates of the homothetic points of the various points of the scene on the display screen defined with respect to two orthonormed axes of this plane, assigning an information of aspect or of shade to the projected points of each plane, each information relative to a point of a new projected plane replacing the in for mation relative to a projected point of the previous plane, with the same coordinate in the plane of projection, so that upon completion of the analysis of the image of the scene, only the portions of the scene that are seen by the observer remain stored in the memory.
An object of the invention is also to provide z device for the implementation of this method.
The present invention has the advantage to allow the user to modify at will the portion of the observed scene and to modifiy the various parameters and in particular the minimum distance d used to build the perspective so that the image that is displayed in real time is the image that the observer would see directly if he was looking at the scene with the naked eye. The effect of animation, that makes the scene move in front of the observer who is moving with respect to the scene, is obtained by displaying a sequence of static images at a rate over a minimum rate, and any ambiguity in depth of the image can be resolved by judiciously selecting the aspect or the shade of the pixels of each of the projected planes.
The present invention has also the advantage to allow the suppression of the masked portions in an automatic manner, without using the sorting algorithms of the prior art, by successively superimposing the various projected planes of the scene that are perpendicular to the axis of displacement of the moving trihedron, beginning in each sequence by the plane that is at the rear of the scene, i.e. the farthest from the observer, and storing the results in an image memory, using the fact that in an image memory, writing at a given address erases the information that was previously contained at this address. This approach allows, with respect to conventional methods, to save time and gives the capability of real-time animation with manageable circuitry.
Brief description of the drawings Other features and advantages of the present invention will become apparent from the following description given with.
reference to the accompanying drawings, in which - Figure 1 shows the relative position of the moving trihedron with respect to the local trihedron; - Figure 2 shows the process of transforming the coordinates of the points of the image of the scene located in the moving space, into the plane of projection; - Figure 3 represents an embodiment of a device for the implementation of the method according to the present invention; - Figure 4 represents an embodiment of a coordinate transforming circuit of the device rnpresented in Figure 3 to transform the coordinates of the scene known in the local space into corresponding coordinates in the moving trihedron; - Figure 5 shows an embodiment of a coordinate transforming circuit for the transformation of the moving trihedron into the display screen space, of the device shown in Figure 3; ; - Figure 6 shows a mode of surfaced representation of an image of a scene in perspective represented on a display screen, obtained with the device according to the present invention; - Figure 7 shows a mode of representation in dotted lines of an image in relief obtained on a display screen with the device according to the present invention.
Detailed description of the invention To put into perspective a scene whose geographic coordinates of each point are referenced with respect to a local trihedron O(X,Y,Z), the method according to the present invention consists in referencing in this system of axes, the position P of the observer. Then, it consists to consider, in the manner shown in Figure 1, a trihedron P(X,Y,Z) derived from the local trihedron O(X,Y,Z) by translating the vector OP, and the trihedron P(XM,YM,ZM) derived from the trihedron P(X,Y,Z) by a rotation of an angle d about the axis PZ, representing the reference trihedron of the moving body.
In Figure 1, x and y designate the horizontal coordinates and z designates the vertical coordinate, in the local trihedron O(X,Y,Z), of the points S(x,y,z) of the observed scene, and xm, y m and # m designate the corresponding coordinates of the same points of the observed scene in the moving trihedron P( XM,YM,ZM) In this figure, xp, yp and zp designate the coordinates of the position P of the observer in the local trihedron O(X,Y,Z).In the system of coordinates so defined, the coordinates x,y,z of each of the points S of the scene verify the relations x = x .cos &alpha; - y .sin &alpha; + xp (1) y = xm.sini + ym.cos + Yp (2) and z = Zm + Zp (3).
Based on these relations which take into account both the altitude and the direction of displacement of the observer in the local space, the following phases of the method consist in computing the projection coordinates of each of the points S(xm,ym,zm) ####Zm)of the moving space onto a plane of projection that is situated in the moving trihedron P(XM,YM,ZM) used as a reference for the moving space, at a given distance d from the top P of this trihedron and perpendicular to the side PXM that is oriented in the direction of displacement of the moving trihedron, in the manner shown in Figure 2. This plane of projection also represents the plane of the display screen on which the observed scene is displayed.
In Figure 2, the coordinates x e and Ye are those of â point S and represent the projection of â point S(x,'Y,z,) e of the scene onto the plane of projection along the straight line SP. The plane of projection is perpendicular in e to the axis PXM Mand is situated at the distance d from the point P. The equations for passing from the coordinates of the points S(xm,ym,zm) ) of the moving space seen in projection on the dis- play screen, to the plane of projection, are automatically derived from the geometric figure represented in Figure 2.
The coordinates x e and Ye verify the relations x = (d/xm) ).zm = (d/xm ).(z - ZP(4) Ye = (d/Xm) Ym (5).
These equations reflect the fact that each point S(xm,ym, of situated can plane derived from the point Se projected onto the plane of projection by a similarity with a center P and a ratio xm/d. Each point S projected onto the plane of pro e jection can correspond to a multitude of points S(xm,vm,zm) situated in planes parallel to the plane of projection and homothetic of the plane of projection in the similarity with a center P and a ratio d/xm, with xm representing the distance of each of the parallel planes of the moving space from the center of similarity P.
The coordinates x , y of each projected point S are used e e e for addressing an image memory that stores, in the corresponding locations, the shade or the aspect chosen for each of the pixel of a projected plane. In order to allow the suppression of the masked portions in an entirely automatic manner, each new information relative to a pixel automatically takes the place of the information relative to a pixel of a previous plane and present in the memory at this same address. In this way, only the portions of the scene that are seen by the observer remain stored in the image memory.
A device for the implementation of the method according to the present invention is shown in figure 3. The device represented comprises a main memory 1, a coordinate transforming circuit 2 for computing the coordinates x and y of each point of the scene in the local space as a function of the coordinates xp, Yp and i of the moving body, a coordinate transforming circuit 3 for computing the coordinates of the points of the scene projected onto the plane of projection, a counter 4 of coordinates Ym, a down counter 5 of coordinates xm, a multiplexing circuit 6 coupled to an even image memory 7 and an odd image memory 8, a divide-by-two circuit 9 to control the multiplexing circuit 6, a transcoding table 10, a multiplexing circuit 11, and a display screen 12.
The altitudes z of each of the points S(x,y,z) of the scene are stored in the main memory 1 via the data line D. Each address x,y of this memory contains the altitude z(x,y) of one point of the corresponding scene. Each point with an address x,y in the main memory 1 consequently represents a segment of a straight line perpendicular to the plane XOY with the coordinates x,y and the altitude z.
As indicated previously, the scene is analyzed in the moving trihedron P(XM,YM,ZM) in successive sectional planes perpendicular to the axis of displacement PX H of the moving body, beginning with the plane farthest at the rear of the scene, with respect to the top P of the moving trihedron P(XM,YM,ZM).
At the beginning of the analysis, the down counter 5 is loaded with a binary number whose value corresponds to the maximum distance # m (max) that separates the plane situated at the rear of the scene from the top P, and the counter Y H increments the coordinate # m along the axis PYM. The outputs of the coun- ter 4 and the counter 5 are connected respectively to the inputs of the coordinate transformating circuit 2 so that to each address xm,ym correspond coordinates x and y in the local space and to allow addressing of the main memory 1 and reading, at the corresponding address, the altitude z(x,y) of the observed point.
This altitude z(x,y) is applied by the main memory 1 to an input of the coordinate transforming circuit 3 that compu tes, as a function of the altitude # p of the moving body and the distance d separating the plane of projection from the top P, the coordinates of projection x e and Ye defined by the above relations (4) and (5). The coordinates xe and Ye are written, through the multiplexing circuit 6, alternately into the image memories 7 and 8 to save processing time. The image memories are then alternately read, one of them being written into while the other one is read by the multiplexing circuit 11 that furnishes the coordinates x e and Ye to the display screen 12.
The coordinate transforming circuit 2 is shown in Figure 4. This circuit comprises a programmable read-only memorv 13, a group of four multiplier circuits 14, 15, 16 and 17 and a group of two adder-subtracter circuits 18 and 19. The coordinate transforming circuit 2 allows the computation of the coordinates x and y of the points situated in the local trihedron, starting with the coordinates S , xpt Yp,Xm and y m of the same points of the image referenced in the moving trihedron. The memory 13 contains a table of cosines and a table of sines of the angle of rotation a . This memory is addressed by the angle of rotation A . The corresponding values of sine &alpha; are applied simultaneously to a first operand input of the multiplier circuits 14, 15.The second operand input of the multiplier circuit 14 receives the coordinate y m and the second operand input of the multiplier circuit 15 receives the coordinate x m Under these conditions, the multiplier circuits 14 and 15 furnish the products xm.cosd and coso, respectively.
In a rather similar way, the multiplier circuits 16 and 17 receive on a first operand input the value of sine > furnished by the programmable memory 13 and on a second operand input the values x m and Ym, respectively, so that they furnish at their outputs the products xm.sini and ym.sinX .The adder-subtracter circuit 18 receives on a first operand input the value x p representing the x coordinate of the moving trihedron in the local space, on a second operand input marked the thevalue of the product y m sinoc furnished by the multiplier circuit 17 and on a third operand input marked "+" the value of the product xm.cos furnished by the multiplier circuit 15. The result of the operation performed by the adder-subtracter circuit is, under these conditions, equal to x = xm. cos# > - y.sin a + x according to the above re p lation (1).
The adder circuit 19 has three inputs of which two are connected respectively to the output of the multiplier circuit 16 and to the output of the multiplier circuit 14, the third input receiving the coordinate y p of the moving trihedron re ferenced in the local trihedron. The result of the addition performed by the adder circuit 19 is equal, under these con ditions, to y = # m + .sinus(y m V according to the n above relation (2).
An embodiment of the circuit 3 transforming the coordinates of the points of the scene referenced in the mobile space, into the space of the display screen is shown on figure 5, this circuit comprising a divider circuit 20, a first multiplier circuit 21, a second multiplier circuit 22, a subtracter circuit 23 and a down counter 24. The divider circuit 20 has two operand inputs to which are applied respectively the coordinates x m of each of the points of the scene referenced in the moving trihedron and the distance d that separates the plane of projection from the top P of the moving trihedron.
This divider circuit 20 furnishes at its output the result of the division of d by x m and this result is applied to a first operand input of the multiplier circuit 21 and to a first operand input of the multiplier circuit 22.
The multiplier circuit 21 has a second input to which are applied the coordinates y m of each point of the scene referenced in the moving trihedron, and gives the product of y m bv the result of the division d/xm given by the divider circuit m 20. This result verifies the relation (5) Ye = (d/xm).ym. For each pair of values xm,ym, the altitude z(x,y) of each point S(x,y,z) of the scene, furnished by the main memory 1 of Figu re 3, is loaded into the down counter 24'to allow the compu tat ion of the coordinates xe and Ye and the display of a corresponding altitude segment in the space of the display screen.
This operation always begins at the value x m (max) referencing the plane of the scene that is the farthest and goes on through decreasing values of x mdown to the value x = d that refe m m rences the plane of projection. At each pair of values (xm, a a a fast clock, not represented, controls the countinc-down of the down counter 24, from the value z(x, y) down to the value z = O. The content of the down counter 24 is then transmitted to an operand input marked "+" of the subtracter circuit 23 whose other operand input "-" receives the altitude p of the moving body.The result of the subtraction performed by the subtracter circuit 23 is transmitted to the second operand input of the multiplier circuit 22 that performs the operation xe = (d/xm ).(z - zp).
The passage through zero of the down counter 24 is signalled by a signal FSD that indicates that the transformation of the coordinate of the segment of straight line z(x,y) is completed and this signal is transmitted to the clock input of the counter 4 to obtain the next value y m and start again the transformation of the coordinates of the segment z(x,y) read in the main memory 1. When the counter 4 has reached its maximum capacity, it delivers a count-down pulse to the clock input H of the down counter 5 to allow the computation of the new coordinates x, y, z of the points of the next plane, addressed by the new value xm, and the display of the corresponding altitude segment on the display screen. In the process that has just been described, the time of analysis of the scene is, of course, a function of its relief and of the position of the observer.
Returning to Figure 3, the transcoding table 10 contains for each parallel plane of the moving trihedron, addressed by the coordinate x m furnished by the down counter 5, values of pixel attributes to designate each sectional plane by a color or a particular surface aspect in order to resolve the ambiguity of the image depth and to give an impression of correct relief. Depending on this transcoding table, it is possible to obtain either surfaced representations such as those represented by Figure 6, or representations in dotted lines such as those represented by Figure 7.

Claims (10)

Claims
1. A method for the synthesis of three-dimensional animated map-type images, allowing the representation in perspective on a display screen, of the scene seen by an observer moving over a relief, comprising the steps of - storing the horizontal and vertical geographic coordinates of the points of the scene defined in a local trihedron O(X,Y,Z) occupying a fixed position with respect to the scene, - referencing the position of the observer in said local trihedron, - defining, with respect to said local trihedron, a moving trihedron P(xm, ### Zm) tied to the observer, the top P of this moving trihedron materializing the position of the observer in said local trihedron and a side of said moving tri hedron figuring the axis PX mof displacement of said moving trihedron, - considering in the direction of the displacement of said moving trihedron, the points of the scene situated in planes parallel to each other and perpendicular to said axis of displacement of said moving trihedron in points distant from said top P of said moving trihedron by a distance greater than a minimum distance d, - defining in the mobile space a plane of projection perpendicular to said axis of displacement of said moving trihedron and intersecting said axis of displacement of said moving trihedron in a point distant from said top P of said moving trihedron by said minimum distance d, - determining in said plane of projection the homothetic points of the scene belonging to each parallel plane, beginning with the points of the plane farthest from said top P of said moving trihedron and going on successively through the points of the other planes up to said plane of projection, the center of similarity of each similarity being coincident with said top P of said moving trihedron, - storing as they are determined the coordinates of said homothetic points of the various points of the scene on said display screen, defined with respect to the orthonormed axes of this plane, assigning an information of aspect or of shade to the points projected of each plane, each information relative to a point of a new projected plane replacing the in for mation relative to a projected point of the previous plane, with the same coordinates in said plane of projection, so that upon completion of the analysis of the image of the scene, only the portions of the scene that are seen by the observer remain stored in the memory.
2. A device for the synthesis of three-dimensional animated map-type images, allowing the representation in perspective, on a display screen, of the scene seen by an observer moving over a relief, comprising - a main memory to store the geographic coordinates of the scene, - computation means to furnish to said display screen the coordinates of the perspective of each point of the scene based on said geographic coordinates of the scene, the position of the observer over the scene and the direction of displacement of said observer, - transcoding means to assign to each computed coordinate of a point of the perspective an aspect or color attribute, - means for sequential analysis of the points of the scene through successive planes perpendicular to the direction of displacement of the observer, beginning by the plane farthest from the position of said observer, coupled to means for storing said attributes of each corresponding point of the perspective, furnished by said transcoding means as the computation of the coordinates of the points of the perspective on said display screen proceeds, each new attribute of a point having the same coordinates on said display screen as those of a point of the previous plane and replacing, in said storage means, the attribute of the previous point situated at the same address, so that upon completion of the analysis of the image, only the portions of the scene that are seen by the observer remain stored in said storage means.
3. A device according to claim 2, wherein said main memory is addressed by the horizontal geographic coordinates of the points of the analyzed scene and stores the altitude of these points at the corresponding addresses.
4. A device according to claims 2 and 3, wherein said computing means comprise a first circuit transforming said coordinates of the points of the scene furnished by said means of sequential analysis to compute the addressing coordinates of said main memory as a function of the position of the observer.
5. A device according to any of claims 2 to 4, wherein said computing means comprise a second coordinate transforming circuit to compute, as a function of the altitude of each point stored in said main memory and as a function of the altitude of the observer, the coordinates of the points of the perspective that are referenced in a plane of projection perpendicular to said direction of dispacement of the observer and placed in front of the scene observed at a predetermined distance d from the observer, the coordinates of the points of the perspective so computed representing the coordinates of these same points on said display screen.
6. A device according to any of claims 2 w3 5, wherein said means of sequential analysis comprise a down counter for selecting each of tine planes perpend#cular to the diction of displacement and a counter to select in this plane a predeter- mined number of points of the straight line parallel to the horizontal plane of the scene and passing at the altitude of the observer, said down counter being decremented each time the totality of points of the straight line have been selected.
7. A device according to any of claims 2 to 6, wherein said transcoding means allow the generation of surfaced images.
8. A device according to any of claims 2 to 7, wherein said transcoding means allow the generation of images in dotted lines.
9. A method for the synthesis of three-dimensional animated map-type images, enabling the representation in perspective on a display screen, of the scene seen by an observer moving over a relief, substantially as described hereinbefore with reference to the accompanying drawings.
10. A device for the synthesis of three-dimensional animated map-type images, enabling the representation in perspective on a display screen, of the scene seen by an observer moving over a relief, substantially as described hereinbefore with reference to the accompanying drawings and as illustrated in Figures 3 to 5 of those drawings.
10. A device for the synthesis of three-dimensional animated map-type images, enabling the representation in perspective on a display screen, of the scene seen by an observer moving over a relief, substantially as described hereinbefore wifih reference to the accompanying årawgs and as illustrated in Figures 3 to 5 of those drawings.
Abstract Method and device for the svnthesis of threedimensional animated mat-tvne images A device comprising: - a main memory (1) to store the geographic coordinates of the scene, - means (3) for computing the coordir#tes of the perspective on the display screen (12), - transcoding means (10) to give an aspect or color attribute to each point of the perspective, - and means (6) for sequential analysis of the scene points through successive planes perpendicular to the direction of displacement of the observer, coupled. to storage means (7 and 8) retaining only the attributes of the scene points that remain in the superimposition of the successive planes visible by the observer.
Figure 3 Claims 1. A method for the synthesis of three-dimensional animated map-type images, allowing the representation in perspective on a display screen, of the scene seen by an observer moving over a relief on the surface of the earth, comprising the steps of - storing the horizontal and vertical geographic coordinates of the points of the scene defined in a local trihedron O(X,Y,Z) occupying a fixed position with respect to the scene, - referencing the position of the observer in said local trihedron, - defining, with respect to said local trihedron, a moving trihedron of top P tied to the observer, the top P of the moving trihedron being situated at the position of the observer in said local trihedron and a side of said moving trihedron being common to the direction of displacement of the moving trihedron in said local trihedron, - considering in the direction of the displacement of said moving trihedron, the points of the relief situated in parallel sectional planes of the relief perpendicular to said axis of displacement of said moving trihedron, these planes crossing said axis of displacement at points spaced from said top P of said moving trihedron by a distance greater than a minimum distance d, - defining in the moving space a plane of proJection perpendicular to said axis of displacement of said moving trihedron and intersecting said axis of displacement of said moving trihedron at a point distant from said top P of said moving trihedron by said minimum distance d, - determining in said plane of projection by homothetic transformations the homothetic points of the relief belonging to each sectional parallel plane, beginning with the points of the plane farthest from said top P of said moving trihedron and going on successively through the points of the other planes up to said plane of projection, the center of each homothetic transformation being coincident with said -top P of said moving trihedron, coordinates of said - storing as they are determined the/ homothetic points of the various points of each sectional plane of the relief on rating to the said display screen, assigning information aspect or shade of the points projected of each plane, each information relative to a point of a new projected plane replacing the information relative to a projected point of the previous plane, with the same coordinates in said plane of projection, so that upon completion of the analysis of the image of the scene, only the portions of the scene that are seen by the observer remain stored in the memory.
2. A device for the synthesis of three-dimensional animated map-type images, allowing the representation in perspective, on a display screen, of the scene seen by an observer moving over a relief, on the surface of the earth comprising : - a main memory to store the geographic coordinates of the scene, - computation means to furnish to said display screen the co ordinates of the perspective of each point of the scene based on said geographic coordinates of the scene, the position of the observer over the scene and the direction of displacement of said observer, - transcoding means to assign to each computed coordinate of a point on the screen an aspect or color attribute, - means for sequential analysis of the points of the scene through successive planes perpendicular to the direction of displacement of the observer, beginning by the plane farthest from the position of said observer, coupled to means for storing said attributes of each corresponding point of the perspective, furnished by said transcoding means as the computation of the coordinates of the points of the perspective on said display screen proceeds, each new attribute of a point having the same coordinates on said display screen as those of a point of the previous plane and replacing, in said storage means, the attribute of the previous point situated at the same address, so that upon completion of the analysis of the image, only the portions of the scene that are seen by the observer remain stored in said storage means.
3. A device according to claim 2, wherein said main memory is addressed by the horizontal geographic coordinates of the points of the analyzed scene and stores the altitude of these points at the corresponding addresses.
4. A device according to claims 2 and 3, wherein said computing means comprise a first circuit transforming said coordinates of the points of the scene furnished by said means of sequential analysis to compute the addressing coordinates of said main memory as a function of the position of the observer.
5. A device according to any of claims 2 to 4, wherein said computing means comprise a second coordinate transforming circuit to compute, as a function of the altitude of each point stored in said main memory and as a function of the altitude of the observer, the coordinates of the points of the perspective that are referenced in a plane of projection perpendicular to said direction of displacement of the observer and placed in front of the scene observed at a predetermined distance d from the observer, the coordinates of the points of the perspective so computed representing the coordinates of these same points on said display screen.
6. A device according to any of claims 2 to 5, wherein sard means of sequential analysis comprise a down counter for se lecting each of the planes perpendicular to the direction of displacement and a counter to select in this plane a predetermined number of points of the straight line parallel to the horizontal plane of the scene and passing at the altitude of the observer, said down counter being decremented each time the totality of points of the straight line have been selec ted.
7. A device according to any of claims 2 to 6, wherein said transcoding means allow the generation of surfaced images.
8. A device according to any of claims 2 to 7, wherein said transcoding means allow the generation of images in dotted lines.
9. A method for the synthesis of three-dimensional animated map-type images, enabling the representation in perspective on a display screen, of the scene seen by an observer moving over a relief, substantially as described hereinbefore with reference to the accompanying drawings.
GB8705032A 1986-03-04 1987-03-04 Method and device for the synthesis of three-dimensional animated map-type images Expired - Fee Related GB2258979B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
FR8603039A FR2674652A1 (en) 1986-03-04 1986-03-04 Process and device for synthesising three-dimensional moving map images

Publications (3)

Publication Number Publication Date
GB8705032D0 GB8705032D0 (en) 1992-09-16
GB2258979A true GB2258979A (en) 1993-02-24
GB2258979B GB2258979B (en) 1993-07-07

Family

ID=9332747

Family Applications (1)

Application Number Title Priority Date Filing Date
GB8705032A Expired - Fee Related GB2258979B (en) 1986-03-04 1987-03-04 Method and device for the synthesis of three-dimensional animated map-type images

Country Status (4)

Country Link
DE (1) DE3706456A1 (en)
FR (1) FR2674652A1 (en)
GB (1) GB2258979B (en)
IT (1) IT1235575B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6611753B1 (en) 1998-04-17 2003-08-26 Magellan Dis, Inc. 3-dimensional intersection display for vehicle navigation system
FI106772B (en) 1999-01-19 2001-04-12 Jani Petteri Martikainen Chopsticks
US8554475B2 (en) 2007-10-01 2013-10-08 Mitac International Corporation Static and dynamic contours

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1359504A (en) * 1971-05-13 1974-07-10 Plessey Co Ltd Radar simulation apparatus
GB1548031A (en) * 1975-08-22 1979-07-04 Gen Electric Multisensor digital image generator
GB2045568A (en) * 1979-03-21 1980-10-29 Solartron Electronic Group Digital simulation apparatus
EP0137107A1 (en) * 1981-05-22 1985-04-17 The Marconi Company Limited A computer generated imagery system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2524177B1 (en) * 1982-03-25 1987-10-30 Dassault Electronique METHOD AND DEVICE FOR PROVIDING FROM DIGITAL DATA A DYNAMIC IMAGE OF A SURFACE, SUCH AS THE GROUND, VIEWED FROM A MOBILE OBSERVATION POINT

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1359504A (en) * 1971-05-13 1974-07-10 Plessey Co Ltd Radar simulation apparatus
GB1548031A (en) * 1975-08-22 1979-07-04 Gen Electric Multisensor digital image generator
GB2045568A (en) * 1979-03-21 1980-10-29 Solartron Electronic Group Digital simulation apparatus
EP0137107A1 (en) * 1981-05-22 1985-04-17 The Marconi Company Limited A computer generated imagery system

Also Published As

Publication number Publication date
GB8705032D0 (en) 1992-09-16
GB2258979B (en) 1993-07-07
DE3706456A1 (en) 1993-02-04
IT1235575B (en) 1992-09-11
IT8767098A0 (en) 1987-02-16
FR2674652A1 (en) 1992-10-02

Similar Documents

Publication Publication Date Title
US4343037A (en) Visual display systems of the computer generated image type
US5841441A (en) High-speed three-dimensional texture mapping systems and methods
CA1254655A (en) Method of comprehensive distortion correction for a computer image generation system
US4667236A (en) Television perspective effects system
EP0451875B1 (en) Image displaying system
US3816726A (en) Computer graphics clipping system for polygons
US4709231A (en) Shading apparatus for displaying three dimensional objects
US6052100A (en) Computer controlled three-dimensional volumetric display
US4855938A (en) Hidden line removal method with modified depth buffer
CA2036273C (en) Specifying 3d points in 2d graphic displays
US4179824A (en) Simulation of an object system formed by polygon faces having a series of fundamental shapes and dimension
KR910009102B1 (en) Image synthesizing apparatus
US4899295A (en) Video signal processing
JP2762502B2 (en) Stereoscopic display method and apparatus
US5793372A (en) Methods and apparatus for rapidly rendering photo-realistic surfaces on 3-dimensional wire frames automatically using user defined points
US5550959A (en) Technique and system for the real-time generation of perspective images
GB2051525A (en) C.G.I.-Surface textures
GB2258979A (en) Method and device for the synthesis of three-dimensional animated map-type images
CN115164823A (en) Method and device for acquiring gyroscope information of camera
JP2634126B2 (en) Graphics display method and apparatus
EP0389890B1 (en) Method and apparatus for generating figures with three degrees of freedom
JPH08195891A (en) Image processing equipment and method therefor
EP0250588B1 (en) Comprehensive distortion correction in a real time imaging system
JPH1027268A (en) Image processing method and image processor
Washington et al. The application of CRT contour analysis to general circulation experiments

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 19931007