WO2005098760A1 - 描画方法、描画プログラム、および描画装置 - Google Patents
描画方法、描画プログラム、および描画装置 Download PDFInfo
- Publication number
- WO2005098760A1 WO2005098760A1 PCT/JP2005/004492 JP2005004492W WO2005098760A1 WO 2005098760 A1 WO2005098760 A1 WO 2005098760A1 JP 2005004492 W JP2005004492 W JP 2005004492W WO 2005098760 A1 WO2005098760 A1 WO 2005098760A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- viewpoint coordinates
- drawn
- viewpoint
- input
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/40—Hidden part removal
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
Definitions
- the present invention relates to a drawing method, a drawing program, and a drawing device.
- the use of the present invention is not limited to the above-described drawing method, drawing program, and drawing apparatus. Background art
- a map search device capable of displaying a map three-dimensionally by processing with a small calculation load.
- the map retrieval device reads, from the map storage device, map data near the current position of the vehicle calculated by the position calculation device or map data in a range of a map to be displayed specified by the input device.
- the arithmetic processing unit performs perspective transformation of the four vertices of the read map data based on the viewpoint and gazing point coordinates input from the input device, maps the map data to the transformed coordinates, and further performs clipping.
- the map after the mapping is displayed on the output device (for example, see Patent Document 1 below).
- Patent Document 1 JP-A-9138136
- the drawing method according to the first aspect of the present invention is a drawing method, comprising the steps of: inputting an arbitrary viewpoint coordinate in a three-dimensional coordinate system; and viewing one object from the viewpoint coordinates input in the input step.
- a second drawing step of drawing an image of the other object so as to overlap the image of the one object. .
- a drawing program according to the invention of claim 7 causes a computer to execute the drawing method according to any one of claims 115.
- a drawing apparatus is a drawing device, comprising: an input unit that receives an input of an arbitrary viewpoint coordinate in a three-dimensional coordinate system; and an object that is viewed from the viewpoint coordinates input by the input unit.
- a first drawing unit that draws an image of the one object in the case where the first object is drawn, and depth information of the image of the one object drawn by the first drawing unit is set to the viewpoint coordinates more than the one object.
- Changing means for changing the near position force to information on the distance to the viewpoint coordinate; and, based on the depth information changed by the changing means, changing the viewpoint coordinate force to another object different from the one object.
- Second drawing means for drawing an image of the other object in the case of overlapping with the image of the one object. Characterized in that
- FIG. 1 is a block diagram showing a hardware configuration of a drawing apparatus according to an embodiment of the present invention.
- FIG. 2 is a block diagram illustrating a functional configuration of a drawing apparatus according to an embodiment of the present invention.
- FIG. 3 schematically shows information stored in the map information database shown in Fig. 2.
- FIG. 1 A first figure.
- FIG. 4 is a perspective view showing an object generated by the generation unit.
- FIG. 5 is a flowchart of a drawing processing procedure according to the first embodiment.
- FIG. 6 is a flowchart showing a specific processing procedure of the tunnel drawing processing according to the first embodiment.
- FIG. 7 is an explanatory diagram (part 1) showing the drawing contents in the tunnel drawing processing.
- FIG. 8 is an explanatory view (2) of the drawing content in the tunnel drawing process.
- Fig. 9 is an explanatory view (3) of the drawing content in the tunnel drawing process.
- FIG. 10 is a flowchart of a tunnel drawing process according to the second embodiment.
- FIG. 11 is an explanatory view (part 1) showing the drawing contents in the tunnel drawing processing.
- FIG. 12 is an explanatory view (2) of the drawing content in the tunnel drawing process.
- FIG. 13 is an explanatory view (3) of the drawing content in the tunnel drawing process.
- FIG. 14 is an explanatory view (No. 4) showing drawing contents in tunnel drawing processing.
- FIG. 15 is an explanatory diagram showing another example of the drawing content in the tunnel drawing process. Explanation of symbols
- a drawing method, a drawing program, and a drawing device according to an embodiment of the present invention is to be able to improve safe driving by drawing a simple and realistic image.
- a drawing method, a drawing program, and a drawing device that are useful in the embodiment of the present invention are, for example, a drawing method, a drawing program, and a drawing device using a Z-buffer method as an example of a hidden surface elimination method.
- FIG. 1 is a block diagram illustrating a hardware configuration of a drawing apparatus according to an embodiment of the present invention.
- the drawing apparatus includes a CPU 101, a ROM 102, a RAM 103, an HDD (hard disk drive) 104, an HD (node disk) 105, a CDZDVD drive 106, and an example of a removable recording medium.
- CDZDVD 107 video Z audio IZF (interface) 108, display 109, speaker (headphone) 110, input I / F (interface) 111, remote controller 112, input key 113, communication IZF (Interface) 114, GPS (Global Positioning System) receiver 115, angular velocity sensor 116, mileage sensor 117, tilt sensor 118, graphics memory 119, and graphics processor 130.
- the components 101 to 119 and 130 are connected by a bus 100.
- the CPU 101 controls the entire drawing apparatus.
- the ROM 102 stores programs such as a boot program.
- the RAM 103 is used as a work area of the CPU 101.
- the HDD 104 controls the data read Z write to the HD 105 according to the control of the CPU 101.
- the HD 105 stores data written under the control of the HDD 104.
- the CDZDVD drive 106 controls read / write of data to / from the CDZDVD 107 under the control of the CPU 101.
- the CDZDVD 107 is a removable recording medium from which data recorded under the control of the CDZDVD drive 106 is read.
- a writable recording medium can be used.
- the removable recording medium may be a CDZDVD107, a CD-ROM (CD-R, CD-RW), an MO, a memory card, or the like.
- the video Z audio IZF 108 is connected to a display 109 for video display and a headphone (speaker) 110 for audio output.
- the display 109 displays icons, cursors, menus, windows, or various data such as characters and images.
- a CRT a CRT
- a TFT liquid crystal display a plasma display, or the like can be employed.
- the input IZF 111 inputs data transmitted from a remote controller 112 provided with a plurality of keys for inputting characters, numerical values, various instructions, and the like, and input keys (including a keyboard and a mouse) 113. I do.
- a remote controller 112 provided with a plurality of keys for inputting characters, numerical values, various instructions, and the like, and input keys (including a keyboard and a mouse) 113. I do.
- an output IZF can be provided as necessary, and a scanner for optically reading characters and images and a printer for printing characters and images can be connected via the output IZF.
- the communication IZF 114 is connected to a network wirelessly or via a communication cable, and functions as an interface between the network and the CPU 101.
- the network includes a LAN, a WAN, a public line network, a mobile phone network, and the like.
- the communication IZF 114 inputs various data output from the GPS receiver 115, the angular velocity sensor 116, the mileage sensor 117, and the inclination sensor 118.
- the GPS receiver 115 receives a radio wave from a GPS satellite and obtains a geometric position with respect to the GPS satellite, and can measure anywhere on the earth.
- the radio wave is a carrier wave of 1,575.42 MHz, and is transmitted using the L1 radio wave carrying the CZA (Coarse and Access) code and navigation message.
- the navigation message has a bit rate of 50 bps, and the code length is 300 bits for the subframe.
- the angular velocity sensor 116 detects the angular velocity at the time of rotation of the own vehicle, and outputs angular velocity data and relative azimuth data.
- the mileage sensor 117 calculates the number of pulses per one rotation of the wheel by counting the number of pulses of a pulse signal of a predetermined cycle output with the rotation of the wheel, and travels based on the number of pulses per rotation. Output distance data.
- the inclination sensor 118 detects the inclination angle of the road surface and outputs inclination angle data.
- the graphics memory 119 includes a frame buffer 120 and a Z buffer 121.
- the frame buffer 120 stores the color data of the drawn image for each pixel.
- the Z buffer 121 stores a Z value indicating the depth of the drawn image for each pixel.
- the graphics memory 119 may be configured by providing a graphics area inside the RAM 103 described above. Also, the graphics processor 130 controls graphics-related processing, for example, drawing and display of map information.
- FIG. 2 is a block diagram showing a functional configuration of a drawing apparatus 200 according to an embodiment of the present invention.
- the drawing device 200 includes a map information database 201, an input unit 203, an extraction unit 204, a generation unit 205, a drawing unit 206, a change unit 207, a storage unit 202, a detection unit 208, It has.
- the map information database 201 stores map information.
- FIG. 3 is an explanatory diagram schematically showing information stored in the map information database 201 shown in FIG.
- a map information database 201 uses a three-dimensional coordinate system including an X axis, a Y axis, and a Z axis.
- the ground object 301 is, for example, mesh data in which a plurality of triangular polygons are combined, and each vertex of the polygon has a coordinate value using a three-dimensional coordinate system.
- the map information database 201 also stores road network data 311 using the two-dimensional coordinate system of the X axis and the Y axis.
- the road network data 311 is, specifically, data in which a plurality of links are connected by nodes.
- Each link has road width information such as the number of lanes and tunnel information for identifying whether the road is inside a tunnel.
- each node has height information in the z-axis direction and tunnel information for identifying a tip position, a halfway position, a terminal position, and the like of the tunnel.
- a link indicated by a solid line (for example, link 321) is road data drawn on the ground object 301, and a link 322 indicated by a dotted line corresponds to the tunnel information described above.
- Road data in a tunnel having One node 331 of the link 322 has height information in the Z-axis direction and tunnel information for identifying that one end of the tunnel is open.
- the other node 332 of the link 322 has height information in the Z-axis direction and tunnel information for identifying that the other end of the tunnel is open.
- the functions of the map information database 201 are realized by a recording medium such as the ROM 102, the RAM 103, the HD 105, and the CDZDVD 107 shown in FIG.
- the storage unit 202 stores color information (color data) and depth information (Z value) of the image drawn by the drawing unit 206 for each pixel.
- the function of the storage unit 202 is realized by, for example, the graphic memory 119 shown in FIG.
- the input unit 203 inputs arbitrary viewpoint coordinates in the above-described three-dimensional coordinate system. Specifically, the user inputs the viewpoint coordinates using the remote controller 112 and the input keys 113 shown in FIG. Further, the current position information can be obtained using the GPS receiver 115, the angular velocity sensor 116, the mileage sensor 117, and the inclination sensor 118 shown in FIG. 1, and the viewpoint coordinates can be obtained from the obtained current position information card.
- the extraction unit 204 extracts map information existing in the field of view having a viewpoint coordinate power from the map information database 201 based on the viewpoint coordinates input by the input unit 203. Specifically, a frustum representing the field of view of the viewpoint coordinate force is set, and an object included in a coordinate position in the frustum is extracted.
- the generating unit 205 generates various objects based on the map information extracted by the extracting unit 204. Specifically, for example, when a link or a node having tunnel information is extracted by the extraction unit 204, a tubular object corresponding to the length and width of the link is generated. Further, a front end object that covers the front end opening of the cylindrical object and a rear end opening object that covers the rear end opening are generated. Here, an object generated by the generation unit 205 will be described.
- FIG. 4 is a perspective view showing an object generated by the generation unit 205.
- the length L in the longitudinal direction of the cylindrical object 400 is the length of the tunnel to be generated, that is, the length of the link 322 corresponding to the road data in the tunnel shown in FIG. Yes, it is.
- the width W of the tubular object 400 corresponds to the road width information of the link 322 corresponding to the road data in the tunnel.
- the bottom surface 401 inside the tubular object 400 is drawn with a road texture.
- the tip opening 411 of the tubular object 400 corresponds to the entrance of the tunnel.
- a lid-shaped tip surface object 421 that covers the tip opening 411 is generated in the tip opening 411.
- the tip surface object 421 is an object having only a shape, that is, a colorless (transparent) object.
- the rear end opening 412 of the tubular object 400 corresponds to the exit of the tunnel.
- a lid-shaped rear end surface object 422 that covers the rear end opening 412 is generated. If other objects viewed from the exit of the tunnel are not drawn, the rear end object 422 is colored, and if drawn, it is set to colorless.
- the drawing unit 206 includes a first drawing unit 211 and a fourth drawing unit 214.
- the first drawing unit 211 draws an image related to the one object when the viewpoint coordinate force input by the input unit 203 also looks at one object.
- one object is, for example, the cylindrical object 400 generated by the generating unit 205
- an image in which the opening 411 and the inner peripheral wall surfaces 401 to 403 of the cylindrical object 400 can be drawn can be drawn. More specifically, by recording the color data of this image in the frame buffer shown in FIG. 1, an image is drawn in which the opening of the cylindrical object 400 and the inner peripheral wall surfaces 401 to 403 can be seen.
- the second drawing unit 212 executes another object when seeing another object different from the one object from the viewpoint coordinates. Is drawn so as to overlap the image related to one object. More specifically, if one object is a cylindrical object 400 and the other object is a ground object 301 extracted from the map information The image of the cylindrical object 400 seen from the viewpoint and the image of the ground object 301 seen from the viewpoint coordinates are drawn so as to overlap.
- the depth information changed by the changing unit 207 is used for the image of the cylindrical object 400.
- distance information to the coordinate position of the ground object in the 3D coordinate system is used.
- the Z value is the depth information of the image of the cylindrical object 400 recorded in the Z buffer shown in FIG. 1 and the depth information of the drawing image of the ground object 301 viewed from the viewpoint coordinates. By comparing with the Z value, the drawing image of the overlapping portion can be selected.
- the third drawing unit 213 draws an image relating to a transparent object having a transparent color that is located closer to the viewpoint coordinates than one object. Specifically, when one object is a cylindrical object 400 and a transparent object is a transparent (colorless) distal surface object 421 that covers the distal opening 411 of the cylindrical object 400, an image of the cylindrical object 400 is drawn. After that, before the other object, the ground object 301, is drawn, the image of the front end object 421 is drawn.
- the image information of the distal end object 421 includes only the Z value, which is the depth information that the color information does not have, the overlapping portion between the image of the cylindrical object 400 and the image of the distal end object 421 The drawing state of the image of the cylindrical object 400 is maintained, and only the Z value is rewritten.
- the fourth drawing unit 214 draws an image of one object and an object located behind the other object in view of the viewpoint coordinate force. Specifically, when one object is a cylindrical object 400 and another object is a ground object 301, an image of another ground object located behind the object is drawn. After drawing by the fourth drawing unit 214, by clearing the Z value of the drawn image, the drawn image can be depth information at infinity.
- the changing unit 207 converts the depth information of the image of one object drawn by the first drawing unit 211 closer to the viewpoint coordinates than the one object, and the distance from the position to the viewpoint coordinates. Change to information about Specifically, if one object is a cylindrical object 400, for example, the depth information of the image of the cylindrical object 400 The depth information is changed to a position between the mark and the tip opening of the cylindrical object 400. More specifically, the changing unit 207 changes the depth information of the image of the tubular object 400 to the depth information of the image of the transparent object.
- the detecting unit 208 detects whether the viewpoint coordinates are inside the tubular object 400, that is, inside the tunnel. Specifically, detection is performed using the XY coordinate values of the viewpoint coordinates, the XY coordinate values of the link 322 corresponding to the tunnel, and the height information of the nodes 331 and 332. For example, if the XY coordinate value of the viewpoint coordinates matches the XY coordinate value of the link 322 and the Z coordinate value of the viewpoint coordinates is smaller than the height information of the nodes 331 and 332 connecting the link 322, the viewpoint coordinates Can be detected as coordinates in the tunnel. Also, since the XY coordinate value of the link is expanded according to the width information of the link, even if the viewpoint coordinate is the XY coordinate value within the expanded range, it can be detected as the coordinate in the tunnel.
- the input unit 203, the extraction unit 204, the generation unit 205, the drawing unit 206, the change unit 207, and the detection unit 208 described above are, for example, the ROM 102, the RAM 103, and the HD 105 shown in FIG.
- the function is realized by causing the CPU 101 or the graphics processor 130 to execute a program recorded on a recording medium such as the CDZDVD 107, or by the input I / F 111.
- FIG. 5 is a flowchart illustrating a drawing processing procedure according to the first embodiment.
- the viewpoint coordinates are input (step S501: Yes)
- the map information existing in the view frustum representing the field of view from the viewpoint coordinates that is, the ground object 301 and the road network data 311 are extracted.
- step S 503 it is detected whether or not the road network data 311 in the frustum includes tunnel information (step S 503). If tunnel information is included (step S503: No), normal drawing processing is performed (step S504). Specifically, an image of each object whose viewpoint coordinate force is also visible is drawn, and the Z value of the image of each object is compared by using a hidden surface elimination method such as a Z-buffer method, thereby performing pseudo three-dimensional drawing.
- a hidden surface elimination method such as a Z-buffer method
- step S503 if tunnel information is included (step S503: Yes), the viewpoint coordinates are It is detected whether the coordinates are within the file (step S505). Then, when it is detected that the viewpoint coordinates are in the tunnel (step S505: Yes), the process proceeds to step S504. On the other hand, if it is determined that the viewpoint coordinates are not coordinates in the tunnel (step S505: No), tunnel drawing processing is performed (step S506). Here, a specific processing procedure of the tunnel drawing processing (step S506) will be described.
- FIG. 6 is a flowchart showing a specific processing procedure of the tunnel drawing processing.
- This tunnel drawing processing procedure is a processing procedure when an object viewed from the exit of the tunnel is not drawn.
- FIG. 7 to FIG. 9 are explanatory diagrams showing drawing contents in the tunnel drawing processing.
- a cylindrical object 400, a front end object 421, and a rear end surface object 422 from which the inner peripheral wall surfaces 401-403 can be seen from the viewpoint coordinates are generated (step S 601).
- the front end object 421 is colorless, and the rear end object 422 is colored, for example, black.
- the viewpoint coordinate force also calculates the distance to the tubular object 400, the viewpoint coordinate calculates the distance to the front end object 421, and the viewpoint coordinate force calculates the distance to the rear end surface object 422 (step S602).
- the cylindrical object 400 viewed from the viewpoint coordinates that is, the image of the inner peripheral wall surfaces 401 to 403 and the image of the rear end surface object 422 are drawn (step S603).
- the color data of the images of the inner peripheral wall surfaces 401 to 403 and the image of the rear end surface object 422 are recorded in the frame buffer 120 shown in FIG.
- FIG. 7 is an explanatory diagram showing the drawing contents in step S603.
- a cylindrical object 400 composed of a bottom surface 401, a side wall surface 402, and a ceiling surface 403 exists in a viewing frustum 700 representing a field of view viewed from viewpoint coordinates V.
- the rear end face object 422 is located at the rear end opening 412 of the tubular object 400.
- a road object 701 is formed in front of the tip opening 411.
- the drawing image at this stage is indicated by reference numeral 710 in the figure.
- the drawing image 710 includes a drawing image 711 of the bottom surface 401, a drawing image 712 of the side wall surface 402, a drawing image 713 of the ceiling surface 403 (hereinafter referred to as an “inner wall image 711—713”), and a drawing image of the rear end surface object 422. (Hereinafter, refer to ⁇ Rear end face image )) And a drawing image 715 of the road object 701.
- step S604 depth information of the inner peripheral wall image 711—713 and the rear end face image 714 is recorded (step S604). Records a value corresponding to the distance from the viewpoint coordinates V calculated in step S602 to each point of the tubular object 400 in the Z buffer 121 for each pixel of the inner peripheral wall surface images 711-713. Further, a value corresponding to the distance from the viewpoint coordinates V to the rear end face object 422 is recorded in the Z buffer 121 for each rear end face image 714.
- step S605 a colorless image of the tip surface object 421 is drawn. Specifically, since the front end surface object 421 is colorless, the value of the frame buffer 120 does not change, and the inner peripheral wall surface images 711-713 and the rear end surface image 714 remain drawn.
- the drawing contents in step S605 will be described.
- FIG. 8 is an explanatory diagram showing the drawing contents in step S605.
- a distal end surface object 421 is arranged at a distal end opening 411 of a cylindrical object 400. Since the front end surface object 421 is colorless, the value of the frame buffer 120 of the inner peripheral wall surface image 711 — 713 and the rear end surface image 714 overlapping with the drawing range of the front end surface object 421 is not updated, and the inner peripheral wall surface image 711 is not updated. — Drawing of 713 and rear end face image 714 is maintained. On the other hand, by the drawing in step S605, the depth information (Z value) of the inner peripheral wall surface image 711—713 and the rear end surface image 714 is changed from the value recorded in step S604 to the viewpoint coordinate V force to the front end object 421. The value is updated to a value corresponding to the distance (step S606).
- step S607 the distance to the ground object 301, which also looks at the viewpoint coordinate V force, is calculated (step S607). Then, an image of the ground object 301 that also looks at the viewpoint coordinates V is drawn (step S608). In this drawing, the Z value that is the depth information updated in step S606 is compared with a value corresponding to the distance calculated in step S607.
- the drawing content in step S605 will be described.
- FIG. 9 is an explanatory diagram showing the drawing contents in step S608.
- a ground object 301 is arranged so as to overlap with a cylindrical object 400.
- the hem data 301a, 301b of the ground object 301 Is arranged inside the cylindrical object 400. Since the hem data 301a on the tip opening 411 side is located behind the tip surface object 421 when viewed from the viewpoint coordinate V force, the image in which the hem data 301a on the tip opening 411 side is also viewed from the viewpoint coordinate V force is the tip surface. The hidden surface is eliminated by the Z value of the transparent drawing image of the object 421.
- the skirt data 301b on the rear end opening 412 side is located behind the front end object 421 when viewed from the viewpoint coordinate V force
- the skirt data 301b on the front end opening 411 side is also viewed from the viewpoint coordinate V force.
- the hidden image is erased by the Z value of the transparent drawing image of the front end object 421.
- the skirt data 301b corresponds to the back of the polygon of the ground object 301 in view of the viewpoint coordinate V force, it is not drawn even by the knock face force ring process.
- the ground object 301c outside the cylindrical object 400 is drawn as a drawing image 716.
- the Z value of the internal image 711—714 of the tubular object 400 representing the tunnel is replaced with the Z value of the colorless tip surface object 421. Then, by drawing the ground object 301 behind the tip end object 421, the internal image 711—714 of the tubular object 400 is maintained while the drawing state of the internal image 711—714 of the tubular object 400 is maintained. Images of overlapping ground objects (foot data 301a, 301b) can be hidden-surface removed. As a result, it is possible to draw as if a tunnel was formed in the ground object 301, and it is possible to intuitively recognize that the actual scenery and the drawn image are the same.
- FIG. 10 is a flowchart of a tunnel drawing process procedure according to the second embodiment.
- This tunnel drawing processing procedure is a specific drawing processing in step S506 shown in FIG.
- FIGS. 11 and 14 are explanatory diagrams showing the contents of the drawing in this tunnel drawing process. Note that the drawing processing procedure shown in FIG. 5 is also applied to the second embodiment, and a description thereof will be omitted.
- the near surface N1 is moved to the exit position of the tunnel (step S1001).
- the near surface after the movement is the near surface N2
- the viewpoint coordinates An image of the ground object 341 and an image of the road object 342 in the viewing frustum 1100A in which the V force is also visible are drawn (step S1002).
- the drawing image at this stage is indicated by reference numeral 1100 in the figure. Clear the depth information of the drawn images 1111 and 1112 (step S100
- the ground object 341 and the road object 342 can be regarded as objects located at infinity, also in view of the viewpoint coordinate V force.
- the near surface N2 is returned to the original position to be a near surface N1 (step S1).
- a cylindrical object 400, a colorless front end object 421, and a colorless rear end surface object 422 existing in the viewing frustum 1100B indicating the field of view from the viewpoint coordinates V are generated (step S1005).
- the distance from the viewpoint coordinate V to the cylindrical object 400, the distance from the viewpoint coordinate V force to the front end object 421, and the distance from the viewpoint coordinate V force to the rear end surface object 422 are calculated (step S1006).
- the image of the cylindrical object 400 viewed from the viewpoint coordinates V that is, the inner peripheral wall surface images 711-713 is drawn (step S1007).
- the color data of the inner peripheral wall images 711-713 is recorded in the frame buffer.
- step S1008 depth information of the inner peripheral wall images 711-713 is recorded. Specifically, a value corresponding to the distance from the viewpoint coordinates V calculated in step S1006 to each point of the tubular object 400 is recorded in the Z buffer for each pixel of the inner peripheral wall surface image 711-713.
- a colorless image of the front end object 421 and an image of the rear end object 422 are drawn (step S1009). Specifically, since the front end object 421 and the rear end surface object 422 are colorless, the value of the frame buffer 120 does not change, and the inner peripheral wall surface images 711-713 remain drawn. On the other hand, by this drawing, the depth information (Z value) is updated from the value recorded in step S1008 to a value corresponding to the distance from the viewpoint coordinates V to the tip surface object 421 (step S1010). .
- the viewpoint coordinate V force also calculates the distance to the ground object 301 (step S1011). Then, as shown in FIG. 14, an image 716 of the ground object 301 is drawn (step S1012). In this drawing, the Z value serving as the depth information updated in step S1010 is compared with a value corresponding to the distance calculated in step S1011.
- the ground object 301 is arranged so as to overlap the cylindrical object 400. Is done. Inside the cylindrical object 400, skirt data 301a and 30 lb of the ground object 301 are arranged. Since the hem data 301a on the tip opening 411 side is located behind the tip surface object 421 in view of the viewpoint coordinate V force, the image in which the hem data 301a on the tip opening 411 side is also seen in the viewpoint coordinate V force is the tip surface object 421. The hidden surface is removed according to the Z value of the transparent drawing image.
- the skirt data 301b on the rear end opening 412 side is located behind the front end object 421 when viewed from the viewpoint coordinate V force, so the skirt data 301b on the front end opening 411 side is also viewed from the viewpoint coordinate V force.
- the hidden image is erased by the Z value of the transparent drawing image of the front end object 421.
- the skirt data 301b corresponds to the back of the polygon of the ground object 301 in view of the viewpoint coordinate V force, it is not drawn even by the knock face force ring process.
- the ground object 301c outside the cylindrical object 400 is drawn as a drawing image 716.
- the images 1111 and 1112 of the ground object 341 and the road object 342 behind the tunnel exit are drawn first, and the internal image of the cylindrical object 400 is drawn.
- the image of the ground object (hem data 301a, 301b) overlapping with the internal image 711—713 of the cylindrical object 400 is maintained while the drawing state of the internal image 711—713 of the cylindrical object 400 is maintained. Can be erased.
- the images 1111 and 1112 of the ground object 341 and the road object 342 behind the tunnel exit are also drawn, a more realistic drawn image can be obtained.
- the skirt data 301a, 301b of the ground object 301 is the force at which the skirt data 301 is located inside the cylindrical object 400. As shown in FIG. Alternatively, it may be positioned before the tip opening 411 of the cylindrical object 400. In this case, when viewed from the viewpoint coordinates V, the colorless tip object Before drawing the ground object 301, the tip surface object 421 is drawn before the ground object 301 is drawn. As a result, the image of the skirt data 301a just before the front end opening 411 can be erased with the transparent image of the front end object 421.
- a colorless rear end face object 422 is formed before the skirt data 301b, and the rear end face object 422 is drawn before the ground object 301 is drawn.
- the image of the skirt data 301b behind the rear end opening 412 can be erased with the transparent image of the rear end object 422.
- the drawing method, the drawing program, and the drawing apparatus 200 which are effective in the embodiment of the present invention, since the ground data is used to form the tunnel data, The amount can be reduced, and a realistic drawing that matches the scenery viewed can be realized by simple and high-speed processing.
- the drawing method described in the present embodiment can be realized by executing a prepared program on a computer such as a personal computer or a workstation.
- This program is recorded on a computer-readable recording medium such as a hard disk, a flexible disk, a CD-ROM, an MO, and a DVD, and is executed by reading out the recording medium power by the computer.
- This program may be a transmission medium that can be distributed via a network such as the Internet.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- Mathematical Physics (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Processing Or Creating Images (AREA)
- Image Generation (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP05726661A EP1732043A4 (en) | 2004-03-31 | 2005-03-15 | TRACING METHOD, TRACING PROGRAM, AND TRACING EQUIPMENT |
JP2006511943A JP4388064B2 (ja) | 2004-03-31 | 2005-03-15 | トンネル画像描画方法、トンネル画像描画プログラム、およびトンネル画像描画装置 |
US10/594,320 US7633500B2 (en) | 2004-03-31 | 2005-03-15 | Plotting method, plotting program, and plotting equipment |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004108249 | 2004-03-31 | ||
JP2004-108249 | 2004-03-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005098760A1 true WO2005098760A1 (ja) | 2005-10-20 |
Family
ID=35125292
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/004492 WO2005098760A1 (ja) | 2004-03-31 | 2005-03-15 | 描画方法、描画プログラム、および描画装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US7633500B2 (ja) |
EP (1) | EP1732043A4 (ja) |
JP (1) | JP4388064B2 (ja) |
CN (1) | CN1942903A (ja) |
WO (1) | WO2005098760A1 (ja) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8130222B1 (en) * | 2004-10-19 | 2012-03-06 | Rockwell Collins Simulation And Training Solutions Llc | System and method for resolving visual priority among coincident primitives |
JP4397372B2 (ja) * | 2005-12-28 | 2010-01-13 | トヨタ自動車株式会社 | 3次元形状データの作成方法、3次元形状データの作成装置、及び、3次元形状データの作成プログラム |
JP4738550B2 (ja) * | 2009-02-16 | 2011-08-03 | 三菱電機株式会社 | 地図情報処理装置 |
DE112009004047B4 (de) * | 2009-02-17 | 2015-02-12 | Mitsubishi Electric Corp. | Karteninformations -Verarbeitungsvorrichtung |
WO2012060114A1 (ja) * | 2010-11-01 | 2012-05-10 | 三菱電機株式会社 | 描画装置及び描画方法 |
US20150029214A1 (en) * | 2012-01-19 | 2015-01-29 | Pioneer Corporation | Display device, control method, program and storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0970481A (ja) * | 1995-09-05 | 1997-03-18 | Namco Ltd | 三次元ゲーム装置及び画像合成方法 |
JP2002222431A (ja) * | 2001-01-29 | 2002-08-09 | Namco Ltd | 画像生成システム、プログラム及び情報記憶媒体 |
JP2003216967A (ja) * | 2002-01-25 | 2003-07-31 | Namco Ltd | 画像生成システム、プログラム及び情報記憶媒体 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3266236B2 (ja) | 1995-09-11 | 2002-03-18 | 松下電器産業株式会社 | 車載用ナビゲーション装置 |
JPH09161096A (ja) * | 1995-12-06 | 1997-06-20 | Matsushita Electric Ind Co Ltd | 3次元画像表示制御装置 |
US5877768A (en) * | 1996-06-19 | 1999-03-02 | Object Technology Licensing Corp. | Method and system using a sorting table to order 2D shapes and 2D projections of 3D shapes for rendering a composite drawing |
JP2938845B1 (ja) | 1998-03-13 | 1999-08-25 | 三菱電機株式会社 | 3次元cg実写映像融合装置 |
JP3668019B2 (ja) * | 1998-10-27 | 2005-07-06 | 株式会社ソニー・コンピュータエンタテインメント | 記録媒体、画像処理装置および画像処理方法 |
GB2354416B (en) * | 1999-09-17 | 2004-04-21 | Technologies Limit Imagination | Depth based blending for 3D graphics systems |
-
2005
- 2005-03-15 EP EP05726661A patent/EP1732043A4/en not_active Withdrawn
- 2005-03-15 CN CNA2005800108778A patent/CN1942903A/zh active Pending
- 2005-03-15 US US10/594,320 patent/US7633500B2/en not_active Expired - Fee Related
- 2005-03-15 JP JP2006511943A patent/JP4388064B2/ja not_active Expired - Fee Related
- 2005-03-15 WO PCT/JP2005/004492 patent/WO2005098760A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0970481A (ja) * | 1995-09-05 | 1997-03-18 | Namco Ltd | 三次元ゲーム装置及び画像合成方法 |
JP2002222431A (ja) * | 2001-01-29 | 2002-08-09 | Namco Ltd | 画像生成システム、プログラム及び情報記憶媒体 |
JP2003216967A (ja) * | 2002-01-25 | 2003-07-31 | Namco Ltd | 画像生成システム、プログラム及び情報記憶媒体 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1732043A4 * |
Also Published As
Publication number | Publication date |
---|---|
CN1942903A (zh) | 2007-04-04 |
US7633500B2 (en) | 2009-12-15 |
JPWO2005098760A1 (ja) | 2008-02-28 |
EP1732043A1 (en) | 2006-12-13 |
US20070176928A1 (en) | 2007-08-02 |
EP1732043A4 (en) | 2010-02-10 |
JP4388064B2 (ja) | 2009-12-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4628356B2 (ja) | 地図生成装置、ナビゲーション装置、地図生成方法、地図生成プログラムおよび記録媒体 | |
KR100520708B1 (ko) | 3차원 지도의 표시방법 | |
JP2009157053A (ja) | 立体地図表示ナビゲーション装置、立体地図表示システム及び立体地図表示プログラム | |
WO2006092853A1 (ja) | 地図表示装置および地図表示方法 | |
JPH08339162A (ja) | 地図描画方法 | |
JP3266236B2 (ja) | 車載用ナビゲーション装置 | |
JP3568357B2 (ja) | ナビゲーション装置における地図情報表示装置及び地図情報表示方法並びにナビゲーション装置における地図情報表示制御プログラムが記録されたコンピュータ読み取り可能な記録媒体 | |
JP4388064B2 (ja) | トンネル画像描画方法、トンネル画像描画プログラム、およびトンネル画像描画装置 | |
JP2002098538A (ja) | ナビゲーション装置および擬似三次元地図情報表示方法 | |
JP2007026201A (ja) | 画像処理装置、道路画像描画方法および道路画像描画プログラム | |
JP4947376B2 (ja) | 3次元データ処理装置、3次元画像生成装置、ナビゲーション装置及び3次元データ処理プログラム | |
EP1839950A2 (en) | On-vehicle stereoscopic display device | |
JP3501032B2 (ja) | 地図表示方法及び地図表示装置 | |
JP2610998B2 (ja) | 自動車ナビゲーションシステムの描画方法 | |
JP3360425B2 (ja) | 車両用ナビゲーション装置 | |
WO2008053533A1 (fr) | Dispositif, procédé, programme d'affichage de carte et support d'enregistrement | |
JP4468076B2 (ja) | 地図表示装置 | |
KR101020505B1 (ko) | 3차원 자차마크 표시 장치 및 그 방법 | |
JP2008107223A (ja) | 経路誘導装置、経路誘導方法、経路誘導プログラムおよび記録媒体 | |
JP2007057809A (ja) | 地図表示装置 | |
JP3514607B2 (ja) | 地図表示制御装置及び地図表示制御用プログラムを記録した記録媒体 | |
WO2006095689A1 (ja) | 運転支援装置、運転支援方法および運転支援プログラム | |
JPH1183503A (ja) | ナビゲーション装置 | |
JPH08189838A (ja) | ナビゲーション装置 | |
JP2005241332A (ja) | 立体地図描画装置及び方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DPEN | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2006511943 Country of ref document: JP |
|
REEP | Request for entry into the european phase |
Ref document number: 2005726661 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2005726661 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 200580010877.8 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10594320 Country of ref document: US Ref document number: 2007176928 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 2005726661 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 10594320 Country of ref document: US |