WO2003027959A1 - Programme informatique et appareil de création d'image - Google Patents
Programme informatique et appareil de création d'image Download PDFInfo
- Publication number
- WO2003027959A1 WO2003027959A1 PCT/JP2002/009845 JP0209845W WO03027959A1 WO 2003027959 A1 WO2003027959 A1 WO 2003027959A1 JP 0209845 W JP0209845 W JP 0209845W WO 03027959 A1 WO03027959 A1 WO 03027959A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- images
- information
- layers
- coordinate conversion
- Prior art date
Links
- 238000004590 computer program Methods 0.000 title claims description 14
- 239000000872 buffer Substances 0.000 claims abstract description 139
- 238000006243 chemical reaction Methods 0.000 claims description 81
- 238000012545 processing Methods 0.000 claims description 57
- 238000000034 method Methods 0.000 claims description 37
- 230000006870 function Effects 0.000 claims description 23
- 230000008030 elimination Effects 0.000 claims description 16
- 238000003379 elimination reaction Methods 0.000 claims description 16
- 238000009877 rendering Methods 0.000 claims description 8
- 230000008859 change Effects 0.000 claims description 4
- 239000010410 layer Substances 0.000 description 133
- 230000009466 transformation Effects 0.000 description 16
- 238000010586 diagram Methods 0.000 description 9
- 230000001133 acceleration Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 235000015842 Hesperis Nutrition 0.000 description 1
- 235000012633 Iberis amara Nutrition 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 238000002679 ablation Methods 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000010287 polarization Effects 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 238000012827 research and development Methods 0.000 description 1
- 230000001932 seasonal effect Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 239000002356 single layer Substances 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/40—Hidden part removal
- G06T15/405—Hidden part removal using Z-buffer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/28—Indexing scheme for image data processing or generation, in general involving image processing hardware
Definitions
- the present invention relates to an image creation apparatus and method for creating a 3D image based on three-dimensional coordinate information, an electronic device such as a navigation system equipped with such an image creation apparatus, and a computer program. Belong.
- the image forming apparatus and method are preferably used for, for example, an in-vehicle navigation device, and are capable of displaying a multi-layer 3D image formed by superimposing a plurality of layers of images including 3D images.
- This navigation system has various databases, and has a basic configuration for displaying map information, current location information, various kinds of guidance information, etc. on a display device. Searching for is common.
- the system is configured so that the searched drive route and the current position based on GPS (Global Positioning System) positioning or self-contained positioning are displayed on a map to provide guidance to the destination (navigation).
- GPS Global Positioning System
- the display of the navigation system mounted on the vehicle for guidance to the destination shows the scenery ahead from the current driving point, plus the direction of the driving route, turning at the intersection The direction, the distance to the intersection, the distance to the destination, the estimated time of arrival, etc.
- the scenery ahead is represented by an image using a technique of giving a three-dimensional visual expression on a plane based on the driver's line of sight, that is, an image using perspective (referred to as a so-called “3D image” in this specification).
- 3D image an image using perspective
- a drawing engine having a hidden surface removal function by the Z-buffer method may be used.
- 3 D Partial frame images that make up the image for example, one frame image showing one landscape, road image, one building image, another building image, sky image, etc.
- the foreground image part at the same drawing position is stored for each drawing position, and finally a one-frame 3D image consisting of only the foreground image part at all drawing positions can be drawn.
- the hidden buffer elimination function using the Z-buffer method it is possible to draw not only landscape images, but also 3D images in which multiple texts, marks, patterns, designs, drawings, backgrounds, etc. are three-dimensionally superimposed Become.
- an on-board navigation device displays a multi-layer image in which various image information other than a map, related text information, various icons and marks, map information of different scales, etc. are superimposed on the basic map information.
- this technology provides a multi-layer image by providing a plurality of frame buffers corresponding to the total number of layers of the multi-layer image and superimposing the multi-layer images stored in the plurality of frame buffers. Is displayed.
- multi-layer 3D image a complex advanced image obtained by superimposing multiple layers of images including a 3D image. It is contemplated that a multi-layer image (referred to herein as a "multi-layer 3D image”) may be displayed.
- a pair of a frame buffer and a Z buffer are required for each layer of the multi-layer image. Requires a buffer. That is, the number of frame buffers equal to the total number of layers of the multi-layer image and the number of Z buffers equal to the total number of layers of the multi-layer image are required, and the required memory capacity becomes enormous as a whole. There is.
- the present invention has been made in view of the above-described problems, and has an image creating apparatus and method capable of displaying a multi-layer image such as a multi-layer 3D image while achieving a relatively low required memory capacity as a whole. It is an object of the present invention to provide electronic equipment provided with various devices and computer programs.
- an image creating apparatus is provided with a plurality of frame buffers for separately storing a plurality of layers of images including a 3D image and a common buffer for the plurality of frame buffers.
- a graphics memory having a single Z-buffer, and sequentially generating the images of the plurality of layers while performing hidden surface elimination using the single Z-buffer in a time-sharing manner and sequentially storing the images in the plurality of frame buffers.
- a drawing device for creating a multilayer 3D image by superimposing the images of the plurality of layers stored in the plurality of frame buffers and after the execution of the hidden surface elimination.
- the drawing device sequentially generates images of a plurality of layers including a 3D image while performing hidden surface elimination using one Z-buffer in a time-division manner, and generates a plurality of frames.
- each of the plurality of frame buffers is set to a state in which a 3D image or the like subjected to hidden surface elimination by the Z-buffer method is stored.
- the drawing device creates a multi-layer 3D image by superimposing a plurality of layers of images including the 3D image stored in the plurality of frame buffers and having been subjected to hidden surface erasure.
- the image forming apparatus of the present invention is suitable for applications in which it is very important in practice to reduce the memory capacity for graphics, such as a car navigation system.
- the z buffer is cleared each time the generation of each of the images of the plurality of layers by the drawing device is completed.
- the Z-buffer is cleared each time the generation of each of the images of the plurality of layers is completed, so that one Z-buffer is used in a time-division manner to perform the hidden surface removal for the images of the plurality of layers without any problem. it can.
- the drawing device draws an image in accordance with the perspective for at least one of the images of the plurality of layers.
- At least one of the multilayer 3D images for at least one of the multilayer 3D images, for example, a scene that the driver sees from the driver's seat is displayed as a 3D image by a perspective drawing method, and the driver actually It is easy to recognize the image by matching the scenery.
- at least one of the plurality of layers of images is drawn for each object based on three-dimensional coordinate information of an object included in the 3D image.
- Drawing object information generating means for generating the drawing object information in a single coordinate system for each predetermined information unit, and coordinate conversion information for generating coordinate conversion information defining at least a viewpoint and a field of view when drawing the 3D image
- a drawing ablation processing unit having a generating unit; a drawing object information storing unit that stores the generated drawing object information; and a coordinate conversion information storing unit that stores the generated coordinate conversion information.
- the graphics device further comprises a graphics library, wherein the drawing device operates the stored coordinate conversion information on the stored drawing object information for each of the predetermined information units.
- the Rukoto sequentially generates an image of the plurality of layers.
- the drawing application processing unit and the graphics library provide at least one or preferably a plurality of images of the plurality of layers.
- the data for drawing the image is separated into coordinate conversion information and drawing object information, and each is generated, stored, and managed by its own information generating means.
- the drawing device Based on the stored and managed data, the drawing device generates a 3D image as a display image and outputs it to the display device.
- the drawing speed by the drawing device can be improved by separately and independently handling the coordinate conversion information and the drawing object information, and executing the drawing processing based on these separately.
- the drawing object information generating means is described as “generating drawing object information in a single coordinate system for each predetermined information unit”.
- the predetermined information unit is, for example, a unit of a display list.
- the coordinate conversion information is made to correspond to the unit that acts on the drawing object information.
- the coordinate system is unified within the same predetermined information unit such as the same display list, but it is not necessary to unify the coordinate system between different predetermined information units.
- the drawing application processing unit may include at least one of the plurality of layers of images, which sequentially changes successively with time.
- the coordinate conversion information generating means may regenerate the coordinate conversion information while keeping the drawing object information fixed.
- the drawing application processing unit and the graphics library may be configured to generate and accumulate the drawing object information and the coordinate conversion information for the images of the plurality of layers by a multitask.
- At least one of the images of the plurality of layers includes a plurality of pieces of coordinate conversion information generated by the coordinate conversion information generation unit, and the coordinate conversion information storage unit
- a plurality of pieces of drawing object information generated by the drawing object information generating means are stored in the drawing object information storage means, and the drawing device is arbitrarily combined with the coordinate conversion information and the drawing object information. May be configured so as to draw.
- the stored some coordinate transformation information and some of the drawing object information are drawn in any combination. It can be drawn on the device, and the displayed image can be displayed quickly and as various images according to the user's request.
- the drawing application processing unit includes a list creation program that creates the list of the drawing object information, and a clear program that clears the Z buffer. And an execution instruction program for instructing the drawing device to execute drawing.
- one frame buffer can be shared by a plurality of frame buffers in a time-division manner.
- drawing devices make it easier to draw multi-layer 3D images.
- the graphics library has a function of managing the drawing object information, a function of controlling the graphics memory, and a function of controlling the drawing device. It may be configured to have With this configuration, the function of managing the drawing object information, the function of controlling the graphics memory, and the function of controlling the drawing device of the graphics library allow the multi-layer 3D image to be actually drawn by the drawing device.
- the drawing application processing unit determines at least one of the plurality of layers of images from a map database of a navigation system. Note that it may be configured to have a function of supplying map information including three-dimensional coordinate information.
- At least one of the images of the plurality of layers is based on the map information from the map database, and further includes position information of a moving object from a GPS positioning device or the like included in the navigation system,
- An image based on map information including three-dimensional coordinate information, which has been subjected to predetermined coordinate conversion on the display based on route information input by the pilot, is displayed as a layer constituting a multilayer 3D image.
- guidance information that does not involve coordinate transformation as another layer constituting a multi-layer 3D image by superimposing it on map information and providing it to the pilot's convenience.
- the coordinate conversion information may include, for at least one of the images of the plurality of layers, a light source in addition to the information defining the viewpoint and the field of view. It may be configured to include information that specifies
- the drawing object information is converted based on the light source information in addition to the viewpoint information and the view information as coordinate conversion information for at least one of the plurality of layers of the image, and a more realistic image is obtained. Can be displayed. Also, even if the drawing device is replaced, the coordinate conversion information such as viewpoint information, view information, and light source information can be applied as it is, and a similarly realistic image can be secured while portability is secured.
- the viewpoint is configured such that at least one of the plurality of layers of images is set based on a viewpoint of a mobile operator. You may.
- At least one of the multi-layer images can be
- the scenery seen by the driver's eyes is displayed as a 3D image, making it easier for the pilot to recognize the image as a three-dimensional image that matches the actual scenery.
- the viewpoint may be set manually.
- the field of view is set based on the field of view of the mobile operator for at least one of the plurality of layers of images.
- At least one of the plurality of layers of images is displayed as a 3D image of a scene that is in the driver's field of view.
- the field of view can be set manually.
- an electronic apparatus includes: an image forming apparatus (including various aspects thereof) according to the present invention described above; and a display that outputs a multilayer 3D image drawn by the drawing device. Device.
- the electronic apparatus of the present invention since the electronic apparatus of the present invention includes the above-described image creating apparatus, a navigation system such as an in-vehicle navigation system capable of displaying a multi-layer 3D image, a game such as an arcade game or a video game.
- a navigation system such as an in-vehicle navigation system capable of displaying a multi-layer 3D image
- a game such as an arcade game or a video game.
- Various electronic devices such as devices and computers such as personal computers that can display multilayer 3D images can be realized.
- the computer program of the present invention causes a computer to function as the above-described image creating apparatus of the present invention (including its various aspects). More specifically, the computer is caused to function as the above-described graphics memory, drawing device, drawing application processing unit, graphics library, or the like of the present invention. It functions as a generation unit, a coordinate conversion information generation unit, a drawing object information storage unit, a coordinate conversion information storage unit, and the like.
- the computer program is stored in a storage medium such as a CD-ROM (Compact Disc Read Only Memory), a DVD-ROM (DVD Read Only Memory), or a hard disk that stores the computer program.
- a storage medium such as a CD-ROM (Compact Disc Read Only Memory), a DVD-ROM (DVD Read Only Memory), or a hard disk that stores the computer program.
- an image creation method is provided with a plurality of frame buffers for separately storing a plurality of layers of images including a 3D image, and is provided in common for the plurality of frame buffers.
- a graphics memory having a single Z-buffer and a drawing device for creating a multi-layer 3D image formed by superimposing the images of the plurality of layers. Sequentially generating and sequentially storing images in the plurality of frame buffers while performing hidden surface removal using the one Z buffer in a time-division manner; and storing the images in the plurality of frame buffers and performing the hidden surface removal. Superimposing the images of the plurality of layers after the image is executed.
- a plurality of layers of images including 3D images each formed using the Z-buffer method are superimposed on the same screen. They can be displayed together.
- the graphics memory has one Z-buffer provided in common for a plurality of frame buffers, it is compared with a case where a Z-buffer is separately provided for a plurality of layers of images.
- the memory capacity can be significantly reduced.
- the z-buffer is cleared each time generation of each of the images of the plurality of layers by the drawing device is completed.
- the z-buffer is cleared each time the generation of each of the images of the plurality of layers is completed, so that one Z-buffer is used in a time-division manner to perform the hidden surface removal for the images of the plurality of layers without any problem. it can.
- At least one of the plurality of layers of the image is drawn for each object based on the three-dimensional coordinate information of the object included in the 3D image.
- data for drawing the image is separated into coordinate conversion information and drawing object information, Each is generated, stored and managed by its own information generation means.
- a 3D image is generated as a display image by a drawing device based on the stored and managed data, and output to a display device.
- the drawing speed by the drawing device can be improved by treating the coordinate conversion information and the drawing object information separately and independently, and executing the drawing processing based on these separately.
- a further 3D image can be generated in a burst manner, so that hidden surface elimination using a Z buffer when drawing a further 3D image is performed. Is performed in a very short time. Therefore, using one Z-buffer in a time-sharing manner, hidden surface elimination of the image of each layer can be executed without any problem.
- a single Z buffer provided in common for a plurality of frame buffers for separately storing multiple layers of 3D images is used in a time-division manner. Since hidden surface elimination is performed, it is possible to display a multi-layer 3D image while reducing the memory capacity required for the graphics memory.
- FIG. 1 is a block diagram showing a basic configuration of an image forming apparatus according to a first embodiment of the present invention.
- FIG. 2 is a diagram for explaining an internal configuration of a graphics library of the image creating apparatus.
- FIG. 3 is a diagram for explaining management of a scene object of the image creating apparatus.
- FIG. 4 is a flowchart showing the operation flow of the graphics library c .
- FIG. 5 is a flowchart showing the operation flow of the drawing device of the image creating apparatus.
- FIG. 6 is a flowchart showing the flow of the operation of the drawing application processing unit.
- FIG. 7 is a sequence chart showing the operation of the image creating apparatus.
- FIG. 8 is a diagram illustrating an example of drawing.
- FIG. 9 is a diagram showing a configuration of a navigation system applied to the image creating apparatus of the present invention as a second embodiment.
- the image creating apparatus of the present invention is constructed so as to be used for an in-vehicle navigation system, but the present invention is applied to a personal computer without being concerned with this embodiment. It is also suitable for use in the creation of images for video games and for video games.
- FIG. 1 is a block diagram showing a basic configuration of the image creating apparatus according to the first embodiment
- FIG. 2 is a diagram for explaining an internal configuration of a graphics library constituting the image creating apparatus.
- FIG. 3 is a diagram for explaining the management of scene objects related to drawing
- FIG. 4 is a flowchart showing an operation flow of a graphics library constituting the image creating device
- FIG. 5 is a flowchart showing the operation of the image creating device.
- FIG. 6 is a flowchart showing the operation flow of the drawing device constituting the image processing apparatus.
- FIG. 6 is a flowchart showing the operation flow of the drawing application processing unit.
- FIG. 7 is a sequence chart showing the operation of the image creating apparatus of the present embodiment
- FIG. 8 is a diagram showing an example of drawing.
- the image creating apparatus 1 includes a drawing application processing unit 11, a Dallax library 12, a drawing It comprises a device 13, a graphics memory 16, and a superposition unit 17.
- the drawing application processing unit 11 includes coordinate conversion information 14 such as viewpoint, field of view, and light source, and road, building, and map information. Is input.
- the graphics library 12 and the drawing device 13 form a system as a single unit, and can be arbitrarily replaced by the drawing application processing unit 11.
- the image creating apparatus 1 is configured to create a multilayer 3D image.
- the image creating apparatus 1 stores the first 3D image in the graphics memory 16.
- a one-frame buffer 16a is provided, and a second frame buffer 16b is provided in the graphics memory 16 for drawing a 3D image of the second layer.
- One shared Z buffer 16 c is provided for these two buffers. The Z buffer 16c is used for the first frame buffer 16a and the second frame buffer 16b in a time division manner as described later.
- the display device such as a liquid crystal display and a CRT display is configured to display and output as one multilayer 3D image.
- a case will be described in which a multi-layer 3D image composed of two-layer 3D images is drawn.
- three or more frame buffers are provided in the graphics memory 16, It can also be configured to draw a multi-layer 3D image composed of 3 or more 3D images.
- the Z buffer 16c may be configured to be shared by all the frame buffers by using time division.
- the multi-layer 3D structure can be achieved while suppressing an increase in the memory capacity of the graphics memory 16. The remarkable function and effect of the present embodiment of drawing an image is exhibited.
- the drawing application processing section 11 is provided with a coordinate conversion parameter creation routine 111 and a display list creation routine 112.
- the coordinate transformation parameter creation routine 111 creates coordinate transformation data for each of the multi-layer 3D images based on coordinate transformation information such as a viewpoint, a field of view, and a light source input as coordinate transformation information 14. .
- This data is a graphics library 1 In 2 it is managed as a scene object. The operation of setting the parameters for the scene object and applying it to the drawing based on the setting values is performed by performing the operation on the identifier of the scene object.
- the display list creation routine 112 creates drawing object information from roads, buildings, map information, and the like for each of the multi-layer 3D images, and inputs the drawing object information to the graphic library 12.
- This drawing object information does not include coordinate conversion information.
- the coordinate conversion information is separately set as the scene object described above. In this way, coordinate conversion information and drawing object information are individually created for each of the multi-layer 3D images, and the drawing is performed by applying the coordinate conversion information to the drawing object information during drawing. It realizes the replacement of the system described above and the increase in drawing speed.
- the graphics library 1 2 scene object setting unit 1 2 1 and the display list creation unit 1 c scene object setting unit 1 2 1 and a 2 2 and the display list execution unit 1 2 3 has a plurality
- the coordinate conversion information created by the coordinate conversion parameter creation routine 111 of the drawing application processing unit 111 is stored in the scene object corresponding to the identifier specified by the drawing application. Save and manage.
- the display list creation unit 1 2 2 converts the display list created by the display list creation routine 1 1 2 of the drawing application processing unit 1 1 1 into the drawing device 1 3 for each of the multi-layer 3D images. It is configured and managed so that it can be executed directly and in batch so that it can be drawn at high speed.
- the display list execution unit 123 controls the drawing device 13 for each of the multi-layer 3D images, and also includes a scene object setting unit 121 and a display list creation unit 122. To the drawing device 13 to send the display list and the coordinate conversion parameter processed to enable the batch execution of the coordinate conversion information and the drawing object information, and to execute the drawing operation.
- the coordinate conversion information and the drawing object information are stored separately for each of the multi-layer 3D images, and the drawing object 13 is drawn at the time of drawing.
- Apply coordinate transformation information to the information The coordinates are transformed by adding the viewpoint, field of view, light source, and other conditions set as a project, and 3D images of each layer are drawn. Therefore, for each of the multiple layers of 3D images, the drawing object information stored in a single display list is a display list in a single coordinate system independent of the viewpoint and the field of view.
- the 3D image of the first layer created by the drawing device 13 is held in the corresponding first frame buffer 16a, and the hidden surface is erased using the Z buffer 16c.
- the drawing device 13 draws the first layer 3D image by applying the coordinate conversion information to the drawing object information, so that the drawing of the first layer 3D image is performed in a burst manner.
- the second layer 3D image created by the drawing device 13 is held in the corresponding second frame buffer 16 b and the Z buffer 16 c Is performed using the hidden surface.
- the 3D image of the second layer is also drawn in a burst manner.
- the two layers of 3D images stored in the first frame buffer 16a and the second frame buffer 16b are superimposed by the superimposing unit 17.
- a multi-layer 3D image is input to the display device 19 and displayed.
- the Z-buffer 16c can be used in a time-sharing manner as described above. It is also possible to superimpose 3D images of three or more layers while performing erasure.
- a display list is created as a procedure (1) for the 3D image of the first layer.
- the created display list is stored as the object display list (1) in the graphics library 12.
- coordinate conversion information is set as step (2).
- the coordinate conversion information is stored in the scene object (1).
- drawing is instructed as a procedure (3). Work on the object display list (1) and scene object (1) with the graphics library 12 and input the data to the drawing device 13 and execute hidden surface removal using the Z buffer 16c. Draws the 3D image of the first layer in the frame buffer 16a while performing the operations.
- the drawing device 13 has a 3D coordinate conversion function, and based on the coordinate conversion parameter specified by the identifier, for example, when the vehicle is running, based on the driver's viewpoint, field of view, light source, etc., and the drawing object information. For example, the scenery that is actually running, which can be seen by the driver, is drawn and displayed in 3D.
- a large number of display playlists or scene objects of the objects may be created and stored in advance, and each of them may be appropriately combined and drawn.
- step S 101 From the state of waiting for an operation input from the drawing application processing unit 11 (step S 101) If there is, the type of the operation is determined (step S102).
- the types of operations of the graphics library 12 include creation of a display list, setting of a scene object, execution of a Z-buffer tally, and execution of a display list.
- a display list creation instruction When a display list creation instruction is issued, a display list is created based on drawing object information such as roads and buildings (step S103). After creating the display list, the process returns to step S101 and waits for the next operation input.
- step S 102 If the result of determination in step S 102 is that the operation input is a setting of a scene object, the operation input is set to a scene object specified by an identifier of coordinate conversion information based on a viewpoint, a field of view, a light source, and the like (step S 104 ).
- step S 104 When the setting of the scene object is completed, the process returns to step S101 again and waits for the next operation input.
- step S 102 If the result of determination in step S 102 is that the operation input is execution of a display list, the scene object specified by the identifier is set in the drawing device 13 (step S 105), and the display list is set. A request is made to the drawing device 13 to execute (step S106).
- step S 102 If the result of determination in step S 102 is that the operation input clears the Z buffer, the Z buffer 16 c in the graphics memory 16 is cleared (step S 1
- the operation input for clearing the Z buffer is performed when the drawing operation of the further 3D image is completed after the execution of step S106.
- the drawing device 13 executes the display list in a lump to draw. Refer to Figure 3 for the execution procedure. This is in accordance with the execution procedure described above.
- step S201 when there is an operation input from a state of waiting for an operation input from the graphics library 12 (step S201), the type of the operation is determined (step S202). Operation types include setting of scene objects and execution of display lists.
- the drawing device 13 sets a coordinate conversion parameter based on the identifier (step S203).
- the flow returns to step S201 again and waits for the next operation input.
- step S 202 If the result of determination in step S 202 is that the operation input is execution of a display list, drawing is executed based on the coordinate conversion parameters and the display list.
- the rendered image is output from the rendering device 13 and stored in the corresponding frame buffer 16a or 16b. At this time, hidden surface elimination using the Z buffer 16c is performed.
- a display list is created for each of the first layer 3D image and the second layer 3D image by the display list creation routine 112. Then, the corresponding coordinate conversion parameters are created by the coordinate conversion parameter creation routine 111 (step S501).
- a command to clear the Z buffer 16 c is issued via the graphics library 12. As a result, the Z buffer 16c is cleared (step S502).
- step S501 the display list created in step S501 related to the 3D image of the first layer is added to the corresponding coordinate transformation parameter created in step S501 as well.
- the 3D image of the first layer is drawn by operating the meter.
- hidden surface erasure is performed using the Z buffer 16c cleared in step S502 (step S503). More specifically, this hidden surface removal is performed by, for example, forming one frame image related to one landscape, a road image, an image of one building, an image of another building, an image of the sky, etc.
- a plurality of partial frame images constituting the 3D image are sequentially compared at each drawing position, and the image part located at the foreground at the same drawing position is stored for each drawing position. Finally, at all drawing positions, a 3D image of the frame consisting of the foreground image portion is drawn.
- the Z buffer 16c is cleared again (step S504).
- step S501 by applying the corresponding coordinate transformation parameters, also created in step S501, to those associated with the 3D image of the second layer in the display list created in step S501, Draw the second layer 3D image.
- hidden surface erasure is performed using the Z buffer 16c cleared in step S504 (step S505).
- one Z-buffer 16 c is used in a time-division manner when rendering a 3D image of each layer, and can be shared by a plurality of frame buffers.
- This sequence chart shows the interrelationship between the drawing application processing unit 11, the graphics library 12, the drawing device 13 and the graphics memory 16 over time, and the horizontal line shows the interrelationship. And the vertical line indicates the passage of time from top to bottom.
- the drawing application processing section 11 creates a display list ⁇ for the 3D image of the first layer, and inputs it to the graphics library 12 (step S601).
- the drawing application processing unit 11 A target conversion parameter, that is, an identifier is set, and the setting of the scene object to which this is added is instructed to the graphics library 12.
- the drawing application processing section 11 creates a display list ⁇ relating to the 3D image of the second layer, and inputs it to the graphics library 12 (step S602).
- a coordinate conversion parameter that is, an identifier is set in the drawing application processing unit 11, and the setting of the scene object to which the coordinate conversion parameter is added is instructed to the graphics library 12.
- the Z buffer 16c is cleared by the drawing application processing unit 11 via the graphics library 12 (step S603).
- the drawing application processing unit 11 instructs the graphics library 12 to draw by applying the scene object set to the created display list of the first layer 3D image.
- the graphics library 12 supplies the display device 13 with the display list of the first layer 3D image together with the corresponding scene object, and displays the display list. (Step S606).
- the drawing device 13 draws the display list ⁇ ⁇ ⁇ based on the coordinate transformation parameters of the corresponding scene object, and executes the hidden surface elimination using the Z buffer 16c. Meanwhile, a 3D image of the first layer is drawn in the first frame buffer 16a (step S607). When the drawing execution in the first frame buffer 16a is completed, the drawing completion is notified to the graphics library 12 and the drawing application processing unit 11 (step S608).
- drawing application processor 1 1 By the drawing application processor 1 1 having received this via a graphic slide library 1 2, again clears the Z buffer 1 6 c (Step S 6 0 9) c Then, drawing application processor 1 1, graphics The library 12 is instructed to draw by applying the set-up object set to the created display list of the second layer 3D image (step S610). In response to these instructions, the graphics library 12 sends the display list ⁇ relating to the second layer 3D image to the drawing device 13 to the corresponding scene. Supplied with the object and instructs the display list to be executed (step S610).
- the drawing device 13 draws the display list ⁇ based on the coordinate transformation parameters of the corresponding scene object, and executes the second process while erasing the hidden surface using the Z buffer 16c.
- the 3D image of the second layer is drawn on the 2 frame buffer 16b (step S612).
- the drawing completion is transmitted to the graphics library 12 and the drawing application processing unit 11 (step S 6 13), and the drawing completion processing is performed.
- the 3D images of the first and second layers stored in the first frame buffer 16a and the second frame buffer 16b, respectively, are superimposed by the superimposing unit 17 and the multi-layer 3D image is displayed on the display device 19. Is output and displayed.
- drawing application processing unit 11 may determine whether the current display list 1 or 2 can cover the next field of view for any 3D image. If it can be covered, the same display list can be used to draw the next 3D image.
- one Z-buffer 16 c is used to draw a 3D image of the first layer in the period T 1, and is used in the second layer in the period T 2. Used to render 3D images.
- Fig. 8 shows an example of the display of the multi-layer 3D image created as described above.
- the main image which is the first layer of the 3D image, is the driver's gaze when the vehicle is driving on a road in the city. It shows the sight seen in.
- the light source 21, the viewpoint 22, the field of view 23, etc. are coordinate conversion information represented by identifiers embedded in the scene object, and the buildings 24 a, 24 b, 24 c, The roads 25 and the like correspond to the drawing object information.
- the c light source 21 is, for example, the sun in the daytime and the streetlight in the nighttime, and their position and illumination direction are parameters.
- the viewpoint can be a point corresponding to the driver's line of sight, and it can be seen as if it were a scene of the environment in which the vehicle is actually traveling.
- the field of view 23 defines a predetermined image range, and is provided to a driver to set a suitable range.
- Buildings 24a, 24b, 24c, ..., roads 25, etc. correspond to the drawing object information.
- the display lists for these are created in a form that can be directly executed by the drawing device.
- these drawing object information those supplied from a map information database or the like of the navigation system can be used.
- the form as the drawing object information is represented by a single coordinate system having no coordinate conversion information.
- the light source 21, in this case, the sun is ahead, and the front side of the buildings 24 a, 24 b, 24 c, ... is dark and shaded.
- the viewpoint 22 is above the road 25, and is a method based on perspective, and the viewpoint 22 includes buildings 24a, 24b, 24c, ' Coordinate transformation is performed so that drawing objects such as roads and roads 25 converge.
- the main image is the first 3D image in which the hidden surface has been removed using the z-buffer as described above.
- the barrel sub-image 28 is drawn.
- the sub-image 28 is a 3D image depicting the state of the tollgate, and the sub-image 28 is nothing but a 3D image in which hidden surfaces are eliminated using the Z buffer as described above.
- the sub-image 29 for displaying character information is drawn in the lower right part of the figure in the main image.
- the sub-image 29 is also erased using the Z buffer. May be drawn as a 3D image (for example, three-dimensional characters).
- the image creating apparatus of the present embodiment is suitable for applications where it is important to reduce the memory capacity of the image forming apparatus.
- the drawing object information and the coordinate conversion information it is possible to convert the image at high speed and draw the image.
- the same drawing object can be easily processed. Can be drawn at different coordinates.
- this apparatus is used for navigation for a mobile object.
- a form applied to the system will be described.
- the various functions of the navigation system are closely related to the image forming device and are configured as a single unit. This point will be described in detail. Note that the configuration and operation of the image creating apparatus itself are the same as those described above, so that a repetitive description will be omitted here, and the above description will be referred to as needed.
- the navigation system is a self-contained positioning device 30, 0-3 receiver 38, system controller 40, input / output (I / O) circuit 41, CD-ROM drive 51, DVD-ROM drive 52, hard disk Device (HDD) 56, wireless communication device 58, display unit 60, audio output unit 70, input device 80, and external interface (I / F) unit 81. It is connected to a bus line 50 for transferring control data and processing data.
- the self-contained positioning device 30 includes an acceleration sensor 31, an angular velocity sensor 32, and a speed sensor 33.
- the acceleration sensor 31 is composed of, for example, a piezoelectric element, and outputs acceleration data obtained by detecting the acceleration of a vehicle.
- the angular velocity sensor 32 is constituted by, for example, a vibrating gyroscope, and outputs angular velocity data and relative azimuth data obtained by detecting the angular velocity of the vehicle at the time of changing the direction of the vehicle.
- the speed sensor 33 mechanically, magnetically, or optically detects the rotation of the axle of the vehicle, and outputs a signal of a pulse number corresponding to the vehicle speed for each rotation of the axle at a predetermined angle.
- the 0 to 3 receiver 38 is a known device equipped with a digital signal processor (DSP) or microprocessor (MPU), V-RAM, memory, etc., together with a planar polarization omnidirectional receiving antenna and a high-frequency reception processing unit. It is a structure of. 0 3 Receiver 38 receives radio waves from at least three GPS satellites that fly in the sky, performs spectrum despreading, distance measurement, doppler measurement, orbit data processing, and calculates position and speed. After calculating, the absolute position information of the receiving point (vehicle driving point) is continuously output from the IZO circuit 41 to the bus line 50, captured by the system controller 40, and displayed on a map road. It is configured to perform.
- DSP digital signal processor
- MPU microprocessor
- the system controller 40 is composed of a CPU 42, a non-volatile solid-state storage element RO It is composed of an M43 and a working RAM 44, and exchanges data with each unit connected to the bus line 50.
- the processing control by exchanging the data is executed by a boot program and a control program stored in the ROM 43.
- the RAM 44 temporarily stores, in particular, setting information for changing the map display by the user operation from the input device 80 (change to the whole or district map display).
- the CD-ROM drive 51 and the DVD-ROM drive 52 store the map database information (for example, the number of lanes in the map information (foliage), respectively) stored in the CD-ROM 53 and the DVD-ROM 54, respectively. Reads and outputs various road data such as road width).
- the hard disk drive 56 stores the map (image) data read by the CD-ROM drive 51 or the DVD-ROM drive 52, and can read out the data at any time after the storage.
- the hard disk unit 56 can further store audio data and video data read from the CD-ROM drive 51 or the DVD-ROM drive 52.
- the map data on the CD-ROM 53 or the DVD-ROM 54 is read and the navigation operation is performed, while the audio data and the video data stored in the disk unit 56 are read to output the audio output and the video.
- Output is enabled.
- the display unit 60 displays various processing data on the screen under the control of the system controller 40.
- the display section 60 controls each section of the display section 60 based on control data transferred from the CPU 42 through the internal graphic controller 61 bus line 50.
- a buffer memory 62 using a V-RAM or the like temporarily stores image information that can be displayed immediately.
- the display control section 63 performs display control, and displays image data output from the graphic controller 61 on the display 64.
- the display 44 is arranged, for example, near the front panel in the vehicle.
- the audio output section 70 is controlled by the D / A converter 71 1 system controller 40
- the audio signal transferred through the bus line 50 is converted into a digital signal, and the audio analog signal output from the DZA converter 71 is variably amplified by a variable amplifier (AMP) 72 to the speaker 73.
- the c input device 80 for outputting and outputting a voice from the input device includes a key, a switch, a button, a remote controller, a voice input device, and the like for inputting various commands and data.
- the input device 80 is arranged around a front panel or a display 64 of a main body of the vehicle-mounted electronic system mounted in the vehicle.
- the image creation device when introducing the image creation device according to the present invention into a navigation system, it is required to appropriately display an image that matches the drive route.
- the image viewed from the driver's line of sight on the road on which the vehicle is currently running is displayed in 3D, and the image of a vehicle turning at an intersection ahead or the wind ahead in a place with poor visibility.
- the scenery is displayed in 3D, and it is also useful for safety to inform the driver. It is also necessary that various messages be superimposed and displayed on those images.
- the image creating apparatus can construct an extremely effective navigation system by constructing an integrated system by using various devices and functions of the navigation system.
- the image creating apparatus separates and captures coordinate conversion information such as a viewpoint, a field of view, and a light source and information on a drawing object such as a road and a building in a drawing application processing unit 11, and each of them is used as drawing data.
- the data is created, stored, and managed by the graphic library 12, and the image is formed from the individual information by the drawing device 13 as described above.
- the drawing object information is map information including roads and buildings, and utilizes the map database of the navigation system.
- the map information is stored in the CD-ROM 53 and the DVD-ROM 54, and is read by the CD-ROM drive 51 and the DVD-ROM drive 52. Further, map information can be obtained from a predetermined site via the communication device 58 and stored in the hard disk device 56 for use.
- the map information of the drive route read by the CD-ROM drive 51 or the DVD-ROM drive 52 can be stored and read at any time after the storage. This work may be performed when creating a drive plan.
- the map information is divided into a number of areas, and is represented by a coordinate system for each area.
- the display list creation routine 1 12 of the drawing application processing unit 11 of the image creation apparatus 11 The data is converted into a single coordinate system that does not depend on the position, and an instruction is issued to the display list creation unit 122 of the graphics library 12, where the display list is formed and stored and managed.
- the information such as viewpoint, field of view, and light source which is the coordinate conversion information required by the image creation device
- the viewpoint and the field of view may be set at a predetermined position or range, or may be set manually.
- the direction of the sun can be specified in consideration of seasonal factors, and its position is determined using the direction of the sun as a light source.
- the position and time of arrival can be specified to determine the direction of the sun, and therefore the effect of the position of the light source at the time of arrival Can be seen.
- the coordinate transformation information by applying the coordinate transformation information to the drawing object information on the condition of the momentarily changing time, it is possible to confirm the change in the shadow of the landscape from the sunrise to the sunset. It is also possible to change the D image sequentially.
- the coordinate transformation information when displaying a 3D image corresponding to the scenery of continuous traveling on the same road, the coordinate transformation information should be changed as the vehicle travels while fixing the drawing object information as described above.
- 3D images can be displayed continuously and efficiently.
- a scene object serving as coordinate conversion information can be determined using the function of the navigation system, and drawing object information can be determined using map information. Therefore, a 3D image can be drawn from the independent coordinate conversion information and the drawing object information, and the image is displayed on the display unit of the navigation system. 0, and stored by the graphic controller 61 in the buffer memory 62 using V-RAM, read out therefrom, and displayed on the display 64 via the display control unit 63. .
- the present invention is not limited to this mode. It is also suitable for use in image creation in games, mobile games, etc., and in operation simulation equipment or training equipment for various moving objects such as cars, motorcycles, airplanes, helicopters, rockets, ships, etc. .
- the present invention is applicable to image creation in navigation systems, personal computers, mobile phones, etc., image creation in video games, mobile games, etc., and image formation in steering simulation devices or training devices for various moving objects such as automobiles, airplanes, and ships. Can be used.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Generation (AREA)
- Navigation (AREA)
- Processing Or Creating Images (AREA)
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP02775231A EP1435590A1 (en) | 2001-09-26 | 2002-09-25 | Image creation apparatus and computer program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2001-295167 | 2001-09-26 | ||
JP2001295167A JP2003109032A (ja) | 2001-09-26 | 2001-09-26 | 画像作成装置及びコンピュータプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2003027959A1 true WO2003027959A1 (fr) | 2003-04-03 |
Family
ID=19116645
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2002/009845 WO2003027959A1 (fr) | 2001-09-26 | 2002-09-25 | Programme informatique et appareil de création d'image |
Country Status (5)
Country | Link |
---|---|
US (1) | US20030080958A1 (ja) |
EP (1) | EP1435590A1 (ja) |
JP (1) | JP2003109032A (ja) |
CN (1) | CN1559054A (ja) |
WO (1) | WO2003027959A1 (ja) |
Families Citing this family (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100543701B1 (ko) * | 2003-06-17 | 2006-01-20 | 삼성전자주식회사 | 공간형 입력 장치 및 방법 |
CN1922879A (zh) * | 2004-02-23 | 2007-02-28 | 松下电器产业株式会社 | 显示处理装置 |
DE102005018080A1 (de) * | 2005-04-19 | 2006-10-26 | Robert Bosch Gmbh | Verfahren zur dreidimensionalen Darstellung einer digitalen Straßenkarte |
EP1746542A1 (en) * | 2005-07-19 | 2007-01-24 | Spark Vision Ab | System and method for managing digital image layers |
JP2007026201A (ja) * | 2005-07-19 | 2007-02-01 | Sega Corp | 画像処理装置、道路画像描画方法および道路画像描画プログラム |
US7557736B1 (en) * | 2005-08-31 | 2009-07-07 | Hrl Laboratories, Llc | Handheld virtual overlay system |
US20080046175A1 (en) * | 2006-07-31 | 2008-02-21 | Nissan Technical Center North America, Inc. | Vehicle navigation system |
US8970680B2 (en) * | 2006-08-01 | 2015-03-03 | Qualcomm Incorporated | Real-time capturing and generating stereo images and videos with a monoscopic low power mobile device |
US9721420B2 (en) | 2008-01-08 | 2017-08-01 | Bally Gaming, Inc. | Video switcher and touch router method for multi-layer displays |
US8217934B2 (en) * | 2008-01-23 | 2012-07-10 | Adobe Systems Incorporated | System and methods for rendering transparent surfaces in high depth complexity scenes using hybrid and coherent layer peeling |
AU2008355643A1 (en) * | 2008-05-02 | 2009-11-05 | Tomtom International B.V. | A navigation device and method for displaying a static image of an upcoming location along a route of travel |
CN102194043B (zh) * | 2010-03-15 | 2013-08-14 | 北京乐升科技有限公司 | 地图图像与虚拟地图的产生方法及虚拟地图产生器 |
CN102012924B (zh) * | 2010-11-29 | 2013-07-03 | 深圳市融创天下科技股份有限公司 | 一种地图显示的方法、系统和移动终端 |
FR2964775A1 (fr) | 2011-02-18 | 2012-03-16 | Thomson Licensing | Procede d'estimation de l'occultation dans un environnement virtuel |
KR101508409B1 (ko) * | 2011-12-27 | 2015-04-08 | 삼성전기주식회사 | Mrlc를 이용한 오버레이 구현 장치 및 방법 |
CN102750665B (zh) * | 2012-06-01 | 2014-09-24 | 上海鼎为电子科技(集团)有限公司 | 一种图形处理方法、图形处理装置及移动终端 |
JP5883817B2 (ja) * | 2013-03-21 | 2016-03-15 | 株式会社ジオ技術研究所 | 描画データ生成装置および描画装置 |
JP2015158728A (ja) * | 2014-02-21 | 2015-09-03 | 東芝テック株式会社 | 情報閲覧装置、及び、情報閲覧プログラム |
GB2524120B (en) * | 2014-06-17 | 2016-03-02 | Imagination Tech Ltd | Assigning primitives to tiles in a graphics processing system |
GB2524121B (en) | 2014-06-17 | 2016-03-02 | Imagination Tech Ltd | Assigning primitives to tiles in a graphics processing system |
JP6413846B2 (ja) * | 2015-03-02 | 2018-10-31 | 株式会社リコー | 画像処理システム、画像処理システムの制御方法、画像処理システムの制御プログラム、画像形成出力制御装置 |
US10078884B2 (en) * | 2015-12-21 | 2018-09-18 | Siemens Aktiengesellschaft | System and method for processing geographical information with a central window and frame |
US10032299B2 (en) * | 2016-04-04 | 2018-07-24 | Samsung Electronics Co., Ltd. | Portable image device for generating application images |
CN107656961B (zh) * | 2017-08-04 | 2020-03-27 | 阿里巴巴集团控股有限公司 | 一种信息显示方法及装置 |
US10915230B1 (en) * | 2020-01-15 | 2021-02-09 | Citelum Sa | Layer superimposition in a user interface for a lighting plan |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09292830A (ja) * | 1996-04-25 | 1997-11-11 | Hitachi Ltd | 電子地図表示方法及び電子地図表示装置 |
JPH1021420A (ja) * | 1996-07-05 | 1998-01-23 | Namco Ltd | 画像合成装置及び画像合成方法 |
JPH11174952A (ja) * | 1997-12-11 | 1999-07-02 | Fumio Mizoguchi | 地図データの先読み方法及び地図スクロール方法 |
JPH11232484A (ja) * | 1997-12-05 | 1999-08-27 | Wall:Kk | 3次元都市データ生成方法、3次元都市データ生成装置及び3次元都市データ調査用測定装置 |
JP2000293705A (ja) * | 1999-04-01 | 2000-10-20 | Mitsubishi Electric Corp | 3次元グラフィックス用描画装置、3次元グラフィックス用描画方法および3次元グラフィックス用描画プログラムを記録した媒体 |
GB2350993A (en) * | 1998-12-15 | 2000-12-13 | Ibm | Generation of mixed semitransparent and opaque objects on a computer display screen. |
JP2001109911A (ja) * | 1999-10-14 | 2001-04-20 | Hitachi Ltd | 3次元図形表示方法および装置 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2267203B (en) * | 1992-05-15 | 1997-03-19 | Fujitsu Ltd | Three-dimensional graphics drawing apparatus, and a memory apparatus to be used in texture mapping |
US5864342A (en) * | 1995-08-04 | 1999-01-26 | Microsoft Corporation | Method and system for rendering graphical objects to image chunks |
JP3375258B2 (ja) * | 1996-11-07 | 2003-02-10 | 株式会社日立製作所 | 地図表示方法及び装置並びにその装置を備えたナビゲーション装置 |
JPH10255081A (ja) * | 1997-03-10 | 1998-09-25 | Canon Inc | 画像処理方法及び画像処理装置 |
US6104406A (en) * | 1997-04-04 | 2000-08-15 | International Business Machines Corporation | Back away navigation from three-dimensional objects in three-dimensional workspace interactive displays |
JPH10334269A (ja) * | 1997-06-03 | 1998-12-18 | Sega Enterp Ltd | 画像処理装置、画像処理方法、及び画像処理プログラムを記録した記録媒体 |
US6611753B1 (en) * | 1998-04-17 | 2003-08-26 | Magellan Dis, Inc. | 3-dimensional intersection display for vehicle navigation system |
GB9915012D0 (en) * | 1999-06-29 | 1999-08-25 | Koninkl Philips Electronics Nv | Z-buffering graphics system |
-
2001
- 2001-09-26 JP JP2001295167A patent/JP2003109032A/ja active Pending
-
2002
- 2002-09-25 CN CNA028189426A patent/CN1559054A/zh active Pending
- 2002-09-25 EP EP02775231A patent/EP1435590A1/en not_active Withdrawn
- 2002-09-25 WO PCT/JP2002/009845 patent/WO2003027959A1/ja not_active Application Discontinuation
- 2002-09-26 US US10/254,928 patent/US20030080958A1/en not_active Abandoned
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09292830A (ja) * | 1996-04-25 | 1997-11-11 | Hitachi Ltd | 電子地図表示方法及び電子地図表示装置 |
JPH1021420A (ja) * | 1996-07-05 | 1998-01-23 | Namco Ltd | 画像合成装置及び画像合成方法 |
JPH11232484A (ja) * | 1997-12-05 | 1999-08-27 | Wall:Kk | 3次元都市データ生成方法、3次元都市データ生成装置及び3次元都市データ調査用測定装置 |
JPH11174952A (ja) * | 1997-12-11 | 1999-07-02 | Fumio Mizoguchi | 地図データの先読み方法及び地図スクロール方法 |
GB2350993A (en) * | 1998-12-15 | 2000-12-13 | Ibm | Generation of mixed semitransparent and opaque objects on a computer display screen. |
JP2000293705A (ja) * | 1999-04-01 | 2000-10-20 | Mitsubishi Electric Corp | 3次元グラフィックス用描画装置、3次元グラフィックス用描画方法および3次元グラフィックス用描画プログラムを記録した媒体 |
JP2001109911A (ja) * | 1999-10-14 | 2001-04-20 | Hitachi Ltd | 3次元図形表示方法および装置 |
Also Published As
Publication number | Publication date |
---|---|
CN1559054A (zh) | 2004-12-29 |
US20030080958A1 (en) | 2003-05-01 |
JP2003109032A (ja) | 2003-04-11 |
EP1435590A1 (en) | 2004-07-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4807693B2 (ja) | 画像作成装置及びその方法、電子機器並びにコンピュータプログラム | |
WO2003027959A1 (fr) | Programme informatique et appareil de création d'image | |
JP7374313B2 (ja) | 仮想環境における乗り物の運転方法、装置、端末及びプログラム | |
EP1434174B1 (en) | Image creation apparatus and method, and computer program | |
US8862392B2 (en) | Digital map landmarking system | |
WO2008059586A1 (fr) | Dispositif de navigation, procédé d'affichage de carte et programme d'affichage de carte | |
JP3568357B2 (ja) | ナビゲーション装置における地図情報表示装置及び地図情報表示方法並びにナビゲーション装置における地図情報表示制御プログラムが記録されたコンピュータ読み取り可能な記録媒体 | |
JP4807691B2 (ja) | 画像作成装置及び方法、電子機器並びにコンピュータプログラム | |
JP2004117830A (ja) | ナビゲーション装置 | |
JPH10332396A (ja) | ナビゲーション装置 | |
JP2000009480A (ja) | 位置情報表示装置 | |
JP3019299B1 (ja) | 実写画像データを記録した記録媒体および画像読み出し装置 | |
JP5165851B2 (ja) | 立体地図表示方法、立体地図表示装置およびナビゲーション装置 | |
JP4054242B2 (ja) | ナビゲーション装置、方法及びプログラム | |
JP2006268550A (ja) | ナビゲーション装置 | |
JP4906272B2 (ja) | 地図表示システム、情報配信サーバ、地図表示装置およびプログラム | |
JP3655738B2 (ja) | ナビゲーション装置 | |
KR101061363B1 (ko) | 내비게이션 시스템에 특화된 3차원 제어시스템 및 그 방법 | |
JP4621346B2 (ja) | 航空機用電子地図表示方法及びその装置 | |
JP5348806B2 (ja) | 地図表示システム | |
JP2021181914A (ja) | 地図表示システム、地図表示プログラム | |
JP2005345366A (ja) | ナビゲーション装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BY BZ CA CH CN CO CR CU CZ DE DM DZ EC EE ES FI GB GD GE GH HR HU ID IL IN IS KE KG KP KR KZ LK LR LS LT LU LV MA MD MG MK MW MX MZ NO NZ OM PH PL PT RO SD SE SG SI SK SL TJ TM TN TR TT UA UG US UZ VN YU ZA ZM |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW MZ SD SL SZ UG ZM ZW AM AZ BY KG KZ RU TJ TM AT BE BG CH CY CZ DK EE ES FI FR GB GR IE IT LU MC PT SE SK TR BF BJ CF CG CI GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) |
Free format text: EXCEPT/SAUF US, EP (AT, BE, BG, CH, CY, CZ, DE, DK, EE, ES, FI, FR, GB, GR, IE, IT, LU, MC, NL, PT,SE, SK, TR) |
|
WWE | Wipo information: entry into national phase |
Ref document number: 20028189426 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2002775231 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 2002775231 Country of ref document: EP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 2002775231 Country of ref document: EP |