EP2075762A2 - Dispositif de traitement de données tridimensionnelles, dispositif de génération d'image tridimensionnelle, dispositif de navigation, et programme de traitement de données tridimensionnelles - Google Patents

Dispositif de traitement de données tridimensionnelles, dispositif de génération d'image tridimensionnelle, dispositif de navigation, et programme de traitement de données tridimensionnelles Download PDF

Info

Publication number
EP2075762A2
EP2075762A2 EP08021360A EP08021360A EP2075762A2 EP 2075762 A2 EP2075762 A2 EP 2075762A2 EP 08021360 A EP08021360 A EP 08021360A EP 08021360 A EP08021360 A EP 08021360A EP 2075762 A2 EP2075762 A2 EP 2075762A2
Authority
EP
European Patent Office
Prior art keywords
solid body
bounding volume
quadrangular frame
volume
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP08021360A
Other languages
German (de)
English (en)
Other versions
EP2075762B1 (fr
EP2075762A3 (fr
Inventor
Koichi Ushida
Kazuyoshi Yamamoto
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aisin AW Co Ltd
Original Assignee
Aisin AW Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aisin AW Co Ltd filed Critical Aisin AW Co Ltd
Publication of EP2075762A2 publication Critical patent/EP2075762A2/fr
Publication of EP2075762A3 publication Critical patent/EP2075762A3/fr
Application granted granted Critical
Publication of EP2075762B1 publication Critical patent/EP2075762B1/fr
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/40Hidden part removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models

Definitions

  • the present invention relates to a three-dimensional data processing technique for two-dimensionally displaying a three-dimensional space using stereographic data pertaining to a solid body, a three-dimensional image generating device using this three-dimensional data processing technique, and a navigation device incorporating this three-dimensional image generating device.
  • Computational processing by a computer plays a central role in two-dimensionally displaying a three-dimensional space using stereographic data pertaining to a solid body, namely, in rendering three-dimensional graphics.
  • Computers with a graphic processing function have achieved higher performance and become more integrated (on one chip), which has made the application of a three-dimensional graphics rendering function possible in various technical fields, such as a navigation device mounted in an automobile or the like, for example, thus realizing a three-dimensional map display.
  • a ray is traced from a viewpoint toward the direction of a solid body. If the ray and the object intersect, then the mode of light of the object at the intersection point is found, and a two-dimensional image is created by projecting this on an image screen.
  • an intersection point calculation to find the intersection of the ray and the object takes an extremely long time and a faster method is needed. For example, excluding unnecessary solid bodies within a view volume, i.e., a three-dimensional space targeted for two-dimensional display, from a rendering target in advance can reduce the rendering process (three-dimensional image drawing process) load.
  • Such an exclusion process is called a culling process, and there is a method that utilizes a bounding volume which can further speed up the culling process.
  • a conventional bounding volume has a simple geometry such as a sphere (bounding cube) or rectangular solid shape (bounding cube), and encloses one or a plurality of solid bodies.
  • the bounding volume is a simple geometric body capable of covering an entire solid body.
  • a drawing process can be performed faster by determining whether the bounding volume and the ray intersect, and then sending only a solid body enclosed by the bounding volume where the bounding volume and the ray intersect to a final drawing process, thereby eliminating a computation to find intersections of the ray with unnecessary solid bodies.
  • Various methods for setting such a bounding volume are known (see Japanese Patent Application Publication No. JP-A-H06-168340 , paragraphs 0001 to 0017, and Japanese Patent Application Publication No. JP-A-2003-271988 , paragraphs 0002 to 0022, for examples).
  • the bounding volume is set so as to enclose the solid body. Therefore the bounding volume has a size larger than the solid body in all directions. Accordingly, a forward bounding volume located in front of the viewpoint and represented with an excessive size hides a rearward bounding volume that is located therebehind, and the rearward bounding volume is thus culled.
  • a bounding volume used in a map navigation system or the like in particular uses a bounding box parallel to reference axes and whose sides when cross-sectioned are parallel to a reference longitudinal axis and a reference lateral axes, which are commonly known as east longitude and north latitude coordinate axes.
  • the bounding box is considerably larger than the actual solid body (a polygon of the solid body), thus exacerbating the problem.
  • a visual axis extending from the viewpoint to the solid body becomes inclined with respect to the reference axes, and a region masked by the bounding box parallel to the reference axes becomes significantly larger than a region masked by the actual solid body, thus further exacerbating the above problem.
  • the present invention was devised in light of the foregoing problem, and it is an object of the present invention to provide a three-dimensional data processing technique for generating a geometric object simplified into a bounding volume while also making a three-dimensional image ultimately rendered appear more natural, a three-dimensional image generating device using such a three-dimensional data processing technique, and a navigation device incorporating the three-dimensional image generating device.
  • a three-dimensional data processing device for two-dimensionally displaying a three-dimensional space using stereographic data pertaining to a solid body has a structure characterized by including: a bounding volume generating unit that generates a bounding volume of the solid body using the stereographic data of the solid body; a view volume setting unit that sets a view volume based on a set viewpoint; and a culling processing unit that determines a solid body subject to drawing within the view volume using the generated bounding volume, wherein the bounding volume generating unit generates a bounding volume of the solid body using, as a reference plane, an inner quadrangular frame inscribed on an outer quadrangular frame that encloses a plane outer profile of the solid body subject to processing.
  • a bounding volume is set by using an inner quadrangular frame inscribed on an outer quadrangular frame that encloses a plane outer profile of a solid body as a reference plane, i.e., as a cross section (a bottom plane at a ground level) and extending the bounding volume therefrom to a height of the solid body.
  • a reference plane is the inner quadrangular frame inscribed on the outer quadrangular frame, naturally when comparing the inner quadrangular frame and the outer quadrangular frame, a width of the inner quadrangular frame is equal to or less than a width of the outer quadrangular frame in any direction.
  • the inner quadrangular frame acting as the reference plane of the bounding volume according to the present invention is inscribed on the outer quadrangular frame, and can therefore help keep the bounding volume from being expressed excessively smaller than the actual solid body as well.
  • the inner quadrangular frame is preferably computed as a quadrangle obtained by linking center points on each side of the outer quadrangular frame. Accordingly, if the outer quadrangular frame is a rectangle and the inner quadrangular frame is a diamond, for example, then the corners of the inner quadrangular frame are located on the sides of the outer quadrangular frame. Therefore, a shape of the inner quadrangular frame generally covers the shape of the solid body and is half the size of the outer quadrangular frame. It is thus possible to considerably suppress the past problem where the generated bounding volume is significantly larger than a region masked by the actual solid body. And since the sides are equal, a geometrical calculation load thereof is also less than that for a general polygon.
  • the outer quadrangular frame is preferably computed so as to have sides that are parallel to a set reference lateral axis and a set reference longitudinal axis. All the bounding volumes are thus rectangles with the same orientation, which facilitates their use in computations.
  • the culling process can make a determination with nearly complete accuracy regarding a view volume with a visual axis that is inclined with respect to the reference lateral axis and the reference longitudinal axis. Note that in consideration of the application of the three-dimensional data processing device to a map information system such as a navigation device, longitude and latitude can be conveniently adopted for the reference lateral axis and the reference longitudinal axis.
  • the three-dimensional data processing device is preferably configured such that the bounding volume generating unit finds a plurality of inner quadrangular frames from the plane outer profile obtained for every one of a plurality of height levels of a solid body subject to processing, and generates a bounding volume of the solid body using the plurality of inner quadrangular frames as a reference plane.
  • the bounding volume generating unit computes a plurality of inner quadrangular frames for each predetermined height level to generate a sub bounding volume, and the sub bounding volumes are stacked to generate a final bounding volume. Therefore, even in cases involving a solid body whose lateral cross section size considerably differs depending on the height level, an accurate culling determination can be achieved.
  • the lateral cross sections of buildings and the like are often different at upper and lower floors, which makes the three-dimensional data processing device suited for drawing a three-dimensional map image where the solid bodies are buildings in practice.
  • the culling processing unit is configured so as to determine whether a solid body is drawn based on an occlusion culling method.
  • the scope of the present invention also includes a three-dimensional image generating device incorporating the three-dimensional data processing device described above.
  • a three-dimensional image generating device is provided with a three-dimensional image drawing unit that generates a three-dimensional image in the view volume with reference to a determination result made by the culling processing unit, as well as various characteristic configurations of the three-dimensional image generating device according to the present invention as described above.
  • the scope of the present invention also includes a navigation device incorporating the above three-dimensional image generating device.
  • a navigation device is provided with structural elements required for an ordinary navigation device, as well as various characteristic configurations of the three-dimensional image generating device according to the present invention as described above.
  • the navigation device is preferably configured such that the viewpoint is determined based on a host vehicle position detected by a host vehicle position information detecting unit, and a visual axis from the viewpoint is determined so as to follow a guidance route.
  • the guidance route can be displayed and guide a user as a three-dimensional image drawn by the three-dimensional image generating device having the characteristics described above.
  • the scope of the present invention also includes a three-dimensional data processing program for two-dimensionally displaying a three-dimensional space using stereographic data pertaining to a solid body.
  • a three-dimensional data processing program performs in a computer the functions of:
  • FIG. 1 is a block diagram schematically showing a configuration of essential elements of an automobile navigation device according to the present embodiment.
  • the navigation device is capable of displaying a three-dimensional map display, and to achieve this, a three-dimensional image generating device 20 and a three-dimensional data processing device 10, which are the subjects of the present invention, are incorporated in a hierarchical manner therein.
  • a map database 30 includes a three-dimensional city database 31 that stores stereographic data for solid bodies and serves as a stereographic database, and a two-dimensional road database 32 that stores two-dimensional road data.
  • the map database 30 is structured from a high-capacity storage medium such as a DVD or a hard disk.
  • a rewritable storage medium is employed, fresh map data can be downloaded via data communications as appropriate.
  • the three-dimensional data processing device 10 includes a viewpoint setting unit 11, a view volume setting unit 12, a bounding volume setting unit (a bounding volume generating unit) 13, a culling processing unit 14, and a data processing control device (a data processing control unit) 15.
  • a viewpoint setting unit 11 For three-dimensional computer graphics, as schematically shown in FIG 2 , in the case of a perspective projection where an arbitrary position within a three-dimensional space is set as the center of the projection or a plane including an arbitrary point is set as a projection plane, the center of projection corresponds to the position of an observer's eyes, i.e., a viewpoint 1.
  • the viewpoint 1 is set by the viewpoint setting unit 11.
  • a forward clip plane 2 and a rearward clip plane 3 are defined which are planes orthogonal to a visual axis of the viewpoint 1.
  • a quadrangular pyramid in FIG. 2 formed by connecting vertices respectively corresponding to the forward clip plane 2 and the rearward clip plane 3 is a view volume 4.
  • the view volume 4 is set by the view volume setting unit 12.
  • the bounding volume generating unit 13 has a function for generating a bounding volume of the solid bodies by using stereographic data of solid bodies read out from the three-dimensional city database 31.
  • the bounding volume generating unit 13 according to the present invention generates a bounding volume of the solid body wherein an inner quadrangular frame inscribed on an outer quadrangular frame, which encloses a plane outer profile of the solid body subject to processing, is used as a reference plane.
  • the bounding volume generating unit 13 will be described in more detail later.
  • the culling processing unit 14 determines whether the solid bodies located within the view volume 4 are drawing subjects (rendering subjects) within the view volume using bounding volumes thereof. In this embodiment, the culling processing unit 14 employs an occlusion culling method as the culling method.
  • the bounding volume is not considered a subject for drawing.
  • a method is employed that successively sets rays from the viewpoint 1 and determines whether the rays reach the bounding volume subject to processing without becoming masked by another bounding volume.
  • the data processing control device 15 manages control operations of the viewpoint setting unit 11, the view volume setting unit 12, the bounding volume setting unit 13, and the culling processing unit 14, and also records determination results determined by the culling processing unit 14 in a list.
  • the culling determination result is a code identifying the solid body subject to drawing among the solid bodies located within the view volume 4 or the data of the solid body.
  • the temporarily stored culling determination result is utilized for three-dimensional image drawing (rendering).
  • the three-dimensional image generating device 20 includes the three-dimensional data processing device 10 as well as a three-dimensional image drawing unit 21 that employs a well known rendering method or the like to generate a three-dimensional image in the view volume 4 using stereographic data of the solid body subject to drawing.
  • the automobile navigation device is incorporated with the three-dimensional image generating device 20. Therefore, the automobile navigation device has a function for setting the viewpoint 1 based on a host vehicle position detected by a host vehicle position information detecting unit 41, and a function for displaying a three-dimensional map centered on a guidance route that the vehicle is due to travel on by determining the visual axis from the viewpoint 1 so as to follow the guidance route. Naturally the automobile navigation device also has a function for displaying an ordinary two-dimensional road map, and is therefore additionally provided with a two-dimensional image drawing unit 42.
  • a route searching unit 44 searches for an optimum guidance route that links a current position and a point (such as a destination) specified by a user through an operation input portion 43, and the searched guidance route is set by the guidance route storage portion 45. Based on the set guidance route and the host vehicle position, the route guidance unit 46 performs route guidance for the user via a speaker 47 and a monitor 48. At such time, an image generated by the two-dimensional image drawing unit 42 or the three-dimensional image drawing unit 21 or both is sent to a display processing portion 49, and then displayed on the monitor 48 after undergoing processing to superimpose various information and symbols as necessary.
  • the operation input portion 43 has a function for accepting an instruction from the user, and is structured from a touch panel, a mechanical operation button, a software button, and the like.
  • the touch panel and the software button operating in association with the display processing portion 49 construct a user graphic interface.
  • the operation input portion 43 can directly instruct the viewpoint 1 including the visual axis to the three-dimensional data processing device 10, and is thus capable of displaying a three-dimensional image based on the viewpoint and the visual axis desired by the user on the monitor 48.
  • the host vehicle position information detecting unit 41 obtains the host vehicle position information specifying the host vehicle position, i.e., the current position of the host vehicle.
  • the host vehicle position information detecting unit 1 is connected with a GPS receiver 51, an orientation sensor 52, and a distance sensor 53.
  • the GPS receiver 51 is a device that receives a GPS signal from a GPS satellite. The GPS signal is normally received every second, and output to the host vehicle position information detecting unit 41.
  • the signal received by the GPS receiver 51 from the GPS satellite can be analyzed to obtain the current position (latitude and longitude), direction of travel, speed of movement, and the like of the host vehicle.
  • the orientation sensor 52 detects the traveling direction of the host vehicle and changes in the direction of travel.
  • the orientation sensor 52 is structured from a gyro sensor, a geomagnetic sensor, an optical rotation sensor or rotation type resistance volume attached to a rotating portion of a steering wheel, and an angular sensor attached to a vehicle wheel portion, for example, and outputs a detection result thereof to the host vehicle position information detecting unit 41.
  • the distance sensor 53 detects a vehicle speed and a movement distance of the host vehicle.
  • the distance sensor 53 is structured from a vehicle speed pulse sensor that outputs a pulse signal every time a drive shaft, wheel, or the like of the host vehicle rotates a certain amount, a yaw/G sensor that detects an acceleration of the host vehicle, and a circuit that integrates the detected acceleration, for example. Also, the distance sensor 53 outputs information regarding the vehicle speed and the movement distance as a detection result thereof to the host vehicle position information detecting unit 41.
  • the host vehicle position information detecting unit 41 Based on the output from the GPS receiver 51, the orientation sensor 52, and the distance sensor 53, the host vehicle position information detecting unit 41 performs a computation according to a known method to specify the host vehicle position. In addition, the host vehicle position information detecting unit 41 obtains the two-dimensional road data around the host vehicle position from the two-dimensional road database 32. By performing known map matching based thereon, the host vehicle position information detecting unit 41 also corrects the host vehicle position to match the road indicated in the two-dimensional road data. In this manner, the host vehicle position information detecting unit 41 obtains host vehicle position information that includes information regarding the current position of the host vehicle expressed in latitude and longitude, and information regarding the traveling direction of the host vehicle.
  • FIG. 3 shows a block diagram illustrating functional elements of the bounding volume generating unit 13.
  • the bounding volume generating unit 13 includes a stereographic data obtaining portion 91, a stereographic plane data computing portion 92, a reference axis-parallel outer quadrangular frame computing portion 93, an inner quadrangular frame computing portion 94, a bounding volume geometry data determining portion 95, and a bounding volume geometry data output portion 96.
  • the stereographic data obtaining portion 91 stores stereographic data of a solid body subject to processing, which was read from the three-dimensional city database 31, in a working memory, and identifies and controls the stereographic data via a solid body ID.
  • the stereographic plane data computing portion 92 computes from a polygon data group, namely, stereographic data of the solid body subject to processing, plane data for setting a plane outer profile that is a lateral cross section of the solid body.
  • the plane data is bottom plane data, and only the bottom plane data may be extracted from the stereographic data in cases where the bottom plane data is included in the stereographic data read from the three-dimensional city database 31. If the bottom plane data is not included, then the bottom plane data is computed from the polygon data group. Furthermore, in cases where a maximum plane outer profile of the solid body is employed as the plane data or the like, such plane data is computed from the polygon data group.
  • the reference axis-parallel outer quadrangular frame computing portion 93 has a function for computing the outer quadrangular frame enclosing the plane outer profile, which is set by the plane data computed by the stereographic plane data computing portion 92.
  • a circumscribed quadrangle tangent to the plane outer profile is employed as the outer quadrangular frame.
  • an outer quadrangular frame may also be used wherein at least one side thereof has a predetermined clearance with the plane outer profile.
  • the computed outer quadrangular frame is set so as to have sides that are parallel to a set reference lateral axis (e.g. latitude) and a reference longitudinal axis (longitude). Therefore, it is convenient to transform coordinates of the plane data into a coordinate system based on the reference lateral axis and the reference longitudinal axis before computing the outer quadrangular frame.
  • the inner quadrangular frame computing portion 94 has a function for computing an inner quadrangular frame (inscribed quadrangle) inscribed on the outer quadrangular frame computed by the reference axis-parallel outer quadrangular frame computing portion 93.
  • the computed inner quadrangular frame is set as an inscribed quadrangle obtained by linking center points on each side of the inner quadrangular frame.
  • other forms of inscribed quadrangles may be employed as the inner quadrangular frame.
  • the bounding volume geometry data determining portion 95 has a function for determining geometry data (including coordinate position information regarding the coordinate system of the reference axes) required for generating a bounding volume of the solid body subject to processing, using the inner quadrangular frame as a reference plane.
  • the inner quadrangular frame computed by the inner quadrangular frame computing portion 94 is employed as the bottom plane, and a quadrangular prism geometry extending up to a height portion of the solid body subject to processing is set as the bounding volume.
  • Bounding volume geometry data created by the bounding volume geometry data determining portion 95 is then transferred to the culling processing unit 14 by the bounding volume geometry data output portion 96.
  • FIG. 4 schematically shows the bounding volume generation procedure.
  • stereographic data for an instructed solid body is found among stereographic data (polygon data) of solid bodies stored in the working memory (#01).
  • the bottom plane data (bottom plane polygon) is read out or computed as the plane data representing the plane outer profile of the instructed solid body (#02).
  • the coordinates of the obtained bottom plane data are transformed into the coordinate system formed from the mutually orthogonal reference longitudinal axis (e.g. longitude) and reference lateral axis (e.g. latitude).
  • An outer quadrangular frame is computed such that two sides thereof are parallel to the reference longitudinal axis and the reference lateral axis and a polygon set by the bottom plane data is circumscribed in the coordinate system (#03).
  • an inscribed quadrangle (diamond) created by linking the center points on each side of the computed outer quadrangular frame is computed as the inner quadrangular frame (#04).
  • a quadrangular prism geometry is then created by using the inner quadrangular frame as the bottom plane and extending the quadrangular prism geometry up to the height of the solid body instructed in step #01. The quadrangular prism geometry is thus computed as the bounding volume of the solid body (#05).
  • the culling processing unit 14 performs an occlusion culling process and determines bounding volume to be used for drawing that are located within the view volume 4 and not masked by other bounding volumes.
  • the solid body actually a polygon set by the stereographic data of the solid body
  • Drawing is performed by the three-dimensional image drawing unit 21, and an example of a three-dimensional image (three-dimensional map image) displayed on the monitor 48 is shown in FIG. 5B .
  • the bounding volume generating unit 13 may calculate a plurality of inner quadrangular frames for each predetermined height level to generate sub bounding volumes, and generate a final bounding volume by stacking such sub bounding volumes.
  • stereographic data for an instructed solid body is found among stereographic data (polygon data) of solid bodies stored in the working memory (#01).
  • stereographic data polygon data
  • the respective plane data representing the plane outer profile of the solid body are computed for a predetermined plurality of height levels, which in this case are two height levels: a ground level (bottom plane level), and a height level (called an intermediate level) at which the lateral cross-sectional shape of the solid body considerably changes (#12 and #22).
  • the coordinates of the plane data (bottom plane data) obtained at the ground level are transformed into the coordinate system formed from the mutually orthogonal reference longitudinal axis (e.g. longitude) and reference lateral axis (e.g. latitude).
  • a first outer quadrangular frame is computed such that two sides thereof are parallel to the reference longitudinal axis and the reference lateral axis and a polygon set by the bottom plane data is circumscribed in the coordinate system (#13).
  • the coordinates of the plane data obtained at the intermediate level are similarly transformed into the coordinate system, and a second outer quadrangular frame is computed such that a polygon set by the plane data is circumscribed in the coordinate system (#23).
  • an inscribed quadrangle (diamond) created by linking the center points on each side of the first outer quadrangular frame computed at step #13 is computed as a first inner quadrangular frame (#14).
  • an inscribed quadrangle (diamond) created by linking the center points on each side of the second outer quadrangular frame computed at step #23 is computed as a second inner quadrangular frame (#24).
  • a first sub bounding volume is then generated by using the first inner quadrangular frame as the bottom plane and then extending the first sub bounding volume therefrom up to the height of the intermediate level.
  • a second sub bounding volume is then generated by using the second inner quadrangular frame as the bottom plane and then extending the second sub bounding volume from the intermediate level up to the top of the solid body.
  • a final bounding volume is generated by stacking the second sub bounding volume on top of the first sub bounding volume (#30).
  • one bounding volume is generated with respect to one solid body.
  • one bounding volume may be generated with respect to a plurality of solid bodies.
  • allocating one bounding volume with respect to the plurality of solid bodies is convenient because the processing speed can be increased without lowering the culling precision in practice.
  • the inner quadrangular frame computing portion 94 in the above embodiment computes an inscribed quadrangle, which is obtained by linking the center points on each side of the outer quadrangular frame computed by the reference axis-parallel outer quadrangular frame computing portion 93, as the inner quadrangular frame.
  • an inscribed quadrangle obtained by linking n equally-spaced points, such as three equally-spaced points on each side of the outer quadrangular frame can be computed as the inner quadrangular frame.
  • the inner quadrangular frame computing portion 94 computes an inner quadrangular frame inscribed on an outer quadrangular frame computed by the reference axis-parallel outer quadrangular frame computing portion 93.
  • the expression "inscribed” here does not demand strict mathematical precision, and should instead be construed as also including an inner quadrangular frame that is a quadrangle that is somewhat smaller than the inscribed quadrangle actually inscribed on the outer quadrangular frame, and preferably resembles the inscribed quadrangle.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Remote Sensing (AREA)
  • Navigation (AREA)
  • Processing Or Creating Images (AREA)
  • Instructional Devices (AREA)
  • Image Generation (AREA)
  • Traffic Control Systems (AREA)
EP08021360.6A 2007-12-26 2008-12-09 Dispositif de traitement de données tridimensionnelles, dispositif de génération d'image tridimensionnelle, dispositif de navigation, et programme de traitement de données tridimensionnelles Expired - Fee Related EP2075762B1 (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2007334362A JP4947376B2 (ja) 2007-12-26 2007-12-26 3次元データ処理装置、3次元画像生成装置、ナビゲーション装置及び3次元データ処理プログラム

Publications (3)

Publication Number Publication Date
EP2075762A2 true EP2075762A2 (fr) 2009-07-01
EP2075762A3 EP2075762A3 (fr) 2011-08-10
EP2075762B1 EP2075762B1 (fr) 2013-08-21

Family

ID=40473692

Family Applications (1)

Application Number Title Priority Date Filing Date
EP08021360.6A Expired - Fee Related EP2075762B1 (fr) 2007-12-26 2008-12-09 Dispositif de traitement de données tridimensionnelles, dispositif de génération d'image tridimensionnelle, dispositif de navigation, et programme de traitement de données tridimensionnelles

Country Status (4)

Country Link
US (1) US8441479B2 (fr)
EP (1) EP2075762B1 (fr)
JP (1) JP4947376B2 (fr)
CN (1) CN101470902B (fr)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8723987B2 (en) * 2009-10-30 2014-05-13 Honeywell International Inc. Uncertainty estimation of planar features
US8660365B2 (en) 2010-07-29 2014-02-25 Honeywell International Inc. Systems and methods for processing extracted plane features
JP5915335B2 (ja) * 2012-03-30 2016-05-11 富士通株式会社 情報管理方法及び情報管理装置
US9858709B2 (en) * 2012-11-29 2018-01-02 Samsung Electronics Co., Ltd. Apparatus and method for processing primitive in three-dimensional (3D) graphics rendering system
KR102166426B1 (ko) * 2014-07-07 2020-10-16 삼성전자주식회사 렌더링 시스템 및 이의 렌더링 방법
CN111161416B (zh) * 2019-12-11 2023-08-29 北京互时科技股份有限公司 根据模型形状信息精确调整模型显示优先级的方法和系统
CN113779162A (zh) * 2020-02-27 2021-12-10 异起(上海)智能科技有限公司 一种场景标记的方法和系统
CN114820979B (zh) * 2022-04-22 2023-03-24 如你所视(北京)科技有限公司 三维网格模型的处理方法、装置和存储介质

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06168340A (ja) 1992-11-30 1994-06-14 Fujitsu Ltd 3dグラフィック表示装置
JP2003271988A (ja) 2002-03-15 2003-09-26 Denso Corp 画像生成装置及びプログラム

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5687307A (en) * 1993-09-21 1997-11-11 Canon Kabushiki Kaisha Computer graphic animation in which texture animation is independently performed on a plurality of objects in three-dimensional space
JP2972175B2 (ja) * 1998-03-09 1999-11-08 核燃料サイクル開発機構 3次元描画処理におけるモデル詳細度切替距離の決定方法
US6359623B1 (en) * 1998-11-12 2002-03-19 Hewlett-Packard Company Method and apparatus for performing scan conversion in a computer graphics display system
JP3664119B2 (ja) * 1999-05-12 2005-06-22 株式会社デンソー 地図表示装置
FR2852128A1 (fr) * 2003-03-07 2004-09-10 France Telecom Procede pour la gestion de la representation d'au moins une scene 3d modelisee.
WO2006003268A1 (fr) * 2004-06-03 2006-01-12 France Telecom Precede general de determination de liste d’elements potentiellement visibles par region pour des scenes 3d de tres grandes tailles representant des villes virtuelles d’altitudes variables.
WO2006003267A1 (fr) * 2004-06-03 2006-01-12 France Telecom Procede de determination de liste d’elements potentiellement visibles par region pour des scenes 3d de tres grandes tailles representant des villes virtuelles
JP4870079B2 (ja) * 2004-08-31 2012-02-08 フランス・テレコム 可視性データ圧縮方法、圧縮解除方法、圧縮システムおよびデコーダ
JP4500632B2 (ja) * 2004-09-07 2010-07-14 キヤノン株式会社 仮想現実感提示装置および情報処理方法
JP4116648B2 (ja) * 2006-05-22 2008-07-09 株式会社ソニー・コンピュータエンタテインメント オクルージョンカリング方法および描画処理装置
CN101013454A (zh) * 2007-02-02 2007-08-08 郑州机械研究所 Cae软件系统网格剖分的智能化方法

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06168340A (ja) 1992-11-30 1994-06-14 Fujitsu Ltd 3dグラフィック表示装置
JP2003271988A (ja) 2002-03-15 2003-09-26 Denso Corp 画像生成装置及びプログラム

Also Published As

Publication number Publication date
CN101470902A (zh) 2009-07-01
JP4947376B2 (ja) 2012-06-06
JP2009157591A (ja) 2009-07-16
CN101470902B (zh) 2013-03-20
EP2075762B1 (fr) 2013-08-21
US20090167759A1 (en) 2009-07-02
EP2075762A3 (fr) 2011-08-10
US8441479B2 (en) 2013-05-14

Similar Documents

Publication Publication Date Title
EP2075762B1 (fr) Dispositif de traitement de données tridimensionnelles, dispositif de génération d'image tridimensionnelle, dispositif de navigation, et programme de traitement de données tridimensionnelles
US11320836B2 (en) Algorithm and infrastructure for robust and efficient vehicle localization
KR100520708B1 (ko) 3차원 지도의 표시방법
US10380890B2 (en) Autonomous vehicle localization based on walsh kernel projection technique
US9330504B2 (en) 3D building model construction tools
EP2075543A2 (fr) Dispositif de navigation à affichage de carte tridimensionnelle, système d'affichage de carte tridimensionnelle, et programme d'affichage de carte tridimensionnelle
KR20180088149A (ko) 차량 경로 가이드 방법 및 장치
JP3568357B2 (ja) ナビゲーション装置における地図情報表示装置及び地図情報表示方法並びにナビゲーション装置における地図情報表示制御プログラムが記録されたコンピュータ読み取り可能な記録媒体
JPH09138136A (ja) 車載用ナビゲーション装置
US9250093B2 (en) Navigation device, method of predicting a visibility of a triangular face in an electronic map view, and method for generating a database
JPH10207351A (ja) ナビゲーションシステム及びそれに用いるナビゲーションプログラムを記憶した媒体
JPH11161159A (ja) 3次元地図表示装置
WO2018180285A1 (fr) Dispositif de génération de données tridimensionnelles, procédé de génération de données tridimensionnelles, programme de génération de données tridimensionnelles et support d'enregistrement lisible par ordinateur, sur lequel est enregistré un programme de génération de données tridimensionnelles
JP4468076B2 (ja) 地図表示装置
JP3655738B2 (ja) ナビゲーション装置
JP4311659B2 (ja) 3次元景観表示装置
JP4358123B2 (ja) ナビゲーションシステム
JP2002056400A (ja) 地図表示装置、地図表示方法、地図表示装置において用いられるコンピュータプログラム、及びプログラム記録媒体
JP2020129370A (ja) オフスクリーン関心地点(Points of Interest)を示すためのグラフィカル・ユーザ・インターフェース
JP4551355B2 (ja) 3次元地図表示装置
JP2818992B2 (ja) 地形情報解析装置および地形情報解析方法
JP2004333155A (ja) 情報提示装置及び情報提示方法、並びにコンピュータ・プログラム
KR100523514B1 (ko) 3차원 지도 내에서의 2차원 지명 표시방법
JP3365313B2 (ja) 立体地形表示装置
JP2022018015A (ja) 情報処理装置及びプログラム

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MT NL NO PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA MK RS

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MT NL NO PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA MK RS

RIC1 Information provided on ipc code assigned before grant

Ipc: G06T 17/05 20110101ALI20110711BHEP

Ipc: G06T 15/30 20110101AFI20110711BHEP

Ipc: G06T 15/40 20110101ALI20110711BHEP

17P Request for examination filed

Effective date: 20120203

AKX Designation fees paid

Designated state(s): DE FR GB

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602008026898

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: G06T0015300000

Ipc: G06T0015400000

RIC1 Information provided on ipc code assigned before grant

Ipc: G06T 15/40 20110101AFI20130109BHEP

Ipc: G06T 17/05 20110101ALI20130109BHEP

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): DE FR GB

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602008026898

Country of ref document: DE

Effective date: 20131017

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20140522

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602008026898

Country of ref document: DE

Effective date: 20140522

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20141203

Year of fee payment: 7

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20141208

Year of fee payment: 7

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20151201

Year of fee payment: 8

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20151209

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20160831

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20151209

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20151231

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 602008026898

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170701