WO1998009212A1 - Procede et appareil pour l'integration et l'affichage d'informations spatiales - Google Patents

Procede et appareil pour l'integration et l'affichage d'informations spatiales Download PDF

Info

Publication number
WO1998009212A1
WO1998009212A1 PCT/JP1996/002447 JP9602447W WO9809212A1 WO 1998009212 A1 WO1998009212 A1 WO 1998009212A1 JP 9602447 W JP9602447 W JP 9602447W WO 9809212 A1 WO9809212 A1 WO 9809212A1
Authority
WO
WIPO (PCT)
Prior art keywords
spatial information
information
display method
processing
integrated
Prior art date
Application number
PCT/JP1996/002447
Other languages
English (en)
Japanese (ja)
Inventor
Minoru Tokunaga
Chigusa Hamada
Kazunori Nakano
Yoichi Seto
Morimi Kuroda
Original Assignee
Hitachi, Ltd.
Hitachi Engineering Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi, Ltd., Hitachi Engineering Co., Ltd. filed Critical Hitachi, Ltd.
Priority to PCT/JP1996/002447 priority Critical patent/WO1998009212A1/fr
Publication of WO1998009212A1 publication Critical patent/WO1998009212A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation

Definitions

  • the present invention relates to a method for displaying spatial information such as map data using a computer, and to a spatial information integrated display method for integrating and displaying different spatial information.
  • the present invention relates to a spatial information integrated display method for integrally displaying output results of multiple processes for generating spatial information such as maps, images, and analysis data.
  • Japanese Patent Application No. 8-57220 “Method of displaying map simulation results on a digital map”, filed by the present applicant, relates to the integrated display of multiple pieces of spatial information.
  • a method of obtaining detailed information around the reachable range and grasping the size and shape of the reachable range It also describes how to combine and display the original map information without hiding it.
  • maps has changed from a paper map base to a digital map base, and needs have emerged such as the display of simulation results on maps, the display of hierarchies of underground shopping centers and high-rise buildings, and the display of highlighted masking on maps. Therefore, an integrated display method for heterogeneous information is needed.
  • an object of the present invention is to provide a method for performing integrated display, that is, a method for acquiring spatial information such as maps, images, and analysis data from a plurality of processes or acquiring from a plurality of applications and displaying the information in an integrated manner.
  • the present invention has the following configuration.
  • Interprocess communication is used as a method of passing the output results of multiple processes that generate spatial information to the integration process.
  • client intercommunication is used via a window system. Or use shared memory. Or through a file. This enables integrated display of spatial information generated from a plurality of applications.
  • FIG. 1 is a system configuration diagram of a city planning planning support system using the present invention.
  • Figure 2 shows a screen image of the city planning support system.
  • Fig. 3 is a processing flow chart of the city planning planning support system.
  • FIG. 4 is a flowchart of the interactive transparency processing.
  • Figure 5 shows a single application It is a block diagram of the spatial information integrated display method by a case.
  • FIG. 6 is a block diagram of a spatial information integrated display method using inter-process communication.
  • FIG. 7 is a block diagram of a spatial information integrated display method using communication between clients.
  • FIG. 8 is a block diagram of a spatial information integrated display method using a shared memory.
  • FIG. 9 is a block diagram of a spatial information integrated display method using a file as a medium.
  • FIG. 10 is a diagram showing a screen image of the underground shopping mall guidance system.
  • FIG. 11 is a diagram showing a screen image of the map editing system.
  • This embodiment is a system that simulates the traffic convenience of the facility to be constructed on a map when planning a city plan, and supports the planning of arranging the facility at the optimal position.
  • FIG. 1 shows a device configuration for realizing this embodiment.
  • a computer having a display 102, a storage device 103, a keyboard 104 and a mouse 105 connected to a computer 101 may be used.
  • the storage device] 03 has a sufficient capacity for executing programs and storing map data.
  • a magnetic storage device an optical disk or a semiconductor memory may be used.
  • an optical disk or a semiconductor memory may be used.
  • another input device such as a digitizer may be used, or a combination of them may be used, or any one of them may be used. That is, a user can input data, a position on the display 102, and the like as an input device.
  • this device is a dedicated device manufactured to carry out the present invention, it suffices that the function of the peripheral device as described above is satisfied.
  • FIG. Figure 2 shows the actual In the embodiment, a designated area in the image map stored in the storage device 103 and a range that can be moved by car within the designated time 202 from the planned construction position 201 (hereinafter, reachable range 200) 3) on the display 102.
  • the user inputs the planned construction location 201 of the new facility and the designated time 202 from the input device such as the keyboard 104 and the mouse 105 on the screen shown in Fig. 2 (a).
  • the computer 101 obtains the reachable range 203 by the route simulation and displays it in the window 204. At this time, when the transparency of the reachable range 203 is changed by the transparency adjusting knob 205, the reachable range 203 is transparently displayed as shown in FIG. 2 (b).
  • the processing flow of the present embodiment will be described in detail with reference to FIG.
  • the user inputs the position of the construction facility in the planned construction position 201.
  • the address may be input from the keyboard 104 or the construction position on the window 204 displaying the map may be input with the mouse 105 by clicking.
  • the designated time 202 is also used manually.
  • contour coordinate data 303 The coordinates of a node that can be reached by car within the specified time 202 from the planned construction position 201 or the point coordinates of a contour that describes the reachable position are obtained.
  • the obtained data is referred to as contour coordinate data 303.
  • Transparency processing is performed interactively by user input.
  • the processing flow of the interactive transparency processing 304 will be described in detail with reference to FIG.
  • Image map data 403 From the image map data 402 stored in the storage device 103, an area centering on the planned construction position 201 is cut out. Image map data cut out The data is cut out to obtain image map data 4003.
  • a curved plane obtained by spline interpolation of the contour coordinate data 303 obtained by the path simulation processing 302 is converted into image data.
  • the image data structure after conversion is, for example, an RGB value described for each pixel.
  • the converted image data is defined as reachable range image data 405.
  • the user inputs the transparency 407 using the transparency adjustment knob 205 in FIG. Transparency 4 07 takes a value from 0 to 100.
  • the reachable range 203 is not displayed.
  • the reachable range 203 is overwritten on the image map.
  • FIG. 2A shows a display example when the transparency 407 is 100
  • FIG. 2B shows a display example when the transparency 407 is 50.
  • Transparency input processing 404 inputs the cut-out image map data 403 generated in the image map extraction processing 401 and the reachable range image data 405 generated in the conversion processing to the image database 404 Are synthesized according to the determined transparency 407.
  • the RGB value C (r :, gC, bC) of the same coordinates of the synthesized image data is calculated from the following equation. Ask.
  • the composite image data 409 generated in the image compositing process 408 is output to the display 102.
  • the image map in the present embodiment may be a vector map.
  • a composite display of the image map and the vector map may be used.
  • the above is an example of the case where the interactive transparency processing 304 is realized by one application.
  • either of the image map clipping processing 410 and the conversion processing to image data 404 may be performed first.
  • processing A and processing B In general, processing for generating two pieces of spatial information to be integrated is referred to as processing A and processing B, and processing for integrating the two pieces of spatial information is referred to as integration processing.
  • the configuration is shown in the block diagram of FIG. In FIG. 5, one of the image map cutout processing 401 and the conversion processing to image data 404 in this embodiment corresponds to the processing A501 and the other corresponds to the processing B502.
  • the cut-out image map data 4 03 corresponds to the result data A 5 03 when the image map cut-out processing 4 0 1 corresponds to the processing ⁇ 5 0 1, and corresponds to the result data when it corresponds to the processing B 5 0 2 This corresponds to data B504.
  • the reachable range image data 405 corresponds to the result data A503 when the conversion process 104 to image data corresponds to the process A501, and corresponds to the process B502 when the process 104 corresponds to the process A501.
  • the image synthesis processing 408 corresponds to the integration processing 505
  • the synthesis image data 409 corresponds to the integration data 506
  • the display processing 410 corresponds to the display processing 507.
  • the above configuration is realized with only one application, application A508.
  • the integrated display method with this configuration not only has a small development load, but also has a high processing speed.
  • FIG. 6 shows the configuration of the integrated display method in this case.
  • processing ⁇ 602 generates result data ⁇ 603
  • processing B 605 generates result data B 606.
  • Application C
  • the integration processing 608 in the 607 uses the inter-process communication 609 to bring the result data A 610 and the result data B 611 to the memory area managed by the application C 607. .
  • the result data A603 and the result data A61.0 have different storage areas in the memory but have the same contents. The same applies to the result data B606 and the result data B611.
  • the integration processing 6 08 integrates the result data A 6 10 and the result data B 6 11 to generate integrated data 6 12.
  • the integrated data 6 12 generated by the display processing 6 13 is output to the output device.
  • the integrated display method having this configuration has a low degree of dependence between processes.
  • FIG. 7 shows an overview of the integrated display method in this case.
  • the result data A 703 generated by the processing A 702 in the application A 701 and the result data B 706 generated by the processing B 705 in the application B 704 are combined with the application C C 70.
  • Integration processing in 7 is integrated in 708.
  • the results can be obtained using the client-to-client communication 710.
  • Data A 703 and result data B 706 are transferred to the memory area managed by application C 707. For example, transferring data through properties in the window system.
  • a property is a data item that can be added to a window.
  • the integration processing 708 generates integrated data 711 from the result data A711 and the result data B712 in the application C707.
  • the integrated data 7 13 is output to the output device.
  • the degree of dependence on processing is small.
  • a selection is a single token passed between applications.
  • Fig. 8 shows the configuration of the integrated display method in this case.
  • the shared memory 804 is secured in one of the application A 801, the application B 802, and the application C 803.
  • shared memory 804 can be allocated using the shmget function.
  • the shared memory 804 stores the result data A 806 generated by the processing A 805 in the application A 801 and the result data B 808 generated by the processing B 807 in the application B 802. Secure enough space for storage.
  • process A 805 generates result data A 806 and stores it in shared memory 804.
  • process B 807 generates result data B 808 and stores it in shared memory 804.
  • shared memory 804 can be accessed using the shmat function.
  • Application C 8 In the integration process 809 in 03, the result data A81 1 and the result data B8111 are extracted from the shared memory 804 to a memory area managed by the application C803.
  • the result data A806 and the result data A810 have different storage areas in the memory but have the same contents.
  • the integration process 809 generates integrated data 812 from the result data A810 and the result data B811.
  • the display processing 8 13 outputs the integrated data 8 1 2 to the output device.
  • the integrated display method with this configuration not only has a small development load, but also has a high processing speed. When synchronizing, for example, use process communication.
  • FIG 9 shows an overview of the integrated display method in this case.
  • processing A 902 generates result data A 903 and stores it in file A 904.
  • application B 905 process B 906 generates result data B 907 and stores it in file B 908.
  • the application C 909 manages the result data A 911 and the result data B 912 from the file A 904 and the file B 908, respectively. Read into memory area.
  • the result data A903 and the result data A911 have different storage areas in the memory but have the same contents. The same applies to the result data B 9 07 and the result data B 9 12.
  • the integration process 9110 generates integrated data 913 from the result data A911 and the result data B912.
  • the display process 914 outputs the integrated data 913 to the output device.
  • the above is performed, for example, by batch processing.
  • the integrated display method with this configuration has a small development load and low dependence between processes.
  • the configuration of the integrated display method shown in Fig. 6 to Fig. 9 considers the case where there are only two types of spatial information to be integrated, but it can basically be extended to more than that. It is.
  • the present invention when the reachable range is displayed, the reachable range is displayed translucently on the background map. As a result, it is possible to read necessary information such as place names and structures described in the reachable range.
  • the present invention may be used as a means for displaying an underground shopping mall map or a structure inside a building in a layered manner, or may be used as a means for performing highlighted masking display on a map. Examples of applications include an underground shopping mall information system that can simultaneously display a ground map and an underground shopping mall map and provide underground shopping mall information, and a map editing system that can draw various figures on a map.
  • FIG. 10 shows a screen image of the underground shopping mall planning system.
  • FIG. 11 shows the screen image of the map editing system. Using a menu such as a circle 1101 and a polygon 1102, draw a figure such as a circle 1104 and a polygon 1105 on the map displayed in the window 1103. By using the transparency adjustment knob 1106, the figure shown in Fig. 11 (a) can be displayed transparently as shown in Fig. 11 (b), and the background map can be masked without being hidden.
  • the present invention is widely applicable not only to the display of maps and figures, but also to the display of satellite images and analysis results. In addition, it can be used for coexistence of applications that handle CAD drawings, etc., and image map publications. Industrial applicability
  • integrated display of spatial information such as maps, images, and analysis data
  • multiple spatial information can be effectively integrated and displayed, and the output results of multiple processes or different applications can be integrated and displayed on the same screen.
  • the present invention is capable of superimposing and displaying a plurality of pieces of information in a map information system.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention concerne un procédé d'intégration et d'affichage d'informations spatiales générées par des traitements ou des applications. Ce procédé est mis en oeuvre de la façon expliquée ci-dessous. (1) Dans une application, sont gérés des traitements dont chacun génère des informations spatiales, les informations spatiales, un traitement d'intégration servant à l'intégration des informations spatiales et à la génération d'informations intégrées, les informations intégrées, ainsi qu'un traitement d'affichage servant à l'envoi des informations intégrées à une unité de sortie. Ainsi les informations spatiales générées par les traitements sont intégrées et affichées par une application. (2) Pour la mise en oeuvre d'un procédé de transfert des résultats de sortie d'un traitement de génération d'informations spatiales à un traitement d'intégration, on utilise une communication inter-processus, une communication inter-clients par l'intermédiaire d'un système de fenêtre, ou bien une mémoire commune. Dans une variante, un fichier est utilisé comme support. Ainsi, les informations spatiales générées par les applications sont intégrées et affichées. (3) Les informations spatiales sont affichées en surimpression en mode transparent. Cela permet de percevoir les informations spatiales générées par les traitements ou les applications d'un seul coup d'oeil.
PCT/JP1996/002447 1996-08-30 1996-08-30 Procede et appareil pour l'integration et l'affichage d'informations spatiales WO1998009212A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP1996/002447 WO1998009212A1 (fr) 1996-08-30 1996-08-30 Procede et appareil pour l'integration et l'affichage d'informations spatiales

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP1996/002447 WO1998009212A1 (fr) 1996-08-30 1996-08-30 Procede et appareil pour l'integration et l'affichage d'informations spatiales

Publications (1)

Publication Number Publication Date
WO1998009212A1 true WO1998009212A1 (fr) 1998-03-05

Family

ID=14153733

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP1996/002447 WO1998009212A1 (fr) 1996-08-30 1996-08-30 Procede et appareil pour l'integration et l'affichage d'informations spatiales

Country Status (1)

Country Link
WO (1) WO1998009212A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008532181A (ja) * 2005-03-01 2008-08-14 アイアールエム・リミテッド・ライアビリティ・カンパニー 企業データへのアクセス、注釈及びシェアリング方法及びシステム
US8015494B1 (en) 2000-03-22 2011-09-06 Ricoh Co., Ltd. Melded user interfaces
US8793589B2 (en) 2000-03-22 2014-07-29 Ricoh Co., Ltd. Melded user interfaces

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61269194A (ja) * 1985-05-24 1986-11-28 東京電力株式会社 図形表示装置
JPH01220005A (ja) * 1988-02-29 1989-09-01 Nissin Electric Co Ltd 計装用監視制御画面の作成方法
JPH05151339A (ja) * 1991-11-27 1993-06-18 Hitachi Ltd 対象認識方法
JPH05281952A (ja) * 1992-01-14 1993-10-29 Nec Home Electron Ltd 表示制御装置
JPH07104724A (ja) * 1993-09-29 1995-04-21 Hitachi Ltd マルチウィンドウ表示方法および装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61269194A (ja) * 1985-05-24 1986-11-28 東京電力株式会社 図形表示装置
JPH01220005A (ja) * 1988-02-29 1989-09-01 Nissin Electric Co Ltd 計装用監視制御画面の作成方法
JPH05151339A (ja) * 1991-11-27 1993-06-18 Hitachi Ltd 対象認識方法
JPH05281952A (ja) * 1992-01-14 1993-10-29 Nec Home Electron Ltd 表示制御装置
JPH07104724A (ja) * 1993-09-29 1995-04-21 Hitachi Ltd マルチウィンドウ表示方法および装置

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8015494B1 (en) 2000-03-22 2011-09-06 Ricoh Co., Ltd. Melded user interfaces
US8793589B2 (en) 2000-03-22 2014-07-29 Ricoh Co., Ltd. Melded user interfaces
JP2008532181A (ja) * 2005-03-01 2008-08-14 アイアールエム・リミテッド・ライアビリティ・カンパニー 企業データへのアクセス、注釈及びシェアリング方法及びシステム

Similar Documents

Publication Publication Date Title
US6304271B1 (en) Apparatus and method for cropping an image in a zooming graphical user interface
US7023452B2 (en) Image generation system, image generating method, and storage medium storing image generation program
JP3212113B2 (ja) 地図情報の表示方法及び装置
EP0860787A2 (fr) Système de représentation de cartes
JPH0757117A (ja) テクスチャマップへの索引を生成する方法及びコンピュータ制御表示システム
JP3278623B2 (ja) 地図3次元化システム,地図3次元化方法および地図3次元化用プログラムを記憶した記憶媒体
CN112256790A (zh) 基于WebGL的三维可视化展现系统及数据可视化方法
JP2824454B2 (ja) 三次元図形入力方式
KR100514944B1 (ko) 인터넷을 이용한 지리정보 데이터의 주문검색 서비스 방법
WO1998009212A1 (fr) Procede et appareil pour l'integration et l'affichage d'informations spatiales
JP3599198B2 (ja) 図形編集装置
CN115546349A (zh) 基于Openlayer实现地图背景图的比例和位置切换的方法
JP2712789B2 (ja) 画像表示装置
Chapman et al. Panoramic imaging and virtual reality—filling the gaps between the lines
JP2004233711A (ja) 地図データ構築装置
Lodha et al. Consistent visualization and querying of GIS databases by a location-aware mobile agent
JP6575221B2 (ja) 表示制御方法、情報処理装置及び表示制御プログラム
JPH10153949A (ja) 地理情報システム
JP3691105B2 (ja) 三次元画像処理システム
JPH10334217A (ja) 空間情報データ表示方法
JPH11149571A (ja) 仮想3次元空間の生成方法、仮想3次元空間生成プログラムを記録したコンピュータで読みとり可能な記録媒体、サーバから端末装置に3次元構造データを送信する方法、及びサーバと通信して端末装置において仮想3次元空間を生成する方法
JPH0736437A (ja) 画像データ処理装置
JP3930450B2 (ja) 解析結果可視化処理プログラム
CN118071614A (zh) 多元素对象生成方法、装置、设备、存储介质及程序产品
JPS6023889A (ja) マルチ・カ−ソルの表示方法

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): JP KR SG US

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)