WO2024070238A1 - Living environment evaluation system and living environment evaluation method - Google Patents

Living environment evaluation system and living environment evaluation method Download PDF

Info

Publication number
WO2024070238A1
WO2024070238A1 PCT/JP2023/028695 JP2023028695W WO2024070238A1 WO 2024070238 A1 WO2024070238 A1 WO 2024070238A1 JP 2023028695 W JP2023028695 W JP 2023028695W WO 2024070238 A1 WO2024070238 A1 WO 2024070238A1
Authority
WO
WIPO (PCT)
Prior art keywords
analysis
living environment
index
unit
information
Prior art date
Application number
PCT/JP2023/028695
Other languages
French (fr)
Japanese (ja)
Inventor
浩也 松葉
ゆう 趙
イレネ ラフマワン
Original Assignee
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立製作所 filed Critical 株式会社日立製作所
Publication of WO2024070238A1 publication Critical patent/WO2024070238A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/16Real estate

Definitions

  • the present invention relates to a housing environment evaluation system and a housing environment evaluation method, and is suitable for use as a housing environment evaluation system and a housing environment evaluation method that evaluates a housing environment and visually outputs the evaluation result.
  • Patent Document 1 discloses a real estate information output device that outputs real estate information based on an individual's values.
  • Patent Document 2 discloses a living environment evaluation device that evaluates living environments based on an individual's values.
  • Patent Document 1 only evaluates individual real estate (houses), and does not evaluate the livability of a given area or utilize the results of that evaluation.
  • Patent Document 2 evaluates the living environment using a model that analyzes open data and calculates indicators related to the living environment of a "region,” but since open data is used, it is presumed that the resolution of the area referred to as the "region" is relatively low.
  • the smallest unit of area that can be analyzed using open data is the administrative district unit such as a city, ward, town, or village.
  • the present invention has been made in consideration of the above points, and aims to propose a housing environment evaluation system and method that can visually provide an evaluation of a housing environment that reflects individual preferences with a higher resolution than conventional technology.
  • the present invention provides a residential environment evaluation system that evaluates a residential environment with respect to an index specified by a user, the residential environment evaluation system comprising: user preference information that holds a preference coefficient indicating the user's preference for one or more residential environment items that may affect the index; analysis instruction information that indicates an analysis range specified by the user as a range for evaluating the residential environment; a high-resolution data analysis unit that performs a wavelength-based analysis using an image that corresponds to the analysis range to analyze the characteristics of the analysis range for each of a plurality of regions into which the analysis range is divided; an index calculation unit that calculates an index value of the index in the analysis range for each of the regions based on the analysis results of the characteristics of the analysis range and the user preference information; and a visualization unit that displays the evaluation results based on the index values calculated by the index calculation unit by superimposing them on a map indicating the analysis range for each of the regions.
  • the present invention provides a method for evaluating a living environment using a living environment evaluation system that evaluates a living environment with respect to an index specified by a user, the living environment evaluation system having user preference information that holds preference coefficients indicating the user's preferences for one or more living environment items that may affect the index, and analysis instruction information that indicates an analysis range specified by the user as a range for evaluating the living environment, the method comprising: a high-resolution data analysis step in which the living environment evaluation system performs a wavelength-based analysis using an image corresponding to the analysis range to analyze the characteristics of the analysis range for each of a plurality of divided areas of the analysis range; an index calculation step in which the living environment evaluation system calculates an index value of the index in the analysis range for each of the areas based on the analysis results of the characteristics of the analysis range in the high-resolution data analysis step and the user preference information; and a visualization step in which the living environment evaluation system displays the evaluation results based on the index values calculated in the index calculation step by superimposing them on
  • the present invention makes it possible to visually present an evaluation of the living environment that reflects individual preferences with a higher resolution than conventional technology.
  • FIG. 1 is a block diagram showing an example of the configuration of a living environment evaluation system 1 according to a first embodiment of the present invention.
  • 10 is a flowchart showing an example of a processing procedure of an overall process in the first embodiment
  • FIG. 2 is a diagram showing an example of analysis instruction information 18.
  • 13 is a flowchart illustrating an example of a processing procedure for high-resolution data analysis processing.
  • 13 is a flowchart illustrating an example of a processing procedure for feature classification processing.
  • FIG. 11 is a diagram showing an example of color identification information 113.
  • FIG. 11 is a diagram showing an example of feature information 19.
  • 11 is a flowchart (part 1) illustrating an example of a processing procedure of a low-resolution data analysis process.
  • FIG. 13 is a flowchart (part 2) illustrating an example of a processing procedure of a low-resolution data analysis process.
  • FIG. 2 is a diagram showing an example of area information 122.
  • 13 is a flowchart illustrating an example of a processing procedure for index calculation processing.
  • FIG. 2 is a diagram showing an example of user preference information 132.
  • FIG. 11 is a diagram showing an example of association information 133.
  • FIG. 2 is a diagram showing an example of index value information 20.
  • FIG. 3 is a diagram showing an example of a living environment evaluation screen 300.
  • 11 is a block diagram showing an example of the configuration of a living environment evaluation system 10 according to a second embodiment of the 13 is a flowchart illustrating an example of a processing procedure for updating association information. 13 is a flowchart showing an example of a processing procedure for past information display processing;
  • reference signs or common numbers in reference signs will be used, and when describing elements of the same type with distinction between them, the reference signs of those elements will be used or an ID assigned to those elements will be used instead of the reference signs.
  • the processing performed by executing a program may be described, but the program is executed by at least one processor (e.g., a CPU) to perform a predetermined process using a storage resource (e.g., a memory) and/or an interface device (e.g., a communication port) as appropriate, so the subject of the processing may be the processor.
  • the subject of the processing performed by executing a program may be a controller, device, system, computer, node, storage system, storage device, server, management computer, client, or host having a processor.
  • the subject of the processing performed by executing a program (e.g., a processor) may include a hardware circuit that performs part or all of the processing.
  • the subject of the processing performed by executing a program may include a hardware circuit that performs encryption and decryption, or compression and decompression.
  • the processor operates as a functional unit that realizes a specified function by operating according to the program.
  • Devices and systems that include a processor are devices and systems that include these functional units.
  • the program may be installed in a device such as a computer from a program source.
  • the program source may be, for example, a program distribution server or a non-transitory storage medium readable by a computer.
  • the program distribution server includes a processor (e.g., a CPU) and a non-transitory storage resource, and the storage resource may further store a distribution program and a program to be distributed. Then, the processor of the program distribution server may execute the distribution program, thereby distributing the program to be distributed to other computers.
  • two or more programs may be realized as one program, and one program may be realized as two or more programs.
  • FIG. 1 is a block diagram showing an example of the configuration of a living environment evaluation system 1 according to a first embodiment of the present invention.
  • the living environment evaluation system 1 is a computer system that evaluates living environments within an area specified by a user (an individual searching for a house, a real estate agent, a local government official, etc.) taking into consideration the user's preferences, and is connected to a disk 2, a network 3, and an input/output device 4.
  • Disk 2 is a storage device that stores specific data used in the processing in the living environment evaluation system 1.
  • Disk 2 is, for example, a storage device such as a HDD (Hard Disk Drive) or SSD (Solid State Drive), but may also be an external storage or cloud connected via the network 3.
  • Specific examples of the specific data stored on disk 2 include the original data of satellite images 114, map information 115, and regional information 122 in memory 17 described below. Note that these data may be configured to be obtained from an external device connected via the network 3.
  • the network 3 is a network that communicatively connects the living environment evaluation system 1 to the outside, and is, for example, a LAN (Local Area Network) or a WAN (Wide Area Network). As mentioned above, instead of the disk 2 storing the specified data, the disk 2 may be configured to be able to acquire the specified data from the Internet or the like connected via the network 3.
  • LAN Local Area Network
  • WAN Wide Area Network
  • the input/output device 4 includes an input device such as a keyboard and mouse that the user uses to operate the living environment evaluation system 1, and an output device such as a display or printer that outputs data output from the living environment evaluation system 1.
  • the living environment evaluation system 1 is configured with a processor 15, an input/output device 16, and a memory 17.
  • the processor 15 is a processor that controls the entire living environment evaluation system 1, and is specifically, for example, a CPU (Central Processing Unit).
  • the input/output device 16 is an input/output interface that enables input and output of signals (data) between the living environment evaluation system 1 and its external environment (disk 2, network 3, input/output device 4).
  • Memory 17 is a storage device that stores programs and data used in the computer of living environment assessment system 1, and is, for example, a RAM (Random Access Memory).
  • the high-resolution data analysis unit 11, low-resolution data analysis unit 12, index calculation unit 13, and visualization unit 14 shown in memory 17 are functional units that realize various functions by the processor 15 executing programs stored in or read from memory 17.
  • Analysis instruction information 18, feature information 19, and index value information 20 shown in memory 17 are data stored in memory 17.
  • the high-resolution data analysis unit 11 has a function of analyzing the characteristics of an area (range) specified by a user, using "high-resolution data" with a higher resolution than "low-resolution data” represented by area information 122 described below.
  • the high-resolution data analysis unit 11 has a land feature extraction unit 111 and a classification unit 112 as functional units that execute programs, and has color identification information 113, satellite imagery 114, and map information 115 as data used for processing by each of these functional units.
  • the color identification information 113 is data for identifying the analysis target by color, and a specific example is shown in FIG. 6, which will be described later.
  • the satellite image 114 and map information 115 are data with a higher resolution than the regional information 122, which will be described later, and the details of each are explained in the feature classification process shown in FIG. 5.
  • the low-resolution data analysis unit 12 has a function of analyzing the characteristics of a region (area) specified by a user using "low-resolution data.”
  • the low-resolution data analysis unit 12 has a data mapping unit 121 as a functional unit that executes a program, and has regional information 122 as data used for processing by the data mapping unit 121.
  • Regional information 122 is low-resolution data for relatively broad "regions" such as administrative districts, and holds data values for each region for a given indicator.
  • Regional information may be any data that indicates the characteristics of a broad "region,” and may include, for example, the results of a survey of residents in each region, rankings by region, etc.
  • Figure 10 A specific example of regional information 122 is shown in Figure 10, which will be described later.
  • the index calculation unit 13 has a function of integrating the analysis results by the high-resolution data analysis unit 11 and the low-resolution data analysis unit 12 to calculate an index value of the living environment that reflects the user's preferences.
  • the index calculation unit 13 has a data integration evaluation unit 131 as a functional unit that executes a program, and has user preference information 132 and association information 133 as data used in processing by the data integration evaluation unit 131.
  • User preference information 132 is data that holds preference coefficients that indicate the user's preferences for a specific item, and a specific example is shown in FIG. 12, which will be described later.
  • Association information 133 is data that holds association (weighting) coefficients used when calculating the feature quantities of a specific living environment item from the feature quantities held in feature information 19, and a specific example is shown in FIG. 13, which will be described later.
  • the visualization unit 14 has a function of visualizing the living environment index values calculated by the index calculation unit 13.
  • the visualization unit 14 has a map display unit 141 as a functional unit that provides functions by executing a program.
  • the living environment evaluation screen 300 shown in FIG. 16, which will be described later, is a display screen that the living environment evaluation system 1 provides to the user. The processing results by the map display unit 141 are also reflected in this living environment evaluation screen 300.
  • Analysis instruction information 18 is data that indicates the analysis range of the living environment specified by the user, and a specific example is shown in FIG. 3, which will be described later.
  • Feature information 19 is data that holds the analysis results by the high-resolution data analysis unit 11 and the low-resolution data analysis unit 12, and a specific example is shown in FIG. 7, which will be described later.
  • Index value information 20 is data that records the index value of the living environment calculated by the index calculation unit 13, and a specific example is shown in FIG. 14, which will be described later.
  • the overall processing shows an overview of the living environment evaluation processing executed by the living environment evaluation system 1.
  • the user launches the software of the living environment evaluation system 1 and operates a user interface (UI) provided by the launched software to specify the range in which the living environment is to be analyzed (specifying the analysis range).
  • UI user interface
  • the user performs an operation to display the desired analysis range in the map display area 302 before pressing the evaluation execution button 304, thereby specifying the analysis range.
  • the user In addition to specifying the analysis range, the user also selects which items about the living environment he or she places importance on and to what extent (specifying user preferences). Specifically, on the living environment evaluation screen 300 in FIG. 16 described below, the user specifies user preferences by performing input operations according to the user's preferences in the questionnaire input area 301 before pressing the evaluation execution button 304.
  • analysis instruction information 18 is generated and stored in memory 17 when the evaluation execution button 304 on the living environment evaluation screen 300 is pressed.
  • the "evaluation specification screen” for specifying the analysis range and user preferences and the “evaluation result screen” for displaying the evaluation results of the living environment are displayed together on a single "living environment evaluation screen 300," but the evaluation specification screen and the evaluation result screen may be realized as separate UIs.
  • Figure 16 which will be described later, shows a specific example of the living environment evaluation screen 300.
  • FIG. 3 is a diagram showing an example of analysis instruction information 18.
  • starting latitude 1801, starting longitude 1802, north-south distance 1803, and east-west distance 1804 indicate the analysis range of the living environment specified by the user.
  • the specified latitude and longitude (starting latitude 1801, starting longitude 1802) is set as the upper left vertex of a rectangle, and the analysis range is the rectangle from that vertex specified by north-south distance 1803 and east-west distance 1804.
  • This analysis range corresponds to the entire range of the map displayed in map display area 302 of living environment evaluation screen 300 shown in FIG. 16.
  • Mesh width 1805 indicates the mesh width when dividing the analysis range into meshes.
  • the analysis range is divided into a plurality of areas (meshes) and the characteristics of each area are analyzed.
  • the mesh is a square, and the length of one side of the square is specified by the mesh width 1805.
  • the mesh width 1805 is set to 100 m as shown in FIG. 3, the analysis range is divided into squares of 100 m on each side.
  • the mesh width 1805 is set to a fixed value by the system, but may be changeable within a specified range or in specified units, for example, by the user.
  • the mesh shape is also not limited to a square, and may be another specific shape.
  • the analysis object 1806 indicates the type of feature to be analyzed in each mesh in the analysis range.
  • the analysis object 1806 is specified by the system, and color identification information 113 is prepared for each analysis object.
  • the high-resolution data analysis unit 11 activates the land feature extraction unit 111 to execute the high-resolution data analysis process (step S101). Details of the high-resolution data analysis process will be described later with reference to FIG. 4 etc. By executing the high-resolution data analysis process, the characteristics of the analysis range are analyzed using high-resolution data.
  • the low-resolution data analysis unit 12 activates the data mapping unit 121 to execute the low-resolution data analysis process (step S102). Details of the low-resolution data analysis process will be described later with reference to FIG. 8 etc. By executing the low-resolution data analysis process, the characteristics of the analysis range are analyzed using the low-resolution data.
  • the index calculation unit 13 activates the data integration evaluation unit 131 to execute the index calculation process (step S103). Details of the index calculation process will be described later with reference to FIG. 11 etc.
  • an index value of the living environment is calculated based on the analysis results of the high-resolution data analysis process and the low-resolution data analysis process and the user preferences, and is recorded in the index value information 20.
  • the visualization unit 14 starts the map display unit 141 to execute the visualization process (step S104). Details of the visualization process are shown in FIG. 15, which will be described later. By executing the visualization process, the evaluation results of the living environment in the analysis range are visually presented to the user.
  • FIG. 4 is a flowchart showing an example of the processing procedure of the high-resolution data analysis processing.
  • the high-resolution data analysis processing shown in Fig. 4 corresponds to the processing of step S101 in Fig. 2, and is executed when the land feature extraction unit 111 is started by the high-resolution data analysis unit 11.
  • the land feature extraction unit 111 acquires the analysis instruction information 18 generated based on the operation by the user (step S201).
  • This "acquire" operation may refer to any of receiving, referencing, reading, or copying the target data, and the same applies to the following explanation.
  • the land feature extraction unit 111 divides the analysis range specified by the analysis instruction information 18 acquired in step S201 into meshes (step S202). Specifically, the land feature extraction unit 111 divides the analysis range into a rectangular area with the starting point latitude 1801 and starting point longitude 1802 of the analysis instruction information 18 as the upper left vertex, the north-south distance 1803 as the vertical width, and the east-west distance 1804 as the horizontal width, into square areas (meshes) with a mesh width 1805 as a predetermined size. At this time, the north-south distance and east-west distance of the analysis range are rounded up to an integer multiple of the mesh width 1805, so that the analysis range is divided into multiple meshes.
  • the process of dividing the analysis range into meshes is also performed in other processes described later, but all of these are similar to the process in step S202, so detailed explanations will not be repeated.
  • step S203 the land feature extraction unit 111 checks whether steps S204 to S206 have been performed on all meshes generated in step S202 (step S203). If all meshes have been processed (YES in step S203), the high-resolution data analysis process ends. If unprocessed meshes remain (NO in step S203), the process proceeds to step S204. If this is the first time, the process has not yet been performed after step S204, so the process proceeds to step S204 via NO in step S203.
  • step S204 the land feature extraction unit 111 selects one of the unprocessed meshes.
  • the land feature extraction unit 111 activates the classification unit 112 to execute feature classification processing while selecting one by one the analysis targets specified in the analysis target 1806 of the analysis instruction information 18 for the mesh selected in step S204 (step S205). That is, in step S205, the classification unit 112 repeatedly executes feature classification processing while changing the selection of the analysis target specified in the analysis target 1806 for the mesh selected in step S204. Details of the feature classification processing will be described later with reference to FIG. 5.
  • the land feature extraction unit 111 receives the results of the feature classification process executed in step S205 from the classification unit 112 and writes them into the feature information 19 (step S206). After the process in step S206, the process returns to step S203.
  • FIG. 5 is a flowchart showing an example of the processing procedure for feature classification processing.
  • the feature classification processing shown in FIG. 5 is executed when the classification unit 112 is started in step S205 of FIG. 4.
  • the classification unit 112 obtains the mesh to be processed and the selected analysis target 1806 from the land feature extraction unit 111 (step S301).
  • the classification unit 112 calculates the range of the mesh obtained in step S301, reads the satellite image and map information including that range from the disk 2 (or from outside via the network 3), and stores them in the memory 17 as the satellite image 114 and map information 115 (step S302).
  • Satellite image 114 is high-resolution data taken from an artificial satellite of the area for which the living environment is to be evaluated, and may be an aerial image or the like as long as the image has a high resolution.
  • the satellite image or aerial image that is the source of satellite image 114 is not limited to images taken with visible light, and may be imaging data using other wavelengths (e.g., infrared light).
  • By using high-resolution data such as satellite images it is possible to identify real-world phenomena that are difficult to recognize or infer from a map (map information 115), such as street trees and green spaces within factories.
  • Map information 115 is general map data that indicates feature information for coordinates, and is used to prevent erroneous detections due to feature classification using satellite imagery 114.
  • the map information on which map information 115 is based is not limited to a specific type of map, but may be any data that indicates "use” for any coordinate within a specified range. Specifically, this "use” may be, for example, a house, commercial facility, road, or factory, and preferably corresponds to the content of high possibility area 1135 of color identification information 113 shown in Figure 6 described below.
  • the classification unit 112 initializes a pixel counter to 0 (step S303).
  • the pixel counter is a counter used to accumulate the feature amount for each pixel.
  • the classification unit 112 checks whether the processing of steps S305 to S307 has been performed for all pixels included in the mesh acquired in step S301 (step S304). If all pixels have been processed (YES in step S304), the process proceeds to step S308, which will be described later. If unprocessed pixels remain (NO in step S304), the process proceeds to step S305.
  • the classification unit 112 selects color identification information 113 corresponding to the analysis target acquired in step S301. Specifically, the classification unit 112 selects, from among the color identification information related to the various analysis targets stored in the memory 17, the analysis target 1131 (see FIG. 6) whose name matches the analysis target acquired in step S301. Note that the color identification information related to the various analysis targets may not be stored in the memory 17 but may be stored externally via the disk 2 or the network 3. In this case, the classification unit 112 refers to the disk 2 or the network 3, reads the color identification information corresponding to the analysis target acquired in step S301 from among the color identification information, and stores it in the memory 17 as color identification information 113.
  • FIG. 6 is a diagram showing an example of color identification information 113.
  • Color identification information 113 is data for identifying an analysis target by color, and is used to determine whether a certain point (pixel) can be the analysis target.
  • the color identification information 113 shown in FIG. 6 has the items of analysis target 1131, red range 1132, green range 1133, blue range 1134, and high possibility area 1135.
  • Analysis target 1131 indicates the name of the analysis target that can be determined. Specifically, in the case of FIG. 6, analysis target 1131 is "concrete,” which corresponds to "concrete” included in analysis target 1806 of analysis instruction information 18 in FIG. 3.
  • the red range 1132, green range 1133, and blue range 1134 are identification conditions that define the characteristics of the analysis object 1131, and the range of colors that the analysis object 1131 can take is indicated by RGB values. Specifically, in the case of Figure 6, if the RGB values of the color of a certain point (pixel) are all between "110-150,” it is determined that the point is likely to be "concrete.”
  • High possibility area 1135 indicates the use estimated for a point (pixel) that has the characteristics of analysis target 1131. Specifically, in the case of FIG. 6, if the color of a certain point (pixel) is determined to be within the above RGB value range and is likely to be concrete, the point is estimated to have a high possibility of being used for one of the following purposes: "residence,” "commercial facility,” “road,” or "factory.”
  • color identification information 113 information is set in advance for various analysis targets, including each of the analysis targets 1806 in the analysis instruction information 18.
  • the set information may be changeable mainly by an administrator during operation of the living environment evaluation system 1.
  • the color identification information 113 shown in FIG. 6 is analyzed using the RGB values of the color of each pixel, the color analysis method is not limited to this, and various known analysis methods can be used, such as converting the color space of each pixel to an HSV color space before making a judgment.
  • this embodiment uses color identification information 113 analyzed by "color,” this is just one example, and any identification information analyzed by "wavelength” can be used as identification information for determining whether a certain point (pixel) corresponds to the analysis target in this invention. Therefore, the information (satellite image 114) of the point (pixel) to be compared with such identification information is not limited to an image using visible light, and may be, for example, an infrared image, etc.
  • the classification unit 112 determines whether the color of each pixel in the area image corresponding to the mesh acquired in step S301 in the satellite image 114 acquired in step S302 satisfies the identification condition for the analysis target based on the color identification information 113 acquired in step S305, and increments the pixel counter value according to the determination result (step S306).
  • the classification unit 112 determines that the color of each pixel extracted from the satellite image 114 satisfies the identification condition of the analysis target when the color satisfies all of the red range 1132, green range 1133, and blue range 1134 in the color identification information 113. If it is determined that the identification condition of the analysis target is satisfied, the classification unit 112 adds "0.7" to the pixel counter value for each pixel that satisfies the identification condition. On the other hand, the classification unit 112 does not add to the pixel counter value for pixels that are determined not to satisfy the identification condition of the analysis target.
  • the classification unit 112 determines whether the use of the pixel's position shown in the map information 115 acquired in step S302 is included in the uses indicated in the color identification information 113, and further increments the pixel counter value according to the determination result (step S307).
  • step S307 for the pixel whose pixel counter was incremented in step S306, the classification unit 112 extracts the use of the pixel's position from the map information 115, and determines whether it matches any of the uses indicated in the high possibility area 1135 of the color identification information 113. If the use matches, the classification unit 112 further increments the pixel counter value by "0.3", and if the use does not match, no additional increment is made.
  • step S307 After processing in step S307, the process returns to step S304.
  • step S304 the classification unit 112 divides the final value of the pixel counter by the total number of pixels contained in the mesh, and returns the calculated value as the result of the feature classification process to step S206 of the land feature extraction unit 111 (step S308), and the feature classification process ends.
  • the classification unit 112 uses the satellite image 114, which is high-resolution information, to compare the color of each pixel of the satellite image 114 included in the mesh with the identification conditions of the color identification information 113, thereby enabling accurate numerical quantification of the extent to which the mesh has the characteristics of the specified analysis target. Furthermore, in the feature classification process, taking into consideration the possibility that a color match determination alone may result in an erroneous determination, the high-resolution map information 115 is used to compare the "use" as well, and this is reflected in the analysis value, thereby enabling the mesh to be analyzed with higher accuracy.
  • the pixel counter addition value in step S306 is set to "0.7”, and the pixel counter addition value in step S307 is set to "0.3”, and "1.0" is added per pixel when both are added, but the ratio of each addition value may be set arbitrarily depending on the degree of importance placed on the color determination and the use determination, etc.
  • FIG. 7 is a diagram showing an example of feature information 19.
  • Feature information 19 is data that holds the overall analysis results, more specifically, the final analysis results by the high-resolution data analysis unit 11 and the low-resolution data analysis unit 12.
  • Feature information 19 records the analysis results of the analysis target or index for each mesh into which the analysis range is divided.
  • feature information 19 shown in FIG. 7 has items of vertical position 1901, horizontal position 1902, first analysis item 1903, and second analysis item 1904.
  • Vertical position 1901 indicates the vertical position of the mesh
  • horizontal position 1902 indicates the horizontal position of the mesh.
  • the combination of vertical position 1901 and horizontal position 1902 identifies the position of the mesh in the analysis target.
  • the first analysis item 1903 indicates the feature amount for each analysis item as the analysis result of the high-resolution data analysis process for each analysis target in the analysis target 1806 of the analysis instruction information 18.
  • “concrete”, “greening”, and “water” are exemplified as the first analysis item 1903, which correspond to the analysis targets indicated in the analysis target 1806 of the analysis instruction information 18 in FIG. 3.
  • step S206 of FIG. 4 described above the result of the feature classification process by the classification unit 112 is written into one of the cells of the first analysis item 1903.
  • the second analysis item 1904 shows the feature amount for each analysis item as the analysis result of the low-resolution data analysis process for each index in the local information 122.
  • "quiet environment” and "convenient shopping area” are shown as examples of the second analysis item 1904, which correspond to the indexes shown in the index 1221 of the local information 122 in FIG. 10, which will be described later.
  • step S414 in FIG. 9, which will be described later the analysis result is written into one of the cells of the second analysis item 1904.
  • each of the first analysis item 1903 and the second analysis item 1904 is merely an example, and items with arbitrary names may be configured according to the analysis target 1806 of the analysis instruction information 18 and the index 1221 of the area information 122.
  • FIG. 8 and Fig. 9 are flowcharts (part 1, part 2) showing an example of a processing procedure of the low-resolution data analysis processing.
  • the low-resolution data analysis processing shown in Fig. 8 and Fig. 9 corresponds to the processing of step S102 in Fig. 2, and is executed when the data mapping unit 121 is started by the low-resolution data analysis unit 12.
  • the data mapping unit 121 acquires the analysis instruction information 18 generated based on the operation by the user (step S401), and divides the analysis range specified by the analysis instruction information 18 into meshes (step S402).
  • step S403 the data mapping unit 121 checks whether the processing from step S404 onwards has been performed on all meshes generated in step S402 (step S403). If all meshes have been processed (YES in step S403), the low-resolution data analysis process ends. If unprocessed meshes remain (NO in step S403), the process proceeds to step S404. If this is the first time, the processing from step S404 onwards has not yet been performed, so the process proceeds to step S404 via NO in step S403.
  • step S404 the data mapping unit 121 selects one of the unprocessed meshes.
  • step S406 the data mapping unit 121 checks whether the processing from step S406 onwards has been performed for the information relating to all indicators included in the regional information 122 (step S405). If the information relating to all indicators in the regional information 122 has been processed (YES in step S405), the process returns to step S403. If unprocessed indicators remain (NO in step S405), the process proceeds to step S406.
  • FIG. 10 is a diagram showing an example of regional information 122.
  • Regional information 122 is low-resolution data in units of relatively wide-area "regions" such as administrative districts, and holds data values for each region for a given indicator.
  • Regional information 122 is registered in the system in advance.
  • the area information 122 includes, as an example, information on the indicators "quiet environment” and "convenient shopping district.”
  • the information on each indicator has the items of indicator 1221, target 1222, data name 1223, and data value 1224.
  • Indicator 1221 indicates the name of the indicator.
  • Target 1222 indicates the conditions of the target to which the indicator indicated by indicator 1221 (hereinafter, the indicator) is applied.
  • Data name 1223 indicates the specific area name of the area where an index value (data value 1224) has been obtained for the indicator.
  • Data value 1224 indicates the index value of the indicator in the area indicated by data name 1223. In this example, the larger the value of data value 1224, the stronger the characteristics of the indicator.
  • the data mapping unit 121 selects one piece of regional information related to an unprocessed indicator from the regional information 122. Specifically, for example, in step S406, the data mapping unit 121 selects data related to "quiet environment," which is one of the indicators, from the regional information 122 shown in Figure 10.
  • step S407 the data mapping unit 121 checks whether the processing from step S408 onwards has been performed on all columns in the regional information data selected in step S406 (step S407).
  • “columns” refers to each column of data names 1223 (which may be considered as each column of data values 1224) such as "Suginami-ku,” “Setagaya-ku,” and “Nakano-ku” in the example data related to "quiet environment" in regional information 122 in FIG. 10. If all columns have been processed (YES in step S407), the process returns to step S405. If unprocessed columns remain (NO in step S407), the process proceeds to step S408 in FIG. 9.
  • step S408 the data mapping unit 121 selects one unprocessed column from the data of the regional information selected in step S406. To be more specific, using the regional information 122 in FIG. 10, for example, the column "Suginami-ku" is selected.
  • the data mapping unit 121 identifies the geographical range that is the target of the column selected in step S408, based on the data name 1223 and target 1222 of the column (step S409).
  • the target range is identified as the "entire area” of "Suginami Ward” (or simply “Suginami Ward”).
  • step S410 the data mapping unit 121 checks whether the majority of the meshes selected in step S404 are within the range specified in step S409 (step S410). In the previous specific example, it checks whether the majority of the meshes are "Suginami-ku". If the majority of the meshes are within the specified range (YES in step S410), the mesh is considered to belong to that range and the process proceeds to step S411. If the majority of the meshes are not within the specified range (NO in step S410), the process returns to step S407 in Figure 8.
  • condition in step S410 is "the majority of the meshes," but any method that can determine the range corresponding to the meshes will do, and other conditions may also be used, such as "the range identified in step S409 occupies the largest range in the mesh selected in step S404.”
  • step S411 the data mapping unit 121 determines the data value 1224 of the column selected in step S408 as the value of the analysis result of the low-resolution data analysis for the indicator in that mesh (the indicator selected in step S406).
  • the data value "0.72" corresponding to "Suginami-ku” is determined as the value of the analysis result.
  • the data mapping unit 121 stores the value of the analysis result determined in step S411 in the corresponding cell of the feature information 19.
  • the data mapping unit 121 first refers to the feature information 19 and selects a column that corresponds to the mesh (step S412).
  • the data mapping unit 121 selects a cell from the column selected in step S412 that corresponds to the index 1221 of the regional information selected in step S406 (step S413).
  • the data mapping unit 121 enters the value determined in step S411 into the cell of the feature information 19 selected in step S413.
  • step S414 After processing in step S414, the process returns to step S407 in FIG. 8. If there are unprocessed columns in the regional information for the index selected in step S406 (NO in step S407), the process from step S408 onwards is repeated for the other unprocessed columns.
  • low-resolution data regional information 122
  • regional information 122 is assigned to the fine meshes used in the high-resolution data analysis, and ultimately, for all meshes in the analysis range, the extent to which they possess the characteristics of each index held in the regional information 122 (low-resolution data) is quantified and recorded in the characteristic information 19.
  • FIG. 11 is a flowchart showing an example of the procedure of the index calculation process.
  • the index calculation process shown in Fig. 11 corresponds to the process of step S103 in Fig. 2, and is executed when the data integration evaluation part 131 is started by the index calculation part 13.
  • the data integration evaluation unit 131 acquires user preference information 132 generated based on operations by the user (step S501).
  • the user preference information 132 is automatically generated according to the contents of the input operations when the user performs input operations according to his/her preferences (for example, an operation to select the importance level for each item) in the questionnaire input area 301 on the living environment evaluation screen 300 and then presses the evaluation execution button 304.
  • FIG. 12 is a diagram showing an example of user preference information 132.
  • the user preference information 132 shown in FIG. 12 has items of living environment items 1321 and importance 1322.
  • the living environment items 1321 indicate items of the living environment that may affect the index, and are set in advance by the service provider, the system administrator, etc.
  • the selection items in the questionnaire input area 301 correspond to the living environment items 1321 of the user preference information 132.
  • the importance 1322 indicates the degree of preference (preference coefficient) selected by the user (user) for the item indicated by the living environment item 1321. In this example, the importance 1322 is selected from five levels, 1 to 5, and the selected value is set as the value of the preference coefficient.
  • the data integration evaluation unit 131 refers to the analysis instruction information 18 and divides the analysis range specified in the analysis instruction information 18 into meshes (step S502).
  • the analysis range specified in the analysis instruction information 18 corresponds to the area displayed in the map display area 302 of the living environment evaluation screen 300 shown in FIG. 16.
  • step S503 the data integration evaluation unit 131 checks whether the processing from step S504 onwards has been performed on all meshes generated in step S502 (step S503). If all meshes have been processed (YES in step S503), the index calculation process ends. If unprocessed meshes remain (NO in step S503), the process proceeds to step S504.
  • step S504 the data integration evaluation unit 131 selects one of the unprocessed meshes and obtains its features from the feature information 19.
  • step S505 the data integration evaluation unit 131 checks whether associations for all indices included in the association information 133 for the mesh selected in step S504 have been processed (step S505). If all indices have been processed (YES in step S505), the process returns to step S503. If unprocessed indices remain (NO in step S505), the process proceeds to step S506.
  • the association information 133 is data that holds an association (weighting) coefficient when calculating the feature amount of a specified living environment item from the feature amount held in the feature information 19.
  • the association information 133 shown in FIG. 13 has items of living environment item 1331 and analysis item 1332 for each index.
  • the living environment item 1331 corresponds to the living environment item 1321 of the user preference information 132 shown in FIG. 12.
  • the analysis item 1332 corresponds to the first analysis item 1903 and the second analysis item 1904 in the feature information 19 shown in FIG. 7.
  • the association information 133 holds an association (weighting) coefficient for each cell for the combination of the living environment item 1331 and the analysis item 1332.
  • step S506 the data integration evaluation unit 131 selects one piece of association information related to an unprocessed index from the association information 133, and internally creates a table in which the importance 1322 of the user preference information 132 is the column heading, and the feature information obtained from the feature information 19 in step S504 is the row heading.
  • the data integration evaluation unit 131 calculates each cell of the table created in step S506 by multiplying the product of the header matrices by the coefficient indicated in the same location in the association information (step S507).
  • the data integration evaluation unit 131 writes the sum of the values of each cell calculated in step S507 into the index value information 20 as the index calculation result (index value) of the index in that mesh (step S508).
  • the feature values of multiple living environment items that affect the index are calculated by weighting the feature values of each analysis item using the coefficients of association information 133, and the degree of preference by the user (importance of user preference information 132) is reflected in the calculation of the feature values of each living environment item, and the finally calculated feature values of each living environment item are summed up and stored in index value information 20 as the index value of the index.
  • FIG. 14 is a diagram showing an example of index value information 20.
  • Index value information 20 is data that holds index values of each index related to the living environment for each mesh, and in the case of FIG. 14, is composed of items of vertical position 2001, horizontal position 2002, and index item 2003.
  • Vertical position 2001 and horizontal position 2002 indicate the position of the mesh
  • index item 2003 indicates the index value of the index for each indicator related to the living environment.
  • step S508 the process returns to step S505 and repeats steps S506 to S508 until there are no more unprocessed indicators.
  • index values are calculated for each indicator related to the living environment and recorded in the index value information 20.
  • FIG. 15 is a flowchart showing an example of the processing procedure of the visualization processing.
  • the visualization processing shown in Fig. 15 corresponds to the processing of step S104 in Fig. 2, and is executed when the map display unit 141 is started by the visualization unit 14.
  • FIG. 16 is a diagram showing an example of a living environment evaluation screen 300. This is a display screen provided to the user by the living environment evaluation system 1, and in this explanation, it is continuously displayed from when the user inputs preferences before the evaluation of the living environment begins, to when the results are displayed after the evaluation of the living environment is completed.
  • the living environment evaluation screen 300 has a display configuration of a questionnaire input area 301, a map display area 302, an index selection box 303, and an evaluation execution button 304.
  • the questionnaire input area 301 is an area that displays a questionnaire that the user is asked to answer before starting the evaluation.
  • the questionnaire input area 301 displays multiple living environment items that affect the index to be evaluated, and the importance of each living environment item is selected according to the user's preferences.
  • User preference information 132 is generated according to the contents of the input to this questionnaire input area 301.
  • the map display area 302 is the area where the map is displayed.
  • the map area displayed in the map display area 302 by the user's operation before the evaluation begins becomes the analysis range for the living environment evaluation.
  • the map area displayed in the map display area 302 is divided into meshes, and a display based on the index value of the index value information 20 is performed for each mesh as the output of the evaluation result for the index selected in the index selection box 303.
  • the index selection box 303 is a checkbox that allows the user to select an index for evaluating the living environment.
  • the evaluation execution button 304 is a button for instructing the living environment evaluation system 1 to execute a living environment evaluation. The user checks the index that the user wishes to check from among the indexes displayed in the index selection box 303 (in the case of FIG. 16, "nature” or "living convenience") and presses the evaluation execution button 304, which executes a living environment evaluation for the above indexes for the analysis range displayed in the map display area 302.
  • the map display unit 141 displays a map of the analysis range in the map display area 302 of the living environment evaluation screen 300 based on the analysis instruction information 18 (step S601).
  • the living environment evaluation screen 300 described in FIG. 16 since the analysis range is selected by the user before the evaluation begins, it is not necessary to display it again in step S601.
  • the map display unit 141 acquires the index selected by the user in the index selection box 303 on the living environment evaluation screen 300 (step S602).
  • the map display unit 141 divides the map displayed in step S601 into meshes according to the mesh width 1805 of the analysis instruction information 18 (step S603).
  • step S604 the map display unit 141 checks whether the processing from step S605 onwards has been performed on all meshes generated in step S603 (step S604). If all meshes have been processed (YES in step S604), the visualization process ends. If unprocessed meshes remain (NO in step S604), the process proceeds to step S605.
  • step S605 the map display unit 141 selects one of the unprocessed meshes.
  • the map display unit 141 refers to the index value information 20 and obtains the index value corresponding to the combination of the index obtained in step S602 and the mesh selected in step S605 (step S606).
  • the map display unit 141 depicts the index value acquired in step S606 on the mesh of the map display area 302 in a color that has transparency and a density according to the index value (step S607).
  • the mesh of the map display area 302 is depicted in a color that increases in density as the index value increases.
  • step S607 the process returns to step S604, where the map display unit 141 repeats the process for each mesh, and when the rendering process is completed for all meshes (YES in step S604), the visualization process is terminated.
  • the map on which the evaluation results are superimposed and displayed in this way on the map display area 302 of the living environment evaluation screen 300 can show the degree of the living environment index in a form with geographical continuity, with the meshes divided regardless of the boundaries of administrative division units.
  • the map on which the evaluation results are superimposed is not limited to general maps that show geographic information. For example, when a hazard map is used as an example of a map specialized for a specific purpose, the risk information shown by the hazard map and the evaluation results of the living environment can be presented simultaneously.
  • the living environment assessment system 1 can analyze the features of each mesh area into which the analysis range specified by the user is divided based on the low-resolution data (regional information 122) and high-resolution data (satellite image 114), and calculate and display the index values of the living environment indexes specified by the user based on the features, while reflecting the user's preferences.
  • the residential environment evaluation system 1 when analyzing the features of each mesh area, the residential environment evaluation system 1 according to this embodiment can prevent erroneous detection by using the map information 115, which is high-resolution data.
  • the residential environment assessment system 1 uses high-resolution data such as satellite images 114, making it possible to assess the residential environment in a more detailed manner and with a higher resolution than conventional technology, reflecting individual preferences. By using such assessment results, it is possible to obtain information that cannot be known from simple map information alone, such as the amount of greenery and the possibility of noise from roads and factories. It is also possible to quantitatively estimate the residential environment.
  • Second embodiment Fig. 17 is a block diagram showing an example of the configuration of a living environment assessment system 10 according to a second embodiment of the present invention.
  • the living environment assessment system 10 according to the second embodiment has a configuration in which a correction unit 21 and an analysis unit 22 are added to the living environment assessment system 1 according to the first embodiment shown in Fig. 1, and a description of the configuration and processing thereof common to the living environment assessment system 1 will be omitted.
  • the correction unit 21 has a function of updating the association information 133 based on the intuitive indications of the user.
  • the correction unit 21 has an association information update unit 211 as a functional unit that executes a program.
  • the residential environment evaluation system 10 displays the evaluation results of a residential environment for a certain index, the user may feel that the evaluation results deviate from his or her own sense.
  • the residential environment evaluation system 10 is configured to allow the user to use a specified user interface (UI) to specify the index that deviates from his or her sense and the details of the deviation.
  • UI user interface
  • an area in which an index can be selected and an area in which information indicating the details of the deviation from the selected index (e.g., too high, too low) can be added to the residential environment evaluation screen 300. Specific illustrations are omitted.
  • the correction unit 21 starts the association information update unit 211 and executes the association information update process.
  • the details of the association information update process will be described later with reference to FIG. 18.
  • the analysis unit 22 has a function of displaying past living environment evaluation results in response to a request from a user.
  • the analysis unit 22 has a past information display unit 221 as a functional unit that executes a program.
  • the analysis unit 22 starts the past information display unit 221 and executes the past information display process. Details of the past information display process will be described later with reference to FIG. 19.
  • 18 is a flowchart showing an example of the procedure of the association information update process, which is executed when the association information update unit 211 is started.
  • the association information update unit 211 acquires the index that is input by the user and that deviates from the user's sense, and information indicating the nature of the deviation (too high or too low) (step S701).
  • the association information update unit 211 refers to the association information 133 and selects the index obtained in step S701 (step S702).
  • step S703 the association information update unit 211 checks whether the deviation acquired in step S701 is "too high” compared to the user's own feeling (step S703). If the deviation is "too high” (YES in step S703), the process proceeds to step S704, and if the deviation is "too low” (NO in step S703), the process proceeds to step S705.
  • step S704 the association information update unit 211 decreases all of the coefficients of the indices selected in step S702 by a predetermined amount (5% in this example).
  • step S705 the association information update unit 211 increases all of the coefficients of the indices selected in step S702 by a predetermined amount (5% in this example).
  • the association information update unit 211 starts the data integration evaluation unit 131 and the map display unit 141, and executes the index calculation process shown in FIG. 11 and the visualization process shown in FIG. 15 again (step S706).
  • the index value of the indicated index is recalculated using the association information 133 that has been updated to approximate the user's intuition (sense), and the evaluation result of that index is displayed based on the recalculated index value.
  • the redisplayed evaluation result is closer to the user's intuition, and the living environment evaluation system 10 according to the second embodiment can increase the user's satisfaction.
  • association information update unit 211 changes the association coefficient of the index by a predetermined amount (for example, 5%) depending on the deviation of the index input by the user (too high/too low), but various variations in the amount of change in the association coefficient can be adopted.
  • the user can input not only the deviation (too high/too low) but also the degree of deviation, and the amount of change in the association coefficient can be varied depending on the degree.
  • 19 is a flowchart showing an example of a procedure for the past information display process, which is executed when the past information display unit 221 is started.
  • the past information display unit 221 obtains the past date input by the user in an operation requesting the display of past information (step S801).
  • the past information display unit 221 updates the satellite image 114, the map information 115, and the area information 122 to the date closest to the date obtained in step S201 (step S802). To be able to update to past data in step S802, it is advisable to timestamp the data on which the satellite image 114 is based, the data on which the map information 115 is based, and the area information 122, and store the past data for a predetermined period of time on disk 2, etc.
  • the past information display unit 221 re-executes the living environment evaluation process in the processing order shown in FIG. 2 using the data updated in step S802 (step S803).
  • the residential environment evaluation results for the time closest to the past date specified by the user are displayed on the residential environment evaluation screen 300, allowing the user to check the chronological changes in residential environment evaluation in the same analysis range (differences between day and night, differences from 10 years ago, etc.).
  • the current residential environment evaluation results and past residential environment evaluation results may be displayed side by side.
  • the residential environment assessment system 10 according to the second embodiment has the same configuration and functions as the residential environment assessment system 1 according to the first embodiment, and therefore can achieve the same effects as the first embodiment.

Landscapes

  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

A living environment evaluation system 1 for evaluating a living environment with respect to an indicator specified by a user comprises: a high-resolution data analysis unit 11 which, by using user preference information 132 that holds preference coefficients indicating the user's preference for one or more living environment items that may affect the indicator, analysis instruction information 18 that indicates an analysis range specified by the user, and images that correspond to the analysis range, performs wavelength-based analysis and thereby analyzes characteristics of each of a plurality of regions (meshes) obtained by dividing the analysis range; an indicator calculation unit 13 which calculates an indicator value for the indicator for each region of the analysis range on the basis of the analysis results of the characteristics of the analysis range and the user preference information 132; and a visualization unit 14 which displays and overlays, on a map indicating each region of the analysis range, evaluation results based on the indicator values calculated by the indicator calculation unit 13.

Description

住環境評価システム及び住環境評価方法Living environment evaluation system and living environment evaluation method
 本発明は、住環境評価システム及び住環境評価方法に関し、住環境を評価して視覚的に出力する住環境評価システム及び住環境評価方法に適用して好適なものである。 The present invention relates to a housing environment evaluation system and a housing environment evaluation method, and is suitable for use as a housing environment evaluation system and a housing environment evaluation method that evaluates a housing environment and visually outputs the evaluation result.
 従来、住環境を定量的に評価する技術への需要が存在する。個人の価値観に基づく住環境の定量的な評価を参照することにより、例えば、家探しを行う客は、希望する住環境に適合する地域を絞り込みやすくなるし、不動産業者は、賃貸契約または売買契約の成約までに要する時間を短縮することができる。また、自治体は、所定の観点に基づく住環境の定量的な評価から地域の現状を把握することにより、施設等の立案計画に役立てることができる。  There has long been a demand for technology that quantitatively evaluates living environments. By referring to a quantitative evaluation of living environments based on an individual's values, for example, customers searching for a house can more easily narrow down the areas that match their desired living environment, and real estate agents can reduce the time it takes to conclude a rental or purchase contract. Furthermore, local governments can understand the current state of an area from a quantitative evaluation of the living environment based on a specified perspective, which can be useful in planning facilities, etc.
 例えば特許文献1には、個人の価値観に基づいて不動産情報を出力する不動産情報出力装置が開示されている。また、特許文献2には、個人の価値観に基づいて住環境評価を行う住環境評価装置が開示されている。 For example, Patent Document 1 discloses a real estate information output device that outputs real estate information based on an individual's values. Patent Document 2 discloses a living environment evaluation device that evaluates living environments based on an individual's values.
特開2019-145100号公報JP 2019-145100 A 特開2017-91422号公報JP 2017-91422 A
 しかし、特許文献1に開示された技術は、個々の不動産(住宅)のみを対象としてその評価を行うものであり、所定範囲の地域について住みやすさ等を評価したり、その評価結果を利用したりするものではなかった。 However, the technology disclosed in Patent Document 1 only evaluates individual real estate (houses), and does not evaluate the livability of a given area or utilize the results of that evaluation.
 一方、特許文献2に開示された技術は、オープンデータを分析して「地域」の住環境に関する指標を算出するモデルを用いて住環境評価を行うが、オープンデータを利用する以上は「地域」と言及されている範囲の解像度が比較的低いと推定される。一般には、オープンデータを利用して分析可能な範囲の最小単位は、市区町村等の行政区域単位である。 On the other hand, the technology disclosed in Patent Document 2 evaluates the living environment using a model that analyzes open data and calculates indicators related to the living environment of a "region," but since open data is used, it is presumed that the resolution of the area referred to as the "region" is relatively low. In general, the smallest unit of area that can be analyzed using open data is the administrative district unit such as a city, ward, town, or village.
 しかし、実際の住環境は、行政区域単位で区切って表すことができるものではなく、行政区域単位の境界付近において連続的な特性または特徴を示すと考えられる。すなわち、より現実的な住環境の指標を表すためには、市区町村等の行政区域単位よりも高解像度な単位での分析が必要であるが、特許文献2に開示された技術では、評価可能な範囲の解像度が低く、ユーザの要望に十分に応える住環境評価を実現できないおそれがあった。 However, actual living environments cannot be expressed by dividing them into administrative districts, but are thought to show continuous characteristics or features near the boundaries of administrative districts. In other words, to express a more realistic indicator of the living environment, analysis at a higher resolution than administrative district units such as cities, wards, towns, and villages is necessary, but the technology disclosed in Patent Document 2 has a low resolution of the range that can be evaluated, and there is a risk that it will not be possible to realize a living environment evaluation that fully meets the needs of users.
 本発明は以上の点を考慮してなされたもので、個人の嗜好を反映した住環境の評価を、従来技術よりも高い解像度で実現し、視覚的に提供することが可能な住環境評価システム及び住環境評価方法を提案しようとするものである。 The present invention has been made in consideration of the above points, and aims to propose a housing environment evaluation system and method that can visually provide an evaluation of a housing environment that reflects individual preferences with a higher resolution than conventional technology.
 かかる課題を解決するため本発明においては、利用者から指定された指標について住環境を評価する住環境評価システムであって、前記指標に影響を与え得る1以上の住環境項目に対する利用者の嗜好を示す嗜好係数を保持するユーザ嗜好情報と、住環境を評価する範囲として利用者から指定された解析範囲を示す解析指示情報と、前記解析範囲に該当する画像を用いて波長に基づく分析を行うことにより、前記解析範囲の特徴を、当該解析範囲を複数に分割した領域ごとに分析する高解像度データ分析部と、前記解析範囲の特徴の分析結果と前記ユーザ嗜好情報とに基づいて、前記解析範囲における前記指標の指標値を前記領域ごとに計算する指標計算部と、前記指標計算部によって算出された指標値に基づく評価結果を、前記領域ごとに前記解析範囲を示す地図に重ねて表示する可視化部と、を備えることを特徴とする住環境評価システムが提供される。 In order to solve this problem, the present invention provides a residential environment evaluation system that evaluates a residential environment with respect to an index specified by a user, the residential environment evaluation system comprising: user preference information that holds a preference coefficient indicating the user's preference for one or more residential environment items that may affect the index; analysis instruction information that indicates an analysis range specified by the user as a range for evaluating the residential environment; a high-resolution data analysis unit that performs a wavelength-based analysis using an image that corresponds to the analysis range to analyze the characteristics of the analysis range for each of a plurality of regions into which the analysis range is divided; an index calculation unit that calculates an index value of the index in the analysis range for each of the regions based on the analysis results of the characteristics of the analysis range and the user preference information; and a visualization unit that displays the evaluation results based on the index values calculated by the index calculation unit by superimposing them on a map indicating the analysis range for each of the regions.
 また、かかる課題を解決するため本発明においては、利用者から指定された指標について住環境を評価する住環境評価システムによる住環境評価方法であって、前記住環境評価システムは、前記指標に影響を与え得る1以上の住環境項目に対する利用者の嗜好を示す嗜好係数を保持するユーザ嗜好情報と、住環境を評価する範囲として利用者から指定された解析範囲を示す解析指示情報と、を有し、前記住環境評価システムが、前記解析範囲に該当する画像を用いて波長に基づく分析を行うことにより、前記解析範囲の特徴を、当該解析範囲を複数に分割した領域ごとに分析する高解像度データ分析ステップと、前記住環境評価システムが、前記高解像度データ分析ステップにおける前記解析範囲の特徴の分析結果と前記ユーザ嗜好情報とに基づいて、前記解析範囲における前記指標の指標値を前記領域ごとに計算する指標計算ステップと、前記住環境評価システムが、前記指標計算ステップで算出された指標値に基づく評価結果を、前記領域ごとに前記解析範囲を示す地図に重ねて表示する可視化ステップと、を備えることを特徴とする住環境評価方法が提供される。  In order to solve the above problem, the present invention provides a method for evaluating a living environment using a living environment evaluation system that evaluates a living environment with respect to an index specified by a user, the living environment evaluation system having user preference information that holds preference coefficients indicating the user's preferences for one or more living environment items that may affect the index, and analysis instruction information that indicates an analysis range specified by the user as a range for evaluating the living environment, the method comprising: a high-resolution data analysis step in which the living environment evaluation system performs a wavelength-based analysis using an image corresponding to the analysis range to analyze the characteristics of the analysis range for each of a plurality of divided areas of the analysis range; an index calculation step in which the living environment evaluation system calculates an index value of the index in the analysis range for each of the areas based on the analysis results of the characteristics of the analysis range in the high-resolution data analysis step and the user preference information; and a visualization step in which the living environment evaluation system displays the evaluation results based on the index values calculated in the index calculation step by superimposing them on a map showing the analysis range for each of the areas.
 本発明によれば、個人の嗜好を反映した住環境の評価を、従来技術よりも高い解像度で実現し、視覚的に提供することができる。 The present invention makes it possible to visually present an evaluation of the living environment that reflects individual preferences with a higher resolution than conventional technology.
本発明の第1の実施形態に係る住環境評価システム1の構成例を示すブロック図である。1 is a block diagram showing an example of the configuration of a living environment evaluation system 1 according to a first embodiment of the present invention. 第1の実施形態における全体処理の処理手順例を示すフローチャートである。10 is a flowchart showing an example of a processing procedure of an overall process in the first embodiment; 解析指示情報18の一例を示す図である。FIG. 2 is a diagram showing an example of analysis instruction information 18. 高解像度データ分析処理の処理手順例を示すフローチャートである。13 is a flowchart illustrating an example of a processing procedure for high-resolution data analysis processing. 特徴分類処理の処理手順例を示すフローチャートである。13 is a flowchart illustrating an example of a processing procedure for feature classification processing. 色識別情報113の一例を示す図である。FIG. 11 is a diagram showing an example of color identification information 113. 特徴情報19の一例を示す図である。FIG. 11 is a diagram showing an example of feature information 19. 低解像度データ分析処理の処理手順例を示すフローチャート(その1)である。11 is a flowchart (part 1) illustrating an example of a processing procedure of a low-resolution data analysis process. 低解像度データ分析処理の処理手順例を示すフローチャート(その2)である。13 is a flowchart (part 2) illustrating an example of a processing procedure of a low-resolution data analysis process. 地域情報122の一例を示す図である。FIG. 2 is a diagram showing an example of area information 122. 指標計算処理の処理手順例を示すフローチャートである。13 is a flowchart illustrating an example of a processing procedure for index calculation processing. ユーザ嗜好情報132の一例を示す図である。FIG. 2 is a diagram showing an example of user preference information 132. 関連付け情報133の一例を示す図である。FIG. 11 is a diagram showing an example of association information 133. 指標値情報20の一例を示す図である。FIG. 2 is a diagram showing an example of index value information 20. 可視化処理の処理手順例を示すフローチャートである。13 is a flowchart illustrating an example of a processing procedure for visualization processing. 住環境評価画面300の一例を示す図である。FIG. 3 is a diagram showing an example of a living environment evaluation screen 300. 本発明の第2の実施形態に係る住環境評価システム10の構成例を示すブロック図である。FIG. 11 is a block diagram showing an example of the configuration of a living environment evaluation system 10 according to a second embodiment of the 関連付け情報更新処理の処理手順例を示すフローチャートである。13 is a flowchart illustrating an example of a processing procedure for updating association information. 過去情報表示処理の処理手順例を示すフローチャートである。13 is a flowchart showing an example of a processing procedure for past information display processing;
 以下、図面を参照して、本発明の実施形態を詳述する。 Below, an embodiment of the present invention will be described in detail with reference to the drawings.
 なお、以下の記載及び図面は、本発明を説明するための例示であって、説明の明確化のため、適宜、省略及び簡略化がなされている。また、実施形態の中で説明されている特徴の組み合わせの全てが発明の解決手段に必須であるとは限らない。本発明が実施形態に制限されることは無く、本発明の思想に合致するあらゆる応用例が本発明の技術的範囲に含まれる。本発明は、当業者であれば本発明の範囲内で様々な追加や変更等を行うことができる。本発明は、他の種々の形態でも実施する事が可能である。特に限定しない限り、各構成要素は複数でも単数でも構わない。 The following description and drawings are illustrative of the present invention, and have been omitted or simplified as appropriate for clarity of explanation. Furthermore, not all of the combinations of features described in the embodiments are necessarily essential to the solution of the invention. The present invention is not limited to the embodiments, and all application examples that conform to the concept of the present invention are included in the technical scope of the present invention. Those skilled in the art can make various additions and modifications to the present invention within the scope of the present invention. The present invention can also be implemented in various other forms. Unless otherwise specified, each component may be multiple or singular.
 以下の説明では、「テーブル」、「表」、「リスト」、「キュー」等の表現にて各種情報を説明することがあるが、各種情報は、これら以外のデータ構造で表現されていてもよい。データ構造に依存しないことを示すために「XXテーブル」、「XXリスト」等を「XX情報」と呼ぶことがある。各情報の内容を説明する際に、「識別情報」、「識別子」、「名」、「ID」、「番号」等の表現を用いるが、これらについてはお互いに置換が可能である。 In the following explanation, various types of information may be explained using expressions such as "table," "list," and "queue," but the various types of information may also be expressed in other data structures. To indicate independence from data structure, "XX table," "XX list," and so on may be referred to as "XX information." When explaining the content of each piece of information, expressions such as "identification information," "identifier," "name," "ID," and "number" are used, but these are interchangeable.
 また、以下の説明では、同種の要素を区別しないで説明する場合には、参照符号又は参照符号における共通番号を使用し、同種の要素を区別して説明する場合は、その要素の参照符号を使用又は参照符号に代えてその要素に割り振られたIDを使用することがある。 In addition, in the following explanation, when describing elements of the same type without distinguishing between them, reference signs or common numbers in reference signs will be used, and when describing elements of the same type with distinction between them, the reference signs of those elements will be used or an ID assigned to those elements will be used instead of the reference signs.
 また、以下の説明では、プログラムを実行して行う処理を説明する場合があるが、プログラムは、少なくとも1以上のプロセッサ(例えばCPU)によって実行されることで、定められた処理を、適宜に記憶資源(例えばメモリ)及び/又はインターフェースデバイス(例えば通信ポート)等を用いながら行うため、処理の主体がプロセッサとされてもよい。同様に、プログラムを実行して行う処理の主体が、プロセッサを有するコントローラ、装置、システム、計算機、ノード、ストレージシステム、ストレージ装置、サーバ、管理計算機、クライアント、又は、ホストであってもよい。プログラムを実行して行う処理の主体(例えばプロセッサ)は、処理の一部又は全部を行うハードウェア回路を含んでもよい。例えば、プログラムを実行して行う処理の主体は、暗号化及び復号化、又は圧縮及び伸張を実行するハードウェア回路を含んでもよい。プロセッサは、プログラムに従って動作することによって、所定の機能を実現する機能部として動作する。プロセッサを含む装置及びシステムは、これらの機能部を含む装置及びシステムである。 In the following description, the processing performed by executing a program may be described, but the program is executed by at least one processor (e.g., a CPU) to perform a predetermined process using a storage resource (e.g., a memory) and/or an interface device (e.g., a communication port) as appropriate, so the subject of the processing may be the processor. Similarly, the subject of the processing performed by executing a program may be a controller, device, system, computer, node, storage system, storage device, server, management computer, client, or host having a processor. The subject of the processing performed by executing a program (e.g., a processor) may include a hardware circuit that performs part or all of the processing. For example, the subject of the processing performed by executing a program may include a hardware circuit that performs encryption and decryption, or compression and decompression. The processor operates as a functional unit that realizes a specified function by operating according to the program. Devices and systems that include a processor are devices and systems that include these functional units.
 プログラムは、プログラムソースから計算機のような装置にインストールされてもよい。プログラムソースは、例えば、プログラム配布サーバ又は計算機が読み取り可能な非一時的な記憶メディアであってもよい。プログラムソースがプログラム配布サーバの場合、プログラム配布サーバはプロセッサ(例えばCPU)と非一時的な記憶資源とを含み、記憶資源はさらに配布プログラムと配布対象であるプログラムとを記憶してよい。そして、プログラム配布サーバのプロセッサが配布プログラムを実行することで、プログラム配布サーバのプロセッサは配布対象のプログラムを他の計算機に配布してよい。また、以下の説明において、2以上のプログラムが1つのプログラムとして実現されてもよいし、1つのプログラムが2以上のプログラムとして実現されてもよい。 The program may be installed in a device such as a computer from a program source. The program source may be, for example, a program distribution server or a non-transitory storage medium readable by a computer. When the program source is a program distribution server, the program distribution server includes a processor (e.g., a CPU) and a non-transitory storage resource, and the storage resource may further store a distribution program and a program to be distributed. Then, the processor of the program distribution server may execute the distribution program, thereby distributing the program to be distributed to other computers. Also, in the following description, two or more programs may be realized as one program, and one program may be realized as two or more programs.
(1)第1の実施形態
(1-1)システム構成
 図1は、本発明の第1の実施形態に係る住環境評価システム1の構成例を示すブロック図である。住環境評価システム1は、ユーザ(家探しをしている個人、不動産業者、自治体担当者等)が指定した範囲内の住環境を、ユーザの嗜好を考慮して評価する計算機システムであって、ディスク2、ネットワーク3、及び入出力装置4に接続される。
(1) First embodiment (1-1) System configuration Fig. 1 is a block diagram showing an example of the configuration of a living environment evaluation system 1 according to a first embodiment of the present invention. The living environment evaluation system 1 is a computer system that evaluates living environments within an area specified by a user (an individual searching for a house, a real estate agent, a local government official, etc.) taking into consideration the user's preferences, and is connected to a disk 2, a network 3, and an input/output device 4.
 ディスク2は、住環境評価システム1における処理で利用される所定のデータを格納する記憶装置である。ディスク2は、例えばHDD(Hard Disk Drive)やSSD(Solid State Drive)等の記憶装置であるが、ネットワーク3を介して接続される外部ストレージやクラウド等であってもよい。ディスク2に格納される所定のデータとしては、具体的には例えば、後述するメモリ17内の衛星画像114、地図情報115、及び地域情報122の元データ等がある。なお、これらのデータは、ネットワーク3を経由して接続される外部から取得される構成であってもよい。 Disk 2 is a storage device that stores specific data used in the processing in the living environment evaluation system 1. Disk 2 is, for example, a storage device such as a HDD (Hard Disk Drive) or SSD (Solid State Drive), but may also be an external storage or cloud connected via the network 3. Specific examples of the specific data stored on disk 2 include the original data of satellite images 114, map information 115, and regional information 122 in memory 17 described below. Note that these data may be configured to be obtained from an external device connected via the network 3.
 ネットワーク3は、住環境評価システム1と外部とを通信可能に接続するネットワークであって、例えばLAN(Local Area Network)またはWAN(Wide Area Network)等である。前述した通り、ディスク2が所定のデータを格納する代わりに、ネットワーク3を介して接続されるインターネット等から所定のデータを取得できるように構成されてもよい。 The network 3 is a network that communicatively connects the living environment evaluation system 1 to the outside, and is, for example, a LAN (Local Area Network) or a WAN (Wide Area Network). As mentioned above, instead of the disk 2 storing the specified data, the disk 2 may be configured to be able to acquire the specified data from the Internet or the like connected via the network 3.
 入出力装置4は、ユーザが住環境評価システム1を操作する際に使用するキーボード及びマウス等による入力装置と、住環境評価システム1から出力されるデータを出力するディスプレイまたはプリンタ等の出力装置とを含む。 The input/output device 4 includes an input device such as a keyboard and mouse that the user uses to operate the living environment evaluation system 1, and an output device such as a display or printer that outputs data output from the living environment evaluation system 1.
 住環境評価システム1は、プロセッサ15、入出力装置16、及びメモリ17を有して構成される。 The living environment evaluation system 1 is configured with a processor 15, an input/output device 16, and a memory 17.
 プロセッサ15は、住環境評価システム1を全体的に制御するプロセッサであり、具体的には例えばCPU(Central Processing Unit)である。入出力装置16は、住環境評価システム1とその外部(ディスク2、ネットワーク3、入出力装置4)との間で信号(データ)の入出力を可能とするための入出力インタフェースである。 The processor 15 is a processor that controls the entire living environment evaluation system 1, and is specifically, for example, a CPU (Central Processing Unit). The input/output device 16 is an input/output interface that enables input and output of signals (data) between the living environment evaluation system 1 and its external environment (disk 2, network 3, input/output device 4).
 メモリ17は、住環境評価システム1の計算機において使用されるプログラム及びデータを記憶する記憶装置であって、例えばRAM(Random Access Memory)である。メモリ17内に示した高解像度データ分析部11、低解像度データ分析部12、指標計算部13、及び可視化部14は、メモリ17に格納または読み出されたプログラムをプロセッサ15が実行することによって各種の機能を実現する機能部である。メモリ17内に示した解析指示情報18、特徴情報19、及び指標値情報20は、メモリ17に格納されるデータである。 Memory 17 is a storage device that stores programs and data used in the computer of living environment assessment system 1, and is, for example, a RAM (Random Access Memory). The high-resolution data analysis unit 11, low-resolution data analysis unit 12, index calculation unit 13, and visualization unit 14 shown in memory 17 are functional units that realize various functions by the processor 15 executing programs stored in or read from memory 17. Analysis instruction information 18, feature information 19, and index value information 20 shown in memory 17 are data stored in memory 17.
 高解像度データ分析部11は、後述する地域情報122に代表される「低解像度データ」よりも解像度が高い「高解像度データ」を用いて、ユーザによって指定された地域(範囲)の特徴を分析する機能を有する。高解像度データ分析部11は、プログラムを実行する機能部として土地特徴抽出部111及び分類部112を有し、これらの各機能部による処理に利用されるデータとして色識別情報113、衛星画像114、及び地図情報115を有する。 The high-resolution data analysis unit 11 has a function of analyzing the characteristics of an area (range) specified by a user, using "high-resolution data" with a higher resolution than "low-resolution data" represented by area information 122 described below. The high-resolution data analysis unit 11 has a land feature extraction unit 111 and a classification unit 112 as functional units that execute programs, and has color identification information 113, satellite imagery 114, and map information 115 as data used for processing by each of these functional units.
 色識別情報113は、解析対象を色で識別するためのデータであって、後述する図6にその具体例が示される。衛星画像114及び地図情報115は、後述する地域情報122よりも高解像度なデータであって、それぞれの詳細は図5に示す特徴分類処理のなかで説明される。 The color identification information 113 is data for identifying the analysis target by color, and a specific example is shown in FIG. 6, which will be described later. The satellite image 114 and map information 115 are data with a higher resolution than the regional information 122, which will be described later, and the details of each are explained in the feature classification process shown in FIG. 5.
 低解像度データ分析部12は、「低解像度データ」を用いて、ユーザによって指定された地域(範囲)の特徴を分析する機能を有する。低解像度データ分析部12は、プログラムを実行する機能部としてデータマッピング部121を有し、データマッピング部121による処理に利用されるデータとして地域情報122を有する。 The low-resolution data analysis unit 12 has a function of analyzing the characteristics of a region (area) specified by a user using "low-resolution data." The low-resolution data analysis unit 12 has a data mapping unit 121 as a functional unit that executes a program, and has regional information 122 as data used for processing by the data mapping unit 121.
 地域情報122は、行政区域等の比較的広範囲な「地域」を単位とした低解像度のデータであって、所定の指標について地域ごとのデータ値を保持する。地域情報は、広範囲な「地域」に対する特徴を示すデータであればよく、例えば、各地域の住人アンケートの結果や、地域単位のランキング等が含まれる。地域情報122は、後述する図10にその具体例が示される。 Regional information 122 is low-resolution data for relatively broad "regions" such as administrative districts, and holds data values for each region for a given indicator. Regional information may be any data that indicates the characteristics of a broad "region," and may include, for example, the results of a survey of residents in each region, rankings by region, etc. A specific example of regional information 122 is shown in Figure 10, which will be described later.
 指標計算部13は、高解像度データ分析部11及び低解像度データ分析部12による分析結果を統合して、ユーザの嗜好を反映した住環境の指標値を計算する機能を有する。指標計算部13は、プログラムを実行する機能部としてデータ統合評価部131を有し、データ統合評価部131による処理に利用されるデータとしてユーザ嗜好情報132及び関連付け情報133を有する。 The index calculation unit 13 has a function of integrating the analysis results by the high-resolution data analysis unit 11 and the low-resolution data analysis unit 12 to calculate an index value of the living environment that reflects the user's preferences. The index calculation unit 13 has a data integration evaluation unit 131 as a functional unit that executes a program, and has user preference information 132 and association information 133 as data used in processing by the data integration evaluation unit 131.
 ユーザ嗜好情報132は、所定の項目についてユーザの嗜好を示す嗜好係数を保持するデータであって、後述する図12にその具体例が示される。関連付け情報133は、特徴情報19に保持された特徴量から、所定の住環境項目の特徴量を算出する際の、関連付け(重み付け)の係数を保持するデータであって、後述する図13にその具体例が示される。 User preference information 132 is data that holds preference coefficients that indicate the user's preferences for a specific item, and a specific example is shown in FIG. 12, which will be described later. Association information 133 is data that holds association (weighting) coefficients used when calculating the feature quantities of a specific living environment item from the feature quantities held in feature information 19, and a specific example is shown in FIG. 13, which will be described later.
 可視化部14は、指標計算部13によって算出された住環境の指標値を可視化する機能を有する。可視化部14は、プログラムの実行によって機能を提供する機能部として地図表示部141を有する。なお、後述する図16に示す住環境評価画面300は、住環境評価システム1がユーザに提供する表示画面である。地図表示部141による処理結果もこの住環境評価画面300に反映される。 The visualization unit 14 has a function of visualizing the living environment index values calculated by the index calculation unit 13. The visualization unit 14 has a map display unit 141 as a functional unit that provides functions by executing a program. The living environment evaluation screen 300 shown in FIG. 16, which will be described later, is a display screen that the living environment evaluation system 1 provides to the user. The processing results by the map display unit 141 are also reflected in this living environment evaluation screen 300.
 解析指示情報18は、ユーザが指定した住環境の解析範囲を示すデータであって、後述する図3にその具体例が示される。特徴情報19は、高解像度データ分析部11及び低解像度データ分析部12による分析結果を保持するデータであって、後述する図7にその具体例が示される。指標値情報20は、指標計算部13によって計算された住環境の指標値を記録するデータであって、後述する図14にその具体例が示される。 Analysis instruction information 18 is data that indicates the analysis range of the living environment specified by the user, and a specific example is shown in FIG. 3, which will be described later. Feature information 19 is data that holds the analysis results by the high-resolution data analysis unit 11 and the low-resolution data analysis unit 12, and a specific example is shown in FIG. 7, which will be described later. Index value information 20 is data that records the index value of the living environment calculated by the index calculation unit 13, and a specific example is shown in FIG. 14, which will be described later.
(1-2)処理
 図2は、第1の実施形態における全体処理の処理手順例を示すフローチャートである。全体処理は、住環境評価システム1によって実行される住環境評価の処理の概要を示すものである。
2 is a flowchart showing an example of a processing procedure of the overall processing in the first embodiment. The overall processing shows an overview of the living environment evaluation processing executed by the living environment evaluation system 1.
 図2の全体処理が開始される前に、ユーザは、住環境評価システム1のソフトウェアを起動し、起動したソフトウェアから提供されるユーザインタフェース(UI)を操作して、住環境を解析する範囲を指定する(解析範囲の指定)。具体的には、後述する図16の住環境評価画面300において、ユーザが、評価実行ボタン304を押下する前に、マップ表示領域302に所望の解析範囲を表示する操作を行うことで、解析範囲が指定される。 Before the overall processing in FIG. 2 is started, the user launches the software of the living environment evaluation system 1 and operates a user interface (UI) provided by the launched software to specify the range in which the living environment is to be analyzed (specifying the analysis range). Specifically, on the living environment evaluation screen 300 in FIG. 16 (described later), the user performs an operation to display the desired analysis range in the map display area 302 before pressing the evaluation execution button 304, thereby specifying the analysis range.
 また、解析範囲の指定に加えて、住環境に関する所定の項目について、ユーザがどの項目をどの程度重視するかといった選択(ユーザ嗜好の指定)も行われる。具体的には、後述する図16の住環境評価画面300において、ユーザが、評価実行ボタン304を押下する前に、アンケート入力領域301で自身の嗜好に沿った入力操作を行うことで、ユーザ嗜好が指定される。 In addition to specifying the analysis range, the user also selects which items about the living environment he or she places importance on and to what extent (specifying user preferences). Specifically, on the living environment evaluation screen 300 in FIG. 16 described below, the user specifies user preferences by performing input operations according to the user's preferences in the questionnaire input area 301 before pressing the evaluation execution button 304.
 そしてユーザによるこれらの操作の結果に基づいて、住環境評価画面300の評価実行ボタン304の押下操作を契機として、解析指示情報18が生成され、メモリ17に格納される。 Then, based on the results of these operations by the user, analysis instruction information 18 is generated and stored in memory 17 when the evaluation execution button 304 on the living environment evaluation screen 300 is pressed.
 なお、以下の説明では、解析範囲及びユーザ嗜好を指定する「評価指定画面」と、住環境の評価結果が表示される「評価結果画面」とが、1つの「住環境評価画面300」にまとめて表示されるとするが、評価指定画面と評価結果画面とが別個のUIで実現されてもよい。後述する図16には、住環境評価画面300の具体例が示される。 In the following explanation, the "evaluation specification screen" for specifying the analysis range and user preferences and the "evaluation result screen" for displaying the evaluation results of the living environment are displayed together on a single "living environment evaluation screen 300," but the evaluation specification screen and the evaluation result screen may be realized as separate UIs. Figure 16, which will be described later, shows a specific example of the living environment evaluation screen 300.
 図3は、解析指示情報18の一例を示す図である。図3に示した解析指示情報18において、起点緯度1801、起点経度1802、南北距離1803、及び東西距離1804は、ユーザによって指定された住環境の解析範囲を示す。本例では、指定された緯度経度(起点緯度1801、起点経度1802)を四角形の左上の頂点とし、当該頂点から南北距離1803及び東西距離1804で指定される範囲の四角形を解析範囲とする。この解析範囲は、図16に示す住環境評価画面300のマップ表示領域302に表示されている地図の全体範囲に相当する。 FIG. 3 is a diagram showing an example of analysis instruction information 18. In the analysis instruction information 18 shown in FIG. 3, starting latitude 1801, starting longitude 1802, north-south distance 1803, and east-west distance 1804 indicate the analysis range of the living environment specified by the user. In this example, the specified latitude and longitude (starting latitude 1801, starting longitude 1802) is set as the upper left vertex of a rectangle, and the analysis range is the rectangle from that vertex specified by north-south distance 1803 and east-west distance 1804. This analysis range corresponds to the entire range of the map displayed in map display area 302 of living environment evaluation screen 300 shown in FIG. 16.
 メッシュ幅1805は、解析範囲をメッシュに分割する際のメッシュ幅を示す。住環境評価システム1では、解析範囲を複数のエリア(メッシュ)に分割し、各エリアの特徴を解析する。本例では、メッシュを正方形とし、その一辺の長さがメッシュ幅1805で指定される。具体的には例えば、図3のようにメッシュ幅1805が100mとされている場合、解析範囲は100m四方の正方形で分割される。メッシュ幅1805は、システムによって固定値が設定されるが、例えば、ユーザの指定等により、所定範囲もしくは所定単位で変更可能としてもよい。また、メッシュ形状も正方形に限定されるものではなく、その他の特定形状であってもよい。 Mesh width 1805 indicates the mesh width when dividing the analysis range into meshes. In the living environment evaluation system 1, the analysis range is divided into a plurality of areas (meshes) and the characteristics of each area are analyzed. In this example, the mesh is a square, and the length of one side of the square is specified by the mesh width 1805. Specifically, for example, if the mesh width 1805 is set to 100 m as shown in FIG. 3, the analysis range is divided into squares of 100 m on each side. The mesh width 1805 is set to a fixed value by the system, but may be changeable within a specified range or in specified units, for example, by the user. The mesh shape is also not limited to a square, and may be another specific shape.
 解析対象1806は、解析範囲の各メッシュで解析する特徴の種別を示す。解析対象1806は、システムによって指定され、各解析対象に対応して色識別情報113が用意される。 The analysis object 1806 indicates the type of feature to be analyzed in each mesh in the analysis range. The analysis object 1806 is specified by the system, and color identification information 113 is prepared for each analysis object.
 図2の説明に戻る。図2によればまず、高解像度データ分析部11が、土地特徴抽出部111を起動して高解像度データ分析処理を実行する(ステップS101)。高解像度データ分析処理の詳細は、図4等を参照しながら後述する。高解像度データ分析処理が実行されることにより、解析範囲の特徴が高解像度データを用いて分析される。 Returning to the explanation of FIG. 2, according to FIG. 2, first, the high-resolution data analysis unit 11 activates the land feature extraction unit 111 to execute the high-resolution data analysis process (step S101). Details of the high-resolution data analysis process will be described later with reference to FIG. 4 etc. By executing the high-resolution data analysis process, the characteristics of the analysis range are analyzed using high-resolution data.
 次に、低解像度データ分析部12が、データマッピング部121を起動して低解像度データ分析処理を実行する(ステップS102)。低解像度データ分析処理の詳細は、図8等を参照しながら後述する。低解像度データ分析処理が実行されることにより、解析範囲の特徴が低解像度データを用いて分析される。 Next, the low-resolution data analysis unit 12 activates the data mapping unit 121 to execute the low-resolution data analysis process (step S102). Details of the low-resolution data analysis process will be described later with reference to FIG. 8 etc. By executing the low-resolution data analysis process, the characteristics of the analysis range are analyzed using the low-resolution data.
 次に、指標計算部13が、データ統合評価部131を起動して指標計算処理を実行する(ステップS103)。指標計算処理の詳細は、図11等を参照しながら後述する。指標計算処理が実行されることにより、高解像度データ分析処理及び低解像度データ分析処理の分析結果とユーザ嗜好とに基づいて住環境の指標値が算出され、指標値情報20に記録される。 Next, the index calculation unit 13 activates the data integration evaluation unit 131 to execute the index calculation process (step S103). Details of the index calculation process will be described later with reference to FIG. 11 etc. By executing the index calculation process, an index value of the living environment is calculated based on the analysis results of the high-resolution data analysis process and the low-resolution data analysis process and the user preferences, and is recorded in the index value information 20.
 そして最後に、可視化部14が、地図表示部141を起動して可視化処理を実行する(ステップS104)。可視化処理の詳細は、後述する図15に示される。可視化処理が実行されることにより、解析範囲の住環境の評価結果がユーザに視覚的に提示される。 Finally, the visualization unit 14 starts the map display unit 141 to execute the visualization process (step S104). Details of the visualization process are shown in FIG. 15, which will be described later. By executing the visualization process, the evaluation results of the living environment in the analysis range are visually presented to the user.
 以下では、上記のような全体処理の各処理について、より詳しく説明する。 Below, we will explain each step of the overall process described above in more detail.
(1-2-1)高解像度データ分析処理
 図4は、高解像度データ分析処理の処理手順例を示すフローチャートである。図4に示す高解像度データ分析処理は、図2のステップS101の処理に相当し、高解像度データ分析部11によって土地特徴抽出部111が起動された場合に実行される。
(1-2-1) High-resolution data analysis processing Fig. 4 is a flowchart showing an example of the processing procedure of the high-resolution data analysis processing. The high-resolution data analysis processing shown in Fig. 4 corresponds to the processing of step S101 in Fig. 2, and is executed when the land feature extraction unit 111 is started by the high-resolution data analysis unit 11.
 図4によればまず、土地特徴抽出部111は、利用者(ユーザ)による操作に基づいて生成された解析指示情報18を取得する(ステップS201)。この「取得」という動作は、対象データの受信、参照、読込、または複製等の何れを指すものであってもよく、以降の説明でも同様である。 According to FIG. 4, first, the land feature extraction unit 111 acquires the analysis instruction information 18 generated based on the operation by the user (step S201). This "acquire" operation may refer to any of receiving, referencing, reading, or copying the target data, and the same applies to the following explanation.
 次に、土地特徴抽出部111は、ステップS201で取得した解析指示情報18で指定される解析範囲をメッシュに区切る(ステップS202)。具体的には、土地特徴抽出部111は、解析指示情報18の起点緯度1801及び起点経度1802を左上の頂点とし、南北距離1803を縦幅、東西距離1804を横幅とする矩形領域を解析範囲とし、この解析範囲をメッシュ幅1805を所定サイズとする正方形領域(メッシュ)で区切る。このとき、解析範囲の南北距離及び東西距離は、メッシュ幅1805の整数倍に切り上げることにより、解析範囲は複数個のメッシュに分割される。なお、解析範囲をメッシュに分割する処理は、後述する他の処理でも行われるが、これらは全てステップS202の処理と同様であり、詳細な説明の繰り返しを省略する。 Next, the land feature extraction unit 111 divides the analysis range specified by the analysis instruction information 18 acquired in step S201 into meshes (step S202). Specifically, the land feature extraction unit 111 divides the analysis range into a rectangular area with the starting point latitude 1801 and starting point longitude 1802 of the analysis instruction information 18 as the upper left vertex, the north-south distance 1803 as the vertical width, and the east-west distance 1804 as the horizontal width, into square areas (meshes) with a mesh width 1805 as a predetermined size. At this time, the north-south distance and east-west distance of the analysis range are rounded up to an integer multiple of the mesh width 1805, so that the analysis range is divided into multiple meshes. The process of dividing the analysis range into meshes is also performed in other processes described later, but all of these are similar to the process in step S202, so detailed explanations will not be repeated.
 次に、土地特徴抽出部111は、ステップS202で生成した全てのメッシュに対してステップS204~S206の処理を行ったか否かを確認する(ステップS203)。全てのメッシュが処理済みである場合(ステップS203のYES)、高解像度データ分析処理を終了する。未処理のメッシュが残っている場合は(ステップS203のNO)、ステップS204に進む。初回の場合は、まだステップS204以降の処理を行っていないため、ステップS203のNOを経てステップS204に進む。 Next, the land feature extraction unit 111 checks whether steps S204 to S206 have been performed on all meshes generated in step S202 (step S203). If all meshes have been processed (YES in step S203), the high-resolution data analysis process ends. If unprocessed meshes remain (NO in step S203), the process proceeds to step S204. If this is the first time, the process has not yet been performed after step S204, so the process proceeds to step S204 via NO in step S203.
 ステップS204において、土地特徴抽出部111は、未処理のメッシュのうちから1つを選択する。 In step S204, the land feature extraction unit 111 selects one of the unprocessed meshes.
 次に、土地特徴抽出部111は、ステップS204で選択したメッシュについて、解析指示情報18の解析対象1806に指定された解析対象を1つずつ選択しながら、分類部112を起動して特徴分類処理を実行する(ステップS205)。すなわち、ステップS205では、ステップS204で選択したメッシュについて、解析対象1806に指定された解析対象の選択を変えながら、分類部112による特徴分類処理が繰り返し実行される。特徴分類処理の詳細は、図5を参照しながら後述する。 Next, the land feature extraction unit 111 activates the classification unit 112 to execute feature classification processing while selecting one by one the analysis targets specified in the analysis target 1806 of the analysis instruction information 18 for the mesh selected in step S204 (step S205). That is, in step S205, the classification unit 112 repeatedly executes feature classification processing while changing the selection of the analysis target specified in the analysis target 1806 for the mesh selected in step S204. Details of the feature classification processing will be described later with reference to FIG. 5.
 次に、土地特徴抽出部111は、ステップS205で実行された特徴分類処理の結果を分類部112から受け取り、特徴情報19に書き込む(ステップS206)。ステップS206の処理後はステップS203に戻る。 Next, the land feature extraction unit 111 receives the results of the feature classification process executed in step S205 from the classification unit 112 and writes them into the feature information 19 (step S206). After the process in step S206, the process returns to step S203.
 以上のようにして未処理のメッシュについてステップS204~S206の処理を繰り返し実行することにより、最終的には解析範囲の全てのメッシュが処理済みとなり、ステップS203のYESを経て、高解像度データ分析処理が終了する。この高解像度データ分析処理の結果、高解像度データ(衛星画像114及び地図情報115)に基づいて、「解析範囲」を分割した各メッシュが「解析対象」の特徴をどの程度有しているかを示す特徴量が、数値化されて特徴情報19に記録される。 By repeatedly executing the processes of steps S204 to S206 for unprocessed meshes in this manner, all meshes in the analysis range will eventually be processed, and the high-resolution data analysis process will end after a YES in step S203. As a result of this high-resolution data analysis process, feature quantities that indicate the degree to which each mesh divided into the "analysis range" possesses the characteristics of the "analysis target" based on the high-resolution data (satellite image 114 and map information 115) are quantified and recorded in feature information 19.
 図5は、特徴分類処理の処理手順例を示すフローチャートである。図5に示す特徴分類処理は、図4のステップS205において分類部112が起動された場合に実行される。 FIG. 5 is a flowchart showing an example of the processing procedure for feature classification processing. The feature classification processing shown in FIG. 5 is executed when the classification unit 112 is started in step S205 of FIG. 4.
 図5によればまず、分類部112は、土地特徴抽出部111から、処理対象のメッシュと選択された解析対象1806とを取得する(ステップS301)。 According to FIG. 5, first, the classification unit 112 obtains the mesh to be processed and the selected analysis target 1806 from the land feature extraction unit 111 (step S301).
 次に、分類部112は、ステップS301で取得したメッシュの範囲を計算し、当該範囲を含む衛星画像と地図情報とをディスク2から(またはネットワーク3を介して外部から)読み込み、衛星画像114及び地図情報115としてメモリ17に記憶する(ステップS302)。 Next, the classification unit 112 calculates the range of the mesh obtained in step S301, reads the satellite image and map information including that range from the disk 2 (or from outside via the network 3), and stores them in the memory 17 as the satellite image 114 and map information 115 (step S302).
 衛星画像114は、住環境を評価する地域を人工衛星から撮影した高解像度データであって、解像度が高い画像であれば航空画像等であってもよい。衛星画像114の元となる衛星画像や航空画像は、可視光で撮影された画像に限定されるものではなく、他の波長(例えば赤外線)による撮像データ等であってもよい。衛星画像等の高解像度データを用いることにより、街路樹や工場内の緑地のような、地図(地図情報115)からは認識または推測し難い現実的な事象を識別することができる。 Satellite image 114 is high-resolution data taken from an artificial satellite of the area for which the living environment is to be evaluated, and may be an aerial image or the like as long as the image has a high resolution. The satellite image or aerial image that is the source of satellite image 114 is not limited to images taken with visible light, and may be imaging data using other wavelengths (e.g., infrared light). By using high-resolution data such as satellite images, it is possible to identify real-world phenomena that are difficult to recognize or infer from a map (map information 115), such as street trees and green spaces within factories.
 地図情報115は、座標に対する特徴情報を示す一般的な地図データであって、衛星画像114を用いた特徴分類による誤検出を防止するために利用される。地図情報115の元となる地図情報は、特定の種類の地図に限定されるものではなく、所定範囲内の任意の座標に対して「用途」が示されるデータであればよい。この「用途」は、具体的には例えば、住宅、商業施設、道路、または工場等が想定され、後述する図6に示す色識別情報113の高可能性地域1135の内容に対応することが好ましい。 Map information 115 is general map data that indicates feature information for coordinates, and is used to prevent erroneous detections due to feature classification using satellite imagery 114. The map information on which map information 115 is based is not limited to a specific type of map, but may be any data that indicates "use" for any coordinate within a specified range. Specifically, this "use" may be, for example, a house, commercial facility, road, or factory, and preferably corresponds to the content of high possibility area 1135 of color identification information 113 shown in Figure 6 described below.
 次に、分類部112は、ピクセルカウンタを0で初期化する(ステップS303)。ピクセルカウンタは、各ピクセルにおける特徴量を累積するために用いられるカウンタである。 Next, the classification unit 112 initializes a pixel counter to 0 (step S303). The pixel counter is a counter used to accumulate the feature amount for each pixel.
 次に、分類部112は、ステップS301で取得したメッシュに含まれる全てのピクセルについてステップS305~S307の処理を実行したか否かを確認する(ステップS304)。全てのピクセルが処理済みである場合(ステップS304のYES)、後述するステップS308に進む。未処理のピクセルが残っている場合は(ステップS304のNO)、ステップS305に進む。 Next, the classification unit 112 checks whether the processing of steps S305 to S307 has been performed for all pixels included in the mesh acquired in step S301 (step S304). If all pixels have been processed (YES in step S304), the process proceeds to step S308, which will be described later. If unprocessed pixels remain (NO in step S304), the process proceeds to step S305.
 ステップS305において、分類部112は、ステップS301で取得した解析対象に対応する色識別情報113を選択する。具体的には、分類部112は、メモリ17に保持されている、様々な解析対象に関する色識別情報のうちから、解析対象1131(図6参照)の名称がステップS301で取得した解析対象と一致するものを選択する。なお、様々な解析対象に関する色識別情報は、メモリ17に保持されずにディスク2またはネットワーク3を介した外部に保持されていてもよい。その場合、分類部112は、ディスク2またはネットワーク3を参照して、これらの色識別情報のうちからステップS301で取得した解析対象に対応する色識別情報を読み込み、色識別情報113としてメモリ17に記憶すればよい。 In step S305, the classification unit 112 selects color identification information 113 corresponding to the analysis target acquired in step S301. Specifically, the classification unit 112 selects, from among the color identification information related to the various analysis targets stored in the memory 17, the analysis target 1131 (see FIG. 6) whose name matches the analysis target acquired in step S301. Note that the color identification information related to the various analysis targets may not be stored in the memory 17 but may be stored externally via the disk 2 or the network 3. In this case, the classification unit 112 refers to the disk 2 or the network 3, reads the color identification information corresponding to the analysis target acquired in step S301 from among the color identification information, and stores it in the memory 17 as color identification information 113.
 図6は、色識別情報113の一例を示す図である。色識別情報113は、解析対象を色で識別するためのデータであって、ある地点(ピクセル)が当該解析対象に該当し得る否かを判定するために用いられる。図6に示した色識別情報113は、解析対象1131、赤範囲1132、緑範囲1133、青範囲1134、及び高可能性地域1135の項目を有する。 FIG. 6 is a diagram showing an example of color identification information 113. Color identification information 113 is data for identifying an analysis target by color, and is used to determine whether a certain point (pixel) can be the analysis target. The color identification information 113 shown in FIG. 6 has the items of analysis target 1131, red range 1132, green range 1133, blue range 1134, and high possibility area 1135.
 解析対象1131は、判定可能な解析対象の名称を示す。具体的には図6の場合、解析対象1131は「コンクリート」となっており、これは、図3の解析指示情報18の解析対象1806に含まれる「コンクリート」に対応している。 Analysis target 1131 indicates the name of the analysis target that can be determined. Specifically, in the case of FIG. 6, analysis target 1131 is "concrete," which corresponds to "concrete" included in analysis target 1806 of analysis instruction information 18 in FIG. 3.
 赤範囲1132、緑範囲1133、及び青範囲1134は、解析対象1131の特徴を定義した識別条件であって、解析対象1131が取り得る色の範囲がRGB値で示される。具体的には図6の場合、ある地点(ピクセル)の色のRGB値が何れも「110-150」の間であれば、当該地点が「コンクリート」である可能性があると判定される。 The red range 1132, green range 1133, and blue range 1134 are identification conditions that define the characteristics of the analysis object 1131, and the range of colors that the analysis object 1131 can take is indicated by RGB values. Specifically, in the case of Figure 6, if the RGB values of the color of a certain point (pixel) are all between "110-150," it is determined that the point is likely to be "concrete."
 高可能性地域1135は、解析対象1131の特徴を有する地点(ピクセル)に対して推定される用途を示す。具体的には図6の場合、ある地点(ピクセル)の色が上記のRGB値の範囲内にあってコンクリートである可能性があると判定された場合、当該地点は、「住宅」、「商業施設」、「道路」、または「工場」の何れかの用途に用いられている可能性が高いと推定される。 High possibility area 1135 indicates the use estimated for a point (pixel) that has the characteristics of analysis target 1131. Specifically, in the case of FIG. 6, if the color of a certain point (pixel) is determined to be within the above RGB value range and is likely to be concrete, the point is estimated to have a high possibility of being used for one of the following purposes: "residence," "commercial facility," "road," or "factory."
 色識別情報113は、解析指示情報18の解析対象1806の各対象を含む、様々な解析対象について、事前に情報が設定される。設定された情報は、住環境評価システム1の運用中に主に管理者によって変更可能としてもよい。 In the color identification information 113, information is set in advance for various analysis targets, including each of the analysis targets 1806 in the analysis instruction information 18. The set information may be changeable mainly by an administrator during operation of the living environment evaluation system 1.
 なお、図6に示した色識別情報113は、各ピクセルの色のRGB値を用いて解析するものであるが、色の解析手法はこれに限定されるものではなく、例えば、各ピクセルの色空間をHSV色空間に直したうえで判定する等、既知の様々な解析手法を利用可能である。また、本実施形態では「色」で分析する色識別情報113を用いているが、これは一例であって、本発明においてある地点(ピクセル)が解析対象に該当するかを判定するための識別情報には、「波長」を用いて分析する任意の識別情報を採用可能である。したがって、このような識別情報と比較する地点(ピクセル)の情報(衛星画像114)も、可視光による画像に限定されるものでもなく、例えば赤外線画像等であってもよい。 Note that while the color identification information 113 shown in FIG. 6 is analyzed using the RGB values of the color of each pixel, the color analysis method is not limited to this, and various known analysis methods can be used, such as converting the color space of each pixel to an HSV color space before making a judgment. Also, while this embodiment uses color identification information 113 analyzed by "color," this is just one example, and any identification information analyzed by "wavelength" can be used as identification information for determining whether a certain point (pixel) corresponds to the analysis target in this invention. Therefore, the information (satellite image 114) of the point (pixel) to be compared with such identification information is not limited to an image using visible light, and may be, for example, an infrared image, etc.
 図5の説明に戻る。ステップ305の次に、分類部112は、ステップS305で取得した色識別情報113に基づいて、ステップS302で取得した衛星画像114のうちステップS301で取得したメッシュに対応する領域画像の各ピクセルの色が、解析対象の識別条件を満たすか判定し、判定結果に応じてピクセルカウンタの値を加算する(ステップS306)。 Returning to the explanation of FIG. 5, after step 305, the classification unit 112 determines whether the color of each pixel in the area image corresponding to the mesh acquired in step S301 in the satellite image 114 acquired in step S302 satisfies the identification condition for the analysis target based on the color identification information 113 acquired in step S305, and increments the pixel counter value according to the determination result (step S306).
 具体的には、ステップS306において分類部112は、衛星画像114から抽出した各ピクセルの色について、色識別情報113における赤範囲1132、緑範囲1133、及び青範囲1134を全て満足する場合に、当該解析対象の識別条件を満たすと判定する。そして、解析対象の識別条件を満たすと判定した場合、分類部112は、識別条件を満たした1ピクセルごとに、ピクセルカウンタの値を「0.7」加算する。一方、分類部112は、解析対象の識別条件を満たさないと判定したピクセルについては、ピクセルカウンタの値を加算しない。 Specifically, in step S306, the classification unit 112 determines that the color of each pixel extracted from the satellite image 114 satisfies the identification condition of the analysis target when the color satisfies all of the red range 1132, green range 1133, and blue range 1134 in the color identification information 113. If it is determined that the identification condition of the analysis target is satisfied, the classification unit 112 adds "0.7" to the pixel counter value for each pixel that satisfies the identification condition. On the other hand, the classification unit 112 does not add to the pixel counter value for pixels that are determined not to satisfy the identification condition of the analysis target.
 次に、分類部112は、ステップS306でピクセルカウンタを加算したピクセルについて、ステップS302で取得した地図情報115に示される当該ピクセルの位置の用途が、色識別情報113に示された用途に含まれるかを判定し、判定結果に応じてピクセルカウンタの値をさらに加算する(ステップS307)。 Next, for the pixel whose pixel counter was incremented in step S306, the classification unit 112 determines whether the use of the pixel's position shown in the map information 115 acquired in step S302 is included in the uses indicated in the color identification information 113, and further increments the pixel counter value according to the determination result (step S307).
 具体的には、ステップS307において分類部112は、ステップS306でピクセルカウンタを加算したピクセルについて、地図情報115から当該ピクセルの位置の用途を抽出し、色識別情報113の高可能性地域1135に示された用途の何れかと一致するかを判定する。そして用途が一致した場合、分類部112は、ピクセルカウンタの値をさらに「0.3」加算し、用途が一致しなかった場合は、追加の加算を行わない。 Specifically, in step S307, for the pixel whose pixel counter was incremented in step S306, the classification unit 112 extracts the use of the pixel's position from the map information 115, and determines whether it matches any of the uses indicated in the high possibility area 1135 of the color identification information 113. If the use matches, the classification unit 112 further increments the pixel counter value by "0.3", and if the use does not match, no additional increment is made.
 ステップS307の処理後は、ステップS304に戻る。 After processing in step S307, the process returns to step S304.
 その後、メッシュに含まれる全てのピクセルが処理済みになると、ステップS304でYESと判定される。このとき、分類部112は、ピクセルカウンタの最終的な値をメッシュに含まれるピクセルの総数で除し、算出値を特徴分類処理の結果として土地特徴抽出部111のステップS206に返し(ステップS308)、特徴分類処理を終了する。 After that, when all pixels contained in the mesh have been processed, a YES determination is made in step S304. At this time, the classification unit 112 divides the final value of the pixel counter by the total number of pixels contained in the mesh, and returns the calculated value as the result of the feature classification process to step S206 of the land feature extraction unit 111 (step S308), and the feature classification process ends.
 以上のような特徴分類処理によれば、分類部112は、高解像度情報である衛星画像114を用いて、メッシュに含まれる衛星画像114の各ピクセルの色を、色識別情報113の識別条件と比較することにより、当該メッシュが指定された解析対象の特徴をどの程度有するかを、精度よく数値化することができる。さらに、特徴分類処理では、色の一致判定だけでは誤った判定となる可能性を考慮して、高解像度情報である地図情報115を利用して「用途」についても比較を行って解析値に反映することにより、当該メッシュの解析をより高精度に実施することができる。なお、上記説明では、ステップS306におけるピクセルカウンタの加算値を「0.7」とし、ステップS307におけるピクセルカウンタの加算値を「0.3」として、両方で加算された場合に1ピクセルにつき「1.0」が加算されるとしたが、それぞれの加算値の割合は、色判定と用途判定の重視の度合い等に応じて、任意に設定されてもよい。 According to the above feature classification process, the classification unit 112 uses the satellite image 114, which is high-resolution information, to compare the color of each pixel of the satellite image 114 included in the mesh with the identification conditions of the color identification information 113, thereby enabling accurate numerical quantification of the extent to which the mesh has the characteristics of the specified analysis target. Furthermore, in the feature classification process, taking into consideration the possibility that a color match determination alone may result in an erroneous determination, the high-resolution map information 115 is used to compare the "use" as well, and this is reflected in the analysis value, thereby enabling the mesh to be analyzed with higher accuracy. Note that in the above explanation, the pixel counter addition value in step S306 is set to "0.7", and the pixel counter addition value in step S307 is set to "0.3", and "1.0" is added per pixel when both are added, but the ratio of each addition value may be set arbitrarily depending on the degree of importance placed on the color determination and the use determination, etc.
 図7は、特徴情報19の一例を示す図である。特徴情報19は、全体的な分析結果、より具体的には、高解像度データ分析部11及び低解像度データ分析部12によるそれぞれの最終的な分析結果を保持するデータである。特徴情報19は、解析範囲を分割したメッシュごとに、解析対象または指標の分析結果を記録する。具体的には、図7に示した特徴情報19は、縦位置1901、横位置1902、第1の解析項目1903、及び第2の解析項目1904の項目を有する。 FIG. 7 is a diagram showing an example of feature information 19. Feature information 19 is data that holds the overall analysis results, more specifically, the final analysis results by the high-resolution data analysis unit 11 and the low-resolution data analysis unit 12. Feature information 19 records the analysis results of the analysis target or index for each mesh into which the analysis range is divided. Specifically, feature information 19 shown in FIG. 7 has items of vertical position 1901, horizontal position 1902, first analysis item 1903, and second analysis item 1904.
 縦位置1901はメッシュの縦位置を示し、横位置1902はメッシュの横位置を示す。縦位置1901と横位置1902との組合せにより、解析対象におけるメッシュの位置が特定される。 Vertical position 1901 indicates the vertical position of the mesh, and horizontal position 1902 indicates the horizontal position of the mesh. The combination of vertical position 1901 and horizontal position 1902 identifies the position of the mesh in the analysis target.
 第1の解析項目1903は、解析指示情報18の解析対象1806における各解析対象についての高解像度データ分析処理の分析結果として、解析項目ごとの特徴量を示す。図7の場合、第1の解析項目1903として「コンクリート」、「緑化」、及び「水」が例示されているが、これらは、図3の解析指示情報18の解析対象1806に示された解析対象に対応している。前述した図4のステップS206では、第1の解析項目1903の何れかのセルに、分類部112による特徴分類処理の結果が書き込まれる。 The first analysis item 1903 indicates the feature amount for each analysis item as the analysis result of the high-resolution data analysis process for each analysis target in the analysis target 1806 of the analysis instruction information 18. In the case of FIG. 7, "concrete", "greening", and "water" are exemplified as the first analysis item 1903, which correspond to the analysis targets indicated in the analysis target 1806 of the analysis instruction information 18 in FIG. 3. In step S206 of FIG. 4 described above, the result of the feature classification process by the classification unit 112 is written into one of the cells of the first analysis item 1903.
 第2の解析項目1904は、地域情報122における各指標についての低解像度データ分析処理の分析結果として、解析項目ごとの特徴量を示す。図7の場合、第2の解析項目1904として「静かな環境」及び「便利な商店街」が例示されているが、これらは、後述する図10の地域情報122の指標1221に示された指標に対応している。後述する図9のステップS414では、第2の解析項目1904の何れかのセルに、分析結果が書き込まれる。 The second analysis item 1904 shows the feature amount for each analysis item as the analysis result of the low-resolution data analysis process for each index in the local information 122. In the case of FIG. 7, "quiet environment" and "convenient shopping area" are shown as examples of the second analysis item 1904, which correspond to the indexes shown in the index 1221 of the local information 122 in FIG. 10, which will be described later. In step S414 in FIG. 9, which will be described later, the analysis result is written into one of the cells of the second analysis item 1904.
 なお、第1の解析項目1903及び第2の解析項目1904の各項目は、上述したように一例に過ぎず、解析指示情報18の解析対象1806、及び地域情報122の指標1221に応じて、任意の名称の項目が構成される。 Note that, as described above, each of the first analysis item 1903 and the second analysis item 1904 is merely an example, and items with arbitrary names may be configured according to the analysis target 1806 of the analysis instruction information 18 and the index 1221 of the area information 122.
(1-2-2)低解像度データ分析処理
 図8及び図9は、低解像度データ分析処理の処理手順例を示すフローチャート(その1,その2)である。図8及び図9に示す低解像度データ分析処理は、図2のステップS102の処理に相当し、低解像度データ分析部12によってデータマッピング部121が起動された場合に実行される。
(1-2-2) Low-resolution data analysis processing Fig. 8 and Fig. 9 are flowcharts (part 1, part 2) showing an example of a processing procedure of the low-resolution data analysis processing. The low-resolution data analysis processing shown in Fig. 8 and Fig. 9 corresponds to the processing of step S102 in Fig. 2, and is executed when the data mapping unit 121 is started by the low-resolution data analysis unit 12.
 図8及び図9によればまず、データマッピング部121は、利用者(ユーザ)による操作に基づいて生成された解析指示情報18を取得し(ステップS401)、解析指示情報18で指定される解析範囲をメッシュに区切る(ステップS402)。 According to Figures 8 and 9, first, the data mapping unit 121 acquires the analysis instruction information 18 generated based on the operation by the user (step S401), and divides the analysis range specified by the analysis instruction information 18 into meshes (step S402).
 次に、データマッピング部121は、ステップS402で生成した全てのメッシュに対してステップS404以降の処理を行ったか否かを確認する(ステップS403)。全てのメッシュが処理済みである場合(ステップS403のYES)、低解像度データ分析処理を終了する。未処理のメッシュが残っている場合は(ステップS403のNO)、ステップS404に進む。初回の場合は、まだステップS404以降の処理を行っていないため、ステップS403のNOを経てステップS404に進む。 Then, the data mapping unit 121 checks whether the processing from step S404 onwards has been performed on all meshes generated in step S402 (step S403). If all meshes have been processed (YES in step S403), the low-resolution data analysis process ends. If unprocessed meshes remain (NO in step S403), the process proceeds to step S404. If this is the first time, the processing from step S404 onwards has not yet been performed, so the process proceeds to step S404 via NO in step S403.
 ステップS404において、データマッピング部121は、未処理のメッシュのうちから1つを選択する。 In step S404, the data mapping unit 121 selects one of the unprocessed meshes.
 次に、データマッピング部121は、地域情報122に含まれる全ての指標に関する情報について、ステップS406以降の処理を行ったか否かを確認する(ステップS405)。地域情報122の全ての指標に関する情報について処理済みである場合は(ステップS405のYES)、ステップS403に戻る。未処理の指標が残っている場合は(ステップS405のNO)、ステップS406に進む。 Then, the data mapping unit 121 checks whether the processing from step S406 onwards has been performed for the information relating to all indicators included in the regional information 122 (step S405). If the information relating to all indicators in the regional information 122 has been processed (YES in step S405), the process returns to step S403. If unprocessed indicators remain (NO in step S405), the process proceeds to step S406.
 図10は、地域情報122の一例を示す図である。地域情報122は、行政区域等の比較的広範囲な「地域」を単位とした低解像度のデータであって、所定の指標について地域ごとのデータ値を保持する。地域情報122は、予めシステムに登録される。 FIG. 10 is a diagram showing an example of regional information 122. Regional information 122 is low-resolution data in units of relatively wide-area "regions" such as administrative districts, and holds data values for each region for a given indicator. Regional information 122 is registered in the system in advance.
 図10の場合、地域情報122には、一例として「静かな環境」及び「便利な商店街」という指標に関する情報が含まれている。それぞれの指標に関する情報は、指標1221、対象1222、データ名称1223、及びデータ値1224の項目を有する。 In the case of FIG. 10, the area information 122 includes, as an example, information on the indicators "quiet environment" and "convenient shopping district." The information on each indicator has the items of indicator 1221, target 1222, data name 1223, and data value 1224.
 指標1221は、指標の名称を示す。対象1222は、指標1221が示す指標(以後、当該指標)が適用される対象の条件を示す。データ名称1223は、当該指標に関して指標値(データ値1224)が得られている地域の具体的な地域名を示す。データ値1224は、データ名称1223が示す地域における当該指標の指標値を示す。本例の場合、データ値1224の値が大きいほど、当該指標の特徴が強いことを意味する。 Indicator 1221 indicates the name of the indicator. Target 1222 indicates the conditions of the target to which the indicator indicated by indicator 1221 (hereinafter, the indicator) is applied. Data name 1223 indicates the specific area name of the area where an index value (data value 1224) has been obtained for the indicator. Data value 1224 indicates the index value of the indicator in the area indicated by data name 1223. In this example, the larger the value of data value 1224, the stronger the characteristics of the indicator.
 図8及び図9の説明に戻る。ステップS406では、データマッピング部121は、地域情報122のうちから、未処理の指標に関する地域情報を1つ選択する。具体的には例えば、ステップS406においてデータマッピング部121は、図10に示した地域情報122から、指標の1つである「静かな環境」に関するデータを選択する。 Returning to the explanation of Figures 8 and 9, in step S406, the data mapping unit 121 selects one piece of regional information related to an unprocessed indicator from the regional information 122. Specifically, for example, in step S406, the data mapping unit 121 selects data related to "quiet environment," which is one of the indicators, from the regional information 122 shown in Figure 10.
 次に、データマッピング部121は、ステップS406で選択した地域情報のデータにおいて、全ての列に対してステップS408以降の処理を行ったか否かを確認する(ステップS407)。ここでの「列」とは、図10の地域情報122における「静かな環境」に関するデータを例に言うと、「杉並区」、「世田谷区」、「中野区」といったデータ名称1223の各列(データ値1224の各列と考えてもよい)を意味する。全ての列に対して処理済みである場合は(ステップS407のYES)、ステップS405に戻る。未処理の列が残っている場合は(ステップS407のNO)、図9のステップS408に進む。 Next, the data mapping unit 121 checks whether the processing from step S408 onwards has been performed on all columns in the regional information data selected in step S406 (step S407). In this case, "columns" refers to each column of data names 1223 (which may be considered as each column of data values 1224) such as "Suginami-ku," "Setagaya-ku," and "Nakano-ku" in the example data related to "quiet environment" in regional information 122 in FIG. 10. If all columns have been processed (YES in step S407), the process returns to step S405. If unprocessed columns remain (NO in step S407), the process proceeds to step S408 in FIG. 9.
 ステップS408では、データマッピング部121は、ステップS406で選択した地域情報のデータから、未処理の列を1つ選択する。図10の地域情報122を用いて具体的に言うと、例えば「杉並区」の列を選択したとする。 In step S408, the data mapping unit 121 selects one unprocessed column from the data of the regional information selected in step S406. To be more specific, using the regional information 122 in FIG. 10, for example, the column "Suginami-ku" is selected.
 次に、データマッピング部121は、ステップS408で選択した列のデータ名称1223と対象1222とに基づいて、当該列が対象とする地理的な範囲を特定する(ステップS409)。先の具体例で言うと、「杉並区」の「全域」(すなわち単に「杉並区」とも言える)が対象とする範囲に特定される。 Then, the data mapping unit 121 identifies the geographical range that is the target of the column selected in step S408, based on the data name 1223 and target 1222 of the column (step S409). In the previous specific example, the target range is identified as the "entire area" of "Suginami Ward" (or simply "Suginami Ward").
 次に、データマッピング部121は、ステップS404で選択したメッシュの過半が、ステップS409で特定した範囲に入っているか否かを確認する(ステップS410)。先の具体例で言うと、メッシュの過半が「杉並区」であるか否かを確認する。メッシュの過半が特定した範囲に入っている場合(ステップS410のYES)、当該メッシュは当該範囲に属すると見なし、ステップS411に進む。メッシュの過半が特定した範囲に入っていない場合は(ステップS410のNO)、図8のステップS407に戻る。 Next, the data mapping unit 121 checks whether the majority of the meshes selected in step S404 are within the range specified in step S409 (step S410). In the previous specific example, it checks whether the majority of the meshes are "Suginami-ku". If the majority of the meshes are within the specified range (YES in step S410), the mesh is considered to belong to that range and the process proceeds to step S411. If the majority of the meshes are not within the specified range (NO in step S410), the process returns to step S407 in Figure 8.
 なお、本説明では、ステップS410において「メッシュの過半」を条件としたが、当該メッシュに対応する範囲を決定できる方法であればよく、他にも例えば、「ステップS404で選択したメッシュにおいて、ステップS409で特定した範囲が最も大きい範囲を占める」こと等を条件としてもよい。 In this explanation, the condition in step S410 is "the majority of the meshes," but any method that can determine the range corresponding to the meshes will do, and other conditions may also be used, such as "the range identified in step S409 occupies the largest range in the mesh selected in step S404."
 ステップS411では、データマッピング部121は、ステップS408で選択した列のデータ値1224を、当該メッシュにおける当該指標(ステップS406で選択した指標)に関する低解像度データ分析の分析結果の値に決定する。先の具体例で言えば、「杉並区」に対応するデータ値「0.72」が分析結果の値に決定される。 In step S411, the data mapping unit 121 determines the data value 1224 of the column selected in step S408 as the value of the analysis result of the low-resolution data analysis for the indicator in that mesh (the indicator selected in step S406). In the previous specific example, the data value "0.72" corresponding to "Suginami-ku" is determined as the value of the analysis result.
 そして、以降のステップS412~S414では、データマッピング部121が、ステップS411で決定した分析結果の値を、特徴情報19の対応するセルに記憶させる。詳しく説明すると、データマッピング部121は、まず、特徴情報19を参照し、当該メッシュに該当する列を選択する(ステップS412)。次いでデータマッピング部121は、ステップS412で選択した列のなかで、ステップS406で選択した地域情報の指標1221に対応するセルを選択する(ステップS413)。最後にデータマッピング部121は、ステップS413で選択した特徴情報19のセルに、ステップS411で決定した値を記入する。 Then, in the subsequent steps S412 to S414, the data mapping unit 121 stores the value of the analysis result determined in step S411 in the corresponding cell of the feature information 19. To explain in more detail, the data mapping unit 121 first refers to the feature information 19 and selects a column that corresponds to the mesh (step S412). Next, the data mapping unit 121 selects a cell from the column selected in step S412 that corresponds to the index 1221 of the regional information selected in step S406 (step S413). Finally, the data mapping unit 121 enters the value determined in step S411 into the cell of the feature information 19 selected in step S413.
 ステップS414の処理後は、図8のステップS407に戻り、ステップS406で選択した指標の地域情報における未処理の列が存在する場合は(ステップS407のNO)、未処理の他の列についてステップS408以降の処理が繰り返される。 After processing in step S414, the process returns to step S407 in FIG. 8. If there are unprocessed columns in the regional information for the index selected in step S406 (NO in step S407), the process from step S408 onwards is repeated for the other unprocessed columns.
 以上のようにステップS401~S414の処理が行われることにより、高解像度データ分析で使用された細かいメッシュに低解像度データ(地域情報122)が割り当てられ、最終的には解析範囲の全てのメッシュについて、低解像度データである地域情報122が保持する各指標の特徴をどの程度有しているかが、数値化されて特徴情報19に記録される。 By carrying out the processing of steps S401 to S414 as described above, low-resolution data (regional information 122) is assigned to the fine meshes used in the high-resolution data analysis, and ultimately, for all meshes in the analysis range, the extent to which they possess the characteristics of each index held in the regional information 122 (low-resolution data) is quantified and recorded in the characteristic information 19.
(1-2-3)指標計算処理
 図11は、指標計算処理の処理手順例を示すフローチャートである。図11に示す指標計算処理は、図2のステップS103の処理に相当し、指標計算部13によってデータ統合評価部131が起動された場合に実行される。
(1-2-3) Index Calculation Process Fig. 11 is a flowchart showing an example of the procedure of the index calculation process. The index calculation process shown in Fig. 11 corresponds to the process of step S103 in Fig. 2, and is executed when the data integration evaluation part 131 is started by the index calculation part 13.
 図11によればまず、データ統合評価部131は、利用者(ユーザ)による操作に基づいて生成されたユーザ嗜好情報132を取得する(ステップS501)。ユーザ嗜好情報132は、ユーザが住環境評価画面300においてアンケート入力領域301に対して自身の嗜好に応じた入力操作(例えば、項目ごとに重要度を選択する操作)を行った後に評価実行ボタン304を押下することによって、上記入力操作の内容に従って自動生成される。 According to FIG. 11, first, the data integration evaluation unit 131 acquires user preference information 132 generated based on operations by the user (step S501). The user preference information 132 is automatically generated according to the contents of the input operations when the user performs input operations according to his/her preferences (for example, an operation to select the importance level for each item) in the questionnaire input area 301 on the living environment evaluation screen 300 and then presses the evaluation execution button 304.
 図12は、ユーザ嗜好情報132の一例を示す図である。図12に示すユーザ嗜好情報132は、住環境項目1321及び重要度1322の項目を有する。住環境項目1321は、指標に影響を与え得る住環境の項目を示し、サービス提供者やシステムの管理者等によって事前に設定される。後述する図16に示す住環境評価画面300を参照すると、アンケート入力領域301における選択項目がユーザ嗜好情報132の住環境項目1321に対応していることが分かる。重要度1322は、住環境項目1321が示す項目に対してユーザ(利用者)が選択した嗜好の度合い(嗜好係数)を示す。本例では、重要度1322は、1~5の5段階のうちから1つを選択させるものとし、選択した値を嗜好係数の値とする。 FIG. 12 is a diagram showing an example of user preference information 132. The user preference information 132 shown in FIG. 12 has items of living environment items 1321 and importance 1322. The living environment items 1321 indicate items of the living environment that may affect the index, and are set in advance by the service provider, the system administrator, etc. With reference to the living environment evaluation screen 300 shown in FIG. 16 described later, it can be seen that the selection items in the questionnaire input area 301 correspond to the living environment items 1321 of the user preference information 132. The importance 1322 indicates the degree of preference (preference coefficient) selected by the user (user) for the item indicated by the living environment item 1321. In this example, the importance 1322 is selected from five levels, 1 to 5, and the selected value is set as the value of the preference coefficient.
 図11の説明に戻る。ステップS501に次いで、データ統合評価部131は、解析指示情報18を参照して、解析指示情報18で指定される解析範囲をメッシュに区切る(ステップS502)。解析指示情報18で指定される解析範囲は、図16に示す住環境評価画面300のマップ表示領域302に表示された領域に相当する。 Returning to the explanation of FIG. 11, following step S501, the data integration evaluation unit 131 refers to the analysis instruction information 18 and divides the analysis range specified in the analysis instruction information 18 into meshes (step S502). The analysis range specified in the analysis instruction information 18 corresponds to the area displayed in the map display area 302 of the living environment evaluation screen 300 shown in FIG. 16.
 次に、データ統合評価部131は、ステップS502で生成した全てのメッシュに対してステップS504以降の処理を行ったか否かを確認する(ステップS503)。全てのメッシュが処理済みである場合(ステップS503のYES)、指標計算処理を終了する。未処理のメッシュが残っている場合は(ステップS503のNO)、ステップS504に進む。 Next, the data integration evaluation unit 131 checks whether the processing from step S504 onwards has been performed on all meshes generated in step S502 (step S503). If all meshes have been processed (YES in step S503), the index calculation process ends. If unprocessed meshes remain (NO in step S503), the process proceeds to step S504.
 ステップS504において、データ統合評価部131は、未処理のメッシュのうちから1つを選択し、その特徴を特徴情報19から取得する。 In step S504, the data integration evaluation unit 131 selects one of the unprocessed meshes and obtains its features from the feature information 19.
 次に、データ統合評価部131は、ステップS504で選択したメッシュについて、関連付け情報133に含まれる全ての指標に関する関連付けを処理済みであるか否かを確認する(ステップS505)。全ての指標に関して処理済みである場合は(ステップS505のYES)、ステップS503に戻る。未処理の指標が残っている場合は(ステップS505のNO)、ステップS506に進む。 Next, the data integration evaluation unit 131 checks whether associations for all indices included in the association information 133 for the mesh selected in step S504 have been processed (step S505). If all indices have been processed (YES in step S505), the process returns to step S503. If unprocessed indices remain (NO in step S505), the process proceeds to step S506.
 図13は、関連付け情報133の一例を示す図である。関連付け情報133は、特徴情報19に保持された特徴量から、所定の住環境項目の特徴量を算出する際の、関連付け(重み付け)の係数を保持するデータである。図13に示す関連付け情報133は、指標ごとに、住環境項目1331及び解析項目1332の項目を有する。住環境項目1331は、図12に示したユーザ嗜好情報132の住環境項目1321に対応する。解析項目1332は、図7に示した特徴情報19における第1の解析項目1903及び第2の解析項目1904に対応する。関連付け情報133は、住環境項目1331と解析項目1332との組合せについて、各セルに関連付け(重み付け)の係数を保持する。 13 is a diagram showing an example of the association information 133. The association information 133 is data that holds an association (weighting) coefficient when calculating the feature amount of a specified living environment item from the feature amount held in the feature information 19. The association information 133 shown in FIG. 13 has items of living environment item 1331 and analysis item 1332 for each index. The living environment item 1331 corresponds to the living environment item 1321 of the user preference information 132 shown in FIG. 12. The analysis item 1332 corresponds to the first analysis item 1903 and the second analysis item 1904 in the feature information 19 shown in FIG. 7. The association information 133 holds an association (weighting) coefficient for each cell for the combination of the living environment item 1331 and the analysis item 1332.
 図11の説明に戻る。ステップS506では、データ統合評価部131は、関連付け情報133のうちから、未処理の指標に関する関連付け情報を1つ選択し、ユーザ嗜好情報132の重要度1322を列見出し、ステップS504で特徴情報19から取得した特徴情報を行見出しとする表を内部的に作成する。 Returning to the explanation of FIG. 11, in step S506, the data integration evaluation unit 131 selects one piece of association information related to an unprocessed index from the association information 133, and internally creates a table in which the importance 1322 of the user preference information 132 is the column heading, and the feature information obtained from the feature information 19 in step S504 is the row heading.
 次に、データ統合評価部131は、ステップS506で作成した表の各セルを、見出し行列の積に、さらに関連付け情報において同じ場所に示される係数を乗じて計算する(ステップS507)。 Next, the data integration evaluation unit 131 calculates each cell of the table created in step S506 by multiplying the product of the header matrices by the coefficient indicated in the same location in the association information (step S507).
 そして、データ統合評価部131は、ステップS507で算出した各セルの値の和を、当該メッシュにおける当該指標の指標計算結果(指標値)として、指標値情報20に書き込む(ステップS508)。 Then, the data integration evaluation unit 131 writes the sum of the values of each cell calculated in step S507 into the index value information 20 as the index calculation result (index value) of the index in that mesh (step S508).
 すなわち、ステップS506~S508の処理は、高解像度データ分析処理及び低解像度データ分析処理で分析された各解析項目の特徴量(特徴情報19の特徴量)に基づいてある指標の指標値を算出する際に、当該指標に影響を与える複数の住環境項目の特徴量を、上記各解析項目の特徴量に関連付け情報133の係数による重み付けを行って算出し、さらにこの各住環境項目の特徴量の算出に際してユーザによる嗜好の度合い(ユーザ嗜好情報132の重要度)を反映させ、最終的に算出された各住環境項目の特徴量を合計することにより、当該指標の指標値として指標値情報20に格納するものである。 In other words, in the process of steps S506 to S508, when calculating an index value of a certain index based on the feature values of each analysis item analyzed in the high-resolution data analysis process and the low-resolution data analysis process (feature values of feature information 19), the feature values of multiple living environment items that affect the index are calculated by weighting the feature values of each analysis item using the coefficients of association information 133, and the degree of preference by the user (importance of user preference information 132) is reflected in the calculation of the feature values of each living environment item, and the finally calculated feature values of each living environment item are summed up and stored in index value information 20 as the index value of the index.
 図14は、指標値情報20の一例を示す図である。指標値情報20は、メッシュごとに住環境に関する各指標の指標値を保持するデータであって、図14の場合、縦位置2001、横位置2002、指標項目2003の項目から構成される。縦位置2001及び横位置2002はメッシュの位置を示すものであり、指標項目2003は、住環境に関する指標ごとに当該指標の指標値を示す。 FIG. 14 is a diagram showing an example of index value information 20. Index value information 20 is data that holds index values of each index related to the living environment for each mesh, and in the case of FIG. 14, is composed of items of vertical position 2001, horizontal position 2002, and index item 2003. Vertical position 2001 and horizontal position 2002 indicate the position of the mesh, and index item 2003 indicates the index value of the index for each indicator related to the living environment.
 そしてステップS508の後は、ステップS505に戻り、未処理の指標がなくなるまでステップS506~S508の処理を繰り返す。 After step S508, the process returns to step S505 and repeats steps S506 to S508 until there are no more unprocessed indicators.
 以上のようにステップS501~S508の処理が行われることにより、住環境に関する指標ごとに、指標値が算出されて指標値情報20に記録される。 By carrying out the processing of steps S501 to S508 as described above, index values are calculated for each indicator related to the living environment and recorded in the index value information 20.
(1-2-4)可視化処理
 図15は、可視化処理の処理手順例を示すフローチャートである。図15に示す可視化処理は、図2のステップS104の処理に相当し、可視化部14によって地図表示部141が起動された場合に実行される。
(1-2-4) Visualization Processing Fig. 15 is a flowchart showing an example of the processing procedure of the visualization processing. The visualization processing shown in Fig. 15 corresponds to the processing of step S104 in Fig. 2, and is executed when the map display unit 141 is started by the visualization unit 14.
 図15の説明を行う前に、ユーザ(利用者)に向けて表示される住環境評価画面300の構成について説明する。 Before explaining Figure 15, we will explain the configuration of the living environment evaluation screen 300 that is displayed to the user.
 図16は、住環境評価画面300の一例を示す図である。住環境評価システム1がユーザに提供する表示画面であって、本説明では、住環境の評価が開始される前のユーザによる嗜好の入力等から、住環境の評価後の結果表示まで、継続して表示する。 FIG. 16 is a diagram showing an example of a living environment evaluation screen 300. This is a display screen provided to the user by the living environment evaluation system 1, and in this explanation, it is continuously displayed from when the user inputs preferences before the evaluation of the living environment begins, to when the results are displayed after the evaluation of the living environment is completed.
 住環境評価画面300は、アンケート入力領域301、マップ表示領域302、指標選択ボックス303、及び評価実行ボタン304の表示構成を有する。 The living environment evaluation screen 300 has a display configuration of a questionnaire input area 301, a map display area 302, an index selection box 303, and an evaluation execution button 304.
 アンケート入力領域301は、評価開始前にユーザに回答を求めるアンケートが表示される領域である。アンケート入力領域301には、評価する指標に影響を与える複数の住環境項目が表示され、各住環境項目について、ユーザの嗜好に沿った重要度が選択される。このアンケート入力領域301への入力内容に応じて、ユーザ嗜好情報132が生成される。 The questionnaire input area 301 is an area that displays a questionnaire that the user is asked to answer before starting the evaluation. The questionnaire input area 301 displays multiple living environment items that affect the index to be evaluated, and the importance of each living environment item is selected according to the user's preferences. User preference information 132 is generated according to the contents of the input to this questionnaire input area 301.
 マップ表示領域302は、地図が表示される領域である。評価開始前にユーザの操作によってマップ表示領域302に表示された地図領域が、住環境評価の解析範囲となる。また、評価実行後は、マップ表示領域302に表示された地図領域がメッシュで区切られ、指標選択ボックス303で選択された指標に関する評価結果の出力として、指標値情報20の指標値に基づく表示が各メッシュに対して行われる。 The map display area 302 is the area where the map is displayed. The map area displayed in the map display area 302 by the user's operation before the evaluation begins becomes the analysis range for the living environment evaluation. After the evaluation is performed, the map area displayed in the map display area 302 is divided into meshes, and a display based on the index value of the index value information 20 is performed for each mesh as the output of the evaluation result for the index selected in the index selection box 303.
 指標選択ボックス303は、住環境を評価する指標をユーザが選択するためのチェックボックスである。また、評価実行ボタン304は、住環境評価システム1による住環境評価の実行を指示するためのボタンである。ユーザは、指標選択ボックス303に示された指標(図16の場合、「自然」か「生活利便性」)のうちから、自身が確認したい指標にチェックを付けて、評価実行ボタン304を押下操作することにより、マップ表示領域302に表示した解析範囲に対する上記指標に関する住環境評価が実行される。 The index selection box 303 is a checkbox that allows the user to select an index for evaluating the living environment. The evaluation execution button 304 is a button for instructing the living environment evaluation system 1 to execute a living environment evaluation. The user checks the index that the user wishes to check from among the indexes displayed in the index selection box 303 (in the case of FIG. 16, "nature" or "living convenience") and presses the evaluation execution button 304, which executes a living environment evaluation for the above indexes for the analysis range displayed in the map display area 302.
 図15に戻り、可視化処理を説明する。図15によればまず、地図表示部141は、解析指示情報18に基づいて、解析範囲の地図を住環境評価画面300のマップ表示領域302に表示する(ステップS601)。なお、図16で説明した住環境評価画面300の場合は、評価開始前に解析範囲がユーザによって選択されているため、ステップS601において改めて表示しなくてもよい。 Returning to FIG. 15, the visualization process will be described. According to FIG. 15, first, the map display unit 141 displays a map of the analysis range in the map display area 302 of the living environment evaluation screen 300 based on the analysis instruction information 18 (step S601). In the case of the living environment evaluation screen 300 described in FIG. 16, since the analysis range is selected by the user before the evaluation begins, it is not necessary to display it again in step S601.
 次に、地図表示部141は、利用者が住環境評価画面300の指標選択ボックス303で選択した指標を取得する(ステップS602)。 Next, the map display unit 141 acquires the index selected by the user in the index selection box 303 on the living environment evaluation screen 300 (step S602).
 次に、地図表示部141は、ステップS601で表示した地図を、解析指示情報18のメッシュ幅1805によるメッシュで区切る(ステップS603)。 Next, the map display unit 141 divides the map displayed in step S601 into meshes according to the mesh width 1805 of the analysis instruction information 18 (step S603).
 次に、地図表示部141は、ステップS603で生成した全てのメッシュに対してステップS605以降の処理を行ったか否かを確認する(ステップS604)。全てのメッシュが処理済みである場合(ステップS604のYES)、可視化処理を終了する。未処理のメッシュが残っている場合は(ステップS604のNO)、ステップS605に進む。 Then, the map display unit 141 checks whether the processing from step S605 onwards has been performed on all meshes generated in step S603 (step S604). If all meshes have been processed (YES in step S604), the visualization process ends. If unprocessed meshes remain (NO in step S604), the process proceeds to step S605.
 ステップS605において、地図表示部141は、未処理のメッシュのうちから1つを選択する。 In step S605, the map display unit 141 selects one of the unprocessed meshes.
 次に、地図表示部141は、指標値情報20を参照し、ステップS602で取得した指標とステップS605で選択したメッシュとの組合せに対応する指標値を取得する(ステップS606)。 Next, the map display unit 141 refers to the index value information 20 and obtains the index value corresponding to the combination of the index obtained in step S602 and the mesh selected in step S605 (step S606).
 次に、地図表示部141は、ステップS606で取得した指標値を、透明度を有し指標値に応じた濃度の色で、マップ表示領域302のメッシュ上に描写する(ステップS607)。図16の場合、マップ表示領域302のメッシュは、指標値が大きいほど濃度が高くなる色で描写されている。 Then, the map display unit 141 depicts the index value acquired in step S606 on the mesh of the map display area 302 in a color that has transparency and a density according to the index value (step S607). In the case of FIG. 16, the mesh of the map display area 302 is depicted in a color that increases in density as the index value increases.
 ステップS607の後は、ステップS604に戻り、地図表示部141が各メッシュに対して処理を繰り返し、全てのメッシュに対して描写処理が終了すると(ステップS604のYES)、可視化処理を終了する。 After step S607, the process returns to step S604, where the map display unit 141 repeats the process for each mesh, and when the rendering process is completed for all meshes (YES in step S604), the visualization process is terminated.
 このようにして住環境評価画面300のマップ表示領域302に評価結果が重ねて表示された地図は、行政区分単位の境界とは関係なく区切られたメッシュを単位として、地理的な連続性を有する形態で、住環境の指標の度合いを表すことができる。なお、評価結果を重ねる地図は、地理情報を表す一般的な地図に限定されるものではない。例えば、所定の用途に特化した地図の一例として、ハザードマップを用いた場合には、ハザードマップが示すリスク情報と、住環境の評価結果とを同時に提示することができる。 The map on which the evaluation results are superimposed and displayed in this way on the map display area 302 of the living environment evaluation screen 300 can show the degree of the living environment index in a form with geographical continuity, with the meshes divided regardless of the boundaries of administrative division units. Note that the map on which the evaluation results are superimposed is not limited to general maps that show geographic information. For example, when a hazard map is used as an example of a map specialized for a specific purpose, the risk information shown by the hazard map and the evaluation results of the living environment can be presented simultaneously.
 以上のように、本実施形態に係る住環境評価システム1は、低解像度データ(地域情報122)及び高解像度データ(衛星画像114)に基づいて、ユーザが指定した解析範囲を分割した各メッシュ領域の特徴量を分析し、その特徴量に基づいて、ユーザが指定した住環境の指標の指標値をユーザの嗜好を反映させて算出し、表示することができる。 As described above, the living environment assessment system 1 according to this embodiment can analyze the features of each mesh area into which the analysis range specified by the user is divided based on the low-resolution data (regional information 122) and high-resolution data (satellite image 114), and calculate and display the index values of the living environment indexes specified by the user based on the features, while reflecting the user's preferences.
 また、本実施形態に係る住環境評価システム1は、上記の各メッシュ領域の特徴量を分析する際には、高解像度データである地図情報115を用いて誤検出を防止することができる。 In addition, when analyzing the features of each mesh area, the residential environment evaluation system 1 according to this embodiment can prevent erroneous detection by using the map information 115, which is high-resolution data.
 また、本実施形態に係る住環境評価システム1は、衛星画像114等の高解像度データを用いることにより、個人の嗜好を反映した住環境の評価を、従来技術よりも高い解像度で詳細に実現することができる。このような評価結果を利用することにより、例えば、緑の多さ、道路や工場の騒音の可能性などといった、単純な地図情報だけでは知ることのできない情報を得ることができる。また、住環境を定量的に推定することができる。 In addition, the residential environment assessment system 1 according to this embodiment uses high-resolution data such as satellite images 114, making it possible to assess the residential environment in a more detailed manner and with a higher resolution than conventional technology, reflecting individual preferences. By using such assessment results, it is possible to obtain information that cannot be known from simple map information alone, such as the amount of greenery and the possibility of noise from roads and factories. It is also possible to quantitatively estimate the residential environment.
(2)第2の実施形態
 図17は、本発明の第2の実施形態に係る住環境評価システム10の構成例を示すブロック図である。第2の実施形態に係る住環境評価システム10は、図1に示した第1の実施形態に係る住環境評価システム1に、修正部21及び分析部22が追加された構成となっており、住環境評価システム1と共通する構成及びその処理等については説明を省略する。
(2) Second embodiment Fig. 17 is a block diagram showing an example of the configuration of a living environment assessment system 10 according to a second embodiment of the present invention. The living environment assessment system 10 according to the second embodiment has a configuration in which a correction unit 21 and an analysis unit 22 are added to the living environment assessment system 1 according to the first embodiment shown in Fig. 1, and a description of the configuration and processing thereof common to the living environment assessment system 1 will be omitted.
 修正部21は、ユーザ(利用者)の直感による指摘に基づいて、関連付け情報133を更新する機能を有する。修正部21は、プログラムを実行する機能部として関連付け情報更新部211を有する。 The correction unit 21 has a function of updating the association information 133 based on the intuitive indications of the user. The correction unit 21 has an association information update unit 211 as a functional unit that executes a program.
 ある指標に関して、住環境評価システム10による住環境の評価結果が表示された場合、利用者は、その評価結果が自身の感覚とは乖離があると感じることが有り得る。このような状況に対応するため、住環境評価システム10は、利用者が、所定のユーザインタフェース(UI)を利用して、自身の感覚とは乖離がある指標とその乖離の内容を指定できるように構成する。上記の指定を可能にするには、例えば住環境評価画面300において、指標を選択可能な領域と、選択した指標に対する乖離の内容(例えば、高すぎる、低すぎる)を示す情報を入力可能な領域とを追加すればよい。具体的な図示は省略する。 When the residential environment evaluation system 10 displays the evaluation results of a residential environment for a certain index, the user may feel that the evaluation results deviate from his or her own sense. To accommodate such situations, the residential environment evaluation system 10 is configured to allow the user to use a specified user interface (UI) to specify the index that deviates from his or her sense and the details of the deviation. To enable the above specification, for example, an area in which an index can be selected and an area in which information indicating the details of the deviation from the selected index (e.g., too high, too low) can be added to the residential environment evaluation screen 300. Specific illustrations are omitted.
 そして修正部21は、上記の所定のUIにおいて利用者から、自身の感覚とは乖離のある指標とその乖離の内容とが指定された場合に、関連付け情報更新部211を起動して関連付け情報更新処理を実行する。関連付け情報更新処理の詳細は、図18を参照しながら後述する。 Then, when the user specifies an index that deviates from the user's own sense and the details of the deviation in the above-mentioned specified UI, the correction unit 21 starts the association information update unit 211 and executes the association information update process. The details of the association information update process will be described later with reference to FIG. 18.
 分析部22は、利用者からの要求に応じて、過去の住環境評価結果を表示する機能を有する。分析部22は、プログラムを実行する機能部として過去情報表示部221を有する。 The analysis unit 22 has a function of displaying past living environment evaluation results in response to a request from a user. The analysis unit 22 has a past information display unit 221 as a functional unit that executes a program.
 分析部22は、所定のUIにおいて利用者から、過去の日付を添えて過去情報の表示を要求する操作がなされた場合に、過去情報表示部221を起動して過去情報表示処理を実行する。過去情報表示処理の詳細は、図19を参照しながら後述する。 When a user performs an operation on a specific UI to request the display of past information with past dates, the analysis unit 22 starts the past information display unit 221 and executes the past information display process. Details of the past information display process will be described later with reference to FIG. 19.
(2-1)関連付け情報更新処理
 図18は、関連付け情報更新処理の処理手順例を示すフローチャートである。関連付け情報更新処理は、関連付け情報更新部211が起動された場合に実行される。
18 is a flowchart showing an example of the procedure of the association information update process, which is executed when the association information update unit 211 is started.
 図18によればまず、関連付け情報更新部211は、利用者によって入力された、利用者の感覚と乖離のある指標と、その乖離の内容(高すぎるか、低すぎるか)を示す情報とを取得する(ステップS701)。 According to FIG. 18, first, the association information update unit 211 acquires the index that is input by the user and that deviates from the user's sense, and information indicating the nature of the deviation (too high or too low) (step S701).
 次に、関連付け情報更新部211は、関連付け情報133を参照し、ステップS701で取得した指標を選択する(ステップS702)。 Next, the association information update unit 211 refers to the association information 133 and selects the index obtained in step S701 (step S702).
 次に、関連付け情報更新部211は、ステップS701で取得した乖離の内容が、自身の感覚よりも「高すぎる」であるか否かを確認する(ステップS703)。乖離の内容が「高すぎる」である場合(ステップS703のYES)、ステップS704に進み、乖離の内容が「低すぎる」である場合(ステップS703のNO)、ステップS705に進む。 Then, the association information update unit 211 checks whether the deviation acquired in step S701 is "too high" compared to the user's own feeling (step S703). If the deviation is "too high" (YES in step S703), the process proceeds to step S704, and if the deviation is "too low" (NO in step S703), the process proceeds to step S705.
 ステップS704では、関連付け情報更新部211は、ステップS702で選択した指標の係数をすべて所定程度(本例では5%)減少させる。一方、ステップS705では、関連付け情報更新部211は、ステップS702で選択した指標の係数をすべて所定程度(本例では5%)増加させる。 In step S704, the association information update unit 211 decreases all of the coefficients of the indices selected in step S702 by a predetermined amount (5% in this example). On the other hand, in step S705, the association information update unit 211 increases all of the coefficients of the indices selected in step S702 by a predetermined amount (5% in this example).
 そして、ステップS704及びステップS705の後、関連付け情報更新部211は、データ統合評価部131及び地図表示部141を起動し、図11に示した指標計算処理と、図15に示した可視化処理とを再度実行する(ステップS706)。 Then, after steps S704 and S705, the association information update unit 211 starts the data integration evaluation unit 131 and the map display unit 141, and executes the index calculation process shown in FIG. 11 and the visualization process shown in FIG. 15 again (step S706).
 以上のようにステップS701~S706の処理が行われることにより、利用者の直感(感覚)に近づけるように更新された関連付け情報133を用いて、指摘された指標の指標値が再計算され、再計算された指標値に基づいて当該指標の評価結果が表示される。かくして、再表示された評価結果は、利用者の直感に近いものとなり、第2の実施形態に係る住環境評価システム10は、利用者の納得性を高めることができる。 By carrying out the processing of steps S701 to S706 as described above, the index value of the indicated index is recalculated using the association information 133 that has been updated to approximate the user's intuition (sense), and the evaluation result of that index is displayed based on the recalculated index value. In this way, the redisplayed evaluation result is closer to the user's intuition, and the living environment evaluation system 10 according to the second embodiment can increase the user's satisfaction.
 なお、上記した関連付け情報更新処理は、利用者から入力された指標の乖離の内容(高すぎる/低すぎる)に応じて、住環境評価システム10(関連付け情報更新部211)が、当該指標の関連付け係数を所定程度(例えば5%)変更するものであったが、関連付け係数の変更量については、様々なバリエーションを採用することができる。具体的には例えば、乖離の内容(高すぎる/低すぎる)だけでなくその度合いも利用者が入力できるようにし、度合いに応じて関連付け係数の変更量を異ならせるようにしてもよい。 In the above-mentioned association information update process, the living environment evaluation system 10 (association information update unit 211) changes the association coefficient of the index by a predetermined amount (for example, 5%) depending on the deviation of the index input by the user (too high/too low), but various variations in the amount of change in the association coefficient can be adopted. Specifically, for example, the user can input not only the deviation (too high/too low) but also the degree of deviation, and the amount of change in the association coefficient can be varied depending on the degree.
(2-2)過去情報表示処理
 図19は、過去情報表示処理の処理手順例を示すフローチャートである。過去情報表示処理は、過去情報表示部221が起動された場合に実行される。
19 is a flowchart showing an example of a procedure for the past information display process, which is executed when the past information display unit 221 is started.
 図19によればまず、過去情報表示部221は、利用者による過去情報の表示を要求する操作において入力された、過去の日付を取得する(ステップS801)。 As shown in FIG. 19, first, the past information display unit 221 obtains the past date input by the user in an operation requesting the display of past information (step S801).
 次に、過去情報表示部221は、衛星画像114、地図情報115、及び地域情報122を、ステップS201で取得した日付に最も近いものに更新する(ステップS802)。ステップS802で過去のデータに更新できるようにするため、衛星画像114の元となるデータ、地図情報115の元となるデータ、及び地域情報122については、タイムスタンプを付して所定期間の過去のデータをディスク2等に保管しておくとよい。 Then, the past information display unit 221 updates the satellite image 114, the map information 115, and the area information 122 to the date closest to the date obtained in step S201 (step S802). To be able to update to past data in step S802, it is advisable to timestamp the data on which the satellite image 114 is based, the data on which the map information 115 is based, and the area information 122, and store the past data for a predetermined period of time on disk 2, etc.
 次に、過去情報表示部221は、ステップS802で更新したデータを用いて、図2に示した処理順で、住環境評価の処理を再実行する(ステップS803)。 Next, the past information display unit 221 re-executes the living environment evaluation process in the processing order shown in FIG. 2 using the data updated in step S802 (step S803).
 以上のようにステップS801~S803の処理が行われることにより、利用者から指定された過去の日付に最も近い時点における住環境の評価結果が住環境評価画面300に表示されることで、利用者は、同じ解析範囲における住環境評価の時系列の変化(昼夜の違い、10年前との違いなど)を確認することができる。また、現在の住環境の評価結果と、過去の住環境の評価結果とを並べて表示するようにしてもよい。かくして、第2の実施形態に係る住環境評価システム10によれば、利用者が自治体の担当者等である場合には、地域の特色を定量的に評価することができ、公共施設や公園の配置等に関する施策立案を行うにあたっての根拠を得ることができる。 By carrying out the processing of steps S801 to S803 as described above, the residential environment evaluation results for the time closest to the past date specified by the user are displayed on the residential environment evaluation screen 300, allowing the user to check the chronological changes in residential environment evaluation in the same analysis range (differences between day and night, differences from 10 years ago, etc.). In addition, the current residential environment evaluation results and past residential environment evaluation results may be displayed side by side. Thus, with the residential environment evaluation system 10 according to the second embodiment, if the user is a local government official or the like, the characteristics of the area can be quantitatively evaluated, providing a basis for planning policies regarding the placement of public facilities and parks, etc.
 また、第2の実施形態に係る住環境評価システム10は、第1の実施形態に係る住環境評価システム1と同様の構成及び機能を備えることから、第1の実施形態と同様の効果も得ることができる。 In addition, the residential environment assessment system 10 according to the second embodiment has the same configuration and functions as the residential environment assessment system 1 according to the first embodiment, and therefore can achieve the same effects as the first embodiment.
 1,10 住環境評価システム
 2    ディスク
 3    ネットワーク
 4    入出力装置
 11   高解像度データ分析部
 12   低解像度データ分析部
 13   指標計算部
 14   可視化部
 15   プロセッサ
 16   入出力装置
 17   メモリ
 18   解析指示情報
 19   特徴情報
 20   指標値情報
 21   修正部
 22   分析部
 111  土地特徴抽出部
 112  分類部
 113  色識別情報
 114  衛星画像
 115  地図情報
 121  データマッピング部
 122  地域情報
 131  データ統合評価部
 132  ユーザ嗜好情報
 133  関連付け情報
 141  地図表示部
 211  関連付け情報更新部
 221  過去情報表示部
 300  住環境評価画面
 
REFERENCE SIGNS LIST 1, 10 Residential environment evaluation system 2 Disk 3 Network 4 Input/output device 11 High-resolution data analysis unit 12 Low-resolution data analysis unit 13 Index calculation unit 14 Visualization unit 15 Processor 16 Input/output device 17 Memory 18 Analysis instruction information 19 Characteristics information 20 Index value information 21 Correction unit 22 Analysis unit 111 Land feature extraction unit 112 Classification unit 113 Color identification information 114 Satellite image 115 Map information 121 Data mapping unit 122 Regional information 131 Data integration evaluation unit 132 User preference information 133 Association information 141 Map display unit 211 Association information update unit 221 Past information display unit 300 Residential environment evaluation screen

Claims (9)

  1.  利用者から指定された指標について住環境を評価する住環境評価システムであって、
     前記指標に影響を与え得る1以上の住環境項目に対する利用者の嗜好を示す嗜好係数を保持するユーザ嗜好情報と、
     住環境を評価する範囲として利用者から指定された解析範囲を示す解析指示情報と、
     前記解析範囲に該当する画像を用いて波長に基づく分析を行うことにより、前記解析範囲の特徴を、当該解析範囲を複数に分割した領域ごとに分析する高解像度データ分析部と、
     前記解析範囲の特徴の分析結果と前記ユーザ嗜好情報とに基づいて、前記解析範囲における前記指標の指標値を前記領域ごとに計算する指標計算部と、
     前記指標計算部によって算出された指標値に基づく評価結果を、前記領域ごとに前記解析範囲を示す地図に重ねて表示する可視化部と、
     を備えることを特徴とする住環境評価システム。
    A living environment evaluation system for evaluating a living environment based on an index specified by a user,
    User preference information that holds preference coefficients that indicate user preferences for one or more living environment items that may affect the index;
    Analysis instruction information indicating an analysis range designated by a user as a range for evaluating the living environment;
    a high-resolution data analysis unit that performs a wavelength-based analysis using an image corresponding to the analysis range, thereby analyzing the characteristics of the analysis range for each of a plurality of regions obtained by dividing the analysis range;
    an index calculation unit that calculates an index value of the index in the analysis range for each of the regions based on an analysis result of the characteristics of the analysis range and the user preference information;
    a visualization unit that displays an evaluation result based on the index value calculated by the index calculation unit on a map showing the analysis range for each of the regions;
    A living environment evaluation system comprising:
  2.  前記解析範囲を分割する前記領域よりも広範囲な地域を単位として、各地域における特徴量を前記指標ごとに保持する地域情報と、
     前記領域ごとに、当該領域が属する前記地域について前記地域情報が示す特徴量に基づいて、前記解析範囲の特徴を分析し、その分析結果を前記高解像度データ分析部による前記解析範囲の特徴の分析結果に追加する低解像度データ分析部と、
     をさらに備えることを特徴とする請求項1に記載の住環境評価システム。
    regional information that holds, for each of the indices, a feature amount in each of the regions that is wider than the area into which the analysis range is divided;
    a low-resolution data analysis unit that analyzes, for each of the regions, characteristics of the analysis range based on the feature amount indicated by the regional information for the region to which the region belongs, and adds the analysis result to the analysis result of the characteristics of the analysis range by the high-resolution data analysis unit;
    The living environment evaluation system according to claim 1, further comprising:
  3.  前記高解像度データ分析部は、前記解析範囲の特徴の分析結果として、前記画像を用いた分析により、前記領域ごとに、前記解析範囲の特徴に影響を与え得る複数の第1の解析項目の特徴量を算出し、
     前記低解像度データ分析部は、前記解析範囲の特徴の分析結果として、前記地域情報を用いた分析により、前記領域ごとに、前記解析範囲の特徴に影響を与え得る1以上の第2の解析項目の特徴量を算出し、
     前記指標計算部は、
     前記第1及び前記第2の解析項目の各解析項目の特徴量から各前記住環境項目の特徴量を算出するための重み付け係数を、前記指標ごとに保持する関連付け情報を有し、
     前記各解析項目の特徴量に前記重み付け係数と前記嗜好係数とを反映させて、前記領域ごとに、前記指標に関する各前記住環境項目の特徴量を算出し、
     前記領域ごとに、前記算出した各住環境項目の特徴量から当該領域における前記指標の指標値を算出する
     ことを特徴とする請求項2に記載の住環境評価システム。
    the high-resolution data analysis unit calculates, as an analysis result of the characteristics of the analysis range, feature amounts of a plurality of first analysis items that may affect the characteristics of the analysis range for each of the regions by analysis using the image;
    the low-resolution data analysis unit calculates, as an analysis result of the characteristics of the analysis range, a feature amount of one or more second analysis items that may affect the characteristics of the analysis range for each of the regions by an analysis using the regional information;
    The index calculation unit is
    a weighting coefficient for calculating a feature amount of each of the living environment items from the feature amount of each of the first and second analysis items, the weighting coefficient being stored for each of the indexes;
    calculating, for each of the areas, a feature value of each of the living environment items related to the index by reflecting the weighting coefficient and the preference coefficient in the feature value of each of the analysis items;
    3. The living environment evaluation system according to claim 2, further comprising: calculating, for each of said areas, an index value of said index in said area from said calculated feature amount of each of said living environment items.
  4.  前記高解像度データ分析部は、
     前記領域内の任意の座標に対して用途が示される地図情報を保持し、
     前記画像を用いた分析において、前記領域に対応する前記画像をピクセル単位で波長分析を行い、波長分析の結果が前記地図情報が示す用途に適合する場合は当該ピクセルの波長分析の最終的な分析値を増やし、各ピクセルの最終的な分析値に基づいて、当該領域における前記第1の解析項目の特徴量を算出する
     ことを特徴とする請求項3に記載の住環境評価システム。
    The high-resolution data analysis unit
    holding map information indicating a use for any coordinate within said area;
    The living environment evaluation system according to claim 3, characterized in that in the analysis using the image, a wavelength analysis is performed on the image corresponding to the area on a pixel-by-pixel basis, and if the result of the wavelength analysis is suitable for the purpose indicated by the map information, a final analysis value of the wavelength analysis of the pixel is increased, and a characteristic amount of the first analysis item in the area is calculated based on the final analysis value of each pixel.
  5.  利用者が指定した前記指標について前記可視化部が表示した評価結果が利用者の感覚と乖離していた場合に、前記指標と乖離の内容とを利用者から入力可能なユーザインタフェースと、
     前記ユーザインタフェースで入力された指標及び乖離の内容を取得し、前記関連付け情報に保持された前記指標の前記重み付け係数の値を、前記乖離の内容に応じて補正する修正部と、をさらに備える
     ことを特徴とする請求項3に記載の住環境評価システム。
    a user interface that allows a user to input the index and the content of the deviation when the evaluation result displayed by the visualization unit for the index designated by the user deviates from the user's sense;
    The living environment evaluation system of claim 3, further comprising a correction unit that acquires the index and the content of the deviation inputted through the user interface and corrects the value of the weighting coefficient of the index stored in the association information in accordance with the content of the deviation.
  6.  前記解析範囲が含まれる前記画像及び前記地図を時系列を管理して格納する記憶部と、
     利用者から過去の1つの時点が指定された場合に、前記記憶部に格納された当該時点に近い前記画像及び前記地図を用いて、高解像度データ分析部、前記指標計算部、及び前記可視化部による処理を再実行させる過去情報表示部と、をさらに備える
     ことを特徴とする請求項1に記載の住環境評価システム。
    a storage unit that stores the image and the map including the analysis range in a time series manner;
    2. The living environment evaluation system according to claim 1, further comprising: a past information display unit that, when a time in the past is designated by a user, causes the high-resolution data analysis unit, the index calculation unit, and the visualization unit to re-execute processing using the image and the map stored in the memory unit that are close to the designated time.
  7.  前記高解像度データ分析部による前記画像を用いた波長分析は、色を識別する画像分析である
     ことを特徴とする請求項1に記載の住環境評価システム。
    2. The living environment evaluation system according to claim 1, wherein the wavelength analysis using the image by the high resolution data analysis unit is image analysis for identifying colors.
  8.  前記画像は、衛星画像または航空画像である
     ことを特徴とする請求項1に記載の住環境評価システム。
    2. The living environment evaluation system according to claim 1, wherein the image is a satellite image or an aerial image.
  9.  利用者から指定された指標について住環境を評価する住環境評価システムによる住環境評価方法であって、
     前記住環境評価システムは、
     前記指標に影響を与え得る1以上の住環境項目に対する利用者の嗜好を示す嗜好係数を保持するユーザ嗜好情報と、
     住環境を評価する範囲として利用者から指定された解析範囲を示す解析指示情報と、
     を有し、
     前記住環境評価システムが、前記解析範囲に該当する画像を用いて波長に基づく分析を行うことにより、前記解析範囲の特徴を、当該解析範囲を複数に分割した領域ごとに分析する高解像度データ分析ステップと、
     前記住環境評価システムが、高解像度データ分析ステップにおける前記解析範囲の特徴の分析結果と前記ユーザ嗜好情報とに基づいて、前記解析範囲における前記指標の指標値を前記領域ごとに計算する指標計算ステップと、
     前記住環境評価システムが、前記指標計算ステップで算出された指標値に基づく評価結果を、前記領域ごとに前記解析範囲を示す地図に重ねて表示する可視化ステップと、
     を備えることを特徴とする住環境評価方法。
     
    A method for evaluating a living environment using a living environment evaluation system that evaluates a living environment based on an index specified by a user, comprising:
    The living environment evaluation system includes:
    User preference information that holds preference coefficients that indicate user preferences for one or more living environment items that may affect the index;
    Analysis instruction information indicating an analysis range designated by a user as a range for evaluating the living environment;
    having
    a high-resolution data analysis step in which the living environment evaluation system performs a wavelength-based analysis using an image corresponding to the analysis range, thereby analyzing characteristics of the analysis range for each of a plurality of areas obtained by dividing the analysis range;
    an index calculation step in which the living environment evaluation system calculates an index value of the index in the analysis range for each of the regions based on the analysis result of the characteristics of the analysis range in the high-resolution data analysis step and the user preference information;
    a visualization step in which the living environment evaluation system displays the evaluation result based on the index value calculated in the index calculation step on a map showing the analysis range for each of the areas;
    A living environment evaluation method comprising:
PCT/JP2023/028695 2022-09-29 2023-08-07 Living environment evaluation system and living environment evaluation method WO2024070238A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-156892 2022-09-29
JP2022156892A JP2024050195A (en) 2022-09-29 2022-09-29 Living environment evaluation system and living environment evaluation method

Publications (1)

Publication Number Publication Date
WO2024070238A1 true WO2024070238A1 (en) 2024-04-04

Family

ID=90477156

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/028695 WO2024070238A1 (en) 2022-09-29 2023-08-07 Living environment evaluation system and living environment evaluation method

Country Status (2)

Country Link
JP (1) JP2024050195A (en)
WO (1) WO2024070238A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003196380A (en) * 2001-12-28 2003-07-11 Fujitsu Fip Corp Method, device, and program for providing life environmental information, and recording medium
JP2004020292A (en) * 2002-06-14 2004-01-22 Kenwood Corp Map information updating system
JP2004294361A (en) * 2003-03-28 2004-10-21 Hitachi Ltd Multi-spectral pick-up image analyzer
JP2017091422A (en) * 2015-11-16 2017-05-25 大和ハウス工業株式会社 Living environment evaluation device, living environment evaluation system, living environment evaluation method and program
WO2022114082A1 (en) * 2020-11-25 2022-06-02 プロパー ピーティーイー リミテッド Information processing system, information processing method, and program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003196380A (en) * 2001-12-28 2003-07-11 Fujitsu Fip Corp Method, device, and program for providing life environmental information, and recording medium
JP2004020292A (en) * 2002-06-14 2004-01-22 Kenwood Corp Map information updating system
JP2004294361A (en) * 2003-03-28 2004-10-21 Hitachi Ltd Multi-spectral pick-up image analyzer
JP2017091422A (en) * 2015-11-16 2017-05-25 大和ハウス工業株式会社 Living environment evaluation device, living environment evaluation system, living environment evaluation method and program
WO2022114082A1 (en) * 2020-11-25 2022-06-02 プロパー ピーティーイー リミテッド Information processing system, information processing method, and program

Also Published As

Publication number Publication date
JP2024050195A (en) 2024-04-10

Similar Documents

Publication Publication Date Title
Goodman et al. GeoQuery: Integrating HPC systems and public web-based geospatial data tools
Aubrecht et al. Integrating earth observation and GIScience for high resolution spatial and functional modeling of urban land use
Linard et al. Assessing the use of global land cover data for guiding large area population distribution modelling
US8294710B2 (en) Extensible map with pluggable modes
Surabuddin Mondal et al. Modeling of spatio-temporal dynamics of land use and land cover in a part of Brahmaputra River basin using Geoinformatic techniques
Deng et al. Improving the housing-unit method for small-area population estimation using remote-sensing and GIS information
Ge et al. Vectorial boundary-based sub-pixel mapping method for remote-sensing imagery
Kim et al. Pycnophylactic interpolation revisited: integration with the dasymetric-mapping method
US8341156B1 (en) System and method for identifying erroneous business listings
Li et al. HomeSeeker: A visual analytics system of real estate data
Najafi et al. A user-centred virtual city information model for inclusive community design: State-of-art
Ghaemi et al. Design and implementation of a web-based platform to support interactive environmental planning
Wendel et al. Development of a Web-browser based interface for 3D data—A case study of a plug-in free approach for visualizing energy modelling results
Shahid et al. Towards progressive geospatial information processing on web systems: a case study for watershed analysis in Iowa
WO2024070238A1 (en) Living environment evaluation system and living environment evaluation method
JP7422682B2 (en) Improved geographic indexing
Su Spatial continuity and self-similarity in super-resolution mapping: Self-similar pixel swapping
Lidouh et al. GAIA map: a tool for visual ranking analysis in spatial multicriteria problems
US20170083183A1 (en) Multiple resolution non-linear terrain mapping system
Herbreteau et al. GeoHealth and QuickOSM, two QGIS plugins for health applications
Mao et al. Population spatialization at building scale based on residential population index—A case study of Qingdao city
Li et al. Gridded datasets for Japan: total, male, and female populations from 2001–2020
Toomanian et al. Automatic symbolisation methods for geoportals
Colston et al. Using Geospatial Analysis to Inform Decision Making In Targeting Health Facility-Based Programs
Bahri et al. Development of gis database and facility management system: asset and space in ukm

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23871480

Country of ref document: EP

Kind code of ref document: A1