WO2007052458A1 - 情報表示装置 - Google Patents
情報表示装置 Download PDFInfo
- Publication number
- WO2007052458A1 WO2007052458A1 PCT/JP2006/320414 JP2006320414W WO2007052458A1 WO 2007052458 A1 WO2007052458 A1 WO 2007052458A1 JP 2006320414 W JP2006320414 W JP 2006320414W WO 2007052458 A1 WO2007052458 A1 WO 2007052458A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- map
- photographed image
- displayed
- information display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3647—Guidance involving output of stored or live camera images or video streams
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3602—Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3667—Display of a road map
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3679—Retrieval, searching and output of POI information, e.g. hotels, restaurants, shops, filling stations, parking facilities
Definitions
- the present invention relates to an information display device for teaching geographical information to a user by displaying the correspondence between a photographed image and a map image.
- the search type landscape labeling apparatus acquires shooting condition information such as the position, angle, focal length, and image size of the camera. Then, based on the acquired shooting condition information, by creating a CG image when looking at the position at the time of real scene shooting, the angle of the camera and the focal distance power in the three-dimensional map space on the computer, The structure and the structure in the CG image are matched, and the geographical information is added to the structure of the landscape image.
- Patent Document 1 Japanese Patent Application Laid-Open No. 11-66350
- the present invention has an object to provide an information display device that presents a user with the correspondence between a position in a real landscape and a map image.
- the present invention provides a photographed image acquiring means for acquiring a photographed image, a map information storing means for storing map information, and a means for generating a map image based on the map information.
- An image display means for displaying the photographed image and the map image, and an input for specifying at least one of the structure displayed on any one of the photographed image and the map image based on an operation of the user Means, a structure correspondence acquiring means for acquiring correspondence between the structure displayed on the photographed image and the structure displayed on the map image, the image display means including the photographed image and the photographed image
- the structure corresponding to the specified structure is highlighted among the structures.
- the corresponding structure in the other image is highlighted.
- the user can easily understand the correspondence between the structure on the photographed image obtained by imaging the real landscape and the structure on the map image, that is, the correspondence between the real landscape and the map.
- FIG. 1 is a block diagram showing a configuration of an information display apparatus according to a first embodiment of the present invention.
- FIG. 2 is an explanatory view showing an example of a map image and a photographed image displayed on an image display unit in the information display device shown in FIG.
- FIG. 3 shows how the corresponding structure is highlighted on the map image IM when the structure displayed on the photographed image is specified in the information display device shown in FIG. FIG.
- FIG. 4 is a table showing the operation of highlighting the corresponding structure on the map image IM in accordance with the designation of the structure displayed on the photographed image in the information display device shown in FIG. Flow chart.
- FIG. 5 is an explanatory view of the operation in the first structure specifying routine shown in FIG.
- FIG. 6 is a flowchart showing the highlighting operation of the structure in the first modification of the information display device shown in FIG.
- FIG. 7 is a flowchart showing the highlighting operation of the structure in the second modification of the information display device shown in FIG.
- FIG. 8 is an explanatory view showing an example of a map image and a photographed image displayed on the image display unit in the third modified example of the information display device shown in FIG.
- FIG. 9 shows the structure corresponding to the photographed image when the structure displayed on the map image is designated in the information display device of the second embodiment of the present invention.
- FIG. 10 is an explanatory view showing a state of highlighting.
- FIG. 10 is a flowchart showing a highlighting operation of a structure in the information display apparatus according to the second embodiment of the present invention.
- FIG. 11 is a first modified example of the information display device according to the second embodiment of the present invention, wherein when a structure displayed on the map image is designated, the photographed image is displayed. It is explanatory drawing which shows a mode that a corresponding structure is highlighted.
- FIG. 12 is a flowchart showing a highlighting operation of a structure in the information display apparatus according to the third embodiment of the present invention.
- FIG. 13 is a flowchart showing the highlighting operation of the structure in the information display device according to the fourth embodiment of the present invention.
- the user can display the actual landscape by presenting the correspondence between an actual photographed image actually captured by a camera or the like and an arbitrary part displayed and displayed on each of the map images acquired and captured in advance. It assists in the easy understanding of the relationship between the two and the map.
- an arbitrary part to be designated is a structure such as a building or a road! The position of the part is applicable.
- the information display apparatus IDA includes an actual image acquisition unit 1, a position information acquisition unit 2, a camera attribute information acquisition unit 3, a map information storage unit 4, It includes a structure correspondence acquisition unit 5, an input unit 6, an image display unit 7, and a control unit 8. Live-shot image
- the acquisition unit 1 includes a camera that captures a real scene and generates photographed image data Die.
- the captured image may be a still image or a moving image.
- it is assumed that the camera is mounted so as to image the front of the car.
- the actual scenery image photographed by the camera is output as photographed image data Die.
- the photographed landscape image represented by the photographed image data Die is referred to as a photographed image IC.
- the position information acquiring unit 2 is, for example, a GPS receiver that acquires a camera position when the photographed image acquiring unit 1 acquires a photographed image.
- the camera position means camera position information IPc that is mounted on a car that travels or stops on a road or the like! Note that the positional information obtained by the GPS receiver may be corrected based on the positional relationship between the GPS receiver and the camera, and used as camera position information IPc.
- the camera attribute information acquisition unit 3 acquires camera attribute information IAc, which is a parameter for determining the imaging direction and the imaging range of the actual landscape by the photographed image acquisition unit 1.
- Camera attribute information IAc includes camera angle (horizontal angle, elevation angle), focal length, and image size.
- the camera attribute information IA c may be converted based on other parameters such as the angle of view as long as the imaging direction and the imaging range are determined. Also, the camera attribute information IAc may acquire the value set in the camera, or may be acquired by a three-dimensional compass attached to the camera.
- the map information storage unit 4 stores geographical information IG representing, for example, the positions and names of roads and facilities prepared in advance in a medium such as an HDD or a DVD.
- the geographical information IG also includes three-dimensional map data DMt including height information of structures.
- the structure correspondence obtaining unit 5 receives the camera position information IPc, the camera attribute information IAc, the geographical information IG, and the user instruction information IIu, which are input through the control unit 8, in the photographed image.
- the structure is associated with the structure in the map image to generate structure correspondence relationship information ICb.
- the structure correspondence acquisition unit 5 will be described in detail later with reference to FIG.
- the input unit 6 is configured of, for example, a touch panel or a remote control that receives a user's operation.
- the input unit 6 further generates user instruction information IIu representing the user's input instruction based on the received user operation.
- the image display unit 7 is, for example, a display that displays the map image IM and the photographed image IC based on the image data DI supplied via the control unit 8.
- Image display unit 7 displays Map image to be IM may be a 2D (two-dimensional) map or a 3D (three-dimensional) map! /.
- the input unit 6 described above is configured as a touch panel provided on the display surface of the image display unit 7.
- the screen of the image display unit 7 is preferably divided into two and the map image IM and the photographed image IC are displayed respectively.
- the photographed image IC is a real landscape ahead of the traveling direction, photographed by the photographed image acquisition unit 1 of the information display device IDA mounted on a car mounted by the user. Then, in the lower left part of the photographed image IC, a T-junction with the road Rc extending to the left with respect to the traveling direction can be seen.
- a code C indicates a vehicle equipped with the information display device IDA
- a code Rg indicates a road corresponding to the road Rc in the photographed image IC.
- the map image IM is displayed including the structures and the like hidden by the building and the like in the front in the photographed image IC.
- the control unit 8 includes a photographed image acquisition unit 1, a position information acquisition unit 2, a camera attribute information acquisition unit 3, a map information storage unit 4, a structure correspondence acquisition unit 5, and an input unit. 6. Overall operation of the information display device ID A based on the photographed image data DIc, camera position information IPc, camera attribute information IAc, geographical information IG, structure correspondence information ICb, and user instruction information IIu Control.
- the control unit 8 includes, for example, a CPU.
- the corresponding part in the map image IM is selected as a target to be displayed. Then, according to the selection, it is indicated that the road Rg on the map image IM responds by blinking or the like.
- the road Rg is shown in black for convenience of drawing.
- the user touches the image of the road Rc of the photographed image IC.
- the structure correspondence acquisition unit 5 identifies the structure (road Rc)! ⁇ , highlights the road Rg on the map image IM.
- the user can easily recognize the correspondence between the structure in the photographed image IC and the structure in the map image IM (the structure shown in the photographed image corresponds to which structure on the map. .
- the highlighting process starts when the user U selects an arbitrary part on the photographed image IC by touching it.
- a touch panel is used as the input unit 6
- any means such as a remote control may be used as long as the user can specify an arbitrary position on the photographed image IC.
- step S 2 user instruction information IIu is output from the input unit 6 to the control unit 8 in response to a touch of a specific part on the photographed image IC of the touch panel (input unit 6) by the user. Then, control proceeds to the next step S4.
- step S4 the structure correspondence acquisition unit 5 receives user instruction information IIu, camera attribute information IAc (camera angle, focal length, and image size) input from the control unit 8, and force camera position information IPc. Based on the above, on the three-dimensional map space determined by the geographical information IG, the camera position and the direction corresponding to the point designated by the user (hereinafter referred to as “pointing direction vector”) are determined. Then, control proceeds to the next step S6.
- pointing direction vector the camera position and the direction corresponding to the point designated by the user
- step S6 the structure correspondence acquisition unit 5 specifies a structure indicated by the user on the photographed image IC, and generates a structure correspondence information ICb. Steps S4 and S6 described above constitute the first structure identification routine # 10A. Then, control proceeds to the next step S8.
- step S8 the control unit 8 generates, based on the structure correspondence relationship information ICb, image data DI for highlighting a portion representing the specified structure on the map image IM.
- the image display unit 7 blinks the specified portion on the map image IM corresponding to the structure specified by the user on the photographed image IC based on the image data DI.
- control ends As highlighting, in addition to blinking, changes in display color, brightness, etc., outline enhancement, superimposed display of names of object structures, etc., inversion, coloring, and increase / decrease of illuminance Any method may be used as long as it draws the user's attention.
- the present invention is not limited to these, and any other means may be used as long as it can recognize that it corresponds to the structure specified by the user on the photographed image IC. Any method of optical attention such as inversion, coloring, and increase or decrease in illumination can be used!
- the symbol Q indicates a real structure (a road in this example) designated by the user
- the symbol Sc indicates the camera screen of the photographed image acquisition unit 1
- the symbol E indicates a viewpoint that is a camera position. Shown. In the 3D map space as shown in the figure, find the point F where the viewpoint E force also travels by the focal distance f in the camera angle direction, and draw the plane (camera screen Sc) corresponding to the image size. It is set to be perpendicular to the reference vector V connecting the viewpoint E and the point F.
- steps S4 and S6 in the flowchart shown in FIG. 4 are replaced with steps S3, S5, S7 and S16. That is, steps S3, S5, S7, and S16 constitute the second structure identification routine # 10B.
- the second structure identification routine # 10B is performed as follows. That is, in step S3, a three-dimensional space in which half-lines connecting points of four corners of the viewpoint E force camera screen shown in FIG. 5 are set as the visual field space. It should be noted that the visual field space may be cut off at a point at which the force of the visual point force extending to infinity in theory is also separated by an appropriate distance. Control then proceeds to the next step S5.
- step S5 a structure present in the visual field space is determined in the three-dimensional map space.
- step S7 processing for projecting the obtained structure onto the camera screen Sc shown in FIG. 5 is performed.
- projection processing hidden surface processing is performed in consideration of the overlap of structures when viewed from the viewpoint E, and parts that can be seen from viewpoint E and parts that are blocked by other structures are classified.
- Well-known techniques such as Z buffer method, scan line method, and ray tracing method can be used for hidden surface treatment. Note that the projection processing may not be performed on the entire visual field space, but may be performed only on the vicinity of the position designated by the user. Control then proceeds to the next step S16.
- step S16 the correspondence between the structure displayed in the photographed image IC and the structure on the image (hereinafter referred to as “CG image”) obtained by projecting the structure in the view space onto the camera screen Sc.
- Matching processing is performed.
- the matching process is performed on each of the photographed image IC and the CG image, and the area dividing process of dividing the area into the areas divided into the structures is performed.
- the segmentation of the photographed image IC can be performed using a known image processing technology such as edge detection and labeling processing.
- the CG image is created based on geographical information IG which is three-dimensional map data stored in the map information storage unit 4. It is known whether it corresponds to Therefore, by matching the structure in the photographed image IC with the structure in the CG image, the structure in the photographed image IC can be identified.
- the structure designated by the user can be specified by determining to which area the position on the photographed image IC designated by the user belongs.
- structure correspondence relationship information ICb is generated.
- region division processing is performed for both the real shot image IC and CG image, but in order to reduce processing load, the result of the region division performed on the CG image is allocated as it is to the real shot image IC. May be
- step S2 is moved immediately before step S10 in the flowchart shown in FIG. 6, step S8 is deleted, and step S2 is performed.
- step S17 for superimposing the auxiliary display on the photographed image.
- the second structure identification routine # 10B is executed without the user specifying the object on the photographed image IC (step S2). At this point, the user can specify a structure that can be specified on the photographed image IC. Control then proceeds to step S17.
- step S17 an auxiliary display such as emphasizing an outline on the photographed image IC is superimposed on the identified structure, and it is shown that specification is possible.
- the user can easily perform an operation of specifying a structure on the photographed image IC. That is, since the area where the structure exists is already specified when the user specifies the structure, only the method of specifying the structure of specifying a point on the screen as in the above embodiment If a number is assigned to each non-contiguous area and the user designates that number or selects an area with a remote control or the like, it is possible to designate a structure that has been specified. Control then proceeds to the next step S2.
- step S 2 when the user designates a structure on the photographed image IC (S 13), since the correspondence of the structure has already been completed, the above-mentioned structure opposition relationship information generation step S
- step S10 which does not require the process of 8 the structure on the corresponding map image IM is highlighted.
- the structure in which a part or the whole is concealed by another structure in the photographed image IC is the same as the structure on the photographed image IC.
- An auxiliary indication may be superimposed to indicate that the structure is present.
- the same process as in the second structure identification routine # 10B (steps S3 to S16) is performed before the user designates the structure.
- an auxiliary display such as a skeleton display is used to display the corresponding structure on the photographed image IC. Indicates that exists.
- a button for specifying a corresponding structure in association with the auxiliary display the user can select it.
- obstacle detection means for detecting a structure not stored in a three-dimensional map such as a forward vehicle may be provided.
- the direction, distance, and shape of the obstacle are detected by image recognition and other known obstacle detection means, and the result is projected onto the camera screen to determine the range of the obstacle on the photographed image IC.
- the obstacle detection may be performed using the image acquired by the photographed image IC acquisition means 1. In this case, the process of projecting the detected obstacle on the camera screen is unnecessary.
- a third modified example of the information display device to be applied to the first embodiment will be described.
- a structure which is concealed by an obstacle and can not be actually viewed is displayed as an auxiliary display such as Skeleton display on the real image IC.
- Skeleton display Indicate the existence of the corresponding structure in the table to make it easy for the user to recognize.
- means are provided to detect obstacles.
- the map image IM and the photographed image IC are displayed on the screen of the image display unit 7 respectively.
- the building Bc which can be originally viewed, the structure in front, and the road in progress can not be displayed because the road V is blocked by the vehicle Vc passing through. Therefore, as shown by the dotted lines, the building Bc, the structure in front, and the road in progress are displayed in a skeleton in the vehicle Vc. As a result, the user can recognize the existence of a building blocked by the vehicle Vc.
- the building Bg on the map image IM is highlighted.
- the user can be stored on the map of the vehicle ahead, etc., and can be easily selected on the photographed image IC even for a structure that is concealed by the obstacle.
- FIGS. 9 and 10 An information display apparatus according to a second embodiment of the present invention will be described below with reference to FIGS. 9 and 10.
- the map image IM is displayed in response to the designation of the structure on the photographed image IC displayed on the image display unit 7, the map image IM is displayed. The corresponding structure of is highlighted. With power, this book
- the structure on the photographed image IC is highlighted in response to the user designating the structure on the map image IM.
- the information display device IDAr is basically configured in the same manner as the information display device IDA.
- the user that is, the car equipped with the information display device IDAr is basically in the same situation as shown in FIG. 2 (and FIG. 3) described above. Therefore, the map image IM and the photographed image IC displayed on the image display unit 7 are also the same as in FIG.
- the user touches the part of the road Rg on the map image IM, so that the corresponding part in the photographed image IC is selected as a target to be displayed. Then, according to the selection, the corresponding road Rc on the photographed image IC is indicated by blinking or the like.
- FIG. 9 for convenience of drawing, it is shown that it is highlighted by putting three diagonal lines on the upper and lower sides of the road Rc respectively.
- the structure correspondence acquisition unit 5 specifies the structure (road Rg) and highlights the road Rc on the photographed image IC.
- the user can easily identify the corresponding structure between the structure in the map image IM and the structure in the photographed image IC (the structure on the map hits any structure shown on the photographed image IC). can do.
- FIG. 10 highlighting processing of the corresponding structure in the information display device IDAr will be described.
- the flowchart shown in FIG. 10 is the same as the flowchart shown in FIG. 6 described above in step S2, second structure identification routine # 10B, step S8 and step S10 in step S102 and third structure identification, respectively. It is replaced with routine # 10C and step SI 08.
- the third structure identification routine # 10C is the same as the second structure identification routine # 10B shown in FIG. 6 except that step S16 is replaced with step S116.
- step S102 user instruction information IIu is output from the input unit 6 to the control unit 8 in response to the user touching the specific part on the map image IM of the touch panel (input unit 6) by the user. . Then, control proceeds to the third structure identification routine # 10C.
- steps S3, S5, and At step SI16 basically the same processing as at step S16 described above is performed. Meanwhile, in this step, area matching is performed by the matching process for the structure specified by the user on the map image IM. Then, it is specified where on the real image IC the structure specified by the user on the map image IM is present.
- the structure correspondence acquisition unit 5 specifies a structure instructed by the user on the map image IM, and generates a structure correspondence information ICb. Then, control proceeds to the next step S108.
- step S108 the control unit 8 generates, based on the structure correspondence relationship information ICb, image data DI for highlighting a portion representing a specific structure on the photographed image IC.
- the image display unit 7 blinks the specified part on the photographed image IC corresponding to the structure specified by the user on the map image IM based on the image data DI. And control ends.
- FIG. 11 A first modified example of the information display device IDAr according to the second embodiment will be described with reference to FIG.
- the photographed image IC is displayed.
- An auxiliary display for example, a skeleton display
- the user that is, the car equipped with the information display device IDAr is basically in the same situation as that shown in FIG. 9 described above. That is, in the photographed image IC displayed on the image display unit 7, as in the case shown in FIG.
- the building Bc that can be originally viewed, the structure in front, and the road Rc in progress are on the road. Can not be displayed because it is blocked by Vc. As shown by the dotted line, the building Be is displayed as a skeleton in the vehicle Vc!
- the information display device IDAr is provided with the obstacle detection means for detecting a structure which is stored in a three-dimensional map such as a forward vehicle
- the user can Even when the structure designated in (1) is concealed by an obstacle in the photographed image IC, it is possible to indicate that the corresponding structure exists on the photographed image IC by displaying a skeleton or the like. That is, when the user touches a part of the building Bg on the map image IM, the structure correspondence relation acquiring unit 5 specifies where it is present on the photographed image IC.
- the structure correspondence relationship acquisition unit 5 is configured such that the position of the building Bg on the photographed image IC (building Be) is within the range of the obstacle (front vehicle B) on the photographed image IC detected by the obstacle detecting means. It detects that it is in, and displays Be as a skeleton on the photographed image IC.
- the user can grasp the position in the photographed image IC even for a structure which is hidden by an obstacle such as a forward vehicle and can not be seen.
- the structure specified on the map image IM by the user is hidden by another structure in the photographed image IC or not is the projection process (the hidden surface process) in step S7 in the flowchart shown in FIG. It can be calculated by As a result, the user can grasp the position in the photographed image IC even for a structure that is hidden from view by other structures.
- the force described in the case where another structure is concealed by a structure not stored in the three-dimensional map such as a forward vehicle is stored in the three-dimensional map described. Even if another structure is hidden in the structure, it is possible to grasp the position of the hidden structure in the photographed image IC by similarly displaying the skeleton.
- an auxiliary display may be superimposed on the map image IM to indicate a structure selectable by the user.
- the structure existing in the visual field space of the camera is determined in the three-dimensional map space, as in the processing in the third structure identification routine # 10C described above.
- the projection processing and the matching processing are performed to associate the structure in the photographed image IC with the structure in the map image IM.
- a structure that can be designated by the user on the map image IM is identified. Then, the user can easily perform an operation of specifying a structure on the map image IM by indicating that the specified structure can be specified by emphasizing the outline on the map image IM or the like. You can
- an information display apparatus IDArr according to a third embodiment of the present invention. (Not shown) will be described.
- the user when the user designates a structure in the photographed image IC or the map image IM, the user highlights the corresponding structure in the other image. It is possible to easily understand the correspondences between the structures on the real image IC and the structures on the map image IM obtained by imaging the landscape. It should be noted that as an effective means both when the user specifies a structure in the photographed image IC and when specifying a structure in the map image IM, highlighting of the structure is performed in the order specified by the user. You may change the way.
- step S102, third structure identification routine # 10C, and step S108 are respectively changed to step S202, fourth structure identification routine # 10D, and step S208. It is configured in the same manner as the flowchart shown in FIG.
- the fourth structure identification routine # 10D is configured in the same manner as the third structure identification routine # 10C except that step S116 is changed to step S216.
- step S 202 user instruction information IIu is output from the input unit 6 to the control unit 8 in response to the user touching the specific part on the photographed image IC of the touch panel (input unit 6) by the user. . That is, the user selects a plurality of wedges whose position on the map image IM of the structure displayed on the photographed image IC has been confirmed, or a plurality of structures displayed on the map image IM. When a plurality of objects from which the position of the photographed image IC is to be confirmed are selected, user instruction information IIu is generated accordingly. Then, control proceeds to the fourth structure identification routine # 10D.
- step S216 the correspondence between the structure on the photographed image IC and the structure on the map image IM. Will be added. Then, a structure on the photographed image IC specified by a plurality of users is specified on the map image IM, and the structure correspondence relationship information ICb is generated. Alternatively, a structure on the map image IM designated by a plurality of users is specified on the photographed image IC to generate structure correspondence information ICb.
- the structure correspondence information ICb includes information of the order in which the user has specified a plurality of structures. And control proceeds to the next step S 208 Well.
- step S208 a plurality of structures designated by the user in the image display unit 7 are highlighted in the designated order on the map image IM or the photographed image IC.
- the user changes the color according to the specified order, such as red for the first specified structure, yellow for the second specified structure, and blue for the third specified structure. Let it be displayed.
- the structure on the real image IC and the structure on the map image IM can easily understand the correspondence relationship between
- the highlighting method is not limited to association by color, and in addition, character information etc. that may be used to change the size or shape of the highlighting may also be used.
- the corresponding structures on the image IC and the map image IM may be displayed by connecting them with lines or the like.
- the method of emphasizing the structure may be changed according to the attribute of the structure. Also in this case, especially when the user designates a plurality of structures, it is easy for the user to understand the correspondence between the structures on the photographed image IC and the structures on the map image IM. In addition, when specifying a plurality of structures, for example, highlighting is not performed for a certain number or more in order to prevent that the number of structures highlighted on the screen increases and the visibility deteriorates. It is useful to include processing such as terminating highlighting by user input.
- the map display range may be set so that a structure present in the visual field space of the photographed image IC is displayed on the map image IM. This is because, for example, when the user designates a structure on the photographed image IC, the camera view space and the map display range are shifted, and the corresponding structure is not displayed on the map image IM, or the map is displayed. If the range is too large and the structure on the map image IM can not be displayed in the appropriate size, or in such a case, set the map to an appropriate reduction and display range, and then set the corresponding structure. It is highlighted.
- the map may be set to an appropriate scale and display range. This allows the user to specify a structure within the camera view space on the map screen. This facilitates the process and also improves the visibility when highlighting the structure specified by the user with the photographed image IC on the map screen.
- the input method may be different depending on the attribute of the structure. For example, if a road is specified, an operation to trace the road is performed, and if a building is specified, the input method is changed according to the attribute of the structure, such as long press on the building. This can reduce the possibility of the user making an input error.
- the real image IC and the map image IM are displayed on two screens, and while the user designates a structure in one of the real image IC and the map image IM, the other image is displayed.
- the corresponding structure in may be highlighted. This is performed, for example, as the user continues touching the corresponding structure in the map image IM while tapping the structure in the photographed image IC and emphasizing it as the user stops touching. It is to end the display. This makes it easier for the user to understand the correspondence between the structure on the photographed image IC and the structure on the map image IM because the link between the user's operation and the display becomes higher.
- the imaging direction and imaging magnification of the camera are configured to be changeable.
- the imaging direction and imaging magnification of the camera are configured to be changeable.
- FIG. 13 is configured by adding steps S103 and S104 between step S102 and the third structure identification routine # 10C in the flowchart shown in the flowchart of FIG.
- step S102 when the designation of the structure on the map image IM is detected by the user, in step S103, the structure correspondence acquisition unit 5 determines the structure specified by the user in the three-dimensional map space. It determines whether the object is in the field of view of the photographed image IC.
- step S104 the force camera imaging direction is changed so that the structure specified by the user is in the view space.
- the control proceeds to the third structure identification routine # 10C, the above-mentioned processing is performed, and the selection is made.
- the highlighted structure is highlighted on the live-action image IC.
- the user changes the imaging direction of the camera according to the specification of the structure on the map.
- the user changes the imaging direction of the camera and enters the structure in the camera's field of view. It may be specified. Also, change the imaging magnification instead of the imaging direction.
- the image display unit 7 is configured to display each of the two displays in such a manner that the actual display image IC and the map image IM are displayed on two screens on one display. Good. Also, the photographed image IC and the map image IM may not be displayed at the same time. In this case, for example, only the map image IM is initially displayed on the display, and when the user designates a structure, the screen is switched to the photographed image IC, and the corresponding structure is highlighted. Also, after the user designates a structure, the display may be switched to the two-image display of the map image IM and the photographed image IC. As for these, the actual image IC is displayed first, and the same applies when specifying the structure from there.
- the camera is not limited to a force that is attached to a car, and the camera may be attached to a mobile such as a mobile phone, a ship, or an airplane, or may be attached to a building or the like. It may be done. Also, the camera and the display may be remote.
- the information display device of the present invention is useful as a car navigation device installed in a vehicle, an in-vehicle information terminal, and an image display device such as a display. It is also used as a navigation device for mobile phones carried by pedestrians and mobile objects such as ships and aircrafts.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Instructional Devices (AREA)
- Navigation (AREA)
- Processing Or Creating Images (AREA)
- Traffic Control Systems (AREA)
- Image Processing (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US12/090,724 US20090262145A1 (en) | 2005-11-01 | 2006-10-12 | Information display device |
| CN200680040831.5A CN101300459B (zh) | 2005-11-01 | 2006-10-12 | 信息显示装置 |
| EP06811702.7A EP1953500A4 (en) | 2005-11-01 | 2006-10-12 | INFORMATION DISPLAY DEVICE |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2005318341A JP4246195B2 (ja) | 2005-11-01 | 2005-11-01 | カーナビゲーションシステム |
| JP2005-318341 | 2005-11-01 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2007052458A1 true WO2007052458A1 (ja) | 2007-05-10 |
Family
ID=38005611
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2006/320414 Ceased WO2007052458A1 (ja) | 2005-11-01 | 2006-10-12 | 情報表示装置 |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US20090262145A1 (enExample) |
| EP (1) | EP1953500A4 (enExample) |
| JP (1) | JP4246195B2 (enExample) |
| CN (1) | CN101300459B (enExample) |
| WO (1) | WO2007052458A1 (enExample) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2012033095A1 (ja) * | 2010-09-06 | 2012-03-15 | 国立大学法人東京大学 | 車両システム |
| CN105675003A (zh) * | 2016-01-29 | 2016-06-15 | 广州华多网络科技有限公司 | 线路生成及线路分享、路点添加、线路导航方法及装置 |
Families Citing this family (31)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2011052960A (ja) * | 2007-12-28 | 2011-03-17 | Mitsubishi Electric Corp | ナビゲーション装置 |
| US9310992B2 (en) | 2008-08-22 | 2016-04-12 | Google Inc. | Panning in a three dimensional environment on a mobile device |
| US9204050B2 (en) * | 2008-12-25 | 2015-12-01 | Panasonic Intellectual Property Management Co., Ltd. | Information displaying apparatus and information displaying method |
| JP5281421B2 (ja) * | 2009-01-23 | 2013-09-04 | アルパイン株式会社 | 車載システム |
| JP5221580B2 (ja) * | 2010-03-12 | 2013-06-26 | 株式会社日立情報制御ソリューションズ | 画像表示システム、携帯情報端末及び画像表示プログラム |
| US20110279446A1 (en) * | 2010-05-16 | 2011-11-17 | Nokia Corporation | Method and apparatus for rendering a perspective view of objects and content related thereto for location-based services on mobile device |
| US9582166B2 (en) * | 2010-05-16 | 2017-02-28 | Nokia Technologies Oy | Method and apparatus for rendering user interface for location-based service having main view portion and preview portion |
| US8730319B2 (en) * | 2010-07-09 | 2014-05-20 | Kabushiki Kaisha Toshiba | Display device, image data generating device, image data generating program, and display method |
| US8277055B2 (en) * | 2010-07-21 | 2012-10-02 | Delphi Technologies, Inc. | Multiple view display system using a single projector and method of operating the same |
| JP2012048523A (ja) * | 2010-08-27 | 2012-03-08 | Toshiba Corp | 対応付け装置 |
| JP5652097B2 (ja) * | 2010-10-01 | 2015-01-14 | ソニー株式会社 | 画像処理装置、プログラム及び画像処理方法 |
| US9026359B2 (en) * | 2010-11-01 | 2015-05-05 | Nokia Corporation | Visually representing a three-dimensional environment |
| WO2012132257A1 (ja) | 2011-03-28 | 2012-10-04 | パナソニック株式会社 | 画像表示装置 |
| US8164599B1 (en) * | 2011-06-01 | 2012-04-24 | Google Inc. | Systems and methods for collecting and providing map images |
| CN103028252B (zh) * | 2011-09-29 | 2014-12-31 | 泉阳兴业株式会社 | 观览车 |
| JP6064544B2 (ja) * | 2012-11-27 | 2017-01-25 | ソニー株式会社 | 画像処理装置、画像処理方法、プログラム及び端末装置 |
| JP5745497B2 (ja) * | 2012-12-04 | 2015-07-08 | 任天堂株式会社 | 表示システム、表示制御装置、情報処理プログラム及び表示方法 |
| JP6047197B2 (ja) * | 2015-05-01 | 2016-12-21 | 任天堂株式会社 | 表示システム、表示制御装置、情報処理プログラム及び表示方法 |
| EP3848509A1 (en) * | 2015-07-21 | 2021-07-14 | Kabushiki Kaisha Toshiba | Crack analysis device, crack analysis method, and crack analysis program |
| WO2017119202A1 (ja) | 2016-01-06 | 2017-07-13 | 富士フイルム株式会社 | 構造物の部材特定装置及び方法 |
| CN106996785B (zh) * | 2016-01-25 | 2019-12-10 | 北京四维图新科技股份有限公司 | 一种对导航数据进行更新的方法及装置 |
| US10810443B2 (en) * | 2016-09-22 | 2020-10-20 | Apple Inc. | Vehicle video system |
| JP6460420B2 (ja) * | 2016-11-08 | 2019-01-30 | 本田技研工業株式会社 | 情報表示装置、情報表示方法、および情報表示プログラム |
| WO2019164514A1 (en) * | 2018-02-23 | 2019-08-29 | Google Llc | Transitioning between map view and augmented reality view |
| JP7048357B2 (ja) * | 2018-03-06 | 2022-04-05 | 東芝インフラシステムズ株式会社 | 撮影画像事前確認システム及び撮影画像事前確認方法 |
| JP2020042370A (ja) * | 2018-09-06 | 2020-03-19 | アイシン精機株式会社 | 表示制御装置 |
| JP7384014B2 (ja) * | 2019-12-06 | 2023-11-21 | トヨタ自動車株式会社 | 表示システム |
| CN112885087A (zh) * | 2021-01-22 | 2021-06-01 | 北京嘀嘀无限科技发展有限公司 | 确定路况信息的方法、装置、设备和介质和程序产品 |
| JP7707643B2 (ja) * | 2021-05-17 | 2025-07-15 | 富士通株式会社 | 同定プログラム、同定方法、および情報処理装置 |
| KR102605696B1 (ko) * | 2022-12-12 | 2023-11-30 | 한국도로공사 | 정밀도로지도기반의 cctv 카메라 자세 및 객체 좌표 추정 방법 및 시스템 |
| WO2025158899A1 (ja) * | 2024-01-25 | 2025-07-31 | Necソリューションイノベータ株式会社 | 座標算出装置、座標算出方法、及びコンピュータ読み取り可能な記録媒体 |
Citations (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH1166350A (ja) | 1997-08-12 | 1999-03-09 | Nippon Telegr & Teleph Corp <Ntt> | 検索型景観ラベリング装置およびシステム |
| JPH11331831A (ja) * | 1998-05-15 | 1999-11-30 | Mitsubishi Electric Corp | 画像上の位置判読装置 |
| JP2002117494A (ja) * | 2000-10-11 | 2002-04-19 | Honda Motor Co Ltd | 周辺情報表示装置 |
| JP2002372427A (ja) * | 2001-06-15 | 2002-12-26 | Alpine Electronics Inc | ナビゲーション装置 |
| JP2003106853A (ja) * | 2001-09-28 | 2003-04-09 | Toshiba Corp | 運転支援装置 |
| JP2003132334A (ja) * | 2001-10-22 | 2003-05-09 | Asia Air Survey Co Ltd | イメージマップ配信システム |
| JP2003132068A (ja) * | 2001-10-22 | 2003-05-09 | Nec Corp | ナビゲーションシステム及びナビゲーション端末 |
| JP2003317116A (ja) * | 2002-04-25 | 2003-11-07 | Sony Corp | 3次元仮想空間における情報提示装置及び情報提示方法、並びにコンピュータ・プログラム |
| JP2004234457A (ja) * | 2003-01-31 | 2004-08-19 | Canon Inc | 情報処理装置、方法、プログラムおよび記憶媒体 |
| JP2005056075A (ja) * | 2003-08-01 | 2005-03-03 | Sony Corp | 地図表示システム、地図データ加工装置、地図表示装置及び地図表示方法 |
| JP2005069776A (ja) * | 2003-08-21 | 2005-03-17 | Denso Corp | 車両用表示方法、車両用表示装置 |
Family Cites Families (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5602564A (en) * | 1991-11-14 | 1997-02-11 | Hitachi, Ltd. | Graphic data processing system |
| US6282362B1 (en) * | 1995-11-07 | 2001-08-28 | Trimble Navigation Limited | Geographical position/image digital recording and display system |
| JP3658659B2 (ja) * | 1995-11-15 | 2005-06-08 | カシオ計算機株式会社 | 画像処理装置 |
| US7116341B2 (en) * | 2002-04-25 | 2006-10-03 | Sony Corporation | Information presentation apparatus and method in three-dimensional virtual space and computer program therefor |
| US7827507B2 (en) * | 2002-05-03 | 2010-11-02 | Pixearth Corporation | System to navigate within images spatially referenced to a computed space |
| US7636901B2 (en) * | 2003-06-27 | 2009-12-22 | Cds Business Mapping, Llc | System for increasing accuracy of geocode data |
| US7155336B2 (en) * | 2004-03-24 | 2006-12-26 | A9.Com, Inc. | System and method for automatically collecting images of objects at geographic locations and displaying same in online directories |
| US7746376B2 (en) * | 2004-06-16 | 2010-06-29 | Felipe Mendoza | Method and apparatus for accessing multi-dimensional mapping and information |
| JP4696248B2 (ja) * | 2004-09-28 | 2011-06-08 | 国立大学法人 熊本大学 | 移動体ナビゲート情報表示方法および移動体ナビゲート情報表示装置 |
| JP4964762B2 (ja) * | 2005-03-02 | 2012-07-04 | 株式会社ナビタイムジャパン | 地図表示装置および地図表示方法 |
| US7676325B2 (en) * | 2005-03-15 | 2010-03-09 | Pioneer Corporation | Road landscape map producing apparatus, method and program |
| TWI358647B (en) * | 2007-12-28 | 2012-02-21 | Ind Tech Res Inst | Data classification system and method for building |
-
2005
- 2005-11-01 JP JP2005318341A patent/JP4246195B2/ja not_active Expired - Fee Related
-
2006
- 2006-10-12 US US12/090,724 patent/US20090262145A1/en not_active Abandoned
- 2006-10-12 WO PCT/JP2006/320414 patent/WO2007052458A1/ja not_active Ceased
- 2006-10-12 EP EP06811702.7A patent/EP1953500A4/en not_active Withdrawn
- 2006-10-12 CN CN200680040831.5A patent/CN101300459B/zh active Active
Patent Citations (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH1166350A (ja) | 1997-08-12 | 1999-03-09 | Nippon Telegr & Teleph Corp <Ntt> | 検索型景観ラベリング装置およびシステム |
| JPH11331831A (ja) * | 1998-05-15 | 1999-11-30 | Mitsubishi Electric Corp | 画像上の位置判読装置 |
| JP2002117494A (ja) * | 2000-10-11 | 2002-04-19 | Honda Motor Co Ltd | 周辺情報表示装置 |
| JP2002372427A (ja) * | 2001-06-15 | 2002-12-26 | Alpine Electronics Inc | ナビゲーション装置 |
| JP2003106853A (ja) * | 2001-09-28 | 2003-04-09 | Toshiba Corp | 運転支援装置 |
| JP2003132334A (ja) * | 2001-10-22 | 2003-05-09 | Asia Air Survey Co Ltd | イメージマップ配信システム |
| JP2003132068A (ja) * | 2001-10-22 | 2003-05-09 | Nec Corp | ナビゲーションシステム及びナビゲーション端末 |
| JP2003317116A (ja) * | 2002-04-25 | 2003-11-07 | Sony Corp | 3次元仮想空間における情報提示装置及び情報提示方法、並びにコンピュータ・プログラム |
| JP2004234457A (ja) * | 2003-01-31 | 2004-08-19 | Canon Inc | 情報処理装置、方法、プログラムおよび記憶媒体 |
| JP2005056075A (ja) * | 2003-08-01 | 2005-03-03 | Sony Corp | 地図表示システム、地図データ加工装置、地図表示装置及び地図表示方法 |
| JP2005069776A (ja) * | 2003-08-21 | 2005-03-17 | Denso Corp | 車両用表示方法、車両用表示装置 |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2012033095A1 (ja) * | 2010-09-06 | 2012-03-15 | 国立大学法人東京大学 | 車両システム |
| JP5804571B2 (ja) * | 2010-09-06 | 2015-11-04 | 国立大学法人 東京大学 | 車両システム |
| CN105675003A (zh) * | 2016-01-29 | 2016-06-15 | 广州华多网络科技有限公司 | 线路生成及线路分享、路点添加、线路导航方法及装置 |
Also Published As
| Publication number | Publication date |
|---|---|
| EP1953500A1 (en) | 2008-08-06 |
| EP1953500A4 (en) | 2014-06-11 |
| CN101300459A (zh) | 2008-11-05 |
| CN101300459B (zh) | 2011-09-07 |
| US20090262145A1 (en) | 2009-10-22 |
| JP2007127437A (ja) | 2007-05-24 |
| JP4246195B2 (ja) | 2009-04-02 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2007052458A1 (ja) | 情報表示装置 | |
| US10217288B2 (en) | Method for representing points of interest in a view of a real environment on a mobile device and mobile device therefor | |
| JP5798392B2 (ja) | 駐車支援装置 | |
| US8773534B2 (en) | Image processing apparatus, medium recording image processing program, and image processing method | |
| EP2672459B1 (en) | Apparatus and method for providing augmented reality information using three dimension map | |
| US9269007B2 (en) | In-vehicle display apparatus and program product | |
| CN104359487B (zh) | 一种实景导航系统 | |
| US11511627B2 (en) | Display device and computer program | |
| WO2021197190A1 (zh) | 基于增强现实的信息显示方法、系统、装置及投影设备 | |
| US20130185673A1 (en) | Electronic Device, Displaying Method And File Saving Method | |
| JPWO2013161028A1 (ja) | 画像表示装置、画像表示方法、画像表示プログラム、及び記録媒体 | |
| JP2007080060A (ja) | 対象物特定装置 | |
| JPH0763572A (ja) | 車載用ナビゲーション装置の走行案内画像表示方法 | |
| WO2018134897A1 (ja) | 位置姿勢検出装置、ar表示装置、位置姿勢検出方法およびar表示方法 | |
| CN102141869B (zh) | 一种信息识别、提示方法及一种移动终端 | |
| JP2008128827A (ja) | ナビゲーション装置およびナビゲーション方法ならびにそのプログラム | |
| JP2017032436A (ja) | 移動案内システム、移動案内方法及びコンピュータプログラム | |
| JP2009250714A (ja) | 道路案内表示装置および道路案内表示方法 | |
| JP5353780B2 (ja) | 表示制御装置、方法およびプログラム | |
| JPH08115493A (ja) | ナビゲーション装置 | |
| JP5071033B2 (ja) | 経路案内装置及び経路案内方法 | |
| JP2008002965A (ja) | ナビゲーション装置およびその方法 | |
| JP5217860B2 (ja) | 死角画像表示装置および死角画像表示方法 | |
| JP2019064422A (ja) | ヘッドアップディスプレイ装置 | |
| JP2024122059A (ja) | 車両の地点指定方法および装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 200680040831.5 Country of ref document: CN |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
| WWE | Wipo information: entry into national phase |
Ref document number: 12090724 Country of ref document: US |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2006811702 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |