WO2020003548A1 - Système et procédé d'affichage d'image - Google Patents
Système et procédé d'affichage d'image Download PDFInfo
- Publication number
- WO2020003548A1 WO2020003548A1 PCT/JP2018/031364 JP2018031364W WO2020003548A1 WO 2020003548 A1 WO2020003548 A1 WO 2020003548A1 JP 2018031364 W JP2018031364 W JP 2018031364W WO 2020003548 A1 WO2020003548 A1 WO 2020003548A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- image display
- group
- images
- computer system
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/38—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Definitions
- the present invention relates to a technique for processing and displaying an image group. Further, the present invention relates to a technique which is useful when applied to, for example, support for structural deterioration inspection.
- a user takes a group of images (moving images or a plurality of still images) using a camera and inputs them to a computer such as a PC.
- the user displays an image group processed by the computer and individual images selected from the image group on a display screen.
- the user confirms a group of images and individual images on the display screen, and performs an operation according to the application.
- the work includes, for example, a general photographic image editing work.
- a system or function for inspecting or diagnosing a state of deterioration of the surface of a structure from an image obtained by photographing the surface of the structure (in some cases, described as a structure deterioration inspection or the like).
- a user who is a worker who performs maintenance and inspection work captures the surface of a structure to be maintained and inspected with a camera manually or using a drone or a robot.
- the structures include various buildings and infrastructure facilities.
- the user inputs a group of images captured by the camera to a computer, and displays a group of images processed by the computer on a display screen.
- the user selects an individual image to be inspected from the image group on the display screen and displays the selected image on the display screen.
- the user performs, for example, visual inspection work.
- the user looks at individual images on the display screen and visually confirms and detects the presence or absence and location of a state such as cracking, corrosion, or peeling (which may be collectively referred to as deterioration).
- a three-dimensional model of the structure is generated and acquired from an image obtained by photographing the surface of the structure based on a known process such as SFM (Structure For Motion).
- SFM Structure For Motion
- a system and a function to be displayed on the display screen are included.
- Patent Document 1 discloses a panoramic image synthesizing apparatus or the like, in which a digital camera divides a subject image into a plurality of images so that a part of the images overlaps and captures the images, and combines the captured images to obtain a panoramic image. It describes a step of generating an image, a step of simultaneously displaying an entire preview and a partial preview of a panoramic image in the same window, and the like.
- Patent Document 2 discloses an inspection apparatus for a structure in which an inspection object is divided into a plurality of parts, an image is taken, and image data obtained by adding corresponding position information and imaging information to a plurality of imaged image data is stored. It is described that, by designating the inspection position of the inspection object, image data is selected and the inspection position and a plurality of image data around the inspection position are displayed as a connected image.
- An object of the present invention is to provide an image display technology, in which when selecting an image for a predetermined application from a group of images taken by a camera, a suitable image can be easily selected, and the user's work is reduced. To provide technologies that can be used.
- An image display system is an image display system configured by a computer system, wherein the computer system inputs an image group including a plurality of images having different shooting dates and times, positions, and directions, and Displaying a list of image groups on a list screen, displaying a first image selected from the image group on an individual screen based on a user operation, and spatially surrounding the first image and the first image; Determining an adjacent image with respect to the first image based on a determination of a spatial positional relationship in a set with a candidate image and a determination of an overlapping state with respect to a shooting range, based on an operation of the user on the individual screen. Then, the adjacent image to the first image is selected as a second image, and the second image is displayed as a new first image.
- the representative embodiment of the present invention it is possible to easily select a suitable image when selecting an image for a predetermined use from a group of images taken by a camera with respect to an image display technique.
- the work of the user can be reduced.
- FIG. 1 is a diagram showing a configuration of a structure deterioration inspection support system including an image display system according to an embodiment of the present invention.
- FIG. 3 is a diagram illustrating a configuration of cooperation between a drone and a computer system in the embodiment.
- FIG. 1 is a diagram illustrating a configuration of an image display system according to an embodiment.
- FIG. 3 is a diagram illustrating a processing flow of the image display system according to the embodiment.
- FIG. 3 is a diagram illustrating a first example of an aerial photographing method of a structure in the embodiment.
- FIG. 4 is a diagram illustrating a second example of an aerial photographing method of a structure in the embodiment.
- FIG. 3 is a diagram illustrating a shooting range in the embodiment.
- FIG. 1 is a diagram showing a configuration of a structure deterioration inspection support system including an image display system according to an embodiment of the present invention.
- FIG. 3 is a diagram illustrating a configuration of cooperation between a drone and a computer system
- FIG. 4 is a diagram illustrating overlap between images in the aerial photographing traveling direction in the embodiment.
- FIG. 7 is a diagram illustrating an overlap between images in an image planar view in the embodiment.
- FIG. 4 is a diagram illustrating an example of an adjacent image and an overlap in the embodiment.
- FIG. 7 is a diagram illustrating an overlapping state in a shooting range in the embodiment.
- FIG. 8 is a diagram illustrating an overlapping state in adjacent images in the embodiment.
- FIG. 3 is a diagram illustrating SFM in the embodiment.
- FIG. 7 is a diagram illustrating a first example of a list screen in the embodiment.
- FIG. 5 is a diagram illustrating a first example of an individual screen in the embodiment.
- FIG. 9 is a diagram illustrating a second example of an individual screen in the embodiment.
- FIG. 13 is a diagram illustrating a third example of an individual screen in the embodiment.
- FIG. 14 is a diagram illustrating a fourth example of an individual screen in the embodiment.
- FIG. 14 is a diagram illustrating a fifth example of an individual screen in the embodiment.
- FIG. 14 is a diagram illustrating a sixth example of an individual screen in the embodiment.
- FIG. 5 is a diagram illustrating an example of an individual screen (deterioration inspection screen) in the embodiment.
- FIG. 5 is a diagram illustrating an example of a structure three-dimensional model screen in the embodiment.
- FIG. 13 is a diagram illustrating a second example of a list screen in the embodiment.
- FIG. 14 is a diagram illustrating a third example of a list screen in the embodiment.
- FIG. 4 is a diagram illustrating an example of an image relationship between side surfaces of a structure in the embodiment.
- FIG. 5 is a diagram illustrating image selection between side surfaces of a structure in the embodiment.
- FIG. 9 is a diagram illustrating a third example of an aerial photographing method of a structure in the embodiment.
- the user brings back the data of the captured image group to an office or the like and inputs the data to a computer such as a PC.
- the user selects an individual image to be an inspection target image from a list of image groups on the display screen, and displays the image on the display screen.
- the user visually searches the image for the presence or absence and location of a deteriorated state such as a crack for each image.
- the use of a drone or the like can reduce photographing work and the like.
- the computer generates a three-dimensional model of the structure from the input image group based on the SFM processing.
- a three-dimensional model can be obtained and displayed on a display screen, and can be used for structure management and the like.
- the image group obtained by the above-mentioned manual operation or drone is an image group in which the surface area of the target object is comprehensively photographed, for example, and often has an overlapping area between the images.
- a partial area of the same portion in the image content overlaps.
- images are taken so as to have a margin so that partial areas overlap so that there are no imaging omission areas.
- an image group having a sufficient overlap between images is required as a condition necessary for applying the known SFM processing. For example, 80% or more overlap is required in the horizontal direction of an image corresponding to the direction of camera movement, and 60% or more overlap is required in the vertical direction.
- the user wants to visually inspect an area in one image and then visually inspect an area adjacent to the area of the image.
- it is difficult for the user to select an image in an adjacent area because the image contents between images are so similar that differences are difficult to understand and the positional relationship between images is difficult to understand.
- the selected image is not necessarily an image of an adjacent region, and may be an image of a distant position. It may be.
- next image from the certain image (first image) on the display screen by himself / herself determining the image content and the spatial positional relationship. For example, it is determined that the same object appears in two images. Even in such a case, the next image often includes an area already inspected in the first image as an overlapping area. Overlapping areas occur between the images in the contents of the image group depending on the use or case, such as when the surface area of the structure is comprehensively photographed finely or when a three-dimensional model is generated. Therefore, the overlapping area in the next image is included again as a target of the visual inspection, and the burden on the user increases. Since there are overlapping areas in each of a large number of images, work such as deterioration inspection becomes inefficient.
- the conventional image display system in the case of, for example, support for structural deterioration inspection or generation of a three-dimensional model, it is difficult and troublesome to select individual images from an image group. That is, the conventional system has a problem in terms of the support of the user's work and the efficiency.
- the image display system can easily select a suitable image when the user selects an individual image from a group of images on the display screen according to the application, and can save time and effort.
- the mechanism includes a graphical user interface (GUI).
- GUI graphical user interface
- the present image display system when selecting the next target image from a certain image in the image group, there is a mechanism for navigating and linking to the next image in consideration of the overlapping state between the images.
- the present image display system automatically determines, for example, an image (adjacent image) with the minimum degree of overlap, and presents a link image that allows the image to be easily selected.
- the image display system receives, for example, data of a group of images captured so that the surface area of the structure is partially overlapped between the images.
- This image group may include a plurality of images having different shooting dates and times, positions, directions, and the like.
- the image display system displays a list of image groups on a GUI screen. When an individual image (first image) is selected from a screen by a user operation, the image display system displays the selected first image on the screen.
- the image display system determines a spatial positional relationship between images (a set of the first image and the candidate image) for each image (candidate image) spatially surrounding the first image, The overlapping state of the photographing range between the images is determined. In particular, the image display system calculates the overlapping state as an overlapping rate.
- the image display system determines a spatial positional relationship between images by using positional information of the images.
- the image display system determines an image such as an adjacent image to the first image based on the overlapping ratio, the positional relationship, and the like.
- the present image display system selects, as a neighboring image, an image in which the overlap ratio between images is a minimum below a predetermined overlap ratio threshold among candidate images.
- the image display system presents an image such as an adjacent image as an image (next image) which is a candidate to be selected next as a target by the user.
- the present image display system displays, on the screen, a link image indicating the presence or positional relationship of the next image with respect to the first image.
- the present image display system displays the next image (second image) associated with the link image as an individual image (new first image).
- the image display system described above allows the user to easily select a suitable image with few duplications from a certain image, and to efficiently perform operations such as deterioration inspection.
- the first image and the candidate images may have different spatial positions and different planes or directions on which the imaging ranges are arranged. Even in this case, the user can select and display the second image in a different direction from the first image by operating the link image on the screen.
- FIG. 1 shows the overall configuration of a structural deterioration inspection support system including the image display system according to the embodiment.
- the image display system according to the embodiment is configured by a computer system 1.
- the deterioration inspection support system in FIG. 1 includes a computer system 1 as an image display system, a drone 10 as a flying object, and a structure 5 as an object for deterioration inspection or the like.
- the computer system 1 and the drone 10 are connected by wireless communication.
- the computer system 1 is, for example, configured as a client-server system having a PC 2 and a server 3.
- the PC 2 and the server 3 are connected via a communication network 6.
- a user operates the PC 2 to use the stem.
- the user can input an instruction to the system, set a user, and the like on the GUI screen 22 of the PC 2, and can confirm a work state, a result, and the like.
- the structure deterioration inspection support system of FIG. 1 has a deterioration inspection support function, and the deterioration inspection support function includes a three-dimensional model generation function.
- the deterioration inspection support function is realized by the deterioration inspection support software 100 or the like.
- the deterioration inspection support function is a function of providing a GUI screen for assisting a user (operator) in performing a deterioration inspection operation of the structure 5.
- This deterioration inspection includes a visual inspection of the deterioration state of the image captured by the camera 4 by the user.
- This deterioration inspection may further include automatic diagnosis by a computer.
- the three-dimensional model generation function is a function of generating a three-dimensional model of the structure 5 from the image group of the camera 4 based on the SFM processing and providing a screen for displaying the three-dimensional model.
- the PC 2 is a computer having the drone control function 21 and the like, and is a client terminal device used by each user (operator).
- the server 3 is, for example, a server device in a business cloud computing system, a data center, or the like, and performs calculation processing in cooperation with the PC 2.
- a plurality of PCs 2 of a plurality of users may be similarly connected to the server 3.
- the PC 2 performs navigation control of the drone 10 and imaging control of the camera 4 by wireless communication.
- the PC 2 transmits navigation control information, photographing setting information, and the like to the drone 10.
- the structure 5 is an object to be inspected for deterioration and to generate a three-dimensional model, and is an object photographed by the camera 4 of the drone 10.
- the structure 5 is, for example, a building or an infrastructure facility.
- the buildings include general buildings, houses, public buildings, and the like.
- Infrastructure facilities include, for example, power facilities, road traffic facilities, communication facilities, bridges, and the like.
- a predetermined area on the surface of the structure 5 is a target area.
- the drone 10 is an air vehicle that performs autonomous navigation based on remote control by wireless communication from the PC 2.
- the drone 10 autonomously sails on the set route in the space around the structure 5.
- control information on autonomous navigation and aerial photography of the drone 10 is created and set as aerial photography setting information based on the photography plan data d2 so that the target area of the structure 5 can be aerial photography.
- the aerial photography setting information includes the navigation route of the drone 10, the imaging timing (imaging date and time), the imaging setting information of the camera 4, and the like.
- a mode in which the user controls the navigation of the drone 10 from the PC 2 or the like may be adopted.
- the drone 10 is equipped with the camera 4 and various sensors.
- the drone 10 takes an aerial image of the structure 5 with the camera 4 when traveling on the route.
- the drone 10 transmits image data and sensor data at the time of shooting to the PC 2 by wireless communication.
- the known sensor group of the drone 10 can detect the position, direction, speed, acceleration, and the like of the drone 10 as sensor data.
- the position includes three-dimensional coordinate information and can be obtained in the form of latitude, longitude, altitude (elevation from the ground) or the like based on, for example, a GPS, an altitude sensor, or another positioning system. When GPS is used, sufficient positioning accuracy is assumed.
- the camera 4 includes, as shooting setting information, a camera direction (shooting direction), shooting timing, shooting conditions, and the like.
- the camera direction is a direction facing a location on the surface of the object 5 with reference to the camera position.
- the shooting timing is a timing at which a plurality of continuous images (still images) are shot, and is defined by a shooting interval or the like.
- the shooting conditions are defined by known parameters such as exposure.
- the PC 2 has a drone control function 21, a client program 20, a GUI screen 22, and the like.
- the drone control function 21 is a function for controlling navigation of the drone 10 and shooting of the camera 4.
- the client program 20 is a client program of the deterioration inspection support software 100.
- the client program 20 performs processing in cooperation with the server program 30 through client-server communication.
- the client program 20 controls a drone control function 21, an image display function, a GUI screen 22, and the like.
- Various data and information used for processing by the client program 20 and the like are stored in the memory of the PC 2.
- the memory stores control information of the drone 10, image data and sensor data acquired from the drone 10, various data acquired from the server 3, and the like.
- the server 3 has a server program 30, a DB 31, and the like.
- the server program 30 is a server program of the deterioration inspection support software 100.
- the server program 30 is in charge of processing with a high calculation processing load, such as the deterioration inspection support processing and the three-dimensional model generation processing.
- the server program 30 executes a predetermined process in response to a request from the client program 20, and responds with process result information.
- the DB 31 stores various data and information used by the server program 30 and the client program 20 for processing.
- the DB 31 may be realized by a DB server or the like.
- the DB 31 stores, for example, structure data d1, imaging plan data d2, image data d3, inspection data d4, and the like.
- the structure data d1 includes basic information and management information on the structure 5, design drawing data, three-dimensional data, and the like.
- the three-dimensional data includes three-dimensional model data generated by the three-dimensional model generation function.
- the three-dimensional data is not limited to this, and if there is data created by an existing three-dimensional CAD system or the like, it can be used.
- the photographing plan data d2 is data of a plan relating to deterioration inspection and aerial photography. Based on the shooting plan data, aerial shooting setting information by the actual drone 10 is created and set.
- the image data d3 is data of an image group captured by the camera 4.
- the inspection data d4 is data representing the state and the result of the deterioration inspection.
- the mounting configuration of the computer system 1 is not limited to the above example and is possible.
- the PC 2 and the server 3 may be integrated as one computer, or may be separated into a plurality of devices for each function.
- a form having a drone control device separately from the PC 2 may be used.
- the drone 10 a general-purpose drone can be applied, but a dedicated drone having a dedicated function for the present image display system may be applied.
- FIG. 2 shows a schematic configuration of the drone 10 and the drone control function 21 in the computer system 1.
- the drone 10 includes a propeller driving unit 11, a navigation control unit 12, a sensor 13, a gimbal 14, a camera 4, an image storage unit 15, a wireless communication unit 16, a battery 17, and the like.
- the propeller driving unit 11 drives a plurality of propellers.
- the navigation control unit 12 controls navigation of the drone 10 according to navigation control information from the navigation control unit 102 of the PC 2. For that purpose, the navigation control unit 12 drives and controls the propeller driving unit 11 while using the information of the sensor 13.
- the sensor 13 is a group of sensors such as a known GPS receiver, an electronic compass, a gyro sensor, and an acceleration sensor, and outputs predetermined sensor data.
- the gimbal 14 is a known mechanism that holds the camera 4, and is a mechanism that automatically maintains the camera 4 in a certain state without blurring during navigation.
- Each direction and each position of the drone 10, the gimbal 14, and the camera 4 are basically independent parameters.
- the camera 4 aerially shoots the object 5 in accordance with the shooting control information and the shooting setting information from the shooting control unit 104 of the PC 2, and outputs image data.
- the image storage unit 15 stores image data and the like.
- the wireless communication unit 16 includes a wireless communication interface device, and performs wireless communication with the computer system 1 using a predetermined wireless communication interface.
- the battery 17 supplies power to each unit.
- the computer system 1 has a processor 111, a memory 112, a communication device 113, an input device 114, a display device 115, and the like, which are interconnected by a bus or the like.
- the processor 11 executes a process according to the program read into the memory 112. Thereby, each function of the deterioration inspection support software 100 and each unit such as the GUI unit 101 are realized.
- the computer system 1 includes a GUI unit 101, an aerial photography setting unit 107, a navigation control unit 102, an imaging control unit 104, a sensor data storage unit 103, an image data storage unit 105, a wireless communication unit 106, a deterioration inspection support unit 121, and a three-dimensional It has a model generation unit 122, an image display control unit 123, and the like.
- the GUI unit 101 configures a GUI screen 22 for a user based on an application such as an OS, middleware, and a Web browser, and displays the GUI screen 22 on the display device 114 (for example, a touch panel).
- the user can perform user settings and instruction input operations on the GUI screen 23, and can check the state and result of each function.
- As the user setting it is possible to set whether or not to use each function of the image display system, and to set a control threshold for each function.
- the GUI unit 101 generates screen data corresponding to a screen such as a list screen and an individual screen described later.
- the aerial photography setting unit 107 creates aerial photography setting information based on the user's operation and the imaging plan data d2, and sets it in the computer system 1 and the drone 10.
- the navigation control unit 102 controls navigation of the drone 10 based on the aerial photography setting information.
- the navigation control unit 102 transmits navigation control information d101 to the drone 10 through wireless communication, and receives sensor data d102 and the like indicating the navigation state from the drone 10.
- the shooting control unit 104 controls shooting by the camera 4 based on the aerial shooting setting information.
- the imaging control unit 104 transmits the imaging control information d103 to the drone 10 through wireless communication, and receives image data d104 and the like from the drone 10.
- the image data storage unit 105 stores image data obtained from the drone 10.
- the image data is managed in association with the shooting date and time, sensor data, shooting setting information, and the like as appropriate.
- the wireless communication unit 106 includes a wireless communication interface device, and performs wireless communication with the drone 10 using a predetermined wireless communication interface.
- the deterioration inspection support unit 121 performs a process corresponding to the deterioration inspection support function.
- the deterioration inspection support unit 121 performs, for example, visual inspection support processing or automatic diagnosis processing.
- the three-dimensional model generation unit 122 performs a process corresponding to the three-dimensional model generation function.
- the image display control unit 123 determines the spatial positional relationship of the image group and performs control related to image selection.
- FIG. 3 shows a configuration of the image display system (computer system 1) according to the embodiment.
- the image display system has, as inputs (including storage and the like), an image group 201 and image information 202 based on the image data d3, structure data d1, photographing plan data d2, and the like.
- the input image data d3 (image group 201) is mainly obtained by the camera 4 of the drone 10, but is not limited thereto, and may be an image group photographed manually by an operator.
- the combination of the image groups obtained by the above means may be used.
- the image group 201 is data of the main body of each image.
- the image information 202 is metadata or property information associated with each image in a format such as, for example, a known Exif.
- the image information 202 includes, for example, information such as ID (identification information), shooting date and time, camera position, camera direction, camera model, focal length, angle of view, image size (number of pixels), and the like. Note that the camera position and the camera direction generated by the SFM processing unit 122A are used.
- the image display system includes a list display unit 301, an individual display unit 302, an image display control unit 123, a deterioration inspection support unit 121, a three-dimensional model generation unit 122, and the like as processes (corresponding processing units).
- the image display control unit 123 includes an adjacent image determination unit 303 and an overlap ratio calculation unit 304.
- the image display system outputs screens such as a list screen 401 and an individual screen 402 as outputs (including display, storage, and the like).
- the list display unit 301 displays the list screen 401 based on the image group 201 and the image information 202 (for example, ID, shooting date and time). On the list screen 401, a plurality of images of the image group 201 are arranged and displayed in parallel in an order using IDs and the like.
- the individual display unit 302 displays the individual screen 402 based on the image group 201 and the first image selection information 321 on the list screen 401.
- the individual display unit 302 displays the first image 322 selected by the user on the individual screen 402. Further, the individual display unit 302 displays the second image 324 associated with the link image based on the link image selection information 323 in the individual screen 402.
- the image display control unit 123 uses the adjacent image determination unit 303 and the like to grasp the spatial positional relationship between images and the degree of overlap, and controls the link between images and selection of the next image.
- the adjacent image determination unit 303 determines a candidate image, an adjacent image, and the like around the first image.
- Adjacent images are images that are adjacent with a minimum degree of overlap.
- the adjacent image determination unit 303 determines a positional relationship between images using, in particular, position information (camera position and the like) and direction information (camera direction and the like) of the image information 202.
- the adjacent image determination unit 303 determines an adjacent image using the information on the overlap ratio calculated by the overlap ratio calculation unit 304.
- the adjacent image determination unit 303 determines the distance between the position of a certain image (first image) and another image in the vicinity thereof, and determines the direction of each image (corresponding camera direction or image capturing range). The direction of the arranged surface) is determined.
- the overlap ratio calculation unit 304 calculates the overlap ratio between images based on the first image selection information 321 and the like and the candidate image information determined by the adjacent image determination unit 303.
- the image display control unit 123 displays the link image 325 corresponding to the image such as the adjacent image determined by the adjacent image determination unit 303 on the individual screen 402.
- the link image 325 is a link image (GUI component) for navigating an image (next image) that is a candidate to be selected next to the first image.
- the deterioration inspection support unit 121 performs a visual inspection support process or the like based on a user operation using the individual screen 402 as a deterioration inspection screen, displays a processing state or result on the deterioration inspection screen, and displays the processing state or result.
- the inspection data d4 to be represented is created and stored.
- the three-dimensional model generation unit 122 includes an SFM processing unit 122A.
- the SFM processing unit 122A generates a three-dimensional model of the structure 5 and information on the camera direction and the camera position by performing the SFM processing with the image group 201 as an input, and creates the corresponding three-dimensional model data d5.
- the three-dimensional model generation unit 122 displays a structure three-dimensional model screen based on the three-dimensional model data d5.
- the three-dimensional model generation unit 122 describes the obtained camera direction and camera position information in the image information 202 or manages the information in association with the image data d3.
- the user performs an operation (first image selection operation) OP1 on the list screen 401 to select a desired image from the image group as the first image.
- the user performs the deterioration inspection operation OP3 on the image of the individual screen 402 (deterioration inspection screen).
- FIG. 4 shows a processing flow of the structure deterioration inspection support system including the image display system according to the embodiment.
- the flow in FIG. 4 has steps S1 to S13. Hereinafter, the steps will be described in order.
- the computer system 1 creates and sets aerial photography setting information of the drone 10 based on a user operation, the photography plan data d2 of the structure 5, and the like. Aerial photography by the drone 10 is performed based on the aerial photography setting information.
- image data or the like obtained by photographing with the camera 4 is transmitted to the PC 2.
- the PC 2 transmits image data and the like to the server 3.
- the server 3 stores the image data d3 and the like in the DB 31.
- the computer system 1 reads out image data d3 (image data 201, image information 202) and the like from the DB 31 and the like and stores them in the processing memory.
- the computer system 1 (especially the list display unit 301) displays a list of image groups on the list screen 401 (FIG. 14) based on the image data d3 (image group 201, image information 202).
- the computer system 1 receives the user operation on the list screen 401 and, when receiving a predetermined operation (first image selection operation OP1), obtains the first image selection information 321 and selects the first image.
- the first image selection operation OP1 is, for example, a click or a tap of one image.
- the computer system 1 (especially the individual display unit 302) displays the selected first image 322 on the individual screen 402 (FIG. 15) based on the operation in S4 and the first image selection information 321.
- the user can perform the deterioration inspection work using the individual screen 402 as the deterioration inspection screen (FIG. 21).
- the computer system 1 (especially the adjacent image determination unit 303) determines, based on the image information 202 and the first image selection information 321 (or the link image selection information 323), each image (candidate image) in the vicinity of the first image. Explore). At this time, the adjacent image determination unit 303 determines the positional relationship between the images using information such as the camera position and the camera direction of the images. In particular, the adjacent image determination unit 303 calculates a distance between the position of the first image and the candidate image, and picks up a candidate image for the first image based on the distance. At this time, the adjacent image determination unit 303 picks up candidate images in ascending order of the distance. At that time, the adjacent image determination unit 303 narrows down the candidate images within a range in which the distance is within the distance threshold, using a preset distance threshold. At this point, the adjacent image has not been determined yet.
- the adjacent image determination unit 303 when picking up a candidate image, the adjacent image determination unit 303 first extracts an image whose camera direction is substantially the same, for example, an image arranged in the same plane. Is picked up as a candidate image.
- the computer system 1 calculates the overlap ratio (OL) between the images for each of the candidate images obtained in S6, in order, as a set of the first image and the candidate image. I do.
- the duplication ratio calculation unit 304 calculates a photographing range for each image (FIG. 7), and calculates the duplication ratio OL using the photographing range and the like (FIG. 12 and the like), as described later.
- the computer system 1 (especially the adjacent image determination unit 303) determines the overlap state by comparing the overlap rate OL obtained in S7 with a predetermined overlap rate threshold value (hereinafter referred to as TOL). I do.
- the adjacent image determination unit 303 determines whether the OL value of the candidate image is equal to or less than the TOL value, is greater than 0, and is minimum.
- the computer system 1 selects a candidate image that satisfies the condition as an adjacent image (an image with the minimum overlap and an adjacent image).
- the computer system 1 similarly determines an adjacent image for each direction such as up, down, left, and right in the plane.
- the duplication rate threshold value TOL has already been set in the deterioration inspection support software 100 program.
- the present invention is not limited to this, and the TOL value may be set by the user.
- the computer system 1 confirms, based on the result of S8, whether an adjacent image that satisfies the condition of the overlapping state has been found. If the adjacent image has been found (Y), the process proceeds to S11. Proceed to S10.
- the case where no image is found corresponds to the case where no image is found from an image whose camera direction is substantially the same, for example, an image arranged in the same plane.
- the computer system 1 selects, as the next image, a candidate image other than the searched image and having the smallest distance between image positions.
- this next image is an image arranged on another surface (side surface or the like) having a different camera direction from the first image.
- this next image is an image located at a distance without any overlap even in the same plane as the first image.
- the computer system 1 (the image display control unit 123) displays the link image 325 (the link image 403 in FIG. 15 and the link in FIG. 16) of the image (the next image) selected in S8 or S10 in the individual screen 402. Image 404).
- the image display control unit 123 displays the link image 325 according to the type of the next image in each direction with respect to the first image.
- the computer system 1 receives a user's predetermined operation (second image selection operation OP2) on the link image 325 on the individual screen 402.
- the second image selection operation OP2 is a click, a tap, or the like on the link image.
- the computer system 1 receives another operation such as an operation for returning to the list screen 401 and an end operation on the individual screen 402.
- the computer system 1 proceeds to S13 when receiving a selection operation (second image selection operation OP2) for the link image 325 (Y), and returns to S3 when receiving an operation for returning to the list screen 401.
- S13 In S13, the computer system 1 (especially the individual display unit 302) is displayed on the individual screen 402 with the selected link image 325 based on the link image selection information (second image selection information) 324 in S12. The next image (second image) is displayed as a new first image. After S13, the process returns to S6, and the same repetitive processing is performed. The user can perform the deterioration inspection work by viewing the new first image on the individual screen 402 (deterioration inspection screen). Thereafter, similarly, selection and display transition from one image to another image in the image group are possible.
- FIG. 5 shows a first example of an aerial photographing method of the structure 5 using the drone 10.
- the method of aerial photography is not limited to this.
- As the aerial photography method a suitable method is selected in consideration of the processing and characteristics of the image display system.
- the drone control function 21 of the computer system 1 the route of the aerial photographing by the drone 10, the photographing setting of the camera 4, and the like are set as the aerial photographing setting information based on the photographing plan data d2 and the like.
- the aerial photography setting information is held in the DB 31 or the like.
- FIG. 5 shows a bird's-eye view configuration in the Z direction when the structure 5 is a roughly rectangular solid.
- This structure 5 has four side surfaces A1 to A4 standing vertically.
- the image display system comprehensively captures, for example, the entire surface area of these side surfaces A1 to A4 and sets the image as a deterioration inspection target.
- the structure 5 has corners, such as corners 501, in which side surfaces in different directions are connected to each other.
- aerial photography is performed separately for each side surface of the object 5 as a group (side surface group).
- four sides R1 to R4 are set corresponding to four groups corresponding to the four side surfaces A1 to A4, and aerial photography is performed four times.
- the route of each group is basically a straight route (indicated by a dashed arrow), and is controlled so that the camera direction (indicated by an alternate long and short dash line) and the target distance D0 are substantially constant.
- FIG. 5 corresponds to (A) and shows the structure 5 and the aerial photography route in perspective.
- a route R1 for aerial photographing the group of the side surface A1 is shown.
- a method similar to a so-called line-sequential scanning method is used to comprehensively capture the entire area of the side surface A1 (XZ plane).
- the drone 10 is linearly moved from one end of the side surface A1 to the other end in the X direction, which is a horizontal direction, as in main scanning. During the horizontal movement, the direction of the camera 4 and the target distance D0 are controlled so as to be substantially constant.
- the drone 10 After the drone 10 reaches the other end of the side surface A1, the drone 10 is then moved linearly in the vertical direction (Z direction) like a sub-scan. Next, similarly, the side surface A1 is linearly moved from the other end to one end of the side surface A1 like a main scan so as to be folded in the opposite direction in the X direction. Thereafter, the main scanning and the sub-scanning are similarly repeated.
- the description is mainly directed to the side surface that stands vertically, but the horizontal surface such as the roof of the structure 5 can be similarly processed as the target.
- the direction of the camera 4 during aerial photography is, for example, downward in the Z direction.
- the route of the drone 10, the photographing timing (interval) of the camera 4, and the like are set in advance in consideration of the SFM processing of the three-dimensional model generation function so that the overlapping state between the images is suitable.
- the overlapping state in the traveling direction of the drone 10 and the camera 4 is set to be about 80%.
- the SFM processing can be reliably performed by using this image group as an input, and a highly accurate three-dimensional model can be obtained.
- the present image display system provides a mechanism such as a GUI that can support and increase the efficiency of the image selection.
- FIG. 6 similarly shows a second example of a method of aerial photographing of the structure 5 using the drone 10.
- Such an aerial photographing method is appropriately selected according to the shape of the structure 5 and the like.
- the structure 5 in FIG. 6A is different from the side surface A1 of the rectangular parallelepiped structure 5 in FIG. 5 on a part of the right side (direction X2) on the near side (direction Y2).
- This structure 5 has six side surfaces a1 to a6, and it is assumed that all surface areas thereof are to be inspected.
- the structure 5 has corners such as a corner 601, a corner 602, and a corner 603.
- the corner 601 is a portion where the side surface a1 extending in the X direction and the side surface a2 extending in the Y direction are connected, and is a corner that is concave when viewed from the drone 10.
- the corner portion 602 is a portion where the side surface a2 extending in the Y direction and the side surface a3 extending in the X direction are connected, and is a convex portion viewed from the drone 10.
- the corner portion 603 is a portion where the side surface a3 extending in the X direction and the side surface a4 extending in the Y direction are connected, and is a convex corner portion.
- ⁇ ⁇ similarly, aerial photography is performed separately for each side surface of the object 5 as a group.
- the six side faces a1 to a6 are divided into six groups, six routes r1 to r6 are set correspondingly, and aerial photography is performed six times.
- the route of each group is a straight route, and is controlled so that the camera direction and the target distance D0 are substantially constant.
- camera directions are divided into four types: a direction Y1, a direction Y2, a direction X1, and a direction X2.
- the camera direction is the Y direction (direction Y1).
- the camera direction is the X direction (direction X2).
- the camera direction is the Y direction (direction Y1).
- the camera direction is the X direction (direction X1). The same applies to the routes of other aspects.
- FIG. 6 corresponds to (A) and shows the structure 5 and the route of aerial photography in a perspective view.
- route r1 of the side surface a1 and the route r3 of the side surface a3 will be described.
- FIG. 5B the same method as the line-sequential scanning method is used for each side surface.
- FIG. 7 is a schematic diagram illustrating an image capturing range and the like.
- the image display control unit 123 of the image display system calculates the image capturing range as described below. This imaging range is used for calculating the overlap rate.
- the shooting range is a size in the real world defined by the length and width of the image and the like.
- a focal length (F) For calculating the photographing range, a focal length (F), a sensor size (SS), a target distance (D), and the like are used.
- F focal length
- SS sensor size
- D target distance
- the bird's-eye view in the Z direction shows the wall surface 701 (for example, the side surface A1) of the structure 5, the traveling direction K1 (for example, the X direction) in a part of the aerial photography route of the drone 10, and the position of the camera 4 (the camera position C). ),
- the direction of the camera 4 (referred to as a camera direction V), the sensor 702 of the camera 4, an image capturing range (referred to as an SA, and particularly, a width in the X direction).
- the sensor 702 is an image sensor of the camera 4, and FIG. 7 shows a portion having a width in the X direction.
- the sensor 702 has a predetermined sensor size SS (particularly, a width in the X direction).
- the sensor size SS depends on the camera model, and is defined by a width, a height, a diagonal distance, and the like.
- the sensor size SS is included in the image information 202 or can be grasped from the camera model and other information even if it is not included.
- the center position of the sensor 702 is shown as camera position C. From the sensor 702 and the camera position C, for example, there is a camera direction V (indicated by an alternate long and short dash line arrow) in the Y direction.
- the position where the camera direction V intersects the wall surface 701 is indicated by a position Q.
- the position Q corresponds to the center position of the image and the shooting range SA.
- the focal length F is obtained from the image information 202.
- the target distance D is a distance to the target.
- the target distance D is obtained based on the shooting plan data d2. For example, the target distance D is obtained from the target distance D0 in FIG.
- the computer system 1 calculates the shooting range SA using the sensor size SS, the focal length F, the target distance D, and the like.
- the computer system 1 similarly calculates the shooting range SA of each image in the image group.
- the computer system 1 (overlap ratio calculation unit 304) calculates the overlap ratio between images using the shooting range SA of each image.
- the computer system 1 (adjacent image determining unit 303) determines the positional relationship between images using information such as the camera position C, the camera direction V, and the shooting range SA.
- the computer system 1 uses information on the camera position C and the camera direction V for processing.
- Information on the camera position C and the camera direction V can be obtained by, for example, the following methods.
- the computer system 1 transmits the information on the camera position C and the camera direction V.
- the camera 4 itself may include a sensor that can detect the camera position C and the camera direction V.
- the information can be referred to by associating the information on the detected camera position C and the camera direction V with the image data of the image captured by the camera 4.
- the computer system 1 calculates the camera position C and the camera direction V from other information. For example, the computer system 1 obtains information on the position and direction of the drone 10 and the position and direction of the gimbal 14 based on the sensor data at the time of aerial photography. The computer system 1 obtains the camera position C and the camera direction V by calculation using the information. The camera position C and the like can be calculated by matching the relative positional relationship of the gimbal 14 installed on the drone 10 with the relative positional relationship of the camera 4 installed on the gimbal 14.
- the computer system 1 obtains a camera position C and a camera direction V from an image group (for example, three continuous images) by SFM processing. This method is used in the image display system of the embodiment.
- the SFM processing unit 122A of the three-dimensional model generation unit 122 obtains the camera position C and the camera direction V by performing the SFM processing on the input image group.
- FIG. 8 shows the overlap between images and the photographing range in the aerial photographing traveling direction.
- a part of the route to the wall surface 701 is shown as a bird's-eye view in the Z direction.
- the traveling direction K1 of the drone 10 during the aerial photography is, for example, the main scanning direction X2 of the route R1 in FIG. 5 and the main scanning direction X2 of the route r1 in FIG.
- the camera direction C is the direction Y1.
- positions P1 to P7 are indicated by dots.
- positions C1 to C7 are indicated by dots as examples of the camera position C.
- the target distance D0 at each position is constant.
- the positions P1 to P7 and the positions C1 to C7 have a chronological order.
- the center position on the image and photographing range side is indicated by points Q1 to Q7.
- it has images g1 to g7 and shooting ranges SA1 to SA7.
- the imaging range here indicates the width in the X direction.
- the shooting range SA of the image g1 shot at the position P1 and the position C1 is the shooting range SA1
- the center position is the position Q1.
- FIG. 8 shows a case where the photographing ranges SA1 to SA7 in the images g1 to g7 overlap by about 80% between the images in the traveling direction K1.
- FIG. 9 shows the overlap between images in a plan view (XZ plane, for example, side A1), corresponding to the example of FIG.
- FIG. 9 illustrates a case where the images g1 to g7 are slightly shifted in the Z direction (the traveling side direction K2) so that the state of the overlap can be easily understood.
- the traveling direction K1 indicates the case of the direction X2 as in FIG. 8, and the traveling side direction K2 is a vertical direction perpendicular to the traveling direction K1.
- the traveling direction K3 is a direction when a displacement (movement component) of the traveling side direction K2 is added to the traveling direction K1.
- an overlapping area 901 indicated by a diagonal line pattern indicates an overlapping area in a set of an image g1 (imaging area SA1) and an image g2 (imaging area SA2).
- the overlapping area 902 indicates an overlapping area in the set of the image g1 and the image g6 (the imaging range SA6).
- the images g1 to g7 are a group of images photographed so as to secure an overlapping rate (for example, about 80%) necessary for generating a three-dimensional model.
- the overlap ratio OL is 80% or more (for example, 85%).
- the overlap ratio OL for the image g1 is 65% for the image g3, 45% for the image g4, 25% for the image g5, and 5% for the image g6.
- the overlap ratio threshold TOL set for the deterioration inspection is, for example, 20%.
- the image g6 is selected as the image that satisfies the above-described condition of the overlap rate (S8 in FIG. 4).
- the present image display system performs the same processing for the traveling side direction K2 using a predetermined overlap ratio threshold.
- FIG. 10 shows an example of an adjacent image in a plan view (XZ plane, for example, side A1), corresponding to the example of FIG. It is assumed that the first image selected by the user is, for example, the image g1.
- the present image display system examines three-dimensional spatially peripheral images of the image g1 as candidate images.
- the image g6 is selected as an image (adjacent image) with the minimum overlap as shown in FIG.
- the image g6, which is an adjacent image in the right direction is also shown as an image ga.
- the position Qa of the image ga is the position Q6 of the image g6.
- the image display system displays a link image representing the image ga on the individual screen 402.
- the user can select a desired next image (second image) from the link images (corresponding adjacent images) displayed in each direction with respect to the first image on the individual screen 402.
- an image in an oblique direction with respect to a certain first image for example, image g1
- a certain first image for example, image g1
- an image in the oblique direction is directly searched for the image g1 at the position Q1
- an overlap ratio between the image g1 and the image in the oblique direction is calculated
- a predetermined overlap ratio threshold in the oblique direction
- first, left and right or top and bottom adjacent images are determined for the image g1 at the position Q1, and the adjacent images (eg, the image ga and the image gd) obtained by the determination are further processed.
- the adjacent images eg, the image ga and the image gd
- the calculation of the overlap rate will be described with reference to FIGS.
- the basic definition of the overlap ratio is as follows. Ideally, a state of overlap in the surface area of the structure 5 (corresponding imaging range SA) is considered.
- the computer system 1 considers a region where the images overlap in the surface region in the image centered on the position Q in FIG.
- the computer system 1 calculates the size of the image and the overlapping area.
- the rate indicating the degree of overlap of the overlapping area between the images is defined as the overlap rate.
- the computer system 1 calculates an overlap rate based on the size of the image and the overlap area.
- target distance D0 When the distance between the camera 4 and the surface of the structure 5 at the time of photographing (target distance D0) is constant, the size of the region shown in each image almost corresponds to the size of the object in the real world. Therefore, in that case, a sufficient effect can be obtained by calculating the overlap ratio using the size of the image area.
- FIG. 11 shows the overlapping state of the two images in the photographing range SA.
- FIG. 11 it is shown as a bird's-eye view in the Z direction.
- An image G1 taken from the camera position C1 and an image G2 taken from the camera position C2 are shown for the side surface A1 and the like.
- the position Q1 of the image G1 and the position Q2 of the image G2 are shown.
- Each of the position C1, the position Q1, and the like has three-dimensional position coordinates.
- the width W1 in the X direction of the photographing range SA of the image G1 and the width W2 in the X direction of the photographing range SA of the image G2 are shown.
- the width of the overlapping area in the set of the image G1 and the image G2 is indicated by a width W12.
- the overlapping rate in the X direction between the image G1 and the image G2 can be calculated using the width W1 of the image G1 and the width W12 of the overlapping area.
- the position of the drone 10 since the drone 10 travels along the route and is shaken by the wind, the position of the drone 10, the position of the camera 4, the direction of the camera, and the like may be shaken or shifted. That is, the camera direction C of each image may be different. Even in such a case, the photographing range SA and the overlap ratio of the image on the surface of the structure 5 can be calculated in the same manner.
- FIG. 12 shows an overlapping state in an image in a plan view (XZ plane) corresponding to FIG. Further, in this example, a case where the image size is different between the image G1 at the position Q1 and the image G2 at the position Q2 is shown.
- An overlapping area 1201 (a hatched pattern) in a set of an image G1 at a position Q1 and an image G2 at a position Q2 is shown.
- the image G1 has a width W1 as a width W in the X direction and a height H1 as a height H in the Z direction.
- Image G2 has width W2 and height H2.
- a pair of the image G1 and the image G2 indicates a width W12 of the overlapping area 1201 and a height H12 of the overlapping area 1201.
- the width NW1 and height NH1 of the non-overlapping part of the image G1 and the width NW2 and height NH2 of the non-overlapping part of the image G1 are also shown.
- the overlap rate OL here, the overlap rate (OLW) in the horizontal direction (X direction) of the image and the overlap rate (OLH) in the vertical direction (Z direction) of the image are considered.
- the overlap rate OL can be defined by a ratio of the length of the overlap portion to the original length of the first image.
- the vertical (Z direction) overlap rate OLH H12 / H1.
- the three-dimensional position coordinates (X, Y, Z) are appropriately converted into two-dimensional position coordinates (x, y) in the processing of the two-dimensional image and processed.
- the coordinates of the position Q1 of the image G1 are (X1, Y1, Z1), and the coordinates of the position Q2 of the image G2 are (X2, Y2, Z2).
- the left end position in the X direction is X3, and the right end position is X4.
- the left end position in the X direction is X5, and the right end position is X6.
- the position of the upper end in the Z direction is Z3, and the position of the lower end is Z4.
- the position of the upper end in the Z direction is Z5, and the position of the lower end is Z6.
- the positions in the Y direction are the same.
- the positions of the upper, lower, left, and right ends (sides) of the image G1 are obtained as follows using the center position Q1, the width W1, and the height H1.
- the position (X5, X6, Z5, Z6) of each end of the image G2 can be obtained.
- the duplication rate is not limited to the above method, and may be processed by the following method, for example.
- the computer system 1 determines the position of the same thing (represented by a feature point or the like) in each image, and determines the overlapping area between the images based on the position of the same thing.
- the computer system 1 calculates an overlap rate from the size of the overlap area (represented by the number of pixels).
- FIG. 13 is an explanatory diagram of a known SFM.
- the SFM uses a plurality of images obtained by photographing the same object (represented by a plurality of feature points) from a plurality of viewpoints (camera positions), and information on the three-dimensional coordinates of the object, the camera direction and the camera position. (Camera parameters). For example, the information can be obtained using three images from three camera positions.
- three-dimensional coordinates are estimated and calculated based on how much the same feature point of the target object has been displaced using parallax based on the movement of the camera.
- the calculated three-dimensional coordinates represent the structure and shape of the object. In the example of FIG.
- a feature point group (for example, feature points A and B) is extracted from each image, and three-dimensional coordinates of the feature point group are obtained by a factor decomposition method or the like. For example, the coordinates (Xa, Ya, Za) of the feature point A and the coordinates (Xb, Yb, Zb) of the feature point B are obtained.
- the camera direction eg, directions J1 to J3
- the camera position eg, positions E1 to E3
- various applied methods based on the SFM method and other methods can be similarly applied.
- FIG. 14 shows a first example of the list screen 401 in the embodiment.
- a list of image groups is displayed.
- a plurality of images (for example, indicated by # 1 to # 24, etc.) of the image group 201 are arranged in parallel in the vertical and horizontal directions in the thumbnail format in the order using the ID and the shooting date and time in the image information 202.
- These image groups are, for example, image groups taken in consideration of a predetermined overlapping rate for generating a three-dimensional model.
- not only the thumbnail but also the information of the image information 201 may be displayed for each image.
- FIG. 15 shows a first example of the individual screen 402.
- the first image selected by the user from the list screen 401 is displayed in a sufficiently large main area frame 1501.
- the second image is selected by selecting the link image on the individual screen 402
- the second image is displayed in the frame 1501.
- the user can return to the list screen 401 by performing a predetermined operation such as pressing a “return to list” button 1502 in the individual screen 402.
- a link image 403 is displayed around the first image.
- the link image 403 is, in other words, a navigation image.
- the link images 403 are displayed outside the first image (frame 1501) at the top, bottom, left, right, and diagonally (upper right, lower right, upper left, lower left) positions.
- the link image 403 is shown as having an arrow shape, but is not limited thereto, and may be a GUI component, an icon image, or the like having a predetermined shape.
- Another example of the link image 403 may be a thumbnail image.
- These link images 403 are images and parts for selecting an adjacent image in the direction indicated by the arrow as the next image. These link images 403 are images and components for selecting an adjacent image in a certain plane. The user can perform an operation (such as tapping) for selecting the link image 403.
- ⁇ ⁇ In the GUI example of FIG. 15, all of these link images 403 are displayed in each direction regardless of the presence or absence of an actual adjacent image in each direction.
- the user selects and operates a desired link image 403 (for example, a right arrow). In that case, if there is an adjacent image in the direction represented by the selected link image 403, the adjacent image is selected as the second image. That is, the screen display state is changed so that the selected second image is displayed instead of the first image in the frame 1501 of the individual screen 402. The second image is treated as a new first image, and the link image 403 is displayed similarly. If there is no adjacent image in the direction represented by the link image 403 selected and operated by the user, no transition is made.
- the direction represented by the link image 403 in the same plane for example, side A1 where the first image is located. If there is an adjacent image, the adjacent image is selected. Also, if there is no adjacent image in the same plane in the direction represented by the link image 403, and there is an image in another plane in the back or front direction with the aforementioned corner interposed, The image is selected.
- the link image 403 may be displayed so as to overlap the image in the frame 1501.
- the adjacent image may be selected according to a predetermined operation specified in advance by user setting or the like without displaying the link image 403.
- a function as a link in each direction may be assigned to each key of the keyboard.
- an adjacent image in a direction corresponding to the position of the line of the frame 1501 may be selected by a predetermined operation on the line position. For example, when the selection operation near the right side of the frame 1501 is performed, the same function as the selection operation of the right arrow is performed.
- a well-known operation such as dragging and swiping may be used instead of the link image 403. For example, when a swipe operation from right to left is performed within the frame 1501, the same function as the selection operation of the right arrow is performed.
- FIG. 16 shows a second example of the individual screen.
- the image and the direction of the type are changed.
- a specific link image 404 to be displayed is displayed.
- the link image 404 has a different shape from the link image 403. For example, if there is no adjacent image on the right side in the same side of the first image and there is an image on the other side near or behind the corner, a link image 404 indicating that is displayed. Is done.
- the right back direction when there is an adjacent image in the right back direction with respect to the first image (for example, image # 21 of side surface a3 with respect to image # 3 of side surface a2 near corner 602 in FIG. 25), the right back direction is changed.
- the displayed link image 404 is displayed.
- a link image 404 indicating the right front direction is displayed.
- a link image 405 is shown as an example of another link image. If there is an adjacent image above the first image in the same plane in the Z direction (direction Z1), a link image 403 of an arrow indicating the upward direction is displayed. If there is no adjacent image above the first image in the same plane in the Z direction (direction Z1) and an image corresponding to a horizontal plane such as the roof of the structure 5 exists, a link image 405 indicating that. Is displayed.
- the link image 405 has a shape (for example, a shape that is obliquely bent) representing the upper back direction. By the selection operation of the link image 405, the image located in the upper depth direction can be selected as the next image.
- a link image 406 is shown as an example of another link image.
- the link image 406 is a link image indicating that, when there is no adjacent image that minimizes overlap in a certain direction in the same plane with respect to the first image, but there is an image at a distance without overlapping, the link image Will be displayed as
- the link image 406 has a different shape from the link image 403. For example, when there is no adjacent image below the first image (direction Z2) but there is an image at a distant position, a link image 406 representing a position distant in the downward direction is displayed. By the operation of selecting the link image 406, an image at a position apart from the image can be selected.
- FIG. 17 shows a third example of the individual screen.
- the link image 403 is limitedly displayed according to the presence or absence of the adjacent image.
- adjacent images exist at the left, lower, and lower left positions as adjacent images in the same plane with respect to the first image, and do not exist at other positions.
- a link image 403 of three arrows indicating left, lower, and lower left is displayed.
- a link image 404 of an arrow showing the right back side is displayed as shown in the figure. I have.
- the amount of display information is limited, so that the user can work more easily.
- FIG. 18 shows a fourth example of the individual screen.
- the image content of the overlapping area with the adjacent image is not displayed as it is, but the overlapping area is deleted (in other words, masked). I do. More specifically, the overlapping area is displayed by, for example, blackening or hatching.
- the first image (or the second image) has an overlapping area 408.
- the overlapping area 408 is located on the left side of the image selected as the next image and displayed in the frame 1501 in the vicinity of the right side of the previous image (the area is indicated by a broken line for the sake of explanation). The vicinity is an overlapping area.
- the user does not need to look at the image content of the overlapping area 408 in the selected image, and the visual information amount is reduced, so that the work can be performed. It's easy to do.
- the user knows that the overlapping area 408 of this image has been subjected to the deterioration inspection in the previous image, so that it is not necessary to perform the deterioration inspection of the overlapping area 408 again at the time of the deterioration inspection of this image. Therefore, the efficiency of the deterioration inspection work can be improved.
- the overlapping area 408 may be further provided with information indicating that the deterioration has been inspected.
- the overlapping area 408 may be displayed in a frame line expression.
- FIG. 19 shows a fifth example of the individual screen.
- the unphotographed area (missing area) is displayed in a predetermined expression so that the user can understand.
- An unphotographed area (missing area) is an area where a corresponding image does not exist.
- FIG. 19A shows an example of the relationship between two images. The case where the image 191 and the image 192 overlap and the overlapping rate is large is shown.
- the non-photographed area 193 is an area outside these two images, and is an example of an area that is not photographed in an image group including these two images. Of the areas such as the side surfaces of the structure 5, the unphotographed area 193 does not exist as an image.
- this unphotographed area 193 cannot be a target of deterioration inspection or the like.
- On the individual screen 402 such a portion of the unphotographed area 193 is displayed so that the user can recognize it. This allows the user to know the location of the unphotographed area 193, recognize the location where the deterioration test has been omitted, etc., and take another action such as separately capturing the location and performing the degradation test.
- ⁇ Circle around (2) ⁇ shows an unphotographed area 193b of another example.
- the unphotographed area 193b is an area where there is a gap between the image 192 and the image 194. Basically, the images are photographed so as to be overlapped between the images. However, for some reason (for example, an error during aerial photographing), it is conceivable that such a gap is generated and the unphotographed area 193b is generated. In this case as well, the image is displayed so that the existence of the unphotographed area 193b can be recognized.
- FIG. 19 show display examples of the unphotographed area 193 on the individual screen 402 corresponding to (A).
- the example of (B) shows an example in which the selected image 191 is displayed in the frame 1501 at the image size of the image 191 as it is.
- the overlapping image 192 is also indicated by a frame line.
- an unphotographed area 193 exists outside the frame 1501.
- an image representing the unphotographed area 193 may be displayed in a predetermined expression (for example, a dot pattern).
- the example of (C) shows a case in which the image 191 is shifted (translated) and displayed according to a predetermined operation of the user from the state of (B).
- the image display system also has a function of shifting an image displayed in the frame 1501 on the individual screen 402, a function of enlarging / reducing the image, and the like.
- this shift display is enabled by an operation such as dragging or swiping on the first image, and enlargement / reduction display is enabled by a pinch operation or the like on the first image.
- the center position (position Q1a) of the image 191 has been moved to, for example, a slightly upper left position Q1b by the shift operation from the state of (B). Due to this shift, a part of the image 191, a part 193 c of the unphotographed area 193, and a part 192 c of the image 192 are put in the frame 1501. On the individual screen 402, an image portion included in the frame 1501 is displayed, and a part 193c of the unphotographed area 193 is also displayed in a predetermined expression.
- FIG. 20 shows a sixth example of the individual screen.
- the image is displayed by a frame line or the like so that the user can recognize the existence of the adjacent image.
- an outline image 409 for example, a broken line representing the outline (side) of the adjacent image is displayed. indicate. This allows the user to know that, for example, there is an adjacent image on the right side of the first image, and also knows the degree of overlap.
- the user wants to select the adjacent image, the user operates the link image 403 or the outline image 409 thereof. As a result, the adjacent image is selected and displayed in the frame 1501.
- a GUI component (duplication ratio threshold setting column) 1503 for user setting of the duplication ratio threshold TOL is displayed in the individual screen 402.
- the GUI component 1503 is provided with a slide bar or the like so as to display the currently set overlapping rate threshold TOL and allow the user to change the TOL value.
- the user can check the state of the TOL value on the individual screen 402.
- the user can change the setting of the TOL value by operating the GUI component 1503 according to the application, the working state, and the like.
- the actual overlap ratio between the first image and the adjacent image may be displayed.
- the range may be set not only by one value but also by two values.
- the range of the overlap rate is set to be 40% or more and 60% or less, and in that case, an image corresponding to the range can be selected.
- FIG. 21 shows an example of the deterioration inspection screen.
- the individual screen 402 is a deterioration inspection screen as it is.
- the user can perform the deterioration inspection work on the image as it is.
- the selected image is displayed as a deterioration inspection target image in a frame 1501.
- the deterioration inspection includes, for example, a visual inspection by a person and an automatic diagnosis by a computer.
- the present image display system supports both of these functions.
- the user performs a visual inspection on the image in the deterioration inspection screen.
- the user visually checks this image and determines whether or not there is any deterioration and the location. For example, a case where there is a deteriorated portion 2101 such as a crack in an image is shown.
- On the deterioration inspection screen as an example of the deterioration inspection support processing, a deterioration spot found by a user through a visual inspection can be marked and displayed.
- the user performs marking on the deteriorated portion 2101 by a predetermined operation, for example.
- An example of a marking operation is an operation of enclosing a frame by dragging or the like.
- a red frame 2102 is displayed as an example of the marking image.
- Image information and structure information may be displayed outside the frame 1501. Outside the frame 1501, a check item for deterioration inspection (an inspected / untested state can be input), a deterioration presence / absence item, a comment input column, and the like may be provided.
- the computer system 1 estimates a deteriorated portion by using a target image as an input.
- the image display system displays automatic diagnosis result information by the computer system 1 on a deterioration inspection screen. For example, the estimated deteriorated portion is displayed in a predetermined expression. The user confirms or makes a final decision by looking at the result information.
- a deterioration inspection screen may be provided separately from the individual screen 402, and transition between the screens may be made based on a predetermined operation (for example, double tap).
- FIG. 22 shows an example of a structure three-dimensional model screen.
- This image display system displays a structure three-dimensional model screen when a predetermined operation is received on the list screen 401 or the like.
- the present image display system displays a structure three-dimensional model in an area within this screen based on the three-dimensional model data d5, and also displays structure information and the like. In this area, for example, a three-dimensional structure is superimposed on a map (eg, a site) and displayed in a perspective view. According to a predetermined operation on this screen, the direction and position of viewing the three-dimensional structure model can be changed.
- a map eg, a site
- a region such as a side surface may be selected and designated by a predetermined operation of the user with respect to the three-dimensional structure model.
- the image display system displays an image group associated with the aspect on the list screen 401.
- an image associated with the position may be displayed on the individual screen 402.
- FIG. 23 shows a second example of the list screen.
- a group of images to be subjected to deterioration inspection is displayed in a list.
- the user selects a group of images to be subjected to deterioration inspection through the individual screen 402 described above.
- the image display system displays a list of the images (selected images) to be subjected to the deterioration inspection on the list screen 401.
- the selected image group may be displayed by a predetermined operation (for example, pressing a selected image group display button) in the list screen 401.
- the deterioration inspection information for example, inspection completed, presence or absence of deterioration, etc.
- FIG. 24 shows a third example of the list screen.
- This list screen 401 shows, as a modification example of the list screen 401 of FIG. 23, another method for displaying a list of the image group to be subjected to deterioration inspection.
- the image group to be subjected to the deterioration inspection is arranged and displayed in a positional relationship along the shape of the surface area of the structure 5 (or the generated structure three-dimensional model).
- an image group (adjacent image group) to be subjected to deterioration inspection is displayed for each selected side group.
- an area 2401 corresponding to the side surface of the selected group (that is, a partially planar area) is displayed.
- a plurality of adjacent images (images # 1 to # 12) are arranged and displayed according to the actual positional relationship.
- it may be displayed in a predetermined expression so that the state such as the inspection completed or the presence or absence of deterioration can be understood.
- the image is displayed on the individual screen 402.
- a link image 2402 for selecting another side group may be displayed.
- the group of the side surface a3 is selected and displayed in the area 2401 in the same manner.
- the right and left arrows are used, but the present invention is not limited to this, and various expressions can be used.
- the link image 2402 may have an arrow shape indicating that the link image changes in a three-dimensional space in a depth direction or a front direction.
- a link image 2402 for a link from the side surface a2 to the side surface a3 is displayed in a curved arrow shape representing the right back.
- the positional relationship between the images is determined using the information on the position and direction of the image (for example, the camera direction C and the camera direction V). Has the function of determining the next image for a certain image.
- the set of images is determined using the positional information of the images in consideration of efficiency.
- FIG. 25 shows an example of an image relationship between the side surfaces of the structure 5 corresponding to the example of FIG.
- examples of adjacent images on the side surface a2 and the side surface a3 of the structure 5 are shown.
- the adjacent image indicates an image group selected as an image suitable for deterioration inspection.
- each image is indicated by a broken-line frame, in reality, the ends of each image partially overlap.
- the corner 601 is on the left side in the plane, on the far side in the Y direction (direction Y1), and on the right side in the plane, on the near side in the Y direction (direction Y2).
- the corner 602 is on the left side (direction X1) in the plane, and the corner 603 is on the right side (direction X2) in the plane.
- an image group is captured by aerial photography like the above-described scanning. Examples of adjacent image groups selected to minimize the overlap among the image groups are shown as images # 1 to # 12.
- an image group is captured by aerial photography such as the above-described scanning.
- Examples of adjacent image groups selected so as to minimize overlap among the image groups are shown as images # 21 to # 32. These image numbers also correspond to the order of the shooting date and time. Images that are not selected due to a high overlap rate are not shown.
- the image display system displays one image selected by the user on the individual screen 402. For example, assume that image # 3 is selected as the first image. In this case, considering only the YZ plane of the side surface a2, there are images # 2, # 4, and # 5 as adjacent images. Here, in a three-dimensional space, as shown, at the corner 602, the left end of the side surface a3 is connected to the right end of the side surface a2. When the user looks at the image # 3 on the individual screen 402, there is no structure 5 on the right side (direction Y2) of the image # 3, so there is no adjacent image.
- the present image display system In applications such as deterioration inspection, it is useful to be able to select and display between these images even in the case of the positional relationship between the image # 3 and the image # 21.
- the three-dimensional positional relationship between the images is determined using the information on the position and direction of the image (FIG. 26), the adjacent images between the side surfaces are also determined, and a link image is determined.
- the link image not only the link image indicating the left or the bottom, but also the link image indicating the right back (the link image 404 in FIG. 17) is displayed.
- the present image display system mainly determines the three-dimensional positional relationship and the distance, not the overlap ratio in the same plane, when determining the adjacent image between the side surfaces (see FIG. 4). S10 etc.).
- FIG. 26 shows an example of image selection determination between the side surfaces of the structure 5 corresponding to FIG.
- the side surface a1, the side surface a2, the side surface a3, the corner 601 and the corner 602 are shown in a bird's-eye view.
- Positions c1, c2, and c3 are shown as examples of the camera position C at the time of shooting along the route r1 of the side surface a1. These positions are positions near the corner 601 of the side surface a1.
- the camera direction is the direction Y1, and the target distance D0 is constant. For example, it shows an image g11 (especially the width of the photographing range) photographed at the position c3.
- positions c4, c5, c6, and the like are shown as examples of the camera position C at the time of shooting along the route r2 of the side surface a2. These positions are positions near the corner 601 of the side surface a2. At any position, the camera direction is the direction X2, and the target distance D0 is constant. For example, it shows an image g12 (particularly the width of the photographing range) photographed at a position c4 (it is assumed to be almost the same position as the position c3).
- the image display system determines not only the adjacent image in the same side but also the image (image g12) of the different side as the adjacent image to the image g11 of the side a1. At this time, the present image display system calculates and determines the distance between the camera positions C using the position information of the camera position C of each image.
- the image display system determines, for example, an image in which the distance between the camera positions C is the smallest with respect to the position c3 of the image g11, except for images in the same side surface (images of the positions c1, c2, etc.).
- the position c4 has the smallest distance with respect to the position c3. Therefore, the image g12 at the position c4 is picked up as an adjacent image between the side surfaces.
- the image display system associates the image g11 with the image g11 as an adjacent image between the side surfaces (in the case of the side surface, it is called an adjacent image even if there is no overlap), and displays a corresponding link image.
- the present image display system may determine the above distance approximately using information on the drone position P instead of using the camera position C. Further, as another processing example, the present image display system may determine the distance using the information of the corresponding image and the position Q of the center of the shooting range instead of the camera position C. For example, it is assumed that the center position of the image g11 is the position q3 and the center position of the image g12 is the position q4. The image display system determines an image in which the distance between the center positions of the images is the smallest other than the images in the same side surface. In this example, the position q4 has the smallest distance with respect to the position q3. Therefore, the image g12 at the position q4 is picked up as an adjacent image between the side surfaces.
- the adjacent image can be determined also in the relationship between the side surface a2 and the side surface a3 with the corner 602 interposed therebetween, as described above.
- the image g13 at the position c13 (or the position q13) on the side surface a2 is the first image
- the image g14 at the position c14 (or the position q14) on the side surface a3 can be picked up as an adjacent image.
- Such a transition of the display image via the link image corresponds to a change in the user's line-of-sight direction corresponding to the change in the camera direction C.
- the GUI of the present image display system enables image selection and display image transition in a direction along the surface shape of such a three-dimensional structure 5. Thereby, the user can more efficiently perform the work according to the use such as the deterioration inspection.
- the present invention is not limited to the above-described embodiment in which the positional relationship between images is determined using information on the position and direction of the image.
- the positional relationship of the images may be determined based on the determination of the feature points and the like extracted from within the image by the image processing.
- FIG. 27 shows a third example of an aerial photographing method of another structure 5 as an application example.
- the structure 5 of this example shows a case having a columnar shape.
- the following aerial photography method is used.
- 27A shows a perspective view
- FIG. 27B shows an overhead view in the Z direction.
- the drone 10 and the camera 4 are moved, for example, in a horizontal direction and circumferentially with respect to the side surface A30 of the curved surface so that the target distance D0 is substantially constant. Route R30.
- the camera is controlled so as to face, for example, perpendicularly to the side surface A30 of the curved surface.
- an image g31 at the camera position c31 on the route R30, an image g32 at the camera position c32, an image g33 at the camera position c33, and the like are shown.
- the processing of the image display system can be similarly applied to an image group obtained by such an aerial photography method, and substantially the same effect can be obtained.
- the image display system similarly determines the overlap ratio, adjacent images, and the like for a plurality of images (imaging ranges) formed along the side surface A30. In this case, on the individual screen 402, each image can be selected in a direction along the side surface A30.
- this aerial photography method in particular, by setting the target distance D0 to be constant, it is possible to easily perform the deterioration inspection and to ensure the accuracy of the SFM processing.
- the method is not limited to the aerial photographing method, but may be a method of dividing into a plurality of linear routes as described above.
- the computer system 1 executes the processing of the overlap ratio and the determination of the adjacent image at each timing when a certain first image is selected by a user operation, as shown in the flow (S6) of FIG. are doing.
- the processing timing is not limited to this, and is possible.
- the computer system 1 (for example, the server 3) calculates the overlap ratio and the adjacent image determination for the image group at the timing when the data of the image group is input or when the execution of the batch processing is designated by the user in advance. Execute the process as a batch process.
- the target image group can be collectively selected with respect to the original image group according to the use such as the deterioration inspection.
- a group of adjacent images having a positional relationship in a three-dimensional space can be configured.
- the computer system 1 manages the information of the adjacent image group formed by this processing together with the three-dimensional model data of the structure 5 and the like. Then, the computer system 1 can display information of the adjacent image group on a GUI screen (for example, FIGS. 23 and 24) in accordance with a user operation.
- the input image group may be changed. For example, some images may be added later.
- the second photography is performed for the unphotographed area as a target.
- the computer system 1 performs the same process on the added image. That is, the computer system 1 re-executes the similar duplication ratio calculation and the adjacent image determination process using the additional image as the first image and the processed image group as the candidate image. Thereby, an adjacent image group is formed for the additional image, and the relationship between the image groups is reconstructed and updated.
- the image display system of the embodiment when selecting an image for use such as deterioration inspection from a group of images taken by the camera 4 of the drone 10, a suitable image is easily selected. This can reduce the work of the user. Depending on the application, the user checks and works by seeing a certain image (first image), then selects from images surrounding the image, and checks and works by viewing the image (second image). In many cases. In such a case, the present image display system can efficiently realize selection and operation of an image group. In the embodiment, the user can efficiently perform the work of the visual inspection for the purpose of the deterioration inspection. Even when automatic diagnosis is performed by the computer, the computer can efficiently perform processing on the selected image group.
- the image display system has a mechanism for judging an overlapping state of an object in an image with respect to a plurality of two-dimensional images having a positional relationship in a three-dimensional space, It has a mechanism such as a GUI for navigating selection and transition between images according to the overlapping state.
- a GUI for navigating selection and transition between images according to the overlapping state.
- the user when the user selects the first image and displays it on the individual screen 402, it searches for an image around the first image, determines an adjacent image based on the overlap rate, and The link image to be presented.
- the processing since the processing is performed efficiently, the waiting time of the user can be reduced.
- the image display system is not limited to use for deterioration inspection and three-dimensional model generation, but can be applied to other uses. Even when applied to other uses, a suitable image can be selected according to the overlap rate according to the use between the image group and the image, and work and processing according to the use can be efficiently performed.
- the input image group is obtained by hand or by using the drone 10.
- the present invention is not limited to this, and the input image group may be obtained by any means.
- the present image display system has described a case in which an image group used for both the purpose of the deterioration inspection and the generation of the three-dimensional model is handled.
- a case has been described in which a partial image group suitable for deterioration inspection is selected from an image group photographed so that a three-dimensional model can be generated.
- the present invention is not limited to this, and the present image display system is effective when a second image group for another second use is selected from a first image group taken for a certain first use.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Immunology (AREA)
- Signal Processing (AREA)
- Biochemistry (AREA)
- Analytical Chemistry (AREA)
- Pathology (AREA)
- Chemical & Material Sciences (AREA)
- Multimedia (AREA)
- General Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Processing Or Creating Images (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
- Closed-Circuit Television Systems (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
L'invention concerne une technique permettant une sélection facile d'une image appropriée lors de la sélection d'une image pour une utilisation prédéterminée dans un groupe d'images capturées par une caméra, ce qui permet de réduire la part de travail d'un utilisateur. Un système informatique (1) d'un système d'affichage d'image selon l'invention entre un groupe d'images (201) capturées, affiche une liste du groupe d'images (201) sur un écran de liste (401), et affiche sur un écran individuel (402) une première image sélectionnée dans le groupe d'images (201) en fonction d'une opération d'utilisateur. Le système informatique (1) détermine une image adjacente à la première image en fonction de la détermination d'une relation de position spatiale dans un ensemble de la première image et d'images candidates environnantes ainsi que de la détermination d'un état de chevauchement concernant une plage d'imagerie, et sélectionne et affiche sur l'écran individuel (402) l'image adjacente à la première image en fonction d'une opération d'utilisateur.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SG11201808735TA SG11201808735TA (en) | 2018-06-29 | 2018-08-24 | Image display system and method |
CN201880001267.9A CN110915201B (zh) | 2018-06-29 | 2018-08-24 | 图像显示系统以及方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018124978A JP7051616B2 (ja) | 2018-06-29 | 2018-06-29 | 画像表示システムおよび方法 |
JP2018-124978 | 2018-06-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020003548A1 true WO2020003548A1 (fr) | 2020-01-02 |
Family
ID=68985372
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/031364 WO2020003548A1 (fr) | 2018-06-29 | 2018-08-24 | Système et procédé d'affichage d'image |
Country Status (4)
Country | Link |
---|---|
JP (1) | JP7051616B2 (fr) |
CN (1) | CN110915201B (fr) |
SG (1) | SG11201808735TA (fr) |
WO (1) | WO2020003548A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023188510A1 (fr) * | 2022-03-29 | 2023-10-05 | 富士フイルム株式会社 | Dispositif de traitement d'image, procédé de traitement d'image et programme |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111510678B (zh) * | 2020-04-21 | 2021-12-24 | 上海歌尔泰克机器人有限公司 | 一种无人机图像传输控制方法、装置和系统 |
JP7003352B1 (ja) | 2021-04-12 | 2022-01-20 | 株式会社三井E&Sマシナリー | 構造物の点検データ管理システム |
US20240249486A1 (en) * | 2021-05-20 | 2024-07-25 | Nec Corporation | Measurement condition optimization system, threedimensional data measurement system, measurement condition optimization method, and non-transitory computer-readable medium |
CN114778558B (zh) * | 2022-06-07 | 2022-09-09 | 成都纵横通达信息工程有限公司 | 基于视频图像的桥梁监测装置、系统及方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000090232A (ja) * | 1998-09-08 | 2000-03-31 | Olympus Optical Co Ltd | パノラマ画像合成装置及びパノラマ画像合成プログラムを格納した記録媒体 |
JP2006099497A (ja) * | 2004-09-30 | 2006-04-13 | Seiko Epson Corp | パノラマ画像の合成 |
JP2006098256A (ja) * | 2004-09-30 | 2006-04-13 | Ricoh Co Ltd | 3次元サーフェスモデル作成システム、画像処理システム、プログラム及び情報記録媒体 |
JP2013058124A (ja) * | 2011-09-09 | 2013-03-28 | Sony Corp | 情報処理装置、情報処理方法、及びプログラム |
JP2018074757A (ja) * | 2016-10-28 | 2018-05-10 | 株式会社東芝 | 巡視点検システム、情報処理装置、巡視点検制御プログラム |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102819752B (zh) * | 2012-08-16 | 2015-04-22 | 北京理工大学 | 基于分布式和倒排文件的室外大规模物体识别方法和系统 |
JP6335395B2 (ja) * | 2015-09-30 | 2018-05-30 | 富士フイルム株式会社 | 画像処理装置及び撮像装置及び画像処理方法及びプログラム |
-
2018
- 2018-06-29 JP JP2018124978A patent/JP7051616B2/ja active Active
- 2018-08-24 SG SG11201808735TA patent/SG11201808735TA/en unknown
- 2018-08-24 CN CN201880001267.9A patent/CN110915201B/zh active Active
- 2018-08-24 WO PCT/JP2018/031364 patent/WO2020003548A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000090232A (ja) * | 1998-09-08 | 2000-03-31 | Olympus Optical Co Ltd | パノラマ画像合成装置及びパノラマ画像合成プログラムを格納した記録媒体 |
JP2006099497A (ja) * | 2004-09-30 | 2006-04-13 | Seiko Epson Corp | パノラマ画像の合成 |
JP2006098256A (ja) * | 2004-09-30 | 2006-04-13 | Ricoh Co Ltd | 3次元サーフェスモデル作成システム、画像処理システム、プログラム及び情報記録媒体 |
JP2013058124A (ja) * | 2011-09-09 | 2013-03-28 | Sony Corp | 情報処理装置、情報処理方法、及びプログラム |
JP2018074757A (ja) * | 2016-10-28 | 2018-05-10 | 株式会社東芝 | 巡視点検システム、情報処理装置、巡視点検制御プログラム |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023188510A1 (fr) * | 2022-03-29 | 2023-10-05 | 富士フイルム株式会社 | Dispositif de traitement d'image, procédé de traitement d'image et programme |
Also Published As
Publication number | Publication date |
---|---|
JP7051616B2 (ja) | 2022-04-11 |
JP2020005186A (ja) | 2020-01-09 |
CN110915201B (zh) | 2021-09-28 |
SG11201808735TA (en) | 2020-01-30 |
CN110915201A (zh) | 2020-03-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020003548A1 (fr) | Système et procédé d'affichage d'image | |
US11783543B2 (en) | Method and system for displaying and navigating an optimal multi-dimensional building model | |
US10818099B2 (en) | Image processing method, display device, and inspection system | |
JP6918672B2 (ja) | 劣化診断システム | |
JP5538667B2 (ja) | 位置姿勢計測装置及びその制御方法 | |
JP5248806B2 (ja) | 情報処理装置、情報処理方法 | |
JP4375320B2 (ja) | 移動ロボット | |
EP3683647B1 (fr) | Procédé et appareil de planification de points d'échantillonnage pour surveillance et cartographie | |
JP7332353B2 (ja) | 点検システム及び点検方法 | |
US20150062123A1 (en) | Augmented reality (ar) annotation computer system and computer-readable medium and method for creating an annotated 3d graphics model | |
US20200175753A1 (en) | Apparatus and method of three-dimensional reverse modeling of building structure by using photographic images | |
JP2018004541A (ja) | 情報処理装置、情報処理方法及びプログラム | |
JP5229733B2 (ja) | ステレオマッチング処理装置、ステレオマッチング処理方法およびプログラム | |
JP2017182695A (ja) | 情報処理プログラム、情報処理方法および情報処理装置 | |
US20210201522A1 (en) | System and method of selecting a complementary image from a plurality of images for 3d geometry extraction | |
US11395102B2 (en) | Field cooperation system and management device | |
JP2020021465A (ja) | 点検システム及び点検方法 | |
JP2020060907A (ja) | 避雷保護範囲生成システムおよびプログラム | |
CN111581322B (zh) | 视频中兴趣区域在地图窗口内显示的方法和装置及设备 | |
KR200488998Y1 (ko) | 실내 지도 구축 장치 | |
JP2020021466A (ja) | 点検システム及び点検方法 | |
WO2020234912A1 (fr) | Dispositif mobile, procédé d'affichage de position et programme d'affichage de position | |
CN114270405A (zh) | 利用二维图像生成三维内容的图像处理方法及图像处理装置 | |
KR101756313B1 (ko) | 부재 위치 안내 장치 | |
TWI813480B (zh) | 點群數據合成裝置、方法、系統以及電腦可讀取記錄媒體 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18924439 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18924439 Country of ref document: EP Kind code of ref document: A1 |