KR101354688B1 - System and method for supervision of construction site - Google Patents

System and method for supervision of construction site Download PDF

Info

Publication number
KR101354688B1
KR101354688B1 KR1020130078864A KR20130078864A KR101354688B1 KR 101354688 B1 KR101354688 B1 KR 101354688B1 KR 1020130078864 A KR1020130078864 A KR 1020130078864A KR 20130078864 A KR20130078864 A KR 20130078864A KR 101354688 B1 KR101354688 B1 KR 101354688B1
Authority
KR
South Korea
Prior art keywords
information
construction site
image
terminal
detecting
Prior art date
Application number
KR1020130078864A
Other languages
Korean (ko)
Inventor
김균태
임명구
김구택
Original Assignee
한국건설기술연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국건설기술연구원 filed Critical 한국건설기술연구원
Priority to KR1020130078864A priority Critical patent/KR101354688B1/en
Application granted granted Critical
Publication of KR101354688B1 publication Critical patent/KR101354688B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/08Construction
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management, e.g. organising, planning, scheduling or allocating time, human or machine resources; Enterprise planning; Organisational models
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Abstract

A construction site supervision method according to the present invention relates to a supervision method of a construction site where the construction is performed by using a three dimensional virtual space modeling system, and includes: a process of photographing actual images of the construction site through a camera of a terminal; a process of detecting vector photographing information of the photographed construction site; a process of transmitting the photographed actual images and the detected vector photographing information to a management server; a process in which the management server detects a three dimensional model coinciding with the vector photographing information in a database of a three dimensional virtual space modeling system and detects each individual object in the detected three dimensional model; a process of generating matching information by matching each object on the actual image and each individual object of the three dimensional model; delivering an attribute value of the detected three dimensional model and the matching information to the terminal; a process in which the terminal activates each object on the actual image by combining the transmitted matching information to the actual image; and a process of displaying attribute information of a corresponding object, if each object outputs the activated actual image to a terminal screen and then user input touching the objects is detected. According to the present invention, the construction site supervision method enables a user to manage and supervise the construction state of the construction site more easily, by enabling the user to read the attribute value of each object on the screen simply by making the user photograph the construction site and touch the photographed images. And the method enables the user supervising the construction site to perform supervision work easily with one portable terminal without additional advance preparation. [Reference numerals] (100) Terminal; (200) Managing server; (AA) YES; (S10) Photograph construction field; (S100) Display corresponding attribute value if user select specific object in construction field; (S20) Detect vector photographing information of building at photographing spot; (S30) Transmit data (photographed image & vector photographing information); (S40) Detect data matched (equal or similar) with transmitted data by accessing DB of BIMs; (S50) Transmit identification value of detected information; (S60) Display transmitted identification values on screen; (S70) Does user select identification value of corresponding construction field?; (S80) Request total attribute value; (S90) Transmit total attribute value (including image matching information)

Description

Construction site supervision system and supervision method {SYSTEM AND METHOD FOR SUPERVISION OF CONSTRUCTION SITE}

The present invention relates to a construction site supervision system and method, and more particularly, to a construction site supervision system and a method using a mobile terminal.

Recently, in the process of designing and constructing a building or a building, a three-dimensional virtual space modeling system (or module) is generally used to make numerical data by generating data of objects constituting the building and to view three-dimensional display effects. have. In particular, in the project of planning a large building or structure, planning and designing using such a 3D virtual space modeling system (or module) is considered as an essential matter, and in the construction field, this is called 'BIM'.

Building Information Modeling (BIM) is a three-dimensional virtual space modeling system (or module) that combines information. Planning, design, engineering (structure, equipment, electricity, etc.), construction, and maintenance and disposal in multidimensional virtual spaces. Virtually modeling facilities and containing property information.

BIM includes digital models that provide a reliable basis for making decisions throughout the life cycle of a facility by its physical or functional characteristics in all areas of construction, including architecture, civil engineering, and plants, and the procedures for creating them. Refers to

The BIM is based on the objects and information on the objects of the entire construction field, and each object includes the shape information (geometry information), the attribute information and is formed with interconnection with other objects. Through this, it is possible to provide visualized information in two or three dimensions and to extract information for calculating quantities such as dimensions and areas of objects.

BIM data includes objects and properties of the walls, slabs, windows, doors, roofs, stairs, etc. that make up a building. In addition, since each object has a defined relationship with each other, related elements can be automatically reflected in drawings through interactions when a design change occurs. In addition, BIM technology enables integrated management of information at all stages by sharing and sharing all data generated by building, project by project, and process, regardless of whether the target building is structured or atypical.

In the process of designing and constructing a building or a building, 3D virtual space modeling includes the walls, slabs, windows, doors, roofs, stairs, and other objects that make up a construction structure (e.g., a building, a building). It expresses the relationship with each other and expresses the changing elements of the building in the building design, but it exists only in the virtual space and is used for planning and design work, which is rarely used in construction sites such as construction and supervision. It is true.

In projects that plan large buildings or structures that are under construction using a three-dimensional virtual space modeling system, construction work is very complex and requires preliminary work to prepare very complex and diverse information before supervision. . If the construction and supervision work at the construction site can be linked with the online 3D virtual space model and the attribute information, the workplace will be safer and the worker will be able to perform convenient and effective work.

SUMMARY OF THE INVENTION An object of the present invention is to provide a construction site supervision system and method for supervising and managing a construction status using a portable terminal interoperating with a 3D virtual space modeling system.

Additional features and advantages of the invention will be set forth in the description which follows, and in part will be apparent from the description, or may be learned by practice of the invention. The objectives and other advantages of the present invention will be realized and attained by the structure particularly pointed out in the claims, as well as the following description and the annexed drawings.

The construction site supervision system according to the present invention detects vector shooting information (eg, location information, bearing information, vertical tilt information, horizontal tilt information, and distance information) of a photographed image when a user photographs a construction site. It is linked to the 3D virtual space model of the construction site stored in the management server. The user according to the present invention can view the attribute values of each object on the photographed image with a simple screen touch at the construction site.

The present invention enables the user to easily view the property values of each object on the screen by simply photographing the construction site and touching the photographed image, so that the construction status of the construction site can be more easily managed and supervised. Users who supervise the construction site can easily perform the supervision work with just one mobile terminal without any preparation for the construction site.

In addition, in a project for planning a huge building or structure that is under construction using a three-dimensional virtual space modeling system, the supervision work requires prior work, such as preparing a very complex and diverse information before work, By using a construction site supervision system and a mobile terminal, it is expected that the supervision work can be performed more conveniently and simply.

1 is a block diagram of a mobile terminal according to the present invention;
2 illustrates the concept of an azimuth angle.
3 illustrates the concept of vertical tilt.
4 illustrates the concept of a horizontal tilt.
5 is an operation flowchart of a supervision system according to the present invention.
6 is a diagram illustrating image capturing according to the present invention;
7 is an operation example showing a live action image combined with matching information and a user's touch input according to the present invention.
8 is an exemplary operation of the present invention to display the property value of the object in accordance with the user's touch.

In order to achieve the above object, the construction site supervision method according to the present invention, in the construction site supervision method for performing the construction using a three-dimensional virtual space modeling system,

A process of photographing a live-action image of a construction site through a camera of a terminal, a process of detecting vector photographing information of a photographed construction site, transmitting a photographed live-action image and detected vector photographing information to a management server; Detecting, by the management server, a three-dimensional model that matches the vector photographing information from a database of a three-dimensional virtual space modeling system, and detecting each individual object in the detected three-dimensional model; Matching each individual object of the 3D model to generate matching information, transmitting attribute values and matching information of the detected 3D model to a terminal, and matching the transmission transmitted by the terminal to the live image The process of activating each object on the live image by combining the information, and displaying the live image with each object activated on the terminal screen. If a user input of touching the objects after the output is detected, the method may include displaying property information of the corresponding object.

Preferably, the process of detecting each individual object in the three-dimensional model,

Detecting a 3D model that matches the vector shooting information from a database of a 3D virtual space modeling system, photographing position information, bearing information, shooting position information (horizontal tilt and vertical tilt) included in the vector shooting information, Extracting the same modeling image as the real image from the 3D model at the same time point as the terminal camera with reference to lens angle information, photographing frame size and ratio information, and extracting each individual object from the extracted modeling image It is characterized by consisting of a process of detecting.

In order to achieve the above object, the portable terminal for construction site supervision according to the present invention in the construction site to proceed with the construction using the three-dimensional virtual space modeling system,

A camera unit for photographing the actual construction image of the construction site through a camera of the terminal, and a vector shooting information detection unit for detecting vector shooting information of the photographed construction site. The wireless communication unit which transmits the photographed photorealistic image and the detected vector photographing information to a management server side under control of a controller, and when the attribute value and matching information of a 3D model are received from the management server, the transmission to the photorealistic image. And a display unit for activating each object on the live image by combining the matching information, and a display unit for outputting the live image on which the object is activated on the terminal screen, wherein the controller is a user input for touching the objects on the terminal screen. If detected, the property information of the object is displayed.

Preferably, the vector shooting information detector includes a position detector for detecting a location of a photographing point of the terminal, an azimuth detector for detecting an azimuth angle directed by the camera, a vertical tilt detector for detecting a vertical tilt angle of the camera, and a horizontal tilt of the camera. And a horizontal tilt detector for detecting an angle, and a distance detector for detecting a distance between the terminal and the subject.

In order to achieve the above object, construction site supervision system according to the present invention in the construction site to proceed with the construction using the three-dimensional virtual space modeling system,

A mobile terminal for taking a photographic image of a construction site and detecting vector photographing information of a photographed construction site, and receiving the photographed photorealistic image and the detected vector image information from a portable terminal, the database of a 3D virtual space modeling system. Detects a three-dimensional model that matches the vector photographing information, detects each individual object in the detected three-dimensional model, and generates matching information by matching each object on the photo-realistic image with the individual objects of the three-dimensional model, respectively. It is configured to include a management server,

When the portable terminal receives the attribute value of the detected 3D model and matching information from the management server, the portable terminal combines the transmitted matching information with the live image to activate each object on the live image and the live image with each object activated. If the user input of touching the objects is detected after outputting to the terminal screen, the property information of the corresponding object is displayed.

An object of the present invention is a construction site supervision system that allows you to view the property value of each object on the photographed image only by the user's screen touch, if the construction site is photographed using a portable terminal linked with the 3D virtual space modeling system; To implement the supervision method.

Hereinafter, preferred embodiments of the present invention will be described with reference to the drawings.

1 is a block diagram of a portable terminal according to the present invention.

The mobile terminal 100 according to the present invention is a mobile terminal such as a mobile phone, a smart phone, a tablet computer, a notebook computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), and the like. Can be. In the following description, it is assumed that the mobile terminal 100 is a smart phone.

Referring to Figure 1, the mobile terminal according to the present invention will be described.

The mobile terminal 100 according to the present invention includes a wireless communication unit 110, an A / V input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, and a memory 160. ), The interface unit 170, the controller 180, the power supply unit 190, and the vector photographing information detector 300 may be included.

The wireless communication unit 110 includes one or more components for wireless communication between the mobile terminal 100 and a wireless communication (or mobile communication) system or wireless communication between the mobile terminal 100 and a network where the mobile terminal 100 is located. It may include. For example, the wireless communication unit 110 may transmit at least one of the broadcast reception module 111, the mobile communication module 112, the wireless Internet module 113, the short distance communication module 114, and the location information module 115 .

The wireless Internet module 113 (or mobile communication module 112) refers to a communication module for communicating with the management server 200. Applicable wireless Internet technologies include Wireless LAN (Wi-Fi), Wireless broadband (Wibro), World Interoperability for Microwave Access (Wimax), and High Speed Downlink Packet Access (HSDPA).

The A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 generates a photorealistic image by photographing a construction site (eg, a building structure of a site) that requires supervision according to a user's manipulation. Also, image frames of still images or moving images obtained by the image sensor are processed in the video call mode or the imaging mode. Then, the processed image frame can be displayed on the display unit 151. [

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. The camera 121 may include two or more cameras according to the configuration of the terminal.

The camera 121 includes a lens unit for forming an image of a subject, and an imaging device for converting an image formed by the lens unit into an image signal as an electrical signal. The imaging device is a device that photoelectrically converts an image of an imaged object, and a charge coupled device (CCD) imaging device or a complementary MOS (CMOS) imaging device may be used.

The microphone 122 receives an external sound signal by a microphone in a communication mode, a recording mode, a voice recognition mode, and the like and processes it as electrical voice data.

The user input unit 130 generates input data for a user to control operation of the terminal. The user input unit 130 may include a key pad, a dome switch, a touch pad (constant voltage / capacitance), a jog wheel, a jog switch, and the like. Particularly, when the touch pad has a mutual layer structure with the display unit 151 described later, it can be called a touch screen.

The sensing unit 140 is configured to control the operation of the mobile terminal 100 by detecting a current state of the mobile terminal 100 such as an open / closed state of the mobile terminal 100, presence or absence of user contact, acceleration / deceleration of the mobile terminal, and the like. Generate a sensing signal. For example, when the mobile terminal 100 is in the form of a slide phone, whether the slide phone is opened or closed may be sensed. Also, it is responsible for a sensing function related to whether or not the power supply unit 190 is powered on, whether the interface unit 170 is connected to an external device, and the like.

In addition, the interface unit 170 may be a passage for supplying power from the cradle to the mobile terminal 100 when the mobile terminal 100 is connected to an external cradle, or may be input by the user from the cradle. The command signal may be a passage through which the portable terminal 100 is transmitted. Various command signals or power input from the cradle may be operated as signals for recognizing that the external electronic device is correctly mounted on the cradle.

The output unit 150 is used to generate an output related to visual, auditory or tactile senses, and may include a display unit 151, a sound output module 152, an alarm unit 153, and the like.

The display unit 151 displays and outputs information processed by the mobile terminal 100 as well as information related to a call. For example, the mobile terminal 100 displays the vector shooting information detected by the mobile terminal 100 or the processing data of the management server 200, and displays a UI (User Interface) or GUI (Graphic User Interface) related to the call in the call mode.

The display unit 151 may be a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, a three-dimensional display 3D display).

In addition, two or more display units 151 may exist according to the implementation form of the mobile terminal 100. For example, the plurality of display units may be spaced apart or integrally disposed on one surface of the mobile terminal 100, or may be disposed on different surfaces.

The display unit 151 may include a display unit 151 and a touch sensor 160. The display unit 151 may be a touch screen, It can also be used as an input device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The sound output module 152 outputs audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, and a broadcast reception mode. In addition, the sound output module 152 outputs a sound signal related to a function (for example, a call signal reception sound, a message reception sound, etc.) performed by the mobile terminal 100. The audio output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying occurrence of an event of the mobile terminal 100. Examples of the event include a call signal reception, a message reception, a key signal input, a touch input, and the like. The alarm unit 153 may output a signal for notifying the occurrence of an event in a form other than an audio signal or a video signal, for example, by vibration. The video signal or the audio signal may also be output through the display unit 151 or the audio output module 152, so that they 151 and 152 may be classified as part of the alarm unit 153.

The memory 160 may store a program for processing and control of the controller 180 and may have a function for temporarily storing input / output data (e.g., a phone book, a message, a still image, . ≪ / RTI >

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), a RAM (Random Access Memory) SRAM (Static Random Access Memory), ROM (Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), PROM (Programmable Read- And an optical disc. In addition, the mobile terminal 100 may operate a web storage that performs a storage function of the memory 160 on the Internet.

The controller 180 typically controls the overall operation of the mobile terminal. For example, it performs related control and processing for detecting vector shooting information, voice or video call, data communication, and the like. The controller 180 transmits / receives various information (eg, live-action images, vector photographing information, identification values, attribute values, etc.) with the management server 200 through the wireless communication unit 110.

When the controller 180 receives the entire property value and the matching information from the management server 200, the controller 180 combines the transmitted matching information with the photographed live image to highlight the outline of each object on the live image as a fire frame. Method to be activated and output to the screen of the display unit 151. When a user input of touching the activated objects on the live action image is detected, attribute information of the corresponding object is output on the screen of the display unit 151.

The power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.

The vector photographing information detector 300 detects vector photographing information (vector photographing information between the camera 121 and the subject) of a photographing target (for example, each object (subject) of a building or a building site). At this time, the vector shooting information is information that can specify the shooting target, that is, the subject photographed by the camera 121, time information at the time of shooting, shooting position information for the three-dimensional position of the camera 121, and The information includes orientation information on the orientation toward which the lens axis is directed, vertical tilt information in which the lens axis of the camera 121 is inclined with respect to the gravity direction, and horizontal tilt information in which the horizontal axis of the captured image is inclined with respect to the horizon. The vector shooting information may further include distance information which is the distance from the camera 121 to the subject, lens angle angle information, shooting frame size and ratio information, user information of the mobile terminal (eg, serial number or model number of the mobile terminal, telephone number). Etc.), meta file information of the photographed image may be further included. Each piece of information included in the vector shooting information is information measured at a specific shooting time. The lens angle of view information and the photographing frame size and ratio information may be detected by the controller 180. The meta file information is information included in a header of a photographed image file, and time information, photographing position information, azimuth information, etc. at the photographing time point may be detected from the meta file.

As shown in FIG. 1, the vector photographing information detector 300 according to the present invention includes a position detector 310, an azimuth detector 320, a vertical tilt detector 330, a horizontal tilt detector 340, and a distance detector ( 350).

The location detector 310 provides location information of a location to be photographed, that is, three-dimensional location information of the camera 121, and a longitude, latitude, and altitude of a location where the camera 121 is located. ) Contains information about.

The location detector 310 generates location information by using various methods such as location information through a sensor network and location information through mobile communication, in addition to location information by a global positioning system (GPS). In the case of using the GPS signal, the location detector 310 includes a GPS chip that receives the GPS signal and calculates the GPS signal to generate location information. Alternatively, the position detector 310 may store only the GPS signal without performing position calculation and acquire position information through a PC or other external device. On the other hand, it may be difficult to obtain the location information of the terminal 100 in the building or where the GPS signal is not received smoothly, in this case, the tag reader of the terminal 100 can be used. A plurality of wireless tags may be installed at a predetermined point in a building or indoors, and location information may be obtained by reading the installed tags. The position detection unit 310 may also track the position using inertial navigation technology.

As shown in FIG. 2, the azimuth detection unit 320 acquires azimuth information indicating an azimuth angle of the lens axis of the camera 121 and includes a geomagnetic sensor (azimuth angle measuring sensor) to obtain azimuth information. desirable.

The azimuth angle α denotes a horizontal angle formed by the lens axis toward the subject 7 of the camera 121 with true north, that is, a horizontal angle formed by the camera 121 toward true subject with the true north. Expressed clockwise as Therefore, the azimuth angle α may have a range of 0 ° to 360 °. 2 is a diagram illustrating the concept of an azimuth angle.

The vertical tilt detection unit 330 measures a vertical tilt, and includes a tilt sensor, a gravity sensor, an acceleration sensor, and the like.

As shown in FIG. 3, the vertical tilt refers to an angle β of the lens axis toward the subject 7 of the camera 121 in the direction of gravity in the vector photographing information generating apparatus 100. For example, when photographing a subject in a direction perpendicular to the ground in the air, the vertical tilt is 0 degrees, and when photographing the air in a direction perpendicular to the ground, the vertical tilt is 180 degrees. 3 is a diagram illustrating the concept of a vertical tilt.

The horizontal tilt detection unit 340 measures a horizontal tilt, and includes a tilt sensor, a gravity sensor, an acceleration sensor, and the like.

As shown in FIG. 4, the horizontal tilt refers to an angle γ of which the horizontal axis of the image photographed by the camera 121 is in the horizontal direction. For example, in FIG. 4, the mobile terminal 100 photographing the subject 7 is inclined counterclockwise by γ °. 4 is a diagram illustrating the concept of horizontal tilt.

The horizontal tilt and vertical tilt detection may process horizontal tilt and vertical tilt information (shooting posture information) as one process by utilizing a modern sensor such as a 3D gravity sensor.

As shown in FIG. 3, the distance detector 350 provides photographing distance information measuring a distance l from the camera 121 to a subject, and an infrared distance measuring sensor and an ultrasonic wave to measure the distance to the subject. Various measuring instruments, such as a distance measuring sensor and a laser range finder, can be used.

Hereinafter, the operation method of the supervision system according to the present invention will be described in detail.

The present invention photographs a subject (eg, a building structure, etc.) that requires supervision (for example, a task of checking whether drawings and actual construction coincide) at a construction site through the terminal 100, and from the management server 200. The attribute information regarding the photographed building (architecture structure) is downloaded and displayed on the terminal screen. A user of a construction site (for example, a supervisor, a construction supervisor, a inspector, etc.) selects a desired object (screen touch) on a terminal screen on which the photographed image is displayed, and displays attribute information of the object (eg : ID information, manufacturing information, construction manager, specification, process deadline, etc. can be viewed.

5 is a flowchart illustrating an operation of a supervision system according to the present invention, and FIG. 6 is a diagram illustrating image capturing according to the present invention.

5 and 6, the user (eg, construction supervisor) according to the present invention first drives the camera 121 of the terminal 100 to be a subject to be supervised (eg, a building structure on site). Etc.), and create an image of the subject. (S10)

In addition, the terminal 100 detects vector photographing information about the photographed subject (hereinafter, referred to as a 'real image'). (S20) The terminal 100 is vector shooting information, time information at the time of shooting, shooting position information on the three-dimensional position of the camera 121, azimuth information on the orientation of the lens axis of the camera 121, camera ( The vertical tilt information in which the lens axis of 121 is inclined with respect to the gravity direction, the horizontal inclination information in which the horizontal axis of the captured image is inclined with respect to the horizon, and the distance information which is the distance from the terminal 100 to the subject are detected. In addition, as the vector shooting information, lens angle of view information, shooting frame size and ratio information can be detected.

Thereafter, the terminal 100 transmits the photographed photorealistic image and the detected vector photographing information to the management server 200. (S30)

The management server 200 receives a photorealistic image of a building or a building site and the corresponding vector photographing information from the terminal 100, and then refers to the received vector photographing information, and refers to a 3D virtual space modeling system (hereinafter, 'BIM In the database of (Building Information Modeling) module, the building structures corresponding to the received vector photographing information are detected. (S40)

The management server 200 first refers to the photographing location information and orientation information included in the vector photographing information, the lens angle of view information, the photographing frame size and the ratio information, and which region is the building site and the building site building thereof. Detect how many floors the workpiece (three-dimensional model), etc. Then, the three-dimensional coordinates of the terminal camera 121 is detected.

The management server 200 may also refer to the vertical tilt information, the horizontal tilt information, the distance information, and the three-dimensional coordinates of the terminal camera included in the vector shooting information, to be identical to the live image received from the terminal side 100. Extract the modeling image. Through this process, the present invention extracts a modeling image having the same shooting distance and horizontal and vertical view angle as the real image. At this time, the extracted modeling image will be a two-dimensional image.

Then, matching information is generated by matching each object (eg, a window frame, a door, a fragment window frame on the door, etc.) on the live image with the objects on the extracted modeling image. In this case, the matching information is to match the coordinates of the objects on the modeling image with the coordinates of the objects (eg, window frame, door, carved window frame on the door) on the actual image.

When the generated matching information is transmitted to the terminal side 100, the terminal 100 combines the transmitted matching information with the live image, and thus each object (for example, window frame, door, and carving window frame on the live image) Etc.), so that the outline is activated by using a method of highlighting the outline in a wireframe. When each object on the photorealistic image (e.g. window frame, door, carved window frame, etc.) is activated, the user simply selects (touches) the object on the screen, and the property information of the object (eg ID information, manufacturing) Information, construction charges, specifications, time limits, etc.) are displayed on the terminal screen as shown in FIG.

In the case of a small construction site (small construction project), a single matched result may be easily retrieved in the detection process (S40), but a construction / building project employing BIM (Building Information Modeling) is usually large and large. As an architectural project, when the vector photographing information provided from the terminal 100 is substituted, at least two or more similar results are searched.

When a plurality of similar results are searched as described above, the management server 200 first provides the detected search results (eg, 3D models) to the terminal 100 so that the user searches. Choose one of the outputs. In this case, the management server 200 transmits only the identification values of the results in order to minimize the transmitted data capacity.

When the identification values of the results are received from the management server 200, the terminal 100 displays the received identification values on the screen, and waits for the user's selection. After (S60), if the user selects the identification value of the construction site (for example, S3-A014-106D6F603), the terminal 100 manages the entire property value of the construction site (S3-A014-106D6F603) management server 200 ) To the side. (S70 ~ S80)

If the management server 100 receives a message requesting the entire property value of the construction site (S3-A014-106D6F603, three-dimensional model) from the terminal 100, as previously described, previously received from the terminal side (100) A modeling image corresponding to the actual image is extracted from the database of the BIM module with reference to the vector photographing information. Then, matching information is generated to match each object (eg, window frame, door, engraving frame on the door, etc.) on the live image and the objects on the extracted modeling image, respectively.

The management server 100 transmits the entire attribute value of the construction site S3-A014-106D6F603 requested by the terminal to the terminal 100 together with the matching information. (S90)

The terminal 100 receiving the entire attribute value and the matching information combines the transmitted matching information with the live image to activate each object (eg, a window frame, a door, a carved window frame on the door) on the live image. , As shown in FIG. 7. (S100) FIG. 7 is a diagram illustrating an operation showing a live action image combining matching information and a user's touch input according to the present invention.

When the user 100 detects a user input of touching the activated objects on the live action image, the terminal 100 displays attribute information (eg, window frame S3-A014-106D6F603-0711) as shown in FIG. 8. : ID information, manufacturing information, construction charge, specification, time limit, etc.) are displayed on the terminal screen. 8 is an exemplary operation of the present invention to display the property value of the object in accordance with the user's touch. FIG. 8B illustrates that when manufacturing information is selected from among attribute information of the window frame S3-A014-106D6F603-0711, for example, ID information, manufacturing information, construction charge, specification, time limit, etc. The attribute value table of the manufacturing information is displayed.

When explaining the construction site identification value (S3-A014-106D6F603), the S3 corresponds to the name of the region (for example, three areas of Seoul (eg, Seocho-dong)), and the A014 is an architectural site (for example, an apartment in three areas). Construction 014 district), and 106D6F603 means 106, 6th floor, 603 of apartment 014 district. The identification value S3-A014-106D6F603-0711 refers to a window frame in operation at 603, 6th floor, 106-dong, in the construction site of apartment 014 in Seocho-dong, Seoul, Korea.

According to the present invention, a user (for example, a construction supervisor) uses a terminal 100 to construct a construction site at his / her terminal 100 at a construction site (for example, 603, 6F, 106-dong, in the 014 apartment construction site located in Seocho-dong, Seoul, Korea). When taking a picture, the 3D virtual space modeling system (management server 200) and the terminal 100 according to the present invention performs the interworking procedure (S20 ~ S90), the user only selects the object on the screen (touch) As such, the property information of the object (eg, ID information, manufacturing information, construction manager, specification, time limit, etc.) can be viewed.

As described above, the controller 180 and the 3D virtual space modeling system according to the present invention can be embodied as computer readable codes on a medium in which a program is recorded. The computer-readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored.

Examples of the computer-readable medium include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device and the like, and also implemented in the form of a carrier wave (for example, transmission over the Internet) . The computer may include the controller 180 and a 3D virtual space modeling system.

Examples of the present invention have been described assuming that the mobile terminal 100 is a smart phone (smart phone). If the portable terminal 100 is a terminal with sufficient storage and cpu-side resources, such as a tablet computer or a notebook computer, the portable terminal 100 does not need to be connected to the remote management server 200 by embedding a BIM system. (100, for example, a tablet computer or a laptop computer) may replace the function and operation of the management server 200 by itself.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, May be constructed by selectively or in combination. Accordingly, the true scope of the present invention should be determined by the technical idea of the appended claims.

As described above, the construction site supervision system according to the present invention, when the user photographs the construction site, the vector shooting information (eg, location information, orientation information, vertical tilt information, horizontal tilt information, and distance information) of the photographed image ) Is linked to the 3D virtual space model of the construction site stored in the remote management server. The user according to the present invention can view the attribute values of each object on the photographed image with a simple screen touch at the construction site.

The present invention enables the user to easily view the property values of each object on the screen by simply photographing the construction site and touching the photographed image, so that the construction status of the construction site can be more easily managed and supervised. Users who supervise the construction site can easily perform the supervision work with just one mobile terminal without any preparation for the construction site.

In addition, in a project for planning a huge building or structure that is under construction using a three-dimensional virtual space modeling system, the supervision work requires prior work, such as preparing a very complex and diverse information before work, By using a construction site supervision system and a mobile terminal, it is expected that the supervision work can be performed more conveniently and simply.

100: mobile terminal
110: wireless communication unit 120: AV input unit
130: user input unit 140: sensing unit
150: output unit 160: memory
170: interface unit 180: control unit
190: power supply 200: management server
300: vector shooting information detection unit 310: position detection unit
320: orientation detection unit 330: vertical tilt detection unit
340: horizontal tilt detection unit 350: distance detection unit

Claims (11)

  1. In the method of supervising the construction site that uses the three-dimensional virtual space modeling system,
    Photographing a live-action image of a construction site through a camera of the terminal 100;
    Detecting vector photographing information of the photographed construction site;
    Transmitting the photographed live image and the detected vector photographing information to a management server 200;
    Detecting, by the management server (200), a three-dimensional model that matches the vector photographing information in a database of a three-dimensional virtual space modeling system and detecting individual objects in the detected three-dimensional model;
    Generating matching information by matching respective objects on the live image with individual objects of the 3D model;
    Transmitting attribute information and matching information of individual objects of the detected 3D model to the terminal (100);
    Activating, by the terminal (100), the objects on the live image to be touchable by combining the transmitted matching information with the live image;
    And displaying the attribute information and the attribute value table of the object as supervision information of the construction site when the input of the user of the terminal 100 touching the objects is detected after outputting the live image of each object activated on the terminal screen. Is done by
    The terminal 100
    A camera unit 121 for shooting a live-action image of a construction site through a built-in camera;
    A vector shooting information detection unit 300 for detecting vector shooting information of the photographed construction site;
    A wireless communication unit 110 for transmitting the photographed photorealistic image and the detected vector photographing information to a management server 200 under the control of the controller 180;
    A control unit 180 for activating each object on the live image by combining the transmitted matching information with the live image when the attribute value and matching information of the 3D model are received from the management server 200;
    It is configured to include a display unit 151 for outputting the live-action image activated by each object on the terminal screen,
    The control unit 180, if a user input for touching the objects of the terminal screen is detected, supervision method for the construction site, characterized in that to display the attribute information and the attribute value table of the object as the construction site supervision information.
  2. The method of claim 1, wherein the detecting process of the vector shooting information is performed.
    Detecting, by the position detection unit 310, a location of a photographing point of the terminal 100;
    Detecting, in the azimuth detection unit 320, the azimuth angle directed by the camera 121;
    Detecting, at the vertical tilt detector 330, the vertical tilt angle of the camera 121;
    Construction site supervision method comprising the step of detecting the horizontal tilt angle of the camera 121 in the horizontal tilt detection unit (340).
  3. The method of claim 1, wherein the detecting of each individual object in the three-dimensional model comprises:
    Detecting a 3D model matching the vector photographing information from a database of a 3D virtual space modeling system;
    Detecting three-dimensional coordinates of the terminal camera in the three-dimensional model by referring to the shooting position information, the orientation information, the lens angle information, and the shooting frame size and ratio information included in the vector shooting information;
    Extracting the same modeling image as the live image by referring to the vertical gradient information, the horizontal gradient information, and the 3D coordinates of the terminal camera included in the vector photographing information;
    Construction site supervision method comprising the step of detecting each individual object in the extracted modeling image.
  4. In the mobile terminal 100 used for supervising the construction site in the construction site that proceeds using the three-dimensional virtual space modeling system,
    A camera unit 121 for photographing a live-action image of a construction site through a camera of the terminal 100;
    A vector shooting information detection unit 300 for detecting vector shooting information of the photographed construction site;
    A wireless communication unit 110 for transmitting the photographed photorealistic image and the detected vector photographing information to a management server 200 under the control of the controller 180;
    A controller 180 for activating each object on the live image to be touchable by combining the transmitted matching information with the live image, when the attribute value and matching information of the 3D model are received from the management server 200;
    It is configured to include a display unit 151 for outputting the live-action image activated by each object on the terminal screen,
    The controller 180 displays the attribute information and the attribute value table of the object as supervision information of the construction site when a user input of touching the objects on the terminal screen is detected.
    The management server (200)
    When the photographed photorealistic image and the detected vector photographing information are received from the terminal 100, the 3D model and the individual objects on the 3D model that match the vector photographing information are detected in a database of a 3D virtual space modeling system. ,
    The mobile terminal for construction site supervision, characterized in that for generating matching information by matching each object on the live-action image and the individual objects of the three-dimensional model.
  5. delete
  6. The method of claim 4, wherein in the process of detecting each individual object in the 3D model, the management server 200,
    Detecting a 3D model matching the vector photographing information from a database of a 3D virtual space modeling system;
    Detecting three-dimensional coordinates of the terminal camera in the three-dimensional model by referring to the shooting position information, the orientation information, the lens angle information, and the shooting frame size and ratio information included in the vector shooting information;
    Extracting the same modeling image as the live image by referring to the vertical gradient information, the horizontal gradient information, and the 3D coordinates of the terminal camera included in the vector photographing information;
    And a process of detecting each individual object from the extracted modeling image.
  7. The method of claim 4, wherein the vector shooting information detector 300
    A position detector 310 for detecting a position of a photographing point of the terminal 100;
    An azimuth detector 320 for detecting an azimuth angle directed by the camera 121;
    A vertical tilt detector 330 for detecting a vertical tilt angle of the camera 121;
    Mobile terminal for construction site supervision comprising a horizontal tilt detection unit for detecting the horizontal tilt angle of the camera (121).
  8. In the construction site supervision system provided to the construction site using a three-dimensional virtual space modeling system,
    A mobile terminal 100 for photographing a live-action image of a construction site and detecting vector photographing information of the photographed construction site;
    When the photographed live image and the detected vector photographing information are received from the mobile terminal 100, a 3D model matching the vector photographing information is detected in a database of a 3D virtual space modeling system, and each of the detected 3D models is detected. And a management server 200 for detecting individual objects of the object and matching each object on the photorealistic image with individual objects of the 3D model to generate matching information.
    The mobile terminal 100
    When the attribute value of the detected 3D model and matching information are received from the management server 200, the transmitted matching information is combined with the live image to activate each object on the live image to be touchable, and each object is activated. After outputting the live-action image on the terminal screen and detecting the input of the user of the terminal 100 touching the objects, the attribute information and attribute value table of the object are displayed as supervision information of the construction site,
    The mobile terminal 100
    A camera unit 121 for shooting a live-action image of a construction site through a built-in camera;
    A vector shooting information detection unit 300 for detecting vector shooting information of the photographed construction site;
    A wireless communication unit 110 for transmitting the photographed photorealistic image and the detected vector photographing information to a management server 200 under the control of the controller 180;
    A control unit 180 for activating each object on the live image by combining the transmitted matching information with the live image when the attribute value and matching information of the 3D model are received from the management server 200;
    It is configured to include a display unit 151 for outputting the live-action image activated by each object on the terminal screen,
    The control unit 180, if a user input for touching the object of the terminal screen is detected, supervised construction site, characterized in that for displaying the attribute information of the object.
  9. delete
  10. The method of claim 8, wherein the vector shooting information detector 300
    A position detector 310 for detecting a position of a photographing point of the terminal 100;
    An azimuth detector 320 for detecting an azimuth angle directed by the camera 121;
    A vertical tilt detector 330 for detecting a vertical tilt angle of the camera 121;
    A horizontal tilt detector 340 for detecting a horizontal tilt angle of the camera 121;
    Construction site supervision system, characterized in that it comprises a distance detector for detecting the distance between the terminal (100) and the subject.
  11. The method of claim 8, wherein the management server 200
    In order to detect each individual object in the three-dimensional model, the terminal camera in the three-dimensional model with reference to the shooting position information, orientation information, lens angle information, shooting frame size and ratio information included in the vector shooting information Detect the three-dimensional coordinates of,
    The same modeling image as the real image is extracted by referring to the vertical tilt information, the horizontal tilt information, and the three-dimensional coordinates of the terminal camera included in the vector photographing information.
    Construction site supervision system, characterized in that for detecting each individual object in the extracted modeling image.
KR1020130078864A 2013-07-05 2013-07-05 System and method for supervision of construction site KR101354688B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020130078864A KR101354688B1 (en) 2013-07-05 2013-07-05 System and method for supervision of construction site

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130078864A KR101354688B1 (en) 2013-07-05 2013-07-05 System and method for supervision of construction site

Publications (1)

Publication Number Publication Date
KR101354688B1 true KR101354688B1 (en) 2014-01-27

Family

ID=50146264

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130078864A KR101354688B1 (en) 2013-07-05 2013-07-05 System and method for supervision of construction site

Country Status (1)

Country Link
KR (1) KR101354688B1 (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101482380B1 (en) 2013-02-28 2015-01-14 한국남부발전 주식회사 System for construction information creation and method for construction information creation using the same
KR101583723B1 (en) * 2015-01-16 2016-01-08 단국대학교 산학협력단 Interactive synchronizing system of BIM digital model and Real construction site
KR101595243B1 (en) * 2015-05-04 2016-02-19 주식회사 두올테크 Automatic modeling system for facility in field based building information modeling and method thereof
KR20160118899A (en) * 2015-04-03 2016-10-12 주식회사 씨엠엑스건축사사무소 Method, appratus and computer-readable recording medium for supporting field survey regarding building
EP3223221A1 (en) * 2016-03-22 2017-09-27 Hexagon Technology Center GmbH Construction management
WO2017176502A1 (en) * 2016-04-05 2017-10-12 Lynch & Associates - Engineering Consultants, LLC Electronic project management system
KR101798097B1 (en) * 2016-08-16 2017-11-16 김영대 Method for integrated management including building construction and maintenance of based on video
KR101844718B1 (en) 2017-08-31 2018-04-02 김용옥 Mobile terminal for construction supervision
KR101844726B1 (en) * 2017-12-11 2018-04-02 이태영 Drone for construction suprvision and the method of supervision using the same
KR101844533B1 (en) * 2016-04-22 2018-04-03 한국토지주택공사 Construction Monitering System for monitering progress of construction and Method thereof
KR101914386B1 (en) * 2017-08-31 2018-11-01 김용옥 Mobile terminal for construction supervision
KR101933652B1 (en) * 2017-03-23 2018-12-28 신승연 Indoor location based intelligent photo generation method and system
KR20190002834A (en) * 2017-06-30 2019-01-09 강동민 Home styling server and a system comprising the same, and a method for processing image for the home styling
KR101958635B1 (en) * 2018-09-13 2019-04-04 반석기초이앤씨(주) Mobile application system using reinforced coupler and management method of supervision of reinforcing steel construction using it
KR101988356B1 (en) * 2018-03-30 2019-09-30 (주)대우건설 Smart field management system through 3d digitization of construction site and analysis of virtual construction image
KR102033570B1 (en) * 2019-07-16 2019-10-17 한국교통대학교산학협력단 A Camera System For Remote Safety Management of Construction Site
KR102039334B1 (en) 2019-04-25 2019-11-27 김용진 A Remote Safety Management System for Construction Site
CN110663274A (en) * 2017-02-22 2020-01-07 米德查特有限责任公司 Improved building model with as built feature virtual capture and target performance tracking
KR20200015963A (en) 2018-08-06 2020-02-14 현대건설주식회사 Automatic Indoor Costruction progress inspection system and methodology using 360 degree Camera

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11266487A (en) * 1998-03-18 1999-09-28 Toshiba Corp Intelligent remote supervisory system and recording medium
KR20110000843A (en) * 2009-06-29 2011-01-06 주식회사 아이디폰 System for scene mornitoring using mobile terminal and method therefore
KR101195446B1 (en) * 2012-01-26 2012-12-24 이에스이 주식회사 A portable terminal providing maintenance guide based on augmented reality and the method of providing maintenance guide using the same
KR20120137739A (en) * 2011-06-13 2012-12-24 연세대학교 산학협력단 Location-based construction project management method and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11266487A (en) * 1998-03-18 1999-09-28 Toshiba Corp Intelligent remote supervisory system and recording medium
KR20110000843A (en) * 2009-06-29 2011-01-06 주식회사 아이디폰 System for scene mornitoring using mobile terminal and method therefore
KR20120137739A (en) * 2011-06-13 2012-12-24 연세대학교 산학협력단 Location-based construction project management method and system
KR101195446B1 (en) * 2012-01-26 2012-12-24 이에스이 주식회사 A portable terminal providing maintenance guide based on augmented reality and the method of providing maintenance guide using the same

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101482380B1 (en) 2013-02-28 2015-01-14 한국남부발전 주식회사 System for construction information creation and method for construction information creation using the same
KR101583723B1 (en) * 2015-01-16 2016-01-08 단국대학교 산학협력단 Interactive synchronizing system of BIM digital model and Real construction site
KR20160118899A (en) * 2015-04-03 2016-10-12 주식회사 씨엠엑스건축사사무소 Method, appratus and computer-readable recording medium for supporting field survey regarding building
KR101671235B1 (en) * 2015-04-03 2016-11-01 주식회사 씨엠엑스건축사사무소 Method, appratus and computer-readable recording medium for supporting field survey regarding building
CN107636239A (en) * 2015-05-04 2018-01-26 Doall科技株式会社 Field type facility automation modeling system and method based on BIM
KR101595243B1 (en) * 2015-05-04 2016-02-19 주식회사 두올테크 Automatic modeling system for facility in field based building information modeling and method thereof
WO2016178453A1 (en) * 2015-05-04 2016-11-10 주식회사 두올테크 Bim-based on-site facility automation modeling system and method
EP3223221A1 (en) * 2016-03-22 2017-09-27 Hexagon Technology Center GmbH Construction management
US10664781B2 (en) 2016-03-22 2020-05-26 Hexagon Technology Center Gmbh Construction management system and method for linking data to a building information model
WO2017176502A1 (en) * 2016-04-05 2017-10-12 Lynch & Associates - Engineering Consultants, LLC Electronic project management system
KR101844533B1 (en) * 2016-04-22 2018-04-03 한국토지주택공사 Construction Monitering System for monitering progress of construction and Method thereof
WO2018034470A1 (en) * 2016-08-16 2018-02-22 대성산업 주식회사 Video-based integrated management method for building construction and maintenance
KR101798097B1 (en) * 2016-08-16 2017-11-16 김영대 Method for integrated management including building construction and maintenance of based on video
CN110663274A (en) * 2017-02-22 2020-01-07 米德查特有限责任公司 Improved building model with as built feature virtual capture and target performance tracking
EP3586327A4 (en) * 2017-02-22 2020-03-11 Middle Chart Llc Improved building model with capture of as built features and experiential data
EP3586553A4 (en) * 2017-02-22 2020-03-11 Middle Chart Llc Improved building model with virtual capture of as built features and objective performance tracking
KR101933652B1 (en) * 2017-03-23 2018-12-28 신승연 Indoor location based intelligent photo generation method and system
KR102019299B1 (en) 2017-06-30 2019-09-06 강동민 Home styling server and a system comprising the same, and a method for processing image for the home styling
KR20190002834A (en) * 2017-06-30 2019-01-09 강동민 Home styling server and a system comprising the same, and a method for processing image for the home styling
KR101844718B1 (en) 2017-08-31 2018-04-02 김용옥 Mobile terminal for construction supervision
KR101914386B1 (en) * 2017-08-31 2018-11-01 김용옥 Mobile terminal for construction supervision
KR101844726B1 (en) * 2017-12-11 2018-04-02 이태영 Drone for construction suprvision and the method of supervision using the same
KR101988356B1 (en) * 2018-03-30 2019-09-30 (주)대우건설 Smart field management system through 3d digitization of construction site and analysis of virtual construction image
KR20200015963A (en) 2018-08-06 2020-02-14 현대건설주식회사 Automatic Indoor Costruction progress inspection system and methodology using 360 degree Camera
KR101958635B1 (en) * 2018-09-13 2019-04-04 반석기초이앤씨(주) Mobile application system using reinforced coupler and management method of supervision of reinforcing steel construction using it
KR102039334B1 (en) 2019-04-25 2019-11-27 김용진 A Remote Safety Management System for Construction Site
KR102033570B1 (en) * 2019-07-16 2019-10-17 한국교통대학교산학협력단 A Camera System For Remote Safety Management of Construction Site

Similar Documents

Publication Publication Date Title
US10445933B2 (en) Systems and methods for presenting building information
US9740962B2 (en) Apparatus and method for spatially referencing images
US10140769B2 (en) Electronic device and method for providing map service
US9805065B2 (en) Computer-vision-assisted location accuracy augmentation
CN105138126B (en) Filming control method and device, the electronic equipment of unmanned plane
US9830337B2 (en) Computer-vision-assisted location check-in
US9646384B2 (en) 3D feature descriptors with camera pose information
US10354452B2 (en) Directional and x-ray view techniques for navigation using a mobile device
Wu et al. Smartphoto: a resource-aware crowdsourcing approach for image sensing with smartphones
US10540804B2 (en) Selecting time-distributed panoramic images for display
US10068373B2 (en) Electronic device for providing map information
JP5871976B2 (en) Mobile imaging device as navigator
CN104913763B (en) Method and hand-held range unit for creating spatial model
US9661214B2 (en) Depth determination using camera focus
US9516281B1 (en) Systems and methods for automated cloud-based analytics for security surveillance systems with mobile input capture devices
CN103471580B (en) For providing method, mobile terminal and the server of navigation information
US9584694B2 (en) Predetermined-area management system, communication method, and computer program product
Zollmann et al. Augmented reality for construction site monitoring and documentation
US9424371B2 (en) Click to accept as built modeling
US9838485B2 (en) System and method for generating three-dimensional geofeeds, orientation-based geofeeds, and geofeeds based on ambient conditions based on content provided by social media content providers
US9514370B1 (en) Systems and methods for automated 3-dimensional (3D) cloud-based analytics for security surveillance in operation areas
US9699375B2 (en) Method and apparatus for determining camera location information and/or camera pose information according to a global coordinate system
KR101433305B1 (en) Mobile device based content mapping for augmented reality environment
US9516280B1 (en) Systems and methods for automated cloud-based analytics and 3-dimensional (3D) display for surveillance systems in retail stores
US9874454B2 (en) Community-based data for mapping systems

Legal Events

Date Code Title Description
A201 Request for examination
A302 Request for accelerated examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20170103

Year of fee payment: 4

FPAY Annual fee payment

Payment date: 20171226

Year of fee payment: 5

FPAY Annual fee payment

Payment date: 20190107

Year of fee payment: 6