GB2498177A - Apparatus for determining a floor plan of a building - Google Patents

Apparatus for determining a floor plan of a building Download PDF

Info

Publication number
GB2498177A
GB2498177A GB201122131A GB201122131A GB2498177A GB 2498177 A GB2498177 A GB 2498177A GB 201122131 A GB201122131 A GB 201122131A GB 201122131 A GB201122131 A GB 201122131A GB 2498177 A GB2498177 A GB 2498177A
Authority
GB
United Kingdom
Prior art keywords
spatial position
feature
building
image detector
spatial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB201122131A
Other versions
GB201122131D0 (en
Inventor
Max Christian
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to GB201122131A priority Critical patent/GB2498177A/en
Publication of GB201122131D0 publication Critical patent/GB201122131D0/en
Publication of GB2498177A publication Critical patent/GB2498177A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • G01C11/08Interpretation of pictures by comparison of two or more pictures of the same area the pictures not being supported in the same relative position as when they were taken
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/30Interpretation of pictures by triangulation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P21/00Testing or calibrating of apparatus or devices covered by the preceding groups

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)
  • Navigation (AREA)
  • Manufacturing & Machinery (AREA)

Abstract

 An apparatus 100 is provided for determining a layout of a building. The apparatus comprises: an inertial measurement unit (IMU) 108 adapted to track the spatial position of the apparatus 100 relative to an initial spatial position; an image detector 104 adapted to detect a plurality of images, each image being associated with a feature in a building; and a processor 124 and associated storage 106, the storage being adapted to store a plurality of spatial positions, each spatial position being associated with a respective image, wherein the spatial position is the spatial position of the image detector when detecting an image, and wherein the processor 124 is adapted to determine the relative spatial position of each said imaged feature utilising the plurality of images, and the plurality of associated spatial positions. The IMU 108 may comprise at least one gyroscope 112,114,116 and at least one accelerometer 118,120,122. The features in the building may include corners where walls meet, doors, windows, fixed items of furniture and sanitary ware etc. A further aspect relates to a method of calibration of portable electronic device accelerometers.

Description

I
Apparatus and Method for Position Determination
Field of the Invention
The present invention relates to an apparatus for determining a layout of a building.
In particular, the present invention relates to an apparatus enabled to track its movement around a building in order to determine the relative spatial position of features within the building. The invention also relates to an associated method of determining a layout of a building. Furthermore, the present invention relates to a method of calibrating portable electronic device accelerometers.
Background of the Invention
In the construction, marketing and conservation of buildings, a common requirement is the production of a floor plan diagram that accurately records the layout and dimensions of an existing building. This is usually performed by measuring the distance between some of the walls using a tape measure, laser distance measure or other device capable of measuring the linear distance between two surfaces. In addition, the operator typically produces a hand-drawn sketch while taking the measurements at the property, and the sketch is later re-drawn using computer software as a scale diagram. Alternatively, drawing software running on a handheld device is occasionally used at the same time as taking the measurements1 in order to bypass the need for a hand-drawn sketch.
Alternative systems have been developed that enable a user to automatically generate a plan of a single room by capturing images of the room while determining the orientation of the image capturing device. The resulting images and orientations are then processed to determine the plan of the room.
The present systems may result in the requirement of the very time-consuming process of recording the size of every wall, doorway and window and the locations of these features. Once determined, a scale drawing of the floor plan must be generated in an additional process, which increases the time taken to produce the floor plan.
More importantly, even if the dimensions of all the component parts of the floor plan are measured accurately, it is not usually possible to combine these parts into a scale floor plan diagram without inconsistencies. For example, a room that appears to be rectangular may actually be an irregular shape, making it impossible to tessellate all the rooms into a coherent scale diagram whilst respecting all of the recorded measurements.
Furthermore, using even the most advanced present systems, some measurements can be difficult or impossible to obtain using available measuring devices, such as the thickness of internal and external walls. These thicknesses can vary widely within a single building, again leading to inconsistencies ri the floor plan.
The present invention seeks to mitigate or alleviate at least the above problems.
Summary of the Invention
According to one aspect of the present invention, there is provided an apparatus for determining a layout of a building. The apparatus comprises an inertial measurement unit adapted to track the spatial position of the apparatus relative to an initial spatial position.
The apparatus also comprises an image detector adapted to detect a plurality of images, and each image is associated with a feature in a building. A processor and associated storage is also provided. The storage is adapted to store a plurality of spatial positions.
Each spatial position is associated with a respective image, and is the spatial position of the image detector when detecting an image. The processor is adapted to determine the * relative spatial position of each imaged feature utilising the plurality of images, and the plurality of associated spatial positions.
Advantageously, by providing such apparatus, a more accurate layout of a building can be produced since the requirement of drawing the layout by hand is eliminated.
Furthermore, by determining the relative spatial positions of the features within the building, advantageously, the apparatus provides details, such as wall thickness and divergence, that are not available by using existing techniques and systems.
As used herein, the term "feature" connotes any feature of a building such as, but not limited to, the corner of a room, an edge of a doorway, an edge of a window, an edge of built-in furniture, sanitary ware or any other item located within a building.
The inertial measurement unit system preferably uses a processor, linear motion sensors (accelerometers) and rotation sensors (gyroscopes) to continuously calculate via dead reckoning the positiop, orientation, and velocity (direction and speed of movement) of the apparatus without the need for external references.
The inertial measurement unit preferably comprises at least one gyroscope, and at least one accelerometer. The inertial measurement unit may further comprise a processor adapted to calculate the relative spatial position of the apparatus. Preferably, the inertial measurement unit is adapted to utilise the processor of the apparatus to process the inputs from the at east one gyroscope and the at least one accelerometer to track the spatial position of the apparatus.
The inertial measurement unit may be adapted to track the spatial position of the apparatus relative to an initial starting position. Alternatively, the inertial measurement unit may be adapted to track the spatial position of the apparatus relative to an inputted starting position. By enabling the tracking relative to an inputted starting position a user is provided
I
with the advantage of being able to restart the layout determining process after an interruption.
Preferably, the inertial measurement unit is adjusted periodically utilising the image detector and/or the at least one gyroscope. Preferably, the adjustment comprises resetting the inertial measurement unit with a zero velocity input. By adjusting the inertial measurement unit periodically the apparatus may be tracked more accurately. The inertial measurement unit is preferably adjusted by determining when the apparatus has a substantially zero velocity, the zero velocity being utilised to update the inertial measurement unit. The inertial measurement unit may utilise the zero velocity determination as a constraint when processing the accelerometer outputs for a series of apparatus motions. In this way, a series of accelerometer outputs, bounded at each end by a zero velocity constraint, can be adjusted such that they sum to zero. The zero velocity determination may also be utilised to adjust the gyroscope outputs in a similar way.
Preferably, the outputs from the accelerometers and gyroscopes are initially translated from the inertial measurement unit frame of reference to an alternative frame of reference. The alternative frame of reference preferably corresponds to the real world frame of reference; i.e. the degree of freedom up and down is translated to being parallel to gravity, and the remaining degrees of freedom are translated on that basis.
Preferably, the zero velocity is determined by calculating the optical flow of a series of images detected by said image detector. By utilising the image detector to provide an input to the inertial measurement unit, the spatial position of the apparatus may be tracked more accurately. The optical flow analysis is preferably performed substantially continuously, such that the inertial measuring unit is calibrated each time the apparatus has * zero velocity as determined by the optical flow analysis.
Optical flow analysis determines the pattern of the apparent motion of objects, surfaces, and edges in a visual scene caused by the relative motion between the image detector and the scene. The zero velocity optical flow analysis correlates successive images to determine when the apparatus has zero velocity.
Preferably, the inertial measuring unit comprises three gyroscopes, and three accelerometers.
Preferably, the inertial measuring unit is adapted to track the spatial position of the apparatus in at least four degrees of freedom. More preferably, the at least four degrees of freedom are, translational displacement in a first direction (left/right), translational displacement in a second direction orthogonal to the first direction (forward/backward), angular displacement about a first axis of rotation (pitch), and angular displacement about a second axis of rotation orthogonal to the first axis of rotation (yaw). Yet more preferably, the inertial measuring unit is adapted to track the spatial position of the apparatus in at least five degrees of freedom, the above four degrees of freedom, and angular displacement about a third axis of rotation orthogonal to the first and second axes of rotation (roll).
The processor may be adapted to utilise at least two spatial positions of the image detector to triangulate the spatial position of said feature.
The processor may be adapted to determine the spatial position of each imaged feature relative to the initial spatial position. Preferably, the processor is adapted to determine the spatial position of each feature in two dimensions. Preferably, the two dimensions are left/right and forward/backward; i.e. x and y, where z is parallel to gravity.
Alternatively, the processor may be adapted to determine the spatial position of each imaged feature relative to at least one other spatial position of another feature.
The image detector may be a charge-coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) sensor, or any other image detector suitable for providing a continuous, video, output.
The apparatus may further comprise a display adapted to display the output of the image detector continuously, and wherein a reference marker is provided on the display to enable the user to align the apparatus with an imaged feature when inputting a spatial position of the apparatus when imaging the feature.
The apparatus preferably further comprises an output adapted to output the relative spatial positions of each said imaged feature in the form of a layout, preferably in the form of a floor plan of the building being measured. The output may be at least one of: a display; an electronic file generator; and a printer.
The apparatus may further comprise a global positioning system receiver adapted to determine the absolute position of the apparatus. The absolute position of the apparatus may be utilised to determine the position of the building being measured on a map. In addition, the apparatus may also comprise a compass, preferably a digital compass, adapted to output the absolute orientation of the apparatus. The orientation of the device as measured by the compass may be utilised to determine orientation of the building so that it can be placed accurately on a map, or the like.
In addition, the GPS receiver and/or compass may be adapted to provide an indication of the relative positions and!or orientations of a plurality of sets of building feature spatial positions, preferably each set of feature spatial positions corresponds to a room in the building. Thus a more accurate representation of the building may be provided.
The apparatus may be a portable electronic device, preferably a cellular telephone, more preferably a smartphone. As used herein, the term smartphone connotes a mobile cellular telephone that includes advanced functionality beyond making phone calls and
I
sending text messages, in particular the term smartphone connotes a mobile cellular telephone capable of instalkng and running third party applications. Alternatively, the apparatus may be a tablet computing device, or any other appropriate device that is provided with the required features of the apparatus.
According to another aspect of the present invention, there is provided a method of determining a layout of a building. The method comprises detecting a plurality of images of a plurality of features in a building; tracking, utilising an inertial measurement unit, the spatial position of the image detector as the image detector moves through the building: determining a plurality of spatial positions, wherein each spatial position is associated with a tO respective image; and determining from the plurality of images and the pluratity of associated spatial positions, the relative spatial position of each feature.
Advantageously, by providing such a method, a more accurate representation of a * building's layout may be provided.
Preferably, at least two spatial positions of the image detector are utilised to determine by triangulation the spatial position of each feature.
A best fit analysis may be conducted to determine the spatial position of the feature when more than two images are utilised.
Preferably, the method further comprises aligning the feature to be imaged with a reference marker on a display on the apparatus By aligning the feature with a reference marker the feature's spatial position relative to the image detector may be determined more accurately.
Preferably, the spatial position of each said feature is determined relative to an initial spatial position.
0 In order to enable the relative spatial position of the feature to be determined, each spatial position associated with a respective image is preferably the position of the image detector.
The method preferably further comprises cahbrating the inertial measurement unit by determining when the image detector has a substantially zero velocity relative to the features having their spatial positions determined. The zero velocity may be determined by calculating the optical flow of a series of images detected by the image detector.
Alternatively, or in addition, the inertial measurement unit may be used to determine zero velocity; for example, the gyroscopes can be used to determine when there is zero velocity.
The optical flow analysis is preferably performed continuously, such that the inertial measuring unit is calibrated each time the image detector has zero velocity as determined by the optical flow analysis.
The zero velocity calibration may be fine-tuned utilising a rolling mean of the output values from the accelerometers and/or gyroscopes. The rolling mean is preferably calculated starting at the beginning of the zero velocity period, and is calculated towards the next period of motion. When the rolling mean deviates more than a threshold amount from the previous rolling mean, the data making up the rolling mean at that point are then preferably examined one-by-one. The single datum that first exceeded the threshold amount from the rolling mean during the zero velocity period is taken as the start of the next motion. This is preferably repeated per axis.
Preferably, the process is repeated backwards in time from the end of zero velocity period towards the end of the previous motion. Thus, the end point of the previous motion can be determined more accurately.
Preferably, the spatial position of the image detector is tracked in at least four degrees of freedom. Preferably, the four degrees of freedom are translational displacement in a first direction (left/right), translational displacement in a second direction orthogonal to the first direction (forward/backward), angular displacement about a first axis of rotation (pitch), and angular displacement about a second axis of rotation orthogonal to the first axis of rotation (yaw). More preferably a further, fifth degree of freedom is utilised. The fifth degree of freedom being an angular displacement about a third axis of rotation orthogonal to the first and second axes of rotation.
The method may further comprise plotting each spatial position of each feature to provide the layout of the features of the building. In addition, the method may also comprise outputting the layout to at least one of: a display; an electronic file generator; and a printer. For example, an electronic file may be generated in a standard format, such as POF, JPEG, PNG, OXE, or the like.
* Preferably, the method further comprises determining an initial spatial position of the image detector in relation to at least two features, each with a known spatial position. By determining an initial spatial position in relation to at least two such features, the method extends to enabiing the determination of the layout of the building to be continued from an existing series of measurements. The method preferably further comprises receiving an approximate indication of the initial spatial position in relation to the at least two features.
An image is preferably detected of the at least two known features, and preferably triangulation is utflised to determine the position of the image detector. More preferably, the image detector remains in a substantially stationary spatial position to image each feature.
By determining the spatial position of the image detector in relation to known features, the process of determining the layout can be continued from a known spatial position, and thus the relative spatial position of new features to existing features can be determined. -7..
According to a further aspect of the present invention, there is provided a method of calibration for portable electronic device accelerometers. The method comprises placing the portable electronic device on a substantially flat surface; taking a first set of readings from two substantially orthogonal accelerometers; rotating the portable electronic device by approximately 90 degrees; taking a second set of readings from the two substantially orthogonal accelerometers; determining the actual rotation angle of the portable electronic device; and determining a calibration constant for each of the accelerometers utilising the first and second readings and the rotation angle.
Preferably, the portable electronic device is a cellular telephone.
Preferably, the two substantially orthogonal accelerometers are oriented to measure accelerations in the plane of said surface.
The portable electronic device preferably compilses at least one gyroscope, and the at least one gyroscope reading is utilised to determine the actual rotation angle.
Where the portable electronic device is a cellular telephone, preferably the cellular telephone is a smartphone or any other such device as described herein.
The invention extends to methods and apparatus substantially as herein described with reference to the accompanying drawings.
As used herein, means plus function features may be expressed alternatively in terms of their corresponding structure.
Any feature in one aspect of the invention may be applied to other aspects of the invention, in any appropriate combination. In particular, method aspects may be applied to apparatus aspects, and vice versa. Furthermore, any, some and/or all features in one aspect can be applied to any, some and/or all features in any other aspect, in any appropriate combination.
It should also be appreciated that particular combinations of the various features described and defined in any aspects of the invention can be implemented and/or supplied and/or used independently.
Description of preferred embodiments
These and other aspects of the present invention will become apparent from the following exemplary embodiments that are described with reference to the following figures in which: Figure 1 shows a schematic representation of an apparatus according to the present invention; Figure 2 shows a schematic representation of the process of determining a building layout; a Figure 3 illustrates the apparatus according to the present invention in use; and Figure 4 shows a flow diagram of a method of calibrating accelerometers.
As shown in Figure 1, the apparatus 100 comprises a processor 102 in communication with an image detector 104, memory 106, an inertial measurement unit 108, an output 110, and an input 111. The image detector 104 is a digital camera capable of outputting continuous images in the form of a video stream. For example, the digital camera may be a charge-coupled device (COD) or a CMOS type device. The inertial measurement unit 108 comprises three gyroscopes 112, 114 and 116, and three accelerometers 118, 120 and 122. The inertial measurement unit may comprise an additional processor 124 adapted to process the outputs of the gyroscopes and accelerometers to track the spatial position of the apparatus. However, in one embodiment, the processor 102 is adapted to perform the functions of the processor 124, and as such there is no requirement for processor 124.
The inertial measurement unit is adapted to measure readings from the gyroscopes and accelerometers at a frequency of approximately 100 Hz. However, each sensor is polled in series and therefore the measurements do not correlate with each other. Therefore, the readings taken from each sensor are interpolated to provide a set of readings equivalent to polling all the sensors every 10 ms. Those interpolated readings are then utilised by the processor to track the position of the apparatus.
In use, the user caries the apparatus 100 through the building to have its layout determined. The user directs the video camera towards features within the building to be included within the layout, such as the corners where walls meet, doors and windows, fixed items of furniture, sanitary ware etc. The apparatus 100 displays the video taken by the camera on a display on the apparatus with an overlaid vertical line in the centre of the image. The user aligns the overlaid line with the feature visible in the captured image and, while holding the device substantially stationary, activates the input 111, such as a button. The memory 106 is adapted to store the tracking information output by the inertial measurement unit, together with a marker associated with each feature; the marker indicates that a particular spatial location is associated with a feature. This process is repeated at least once for each feature, with the user orienting the device towards the feature while standing in a different location at each repetition.
Utilising the method as described herein, the apparatus estimates the spatial position of each feature in the two horizontal dimensions. For each pair of captured corners for example, the device calculates the length, position and orientation of one surface of a wall (and similarly for doors and windows). By continuing this process throughout the building to be measured, an accurate floor plan diagram is produced. The layout of the building is calculated off-line once all of the features have been imaged at least twice. In an alternative embodiment, the relative spatial position of each feature is calculated as the user carries the apparatus through the building. The process of determining the layout of the building is described in further detail with reference to Figure 2.
Figure 2 shows a schematic representation of the process of determining a building layout. As can be seen, the path 200 represented by the solid line, shows the course that the apparatus 100 has taken through the building 202. As described above, the apparatus 100 is portable, and is carried through the building by the user. In this example, the position P1 the user has initiated the building layout generation process. The user then moved to position P2, and the course taken is tracked by the apparatuses inertial measurement unit 108. At point P2, the user utilises the image detector 104 to Image the features Fl, F2, Pa, F4, F5, F6, P7 and F8. As described above, the user aligns a reference line overlaid on the apparatus display with the feature, and indicates the alignment via the input button 111. A marker is stored in memory when the user indicates that a feature is aligned with the overlaid reference line.
The user then moved to position P3 and repeats the above process to image the features Fl, F2, F3, P4, F5, F6, FT and F8 once more. It is not necessary to image the features in the same order in each position. By imaging the features from at least two different positions, the apparatus is provided with a series of measurements that enable the relative spatial position of each feature to be determined utilising triangulation. This process is described in further detail below.
The user then moved to positions P4, P5 and PS, located in an adjoining room in this example, and images the features 10, FlU, Fli, F12, F13 and F14. Again, each feature is imaged from at least two different spatial positions.
Each time the user activates the button 111, the estimated current spatial position of the apparatus is combined with the estimated orientation of the device to form a half-plane in three-dimensional space (a half-plane being the planar region consisting of all points on *25 one side of an infinite straight line, and no points on the other side). The infinite tine defining the edge of the half-plane passes verticafly through the estimated spatial position of the apparatus. The half-plane is oriented in the direction of the feature with respect to the device at the point of activation of the button III, so that the closed half-plane contains the locations of both the device and the feature. At this point, the distance from the device to the feature is unknown.
The repetition from different locations of the alignment of the device with a given feature results in a set of half-planes. The best-fit intersection of these planes results in a single infinite vertical tine representing an estimate of the spatial position of the feature.
Since a floor plan is a representation of the structure of a floor of a building as it intersects with a horizontal plane, this vertical line pinpoints the location of the feature on the floor plan.
As can be seen from Figure 2, features such as diverging walls can be determined using the apparatus because the relative spatial positions of all features, in all rooms being measured, can be determined.
For the above method to estimate the location of features sufficiently accurately to result in an acceptable floor plan, the inertial measurement unit must provide a sufficiently accurate estimate of the motion of the device as the user carries it through the building.
However, the inertial measurement unit can only deduce the relative spatial position of the device through a double integration of the accelerations measured by the accelerometer in conjunction with the gyroscope. This results in a phenomenon known as drift, whereby the position estimate's error grows over time in an unbounded manner. In the case of lightweight accelerometers and gyroscopes of the kind suitable for use in a consumer handh&d electronic device, errors intrinsic to the sensors accumulate to cause at least 10cm of position error per minute.
In order to improve the accuracy of the inertial measurement unit, the image detector 104 is also utilised to provide zero velocity updates. Zero velocity update, is the term used to describe an indication to an inertial measurement unit that the device (in this case, the apparatus 100) is stationary at a given moment. The present system utilised a calculation of the optical flow of the image detector to determine such zero velocity moments. The optical flow technique correlates successive images in a series of images(i.e. within a video stream), and when the correlation is above a threshold value, the image detector, and hence the device is determined to be stationary.
By obtaining frequent such zero velocity updates, either when the user is aligning the apparatus with a feature or at any other time the device is stationary, the accuracy of the output from the inertial measurement unit becomes sufficient to produce an accurate floor plan.
In order to provide a yet further accurate determination of the spatial position of the apparatus at any given moment the zero velocity updates is refined utilising the following process. The zero velocity period is estimated utilising the above optical flow method. The outputs of each gyroscope are then analysed to refine the data that is utilised to update the inertial measurement unit. A rolling mean is calculated starting at the beginning of the zero velocity period and moving forwards in time towards the next period of motion. When the rolling mean deviates more than a threshold amount from the mean during the zero velocity period, the samples making up the rolling mean at that point are then examined one-by-one. The single sample that first moved more than the threshold amount from the mean during the zero velocity period is taken as the start of the next motion. This is repeated per-axis.
The whole process is then repeated backwards in time from the end of zero velocity period following the motion towards the end of the previous motion, to get the sample to take as the end of the previous motion.
In addition, a Salman smoother is utilized on the output data from the gyroscopes and accelerometers of the inertial measurement unit. The Kalman smoother operates on the stored data, and as suth is enabled to operate in a similar way as a Kalman filter, but both forwards and backwards in time. The Kalman filter is an efficient recursive filter that estimates the internal state of a linear dynamic system from a series of noisy measurements. In this case, the series of noisy measurements are the outputs from the accelerometers and gyroscopes.
The calculations are not done in real-time so that when each zero velocity event occurs, the algorithm is able to operate on a recording of all the accelerations and rotations since the last zero velocity event. Without correction, the accelerations measured by the unit, when transformed into real-world coordinates, would not sum to zero between zero velocity events due to measurement errors. The Kalman smoother is used to estimate the true acceleration between the two zero velocity events, taking into account that the actual velocity of the apparatus is known to be zero not only at the start but also at the end of a period of motion. In this way the apparatus can be tracked more accurately.
In detail, the Rauch-Tung-Striebel (RTS) variation of a fixed-interval Kalman smoother is utilised. The RTS smoother is configured with a state vector comprising the velocity and acceleration of the device in the two horizontal dimensions (left/right, and forward/backward). The measurement vector has the same components. Alternatively, the velocity and acceleration in the third (up/down) direction may be included. The RTS is a two pass algorithm that utilises the standard Kalman filter algorithm on the forward pass, and the following algorithm on the backward pass:
S
= k1k+1ln + Kkxk+llk where, = Fj1 (I -QJi-l1k) and = F'QkPIIk Different measurement noise vectors (Q) are used when the device is stationary (as determined by the optical flow and fine-tuning algorithms, as described above) and when it is known to be moving. In motion, the noise on the velocity measurement is essentially infinite. When stationary, a relatively low noise value for the velocity reflects the relative certainty of the zero velocity values in the measurement vector. The accelerometer error components also vary between the stationary and moving states according to the relative accuracy of the accelerometer hardware in use and the optical flow's estimation of zero velocity (and hence zero acceleration).
During a zero velocity period, accelerometer readings are gathered and a "trimmed mean" is applied. The trimmed mean operates by soiling the data, and calculating the mean of the middle third of the data. These stationary readings (one for each of the three accelerometers) are deducted from the accelerometer readings before being fed into the main algorithm to adjust for drift. This process may also be conducted on each of the three gyroscopes.
in addition, the readings from the gyroscopes are transformed from the apparatuses present orientation into a reference orientation, the reference orientation is the real-world frame of reference, and corresponds to the up/down axis (the z-axis) being parallel to gravity. During the transformation process, any misalignment of the gyroscopes from the three orthogonal axes is corrected utilising standard methods. A similar process is used to correct any misalignment of the accelerometers from the three orthogonal axes.
Furthermore, compensation is applied for coning and sculling effects. Compensation for coning effects is necessary to take into account any rotation of the apparatus during the measurement interval. Compensation for sculling effects is necessary to take into account the fact that the accelerometers are not collocated, and therefore a lever arm effect will be generated. In addition, compensation is applied to the gyroscope outputs to take into account the effect of acceleration (that is, linear acceleration) on the output of the giroscope. A similar compensation is applied to the outputs of the accelerometers to take into account the effect of rotation of the accelerometer outputs. Finally, compensation is applied to correct for scale factor errors in both the accelerometers and the gyroscopes.
The above compensation techniques are well-known.
Figure 3 illustrates the apparatus 100 in use. As described above in relation to Figure 2, the image detector 104 is utilised to image a series of features. As shown in Figure 3, the user 300 orients the apparatus in the direction a feature 302, in this example a corner between wall A and wall B, and then aligns the reference marker 304 with the feature 302 and presses the button 306. On pressing the button the spatial position of the apparatus is associated with the feature to enable further, offline, processing to determine the relative spatial positions of all such features, as described above.
Figure 4 shows a flow diagram of the method of calibrating accelerometers in a cellular telephone, such as a smartphone The method comprises initiating the calibration process at step 400, and placing the smartphone on a flat, but not necessarily level, surface oriented such that at least two of the accelerometers in the smartphone measure acceleration in the plane of the flat surface, step 402. The two accelerometers are orthogonal to each other. A set of readings is then taken from the two accelerometers that are oriented to measure acceleration in the plane of the flat surface, step 404. The smartphone is then rotated by approximately 90 degrees about an axis normal to the flat surface, step 406. A second set of readings is then taken from the two accelerometers, step 408. The actual angle through which the smartphone is rotated is then calculated, or measured. This may be accomphshed utilising a gyroscope on the smartphone, by taking readings of the gyroscope as the smartphone is rotated, the actual angle can be calculated. Alternatively, or in addition, where the smartphone comprises a gyroscope, the gyroscope readings are utilised to determine the actual angle that the sniartphone was rotated.
The folLowing formula is utilised to determine the accelerometer biases in step 412: a.-a + acosr-acosr + e4sinr-asinr X -a1 (sinr)2 + (cosT)2 -2cosr + I 1 a -4-asinr+4sinr+acosr-4cosr bc- ) (siltr)a+(cosr)a2cosr+1 where: b1, b, are the biases of the two accelerometers * 11 G1 are the readings from the two accelerometers prior to the rotation a2 a2 x' are the readings from the two accelerometers following the rotation r is the rotation around the axis perpendicular to the surface The determined biases are then stored in memory, step 414. The stored biases are utilised to ensure that more accurate accelerometer readings can be taken. In practice, the biases are subtracted from the actual accelerometer readings before further processing is conducted on the accelerometers readings. For example, the accelerometers may be utilised in an inertial measurement unit. In this example, the biases are subtracted from the accelerometer readings, and then those adjusted readings are fed into the inertial measurement unit.
As will be appreciated, by taking two sets of readings from the accelerometers from two positions separated by approximately 90 degrees of rotation, the readings from the two orthogonal accejerometers can be isolated such that the bias of each accelerometer can be determined. If the smartphone, or the like, comprises a third accelerometer, the process can be repeated with the smartphone in a second orientation.
When utilised in an inertial measurement unit, such as that described herein, the calibration process is conducted before starting the building layout generation process.
Artematively, the calibration process is conducted only once for each apparatus.
An additional method is provided to enable a user to restart a measurement process after pausing or stopping, This avows the user to add features to a previously created layout of the building. The existing layout might be the result of a previous application of the above apparatus, or it may be a layout generated from an entirely separate source such as art architect's drawing.
In this case, the user begins by invoking a resynchronisation mode. The incomplete floor plan is shown on the display on the apparatus. The user inputs a location within that plan that approximately coincides with the user's initial position in the building. The location can be input via a touch screen or other method.
The user views the video image from the image detector 104 and identifies at (east two features from the existing floor plan that are simultaneously visible in the image. The user inputs the location of each of these features on the existing floor plan in turn. For each feature, the user is also required to rotate the device to align with the feature as shown in Figure 3 and previously described. The user rotates but does not translate the device during this process.
*25 This method results in two or more vertical planes defined by the locations on the existing floor plan of the selected features (already known) and the orientation of the device on alignment with those features. The best-fit intersection between these pianes results in a vertical line that provides an estimate of the apparatuses spatial position in relation to the existing layout. The method then proceeds as described above, with the inertial measurement unit beginning with the initial location estimated via this resynchronisation method. -15-

Claims (1)

  1. <claim-text>CLAIMS1. Apparatus for determining a layout of a building, comprising: an inertial measurement unit adapted to track the spatial position of the apparatus relative to an initial spatial position; an image detector adapted to detect a plurality of images, each image being associated with a feature in a building: and a processor and associated storage, the storage being adapted to store a plurality of spatial positions, each spatial position being associated with a respective image, wherein the spatial position is the spatial position of the image detector when detecting an image, wherein, the processor is adapted to determine the relative spatial position * of each said imaged feature utilising the plurality of images, and the plurality of associated spatial positions.</claim-text> <claim-text>2. Apparatus according to Claim 1, wherein the inertial measurement unit comprises at least one gyroscope, and at least one accelerometer.</claim-text> <claim-text>3. Apparatus according to Claim 2, wherein the inertial measurement unit is adapted to utilise said processor to process the inputs from the at least one gyroscope and the at least one accelerometer to track the spatial position of the apparatus 4. Apparatus according to Claim 2 or 3, wherein the inertial measurement unit is adjusted periodically utilising the image detector and/or the at least one gyroscope 5. Apparatus according to Claim 4, wherein the inertial measurement unit is adjusted by determining when the apparatus has a substantially zero velocity, said zero velocity being utilised to update the inertial measurement unit.6. Apparatus according to Claim 5, wherein zero velocity is determined by calculating the optical flow of a series of images detected by said image detector.7. Apparatus according to Claim 6, wherein the optical flow anatysis is performed continuously, such that the inertial measuring unit is calibrated each time the apparatus has zero velocity as determined by the optical flow analysis. -16-8. Apparatus according to any of Claims 2 to 7, wherein the inertial measuring unit comprises three gyroscopes, and three accelerometers.9. Apparatus according to any of the preceding claims, wherein the processor is adapted to utilise at least two spatial positions of the image detector to triangulate the spatial position of said feature.10. Apparatus according to any of the preceding claims, wherein the processor is adapted to determine the spatial position of each said imaged feature relative to said initial spatial position.11. Apparatus according to any of the preceding claims, wherein the inertial measuring * unit is adapted to track the spatial position of the apparatus in at least five degrees of freedom.12. Apparatus according to Claim 11, wherein the at least five degrees of freedom are, translational displacement in a first direction, translational displacement in a second direction orthogonal to the first direction, angular displacement about a first axis of rotation, angular displacement about a second axis of rotation orthogonal to the first axis of rotation, and angular displacement about a third axis of rotation orthogonal to the first and second axes of rotation.13. Apparatus according to any of the preceding claims, wherein the image detector is 0 a charge-coupled device (CCD) or a compLementary metal-oxide semiconductor (CMOS) sensor.14. Apparatus according to any of the preceding claims1 further comprising an output adapted to output the relative spatial positions of each said imaged feature in the form of a layout, preferably in the form of a floor plan.15. Apparatus according to Claim 14, wherein the output is at least one of: a display; an electronic file generator; and a printer.16. Apparatus according to any of the preceding claims, further comprising a global positioning system receiver adapted to determine the absolute position of the apparatus.17. Apparatus according to any of the preceding claims, further comprising a compass, preferably a digital compass, adapted to output the absolute orientation of the apparatus.18. Apparatus according to Claim IS or 17, wherein said receiver aridlor compass is adapted to provide an indication of the relative positions anther orientations of a plurality of sets of feature spatial positions, preferably each set of feature spatial positions corresponds to a room in the building.19. Apparatus according to any of the preceding claims, wherein the apparatus is a portable electronic device, preferably a cellular telephone, more preferably a smartphone.20. A method of determining a layout of a building, comprising: detecting a plurality of images of a plurality of features in a building: tracking, utilising an inertial measurement unit, the spatial position of the image detector as the image detector moves through the building; determining a plurality of spatial positions, wherein each spatial position is the position of the image detector when imaging the feature; and determining from the plurality of spatial positions, the relative spatial position of each feature.* 21. A method according to Claim 20, wherein at least two images are detected of each said feature, each image being detected in a different spatial position.22. A method according to Claim 21, wherein the spati& position of said feature is determined by triangulation utilising the spatial position of the image detector when detecting each corresponding image.23. A method according to Claim 21 or 22, wherein a best fit analysis is conducted to determine the spatial position of said feature when more than two spatial positions are utilised.24. A method according to any of Claims 20 to 23, wherein the spatial position of each said feature is determined relative to an initial spatial position. -18-25. A method according to any of Claims 20 to 25, further comprising adjusting the inertial measurement unit by determining when the image detector has a substantially zero verocity.26. A method according to Claim 25, wherein zero velocity is determined by calculating the optical flow of a series of images detected by the image detector.27. A method according to Claim 26, wherein the optical flow analysis is performed continuously, such that the inertial measuring unit is adjusted each time the image detector has zero velocity as determined by the optical flow analysis.S28. A method according to any of Claims 25 to 27, wherein the zero velocity adjusting is fine-tuned utilising a rolling mean of the output values from the inertial measuring unit.29. A method according to any of Claims 20 to 28, wherein the spatial position of the image detector is tracked in at least five degrees of freedom, 30. A method according to any of Claims 20 to 29, further comprising plotting each spatial position of each feature to provide said layout of the features of the building.31. A method according to Claim 30, further comprising outputting said layout to at least one of: a display; an electronic file generator; and a printer.32. A method of calibration for portable electronic device accelerometers, comprising: placing the portable electronic device on a substantially flat surface; taking a first set of readings from two substantially orthogonal accelerometers; rotating the partable electronic device by approximately 90 degrees; taking a second set of readings from the two substantially orthogonal accelerometers; determining the actual rotation angle of the portable electronic device: and determining a calibration constant for each of the accelerometers utilising the first and second readings and the rotation angle.33. A method according to Claim 32, wherein the two substantially orthogonal accelerometers are oriented to measure accelerations in the plane of said surface.34. A method according to Claim 32 or 33, wherein the portable electronic device comprises at least one gyroscope, and the at least one gyroscope reading is utilised to determine the actual rotation angle.35. A method according to any of Claims 32 to 34, wherein the portable electronic device is a smartphone.36. Apparatus for determining a layout for a building substantially as herein described with reference to the accompanying figures.37. A method of determining a layout of a building substantially as herein described with reference to the accompanying figures.38. A method of calibration for portable electronic device accelerometers substantially as herein described.</claim-text>
GB201122131A 2011-12-21 2011-12-21 Apparatus for determining a floor plan of a building Withdrawn GB2498177A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB201122131A GB2498177A (en) 2011-12-21 2011-12-21 Apparatus for determining a floor plan of a building

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB201122131A GB2498177A (en) 2011-12-21 2011-12-21 Apparatus for determining a floor plan of a building

Publications (2)

Publication Number Publication Date
GB201122131D0 GB201122131D0 (en) 2012-02-01
GB2498177A true GB2498177A (en) 2013-07-10

Family

ID=45572887

Family Applications (1)

Application Number Title Priority Date Filing Date
GB201122131A Withdrawn GB2498177A (en) 2011-12-21 2011-12-21 Apparatus for determining a floor plan of a building

Country Status (1)

Country Link
GB (1) GB2498177A (en)

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103557874A (en) * 2013-11-10 2014-02-05 西安费斯达自动化工程有限公司 Precession movement elimination and compensation method for vertical gyro in attitude measuring system
US8868375B1 (en) 2014-05-21 2014-10-21 Locometric Ltd Generation of a floor plan
US10268782B1 (en) 2017-02-22 2019-04-23 Middle Chart, LLC System for conducting a service call with orienteering
US10433112B2 (en) 2017-02-22 2019-10-01 Middle Chart, LLC Methods and apparatus for orienteering
US10480943B2 (en) 2016-02-04 2019-11-19 Honeywell International Inc. Systems and methods for noise and drift calibration using dithered calibration
CN110869700A (en) * 2017-07-28 2020-03-06 高通股份有限公司 System and method for determining vehicle position
US10620084B2 (en) 2017-02-22 2020-04-14 Middle Chart, LLC System for hierarchical actions based upon monitored building conditions
US10628617B1 (en) 2017-02-22 2020-04-21 Middle Chart, LLC Method and apparatus for wireless determination of position and orientation of a smart device
US10671767B2 (en) 2017-02-22 2020-06-02 Middle Chart, LLC Smart construction with automated detection of adverse structure conditions and remediation
US10733334B2 (en) 2017-02-22 2020-08-04 Middle Chart, LLC Building vital conditions monitoring
US10740502B2 (en) 2017-02-22 2020-08-11 Middle Chart, LLC Method and apparatus for position based query with augmented reality headgear
US10740503B1 (en) 2019-01-17 2020-08-11 Middle Chart, LLC Spatial self-verifying array of nodes
US10762251B2 (en) 2017-02-22 2020-09-01 Middle Chart, LLC System for conducting a service call with orienteering
US10776529B2 (en) 2017-02-22 2020-09-15 Middle Chart, LLC Method and apparatus for enhanced automated wireless orienteering
US10824774B2 (en) 2019-01-17 2020-11-03 Middle Chart, LLC Methods and apparatus for healthcare facility optimization
US10831945B2 (en) 2017-02-22 2020-11-10 Middle Chart, LLC Apparatus for operation of connected infrastructure
US10872179B2 (en) 2017-02-22 2020-12-22 Middle Chart, LLC Method and apparatus for automated site augmentation
US10902160B2 (en) 2017-02-22 2021-01-26 Middle Chart, LLC Cold storage environmental control and product tracking
US10949579B2 (en) 2017-02-22 2021-03-16 Middle Chart, LLC Method and apparatus for enhanced position and orientation determination
US10984146B2 (en) 2017-02-22 2021-04-20 Middle Chart, LLC Tracking safety conditions of an area
US11054335B2 (en) 2017-02-22 2021-07-06 Middle Chart, LLC Method and apparatus for augmented virtual models and orienteering
US11194938B2 (en) 2020-01-28 2021-12-07 Middle Chart, LLC Methods and apparatus for persistent location based digital content
US11269060B1 (en) 2021-07-09 2022-03-08 Locometric Limited Determination of whether a boundary includes an interruption
US11436389B2 (en) 2017-02-22 2022-09-06 Middle Chart, LLC Artificial intelligence based exchange of geospatial related digital content
US11468209B2 (en) 2017-02-22 2022-10-11 Middle Chart, LLC Method and apparatus for display of digital content associated with a location in a wireless communications area
US11475177B2 (en) 2017-02-22 2022-10-18 Middle Chart, LLC Method and apparatus for improved position and orientation based information display
US11481527B2 (en) 2017-02-22 2022-10-25 Middle Chart, LLC Apparatus for displaying information about an item of equipment in a direction of interest
US11507714B2 (en) 2020-01-28 2022-11-22 Middle Chart, LLC Methods and apparatus for secure persistent location based digital content
US11625510B2 (en) 2017-02-22 2023-04-11 Middle Chart, LLC Method and apparatus for presentation of digital content
US11640486B2 (en) 2021-03-01 2023-05-02 Middle Chart, LLC Architectural drawing based exchange of geospatial related digital content
US11900021B2 (en) 2017-02-22 2024-02-13 Middle Chart, LLC Provision of digital content via a wearable eye covering
US11900022B2 (en) 2017-02-22 2024-02-13 Middle Chart, LLC Apparatus for determining a position relative to a reference transceiver

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11199410B2 (en) * 2019-04-30 2021-12-14 Stmicroelectronics, Inc. Dead reckoning by determining misalignment angle between movement direction and sensor heading direction

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1336949A1 (en) * 2002-02-18 2003-08-20 Purple Labs Device with rectangular display
US20090138224A1 (en) * 2007-09-29 2009-05-28 Ruey-Der Lou Methods for improving accuracy of measurement and calibration of accelerometer parameters
WO2011091552A1 (en) * 2010-02-01 2011-08-04 Intel Corporation Extracting and mapping three dimensional features from geo-referenced images
EP2434256A2 (en) * 2010-09-24 2012-03-28 Honeywell International Inc. Camera and inertial measurement unit integration with navigation data feedback for feature tracking

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1336949A1 (en) * 2002-02-18 2003-08-20 Purple Labs Device with rectangular display
US20090138224A1 (en) * 2007-09-29 2009-05-28 Ruey-Der Lou Methods for improving accuracy of measurement and calibration of accelerometer parameters
WO2011091552A1 (en) * 2010-02-01 2011-08-04 Intel Corporation Extracting and mapping three dimensional features from geo-referenced images
EP2434256A2 (en) * 2010-09-24 2012-03-28 Honeywell International Inc. Camera and inertial measurement unit integration with navigation data feedback for feature tracking

Cited By (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103557874B (en) * 2013-11-10 2016-06-08 西安费斯达自动化工程有限公司 In a kind of attitude measurement system, the precessional motion of vertical gyro eliminates and compensation method
CN103557874A (en) * 2013-11-10 2014-02-05 西安费斯达自动化工程有限公司 Precession movement elimination and compensation method for vertical gyro in attitude measuring system
US8868375B1 (en) 2014-05-21 2014-10-21 Locometric Ltd Generation of a floor plan
US10480943B2 (en) 2016-02-04 2019-11-19 Honeywell International Inc. Systems and methods for noise and drift calibration using dithered calibration
US11054335B2 (en) 2017-02-22 2021-07-06 Middle Chart, LLC Method and apparatus for augmented virtual models and orienteering
US10983026B2 (en) 2017-02-22 2021-04-20 Middle Chart, LLC Methods of updating data in a virtual model of a structure
US11080439B2 (en) 2017-02-22 2021-08-03 Middle Chart, LLC Method and apparatus for interacting with a tag in a cold storage area
US10620084B2 (en) 2017-02-22 2020-04-14 Middle Chart, LLC System for hierarchical actions based upon monitored building conditions
US11087039B2 (en) 2017-02-22 2021-08-10 Middle Chart, LLC Headset apparatus for display of location and direction based content
US10671767B2 (en) 2017-02-22 2020-06-02 Middle Chart, LLC Smart construction with automated detection of adverse structure conditions and remediation
US10726167B2 (en) 2017-02-22 2020-07-28 Middle Chart, LLC Method and apparatus for determining a direction of interest
US10733334B2 (en) 2017-02-22 2020-08-04 Middle Chart, LLC Building vital conditions monitoring
US10740502B2 (en) 2017-02-22 2020-08-11 Middle Chart, LLC Method and apparatus for position based query with augmented reality headgear
US11900023B2 (en) 2017-02-22 2024-02-13 Middle Chart, LLC Agent supportable device for pointing towards an item of interest
US10760991B2 (en) 2017-02-22 2020-09-01 Middle Chart, LLC Hierarchical actions based upon monitored building conditions
US10762251B2 (en) 2017-02-22 2020-09-01 Middle Chart, LLC System for conducting a service call with orienteering
US10776529B2 (en) 2017-02-22 2020-09-15 Middle Chart, LLC Method and apparatus for enhanced automated wireless orienteering
US11900022B2 (en) 2017-02-22 2024-02-13 Middle Chart, LLC Apparatus for determining a position relative to a reference transceiver
US10831945B2 (en) 2017-02-22 2020-11-10 Middle Chart, LLC Apparatus for operation of connected infrastructure
US10831943B2 (en) 2017-02-22 2020-11-10 Middle Chart, LLC Orienteering system for responding to an emergency in a structure
US10866157B2 (en) 2017-02-22 2020-12-15 Middle Chart, LLC Monitoring a condition within a structure
US10872179B2 (en) 2017-02-22 2020-12-22 Middle Chart, LLC Method and apparatus for automated site augmentation
US10902160B2 (en) 2017-02-22 2021-01-26 Middle Chart, LLC Cold storage environmental control and product tracking
US11900021B2 (en) 2017-02-22 2024-02-13 Middle Chart, LLC Provision of digital content via a wearable eye covering
US10949579B2 (en) 2017-02-22 2021-03-16 Middle Chart, LLC Method and apparatus for enhanced position and orientation determination
US10984148B2 (en) 2017-02-22 2021-04-20 Middle Chart, LLC Methods for generating a user interface based upon orientation of a smart device
US10984147B2 (en) 2017-02-22 2021-04-20 Middle Chart, LLC Conducting a service call in a structure
US10984146B2 (en) 2017-02-22 2021-04-20 Middle Chart, LLC Tracking safety conditions of an area
US11100260B2 (en) 2017-02-22 2021-08-24 Middle Chart, LLC Method and apparatus for interacting with a tag in a wireless communication area
US11010501B2 (en) 2017-02-22 2021-05-18 Middle Chart, LLC Monitoring users and conditions in a structure
US11893317B2 (en) 2017-02-22 2024-02-06 Middle Chart, LLC Method and apparatus for associating digital content with wireless transmission nodes in a wireless communication area
US10268782B1 (en) 2017-02-22 2019-04-23 Middle Chart, LLC System for conducting a service call with orienteering
US11625510B2 (en) 2017-02-22 2023-04-11 Middle Chart, LLC Method and apparatus for presentation of digital content
US10628617B1 (en) 2017-02-22 2020-04-21 Middle Chart, LLC Method and apparatus for wireless determination of position and orientation of a smart device
US10433112B2 (en) 2017-02-22 2019-10-01 Middle Chart, LLC Methods and apparatus for orienteering
US11610033B2 (en) 2017-02-22 2023-03-21 Middle Chart, LLC Method and apparatus for augmented reality display of digital content associated with a location
US11106837B2 (en) 2017-02-22 2021-08-31 Middle Chart, LLC Method and apparatus for enhanced position and orientation based information display
US11120172B2 (en) 2017-02-22 2021-09-14 Middle Chart, LLC Apparatus for determining an item of equipment in a direction of interest
US11188686B2 (en) 2017-02-22 2021-11-30 Middle Chart, LLC Method and apparatus for holographic display based upon position and direction
US11610032B2 (en) 2017-02-22 2023-03-21 Middle Chart, LLC Headset apparatus for display of location and direction based content
US11514207B2 (en) 2017-02-22 2022-11-29 Middle Chart, LLC Tracking safety conditions of an area
US11481527B2 (en) 2017-02-22 2022-10-25 Middle Chart, LLC Apparatus for displaying information about an item of equipment in a direction of interest
US11429761B2 (en) 2017-02-22 2022-08-30 Middle Chart, LLC Method and apparatus for interacting with a node in a storage area
US11436389B2 (en) 2017-02-22 2022-09-06 Middle Chart, LLC Artificial intelligence based exchange of geospatial related digital content
US11475177B2 (en) 2017-02-22 2022-10-18 Middle Chart, LLC Method and apparatus for improved position and orientation based information display
US11468209B2 (en) 2017-02-22 2022-10-11 Middle Chart, LLC Method and apparatus for display of digital content associated with a location in a wireless communications area
CN110869700A (en) * 2017-07-28 2020-03-06 高通股份有限公司 System and method for determining vehicle position
US11042672B2 (en) 2019-01-17 2021-06-22 Middle Chart, LLC Methods and apparatus for healthcare procedure tracking
US11861269B2 (en) 2019-01-17 2024-01-02 Middle Chart, LLC Methods of determining location with self-verifying array of nodes
US11636236B2 (en) 2019-01-17 2023-04-25 Middle Chart, LLC Methods and apparatus for procedure tracking
US11593536B2 (en) 2019-01-17 2023-02-28 Middle Chart, LLC Methods and apparatus for communicating geolocated data
US10740503B1 (en) 2019-01-17 2020-08-11 Middle Chart, LLC Spatial self-verifying array of nodes
US10824774B2 (en) 2019-01-17 2020-11-03 Middle Chart, LLC Methods and apparatus for healthcare facility optimization
US10943034B2 (en) 2019-01-17 2021-03-09 Middle Chart, LLC Method of wireless determination of a position of a node
US11100261B2 (en) 2019-01-17 2021-08-24 Middle Chart, LLC Method of wireless geolocated information communication in self-verifying arrays
US11361122B2 (en) 2019-01-17 2022-06-14 Middle Chart, LLC Methods of communicating geolocated data based upon a self-verifying array of nodes
US11436388B2 (en) 2019-01-17 2022-09-06 Middle Chart, LLC Methods and apparatus for procedure tracking
US11507714B2 (en) 2020-01-28 2022-11-22 Middle Chart, LLC Methods and apparatus for secure persistent location based digital content
US11194938B2 (en) 2020-01-28 2021-12-07 Middle Chart, LLC Methods and apparatus for persistent location based digital content
US11809787B2 (en) 2021-03-01 2023-11-07 Middle Chart, LLC Architectural drawing aspect based exchange of geospatial related digital content
US11640486B2 (en) 2021-03-01 2023-05-02 Middle Chart, LLC Architectural drawing based exchange of geospatial related digital content
US11269060B1 (en) 2021-07-09 2022-03-08 Locometric Limited Determination of whether a boundary includes an interruption

Also Published As

Publication number Publication date
GB201122131D0 (en) 2012-02-01

Similar Documents

Publication Publication Date Title
GB2498177A (en) Apparatus for determining a floor plan of a building
CN105865451B (en) Method and apparatus for mobile robot indoor positioning
CN107194969B (en) Sensor calibration and position estimation based on vanishing point determination
US20160260250A1 (en) Method and system for 3d capture based on structure from motion with pose detection tool
EP2807629B1 (en) Mobile device configured to compute 3d models based on motion sensor data
WO2015134795A2 (en) Method and system for 3d capture based on structure from motion with pose detection tool
JP5027747B2 (en) POSITION MEASUREMENT METHOD, POSITION MEASUREMENT DEVICE, AND PROGRAM
EP3168571B1 (en) Utilizing camera to assist with indoor pedestrian navigation
JP5027746B2 (en) POSITION MEASUREMENT METHOD, POSITION MEASUREMENT DEVICE, AND PROGRAM
WO2010001940A1 (en) Position measurement method, position measurement device, and program
US20080144925A1 (en) Stereo-Based Visual Odometry Method and System
Ruotsalainen et al. Visual-aided two-dimensional pedestrian indoor navigation with a smartphone
RU2572637C2 (en) Parallel or serial reconstructions in online and offline modes for 3d measurements of rooms
JP2008058264A (en) Device, method and program for observing flow velocity at actual river as object of observation
Brunetto et al. Fusion of inertial and visual measurements for rgb-d slam on mobile devices
TWM560099U (en) Indoor precise navigation system using augmented reality technology
Cheng et al. AR-based positioning for mobile devices
US20170343355A1 (en) Method And System For Estimating Relative Angle Between Headings
JP4649192B2 (en) Stereo image creation method and three-dimensional data creation apparatus
Qian et al. Optical flow based step length estimation for indoor pedestrian navigation on a smartphone
CN115540854A (en) Active positioning method, equipment and medium based on UWB assistance
Hol et al. A new algorithm for calibrating a combined camera and IMU sensor unit
CN105874352B (en) The method and apparatus of the dislocation between equipment and ship are determined using radius of turn
JPH08261719A (en) Device and method for calculating amount of relative movement
CN108801248B (en) Planar vision inertial navigation method based on UKF

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)