GB2534190A - A method for measuring distance and areas by mobile devices combined with light beam projectors - Google Patents

A method for measuring distance and areas by mobile devices combined with light beam projectors Download PDF

Info

Publication number
GB2534190A
GB2534190A GB1500712.3A GB201500712A GB2534190A GB 2534190 A GB2534190 A GB 2534190A GB 201500712 A GB201500712 A GB 201500712A GB 2534190 A GB2534190 A GB 2534190A
Authority
GB
United Kingdom
Prior art keywords
distance
light beam
module
measuring point
mcu
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1500712.3A
Other versions
GB201500712D0 (en
Inventor
Lin Wen-Wei
Yen Hsien-Cheng
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Conary Enterprise Co Ltd
Original Assignee
Conary Enterprise Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Conary Enterprise Co Ltd filed Critical Conary Enterprise Co Ltd
Priority to GB1500712.3A priority Critical patent/GB2534190A/en
Publication of GB201500712D0 publication Critical patent/GB201500712D0/en
Publication of GB2534190A publication Critical patent/GB2534190A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • G01C3/08Use of electric radiation detectors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
    • G01C15/002Active optical surveying means
    • G01C15/008Active optical surveying means combined with inclination sensor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

A method for measuring distance and area by a mobile device 50 combined with a light beam projector which projects the light beams in the direction identical to the one of the image retrieved from a photodetector module of the mobile device, to project a first measuring point A and a second measuring point B and produce first distance OA and a second distance OB. The coordinates (X1, Y1, Z1), (X2, Y2, Z2) of the first and second measuring points A, B are calculated with a first azimuth data and a second azimuth data detected by an azimuth sensor of the mobile device. Thus the method can calculate the coordinate distance AB between the first and second measuring points A, B and the area OAB. The method can calculate the coordinate distance between any two points and the measure of the area surrounded thereby, achieving a convenient and augmented effectiveness in the measuring process.

Description

TITLE: A METHOD FOR MEASURING DISTANCE AND AREAS BY MOBILE DEVICES COMBINED WITH LIGHT BEAM PROJECTORS
BACKGROUND OF THE INVENTION
1. Field of the Invention:
The present invention relates to a method for measuring distance and areas by mobile devices combined with light beam projectors, especially to one that can conveniently calculate the distance between two points and that can further calculate the measure of areas, achieving a convenient and augmented effectiveness in the measuring process.
2. Description of the Related Art
FIGS. 1 -4 illustrated a distance measurement system and method disclosed in Taiwan Publication No. 1289196. It calculates the actual distance between the target object and the image capturing device based on the proportional relationship of the pixel values and the actual length of distance from the target object. FIG. 1 is a functional block diagram thereof. There is a laser light source 20 projecting a laser beam on the surface of the target object 30 and a digital camera 10 retrieving image information, then a computing unit 40 calculating the pixel values of the image so as to detects the actual length of the target object 30 or the distance between which and the digital camera 10 with the proportional relationship between the pixel values and the distance from the target object 30.
FIG. 2 is a schematic diagram illustrating the distance measuring by the pixel values according to the system and method. The digital camera 10 individually retrieves image information on the line CD and EF with the laser source 20 individually projecting a point thereon, where OP denotes the optical point of origin of the digital camera 10; PD and PF denote the projected points on the plane CD and EF respectively projected by the laser light source 20; 0 denotes the center of the scanned plane captured by the digital camera 10; HD denotes the distance between the plane CD and the digital camera 10; HF denotes the distance between the plane EF and the digital camera 10; hs denotes the distance between the point OP and the digital camera 10; DD and Di: denote the maximal values of the length that can be captured on the plane CD and EF respectively by the digital camera 10; Dr denotes the distance between PD, PI: and 0; 20,,"" denotes the maximal angle for the digital camera 10 to capture; Nm", denotes the maximal pixel values of a single scanning line of the digital camera 10; and ND and NF denote the pixel values of the distance between PD, PF and 0 20 respectively.
Referring to the projected perspective view of an image retrieved by the digital camera 10 as shown in FIG. 3, the axle Z is the direction of image retrieving of the optical point of origin OP along which the digital camera 10 retrieves the image and the image information of scanned plane including a point A and point B. The direction of the axle Z is also the direction of the normal line of the scanned plane, and the point on the scanned plane passed through by the axle Z is the center of the plane 0. The line between point C and D and the line between point E and F are the scanned lines crossed by the point 0 and a scanned plane; each crossing point on the lines is right at the position of the 2 -N pixel point.
Referring to FIG.2 again, the laser beam projected by the laser light source 20 is parallel to the direction of the image retrieved by the digital camera 10, resulting in the laser beam being perpendicular to any plane scanned by the digital camera 10 and resulting in the projected points PD, Pr: projected by the laser beam on the scanned planes having the same length of distance Dr between each point and the point 0 on the planes.
With the feature of having the same length of distance Dr between each projected points PD, PF and the point 0 on the planes, it is able to obtain a horizontal distance D, by a single laser light source projecting on any plane instead of two laser light sources. In addition, the time for scanning the image information captured by the digital camera 10 and the actual distance between the digital camera 10 and the target object have a linear proportion relation, thus the computing unit 40 can present the distance with the pixel values in the following formulas: DD= Arm' x Dr and DF= ale" X Dr.
ND NF
On the other hand, the following formulas come from the Triangle Theorem: H0=-DD cot em" -hs and = 2 DE cot em" -its; Then we can infer the following formulas from the one above: H0= 1 Cs(max) X Dr) cot 8","x -hs and 2 No 1 (Ns(max) HF-2 Alp X Di.) Cot @max -The cot 0"," and h, in the inferred formulas can be calculated in advance by a calculation model, and then the computing unit 40 is able to find the values of 1\10 and NF, and further the ones of H0 and HF with the formulas.
In FIG. 4, a structure diagram of a calculation model for calculating parameter values according to the prior art, the calculation model includes the digital camera 10, two vertical rulers 41, and two horizontal rulers 43; the perpendicular distance h,"1, h,"2 between each horizontal rulers 43 and the surface of the digital camera 10 can be measured by the vertical rulers 41.
The maximal angle 20,"" to be captured by the digital camera 10 is restricted to 20, for the accuracy of the measurement of hs in the calculation model; in this way, the edges of the scanned plane would be eliminated in case of blur edges.
Furthermore, when the maximal angle is restricted to 20" the maximal 20 horizontal distances Dm], Dm, for the digital camera 10 to capture can also be easily measured and calculated by the Triangle Theorem with the following formulas: + hail = 2D,", cot Q. and h, + 11,n2 = Dm2 cot Os.
Then we can infer the formula for the value of cot Q as following.
hm1 lima= Z (Dita -Dm2) cot Os cot Os = 2 Dm2 -Dna hm, -hrnl Then we can further infer the following proportion relation by comparing the formulas above with angle 0, and 20 ax hs + hm2 Dm2 hs + hmi Dna Therefore the value of h, can be found in the following inferred foinaula.
hs = hmiDni2 -hfit2Din1 Dmi -Dm2 Another measurement tool is a laser rangefinder which emits laser beam to the target object and calculates the distance in-between by receiving the laser signals reflected to a laser signal receiver, usually an Avalanche Photo Diode (APD), that turns the laser signals into electric signals. The equation of the calculation is Td= 2-where Td denotes the delayed period between sending and receiving the signals; L denotes the distance between the origin of measurement and the target object; C denotes the velocity of light transmission. Therefore, measuring the delayed period Td leads to the distance L by calculation.
As technologies getting advanced, laser rangefinders are widely applied in construction engineering, decoration engineering, etc. As shown in FIG. 5, a laser rangefinder 100 is used to measure the table 101. If the table 101 is placed along the wall, the signals would reflect by the wall and then we can find the length of distance L. However, there is still room for improvements. Firstly, the measurer should be at a fixed position to project the laser beam to the pm-determined measuring point instead of any measuring point in the space, making the measurement inconvenient. Secondly, since the laser beam is projected to the pre-determined measuring point, it can only calculate the distance but not the 1.0 measure of the area surrounded by any measuring point and the point of the origin.
SUMMARY OF THE INVENTION
It is a primary object of the present invention to provide a method for measuring distance and areas by mobile devices combined with light beam projectors that can calculate the distance between any two points projected by the light beam projector in the mobile device, so as to overcome the inability to process the calculation in the prior art and achieve a convenient effectiveness in the distance measuring process.
Another object of the present invention is to provide a method for measuring distance and areas by mobile devices combined with light beam projectors that can calculate the distance between any two points projected by the light beam projector in the mobile device and the distances between the projector as the point of the origin and the projected points, so as to calculate the measure of area surrounded by the distances and to overcome the inability to process the calculation of areas in the prior art and achieve an augmented effectiveness in the distance measuring process.
In order to achieve the objects above, the present invention comprises the steps according to the annexed claims.
In summation, the present invention can measure the moving coordinates of the first and second measuring points individually by calculating the first and second distance and the first and second azimuth data respectively and further calculate the coordinate distance between the first and second measuring points with the structures disclosed above. In addition, the method can also calculate the area surrounded by the coordinate distances, the first distance and the second distance, featuring a convenient and augmented effectiveness in the measuring process.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a functional block diagram according to the prior art; FIG. 2 is a schematic diagram illustrating the distance measuring by the
pixel values according to the prior art;
FIG. 3 is a projected perspective view of an image retrieved by a digital camera according to the prior art; FIG. 4 is a structure diagram of a calculation model for calculating the
parameter values according to the prior art;
FIG. 5 is an application example of a rangefinder according to the prior art; FIG. 6 is a flow diagram of the present invention; FIG. 7 is a block diagram illustrating the combination of the mobile device and the light beam projector in the present invention; FIG. SA is an exploded view of the light beam projector in the present invention; FIG. 8B is a perspective view of the assembled light beam projector in the present invention; FIG. 9A is an exploded view of the light beam projector and the mobile device before the combination in the present invention; FIG. 9B is a perspective view of the fight beam projector and the mobile device after the combination in the present invention; FIG. 9C is a perspective view of the combined light beam projector and mobile device in a matching form in the present invention; FIG. 10A is another exploded view of the light beam projector and the mobile device before the combination in the present invention; FIG. 10B is another perspective view of the light beam projector and the 15 mobile device after the combination in the present invention; HG. 11A is a partially sectional view of the light emitting module arranged in the same direction as the one of the connecting plug; FIG. I1B is a cross-section view along line 11B-11B in FIG. 1 IA; FIG. 11C is a partially sectional view of the light emitting module arranged in a perpendicular direction to the one of the connecting plug; FIG. 11D is a cross-section view along line 11D-11D in FIG. 11C; FIG. 12 is an application view of the present invention; FIG. 13A is a practical application view of a coordinate distance measurement in the present invention; and FIG. 13B is a practical application view of an area measurement in the present invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT Referring to the flow diagram in FIG. 6 with coordination of FIGS. 6 13 B, in a preferred embodiment, the present invention includes steps S1 -S9 as following. Step I Si: initiating the process. Step 2 S2: a) providing a mobile device 50 having at least one MCU 51, a memory 52, a photodetector module 53, and an azimuth sensor 54; the MCU 51 is electrically connected to the memory 52, the photodetector module 53, and the azimuth sensor 54 separately.
The photodetector module 53 comprises either a camera module or an avalanche photodiode (APD). Basically it detects the distance between the observer (e.g. position 0) and a pre-determined point in the space (e.g. position A) by the principles of the photodetector; such function can be easily achieved in the prior art. The mobile device 50 comprises either a smart phone, a tablet PC, or a rangefinder; such devices have construction of a photodetector module 53. In the following embodiment, the present invention has a smart phone as the mobile device 50 for illustration, but certainly the present invention is not limited to such application. Referring to FIG. 7, in this embodiment the mobile device 50 is a smart phone which has a transmission port 55 and an audio jack 55' electrically connected to the MCU 51 individually.
S3: b) providing a light beam projector 60 electrically linked up with the mobile device 50 to be driven by, and the direction of the light beam projected by the light beam projector 60 being identical with the direction of the image retrieved by the photodetector module 53. The light beam projector 60 can be disposed inside or outside the mobile device 50 as shown in FIGS. 8A and 8B. The light beam projector 60 comprises a connecting plug 70 having a PCB 71 with an electrically connecting element 72 arranged at the front end thereof, a light emitting module 80 arranged aside the PCB 71, a driving circuit 73 coupled to the PCB 71 and the light emitting module 80 and disposed in-between, and a casing 90 wrapping the PCB 71 and the light emitting module 80. The electrically connecting element 72 is designed to match the specifications of the transmission port 55 and the audio jack 55' of the mobile device 50, in order to be inserted in for accessing the electricity and signals. A light emitting hole 91 is arranged on a surface of the periphery of the casing 90 for the light emitting module 80 to project the lights. Besides, the driving circuit 73 can be disposed either on the PCB 71 or inside the light emitting module 70.
1.5 Furthermore, in a preferred embodiment, the transmission port 55 is arranged at the rear of the smart phone for the light beam projector 60 to link up as illustrated in FIGS. 9A, 9B, and 9C. In another preferred embodiment, the transmission port 55 is arranged aside the smart phone for the light beam projector 60 to link up as illustrated in FIGS. 10A and 10B. From the disclosed embodiments it is concluded that the light beam projector 60 can be applied to the transmission port 55 of any smart phone. Also, the mobile device 50 can be a table PC or a rangefinder other than a smart phone, as stated before. Hence, the light beam projector 60 can not only link up to the transmission port 55 and the audio jack 55' on the smart phone;as an electrically connected interface, it can link up to every transmission port on a tablet PC or a rangefinder. Or having the light beam projector 60 built inside the mobile device 50 is also applicable.
With reference to FIGS. 11A -11D, the light emitting module 80 5 includes a hollow tube 81, a luminous element 82 being arranged inside the hollow tube 81 and having a plurality of pins 83 at the bottom thereof, and an optical lens 84 arranged inside the hollow tube 81 ahead of the luminous element 82. The luminous element 82 comprises either a laser diode or a LED, and the LED comprises either a Visible LED or an Infrared LED. In 10 this embodiment, the light emitting module 80 can be arranged in the same direction with the connecting plug 70 as shown in FIG. 11A, or it may be in a perpendicular direction with the connecting plug 70 as shown in FIG. 11 B. in addition, referring to FIGS. 11C and 11D, the light emitting hole 91 of the casing 90 is arranged in a L shape; at the corner thereof a reflector 92 with 45 incline is arranged for the lights to be refracted to a pre-determined direction.
With reference to FIG. 12, the next steps are as following. S4: c) setting up the connections between the memory 52, the photodetector module 53, the azimuth sensor 54 and the light beam projector 60 by the MCU 51 so that when the connection is activated, the azimuth sensor 54 would initialize and set up the photodetector module 53 as being at the point of the origin 0.
S5: d) turning the photodetector module 53 and linking up the light beam projector 60 for the azimuth sensor 54 to produce a first azimuth data a i, 18 1, and then activating the photodetector module 53 to retrieve a first image information PI after the light beam being projected to a first measuring point A in the space; a first distance 0/1 between the first measuring point A and the photodetector module 53 is calculated by the MCU 51, and a moving coordinates X1, Y1, Zt of the first measuring point is calculated by the MCU 51 with the first distance and the first azimuth data a 48, with the formulas below: X1= OA xsin( xcos( a 1); Yt=0A xsin( fit) xsin ( al); and ZI=OA xcos ($1).
Then the first distance 0,4 and the moving coordinates X1, VI, Z, of the 10 first measuring point A are stored in the memory 52.
S6: e) turning again the photodetector module 53 and linking up the light beam projector 60 for the azimuth sensor 54 to produce a second azimuth data a 2)3 2, and then activating the photodetector module 53 to retrieve a second image information P2 after the light beam being projected to a second measuring point B in the space; a second distance 08 between the second measuring point B and the photodetector module 53 is calculated by the MCU 51, and a moving coordinates X2, Y2, Z2 of the second measuring point B is calculated by the MCU 51 with the second distance OB and the second azimuth data a 2 ' 13 2 with the tbrmulas below: X2= OB x sin( /5.2) x cos( a 2); Y2= OB Xsill( (3 2) Xs ( a 2); and Z2= 08xcos ()32).
Then the second distance OB and the moving coordinates X2, Y2, Z:2 of the second measuring point B are stored in the memory 52.
S7: accessing the moving coordinates X1, 111, Z1 and X2, Y2, Z2 of the first measuring point A and the second measuring point B by the MCU 52 5 and calculating the coordinate distance AB between the first measuring point and the second measuring point with the formula AB j(x, -X 2)2 + (V1 -112) 2 + (Z1 -Z2) 2. Then the next step would be S9: terminating.
The present invention may further include a S8: g) storing the I 0 coordinate distance AB between the first measuring point A and the second measuring point B in the memory 52 and accessing it to calculate the area DAB surrounded by said coordinate distance AB, the first distance OA and the second distance OS by the MCU 52, and then proceed S9: terminating.
Hence, the mobile device 50 includes a display module 56 electrically connected to the MCU 51 and an application program 57 stored in the memory 52 and written with the steps c) g) S4-SS; the mobile device 50 can handle the display module 56 to operate the MCU 51 and access the application program 57 so that the display module 56 would display a measurement list M of the coordinate distance MI and the area OAB.
Referring to FIG. 13A, after the coordinate distance AB of the measurement list M is selected, the first and second measuring points A, B would be randomly projected in the space and the coordinate distance 413 can be easily calculated. Further referring to FIG. 13B, after the area OAB of the measurement list AI is selected, the first and second measuring points A, B would be randomly projected in the space to produce the distances AB,OA, and OB for calculating the area OAB; Or there can be a third measuring point C randomly projected in the space, then the moving coordinates thereof X3, Y3, 7,3 and the coordinate distance BC can be calculated, thus producing an area OBC surrounded by the distances BC, OB, and OC to be calculated, so as to produce a larger area combined with areas OAB and OBC, featuring the present invention an augmented effectiveness Although particular embodiments of the invention have been described in detail for purposes of illustration, various modifications and enhancements may be made without departing from the spirit and scope of the invention. Accordingly, the invention is not to be limited except as by the appended claims.

Claims (9)

  1. What Is Claimed Is: 1. A method for measuring distance and areas by mobile devices combined with light beam projectors comprising: a) providing a mobile device (50) having at least one MCU (51), a 5 memory (52), a photodetector module (53), and an azimuth sensor (54), and said MCU (51) being electrically connected to the memory (52), the photodetector module (53), and the azimuth sensor (54) separately; b) providing a light beam. projector (60) electrically linked up with said mobile device (50) to be driven by, and the direction of the light beam 10 projected by the light beam projector (60) being identical with the direction of the image retrieved by the photodetector module (53); c) setting up the connections between the memory (52), the photodetector module (53), the azimuth sensor (54) and the light beam projector (60) by the MCU (51) so that when the connection being activated, the azimuth sensor (54) would initialize and set up the photodetector module (53) as being at the point of the origin (0); d) turning the photodetector module (53) and linking up the light beam projector (60) for the azimuth sensor (54) to produce a first azimuth data ( 1, ,81), and then activating the photodetector module (53) to retrieve a first image information (P1) after the light beam being projected to a first measuring point (A) in the space; a first distance (674) between the first measuring point (A) and the photodetector module (53) being calculated by the MCU (51), and a moving coordinates (Xi, Y,, Z1) of the first measuring point (A) being calculated by the MCU (51) with the first distance (OA) and the first azimuth data ( a 1, SI), then the first distance (OA) and the moving coordinates (X1, Yt, Z1) of the first measuring point (A) being stored in said memory (52); e) turning again the photodetector module (53) and linking up the light beam projector (60) for the azimuth sensor (54) to produce a second azimuth data ( a 2, 9 2), and then activating the photodetector module (53) to retrieve 5 a second image information (P2) after the light beam being projected to a second measuring point (B) in the space; a second distance ( OB) between the second measuring point (B) and the photodetector module (53) being calculated by the MCU (51), and a moving coordinates (X2, Y2, Z2) of the second measuring point (B) being calculated by the MCI.: (51) with the 10 second distance (08) and the second azimuth data ( a 2, i 2), then the second distance ( ?KB-) and the moving coordinates (X2, Y2, Z2) of the second measuring point (B) being stored in said memory (52); and f) accessing the moving coordinates (X1, Y1, Z1), (X2, Y2, Z2) of the first measuring point (A) and the second measuring point (B) by the MCU (51) 15 and calculating the coordinate distance (AB) between the first measuring point (A) and the second measuring point (B).
  2. 2. The method as claimed in claim 1, wherein the method further includes a step g) storing the coordinate distance (AB)in the memory (52) and accessing it to calculate the area (0,4B) surrounded by said coordinate distance (AB), the first distance ( OA) and the second distance ( OB) by the MCU (51).
  3. 3. The method as claimed in claim 2, wherein the mobile device (50) includes a display module (56) electrically connected to the MCU (51) and an application program (57) stored in the memory (52) and written with the 25 steps c) g); the mobile device (50) can handle the display module (56) to operate the MCU (51) and access the application program (57) so that the display module (56) would display a measurement list (Al) of the coordinate distance (,1B) and area (OAB).
  4. 4. The method as claimed in claim I, wherein the mobile device (50) comprises either a smart phone, a tablet PC, or a rangefinder.
  5. 5. The method as claimed in claim 1, wherein the photodetector module (53) comprises either a camera module or an avalanche photodiode.
  6. 6. The method as claimed in claim 1, wherein the light beam projector (60) comprises a connecting plug (70) having a PCB (71) with an electrically connecting element (72) arranged at the front end thereof, a tight emitting module (80) arranged aside the PCB (71), a driving circuit (73) coupled to the PCB (71) and the light emitting module (80) and disposed in-between, and a casing (90) wrapping the PCB (71) and the light emitting module (80); the electrically connecting element (72) being designed to match the specifications of a transmission port (55) or an audio jack (55') of the mobile device (50), in order to be inserted in for accessing the electricity and signals and being exposed at the inner side of the casing (90); and a light emitting hole (91) being arranged on a surface of the periphery of the casing (90) for the light emitting module (80) to project the lights.
  7. 7. The method as claimed in claim 6, wherein the light emitting module (80) includes a hollow tube (81), a luminous element (82) being arranged inside the hollow tube (81) and having a plurality of pins (83) at the bottom thereof, and an optical lens (84) arranged inside the hollow tube (81) ahead of the luminous element (82); the luminous element (82) comprises either a laser diode or a LED, and the LED comprises either a Visible LED or an Infrared LED.
  8. 8. The method as claimed in claim 6, wherein the light emitting module (80) can be arranged in the same direction or in the perpendicular direction with the connecting plug (70).
  9. 9. The method as claimed in claim 6, wherein the light emitting hole (91) of the casing (90) is arranged in a L shape, at the corner thereof a reflector (92) with 45° incline is arranged for the lights to be refracted to a pre-determined direction.
GB1500712.3A 2015-01-16 2015-01-16 A method for measuring distance and areas by mobile devices combined with light beam projectors Withdrawn GB2534190A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1500712.3A GB2534190A (en) 2015-01-16 2015-01-16 A method for measuring distance and areas by mobile devices combined with light beam projectors

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1500712.3A GB2534190A (en) 2015-01-16 2015-01-16 A method for measuring distance and areas by mobile devices combined with light beam projectors

Publications (2)

Publication Number Publication Date
GB201500712D0 GB201500712D0 (en) 2015-03-04
GB2534190A true GB2534190A (en) 2016-07-20

Family

ID=52630673

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1500712.3A Withdrawn GB2534190A (en) 2015-01-16 2015-01-16 A method for measuring distance and areas by mobile devices combined with light beam projectors

Country Status (1)

Country Link
GB (1) GB2534190A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11037315B2 (en) 2018-07-09 2021-06-15 Toughbuilt Industries, Inc. Dual laser measuring device and online ordering system using the same
US11906290B2 (en) 2016-03-04 2024-02-20 May Patents Ltd. Method and apparatus for cooperative usage of multiple distance meters

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004036246A1 (en) * 2002-10-18 2004-04-29 Peter Stevrin Mobile phone with laser range finder
TW201400791A (en) * 2012-06-25 2014-01-01 Univ Southern Taiwan Tech Distance measuring device on mobile phone
US20140357316A1 (en) * 2013-05-29 2014-12-04 Apple Inc. Electronic Device With Mapping Circuitry

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004036246A1 (en) * 2002-10-18 2004-04-29 Peter Stevrin Mobile phone with laser range finder
TW201400791A (en) * 2012-06-25 2014-01-01 Univ Southern Taiwan Tech Distance measuring device on mobile phone
US20140357316A1 (en) * 2013-05-29 2014-12-04 Apple Inc. Electronic Device With Mapping Circuitry

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11906290B2 (en) 2016-03-04 2024-02-20 May Patents Ltd. Method and apparatus for cooperative usage of multiple distance meters
US11037315B2 (en) 2018-07-09 2021-06-15 Toughbuilt Industries, Inc. Dual laser measuring device and online ordering system using the same

Also Published As

Publication number Publication date
GB201500712D0 (en) 2015-03-04

Similar Documents

Publication Publication Date Title
Giancola et al. A survey on 3D cameras: Metrological comparison of time-of-flight, structured-light and active stereoscopy technologies
US10244222B2 (en) Triangulation scanner and camera for augmented reality
US10089789B2 (en) Coordinate measuring device with a six degree-of-freedom handheld probe and integrated camera for augmented reality
EP1493990A1 (en) Surveying instrument and electronic storage medium
EP1607718A2 (en) Surveying instrument and electronic storage medium
JP2016142562A (en) Method of measuring distance or area using portable device and beam projection device
EP2754129A1 (en) System and method for three-dimensional surface imaging
CN104024797A (en) Method and device for determining 3D coordinates of an object
WO2013155379A2 (en) Orthographic image capture system
US11467255B2 (en) Lidar system for object detection and recognition
WO2016025358A1 (en) A six degree-of-freedom triangulation scanner and camera for augmented reality
CN103234517A (en) Measuring method and measuring tool of spatial distance
FR2805350A1 (en) TELEMETRY EQUIPMENT FOR BI- OR THREE-DIMENSIONAL MAPPING OF A VOLUME
GB2464172A (en) Handheld surveying apparatus for surveying buildings
GB2534190A (en) A method for measuring distance and areas by mobile devices combined with light beam projectors
US9804259B2 (en) Method for measuring distance and areas by mobile devices combined with light beam projectors
CN104202453A (en) Mobile phone and method for detecting distance or area
JP2008076405A (en) Three-dimensional surveying apparatus and electronic storage medium
Budge et al. Calibration method for texel images created from fused flash lidar and digital camera images
KR20160090037A (en) A method for measuring distance and areas by mobile devices combined with light beam projectors
KR20200063937A (en) System for detecting position using ir stereo camera
US10379219B1 (en) Measurement system using camera
CN203203584U (en) Laser space distance measuring instrument
Ruiz-Sarmiento et al. Experimental study of the performance of the Kinect range camera for mobile robotics
TW201621264A (en) Method for measuring distance or area by mobile device combined with beam projector

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)