CN108304119B - Object measuring method, intelligent terminal and computer readable storage medium - Google Patents

Object measuring method, intelligent terminal and computer readable storage medium Download PDF

Info

Publication number
CN108304119B
CN108304119B CN201810058366.2A CN201810058366A CN108304119B CN 108304119 B CN108304119 B CN 108304119B CN 201810058366 A CN201810058366 A CN 201810058366A CN 108304119 B CN108304119 B CN 108304119B
Authority
CN
China
Prior art keywords
measurement
user interface
measured
point
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810058366.2A
Other languages
Chinese (zh)
Other versions
CN108304119A (en
Inventor
陈泽滨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201810058366.2A priority Critical patent/CN108304119B/en
Publication of CN108304119A publication Critical patent/CN108304119A/en
Application granted granted Critical
Publication of CN108304119B publication Critical patent/CN108304119B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses an object measuring method, an intelligent terminal and a computer readable storage medium, wherein the method comprises the following steps: acquiring a trigger event of object measurement; calling a camera component of the terminal to shoot an object to be measured, and displaying an environment image of the object to be measured on a user interface of the terminal; acquiring an object measurement instruction; according to the object measurement instruction, determining a measurement point set on the user interface, wherein the measurement point comprises: determining a measurement starting point and a measurement end point of an object to be measured in the environment image; acquiring reference coordinate data of the measuring point according to the position of the measuring point on a terminal screen; acquiring length measurement data of the object to be measured according to the reference coordinate data; displaying the length measurement data in the user interface. The length measurement data of the object to be measured can be intelligently measured, so that the accuracy of the measurement result can be improved.

Description

Object measuring method, intelligent terminal and computer readable storage medium
Technical Field
The present invention relates to the field of internet technologies, and in particular, to a method for measuring an object, an intelligent terminal, and a computer-readable storage medium.
Background
Along with the development of science and technology, the function of terminal is abundanter more and more, and present terminal not only can satisfy user's daily communication demand, still has the object measurement function. The working principle of the object measuring function of the current terminal is that a ruler with scales is simulated on a terminal screen according to the size proportion of the real world, and the measuring interface of the terminal can be shown in fig. 1. As can be seen from fig. 1, since the measuring method is to display a measuring ruler on a terminal screen in an analog manner, the measuring length thereof is limited. If the length of the object to be measured is greater than the length of the analog ruler, the measurement cannot be performed by the analog ruler. Moreover, if the user wants to use the object measurement function to measure the length of the object to be measured, the user needs to fix the simulation ruler on the object to be measured in the same way as the real object ruler, which results in poor measurement experience and inaccurate measurement result.
Disclosure of Invention
The embodiment of the invention provides an object measuring method, an intelligent terminal and a computer readable storage medium, which can intelligently measure the length of an object to be measured, thereby improving the accuracy of a measuring result.
In one aspect, an embodiment of the present invention provides an object measurement method, where the method includes: acquiring a trigger event of object measurement, calling a camera component of the terminal to shoot the object to be measured, and displaying an environment image of the object to be measured on a user interface of the terminal. Obtaining an object measurement instruction, and determining a measurement point set on the user interface according to the object measurement instruction, wherein the measurement point comprises: and determining a measurement starting point and a measurement end point of the object to be measured in the environment image. And acquiring reference coordinate data of the measuring point according to the position of the measuring point on the terminal screen, and acquiring length measuring data of the object to be measured according to the reference coordinate data. The length measurement data is displayed in the user interface.
In another aspect, an embodiment of the present invention provides a terminal, where the terminal includes:
an acquisition unit for acquiring a trigger event of an object measurement.
And the shooting unit is used for calling the camera shooting assembly of the terminal to shoot the object to be measured and displaying the environment image of the object to be measured on the user interface of the terminal.
The acquisition unit is also used for acquiring an object measurement instruction.
A determination unit configured to determine a measurement point set on the user interface according to the object measurement instruction, the measurement point including: and determining a measurement starting point and a measurement end point of the object to be measured in the environment image.
The acquisition unit is also used for acquiring the reference coordinate data of the measuring point according to the position of the measuring point on the terminal screen.
The acquisition unit is also used for acquiring length measurement data of the object to be measured according to the reference coordinate data.
A display unit for displaying the length measurement data in the user interface.
In another aspect, an embodiment of the present invention provides another intelligent terminal, which includes a processor, an input device, an output device, and a memory, where the processor, the input device, the output device, and the memory are connected to each other, where the memory stores a computer program, the computer program includes program instructions, and the processor is configured to call the program instructions to execute the object measurement method.
In yet another aspect, an embodiment of the present invention provides a computer-readable storage medium, where a computer program is stored, where the computer program includes program instructions, and the program instructions, when executed, implement the above object measurement method.
According to the embodiment of the invention, the length measurement data can be measured and displayed by shooting the object to be measured and combining the object measurement instruction of the user, the user can measure the object at any time and any place as long as the user carries the intelligent terminal, so that the automation and the intellectualization of the object measurement are realized, and the measurement requirement of the user is greatly met.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the description below are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on the drawings without creative efforts.
FIG. 1 is a schematic view of a measurement interface of measurement software provided in the background art;
FIG. 2 is a schematic flow chart diagram of a method for measuring an object according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a terminal determining a mapping point of a measurement point of an object to be measured on a fitting plane according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a mapping relationship between a pixel plane coordinate system and an image coordinate system according to an embodiment of the present invention;
fig. 5 is a schematic diagram of a mapping relationship between a camera coordinate system and an image coordinate system in a camera imaging process according to an embodiment of the present invention;
fig. 6 is a schematic view of a measurement scenario provided in an embodiment of the present invention;
FIG. 6a is a schematic diagram of a user interface provided by an embodiment of the present invention;
FIG. 6b is a schematic view of another user interface provided by an embodiment of the present invention;
FIG. 6c is a schematic view of another user interface provided by embodiments of the present invention;
FIG. 6d is a schematic view of another user interface provided by embodiments of the present invention;
FIG. 6e is a schematic diagram of another user interface provided by embodiments of the present invention;
FIG. 6f is a schematic view of another user interface provided by embodiments of the present invention;
FIG. 6g is a schematic diagram of another user interface provided by an embodiment of the invention;
FIG. 6h is a schematic diagram of another user interface provided by embodiments of the present invention;
FIG. 7 is a schematic diagram of a user interface provided by another embodiment of the present invention;
FIG. 8 is a schematic block diagram of an object measuring device provided by an embodiment of the present invention;
fig. 9 is a schematic block diagram of an intelligent terminal according to an embodiment of the present invention.
Detailed Description
In the embodiment of the invention, a user interface can be provided for a user, on one hand, an environment image obtained by shooting of a terminal camera shooting component can be displayed on the user interface, and on the other hand, relevant length measurement operations such as clicking operation of the user can be received to trigger measurement of a certain object to be measured in the environment image. Through the length measurement operation, a user can set a measurement starting point and a measurement end point of an object to be measured on a user interface, and the reference coordinate data of the measurement starting point and the measurement end point are determined based on the conversion relation between the pixel coordinate system and the reference coordinate system of the measurement starting point and the measurement end point, so that the length from the measurement starting point to the measurement end point is determined, and the measurement of the object to be measured is completed. The object to be measured may be a whole area or a partial area of an object in the captured environment image, and the measured length is a length between two measurement points selected by the user. For example, when a user wishes to measure the size of a computer display screen, the two opposite corners of the display screen can be set as a measurement starting point and a measurement ending point respectively through the user interface, and finally the size of the display screen is obtained.
Referring to fig. 2, a schematic flowchart of an object measurement method according to an embodiment of the present invention is shown, where the method according to the embodiment of the present invention may be implemented by an intelligent terminal, for example, a mobile intelligent terminal such as a smart phone and a tablet computer. The object measuring method as shown in fig. 2 may include the following steps.
S201, after the terminal acquires a trigger event of object measurement, calling a camera shooting assembly to shoot the object to be measured, and displaying an environment image of the object to be measured on a user interface of the terminal.
In one embodiment, the trigger event of the object measurement may refer to an operation of a user to turn on a measurement function in the terminal. If a user wants to measure an object to be measured, the user can turn on a measuring function in the terminal and align a camera shooting component in the terminal with the object to be measured. After the terminal obtains the operation of opening the measurement function by the user, the camera shooting assembly of the terminal can be called to shoot the object to be measured, and the environment image of the object to be measured is displayed on the user interface of the terminal.
It should be noted that the camera module mentioned in the embodiment of the present invention is a camera module in a terminal. In other possible embodiments, the camera assembly may be camera equipment externally connected to the terminal. Therefore, the present invention is not limited thereto.
S202, the terminal obtains an object measurement instruction and determines a measurement point set on the user interface according to the object measurement instruction.
Wherein, this measuring point includes: and determining a measurement starting point and a measurement end point of the object to be measured in the environment image.
In one embodiment, the user interface has a registration icon displayed thereon, which may be a small circle or other icon in the middle of the user interface. When the terminal acquires the object measurement instruction, the coincidence point of the alignment icon and the image of the object to be measured in the user interface can be determined according to the object measurement instruction, and the coincidence point is used as a measurement point set on the user interface. Specifically, if the terminal acquires a first object measurement instruction, a first coincident point of the alignment icon and an object to be measured in the environment image displayed on the user interface is determined, and the first coincident point is used as a measurement starting point set on the user interface. And after the terminal acquires a second object measurement instruction, determining a second coincident point of the alignment icon and the object to be measured in the environment image displayed on the user interface, and taking the second coincident point as a measurement terminal point set on the user interface. It should be noted that the object measurement command may be a click command on the user interface, a voice command, or a key command, which is not limited herein.
In one embodiment, the alignment icon in the embodiment of the present invention may be located in the middle of the user interface, and the user needs to align the alignment icon with the actual measurement starting point of the object to be measured when confirming the measurement starting point. Of course, in other embodiments of the invention, the alignment icon may be located at any position in the user interface. In this case, the manner of confirming the measurement starting point by the user is the same as the manner of confirming the measurement starting point by the user when the alignment icon is located in the middle of the user interface, and details are not described here. Or, the alignment icon may not be present in the user interface, and when the user needs to confirm the measurement starting point, the user only needs to click the position of the measurement starting point in the user interface to confirm. The foregoing is by way of example only and is not exhaustive.
In one embodiment, the terminal may further acquire a moving direction of the terminal when a measurement start point set on the user interface is determined according to the object measurement instruction. Drawing a measurement graphic in the user interface in the moving direction from the measurement starting point. And stopping drawing the measurement graph when the measurement end point set on the user interface is determined according to the object measurement instruction. And displaying the measurement graph in the user interface, wherein the drawing starting point of the measurement graph is coincided with the measurement starting point of the object to be measured, and the drawing end point of the measurement graph is coincided with the measurement end point of the object to be measured. In one embodiment, after determining the measurement starting point of the object to be measured, the terminal may obtain the moving direction of the terminal in real time through an acceleration sensor, a gyroscope or other components in the terminal, which may obtain the moving direction of the terminal, and draw a measurement graph in the user interface in real time from the measurement starting point along the moving direction.
S203, the terminal acquires the reference coordinate data of the measuring point according to the position of the measuring point on the terminal screen, and acquires the length measuring data of the object to be measured according to the reference coordinate data.
The length measurement data is a distance value between a measurement starting point and a measurement end point of the object to be measured in the reference coordinate system.
In one embodiment, the terminal may first obtain the pixel coordinate data of the measuring point in the environment image according to the position of the measuring point on the terminal screen. The pixel coordinate data is then converted into reference coordinate data in a reference coordinate system. Optionally, when the terminal converts the pixel coordinate data into reference coordinate data in a reference coordinate system, the terminal includes 3 conversion steps: converting the pixel coordinate data into image coordinate data; converting the image coordinate data into camera coordinate data and converting the camera coordinate data into reference coordinate data.
In one embodiment, converting the pixel coordinate data to image coordinate data may include the following computational process:
pixel plane coordinate system u-v: the digital image captured by the camera may be stored in the computer as an array, and the value of each element (pixel) in the array is the brightness (gray scale) of the image point. A rectangular coordinate system defined on the image, i.e. a pixel plane coordinate system u-v. The coordinates (u, v) of each pixel are the column number and the row number of the pixel in the array respectively.
Image coordinate system x-y: a coordinate system established to describe the location of the pixels in the image. The intersection of the camera optical axis and the image plane (typically at the center of the image plane) is defined as the origin O of the image coordinate system 1
While the pixel is flatThe mapping between the surface coordinate system u-v and the image coordinate system x-y may be as shown in fig. 4. Suppose (u) 0 ,v 0 ) Represents O 1 The coordinates in the pixel plane coordinate system u-v, dx and dy representing the physical dimensions of each pixel on the horizontal axis x and the vertical axis y, respectively, then the following relationship exists between the coordinates of each pixel in the image in the pixel plane coordinate system u-v and the coordinates in the image coordinate system x-y:
u=x/dx+u 0 formula 1.1
v=y/dy+v 0 Formula 1.2
Converting equations 1.1 and 1.2 above into matrices:
Figure BDA0001553684320000061
the formula 1.3 is the conversion relation between the pixel plane coordinate system u-v and the image coordinate system x-y, and when the terminal acquires the pixel coordinate data of the measuring point, the pixel coordinate data can be converted into the image coordinate data according to the formula 1.3.
In one embodiment, converting the image coordinate data to camera coordinate data may include the following calculation process:
camera coordinate system O-XcYcZc: the camera coordinate system may also be referred to as an observation coordinate system, and is a rectangular coordinate system constructed by using the optical center of the camera as an origin, and using axis coordinate axes parallel to the X and Y axes in the image coordinate system as the X axis and the Y axis, respectively, and using the optical axis of the camera as the Z axis.
The geometric relationship between the camera coordinate system O-XcYcZc and the image coordinate system x-y may be as shown in fig. 5. Wherein the O point is located at the projection center of the camera, the Xc axis and the Yc axis are parallel to the x axis and the y axis of the x-y image coordinate system, and Zc is the optical axis of the camera. According to the similar principle of triangle, the following relationship can be obtained:
x = f (Xc/Zc) formula 1.4
y = f (Yc/Zc) formula 1.5
Converting equations 1.4 and 1.5 above into matrices:
Figure BDA0001553684320000062
equation 1.6 is the transformation between the image coordinate system x-y and the camera coordinate system O-XcYcZc. Where f is the focal length of the camera. After the terminal acquires the image coordinate data of the measuring point, the image coordinate data can be converted into camera coordinate data according to equation 1.6.
In one embodiment, converting the camera coordinate data to the reference coordinate data may include the following calculation process:
a reference coordinate system O-XYZ: the reference coordinate system is an absolute coordinate system of the system, also called world coordinate system, a rectangular coordinate system introduced for describing the position of the camera. Before the user coordinate system is not established, the coordinates of all points on the picture are determined by the origin of the coordinate system to determine the respective positions. Since a camera device (e.g., camera assembly of a terminal, a video camera, etc.) can be placed at any position in an environment, a reference coordinate system needs to be selected in the environment to describe the position of the camera device and to describe the position of any object in the environment.
The relationship between any one of the camera coordinate systems O-XcYcZc and the reference coordinate system O-XYZ can be described by the rotation matrix R and the translation vector T, and the correspondence relationship is as follows:
Figure BDA0001553684320000071
wherein R is
Figure BDA0001553684320000072
T is
Figure BDA0001553684320000073
0 T Is (0, 0), and may be a matrix of extrinsic parameters of the camera
Figure BDA0001553684320000074
Reduced to a 3 x 4 matrix L W
Figure BDA0001553684320000075
Thus, a reduction of equation 1.7 yields the following equation:
Figure BDA0001553684320000076
equation 1.8 is the conversion relationship between the camera coordinate system O-XcYcZc and the reference coordinate system O-XYZ. After the terminal acquires the camera coordinate data of the measuring point, the camera coordinate data can be converted into reference coordinate data according to the formula 1.8.
Further, the terminal may combine equation 1.3, equation 1.6, and equation 1.8 to obtain the following relationship:
Figure BDA0001553684320000077
and the intrinsic parameter matrix of the camera is:
Figure BDA0001553684320000081
thus, equation 1.9 can be converted to:
Figure BDA0001553684320000082
as can be seen from equation 2.0, the reference coordinate data of a point can be obtained by acquiring the pixel coordinates (u, v) of the point, the intrinsic parameter and the extrinsic parameter of the camera, and the optical axis of the camera. Therefore, the process of the terminal converting the pixel coordinate data into the reference coordinate data in the reference coordinate system may include the following steps s11 to s13:
s11: acquiring parameter information of the camera shooting assembly and an optical axis of the camera shooting assembly, wherein the parameter information comprises internal parameters and external parameters.
s12: and determining the corresponding relation between the reference coordinate system and the pixel coordinate system according to the parameter information and the optical axis.
s13: and converting the pixel coordinate data into reference coordinate data in the reference coordinate system according to the corresponding relation.
In a specific implementation process, the terminal can acquire values of an internal parameter, an external parameter and an optical axis of the camera shooting assembly, and the values of the internal parameter, the external parameter and the optical axis are substituted in the formula 2.0. And then converting the pixel coordinate data into reference coordinate data in the reference coordinate system according to the formula 2.0 for determining the value of each parameter. From mathematical geometric relationships, the distance between two points is the length of the vector subtraction between the two points. Therefore, the terminal determines the reference coordinate data start-v = (x) of the measurement start point 1 ,y 1 ,z 1 ) And reference coordinate data end-v = (x) of measurement end point 2 ,y 2 ,z 2 ) Later, the vectors of the measurement start point and the measurement end point may be subtracted to obtain diff-v = start-v-end-v = (x) 1 -x 2 ,y 1 -y 2 ,z 1 -z 2 ). Therefore, the length between the measurement start point and the measurement end point is length (diff-v) = sqrt ((x) 1 -x 2 ) 2 +(y 1 -y 2 ) 2 +(z 1 -z 2 ) 2 )。
For example, the basic unit length of the reference coordinate system is 1cm. If the reference coordinate data of the measurement starting point obtained by the terminal is start-v = (1, 2, 3) and the reference coordinate data of the measurement end point is end-v = (4, 6, 3), the vector difference diff-v = (-3, -4, 0) between the measurement starting point and the measurement end point can be calculated, and then length (diff-v) = sqrt ((-3) 2 +(-4) 2 +(0) 2 )=5cm。
In one embodiment, when the terminal receives an object measurement instruction, a camera component of the terminal may be called to obtain image feature points of a plane where the object to be measured is located, then a fitting plane is constructed according to the image feature points, and a pixel coordinate system is constructed according to the fitting plane. Therefore, when the terminal acquires the pixel coordinate data of the measurement starting point in the environment image, the terminal can project a straight line according to the shooting direction of the camera shooting assembly through the alignment icon. And then the terminal takes the intersection point of the projected straight line and the constructed fitting plane as the mapping point of the measurement starting point on the fitting plane. And then acquiring the coordinate data of the mapping point in a pixel coordinate system, and taking the coordinate data of the mapping point in the pixel coordinate system as the pixel coordinate data of the measurement starting point in the environment image. It should be understood that the principle of obtaining the pixel coordinate data of the measurement end point in the environment image is the same as the principle of obtaining the pixel coordinate data of the measurement start point in the environment image, and the description thereof is omitted.
In addition, if the measuring point has no mapping point on the fitting plane, a prompt message is output in the user interface. In one embodiment, since the length measurement data of the object to be measured is calculated by the coordinate data of the measurement points in the embodiment of the present invention, it is necessary that the measurement points are all in the same pixel coordinate system. Each pixel coordinate system is determined by a fitting plane, so that all the measurement points are required to be on the same fitting plane. If the measuring point has no mapping point on the fitting plane, the measuring point is not on the fitting plane. At this time, the terminal may output a prompt message in the user interface to prompt the user to reconfirm the location of the measurement point. The form of outputting the prompt message may include popping up a prompt box in the user interface, and the content of the prompt box may be "unable to calculate, please confirm the measurement point again". It may also include the terminal dithering the current user interface at a preset frequency. The method can also comprise the step that the terminal marks the measuring graph drawn in real time on the user interface to be red. Of course, it should be noted that there may be many forms of outputting the prompt message, and the above are only examples and are not exhaustive.
And S204, the terminal displays the length measurement data in the user interface.
In one embodiment, the terminal may display the length measurement data at any location in the user interface, and may also display the length measurement data at a location in the user interface associated with the measurement graphic. It should be noted that, as the terminal moves, the size of the measurement graph drawn in the user interface changes. The position of the center point of the measurement pattern may also vary with the size of the measurement pattern. The position in the user interface associated with the measurement profile can therefore be determined from the centre point of the measurement profile. Of course, it should be understood that the location in the user interface associated with the measurement graphic may also be determined based on other factors of the measurement graphic, and is not limited herein.
In the embodiment of the invention, the terminal acquires the trigger event of object measurement, calls the camera assembly of the terminal to shoot the object to be measured, and displays the environment image of the object to be measured on the user interface of the terminal. The terminal may then obtain an object measurement instruction, and determine a measurement point set on the user interface according to the object measurement instruction, where the measurement point includes: and determining a measurement starting point and a measurement end point of the object to be measured in the environment image. And then, acquiring reference coordinate data of the measuring point according to the position of the measuring point on the terminal screen, and acquiring length measuring data of the object to be measured according to the reference coordinate data. And finally, displaying the length measurement data in the user interface. Therefore, the length measurement data of the object to be measured can be intelligently measured by respectively acquiring the reference coordinate data of the measurement starting point and the measurement end point of the object to be measured and acquiring the length measurement data of the object to be measured through the reference coordinate data, and the accuracy of the measurement result can be improved. In addition, as shown in fig. 1, the prior art is to simulate and display a ruler on a terminal screen, and the measuring length is limited. The embodiment of the invention is an augmented reality measurement mode directly attached to a real world measurement object, namely length measurement data is obtained through real coordinate data of a measurement point, and the limitation of measurement length is avoided. And a measuring ruler does not need to be fixed on an object to be measured, so that the whole operation process is simpler.
Fig. 6 is a schematic view of a measurement scenario according to an embodiment of the present invention. If the user needs to measure the length of one pencil, the user can open the measuring function of the terminal first. After receiving an instruction of opening the measurement function of the terminal by the user, the terminal may call the camera component to capture an environment image including the pencil, and output a user interface as shown in fig. 6 a. In fig. 6a, the photographed object to be measured (pencil) is presented, and some icons, buttons, display frames, and prompt frames available for use in the measurement process are displayed. Such as an alignment icon at the center of the user interface, a deletion icon displayed below the user interface for facilitating the user to delete the confirmed measuring point, a photographing icon for directly photographing the environment image, etc., a display frame for displaying the length measurement data, and prompt frames for displaying prompt information of the length measurement function to the user.
As shown in fig. 6a, an anchor point in the shape of a concentric circle is located right in the middle of the user interface, and the anchor point is an alignment icon in the user interface, and the anchor point is gray at this time to indicate that the measurement environment does not satisfy the condition, and the measurement cannot be performed, and the factor that does not satisfy the condition may be a factor such as strong ambient light, which may cause adverse effects on the measurement accuracy. In one embodiment, the user interface further includes a prompt box, and the user can slowly move the terminal according to the content of the prompt box, so that the color of the anchor point becomes black, and thus the measurement work is performed. That is, on the user interface, the user may be prompted to measure by setting different colors for the bit icons, including a non-measurable prompt or a measurable prompt.
In one embodiment, after the terminal acquires the action of the user to slowly move the terminal, the user interface output by the terminal can be seen in fig. 6b. As shown in fig. 6b, the anchor point in the user interface has turned black, i.e. it is illustrated that the user may start the measurement job. And the content of the prompt box also becomes 'confirm starting point, aiming at the center of the anchor point, and clicking any area of the screen can confirm the starting point'. At this time, the user can set a measurement start point according to the contents of the prompt box. The user can align the center of the anchor point in the user interface with one end of the pencil, and after the center of the anchor point is aligned with one end of the pencil, the user can click any area of the terminal screen to complete the setting of the measurement starting point. In the process of determining the measurement starting point by the user, the terminal receives an object measurement command, which may be a click command of the user clicking on a screen of the terminal. When the terminal receives a click command of a user, a coincidence point of the anchor point center and an image of an object to be measured in the user interface can be determined, and then the coincidence point is determined as a measurement starting point set on the user interface. After the measurement starting point is determined, the terminal may obtain pixel coordinate data of the measurement starting point, and obtain reference coordinate data of the measurement starting point according to the pixel coordinate data.
After the terminal obtains that the user completes the setting of the measurement starting point, the terminal outputs a user interface as shown in fig. 6 c. As shown in FIG. 6c, the contents of the prompt box in the user interface have become "drag line". At this time, the user may move the terminal along the measuring edge of the pencil, and in the moving process, the terminal may obtain, in real time, pixel coordinate data of a point on the measuring edge of the pencil corresponding to the alignment icon, convert the pixel coordinate data of the point into reference coordinate data, and then calculate, in real time, a distance value between the point and the measuring start point, that is, length measurement data. In one embodiment, while the user moves the terminal, the terminal may acquire the moving direction of the terminal in real time through an acceleration sensor, a gyroscope or other components in the terminal, draw a straight line along the moving direction from the measurement starting point on the user interface, and update the real-time calculated length measurement data on the user interface in real time, as shown in fig. 6 d. At this time, the user can see that the length measurement data displayed on the user interface changes in real time, and the straight line drawn on the user interface gradually becomes longer as the terminal moves.
In one embodiment, the terminal may simultaneously display the length measurement data at a location in the user interface associated with the measurement graphic and in an area in the user interface where the length measurement data is displayed when the user interface displays the length measurement data. It should be understood that in other embodiments, the terminal may display the length measurement data only in the location in the user interface associated with the measurement graphic, or the terminal may display the length measurement data only in the area in the user interface where the length measurement data is displayed. For the embodiment of the invention, the measurement graph is a straight line drawn by the terminal in the moving direction of the user interface. When the terminal displays the length measurement data on the user interface, the terminal can display the length measurement data in the area of the user interface where the length measurement data is displayed, and simultaneously display the length measurement data below the drawn straight line.
After the user moves the terminal along the measuring edge of the pencil for a period of time, the user interface diagram may be as shown in FIG. 6 e. As shown in FIG. 6e, the content of the prompt box in the user interface has become "confirm end, align anchor center, click any area of the screen to confirm end". At this point, the user may center the anchor point in the user interface to the other end of the pencil, which may be as shown in FIG. 6 f. After aligning the anchor point center with the other end of the pencil, the user can click any area of the terminal screen to complete the setting of the measurement endpoint. When the terminal obtains a click instruction of a user for clicking a terminal screen, the terminal stops drawing a straight line on the user interface, and length measurement data displayed in the user interface is not changed any more, and the user interface at this time can be as shown in fig. 6 g. It should be noted that, after the user completes the confirmation of the measurement end point, although the length measurement data displayed in the user interface is no longer changed, the position of the object to be measured displayed in the user interface is not fixed. In one embodiment, the user may move the terminal so that the pencil and its length measurement data are centered in the user interface for easy viewing, which may be as shown in FIG. 6 h.
Please refer to fig. 7, which is a schematic diagram of a user interface according to another embodiment of the present invention. As shown in fig. 7, the number of objects to be measured in the embodiment of the present invention is two, that is, the first object to be measured and the second object to be measured. The user can first determine the measurement start point and the measurement end point of the first object to be measured according to the methods listed above. The terminal calculates the length measurement data of the first object to be measured according to the object measurement instruction of the first object to be measured, and displays the length measurement data below the first measurement graph corresponding to the first object to be measured and in the area of the user interface displaying the length measurement data. The user then determines the start and end points of the measurement of the second object to be measured again according to the methods listed above. And the terminal calculates the length measurement data of the second object to be measured according to the object measurement instruction of the second object to be measured, and displays the length measurement data below a second measurement graph corresponding to the second object to be measured. In addition, the length measurement data of the second object to be measured is displayed in the area for displaying the length measurement data in the user interface, that is, the terminal replaces the length measurement data of the first object to be measured with the length measurement data of the second object to be measured in the area for displaying the length measurement data, but the measurement pattern and the length presented in the area for displaying the environment image are not changed, and the measurement interface of which the measurement is completed may be as shown in fig. 7. In one embodiment, the measurement interface shown in fig. 7 after the measurement is completed is not fixed and can be changed along with the change of the distance between the terminal and the object to be measured. The terminal can sense the distance between the terminal and the object to be measured through the distance sensor in the terminal, when the distance between the terminal and the object to be measured is short, the image size of the object to be measured is increased, at the moment, the measurement graph in the measurement interface after measurement is finished is increased along with the increase, but the actual length measurement value is not changed. When the distance between the terminal and the object to be measured is long, the image size of the object to be measured is reduced, and at the moment, the measurement graph in the measurement interface after measurement is finished can be shortened, but the actual length measurement value is not changed.
In one embodiment, the first object to be measured and the second object to be measured may or may not be in the same plane. The terminal can acquire whether the straight line emitted by the alignment icon and the first fitting plane corresponding to the first object to be measured have an intersection point when the measuring point of the second object to be measured is determined. If the intersection point exists, the second object to be measured and the first object to be measured are in the same plane. Otherwise, the same plane is not used. If the first object to be measured and the second object to be measured are in the same plane, when the terminal determines the length measurement data of the second object to be measured, the terminal does not need to acquire the graphic feature points of the plane where the second object to be measured is located, and does not need to construct a second fitting plane according to the image feature points. If the first object to be measured and the second object to be measured are not in the same plane, the terminal needs to execute steps of constructing a second fitting plane and the like when determining the length measurement data of the second object to be measured. It can be understood that the two objects to be measured shown in fig. 7 are only examples, and in practical applications, the terminal may also measure more than two objects to be measured at the same time on one measurement interface, and in one embodiment, the number may be set according to actual needs and the size of the measurement interface, which is not limited in the embodiment of the present invention.
The terminal can measure a plurality of objects to be measured on one measuring interface, and when a user wants to compare the sizes of the plurality of objects, the sizes of the plurality of objects can be measured in sequence on the same measuring interface. Then, the user can know the sizes of the plurality of objects at a glance by the length measurement data of the plurality of objects respectively displayed on the user interface. The user is not required to open a measuring interface to measure the size of one object and record the measurement result on paper, then open the measuring interface to measure the size of the next object, and so on. Therefore, the length measurement data of a plurality of objects to be measured can be displayed simultaneously, so that the complexity of operation can be reduced, and the convenience of the terminal can be improved.
As can be seen from fig. 6 or 7, the user interface of the terminal further includes a delete icon and a capture icon. If the user thinks that the length measurement data is not accurate, the user can delete the displayed length measurement data by clicking the delete icon, and can measure the object to be measured again. If the user wants to store the object to be measured and the length measurement data of the object to be measured after the measurement is completed, the image currently acquired by the camera shooting assembly can be shot and stored by clicking the shooting icon.
Fig. 8 is a schematic block diagram of a terminal according to an embodiment of the present invention. The terminal in the embodiment of the present invention shown in fig. 8 may include:
an acquisition unit 801 for acquiring trigger events for object measurements.
The shooting unit 802 is configured to invoke a camera component to perform shooting processing on an object to be measured, and display an environment image of the object to be measured on a user interface of the terminal.
The acquisition unit 801 is also used for acquiring an object measurement instruction.
A determining unit 803, configured to determine, according to the object measurement instruction, measurement points set on the user interface, where the measurement points include: and determining a measurement starting point and a measurement end point of the object to be measured in the environment image.
The acquiring unit 801 is further configured to acquire reference coordinate data of the measurement point according to the position of the measurement point on the terminal screen.
An acquiring unit 801 is further configured to acquire length measurement data of the object to be measured from the reference coordinate data.
A display unit 804 for displaying the length measurement data in the user interface.
In an embodiment, the obtaining unit 801 may be specifically configured to: and acquiring pixel coordinate data of the measuring point in the environment image according to the position of the measuring point on the terminal screen. The pixel coordinate data is converted into reference coordinate data in a reference coordinate system.
In an embodiment, the obtaining unit 801 may be specifically configured to: and acquiring parameter information of the camera shooting assembly, wherein the parameter information comprises internal parameters and external parameters. And determining the corresponding relation between the reference coordinate system and the pixel coordinate system according to the parameter information. And converting the pixel coordinate data into reference coordinate data in the reference coordinate system according to the corresponding relation.
In one embodiment, a registration icon is displayed on the user interface. The object measurement instructions include a first object measurement instruction and a second object measurement instruction. Correspondingly, the determining unit 803 may be specifically configured to: and according to the first object measurement instruction, determining a first coincident point of the alignment icon and the object to be measured in the environment image displayed on the user interface, and taking the first coincident point as a measurement starting point set on the user interface. And determining a second coincident point of the alignment icon and the object to be measured in the environment image displayed on the user interface according to the second object measurement instruction, and taking the second coincident point as a measurement terminal point set on the user interface.
In one embodiment, a delete icon is displayed on the user interface. Correspondingly, the display unit 804 is further configured to: and if a click command for the deletion icon is received, deleting the length measurement data displayed on the user interface and/or the measurement graph from the measurement starting point to the measurement ending point displayed on the user interface.
In one embodiment, the display unit 804 is further operable to: and acquiring the moving direction of the terminal when the measuring starting point set on the user interface is determined according to the object measuring instruction. Drawing a measurement graphic in the user interface in the moving direction from the measurement starting point. And stopping drawing the measurement graph when the measurement end point set on the user interface is determined according to the object measurement instruction. And displaying the measurement graph in the user interface, wherein the drawing starting point of the measurement graph is coincided with the measurement starting point of the object to be measured, and the drawing end point of the measurement graph is coincided with the measurement end point of the object to be measured.
In one embodiment, the display unit 804 may be specifically configured to: and displaying the length measurement data at a position associated with the measurement graph in the user interface, wherein the length measurement data is a distance value between a measurement starting point and a measurement end point of the object to be measured in the reference coordinate system.
In the embodiment of the present invention, if the obtaining unit 801 of the terminal obtains the trigger event of object measurement, the shooting unit 802 is called to call the camera component to perform shooting processing on the object to be measured, and an environment image of the object to be measured is displayed on the user interface of the terminal. Next, the acquiring unit 801 is invoked to acquire an object measurement instruction, and the determining unit 803 is invoked to determine a measurement point set on the user interface by the object measurement instruction, where the measurement point includes: and determining a measurement starting point and a measurement end point of the object to be measured in the environment image. Then, the acquisition unit 801 is invoked to acquire reference coordinate data of the measurement point from the position of the measurement point on the terminal screen, and length measurement data of the object to be measured from the reference coordinate data. Finally, the length measurement data is displayed in the user interface via the display unit 804. The embodiment of the invention can intelligently measure the length measurement data of the object to be measured by respectively obtaining the reference coordinate data of the measurement starting point and the measurement end point of the object to be measured and obtaining the length measurement data of the object to be measured through the reference coordinate data, thereby improving the accuracy of the measurement result.
Fig. 9 is a schematic block diagram of a terminal according to another embodiment of the present invention. The terminal in this embodiment as shown in fig. 9 may include: one or more processors 901; one or more input devices 902, one or more output devices 903, and memory 904. The processor 901, the input device 902, the output device 903, and the memory 904 are connected by a bus 905. The memory 904 is used to store a computer program comprising program instructions, and the processor 901 is used to execute the program instructions stored by the memory 904.
In the embodiment of the present invention, the processor 901 loads and executes one or more instructions stored in the computer storage medium to implement the corresponding steps of the method in the corresponding embodiment described above; in a specific implementation, at least one instruction in the computer storage medium is loaded by the processor 901 and performs the following steps:
the method comprises the steps of obtaining a trigger event of object measurement, calling a camera component of a terminal to shoot an object to be measured, and displaying an environment image of the object to be measured on a user interface of the terminal. Obtaining an object measurement instruction, and determining a measurement point set on the user interface according to the object measurement instruction, wherein the measurement point comprises: and determining a measurement starting point and a measurement end point of the object to be measured in the environment image. And acquiring reference coordinate data of the measuring point according to the position of the measuring point on the terminal screen, and acquiring length measuring data of the object to be measured according to the reference coordinate data. The length measurement data is displayed in the user interface.
In one embodiment, the at least one program instruction is loaded by the processor 901, and is further configured to perform: and acquiring pixel coordinate data of the measuring point in the environment image according to the position of the measuring point on the terminal screen. The pixel coordinate data is converted into reference coordinate data in a reference coordinate system.
In one embodiment, the at least one program instruction is loaded by the processor 901 and further configured to perform: acquiring parameter information of the camera shooting assembly, wherein the parameter information comprises internal parameters and external parameters. And determining the corresponding relation between the reference coordinate system and the pixel coordinate system according to the parameter information. And converting the pixel coordinate data into reference coordinate data in the reference coordinate system according to the corresponding relation.
In one embodiment, a registration icon is displayed on the user interface, and the object measurement instructions include a first object measurement instruction and a second object measurement instruction. Correspondingly, the at least one program instruction is loaded by the processor 901, and is further configured to perform: and determining a first coincident point of the alignment icon and the object to be measured in the environment image displayed on the user interface according to the first object measurement instruction, and taking the first coincident point as a measurement starting point set on the user interface. And determining a second coincident point of the alignment icon and the object to be measured in the environment image displayed on the user interface according to a second object measurement instruction, and taking the second coincident point as a measurement terminal point set on the user interface.
In one embodiment, a delete icon is displayed on the user interface, and the at least one program instruction is loaded by the processor 901 and further configured to: and if a click command for the deletion icon is received, deleting the length measurement data displayed on the user interface and/or the measurement graph from the measurement starting point to the measurement ending point displayed on the user interface.
In one embodiment, the at least one program instruction is loaded by the processor 901 and further configured to perform: when a measurement start point set on the user interface is determined according to the object measurement instruction, a moving direction of the terminal is acquired. Drawing a measurement graphic in the user interface in the moving direction from the measurement starting point. And stopping drawing the measurement graph when the measurement end point set on the user interface is determined according to the object measurement instruction. And displaying the measurement graph in the user interface, wherein the drawing starting point of the measurement graph is coincided with the measurement starting point of the object to be measured, and the drawing end point of the measurement graph is coincided with the measurement end point of the object to be measured.
In one embodiment, the at least one program instruction is loaded by the processor 901, and is further configured to perform: and displaying the length measurement data at a position associated with the measurement graph in the user interface, wherein the length measurement data is a distance value between a measurement starting point and a measurement end point of the object to be measured in the reference coordinate system.
The processor 901 may be a Central Processing Unit (CPU), or other general-purpose processor, i.e., a microprocessor or any conventional processor. The memory 904 may include both read-only memory and random access memory, and provides instructions and data to the processor 901. Therefore, the processor 901 and the memory 904 are not limited herein.
It should be noted that, for the specific working process of the terminal and the unit described above, reference may be made to the relevant description in each foregoing embodiment, and details are not described here again.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above may be implemented by a computer program, which may be stored in a computer readable storage medium and executed by a computer to implement the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
While the invention has been described with reference to a number of embodiments, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (9)

1. An object measuring method, comprising:
acquiring a trigger event of object measurement;
calling a camera component of the terminal to shoot an object to be measured, and displaying an environment image of the object to be measured on a user interface of the terminal; icons and prompt boxes are further displayed on the user interface, and the displayed icons comprise alignment icons and shooting icons; sending out measurement prompts by different colors set for the alignment icons, wherein the measurement prompts comprise prompts which cannot be measured or prompts which can be measured;
when the measurement can not be carried out according to the color confirmation of the alignment icon, sending a movement prompt message to the terminal through a prompt box so as to carry out measurement conveniently;
when the measurement can be carried out according to the color confirmation of the alignment icon, displaying prompt information for sending a measurement starting point through the prompt box, and determining to obtain a first object measurement instruction when a click instruction at any position of a terminal screen is obtained;
if a first object measurement instruction is acquired, determining a first coincident point of the alignment icon and an object to be measured in the environment image displayed on the user interface, and taking the first coincident point as a measurement starting point set on the user interface;
drawing a straight line on the user interface from the measurement starting point along the moving direction according to the obtained moving direction, wherein one end of the drawn straight line is overlapped with the measurement starting point determined by the click command, and the other end of the drawn straight line is overlapped with the alignment icon;
after a second object measurement instruction is acquired, determining a second coincident point of the alignment icon and an object to be measured in the environment image displayed on the user interface, and taking the second coincident point as a measurement terminal point set on the user interface;
acquiring reference coordinate data of the measurement starting point and the measurement end point according to the position of the measurement point on the terminal screen;
acquiring length measurement data of the object to be measured according to the reference coordinate data;
displaying the length measurement data in the user interface;
and if a click instruction for the shooting icon is received, shooting and storing the image which is obtained by the camera shooting assembly currently and comprises the object to be measured and the length measurement data of the object to be measured.
2. The method according to claim 1, wherein the acquiring reference coordinate data of the measuring point according to the position of the measuring point on the terminal screen comprises:
acquiring pixel coordinate data of the measuring point in the environment image according to the position of the measuring point on a terminal screen;
and converting the pixel coordinate data into reference coordinate data in a reference coordinate system.
3. The method of claim 2, wherein converting the pixel coordinate data into reference coordinate data in a reference coordinate system comprises:
acquiring parameter information of the camera shooting assembly, wherein the parameter information comprises internal parameters and external parameters;
determining the corresponding relation between the reference coordinate system and a pixel coordinate system according to the parameter information;
and converting the pixel coordinate data into reference coordinate data in the reference coordinate system according to the corresponding relation.
4. The method of claim 1, wherein a delete icon is displayed on the user interface, the method further comprising:
and if a click command for the deletion icon is received, deleting the length measurement data displayed on the user interface and/or a measurement graph from the measurement starting point to the measurement end point displayed on the user interface.
5. The method of any one of claims 1 to 4, further comprising:
when a measurement starting point set on the user interface is determined according to the object measurement instruction, the moving direction of the terminal is obtained;
drawing a measurement graphic in the user interface in the moving direction from the measurement starting point;
stopping drawing the measurement graph when a measurement terminal point set on the user interface is determined according to the object measurement instruction;
and displaying the measurement graph in the user interface, wherein the drawing starting point of the measurement graph is coincided with the measurement starting point of the object to be measured, and the drawing end point of the measurement graph is coincided with the measurement end point of the object to be measured.
6. The method of claim 5, wherein said displaying the length measurement data in the user interface comprises:
and displaying the length measurement data at a position associated with the measurement graph in the user interface, wherein the length measurement data is a distance value between a measurement starting point and a measurement end point of the object to be measured in a reference coordinate system.
7. An object measuring device, characterized by comprising means for performing the method according to any one of claims 1 to 6.
8. An intelligent terminal, comprising a processor, an input device, an output device and a memory, the processor, the input device, the output device and the memory being interconnected, wherein the memory is configured to store a computer program, the computer program comprising program instructions, the processor being configured to invoke the program instructions to perform the method according to any one of claims 1 to 6.
9. A computer-readable storage medium storing a computer program comprising program instructions that, when executed by a processor, cause the processor to perform the method of any one of claims 1 to 6.
CN201810058366.2A 2018-01-19 2018-01-19 Object measuring method, intelligent terminal and computer readable storage medium Active CN108304119B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810058366.2A CN108304119B (en) 2018-01-19 2018-01-19 Object measuring method, intelligent terminal and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810058366.2A CN108304119B (en) 2018-01-19 2018-01-19 Object measuring method, intelligent terminal and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN108304119A CN108304119A (en) 2018-07-20
CN108304119B true CN108304119B (en) 2022-10-28

Family

ID=62865731

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810058366.2A Active CN108304119B (en) 2018-01-19 2018-01-19 Object measuring method, intelligent terminal and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN108304119B (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111095024A (en) * 2018-09-18 2020-05-01 深圳市大疆创新科技有限公司 Height determination method, height determination device, electronic equipment and computer-readable storage medium
CN109579745A (en) * 2018-11-26 2019-04-05 江苏科技大学 Novel house Area computing method based on augmented reality and cell phone software
CN109859265B (en) * 2018-12-28 2024-04-19 维沃移动通信有限公司 Measurement method and mobile terminal
CN109974580A (en) * 2019-03-28 2019-07-05 江苏瑞奇海力科技有限公司 A kind of measurement method, device, electronic equipment and storage medium
CN110030928A (en) * 2019-04-11 2019-07-19 接楚添 The method and system of space object positioning and measurement based on computer vision
CN112797897B (en) * 2019-04-15 2022-12-06 Oppo广东移动通信有限公司 Method and device for measuring geometric parameters of object and terminal
CN110208780B (en) * 2019-05-14 2021-10-19 北京华捷艾米科技有限公司 Method and device for measuring distance based on somatosensory camera and storage medium
CN110276317B (en) * 2019-06-26 2022-02-22 Oppo广东移动通信有限公司 Object size detection method, object size detection device and mobile terminal
CN110276774B (en) * 2019-06-26 2021-07-23 Oppo广东移动通信有限公司 Object drawing method, device, terminal and computer-readable storage medium
CN110926334B (en) * 2019-11-29 2022-02-22 深圳市商汤科技有限公司 Measuring method, measuring device, electronic device and storage medium
CN113646606A (en) * 2019-12-31 2021-11-12 深圳市大疆创新科技有限公司 Control method, control equipment, unmanned aerial vehicle and storage medium
CN111141217A (en) * 2020-04-03 2020-05-12 广东博智林机器人有限公司 Object measuring method, device, terminal equipment and computer storage medium
CN111680251A (en) * 2020-05-28 2020-09-18 平安普惠企业管理有限公司 Browser element measuring method and device, electronic equipment and storage medium
CN115046480B (en) * 2021-03-09 2023-11-10 华为技术有限公司 Method for measuring length, electronic equipment and mobile equipment
CN114166187A (en) * 2021-11-17 2022-03-11 深圳市宝尔爱迪科技有限公司 Mobile terminal-based quadratic element image measuring method and device
CN117516485B (en) * 2024-01-04 2024-03-22 东北大学 Pose vision measurement method for automatic guiding and mounting of aircraft engine

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105122000A (en) * 2013-04-05 2015-12-02 莱卡地球系统公开股份有限公司 Measuring device with function for calibrating a display image position of an electronic reticule
CN105222717A (en) * 2015-08-28 2016-01-06 宇龙计算机通信科技(深圳)有限公司 A kind of subject matter length measurement method and device
EP3176678A2 (en) * 2015-11-17 2017-06-07 Huawei Technologies Co., Ltd. Gesture-based object measurement method and apparatus

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9080855B2 (en) * 2011-09-23 2015-07-14 Mitutoyo Corporation Method utilizing image correlation to determine position measurements in a machine vision system
US9696897B2 (en) * 2011-10-19 2017-07-04 The Regents Of The University Of California Image-based measurement tools
CN105783731A (en) * 2016-03-08 2016-07-20 上海易景信息科技有限公司 Method for measuring length of measured object by means of double cameras
CN106123784B (en) * 2016-07-22 2019-03-15 广东小天才科技有限公司 method and device for measuring length
CN107238349A (en) * 2017-05-16 2017-10-10 广东小天才科技有限公司 Length measuring method and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105122000A (en) * 2013-04-05 2015-12-02 莱卡地球系统公开股份有限公司 Measuring device with function for calibrating a display image position of an electronic reticule
CN105222717A (en) * 2015-08-28 2016-01-06 宇龙计算机通信科技(深圳)有限公司 A kind of subject matter length measurement method and device
EP3176678A2 (en) * 2015-11-17 2017-06-07 Huawei Technologies Co., Ltd. Gesture-based object measurement method and apparatus

Also Published As

Publication number Publication date
CN108304119A (en) 2018-07-20

Similar Documents

Publication Publication Date Title
CN108304119B (en) Object measuring method, intelligent terminal and computer readable storage medium
CN111932664B (en) Image rendering method and device, electronic equipment and storage medium
US20040246229A1 (en) Information display system, information processing apparatus, pointing apparatus, and pointer cursor display method in information display system
CN114663618B (en) Three-dimensional reconstruction and correction method, device, equipment and storage medium
CN113240769B (en) Spatial link relation identification method and device and storage medium
CN108344401B (en) Positioning method, positioning device and computer readable storage medium
CN113066086A (en) Road disease detection method and device, electronic equipment and storage medium
CN113052919A (en) Calibration method and device of visual sensor, electronic equipment and storage medium
CN114640833B (en) Projection picture adjusting method, device, electronic equipment and storage medium
CN114494388B (en) Three-dimensional image reconstruction method, device, equipment and medium in large-view-field environment
CN114820814A (en) Camera pose calculation method, device, equipment and storage medium
KR102207725B1 (en) 3D survey system and survey method in multiple survey mode
CN114578329A (en) Multi-sensor joint calibration method, device, storage medium and program product
CN113610702B (en) Picture construction method and device, electronic equipment and storage medium
CN114004890B (en) Attitude determination method and apparatus, electronic device, and storage medium
CN114290338A (en) Two-dimensional hand-eye calibration method, device, storage medium, and program product
JP5152281B2 (en) Image processing apparatus, method, and program
CN112767479A (en) Position information detection method, device and system and computer readable storage medium
KR20220058846A (en) Robot positioning method and apparatus, apparatus, storage medium
CN111161350B (en) Position information and position relation determining method, position information acquiring device
CN113628284B (en) Pose calibration data set generation method, device and system, electronic equipment and medium
CN113066134A (en) Calibration method and device of visual sensor, electronic equipment and storage medium
JP2022138883A (en) Image creation method, control method, and information processing apparatus
WO2021134715A1 (en) Control method and device, unmanned aerial vehicle and storage medium
US20220345621A1 (en) Scene lock mode for capturing camera images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant