CN110196422B - Laser ranging test method and device and mobile terminal - Google Patents

Laser ranging test method and device and mobile terminal Download PDF

Info

Publication number
CN110196422B
CN110196422B CN201910562541.6A CN201910562541A CN110196422B CN 110196422 B CN110196422 B CN 110196422B CN 201910562541 A CN201910562541 A CN 201910562541A CN 110196422 B CN110196422 B CN 110196422B
Authority
CN
China
Prior art keywords
distance
mobile terminal
cameras
acquiring
detected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201910562541.6A
Other languages
Chinese (zh)
Other versions
CN110196422A (en
Inventor
薛兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910562541.6A priority Critical patent/CN110196422B/en
Publication of CN110196422A publication Critical patent/CN110196422A/en
Application granted granted Critical
Publication of CN110196422B publication Critical patent/CN110196422B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Abstract

The application is suitable for the technical field of laser ranging, and provides a test method, a test device and a mobile terminal for laser ranging, and the test method, the test device and the mobile terminal comprise the following steps: acquiring a first distance between an object to be detected and a mobile terminal through a camera of the mobile terminal; calibrating a laser chip in the mobile terminal; acquiring a second distance between the object to be detected and the mobile terminal through calibrating the calibrated laser chip; and acquiring a test result of the distance measuring function of the laser chip according to the first distance and the second distance. Through the method and the device, the operation of laser calibration and calibration can be simplified, and the precision of laser calibration and calibration is improved, so that the test precision of laser ranging is improved.

Description

Laser ranging test method and device and mobile terminal
Technical Field
The application belongs to the technical field of laser ranging, and particularly relates to a test method and a test device for laser ranging and a mobile terminal.
Background
Mobile terminals such as mobile phones and tablet computers have become essential articles in daily life of people, the mobile terminals can achieve multiple functions such as instant messaging, photographing and payment purchasing, and the mobile terminals can utilize laser ranging when photographing to achieve rapid focusing. At present, the laser calibration of the mobile terminal is generally calibrated by adopting customized equipment, the mobile terminal needs to be placed and fixed on the customized equipment, the calibration time is long, the mobile terminal is inconvenient to move, the misdetection rate is high, and the test precision of laser ranging is influenced.
Disclosure of Invention
In view of this, the present application provides a method and a device for testing laser ranging, and a mobile terminal, so as to simplify the operation of laser calibration and improve the precision of laser calibration and calibration, thereby improving the test precision of laser ranging.
The first aspect of the present application provides a test method for laser ranging, where the test method includes:
acquiring a first distance between an object to be detected and a mobile terminal through a camera of the mobile terminal;
calibrating a laser chip in the mobile terminal;
acquiring a second distance between the object to be detected and the mobile terminal through calibrating the calibrated laser chip;
and acquiring a test result of the distance measuring function of the laser chip according to the first distance and the second distance.
A second aspect of the present application provides a test apparatus for laser ranging, the test apparatus comprising:
the first distance acquisition module is used for acquiring a first distance between an object to be detected and the mobile terminal through a camera of the mobile terminal;
the calibration module is used for calibrating the laser chip in the mobile terminal;
the second distance acquisition module is used for acquiring a second distance between the object to be detected and the mobile terminal by calibrating the calibrated laser chip;
and the test result acquisition module is used for acquiring a test result of the distance measurement function of the laser chip according to the first distance and the second distance.
A third aspect of the present application provides a mobile terminal comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the test method according to the first aspect when executing the computer program.
A fourth aspect of the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, carries out the steps of the testing method as described in the first aspect above.
A fifth aspect of the application provides a computer program product comprising a computer program which, when executed by one or more processors, performs the steps of the testing method as described in the first aspect above.
It is from top to bottom seen that this application scheme is earlier through mobile terminal's camera, acquires the first distance of awaiting measuring object and mobile terminal to demarcate the calibration to the laser chip in the mobile terminal, the rethread is markd the laser chip after the calibration and is surveyed, acquires the second distance of awaiting measuring object and mobile terminal, according to first distance and second distance, can test the range finding function of laser chip, thereby acquires the test result to the range finding function of laser chip. The laser ranging method and the laser ranging system have the advantages that the mobile terminal can calibrate and calibrate the laser chip, the customized equipment is not needed, the operation of calibrating and calibrating laser calibration is simplified, the precision of calibrating and calibrating laser calibration is improved, the accuracy of laser ranging is improved, the distance measured by laser is compared with the distance obtained by the camera, the test result of laser ranging can be accurately obtained, and the test precision of laser ranging is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart illustrating an implementation process of a laser ranging testing method according to an embodiment of the present application;
fig. 2 is a schematic flow chart illustrating an implementation of a laser ranging testing method according to a second embodiment of the present application;
fig. 3 is a schematic flow chart illustrating an implementation of a laser ranging test method according to a third embodiment of the present application;
fig. 4 is a schematic view of a laser ranging testing apparatus according to a fourth embodiment of the present disclosure;
fig. 5 is a schematic diagram of a mobile terminal according to a fifth embodiment of the present application;
fig. 6 is a schematic diagram of a mobile terminal according to a sixth embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
In particular implementations, the mobile terminals described in embodiments of the present application include, but are not limited to, other portable devices such as mobile phones, laptop computers, or tablet computers having touch sensitive surfaces (e.g., touch screen displays and/or touch pads). It should also be understood that in some embodiments, the device is not a portable communication device, but is a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or touchpad).
In the discussion that follows, a mobile terminal that includes a display and a touch-sensitive surface is described. However, it should be understood that the mobile terminal may include one or more other physical user interface devices such as a physical keyboard, mouse, and/or joystick.
The mobile terminal supports various applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disc burning application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an email application, an instant messaging application, an exercise support application, a photo management application, a digital camera application, a web browsing application, a digital music player application, and/or a digital video player application.
Various applications that may be executed on the mobile terminal may use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the terminal can be adjusted and/or changed between applications and/or within respective applications. In this way, a common physical architecture (e.g., touch-sensitive surface) of the terminal can support various applications with user interfaces that are intuitive and transparent to the user.
It should be understood that, the sequence numbers of the steps in this embodiment do not mean the execution sequence, and the execution sequence of each process should be determined by the function and the inherent logic of the process, and should not constitute any limitation to the implementation process of the embodiment of the present application.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
Referring to fig. 1, which is a schematic view of an implementation flow of a laser ranging testing method provided in an embodiment of the present application, as shown in the figure, the laser ranging testing method may include the following steps:
step S101, a first distance between an object to be measured and a mobile terminal is obtained through a camera of the mobile terminal.
In the embodiment of the present application, the cameras of the mobile terminal include, but are not limited to, a main camera lens, a 2M lens, and a telephoto lens, and the three cameras can perform ranging to locate the distance between the object to be measured and the mobile terminal. The first distance may be a distance between the object to be measured and the mobile terminal, which is obtained by a camera of the mobile terminal, for example, a person 1 meter away from the mobile phone is photographed by the mobile phone, and the distance between the person and the mobile phone is 1 meter. The object to be measured may refer to an object to be photographed. The camera of the mobile terminal may refer to an optical component used to generate an image in the mobile terminal. The main shooting lens can adopt 4800 million pixels of a 586 chip, has a prominent lens structure, is suitable for shooting in general occasions, and has high definition. The 2M lens can adopt 200 ten thousand pixels of OV chips, has a small-size structure and is suitable for microspur application. The telephoto lens is a camera lens having a focal length longer than that of the standard lens, and can be divided into two types, i.e., a normal telephoto lens and an ultra-telephoto lens, wherein the focal length of the normal telephoto lens is close to that of the standard lens, and the focal length of the ultra-telephoto lens is far greater than that of the standard lens.
And step S102, calibrating the laser chip in the mobile terminal.
In the embodiment of the application, before the laser chip in the mobile terminal is used for ranging, the mobile terminal can calibrate and calibrate the laser chip by using the embedded calibration algorithm, calibration and calibration are not required to be performed by means of customized equipment, calibration and calibration operation is simple, calibration and calibration time is shortened, mobility is good, the laser chip ranging device is suitable for after-sale calibration, calibration and calibration precision is improved, and the accuracy of laser chip ranging is improved. The laser chip in the mobile terminal can measure the distance between the mobile terminal and an object to be measured.
Step S103, obtaining a second distance between the object to be measured and the mobile terminal through calibrating the calibrated laser chip.
In this application embodiment, after calibrating laser chip is markd, measure range through the laser chip of marking after the calibration, acquire the object to be measured and mobile terminal's second distance, can improve the accuracy degree of the object to be measured who acquires and mobile terminal's second distance, and then improve the test accuracy to laser chip's range finding function.
And step S104, obtaining a test result of the distance measurement function of the laser chip according to the first distance and the second distance.
In this application embodiment, can compare first distance and second distance, if the second distance is less with first distance deviation, then can confirm that laser chip range finding is accurate, if the second distance is great with first distance deviation, then can confirm that laser chip range finding is inaccurate, has great error.
Optionally, the obtaining a test result of the distance measurement function of the laser chip according to the first distance and the second distance includes:
calculating a relative percentage deviation of the first distance X1 and the second distance X2 from the first distance X1 and the second distance X2
Figure BDA0002108674330000061
If P is smaller than or equal to a preset threshold value, determining that the distance measurement function of the laser chip meets the preset measurement deviation requirement;
and if the P is larger than the preset threshold value, determining that the distance measurement function of the laser chip does not meet the preset measurement deviation requirement.
In the embodiment of the application, the relative percentage deviation between the first distance and the second distance can be calculated, and compared with the preset threshold, the deviation between the second distance and the first distance can be determined, and whether the deviation between the second distance and the first distance meets the preset measurement deviation requirement or not is determined, so that whether the laser chip distance measurement is accurate or not is determined. The preset threshold may be a preset standard value for determining whether the relative percentage deviation between the first distance and the second distance meets the requirement for accurate distance measurement, and the user may set the preset threshold according to actual needs without limitation.
The embodiment of the application can realize that the mobile terminal calibrates the laser chip, and does not need to use customized equipment, simplifies the operation of calibrating laser calibration, improves the precision of calibrating laser calibration, thereby improving the accuracy of laser ranging, comparing the distance measured by laser with the distance obtained by a camera, accurately obtaining the test result of laser ranging, and improving the test precision of laser ranging.
Referring to fig. 2, it is a schematic view of an implementation flow of a laser ranging test method provided in the second embodiment of the present application, and as shown in the figure, the test method may include the following steps:
step S201, a third distance between the object to be measured and the mobile terminal is obtained.
In the embodiment of the application, the three cameras can be divided into the main camera and the auxiliary camera according to the advantages of each camera in the mobile terminal, the third distance between the object to be measured and the mobile terminal can be estimated, the three cameras of the mobile terminal can be divided into the main camera and the auxiliary camera according to the estimated third distance, the advantages of each camera are fully utilized, the main camera is selected from the three cameras according to the third distance to select the focus, and the other two cameras serve as auxiliary positioning.
Optionally, the obtaining a third distance between the object to be measured and the mobile terminal includes:
controlling the three cameras to simultaneously acquire images of the object to be detected;
selecting a focusing point from the focusing area of the object to be detected;
and acquiring the distance between the focusing point and the mobile terminal according to the images acquired by the three cameras, and taking the distance between the focusing point and the mobile terminal as a third distance between the object to be detected and the mobile terminal.
In the embodiment of the application, a focusing point can be selected from the focusing area of the object to be measured to estimate the third distance between the object to be measured and the mobile terminal. The focusing area may be an area for focusing on an object to be measured, an image formed in the three cameras in the area generally corresponds to a large number of pixel points, and an image formed in the three cameras in one focusing point corresponds to one pixel point respectively. Optionally, the focus point selected from the focus area of the object to be measured may refer to any one of the focus points in the focus area of the object to be measured, or may refer to a focus point located in the center of the focus area of the object to be measured, which is not limited herein.
Optionally, the obtaining the distance between the focusing point and the mobile terminal according to the images collected by the three cameras includes:
acquiring the positions of the images formed by the one focusing point in the three cameras respectively;
acquiring the distances between the one focus point and the three cameras according to the positions of the images formed by the one focus point in the three cameras respectively;
acquiring a three-dimensional coordinate of the focusing point in a world coordinate system according to the distances between the focusing point and the three cameras respectively;
acquiring three-dimensional coordinates of the central point of a triangular area formed by the three cameras in the world coordinate system;
and according to the three-dimensional coordinate of the focusing point in the world coordinate system and the three-dimensional coordinate of the central point in the world coordinate system, acquiring the distance between the focusing point and the central point, and taking the distance as the distance between the focusing point and the mobile terminal.
In the embodiment of the application, according to the imaging principle of the cameras, the positions of the images of the three cameras respectively formed by one focus point on the object to be measured can be obtained, the distance between one focusing point and one camera can be obtained according to the positions of the images of the three cameras respectively formed by one focus point, then the distances between one focus point and the three cameras can be obtained according to the positions of the images of the three cameras respectively formed by one focus point, because the positions of the three cameras in the mobile terminal are fixed, when the mobile terminal is fixed, the three-dimensional coordinates of the three cameras in the world coordinate system are also fixed, at the moment, the three-dimensional coordinates of one focus point in the world coordinate system can be calculated according to the three-dimensional coordinates of the three cameras in the world coordinate system and the distances between one focus point and the three cameras respectively, and in addition, according to the three-dimensional coordinates of the three cameras in the world coordinate system, the three-dimensional coordinates of the central point of the triangular area formed by the three cameras in the world coordinate system can be calculated, the distance between one focusing point and the central point can be calculated according to the three-dimensional coordinates of the focusing point and the central point in the world coordinate system, and the distance is the distance between the focusing point and the mobile terminal.
And S202, selecting a target camera from the three cameras according to a third distance between the object to be detected and the mobile terminal.
Optionally, the selecting a target camera from the three cameras according to a third distance between the object to be measured and the mobile terminal includes:
and acquiring a distance range of a third distance between the object to be detected and the mobile terminal, and taking a camera corresponding to the distance range in the three cameras as the target camera.
In the embodiment of the present application, the correspondence between different distance ranges and different target cameras may be preset, for example, when the third distance is smaller than the first distance threshold, a 2M lens in the three cameras is used as a target camera, when the third distance is greater than or equal to the first distance threshold and is smaller than or equal to the second distance threshold, a main lens in the three cameras is used as a target camera, and when the third distance is greater than the second distance threshold, a telephoto lens in the three cameras is used as a target camera. Optionally, the first distance threshold and the second distance threshold may be set according to actual needs, for example, the first distance threshold is 2 centimeters, and the second distance threshold is 3 meters, which is not limited herein.
And step S203, controlling the three cameras to simultaneously acquire images of the object to be detected.
And step S204, selecting a plurality of focusing points from the focusing area of the object to be detected according to the image acquired by the target camera.
In the embodiment of the application, the focal length of the 2M lens is small, the 2M lens is suitable for close-range shooting, when the third distance between the object to be detected and the mobile terminal is smaller than the first distance threshold, it is indicated that the object to be detected is closer to the mobile terminal, a plurality of focusing points can be selected from a focusing area of the object to be detected according to an image collected by the 2M lens, and the main shooting lens and the telephoto lens are positioned in an auxiliary manner according to the selected plurality of focusing points, so that the advantage of the 2M macro is fully exerted. When the third distance between the object to be detected and the mobile terminal is greater than or equal to the first distance threshold and less than or equal to the second distance threshold, a plurality of focusing points are selected from a focusing area of the object to be detected according to an image collected by the main shooting lens, and the 2M lens and the telephoto lens are positioned in an auxiliary mode according to the selected focusing points, so that the advantage of high definition of the main shooting lens can be fully exerted. When the third distance between the object to be detected and the mobile terminal is greater than the second distance threshold, the object to be detected is far away from the mobile terminal, a plurality of focusing points are selected from a focusing area of the object to be detected according to an image collected by the telephoto lens, and the 2M lens and the main camera lens perform auxiliary positioning according to the selected plurality of focusing points, so that the advantage of telephoto distance can be fully played.
Step S205, obtaining a distance between each focusing point of the plurality of focusing points and the mobile terminal according to the images collected by the three cameras.
In this embodiment of the application, referring to the method for obtaining the distance between the one focusing point and the mobile terminal according to the images collected by the three cameras in step S201, the distance between each focusing point in the plurality of focusing points and the mobile terminal may be obtained.
Step S206, acquiring a first distance between the object to be measured and the mobile terminal according to the distance between each focusing point in the focusing points and the mobile terminal.
In this embodiment of the application, the distances between the plurality of focuses and the mobile terminal may be accumulated to obtain an average value, and the calculated average value is used as the first distance between the object to be measured and the mobile terminal, or the root mean square of the distances between the plurality of focuses and the mobile terminal is calculated, and the calculated root mean square is used as the first distance between the object to be measured and the mobile terminal.
And step S207, calibrating the laser chip in the mobile terminal.
The step is the same as step S102, and reference may be made to the related description of step S102, which is not repeated herein.
And S208, acquiring a second distance between the object to be detected and the mobile terminal by calibrating the calibrated laser chip.
The step is the same as step S103, and reference may be made to the related description of step S103, which is not described herein again.
Step S209, obtaining a test result of the distance measurement function of the laser chip according to the first distance and the second distance.
The step is the same as step S104, and reference may be made to the related description of step S104, which is not repeated herein.
The distance between the object to be tested and the mobile terminal is preliminarily positioned, then the object to be tested and the mobile terminal are repositioned according to the preliminarily positioned distance, the distance is calculated by fully utilizing the advantages of the three cameras in the mobile terminal, the precision of three shooting distance measurement is improved, and the test precision of the test function of the laser chip is improved.
Referring to fig. 3, it is a schematic view of an implementation flow of a laser ranging test method provided in the third embodiment of the present application, and as shown in the figure, the test method may include the following steps:
step S301, a first distance between an object to be measured and the mobile terminal is obtained through a camera of the mobile terminal.
The step is the same as step S101, and reference may be made to the related description of step S101, which is not repeated herein.
And step S302, carrying out SPAD calibration on the laser chip of the mobile terminal.
The SPAD calibration may refer to calibration of an emitter and a receiver of the laser chip, for example, the emitter of the laser chip is controlled to emit photons of a preset number, the number of photons received by the receiver is obtained, if the number of photons received by the receiver is greater than N times of the preset number, it is determined that the laser chip is normal, and if the number of photons received by the receiver is less than or equal to N times of the preset number, it is determined that the laser chip is abnormal. N is a number less than 1 or equal to 1, for example 0.9.
Step S303, the laser chip of the mobile terminal is calibrated by Xtalk.
Where Xtalk may refer to crosstalk. The Xtalk calibration may refer to acquiring an Xtalk value according to internal parameters and a test environment of the mobile phone, where the Xtalk value is an important measurement standard of the laser, and determining whether the Xtalk value is within a preset calibration range, if the Xtalk value is within the preset calibration range, determining that the influence of crosstalk on the distance measurement of the laser chip is small, and if the Xtalk value is not within the preset calibration range, determining that the crosstalk is large, and the laser chip has a large error in the distance measurement. It should be noted that, a corresponding relationship between a first distance between the object to be measured and the mobile terminal and a preset calibration range may be preset, different first distances may correspond to different preset calibration ranges, and after the first distance between the object to be measured and the mobile terminal is obtained according to the camera of the mobile terminal, the mobile terminal may adaptively adjust the preset calibration range according to the first distance, so that accuracy of Xtalk calibration is improved, and accuracy of laser ranging is improved.
Step S304, calibrating the offset of the laser chip of the mobile terminal.
The offset calibration may refer to calibration of a compensation function, so as to improve the precision of laser ranging. Compensation function Y ═ AX2+0.9X + B, wherein X is the distance between the object to be measured and the mobile terminal, which is measured by the calibrated laser chip, A and B are compensation adjusting functions, and Y is the distance between the object to be measured and the mobile terminal, which is measured by the compensated laser chip. Theoretically, the distance Y is infinitely close to a theoretical value and is known, that is, the accurate distance between the object to be measured and the mobile terminal. After the calibration of the laser chip, the distance X between the object to be measured and the mobile terminal is tested twice, and the distance X measured twice is brought into the compensation function, so that the compensation adjustment parameters A and B can be solved, the compensation function can be adjusted in a self-adaptive manner according to the distance between the object to be measured and the mobile terminal, and the accuracy of offset calibration is improved.
Step S305, obtaining a second distance between the object to be measured and the mobile terminal through calibrating the calibrated laser chip.
The step is the same as step S103, and reference may be made to the related description of step S103, which is not described herein again.
And S306, acquiring a test result of the distance measurement function of the laser chip according to the first distance and the second distance.
The step is the same as step S104, and reference may be made to the related description of step S104, which is not repeated herein.
The embodiment of the application calibrates the laser chip by using the calibration algorithm embedded in the mobile terminal, without the help of customization equipment, the calibration operation is simpler, the calibration time is shortened, the mobility is good, the calibration method is suitable for after-sale calibration, the calibration precision is improved, and the accuracy of laser chip distance measurement is improved.
Referring to fig. 4, the schematic diagram of a laser ranging testing apparatus provided in the fourth embodiment of the present application is shown, and for convenience of description, only the relevant portions of the embodiment of the present application are shown.
The test device includes:
a first distance obtaining module 41, configured to obtain, through a camera of a mobile terminal, a first distance between an object to be measured and the mobile terminal;
a calibration module 42, configured to calibrate a laser chip in the mobile terminal;
a second distance obtaining module 43, configured to obtain a second distance between the object to be detected and the mobile terminal by calibrating the calibrated laser chip;
and the test result acquisition module 44 is configured to acquire a test result of the distance measurement function of the laser chip according to the first distance and the second distance.
Optionally, the mobile terminal includes three cameras, and the first distance obtaining module 41 includes:
the first acquisition unit is used for acquiring a third distance between the object to be detected and the mobile terminal;
the camera selection unit is used for selecting a target camera from the three cameras according to a third distance between the object to be detected and the mobile terminal;
the acquisition control unit is used for controlling the three cameras to simultaneously acquire images of the object to be detected;
the focusing point selection unit is used for selecting a plurality of focusing points from the focusing area of the object to be detected according to the image acquired by the target camera;
the second acquisition unit is used for acquiring the distance between each focusing point in the focusing points and the mobile terminal according to the images acquired by the three cameras;
and the third acquisition unit is used for acquiring the first distance between the object to be detected and the mobile terminal according to the distance between each focusing point in the focusing points and the mobile terminal.
Optionally, the focus selection unit is specifically configured to:
and acquiring a distance range of a third distance between the object to be detected and the mobile terminal, and taking a camera corresponding to the distance range in the three cameras as the target camera.
Optionally, the first obtaining unit is specifically configured to:
controlling the three cameras to simultaneously acquire images of the object to be detected;
selecting a focusing point from the focusing area of the object to be detected;
and acquiring the distance between the focusing point and the mobile terminal according to the images acquired by the three cameras, and taking the distance between the focusing point and the mobile terminal as a third distance between the object to be detected and the mobile terminal.
Optionally, the first obtaining unit is specifically configured to:
acquiring the positions of the images formed by the one focusing point in the three cameras respectively;
acquiring the distances between the one focus point and the three cameras according to the positions of the images formed by the one focus point in the three cameras respectively;
acquiring a three-dimensional coordinate of the focusing point in a world coordinate system according to the distances between the focusing point and the three cameras respectively;
acquiring three-dimensional coordinates of the central point of a triangular area formed by the three cameras in the world coordinate system;
and according to the three-dimensional coordinate of the focusing point in the world coordinate system and the three-dimensional coordinate of the central point in the world coordinate system, acquiring the distance between the focusing point and the central point, and taking the distance as the distance between the focusing point and the mobile terminal.
Optionally, the calibration and calibration module 42 is specifically configured to:
and sequentially carrying out SPAD calibration, Xtalk calibration and offset calibration on the laser chip of the mobile terminal.
Optionally, the test result obtaining module 44 includes:
a calculation unit for calculating a relative percentage deviation of the first distance X1 and the second distance X2 based on the first distance X1 and the second distance X2
Figure BDA0002108674330000141
The first determining unit is used for determining that the distance measuring function of the laser chip meets the requirement of preset measuring deviation if P is smaller than or equal to a preset threshold value;
and the second determining unit is used for determining that the distance measuring function of the laser chip does not meet the requirement of preset measuring deviation if P is larger than the preset threshold.
The test apparatus provided in the embodiment of the present application can be applied to the first, second, and third embodiments of the foregoing methods, and for details, reference is made to the description of the first, second, and third embodiments of the foregoing methods, and details are not repeated here.
Fig. 5 is a schematic diagram of a mobile terminal according to a fifth embodiment of the present application. The mobile terminal as shown in the figure may include: one or more processors 501 (only one shown); one or more input devices 502 (only one shown), one or more output devices 503 (only one shown), and a memory 504. The processor 501, the input device 502, the output device 503, and the memory 504 are connected by a bus 505. The memory 504 is used for storing instructions and the processor 501 is used for executing the instructions stored by the memory 504. Wherein:
the processor 501 is configured to obtain a first distance between an object to be measured and the mobile terminal through a camera of the mobile terminal; calibrating a laser chip in the mobile terminal; acquiring a second distance between the object to be detected and the mobile terminal through calibrating the calibrated laser chip; and acquiring a test result of the distance measuring function of the laser chip according to the first distance and the second distance.
Optionally, the mobile terminal includes three cameras, and the processor 501 is specifically configured to:
acquiring a third distance between the object to be detected and the mobile terminal;
selecting a target camera from the three cameras according to a third distance between the object to be detected and the mobile terminal;
controlling the three cameras to simultaneously acquire images of the object to be detected;
selecting a plurality of focusing points from a focusing area of the object to be detected according to the image acquired by the target camera;
acquiring the distance between each focusing point in the plurality of focusing points and the mobile terminal according to the images acquired by the three cameras;
and acquiring a first distance between the object to be detected and the mobile terminal according to the distance between each focusing point in the focusing points and the mobile terminal.
Optionally, the processor 501 is specifically configured to:
and acquiring a distance range of a third distance between the object to be detected and the mobile terminal, and taking a camera corresponding to the distance range in the three cameras as the target camera.
Optionally, the processor 501 is specifically configured to:
controlling the three cameras to simultaneously acquire images of the object to be detected;
selecting a focusing point from the focusing area of the object to be detected;
and acquiring the distance between the focusing point and the mobile terminal according to the images acquired by the three cameras, and taking the distance between the focusing point and the mobile terminal as a third distance between the object to be detected and the mobile terminal.
Optionally, the processor 501 is specifically configured to:
acquiring the positions of the images formed by the one focusing point in the three cameras respectively;
acquiring the distances between the one focus point and the three cameras according to the positions of the images formed by the one focus point in the three cameras respectively;
acquiring a three-dimensional coordinate of the focusing point in a world coordinate system according to the distances between the focusing point and the three cameras respectively;
acquiring three-dimensional coordinates of the central point of a triangular area formed by the three cameras in the world coordinate system;
and according to the three-dimensional coordinate of the focusing point in the world coordinate system and the three-dimensional coordinate of the central point in the world coordinate system, acquiring the distance between the focusing point and the central point, and taking the distance as the distance between the focusing point and the mobile terminal.
Optionally, the processor 501 is specifically configured to:
and sequentially carrying out SPAD calibration, Xtalk calibration and offset calibration on the laser chip of the mobile terminal.
Optionally, the processor 501 is specifically configured to:
calculating a relative percentage deviation of the first distance X1 and the second distance X2 from the first distance X1 and the second distance X2
Figure BDA0002108674330000161
If P is smaller than or equal to a preset threshold value, determining that the distance measurement function of the laser chip meets the preset measurement deviation requirement;
and if the P is larger than the preset threshold value, determining that the distance measurement function of the laser chip does not meet the preset measurement deviation requirement.
It should be understood that, in the embodiment of the present Application, the Processor 501 may be a Central Processing Unit (CPU), and the Processor may also be other general-purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The input device 502 may include a touch pad, a fingerprint sensor (for collecting fingerprint information of a user and direction information of the fingerprint), a microphone, a data receiving interface, and the like. The output device 503 may include a display (LCD, etc.), a speaker, a data transmission interface, and the like.
The memory 504 may include a read-only memory and a random access memory, and provides instructions and data to the processor 501. A portion of the memory 504 may also include non-volatile random access memory. For example, the memory 504 may also store device type information.
In a specific implementation, the processor 501, the input device 502, the output device 503, and the memory 504 described in this embodiment of the present application may execute the implementation described in the embodiment of the test method provided in this embodiment of the present application, and may also execute the implementation described in the test apparatus described in the fourth embodiment of the present application, which is not described herein again.
Fig. 6 is a schematic diagram of a mobile terminal according to a sixth embodiment of the present application. As shown in fig. 6, the mobile terminal 6 of this embodiment includes: a processor 60, a memory 61 and a computer program 62 stored in said memory 61 and executable on said processor 60. The steps in the various test method embodiments described above are implemented when the computer program 62 is executed by the processor 60. Alternatively, the processor 60 implements the functions of the modules/units in the above-described apparatus embodiments when executing the computer program 62.
Illustratively, the computer program 62 may be partitioned into one or more modules/units that are stored in the memory 61 and executed by the processor 60 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 62 in the mobile terminal 6. For example, the computer program 62 may be divided into a first distance obtaining module, a calibration module, a second distance obtaining module, and a test result obtaining module, where the specific functions of the modules are as follows:
the first distance acquisition module is used for acquiring a first distance between an object to be detected and the mobile terminal through a camera of the mobile terminal;
the calibration module is used for calibrating the laser chip in the mobile terminal;
the second distance acquisition module is used for acquiring a second distance between the object to be detected and the mobile terminal by calibrating the calibrated laser chip;
and the test result acquisition module is used for acquiring a test result of the distance measurement function of the laser chip according to the first distance and the second distance.
Optionally, the mobile terminal includes three cameras, and the first distance obtaining module includes:
the first acquisition unit is used for acquiring a third distance between the object to be detected and the mobile terminal;
the camera selection unit is used for selecting a target camera from the three cameras according to a third distance between the object to be detected and the mobile terminal;
the acquisition control unit is used for controlling the three cameras to simultaneously acquire images of the object to be detected;
the focusing point selection unit is used for selecting a plurality of focusing points from the focusing area of the object to be detected according to the image acquired by the target camera;
the second acquisition unit is used for acquiring the distance between each focusing point in the focusing points and the mobile terminal according to the images acquired by the three cameras;
and the third acquisition unit is used for acquiring the first distance between the object to be detected and the mobile terminal according to the distance between each focusing point in the focusing points and the mobile terminal.
Optionally, the focus selection unit is specifically configured to:
and acquiring a distance range of a third distance between the object to be detected and the mobile terminal, and taking a camera corresponding to the distance range in the three cameras as the target camera.
Optionally, the first obtaining unit is specifically configured to:
controlling the three cameras to simultaneously acquire images of the object to be detected;
selecting a focusing point from the focusing area of the object to be detected;
and acquiring the distance between the focusing point and the mobile terminal according to the images acquired by the three cameras, and taking the distance between the focusing point and the mobile terminal as a third distance between the object to be detected and the mobile terminal.
Optionally, the first obtaining unit is specifically configured to:
acquiring the positions of the images formed by the one focusing point in the three cameras respectively;
acquiring the distances between the one focus point and the three cameras according to the positions of the images formed by the one focus point in the three cameras respectively;
acquiring a three-dimensional coordinate of the focusing point in a world coordinate system according to the distances between the focusing point and the three cameras respectively;
acquiring three-dimensional coordinates of the central point of a triangular area formed by the three cameras in the world coordinate system;
and according to the three-dimensional coordinate of the focusing point in the world coordinate system and the three-dimensional coordinate of the central point in the world coordinate system, acquiring the distance between the focusing point and the central point, and taking the distance as the distance between the focusing point and the mobile terminal.
Optionally, the calibration and calibration module is specifically configured to:
and sequentially carrying out SPAD calibration, Xtalk calibration and offset calibration on the laser chip of the mobile terminal.
Optionally, the test result obtaining module includes:
a calculation unit for calculating a first distance from the first distanceCalculating a relative percent deviation of the first distance X1 and the second distance X2 from X1 and the second distance X2
Figure BDA0002108674330000191
The first determining unit is used for determining that the distance measuring function of the laser chip meets the requirement of preset measuring deviation if P is smaller than or equal to a preset threshold value;
and the second determining unit is used for determining that the distance measuring function of the laser chip does not meet the requirement of preset measuring deviation if P is larger than the preset threshold.
The mobile terminal 6 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The mobile terminal may include, but is not limited to, a processor 60, a memory 61. Those skilled in the art will appreciate that fig. 6 is merely an example of a mobile terminal 6 and is not intended to limit the mobile terminal 6 and may include more or fewer components than those shown, or some components may be combined, or different components, e.g., the mobile terminal may also include input-output devices, network access devices, buses, etc.
The processor 60 may be a central processing unit CPU, but may also be other general purpose processors, digital signal processors DSP, application specific integrated circuits ASIC, off-the-shelf programmable gate arrays FPGA or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 61 may be an internal storage unit of the mobile terminal 6, such as a hard disk or a memory of the mobile terminal 6. The memory 61 may also be an external storage device of the mobile terminal 6, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) and the like provided on the mobile terminal 6. Further, the memory 61 may also include both an internal storage unit and an external storage device of the mobile terminal 6. The memory 61 is used for storing the computer program and other programs and data required by the mobile terminal. The memory 61 may also be used to temporarily store data that has been output or is to be output. It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/mobile terminal and method may be implemented in other ways. For example, the above-described apparatus/mobile terminal embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (9)

1. A test method of laser ranging is characterized by comprising the following steps:
acquiring a first distance between an object to be detected and a mobile terminal through a camera of the mobile terminal;
calibrating a laser chip in the mobile terminal;
acquiring a second distance between the object to be detected and the mobile terminal through calibrating the calibrated laser chip;
obtaining a test result of the distance measuring function of the laser chip according to the first distance and the second distance;
the mobile terminal comprises three cameras, and the acquisition of the object to be measured and the first distance of the mobile terminal through the cameras of the mobile terminal comprise:
acquiring a third distance between the object to be detected and the mobile terminal;
selecting a target camera from the three cameras according to a third distance between the object to be detected and the mobile terminal;
controlling the three cameras to simultaneously acquire images of the object to be detected;
selecting a plurality of focusing points from a focusing area of the object to be detected according to the image acquired by the target camera;
acquiring the distance between each focusing point in the plurality of focusing points and the mobile terminal according to the images acquired by the three cameras;
and acquiring a first distance between the object to be detected and the mobile terminal according to the distance between each focusing point in the focusing points and the mobile terminal.
2. The method according to claim 1, wherein the selecting a target camera from the three cameras according to the third distance between the object to be tested and the mobile terminal comprises:
and acquiring a distance range of a third distance between the object to be detected and the mobile terminal, and taking a camera corresponding to the distance range in the three cameras as the target camera.
3. The testing method of claim 1, wherein the obtaining a third distance between the object to be tested and the mobile terminal comprises:
controlling the three cameras to simultaneously acquire images of the object to be detected;
selecting a focusing point from the focusing area of the object to be detected;
and acquiring the distance between the focusing point and the mobile terminal according to the images acquired by the three cameras, and taking the distance between the focusing point and the mobile terminal as a third distance between the object to be detected and the mobile terminal.
4. The testing method of claim 3, wherein the obtaining the distance between the one focusing point and the mobile terminal according to the images collected by the three cameras comprises:
acquiring the positions of the images formed by the one focusing point in the three cameras respectively;
acquiring the distances between the one focus point and the three cameras according to the positions of the images formed by the one focus point in the three cameras respectively;
acquiring a three-dimensional coordinate of the focusing point in a world coordinate system according to the distances between the focusing point and the three cameras respectively;
acquiring three-dimensional coordinates of the central point of a triangular area formed by the three cameras in the world coordinate system;
and according to the three-dimensional coordinate of the focusing point in the world coordinate system and the three-dimensional coordinate of the central point in the world coordinate system, acquiring the distance between the focusing point and the central point, and taking the distance as the distance between the focusing point and the mobile terminal.
5. The test method according to claim 1, wherein the calibrating the laser chip in the mobile terminal comprises:
and sequentially carrying out SPAD calibration, Xtalk calibration and offset calibration on the laser chip of the mobile terminal.
6. The method according to any one of claims 1 to 5, wherein the obtaining of the test result of the distance measurement function of the laser chip according to the first distance and the second distance comprises:
calculating a relative percentage deviation of the first distance X1 and the second distance X2 from the first distance X1 and the second distance X2
Figure FDA0002756690550000021
If P is smaller than or equal to a preset threshold value, determining that the distance measurement function of the laser chip meets the preset measurement deviation requirement;
and if the P is larger than the preset threshold value, determining that the distance measurement function of the laser chip does not meet the preset measurement deviation requirement.
7. A laser ranging test device, comprising:
the first distance acquisition module is used for acquiring a first distance between an object to be detected and the mobile terminal through a camera of the mobile terminal;
the calibration module is used for calibrating the laser chip in the mobile terminal;
the second distance acquisition module is used for acquiring a second distance between the object to be detected and the mobile terminal by calibrating the calibrated laser chip;
the test result acquisition module is used for acquiring a test result of the distance measurement function of the laser chip according to the first distance and the second distance;
the mobile terminal comprises three cameras, and the first distance acquisition module comprises:
the first acquisition unit is used for acquiring a third distance between the object to be detected and the mobile terminal;
the camera selection unit is used for selecting a target camera from the three cameras according to a third distance between the object to be detected and the mobile terminal;
the acquisition control unit is used for controlling the three cameras to simultaneously acquire images of the object to be detected;
the focusing point selection unit is used for selecting a plurality of focusing points from the focusing area of the object to be detected according to the image acquired by the target camera;
the second acquisition unit is used for acquiring the distance between each focusing point in the focusing points and the mobile terminal according to the images acquired by the three cameras;
and the third acquisition unit is used for acquiring the first distance between the object to be detected and the mobile terminal according to the distance between each focusing point in the focusing points and the mobile terminal.
8. A mobile terminal comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the steps of the test method according to any of claims 1 to 6 are implemented when the computer program is executed by the processor.
9. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the test method according to any one of claims 1 to 6.
CN201910562541.6A 2019-06-26 2019-06-26 Laser ranging test method and device and mobile terminal Expired - Fee Related CN110196422B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910562541.6A CN110196422B (en) 2019-06-26 2019-06-26 Laser ranging test method and device and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910562541.6A CN110196422B (en) 2019-06-26 2019-06-26 Laser ranging test method and device and mobile terminal

Publications (2)

Publication Number Publication Date
CN110196422A CN110196422A (en) 2019-09-03
CN110196422B true CN110196422B (en) 2021-03-26

Family

ID=67755221

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910562541.6A Expired - Fee Related CN110196422B (en) 2019-06-26 2019-06-26 Laser ranging test method and device and mobile terminal

Country Status (1)

Country Link
CN (1) CN110196422B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114152201B (en) * 2021-11-04 2023-10-17 深圳橙子自动化有限公司 Laser altimeter calibration method and device, electronic equipment and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0989521A (en) * 1995-09-21 1997-04-04 Hitachi Zosen Corp Calibrating method for use in laser distance measuring equipment
US9470548B2 (en) * 2011-01-31 2016-10-18 Agency For Defense Development Device, system and method for calibration of camera and laser sensor
CN104880204B (en) * 2015-06-17 2019-03-05 沈阳飞机工业(集团)有限公司 Using GPS and automatically track calibration method with measuring system to high precision laser range finder
EP3416370B1 (en) * 2016-03-23 2023-07-26 Huawei Technologies Co., Ltd. Photography focusing method, device, and apparatus for terminal
CN208239606U (en) * 2018-05-29 2018-12-14 天津杰泰高科传感技术有限公司 Laser radar caliberating device and laser radar calibration system
CN109035345A (en) * 2018-07-20 2018-12-18 齐鲁工业大学 The TOF camera range correction method returned based on Gaussian process

Also Published As

Publication number Publication date
CN110196422A (en) 2019-09-03

Similar Documents

Publication Publication Date Title
CN107920211A (en) A kind of photographic method, terminal and computer-readable recording medium
CN108961157B (en) Picture processing method, picture processing device and terminal equipment
US20090146972A1 (en) Apparatus and method for detecting a pointer relative to a touch surface
CN104534992A (en) Length measurement method and terminal
CN106131408A (en) A kind of image processing method and terminal
CN108668086B (en) Automatic focusing method and device, storage medium and terminal
CN108961267B (en) Picture processing method, picture processing device and terminal equipment
CN108200335A (en) Photographic method, terminal and computer readable storage medium based on dual camera
CN109688340A (en) Time for exposure control method, device, electronic equipment and storage medium
CN108922478B (en) Backlight brightness adjusting method and system and display device
CN110196422B (en) Laser ranging test method and device and mobile terminal
CN111381224B (en) Laser data calibration method and device and mobile terminal
CN108932703B (en) Picture processing method, picture processing device and terminal equipment
TW201326776A (en) Lens test device and method
CN107682691B (en) A kind of method, terminal and the computer readable storage medium of camera focus calibration
WO2021082636A1 (en) Region of interest detection method and apparatus, readable storage medium and terminal device
CN112217992A (en) Image blurring method, image blurring device, mobile terminal, and storage medium
CN109444905B (en) Dynamic object detection method and device based on laser and terminal equipment
CN107657663B (en) Method and device for displaying information
CN109165648B (en) Image processing method, image processing device and mobile terminal
US9652081B2 (en) Optical touch system, method of touch detection, and computer program product
CN107360361B (en) Method and device for shooting people in backlight mode
CN107734324B (en) Method and system for measuring illumination uniformity of flash lamp and terminal equipment
CN111861965A (en) Image backlight detection method, image backlight detection device and terminal equipment
CN115909992A (en) Display screen brightness adjusting method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20210326

CF01 Termination of patent right due to non-payment of annual fee