WO2020216129A1 - 参数获取方法及终端设备 - Google Patents

参数获取方法及终端设备 Download PDF

Info

Publication number
WO2020216129A1
WO2020216129A1 PCT/CN2020/085177 CN2020085177W WO2020216129A1 WO 2020216129 A1 WO2020216129 A1 WO 2020216129A1 CN 2020085177 W CN2020085177 W CN 2020085177W WO 2020216129 A1 WO2020216129 A1 WO 2020216129A1
Authority
WO
WIPO (PCT)
Prior art keywords
calibration
target object
measurement value
terminal device
distance
Prior art date
Application number
PCT/CN2020/085177
Other languages
English (en)
French (fr)
Inventor
付从华
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Priority to CA3136987A priority Critical patent/CA3136987A1/en
Priority to EP20795992.5A priority patent/EP3962060A4/en
Priority to JP2021563406A priority patent/JP7303900B2/ja
Priority to AU2020263183A priority patent/AU2020263183B2/en
Priority to BR112021021120A priority patent/BR112021021120A2/pt
Publication of WO2020216129A1 publication Critical patent/WO2020216129A1/zh
Priority to US17/503,545 priority patent/US11769273B2/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/635Region indicators; Field of view indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/671Focus control based on electronic image sensor signals in combination with active ranging signals, e.g. using light or sound signals emitted toward objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces

Definitions

  • the embodiments of the present disclosure relate to the field of communication technologies, and in particular, to a parameter acquisition method and terminal equipment.
  • the cameras of some electronic products can use the distance sensor to measure the distance of the focused object in order to achieve more accurate focusing.
  • a currently popular solution is to use Time of Flight (TOF) technology to measure the linear distance between the focus object and the camera, that is, the focus object distance.
  • TOF Time of Flight
  • the embodiments of the present disclosure provide a parameter acquisition method and terminal device, so as to solve the problem that the calibration of the distance sensor is relatively complicated.
  • the embodiments of the present disclosure provide a parameter acquisition method, which is applied to a terminal device, and includes:
  • a terminal device including:
  • the second acquisition module is configured to acquire the first measurement value of the distance sensor when the degree of coincidence between the preview image and the calibration area in the shooting preview interface exceeds a threshold;
  • the third acquiring module is used to acquire the calibration distance between the target object and the terminal device
  • the fourth acquisition module is configured to acquire the calibration offset corresponding to the target object based on the first measurement value and the calibration distance.
  • the embodiments of the present disclosure also provide a terminal device, including: a memory, a processor, and a computer program stored in the memory and capable of running on the processor.
  • the processor executes the computer program as described above. The steps in the parameter acquisition method described.
  • the embodiments of the present disclosure also provide a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, the steps in the parameter acquisition method described above are implemented .
  • the preview image of the target object is matched with the calibration area in the shooting preview interface to obtain the measurement value of the distance sensor at this time, and then the measurement value and the calibration distance are used to obtain the calibration offset corresponding to the target object. Shift. Therefore, by using the embodiments of the present disclosure, the user can use any target object and obtain its calibration offset, and the obtaining method is simple, thereby reducing the complexity of calibrating the distance sensor.
  • Fig. 1 is a flowchart of a parameter acquisition method provided by an embodiment of the present disclosure
  • Figure 2 is one of the structural diagrams of a terminal device provided by an embodiment of the present disclosure
  • FIG. 3 is one of the flowcharts of the calibration method provided by an embodiment of the present disclosure.
  • Figure 4 is one of the display interfaces of the terminal device provided by an embodiment of the present disclosure.
  • FIG. 5 is the second flowchart of the calibration method provided by an embodiment of the present disclosure.
  • FIG. 6 is the second display interface of the terminal device provided by the embodiment of the present disclosure.
  • FIG. 7 is the third display interface of a terminal device provided by an embodiment of the present disclosure.
  • FIG. 8 is the second structural diagram of a terminal device provided by an embodiment of the present disclosure.
  • Fig. 9 is the third structural diagram of a terminal device provided by an embodiment of the present disclosure. .
  • FIG. 1 is a flowchart of a parameter acquisition method provided by an embodiment of the present disclosure. As shown in FIG. 1, it includes the following steps:
  • Step 101 Obtain a preview image of the target object in the shooting preview interface.
  • the terminal device When the camera function is turned on, the terminal device displays the shooting preview interface. At this time, the camera can be used to obtain the image of the target object in the shooting preview interface, which is referred to herein as the preview image of the target object.
  • the target object In the preview image, the target object can be included, and the environment in which the target object is located can also be included.
  • the target object may be an object, a person, etc.
  • what kind of object the target object is may be preset. Then, when the user uses a certain object to calibrate, it can first identify whether the object used by the user is a predetermined target object, and if so, execute step 101. Otherwise, the process can be ended or prompted to the user.
  • Step 102 Acquire a first measurement value of the distance sensor when the degree of coincidence of the preview image and the calibration area in the shooting preview interface exceeds a threshold.
  • the ToF distance sensor is usually composed of a transmitting unit and a receiving unit.
  • the laser light emitted by the transmitting unit is reflected back after encountering the target object, and the reflected light is received by the receiving unit. Then, the flight time between laser emission and reception can be measured. After that, the distance between the terminal device (or distance sensor) and the target object can be calculated according to the propagation speed of the light, that is, the first measurement value.
  • the threshold can be set arbitrarily. In order to increase the accuracy of the calibration, the threshold can be set to 100%. Then, at this time, the preview image coincides with the calibration area in the shooting preview interface.
  • the calibration area may be an area of any shape.
  • the calibration area can be rectangular, round lights.
  • the calibration area can be arbitrarily set to correspond to the shape and size of a specific target object.
  • Step 103 Obtain a calibration distance between the target object and the terminal device.
  • a target object For a target object, its size is known or can be obtained by measurement. For objects of fixed size, the closer to the camera, the larger the image size on the display screen, and vice versa.
  • the geometric figure corresponding to its imaging contour can be displayed on the display screen.
  • the size of the geometric figure is fixed and is also a known condition.
  • the imaging of the target object coincides with the geometric figure, it can be considered that the size of the two is the same.
  • the true distance between the target object and the distance sensor can be obtained through experimental data, and the true distance is the calibration distance.
  • a corresponding relationship may be stored in which the corresponding relationship between the object and the calibration distance between the object and the terminal device is stored. Then, in this step, the target object can be identified, and then the calibration distance between the target object and the terminal device can be obtained according to the correspondence between the object and the calibration distance.
  • the method of identifying the target object is not limited here.
  • Step 104 Obtain a calibration offset corresponding to the target object based on the first measurement value and the calibration distance.
  • the difference between the first measurement value and the calibration distance is used as the calibration offset corresponding to the target object.
  • the above method can be applied to terminal devices, such as mobile phones, tablet computers (Tablet Personal Computer), laptop computers (Laptop Computer), personal digital assistants (personal digital assistant, PDA), mobile Internet devices ( Mobile Internet Device (MID) or Wearable Device (Wearable Device), etc.
  • terminal devices such as mobile phones, tablet computers (Tablet Personal Computer), laptop computers (Laptop Computer), personal digital assistants (personal digital assistant, PDA), mobile Internet devices ( Mobile Internet Device (MID) or Wearable Device (Wearable Device), etc.
  • the preview image of the target object is matched with the calibration area in the shooting preview interface to obtain the measurement value of the distance sensor at this time, and then the measurement value and the calibration distance are used to obtain the calibration offset corresponding to the target object. Shift. Therefore, by using the embodiments of the present disclosure, the user can use any target object and obtain its calibration offset, and the obtaining method is simple, thereby reducing the complexity of calibrating the distance sensor.
  • the method may further include: obtaining a second measurement value of the distance sensor, and obtaining a calibrated second measurement value based on the second measurement value and the calibration offset. Specifically, the difference between the second measurement value and the calibration offset is used as the second measurement value after calibration. In this way, the calculation is simple, so the calibration can be completed quickly.
  • the method may further include: identifying the target object, and displaying a calibration area matching the shape of the target object in the shooting preview interface.
  • the target object can be identified, and a calibration area matching the shape of the target object is displayed in the shooting preview interface.
  • the correspondence between the object and the calibration area can be stored.
  • the corresponding calibration area can be obtained according to the corresponding relationship and displayed.
  • the target object does not need to be pre-booked, thus facilitating calibration.
  • a calibration area matching the shape of the coin (such as a circle) can be displayed in the shooting preview interface.
  • step 101 at least one of the following steps may be performed:
  • the first prompt information being used to prompt the user to move the target object so that the preview image coincides with the calibration area. In this way, the time required for calibration can be reduced and the calibration efficiency can be improved.
  • the second prompt information is displayed, and the second prompt information is used to prompt the user to select a target object with preset characteristics.
  • the preset features can be shape features, category features, and so on. In this way, the user can select the target object more quickly, thereby reducing the time required for calibration and improving the calibration efficiency.
  • the terminal device of the embodiment of the present disclosure may include: a display 201, a camera module 202, a distance sensor 203, and a processor 204.
  • the distance sensor 203 may include a receiving unit 2031 and a transmitting unit 2032.
  • the shooting preview interface 2011 is displayed on the monitor.
  • the laser light emitted by the transmitting unit is reflected back after encountering the target object, and the reflected light is received by the receiving unit. Then, the flight time between laser emission and reception can be measured. After that, the distance between the terminal device (or distance sensor) and the target object can be calculated according to the propagation speed of the light.
  • FIG. 3 is a flowchart of a calibration method provided by an embodiment of the present disclosure.
  • This method can be applied to terminal equipment.
  • the shape of the calibration area and the target object are preset.
  • the target object is a round object with the size of a 1 yuan coin as an example.
  • the shape of the calibration area is a circle. As shown in Figure 3, it includes the following steps:
  • Step 301 When the camera of the terminal device enters the distance sensor calibration state, display the shooting preview interface on the display, and display the calibration area in any area of the interface.
  • At least one preset geometric figure 31 (circle in this embodiment) is displayed in the shooting preview interface 2011.
  • the interface diagram of the terminal device is shown in Figure 4.
  • prompt information 32 may be displayed on the display. For example, as shown in Figure 4, "Please use a 1 yuan coin as the target object" may be displayed.
  • Step 302 When the preview image of the target object coincides with the calibration area, obtain the first measurement value measured by the distance sensor.
  • a preview image of an object when a preview image of an object appears in the shooting preview interface, it can first identify whether the object that appears is the set target object. If yes, perform step 302; otherwise, the user is prompted to use the corresponding object as the target object or use an object with a similar shape as the target object.
  • the user can operate to make the imaging 34 of the target object coincide with the geometric figure.
  • prompt information 33 may be displayed on the display. For example, as shown in Figure 4, "Please adjust the camera distance and angle so that the outline of the target object just coincides with the dotted frame" can be displayed.
  • the first measurement value D1 measured by the distance sensor is obtained.
  • Step 303 Obtain a calibration distance between the target object and the terminal device.
  • the calibration offset of the target object can also be determined through experiments. Because the size of the preset target object (such as a 1 yuan coin) is uniform. The target object of fixed size, the closer to the camera, the larger the image size on the display, and vice versa. The preset geometric figure is used as the imaging contour corresponding to the target object, and its size is fixed, which is a known condition. Therefore, when the imaging of the target object coincides with the geometric figure 1, it can be considered that the sizes of the two are the same. At this time, the true distance between the target object and the distance sensor can be obtained through experimental data. This true distance is the calibration distance D0 preset by the system for the target object.
  • D0 preset by the system for the target object.
  • Step 304 Obtain the calibration offset of the target object.
  • This calibration offset can be stored and used for subsequent calibration.
  • Step 305 When calibration is required, obtain a second measurement value of the distance sensor.
  • the second measurement value D2 of the distance sensor is obtained.
  • Step 306 Obtain a second measured value after calibration based on the second measured value and the calibration offset.
  • the second measurement value is calibrated by using the calibration offset.
  • D2-Doffset is used, that is, the difference between the second measurement value and the calibration offset is the second measurement value after calibration.
  • FIG. 5 is a flowchart of a calibration method provided by an embodiment of the present disclosure. This method can be applied to terminal equipment. The difference from the embodiment shown in FIG. 3 is that in the embodiment of the present disclosure, a single target object is no longer preset, but the target object selected by the user can be recognized and displayed on the display screen according to the recognition result. Display the corresponding calibration area (such as geometry). As shown in Figure 5, it includes the following steps:
  • Step 501 Identify the target object selected by the user, and display the calibration area on the shooting preview interface.
  • the target object When the camera is pointed at the target object selected by the user, the target object enters the camera FOV area.
  • the terminal device can identify which target object belongs to by the imaging characteristics of the target object, and then display the geometric figure corresponding to the target object on the preview interface.
  • the user can be prompted for the available target objects.
  • the geometric figure displayed on the shooting preview interface is a circle; when the target object is a resident ID card, the geometric figure displayed on the shooting preview interface is a rectangle.
  • Step 502 Prompt the user to adjust the target object so that the preview image of the target object coincides with the displayed calibration area.
  • Step 503 Obtain the first measurement value D1 measured by the distance sensor.
  • Step 504 Obtain a calibration distance between the target object and the terminal device.
  • the calibration offset of the target object can also be determined through experiments. Because the size of the preset target object (such as a 1 yuan coin) is uniform. The target object of fixed size, the closer to the camera, the larger the image size on the display, and vice versa. The preset geometric figure is used as the imaging contour corresponding to the target object, and its size is fixed, which is a known condition. Therefore, when the imaging of the target object coincides with the geometric figure 1, it can be considered that the sizes of the two are the same. At this time, the true distance between the target object and the distance sensor can be obtained through experimental data. This true distance is the calibration distance D0 preset by the system for the target object.
  • D0 preset by the system for the target object.
  • Step 505 Obtain a calibration offset corresponding to the target object.
  • This calibration offset can be stored and used for subsequent calibration.
  • Step 506 When calibration is required, obtain a second measurement value of the distance sensor.
  • the second measurement value D2 of the distance sensor is obtained.
  • Step 507 Obtain a second measured value after calibration based on the second measured value and the calibration offset.
  • the second measurement value is calibrated by using the calibration offset.
  • D2-Doffset is used, that is, the difference between the second measurement value and the calibration offset is the second measurement value after calibration.
  • the distance sensor can be calibrated according to the reference object selected by the user, thereby simplifying the difficulty of calibration, facilitating the user's operation, and improving the user's experience.
  • FIG. 8 is a structural diagram of a terminal device provided by an embodiment of the present disclosure.
  • a terminal device 800 includes:
  • the first obtaining module 801 is used to obtain a preview image of the target object in the shooting preview interface; the second obtaining module 802 is used to detect the coincidence of the preview image and the calibration area in the shooting preview interface exceeding a threshold In this case, the first measurement value of the distance sensor is obtained; the third obtaining module 803 is used to obtain the calibration distance between the target object and the terminal device; the fourth obtaining module 804 is used to obtain the calibration distance based on the first measurement Value and the calibration distance to obtain the calibration offset corresponding to the target object.
  • the fourth acquiring module 804 is specifically configured to use the difference between the first measurement value and the calibration distance as the calibration offset.
  • the third acquisition module 803 includes: an identification sub-module for identifying the target object; an acquisition sub-module for acquiring the target object and the terminal device according to the correspondence between the object and the calibration distance Calibration distance between.
  • the terminal device further includes:
  • the fifth acquiring module 805 is configured to acquire the second measurement value of the distance sensor
  • the calibration module 806 is configured to obtain a second measurement value after calibration based on the second measurement value and the calibration offset.
  • the calibration module 806 is specifically configured to use the difference between the second measurement value and the calibration offset as the second measurement value after calibration.
  • the terminal device further includes:
  • the recognition module 807 is used to recognize the target object
  • the first display module 808 is configured to display a calibration area matching the shape of the target object in the shooting preview interface.
  • the terminal device further includes at least one of the following modules:
  • the second display module 809 is configured to display first prompt information, where the first prompt information is used to prompt the user to move the target object so that the preview image coincides with the calibration area;
  • the third display module 810 is configured to display second prompt information, and the second prompt information is used to prompt the user to select a target object with preset characteristics.
  • the terminal device 800 can implement the various processes implemented by the terminal device in the foregoing method embodiments, and to avoid repetition, details are not described herein again.
  • the preview image of the target object is matched with the calibration area in the shooting preview interface to obtain the measurement value of the distance sensor at this time, and then the measurement value and the calibration distance are used to obtain the calibration offset corresponding to the target object. Shift. Therefore, by using the embodiments of the present disclosure, the user can use any target object and obtain its calibration offset, and the obtaining method is simple, thereby reducing the complexity of calibrating the distance sensor.
  • the terminal device 900 includes but is not limited to: a radio frequency unit 901, a network module 902, an audio output unit 903, an input unit 904, a sensor 905, and a display unit 906 , User input unit 907, interface unit 908, memory 909, processor 910, and power supply 911.
  • a radio frequency unit 901 a radio frequency unit 901
  • a network module 902 an audio output unit 903, an input unit 904, a sensor 905, and a display unit 906
  • User input unit 907 interface unit 908
  • memory 909 memory 909
  • processor 910 processor 910
  • power supply 911 Power supply 911.
  • terminal devices include, but are not limited to, mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted mobile terminals, wearable devices, and pedometers.
  • the processor 910 is configured to obtain a preview image of the target object in the shooting preview interface; when the coincidence degree of the preview image and the calibration area in the shooting preview interface exceeds a threshold, obtain the first distance sensor A measurement value; obtaining a calibration distance between the target object and the terminal device; obtaining a calibration offset corresponding to the target object based on the first measurement value and the calibration distance.
  • the preview image of the target object is matched with the calibration area in the shooting preview interface to obtain the measurement value of the distance sensor at this time, and then the measurement value and the calibration distance are used to obtain the calibration offset corresponding to the target object. Shift. Therefore, by using the embodiments of the present disclosure, the user can use any target object and obtain its calibration offset, and the obtaining method is simple, thereby reducing the complexity of calibrating the distance sensor.
  • the processor 910 is configured to use a difference between the first measurement value and the calibration distance as the calibration offset.
  • the processor 910 is configured to obtain a second measurement value of the distance sensor; and obtain a second measurement value after calibration based on the second measurement value and the calibration offset.
  • the processor 910 is configured to use the difference between the second measurement value and the calibration offset as the second measurement value after calibration.
  • the processor 910 is configured to identify the target object; and obtain the calibration distance between the target object and the terminal device according to the correspondence between the object and the calibration distance.
  • the processor 910 is configured to identify the target object; in the shooting preview interface, display a calibration area matching the shape of the target object.
  • the processor 910 is configured to perform at least one of the following steps:
  • the first prompt information being used to prompt the user to move the target object so that the preview image coincides with the calibration area
  • the second prompt information is displayed, and the second prompt information is used to prompt the user to select a target object with preset characteristics.
  • the radio frequency unit 901 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, the downlink data from the base station is received and processed by the processor 910; Uplink data is sent to the base station.
  • the radio frequency unit 901 includes but is not limited to an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 901 can also communicate with the network and other devices through a wireless communication system.
  • the terminal device provides users with wireless broadband Internet access through the network module 902, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 903 can convert the audio data received by the radio frequency unit 901 or the network module 902 or stored in the memory 909 into an audio signal and output it as sound. Moreover, the audio output unit 903 may also provide audio output related to a specific function performed by the terminal device 900 (for example, call signal reception sound, message reception sound, etc.).
  • the audio output unit 903 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 904 is used to receive audio or video signals.
  • the input unit 904 may include a graphics processing unit (GPU) 9041 and a microphone 9042.
  • the graphics processor 9041 is used for the image of a still picture or video obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Data is processed.
  • the processed image frame may be displayed on the display unit 906.
  • the image frames processed by the graphics processor 9041 may be stored in the memory 909 (or other storage medium) or sent via the radio frequency unit 901 or the network module 902.
  • the microphone 9042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 901 for output in the case of a telephone call mode.
  • the terminal device 900 further includes at least one sensor 905, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 9061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 9061 and the display panel 9061 when the terminal device 900 is moved to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when it is stationary, and can be used to identify the posture of the terminal device (such as horizontal and vertical screen switching, related games) , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tap), etc.; sensor 905 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
  • the display unit 906 is used to display information input by the user or information provided to the user.
  • the display unit 906 may include a display panel 9061, and the display panel 9061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), etc.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the user input unit 907 may be used to receive input digital or character information, and generate key signal input related to user settings and function control of the terminal device.
  • the user input unit 907 includes a touch panel 9071 and other input devices 9072.
  • the touch panel 9071 also called a touch screen, can collect user touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 9071 or near the touch panel 9071. operating).
  • the touch panel 9071 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 910, the command sent by the processor 910 is received and executed.
  • the touch panel 9071 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 907 may also include other input devices 9072.
  • other input devices 9072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 9071 can be overlaid on the display panel 9061.
  • the touch panel 9071 detects a touch operation on or near it, it transmits it to the processor 910 to determine the type of the touch event.
  • the type of event provides corresponding visual output on the display panel 9061.
  • the touch panel 9071 and the display panel 9061 are used as two independent components to realize the input and output functions of the terminal device, in some embodiments, the touch panel 9071 and the display panel 9061 can be integrated
  • the implementation of the input and output functions of the terminal device is not specifically limited here.
  • the interface unit 908 is an interface for connecting an external device and the terminal device 900.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 908 may be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the terminal device 900 or may be used to connect to the terminal device 900 and external Transfer data between devices.
  • the memory 909 can be used to store software programs and various data.
  • the memory 909 may mainly include a storage program area and a storage data area.
  • the storage program area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data (such as audio data, phone book, etc.) created by the use of mobile phones.
  • the memory 909 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 910 is the control center of the terminal device. It uses various interfaces and lines to connect the various parts of the entire terminal device, runs or executes the software programs and/or modules stored in the memory 909, and calls data stored in the memory 909 , Perform various functions of the terminal equipment and process data, so as to monitor the terminal equipment as a whole.
  • the processor 910 may include one or more processing units; preferably, the processor 910 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, application programs, etc., and the modem The processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 910.
  • the terminal device 900 may also include a power source 911 (such as a battery) for supplying power to various components.
  • a power source 911 such as a battery
  • the power source 911 may be logically connected to the processor 910 through a power management system, so as to manage charging, discharging, and power consumption management through the power management system. And other functions.
  • terminal device 900 includes some functional modules not shown, which will not be repeated here.
  • the embodiment of the present disclosure further provides a terminal device, including a processor 910, a memory 909, a computer program stored in the memory 909 and running on the processor 910, and when the computer program is executed by the processor 910
  • a terminal device including a processor 910, a memory 909, a computer program stored in the memory 909 and running on the processor 910, and when the computer program is executed by the processor 910
  • the embodiments of the present disclosure also provide a computer-readable storage medium on which a computer program is stored.
  • a computer program is stored.
  • the computer program is executed by a processor, each process of the above-mentioned parameter acquisition method embodiment is realized, and the same technology can be achieved. The effect, in order to avoid repetition, will not be repeated here.
  • the computer-readable storage medium such as read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk, etc.
  • the technical solution of the present disclosure essentially or the part that contributes to the related technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk). ) Includes several instructions to make a terminal device (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the method described in each embodiment of the present disclosure.
  • a terminal device which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)
  • Measurement Of Optical Distance (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

本公开提供了一种参数获取方法及终端设备,涉及通信技术领域,以解决距离传感器的校准较为复杂的问题。该方法包括:在拍摄预览界面中,获取目标对象的预览图像;在预览图像和拍摄预览界面中的校准区域的重合度超过阈值的情况下,获取距离传感器的第一测量值;基于第一测量值和校准距离,获取目标对象对应的校准偏移量。

Description

参数获取方法及终端设备
相关申请的交叉引用
本申请主张在2019年4月26日在中国提交的中国专利申请号No.201910343628.4的优先权,其全部内容通过引用包含于此。
技术领域
本公开实施例涉及通信技术领域,尤其涉及一种参数获取方法及终端设备。
背景技术
随着传感器技术的进步,一些电子产品的相机可利用距离传感器测量对焦物距,以便实现更准确的对焦。例如,目前流行的一种方案是,利用飞行时间(Time of Flight,TOF)技术测量对焦物体与相机之间的直线距离,即对焦物距。
基于ToF等技术的距离传感器虽然测距精度较高,但其对环境的适应性较差。不仅电子产品在出厂前需要校准距离传感器的测距精度,在电子产品维修后或使用一段时间后,同样需要校准。而校准距离传感器是一项专业性较强的工作,对操作人员的技能和校准设备、环境都有严苛的要求,因此,这使得距离传感器的校准较为复杂。
发明内容
本公开实施例提供一种参数获取方法及终端设备,以解决距离传感器的校准较为复杂的问题。
第一方面,本公开实施例提供了一种参数获取方法,应用于终端设备,包括:
在拍摄预览界面中,获取目标对象的预览图像;
在所述预览图像和所述拍摄预览界面中的校准区域的重合度超过阈值的情况下,获取距离传感器的第一测量值;
获取所述目标对象和所述终端设备之间的校准距离;
基于所述第一测量值和所述校准距离,获取所述目标对象对应的校准偏移量。
第二方面,本公开实施例提供了一种终端设备,包括:
第一获取模块,用于在拍摄预览界面中,获取目标对象的预览图像;
第二获取模块,用于在所述预览图像和所述拍摄预览界面中的校准区域的重合度超过阈值的情况下,获取距离传感器的第一测量值;
第三获取模块,用于获取所述目标对象和所述终端设备之间的校准距离;
第四获取模块,用于基于所述第一测量值和所述校准距离,获取所述目标对象对应的校准偏移量。
第三方面,本公开实施例还提供一种终端设备,包括:存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现如上所述的参数获取方法中的步骤。
第四方面,本公开实施例还提供一种计算机可读存储介质,所述计算机可读存储介质上存储计算机程序,所述计算机程序被处理器执行时实现如上所述的参数获取方法中的步骤。
在本公开实施例中,利用目标对象的预览图像和拍摄预览界面中的校准区域进行匹配并获得此时距离传感器的测量值,再利用该测量值和校准距离获取所述目标对象对应的校准偏移量。因此,利用本公开实施例,用户可利用任意的目标对象并获取其校准偏移量,获取方式简便,从而降低了对距离传感器校准的复杂度。
附图说明
图1是本公开实施例提供的参数获取方法的流程图;
图2是本公开实施例提供的终端设备的结构图之一;
图3是本公开实施例提供的校准方法的流程图之一;
图4是本公开实施例提供的终端设备的显示界面之一;
图5是本公开实施例提供的校准方法的流程图之二;
图6是本公开实施例提供的终端设备的显示界面之二;
图7是本公开实施例提供的终端设备的显示界面之三;
图8是本公开实施例提供的终端设备的结构图之二;
图9是本公开实施例提供的终端设备的结构图之三。。
具体实施方式
下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
参见图1,图1是本公开实施例提供的参数获取方法的流程图,如图1所示,包括以下步骤:
步骤101、在拍摄预览界面中,获取目标对象的预览图像。
当打开相机功能,终端设备显示拍摄预览界面,此时,可利用摄像头获得目标对象在拍摄预览界面中的图像,在此将其称为目标对象的预览图像。在该预览图像中,可包括目标对象,还可包括目标对象所处的环境等内容。所述目标对象可以是物体,人物等。
可选的,在本公开实施例中,可预先设定目标对象为何种物体。那么,当用户利用某个物体校准时,可先识别用户所使用的物体是否为预定的目标物体,若是,则执行步骤101。否则可结束流程,或者向用户提示。
步骤102、在所述预览图像和所述拍摄预览界面中的校准区域的重合度超过阈值的情况下,获取距离传感器的第一测量值。
以基于ToF技术的距离传感器为例简要介绍下测距原理。ToF距离传感器通常由发射单元和接收单元构成。发射单元发射的激光遇到目标对象后会反射回来,反射光线被接收单元接收。那么,可测得激光从发射到被接收之间的飞行时间。之后,即可根据光的传播速度,计算得到终端设备(或者为距离传感器)和目标对象之间的距离,即第一测量值。
在此步骤中,该阈值可任意设置。为了增加校准的准确性,该阈值可设置为100%。那么,此时,所述预览图像与所述拍摄预览界面中的校准区域重合。
所述校准区域可以是任意形状的区域。例如,校准区域可以是矩形,圆形灯。例如,还可任意设置该校准区域是和某个特定的目标对象的形状和大小相对应。
步骤103、获取所述目标对象和所述终端设备之间的校准距离。
对于某个目标对象,其尺寸大小是已知的,或者是可通过测量获得的。固定大小尺寸的对象,距离摄像头越近,其在显示屏上的成像尺寸越大,反之越小。
根据该目标对象的大小,可在显示屏上显示与其成像轮廓对应的几何图形。而该几何图形的大小尺寸是固定的,也为已知条件。当该目标对象的成像刚好与几何图形重合时,可视为二者的大小尺寸一致,此时,可通过实验数据获知目标对象与距离传感器之间的真实距离,该真实距离即为校准距离。
在实际应用中,可存储有一对应关系,该对应关系中存储有对象和对象与终端设备之间的校准距离之间的对应关系。那么,在此步骤中,可识别所述目标对象,然后根据对象和校准距离的对应关系,获取所述目标对象和所述终端设备之间的校准距离。其中识别目标对象的方式在此不做限定。
步骤104、基于所述第一测量值和所述校准距离,获取所述目标对象对应的校准偏移量。
在此,将该第一测量值和校准距离之间的差值,作为该目标对象对应的校准偏移量。
本公开实施例中,上述方法可以应用于终端设备,例如:手机、平板电脑(Tablet Personal Computer)、膝上型电脑(Laptop Computer)、个人数字助理(personal digital assistant,PDA)、移动上网装置(Mobile Internet Device,MID)或可穿戴式设备(Wearable Device)等。
在本公开实施例中,利用目标对象的预览图像和拍摄预览界面中的校准区域进行匹配并获得此时距离传感器的测量值,再利用该测量值和校准距离获取所述目标对象对应的校准偏移量。因此,利用本公开实施例,用户可利用任意的目标对象并获取其校准偏移量,获取方式简便,从而降低了对距离传感器校准的复杂度。
在上述实施例的基础上,在步骤104之后,还可包括:获取距离传感器 的第二测量值,基于所述第二测量值和所述校准偏移量,获得校准后的第二测量值。具体的,将所述第二测量值和所述校准偏移量的差值作为校准后的第二测量值。通过这种方式计算简单,因此可快速的完成校准工作。
在上述实施例的基础上,在步骤101之前,还可包括:识别所述目标对象,并在所述拍摄预览界面中,显示和所述目标对象的形状匹配的校准区域。具体的,可识别所述目标对象,在所述拍摄预览界面中显示和所述目标对象的形状相匹配的校准区域。例如,可存储对象和校准区域的对应关系。在识别出某个对象之后,可根据该对应关系获得其对应的校准区域,并显示。而在此,目标对象可不需事先预定,从而便于进行校准。例如,当识别出目标对象为硬币时,可在拍摄预览界面中显示出和硬币(如圆形)的形状相匹配的校准区域。
在上述实施例的基础上,在步骤101之后,还可执行以下至少一个步骤:
显示第一提示信息,所述第一提示信息用于提示用户移动所述目标对象,以使所述预览图像与所述校准区域重合。通过这种方式,可减少校准所需的时间,并提高校准效率。
显示第二提示信息,所述第二提示信息用于提示用户选取具有预设特征的目标对象。其中,预设特征可以是形状特征,类别特征等等。通过这种方式,可使得用户更快速的选择目标对象,从而减少校准所需的时间,并提高校准效率。
如图2所示,本公开实施例的终端设备可包括:显示器201,相机模组202,距离传感器203,处理器204。其中,距离传感器203可包括接收单元2031和发射单元2032。显示器上显示有拍摄预览界面2011。
以基于ToF技术的距离传感器为例简要介绍下测距原理。发射单元发射的激光遇到目标对象后会反射回来,反射光线被接收单元接收。那么,可测得激光从发射到被接收之间的飞行时间。之后,即可根据光的传播速度,计算得到终端设备(或者为距离传感器)和目标对象之间的距离。
参见图3,图3是本公开实施例提供的校准方法的流程图。该方法可应用于终端设备。在此实施例中,预先设置有校准区域的形状和目标对象。其中,目标对象以一个1元硬币大小的圆形物体为例,相应的,校准区域的形 状为圆形。如图3所示,包括以下步骤:
步骤301、当终端设备的相机进入距离传感器校准状态时,在显示器上显示拍摄预览界面,并在该界面的任意区域显示校准区域。
在本公开实施例中,在拍摄预览界面2011中显示至少一种预设的几何图形31(此实施例中为圆形)。此时,终端设备的界面示意图如图4所示。
可选的,为便于用户理解几何图形对应的目标对象,可在显示器上显示提示信息32。例如,如图4所示,可以显示“请使用1元硬币作为目标对象”。
步骤302、当目标对象的预览图像与校准区域重合时,获取距离传感器测得的第一测量值。
在这个过程中,当拍摄预览界面中出现物体的预览图像时,可首先识别出现的物体是否为设定的目标对象。若是,则执行步骤302,否则提示用户使用相应的物体作为目标对象或者使用类似形状的物体作为目标对象。
用户可通过操作,使得目标对象的成像34与几何图形重合。可选的,为便于引导用户操作,可在显示器上显示提示信息33。例如,如图4所示,可以显示“请调整相机距离、角度,使目标对象轮廓刚好与虚线框重合”。当目标对象与校准区域重合时,获取距离传感器测得的第一测量值D1。
步骤303、获取所述目标对象和所述终端设备之间的校准距离。
在本公开实施例中,还可通过实验确定目标对象的校准偏移量。因为预设的目标对象(例如1元硬币)的尺寸大小是统一的。固定大小尺寸的目标对象,距离摄像头越近,其在显示器上的成像尺寸越大,反之越小。预设的几何图形作为与目标对象对应的成像轮廓,其大小尺寸是固定的,为已知条件。因此,当目标对象的成像刚好与几何图形1重合时,可视为二者的大小尺寸一致,此时可通过实验数据获知目标对象与距离传感器之间的真实距离。这个真实距离即为系统针对目标对象预设的校准距离D0。
步骤304、获取所述目标对象的校准偏移量。
在此,利用Doffset=D1-D0可作为校准偏移量。此校准偏移量可存储并供后续校准使用。
步骤305、当需要校准时,获取距离传感器的第二测量值。
当需要校准时,当用户利用一元硬币作为参照物时,获取距离传感器的 第二测量值D2。
步骤306、基于所述第二测量值和所述校准偏移量,获得校准后的第二测量值。
利用所述校准偏移量对所述第二测量值进行校准。在此,利用D2-Doffset,也即,所述第二测量值和所述校准偏移量的差即为校准后的第二测量值。
参见图5,图5是本公开实施例提供的校准方法的流程图。该方法可应用于终端设备。与图3所示实施例不同的是,在本公开实施例中,不再预先设置某个单一的目标对象,而是可对用户选择的目标对象进行识别,并根据识别的结果在显示屏上显示对应的校准区域(如几何图形)。如图5所示,包括以下步骤:
步骤501、识别用户所选择的目标对象,并在拍摄预览界面显示校准区域。
当摄像头对准用户选择的目标对象时,目标对象即进入相机FOV区域。终端设备可通过目标对象的成像特征,辨别属于何种目标对象,随即将该种目标对象对应的几何图形显示在预览界面。
如图6所示,可向用户提示可使用的目标对象。当用户选择的目标对象是1元硬币时,拍摄预览界面显示的几何图形即为一个圆形;当目标对象是居民身份证时,拍摄预览界面显示的几何图形即为一个矩形。
步骤502、提示用户调整目标对象,使目标对象的预览图像和显示的校准区域重合。
步骤503、获取距离传感器测得的第一测量值D1。
步骤504、获取所述目标对象和所述终端设备之间的校准距离。
在本公开实施例中,还可通过实验确定目标对象的校准偏移量。因为预设的目标对象(例如1元硬币)的尺寸大小是统一的。固定大小尺寸的目标对象,距离摄像头越近,其在显示器上的成像尺寸越大,反之越小。预设的几何图形作为与目标对象对应的成像轮廓,其大小尺寸是固定的,为已知条件。因此,当目标对象的成像刚好与几何图形1重合时,可视为二者的大小尺寸一致,此时可通过实验数据获知目标对象与距离传感器之间的真实距离。这个真实距离即为系统针对目标对象预设的校准距离D0。
步骤505、获取目标对象对应的校准偏移量。
在此,利用Doffset=D1-D0可作为校准偏移量。此校准偏移量可存储并供后续校准使用。
步骤506、当需要校准时,获取距离传感器的第二测量值。
当需要校准时,例如当用户利用一元硬币作为参照物时,获取距离传感器的第二测量值D2。
步骤507、基于所述第二测量值和所述校准偏移量,获得校准后的第二测量值。
利用所述校准偏移量对所述第二测量值进行校准。在此,利用D2-Doffset,也即,所述第二测量值和所述校准偏移量的差即为校准后的第二测量值。
通过以上描述可以看出,在本公开实施例中,可根据用户选择的参照物对距离传感器进行校准,从而简化了校准的难度,方便了用户的操作,提升了用户的体验。
参见图8,图8本公开实施例提供的终端设备的结构图,如图8所示,终端设备800包括:
第一获取模块801,用于在拍摄预览界面中,获取目标对象的预览图像;第二获取模块802,用于在所述预览图像和所述拍摄预览界面中的校准区域的重合度超过阈值的情况下,获取距离传感器的第一测量值;第三获取模块803,用于获取所述目标对象和所述终端设备之间的校准距离;第四获取模块804,用于基于所述第一测量值和所述校准距离,获取所述目标对象对应的校准偏移量。
可选的,所述第四获取模块804具体用于,将所述第一测量值和所述校准距离的差值作为所述校准偏移量。
可选的,所述第三获取模块803包括:识别子模块,用于识别所述目标对象;获取子模块,用于根据对象和校准距离的对应关系,获取所述目标对象和所述终端设备之间的校准距离。
可选的,所述终端设备还包括:
第五获取模块805,用于获取距离传感器的第二测量值;
校准模块806,用于基于所述第二测量值和所述校准偏移量,获得校准 后的第二测量值。
可选的,所述校准模块806具体用于,将所述第二测量值和所述校准偏移量的差值作为校准后的第二测量值。
可选的,所述终端设备还包括:
识别模块807,用于识别所述目标对象;
第一显示模块808,用于在所述拍摄预览界面中,显示和所述目标对象的形状匹配的校准区域。
可选的,所述终端设备还包括以下至少一个模块:
第二显示模块809,用于显示第一提示信息,所述第一提示信息用于提示用户移动所述目标对象,以使所述预览图像与所述校准区域重合;
第三显示模块810,用于显示第二提示信息,所述第二提示信息用于提示用户选取具有预设特征的目标对象。
终端设备800能够实现上述方法实施例中终端设备实现的各个过程,为避免重复,这里不再赘述。
在本公开实施例中,利用目标对象的预览图像和拍摄预览界面中的校准区域进行匹配并获得此时距离传感器的测量值,再利用该测量值和校准距离获取所述目标对象对应的校准偏移量。因此,利用本公开实施例,用户可利用任意的目标对象并获取其校准偏移量,获取方式简便,从而降低了对距离传感器校准的复杂度。
图9为实现本公开实施例的一种终端设备的硬件结构示意图,该终端设备900包括但不限于:射频单元901、网络模块902、音频输出单元903、输入单元904、传感器905、显示单元906、用户输入单元907、接口单元908、存储器909、处理器910、以及电源911等部件。本领域技术人员可以理解,图9中示出的终端设备结构并不构成对终端设备的限定,终端设备可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。在本公开实施例中,终端设备包括但不限于手机、平板电脑、笔记本电脑、掌上电脑、车载移动终端、可穿戴设备、以及计步器等。
其中,处理器910,用于在拍摄预览界面中,获取目标对象的预览图像;在所述预览图像和所述拍摄预览界面中的校准区域的重合度超过阈值的情况 下,获取距离传感器的第一测量值;获取所述目标对象和所述终端设备之间的校准距离;基于所述第一测量值和所述校准距离,获取所述目标对象对应的校准偏移量。
在本公开实施例中,利用目标对象的预览图像和拍摄预览界面中的校准区域进行匹配并获得此时距离传感器的测量值,再利用该测量值和校准距离获取所述目标对象对应的校准偏移量。因此,利用本公开实施例,用户可利用任意的目标对象并获取其校准偏移量,获取方式简便,从而降低了对距离传感器校准的复杂度。
可选的,处理器910用于,将所述第一测量值和所述校准距离的差值作为所述校准偏移量。
可选的,处理器910用于,获取距离传感器的第二测量值;基于所述第二测量值和所述校准偏移量,获得校准后的第二测量值。
可选的,处理器910用于,将所述第二测量值和所述校准偏移量的差值作为校准后的第二测量值。
可选的,处理器910用于,识别所述目标对象;根据对象和校准距离的对应关系,获取所述目标对象和所述终端设备之间的校准距离。
可选的,处理器910用于,识别所述目标对象;在所述拍摄预览界面中,显示和所述目标对象的形状匹配的校准区域。
可选的,处理器910用于执行以下至少一个步骤:
显示第一提示信息,所述第一提示信息用于提示用户移动所述目标对象,以使所述预览图像与所述校准区域重合;
显示第二提示信息,所述第二提示信息用于提示用户选取具有预设特征的目标对象。
应理解的是,本公开实施例中,射频单元901可用于收发信息或通话过程中,信号的接收和发送,具体的,将来自基站的下行数据接收后,给处理器910处理;另外,将上行的数据发送给基站。通常,射频单元901包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频单元901还可以通过无线通信系统与网络和其他设备通信。
终端设备通过网络模块902为用户提供了无线的宽带互联网访问,如帮 助用户收发电子邮件、浏览网页和访问流式媒体等。
音频输出单元903可以将射频单元901或网络模块902接收的或者在存储器909中存储的音频数据转换成音频信号并且输出为声音。而且,音频输出单元903还可以提供与终端设备900执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出单元903包括扬声器、蜂鸣器以及受话器等。
输入单元904用于接收音频或视频信号。输入单元904可以包括图形处理器(Graphics Processing Unit,GPU)9041和麦克风9042,图形处理器9041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元906上。经图形处理器9041处理后的图像帧可以存储在存储器909(或其它存储介质)中或者经由射频单元901或网络模块902进行发送。麦克风9042可以接收声音,并且能够将这样的声音处理为音频数据。处理后的音频数据可以在电话通话模式的情况下转换为可经由射频单元901发送到移动通信基站的格式输出。
终端设备900还包括至少一种传感器905,比如光传感器、运动传感器以及其他传感器。具体地,光传感器包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板9061的亮度,接近传感器可在终端设备900移动到耳边时,关闭显示面板9061和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别终端设备姿态(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;传感器905还可以包括指纹传感器、压力传感器、虹膜传感器、分子传感器、陀螺仪、气压计、湿度计、温度计、红外线传感器等,在此不再赘述。
显示单元906用于显示由用户输入的信息或提供给用户的信息。显示单元906可包括显示面板9061,可以采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板9061。
用户输入单元907可用于接收输入的数字或字符信息,以及产生与终端设备的用户设置以及功能控制有关的键信号输入。具体地,用户输入单元907包括触控面板9071以及其他输入设备9072。触控面板9071,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板9071上或在触控面板9071附近的操作)。触控面板9071可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器910,接收处理器910发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板9071。除了触控面板9071,用户输入单元907还可以包括其他输入设备9072。具体地,其他输入设备9072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。
进一步的,触控面板9071可覆盖在显示面板9061上,当触控面板9071检测到在其上或附近的触摸操作后,传送给处理器910以确定触摸事件的类型,随后处理器910根据触摸事件的类型在显示面板9061上提供相应的视觉输出。虽然在图9中,触控面板9071与显示面板9061是作为两个独立的部件来实现终端设备的输入和输出功能,但是在某些实施例中,可以将触控面板9071与显示面板9061集成而实现终端设备的输入和输出功能,具体此处不做限定。
接口单元908为外部装置与终端设备900连接的接口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。接口单元908可以用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到终端设备900内的一个或多个元件或者可以用于在终端设备900和外部装置之间传输数据。
存储器909可用于存储软件程序以及各种数据。存储器909可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功 能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器909可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
处理器910是终端设备的控制中心,利用各种接口和线路连接整个终端设备的各个部分,通过运行或执行存储在存储器909内的软件程序和/或模块,以及调用存储在存储器909内的数据,执行终端设备的各种功能和处理数据,从而对终端设备进行整体监控。处理器910可包括一个或多个处理单元;优选的,处理器910可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器910中。
终端设备900还可以包括给各个部件供电的电源911(比如电池),优选的,电源911可以通过电源管理系统与处理器910逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。
另外,终端设备900包括一些未示出的功能模块,在此不再赘述。
优选的,本公开实施例还提供一种终端设备,包括处理器910,存储器909,存储在存储器909上并可在所述处理器910上运行的计算机程序,该计算机程序被处理器910执行时实现上述参数获取方法实施例中的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
本公开实施例还提供一种计算机可读存储介质,计算机可读存储介质上存储有计算机程序,该计算机程序被处理器执行时实现上述参数获取方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。其中,所述的计算机可读存储介质,如只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、 方法、物品或者装置中还存在另外的相同要素。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本公开的技术方案本质上或者说对相关技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本公开各个实施例所述的方法。
上面结合附图对本公开的实施例进行了描述,但是本公开并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本公开的启示下,在不脱离本公开宗旨和权利要求所保护的范围情况下,还可做出很多形式,均属于本公开的保护之内。

Claims (16)

  1. 一种参数获取方法,应用于终端设备,包括:
    在拍摄预览界面中,获取目标对象的预览图像;
    在所述预览图像和所述拍摄预览界面中的校准区域的重合度超过阈值的情况下,获取距离传感器的第一测量值;
    获取所述目标对象和所述终端设备之间的校准距离;
    基于所述第一测量值和所述校准距离,获取所述目标对象对应的校准偏移量。
  2. 根据权利要求1所述的方法,其中,所述基于所述第一测量值和所述校准距离,获取所述目标对象对应的校准偏移量,包括:
    将所述第一测量值和所述校准距离的差值作为所述校准偏移量。
  3. 根据权利要求1所述的方法,其中,在所述获取所述目标对象对应的校准偏移量之后,所述方法还包括:
    获取距离传感器的第二测量值;
    基于所述第二测量值和所述校准偏移量,获得校准后的第二测量值。
  4. 根据权利要求3所述的方法,其中,所述基于所述第二测量值和所述校准偏移量,获得校准后的第二测量值,包括:
    将所述第二测量值和所述校准偏移量的差值作为校准后的第二测量值。
  5. 根据权利要求1所述的方法,其中,所述获取所述目标对象和所述终端设备之间的校准距离,包括:
    识别所述目标对象;
    根据对象和校准距离的对应关系,获取所述目标对象和所述终端设备之间的校准距离。
  6. 根据权利要求1所述的方法,其中,在所述获取目标对象的预览图像之前,所述方法还包括:
    识别所述目标对象;
    在所述拍摄预览界面中,显示和所述目标对象的形状匹配的校准区域。
  7. 根据权利要求1所述的方法,其中,所述获取目标对象的预览图像之 后,所述方法还包括以下至少一项:
    显示第一提示信息,所述第一提示信息用于提示用户移动所述目标对象,以使所述预览图像与所述校准区域重合;
    显示第二提示信息,所述第二提示信息用于提示用户选取具有预设特征的目标对象。
  8. 一种终端设备,包括:
    第一获取模块,用于在拍摄预览界面中,获取目标对象的预览图像;
    第二获取模块,用于在所述预览图像和所述拍摄预览界面中的校准区域的重合度超过阈值的情况下,获取距离传感器的第一测量值;
    第三获取模块,用于获取所述目标对象和所述终端设备之间的校准距离;
    第四获取模块,用于基于所述第一测量值和所述校准距离,获取所述目标对象对应的校准偏移量。
  9. 根据权利要求8所述的终端设备,其中,所述第四获取模块具体用于,将所述第一测量值和所述校准距离的差值作为所述校准偏移量。
  10. 根据权利要求8所述的终端设备,其中,所述终端设备还包括:
    第五获取模块,用于获取距离传感器的第二测量值;
    校准模块,用于基于所述第二测量值和所述校准偏移量,获得校准后的第二测量值。
  11. 根据权利要求10所述的终端设备,其中,所述校准模块具体用于,将所述第二测量值和所述校准偏移量的差值作为校准后的第二测量值。
  12. 根据权利要求8所述的终端设备,其中,所述第三获取模块包括:
    识别子模块,用于识别所述目标对象;
    获取子模块,用于根据对象和校准距离的对应关系,获取所述目标对象和所述终端设备之间的校准距离。
  13. 根据权利要求8所述的终端设备,其中,所述终端设备还包括:
    识别模块,用于识别所述目标对象;
    第一显示模块,用于在所述拍摄预览界面中,显示和所述目标对象的形状匹配的校准区域。
  14. 根据权利要求8所述的终端设备,其中,所述终端设备还包括以下 至少一个模块:
    第二显示模块,用于显示第一提示信息,所述第一提示信息用于提示用户移动所述目标对象,以使所述预览图像与所述校准区域重合;
    第三显示模块,用于显示第二提示信息,所述第二提示信息用于提示用户选取具有预设特征的目标对象。
  15. 一种终端设备,包括:存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现如权利要求1至7任一项所述的参数获取方法中的步骤。
  16. 一种计算机可读存储介质,所述计算机可读存储介质上存储计算机程序,所述计算机程序被处理器执行时实现如权利要求1至7任一项所述的参数获取方法中的步骤。
PCT/CN2020/085177 2019-04-26 2020-04-16 参数获取方法及终端设备 WO2020216129A1 (zh)

Priority Applications (6)

Application Number Priority Date Filing Date Title
CA3136987A CA3136987A1 (en) 2019-04-26 2020-04-16 Parameter obtaining method and terminal device
EP20795992.5A EP3962060A4 (en) 2019-04-26 2020-04-16 PARAMETER ACQUISITION METHOD AND TERMINAL DEVICE
JP2021563406A JP7303900B2 (ja) 2019-04-26 2020-04-16 パラメータ取得方法及び端末機器
AU2020263183A AU2020263183B2 (en) 2019-04-26 2020-04-16 Parameter Obtaining Method and Terminal Device
BR112021021120A BR112021021120A2 (pt) 2019-04-26 2020-04-16 Método de obtenção de parâmetro e dispositivo terminal.
US17/503,545 US11769273B2 (en) 2019-04-26 2021-10-18 Parameter obtaining method and terminal device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910343628.4 2019-04-26
CN201910343628.4A CN110113528B (zh) 2019-04-26 2019-04-26 一种参数获取方法及终端设备

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/503,545 Continuation US11769273B2 (en) 2019-04-26 2021-10-18 Parameter obtaining method and terminal device

Publications (1)

Publication Number Publication Date
WO2020216129A1 true WO2020216129A1 (zh) 2020-10-29

Family

ID=67486869

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/085177 WO2020216129A1 (zh) 2019-04-26 2020-04-16 参数获取方法及终端设备

Country Status (8)

Country Link
US (1) US11769273B2 (zh)
EP (1) EP3962060A4 (zh)
JP (1) JP7303900B2 (zh)
CN (1) CN110113528B (zh)
AU (1) AU2020263183B2 (zh)
BR (1) BR112021021120A2 (zh)
CA (1) CA3136987A1 (zh)
WO (1) WO2020216129A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113068154A (zh) * 2021-03-17 2021-07-02 恒大新能源汽车投资控股集团有限公司 车辆信息安全处理方法及装置、系统

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110113528B (zh) 2019-04-26 2021-05-07 维沃移动通信有限公司 一种参数获取方法及终端设备
WO2021077270A1 (zh) * 2019-10-21 2021-04-29 深圳市大疆创新科技有限公司 一种获取目标距离的方法、控制装置及移动平台
CN110830717B (zh) * 2019-11-12 2021-06-25 维沃移动通信有限公司 一种参数值的获取方法及电子设备
CN113565779B (zh) * 2020-04-28 2023-08-11 广东美的环境电器制造有限公司 一种校准方法、装置、风扇及存储介质
CN112607362B (zh) * 2020-12-24 2022-04-26 中建材信息技术股份有限公司 一种基于视频的皮带偏移检测方法
CN115278060B (zh) * 2022-07-01 2024-04-09 北京五八信息技术有限公司 一种数据处理方法、装置、电子设备及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7701567B2 (en) * 2008-03-06 2010-04-20 Hong Kong Applied Science & Technology Research Institute Co., Ltd. Optoelectronic distance sensor
CN102597693A (zh) * 2009-11-13 2012-07-18 富士胶片株式会社 测距装置、测距方法、测距程序及测距系统以及拍摄装置
US20140307126A1 (en) * 2013-04-12 2014-10-16 Samsung Electronics Co., Ltd. Electronic apparatus and method of controlling the same
US20170234974A1 (en) * 2014-08-27 2017-08-17 Nikon Vision Co., Ltd. Range finder and optical device
CN109154647A (zh) * 2016-05-11 2019-01-04 三星电子株式会社 距离传感器及由包括距离传感器的设备和系统执行的校准方法
CN109257539A (zh) * 2018-10-15 2019-01-22 昆山丘钛微电子科技有限公司 一种对焦方法、装置、电子设备及介质
CN110113528A (zh) * 2019-04-26 2019-08-09 维沃移动通信有限公司 一种参数获取方法及终端设备

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4207980B2 (ja) 2006-06-09 2009-01-14 ソニー株式会社 撮像装置、および撮像装置制御方法、並びにコンピュータ・プログラム
US9335610B2 (en) * 2008-05-19 2016-05-10 Canon Kabushiki Kaisha Image pickup system and lens apparatus
US8866889B2 (en) * 2010-11-03 2014-10-21 Microsoft Corporation In-home depth camera calibration
CN102650691B (zh) * 2011-02-24 2014-01-22 原相科技股份有限公司 具校正功能的测距系统与方法
DE102012103495B8 (de) * 2012-03-29 2014-12-04 Sick Ag Optoelektronische Vorrichtung zur Vermessung von Struktur- oder Objektgrößen und Verfahren zur Kalibrierung
CN102946514B (zh) * 2012-11-08 2014-12-17 广东欧珀移动通信有限公司 移动终端的自拍方法和装置
CN103134489B (zh) * 2013-01-29 2015-12-23 北京凯华信业科贸有限责任公司 基于移动终端进行目标定位的方法
CN203151603U (zh) * 2013-02-06 2013-08-21 上海诺行信息技术有限公司 移动终端的距离传感器校准装置
US9470911B2 (en) * 2013-08-22 2016-10-18 Bespoke, Inc. Method and system to create products
KR101569268B1 (ko) * 2014-01-02 2015-11-13 아이리텍 잉크 얼굴 구성요소 거리를 이용한 홍채인식용 이미지 획득 장치 및 방법
CN103941310B (zh) * 2014-04-09 2017-02-15 苏州佳世达电通有限公司 近距离传感器的校正方法及系统
US10062180B2 (en) * 2014-04-22 2018-08-28 Microsoft Technology Licensing, Llc Depth sensor calibration and per-pixel correction
US9869544B2 (en) * 2014-08-29 2018-01-16 Blackberry Limited Method to determine length and area measurements within a smartphone camera image
CN105100620B (zh) * 2015-07-30 2018-05-25 深圳市永兴元科技股份有限公司 拍摄方法及装置
CN105222788B (zh) * 2015-09-30 2018-07-06 清华大学 基于特征匹配的飞行器航路偏移误差的自校正方法
JP6452585B2 (ja) * 2015-10-01 2019-01-16 株式会社ソニー・インタラクティブエンタテインメント 情報処理装置および位置情報取得方法
CN105245790B (zh) 2015-10-15 2019-04-26 Oppo广东移动通信有限公司 一种补光方法、装置及移动终端
CN105866781B (zh) * 2016-03-24 2020-09-25 联想(北京)有限公司 一种数据处理方法和电子设备
JP6088094B1 (ja) * 2016-06-20 2017-03-01 株式会社Cygames 複合現実環境を作成するためのシステム等
US10560679B2 (en) * 2016-08-30 2020-02-11 Microsoft Technology Licensing, Llc Deformation detection and automatic calibration for a depth imaging system
CN108225278A (zh) * 2017-11-29 2018-06-29 维沃移动通信有限公司 一种测距方法、移动终端

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7701567B2 (en) * 2008-03-06 2010-04-20 Hong Kong Applied Science & Technology Research Institute Co., Ltd. Optoelectronic distance sensor
CN102597693A (zh) * 2009-11-13 2012-07-18 富士胶片株式会社 测距装置、测距方法、测距程序及测距系统以及拍摄装置
US20140307126A1 (en) * 2013-04-12 2014-10-16 Samsung Electronics Co., Ltd. Electronic apparatus and method of controlling the same
US20170234974A1 (en) * 2014-08-27 2017-08-17 Nikon Vision Co., Ltd. Range finder and optical device
CN109154647A (zh) * 2016-05-11 2019-01-04 三星电子株式会社 距离传感器及由包括距离传感器的设备和系统执行的校准方法
CN109257539A (zh) * 2018-10-15 2019-01-22 昆山丘钛微电子科技有限公司 一种对焦方法、装置、电子设备及介质
CN110113528A (zh) * 2019-04-26 2019-08-09 维沃移动通信有限公司 一种参数获取方法及终端设备

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113068154A (zh) * 2021-03-17 2021-07-02 恒大新能源汽车投资控股集团有限公司 车辆信息安全处理方法及装置、系统

Also Published As

Publication number Publication date
AU2020263183A1 (en) 2021-11-11
AU2020263183B2 (en) 2023-01-05
JP7303900B2 (ja) 2023-07-05
US11769273B2 (en) 2023-09-26
CA3136987A1 (en) 2020-10-29
JP2022530144A (ja) 2022-06-27
CN110113528A (zh) 2019-08-09
CN110113528B (zh) 2021-05-07
US20220036588A1 (en) 2022-02-03
BR112021021120A2 (pt) 2021-12-14
EP3962060A1 (en) 2022-03-02
EP3962060A4 (en) 2022-06-22

Similar Documents

Publication Publication Date Title
WO2020216129A1 (zh) 参数获取方法及终端设备
US20220279116A1 (en) Object tracking method and electronic device
CN109461117B (zh) 一种图像处理方法及移动终端
WO2021013009A1 (zh) 拍照方法和终端设备
CN110109593B (zh) 一种截屏方法及终端设备
WO2019206077A1 (zh) 视频通话处理方法及移动终端
WO2021190387A1 (zh) 检测结果输出的方法、电子设备及介质
CN110602389B (zh) 一种显示方法及电子设备
WO2021190390A1 (zh) 调焦的方法、电子设备、存储介质及程序产品
CN109803110B (zh) 一种图像处理方法、终端设备及服务器
WO2021147911A1 (zh) 移动终端、拍摄模式的检测方法及存储介质
WO2021082772A1 (zh) 截屏方法及电子设备
WO2021104232A1 (zh) 显示方法及电子设备
WO2021185254A1 (zh) 内容共享方法及电子设备
CN108881721B (zh) 一种显示方法及终端
CN110457885B (zh) 一种操作方法及电子设备
CN109618055B (zh) 一种位置共享方法及移动终端
CN109005337B (zh) 一种拍照方法及终端
CN108833791B (zh) 一种拍摄方法和装置
CN108093119B (zh) 一种陌生来电号码的标记方法及移动终端
WO2021104265A1 (zh) 电子设备及对焦方法
CN111010496B (zh) 一种图像处理方法及电子设备
CN111031249B (zh) 辅助对焦的方法及电子设备
CN110086916B (zh) 拍照方法及终端
CN110769153B (zh) 一种图像处理方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20795992

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 3136987

Country of ref document: CA

ENP Entry into the national phase

Ref document number: 2021563406

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112021021120

Country of ref document: BR

ENP Entry into the national phase

Ref document number: 2020263183

Country of ref document: AU

Date of ref document: 20200416

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2020795992

Country of ref document: EP

Effective date: 20211126

ENP Entry into the national phase

Ref document number: 112021021120

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20211021