WO2022042275A1 - 测量距离的方法、装置、电子设备及可读存储介质 - Google Patents

测量距离的方法、装置、电子设备及可读存储介质 Download PDF

Info

Publication number
WO2022042275A1
WO2022042275A1 PCT/CN2021/111482 CN2021111482W WO2022042275A1 WO 2022042275 A1 WO2022042275 A1 WO 2022042275A1 CN 2021111482 W CN2021111482 W CN 2021111482W WO 2022042275 A1 WO2022042275 A1 WO 2022042275A1
Authority
WO
WIPO (PCT)
Prior art keywords
target object
ranging
electronic device
image
distance
Prior art date
Application number
PCT/CN2021/111482
Other languages
English (en)
French (fr)
Inventor
赵杰
马春晖
陈霄汉
黄磊
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022042275A1 publication Critical patent/WO2022042275A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Definitions

  • the present application belongs to the technical field of data collection, and in particular, relates to a method, device, electronic device and readable storage medium for measuring distance.
  • the device can also be controlled through a non-contact method.
  • a non-contact operation it is often necessary to determine the electronic equipment and control.
  • the user's previous distance value therefore, the accuracy of the distance value measurement directly responds to the accuracy of the contactless interaction behavior described above.
  • the existing distance measurement technology cannot take into account both the measurement accuracy and the reduction of the cost of the module used for distance measurement, which affects the promotion of the non-contact interaction technology.
  • the embodiments of the present application provide a method, device, electronic device, and readable storage medium for measuring distance, which can reduce the measurement cost while improving the measurement accuracy.
  • an embodiment of the present application provides a method for measuring distance, which is applied to an electronic device, including:
  • a distance value between the electronic device and the target object is determined based on the ranging reference parameter.
  • Implementing the embodiments of the present application has the following beneficial effects: by acquiring a ranging image containing a target object, the target object is a user who performs a non-contact interactive behavior, and a ranging reference parameter for measuring distance is extracted from the above ranging image, Therefore, the distance value between the electronic device and the target object can be determined according to the distance measurement reference parameter, and the electronic device only needs to include a camera module.
  • the electronic device does not need to configure the optical pulse-based transceiver and the binocular camera and other modules, thus greatly reducing the cost of the electronic device;
  • the distance measurement is performed by determining one or more distance measurement reference parameters, instead of directly obtaining the distance value, so that the accuracy of the distance measurement can be improved.
  • the determining a ranging reference parameter according to the ranging image includes:
  • the ranging reference parameter is obtained according to the ratio between the pixel height and the actual height associated with the target object.
  • the method before obtaining the ranging reference parameter according to the ratio between the pixel height and the actual height associated with the target object, the method further includes:
  • the actual height of the target object is extracted from the user information.
  • the method before obtaining the ranging reference parameter according to the ratio between the pixel height and the actual height associated with the target object, the method further includes:
  • the actual height of the target object is obtained from user information of the user account.
  • the determining a ranging reference parameter according to the ranging image includes:
  • the object posture is the first posture, acquiring the pixel length of the preset marker associated with the target object in the ranging image;
  • the ranging reference parameter is obtained according to the actual length of the preset marker and the pixel length.
  • the method further includes:
  • the object posture is the second posture, identifying the pixel height of the target object in the ranging image
  • the ranging reference parameter is obtained according to the ratio between the pixel height and the actual height associated with the target object.
  • the preset marker is a yoga mat; and the identifying the object posture of the target object in the ranging image includes:
  • the operation of recognizing the object posture of the target object in the ranging image is performed.
  • the number of the ranging images is M; the M is a positive integer greater than 1; and the determining a ranging reference parameter according to the ranging images includes:
  • the ranging reference parameter is obtained.
  • the obtaining of the second trajectory based on the motion parameters fed back by the wearable device includes:
  • the second trajectory is generated based on the motion parameters corresponding to a plurality of the feedback cycles.
  • the acquiring a ranging image including the target object includes:
  • the determining a distance value between the electronic device and the target object based on the ranging reference parameter includes:
  • the shooting distance associated with the digital focal length is queried, and the shooting distance is identified as a distance value between the electronic device and the target object.
  • the target application is a non-contact interaction type application
  • the acquiring a distance measurement image including the target object is performed, so as to perform a non-contact interaction operation by using the distance value.
  • an embodiment of the present application provides a device for measuring distance, including:
  • a distance value determining unit configured to determine a distance value between the electronic device and the target object based on the ranging reference parameter.
  • embodiments of the present application provide an electronic device, a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor executes the A computer program implements the method for measuring a distance according to any one of the above first aspects.
  • an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, wherein, when the computer program is executed by a processor, any one of the above-mentioned first aspect is implemented.
  • an embodiment of the present application provides a computer program product that, when the computer program product runs on an electronic device, enables the electronic device to execute the method for measuring a distance according to any one of the first aspects above.
  • an embodiment of the present application provides a chip system, including a processor, where the processor is coupled to a memory, and the processor executes a computer program stored in the memory, so as to measure the distance according to any one of the first aspect Methods.
  • FIG. 1 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 2 is a block diagram of a software structure of an electronic device according to an embodiment of the present application.
  • FIG. 3 is a schematic diagram of an intelligent indoor fitness scene provided by an embodiment of the present application.
  • FIG. 4 is a schematic diagram of the principle of laser-based ranging provided by an embodiment of the present application.
  • FIG. 5 is a schematic diagram of a ranging principle based on a binocular camera provided by an embodiment of the present application
  • FIG. 6 is a schematic diagram of a ranging principle of an imaging module based on depth information provided by an embodiment of the present application
  • FIG. 7 is a flowchart of an implementation of a method for measuring distance provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a screen automatically lighting up according to an embodiment of the present application.
  • FIG. 10 is a schematic diagram of determining a ranging reference parameter based on an identifier provided by an embodiment of the present application
  • FIG. 11 is a schematic diagram of identifying the actual size of a photographed object provided by an embodiment of the present application.
  • FIG. 12 is a schematic diagram of a prompt for multi-person monitoring provided by an embodiment of the present application.
  • FIG. 13 is a schematic diagram of determining a distance value between any two target objects in a multi-person monitoring scenario provided by an embodiment of the present application;
  • FIG. 14 is a flow chart for realizing the determination of the ranging reference parameter provided by the first embodiment of the present application.
  • 18 is a schematic diagram of obtaining pixel lengths when an embodiment of the present application provides a yoga mat with a preset mark
  • FIG. 21 is a flow chart for realizing the determination of the ranging reference parameter provided by the fourth embodiment of the present application.
  • 24 is a structural block diagram of an apparatus for measuring distance provided by an embodiment of the present application.
  • FIG. 25 is a schematic diagram of an electronic device provided by an embodiment of the present application.
  • references in this specification to "one embodiment” or “some embodiments” and the like mean that a particular feature, structure or characteristic described in connection with the embodiment is included in one or more embodiments of the present application.
  • appearances of the phrases “in one embodiment,” “in some embodiments,” “in other embodiments,” “in other embodiments,” etc. in various places in this specification are not necessarily All refer to the same embodiment, but mean “one or more but not all embodiments” unless specifically emphasized otherwise.
  • the terms “including”, “including”, “having” and their variations mean “including but not limited to” unless specifically emphasized otherwise.
  • the electronic device may be a station (STAION, ST) in a WLAN, a cellular phone, a cordless phone, a Session Initiation Protocol (Session Initiation Protocol, SIP) phone, a Wireless Local Loop (WLL) station, Personal Digital Assistant (PDA) devices, handheld devices with wireless communication capabilities, computing devices or other processing devices connected to wireless modems, computers, laptop computers, handheld communication devices, handheld computing devices, and /or other devices for communicating on wireless systems and next-generation communication systems, for example, mobile terminals in a 5G network or mobile terminals in a future evolved Public Land Mobile Network (PLMN) network, etc.
  • STAION Session Initiation Protocol
  • SIP Session Initiation Protocol
  • WLL Wireless Local Loop
  • PDA Personal Digital Assistant
  • the wearable device when the electronic device is a wearable device, the wearable device may also be a general term for the intelligent design of daily wear and the development of wearable devices by applying wearable technology, such as glasses, gloves, Watches, clothing and shoes, etc.
  • a wearable device is a portable device that is directly worn on the body or integrated into the user's clothes or accessories, and collects the user's biometric data by attaching to the user's body. Wearable device is not only a hardware device, but also realizes powerful functions through software support, data interaction, and cloud interaction.
  • wearable smart devices include full-featured, large-scale, complete or partial functions without relying on smart phones, such as smart watches or smart glasses, and only focus on a certain type of application function, which needs to be used in conjunction with other devices such as smart phones. , such as various smart bracelets and smart jewelry with unlockable touch screens.
  • FIG. 1 shows a schematic structural diagram of an electronic device 100 .
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195 and so on.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structures illustrated in the embodiments of the present invention do not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural-network processing unit neural-network processing unit
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may contain multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flash, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate with each other through the I2C bus interface, so as to realize the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • the processor 110 may contain multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications, sampling, quantizing and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is typically used to connect the processor 110 with the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 110 communicates with the camera 193 through a CSI interface to implement the photographing function of the electronic device 100.
  • the processor 110 communicates with the display screen 194 through the DSI interface to implement the display function of the electronic device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • the GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones to play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiment of the present invention is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 charges the battery 142 , it can also supply power to the electronic device through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be sent into a medium-high-frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs sound signals through audio devices (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display screen 194 .
  • the modem processor may be a separate device.
  • the modulation and demodulation processor may be independent of the processor 110, and be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation satellites Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify the signal, and convert it into electromagnetic waves for radiation through the antenna 2 .
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), Beidou navigation satellite system (beidou navigation satellite system, BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • global positioning system global positioning system, GPS
  • global navigation satellite system global navigation satellite system, GLONASS
  • Beidou navigation satellite system beidou navigation satellite system, BDS
  • quasi-zenith satellite system quadsi -zenith satellite system, QZSS
  • SBAS satellite based augmentation systems
  • the electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • emitting diode, AMOLED organic light-emitting diode
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the electronic device 100 may include one or N display screens 194 , where N is a positive integer greater than one.
  • Display 194 may include a touch panel as well as other input devices.
  • the electronic device 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy, and the like.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos of various encoding formats, such as: Moving Picture Experts Group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG Moving Picture Experts Group
  • MPEG2 moving picture experts group
  • MPEG3 MPEG4
  • MPEG4 Moving Picture Experts Group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area may store data (such as audio data, phone book, etc.) created during the use of the electronic device 100 and the like.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also referred to as a "speaker" is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through a human mouth, and input the sound signal into the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, which can implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance. When a touch operation acts on the display screen 194, the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example, when a touch operation whose intensity is less than the first pressure threshold acts on the short message application icon, the instruction for viewing the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, the instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the motion attitude of the electronic device 100 .
  • the angular velocity of electronic device 100 about three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyro sensor 180B detects the shaking angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to offset the shaking of the electronic device 100 through reverse motion to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenarios.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist in positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 can detect the opening and closing of the flip holster using the magnetic sensor 180D.
  • the electronic device 100 can detect the opening and closing of the flip according to the magnetic sensor 180D. Further, according to the detected opening and closing state of the leather case or the opening and closing state of the flip cover, characteristics such as automatic unlocking of the flip cover are set.
  • the electronic device 100 can measure the distance through infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 can use the distance sensor 180F to measure the distance to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the electronic device 100 emits infrared light to the outside through the light emitting diode.
  • Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100 . When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100 .
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • Proximity light sensor 180G can also be used in holster mode, pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket, so as to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking pictures with fingerprints, answering incoming calls with fingerprints, and the like.
  • the temperature sensor 180J is used to detect the temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold value, the electronic device 100 reduces the performance of the processor located near the temperature sensor 180J in order to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid abnormal shutdown of the electronic device 100 caused by the low temperature.
  • the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
  • Motor 191 can generate vibrating cues.
  • the motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • the motor 191 can also correspond to different vibration feedback effects for touch operations on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be contacted and separated from the electronic device 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195 .
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card and so on. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the electronic device 100 employs an eSIM, ie: an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiment of the present invention takes an Android system with a layered architecture as an example to illustrate the software structure of the electronic device 100 as an example.
  • FIG. 2 is a software structural block diagram of the electronic device 100 according to an embodiment of the present invention.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the Android system is divided into four layers, which are, from top to bottom, an application layer, an application framework layer, an Android runtime (Android runtime) and a system library, and a kernel layer.
  • the application package can include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message and so on.
  • the application framework layer may include window managers, content providers, view systems, telephony managers, resource managers, notification managers, and the like.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, take screenshots, etc.
  • the resource manager provides various resources for the application, such as localization strings, icons, pictures, layout files, video files and so on.
  • Android Runtime includes core libraries and a virtual machine. Android runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
  • a system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • surface manager surface manager
  • media library Media Libraries
  • 3D graphics processing library eg: OpenGL ES
  • 2D graphics engine eg: SGL
  • the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the above non-contact interaction scenario may be an intelligent indoor fitness scenario.
  • FIG. 3 shows a schematic diagram of a smart indoor fitness scene provided by an embodiment of the present application.
  • the electronic device can obtain the video data including the user through the camera module, and by analyzing each video image frame in the video data, parameters such as the height of the user, the distance value between the electronic device and the movement speed are determined, Build a perceptible indoor exercise environment, provide users with accurate exercise guidance, and improve exercise efficiency.
  • Method 1 Electronic devices for responding to contactless interactions may be configured with laser receivers as well as laser transmitters.
  • the laser transmitter can emit a beam of laser when it needs to measure the distance
  • the laser receiver can receive the laser reflected by the barrier
  • the timer measures the emission time of the laser from emission to reception, so as to determine the barrier and the electronic equipment.
  • Fig. 4 shows a schematic diagram of the principle of laser-based ranging provided by an embodiment of the present application.
  • the poses between the cameras corresponding to the preset internal parameters need to be exactly the same as the poses of the binocular cameras in actual use, and the accuracy of the poses of the cameras is required to be high, and The calculation process is more complicated. If the pose of the binocular camera is inconsistent with the pose associated with the internal parameters, the accuracy of distance measurement will be greatly reduced; scene), the outline information of the shooting content is weak, the visual difference information cannot be accurately identified, the accuracy is low, and the requirements for ambient light are high.
  • the electronic device can be configured with an imaging module that can obtain depth information.
  • the imaging module includes a projector for projecting structured light and a camera module for receiving structured light.
  • the corresponding structured light pattern is obtained after the surface of the user or furniture is reflected.
  • the camera module collects the above structured light image and decodes the structured light pattern according to the principle of triangulation to generate the depth information of the current environment.
  • the depth information is based on The round-trip time (ie phase difference) of each light pulse in the structured light determines the distance to the reflector; then the electronic device can obtain the distance value between the electronic device and the user based on the above depth information.
  • FIG. 6 shows a schematic diagram of a ranging principle of an imaging module based on depth information provided by an embodiment of the present application.
  • a ranging image containing a target object is acquired.
  • the electronic device can be configured with a camera module, and the camera module can be any type of camera module. Since the type of the camera module and the imaging principle do not affect the measurement process of the distance value, the electronic device can be reduced in time. The cost of construction is only to use an ordinary camera module.
  • the camera module can be a digital camera.
  • the digital camera is equipped with a photosensitive device, which converts the optical signal into an electrical signal by collecting the optical signal in the shooting scene. , to generate a corresponding electronic image, that is, the above-mentioned ranging image.
  • the above-mentioned target object is specifically an object that performs an interactive operation with the electronic device.
  • the number of the target objects may be one or more, and the number of the target objects is not limited herein. It should be noted that, if the number of target objects is two or more, the electronic device can separately identify the distance values between each target object and the electronic device. Optionally, the electronic device may establish a corresponding relationship between the target object and the distance value, so that the electronic device may determine the distance between each target object and itself by querying the corresponding relationship.
  • the electronic device may select at least one candidate object from the multiple candidate objects as the target object. For example, if there are multiple users in the shooting screen, the electronic device may select one of the users as the user for the fitness exercise prompt. , that is, the target object needs to be identified from multiple users, so as to avoid identifying other users passing through the shooting area as users who need to be monitored, and reducing the accuracy of fitness exercise monitoring.
  • the implementation may be as follows: the electronic device identifies the area of the area occupied by each candidate object in the ranging image, and selects the candidate object with the largest area as the above-mentioned target object.
  • the above method of selecting the target object from the multiple candidate objects may also be: the electronic device can identify the key point information contained in each candidate object, and select the candidate object whose key point information matches the standard key information as the target object. Due to part of the fitness movement monitoring, it is necessary to shoot unobstructed user movement videos, such as legs, abdomen, head and hands, etc.
  • the electronic device can determine multiple standard key points, and generate all standard key points into the above-mentioned Standard key information.
  • the electronic device can identify the key points contained in each candidate object in the ranging image, generate the above-mentioned key point information, and determine whether the key point information contains all the key points in the standard key information, and if so, the identification and matching are successful, and the candidate is identified.
  • the object is the target object; on the contrary, it is identified and matched, and the candidate object is identified as a non-target object.
  • the electronic device may store a plurality of preset actions, and if the electronic device detects that the action of any shooting object matches the preset action, it will identify the shooting object as the target object, and pass the human Face matching or tracking algorithms are used to track and identify target objects.
  • the electronic device when the electronic device needs to determine the distance value between the electronic device and the target object, the electronic device can control the camera module to obtain a ranging image including the target object.
  • the electronic device may receive a ranging request instruction initiated by the user, for example, by clicking a ranging button or speaking a voice control instruction corresponding to the ranging request instruction (such as a voice signal such as "measure distance” or "start ranging”) , and the electronic device executes the operation of S701.
  • the electronic device may run a fitness monitoring application, and the fitness monitoring application may be configured with ranging conditions. For example, when it is detected that the position of the target object changes, a ranging instruction is generated to execute the step of S701. Operation; the above distance measuring condition may also be that when it is detected that the monitoring function is activated, the operation of S701 is performed with a preset detection period.
  • the electronic device may be configured with a detection period, so as to realize the periodic determination of the distance value between the electronic device and the target object.
  • the above detection period can be set according to the application program currently started by the electronic device. For example, when performing fitness exercise monitoring, the user moves faster, at this time, the period of the above detection cycle can be set to be shorter, for example, the distance value is detected once in 0.2s; in the standby state (in the standby state, the electronic device can Detect the distance value from the user to activate the device), at this time, the period of the above detection period can be set to be longer, for example, the distance value is detected once every 1s.
  • the electronic device may perform an operation to light up the screen according to the distance from the user.
  • FIG. 8 shows a schematic diagram of a screen automatically lighting up according to an embodiment of the present application. Referring to (a) in FIG. 8 , the distance between the user and the electronic device is far, and the electronic device is in an off-screen state (it can also be a standby state), at this time, the electronic device can detect cycles at a lower frequency Get the distance value between the user and the electronic device.
  • the electronic device determines in a certain detection cycle that the distance between the electronic device and the user is smaller than the preset activation threshold, that is, (b) in Figure 8 , at this time, the electronic device will light up the screen, and the distance value between the lake area user and the electronic device will be dynamically adjusted according to the different status of the electronic device at a higher frequency detection cycle, so as to ensure the distance value measurement. While meeting the real-time requirements, it can also reduce the energy consumption of the equipment.
  • the electronic device after the electronic device obtains the ranging image through the camera module, it can identify whether the ranging image contains the target object. If the ranging image contains the target object, the operation of S702 is performed; otherwise, if the ranging image does not contain any target object, wait for the next ranging image collection operation.
  • the electronic device may be configured with feature information of the target object. For example, if the target object is a real person, the corresponding feature information is human body feature information, such as face information and body part information. If any shot in the ranging image If the object matches the above-mentioned feature information, it is recognized that the ranging image contains the target object. For example, if any shooting object in the ranging image contains a human face, the shooting object is recognized as the target object.
  • FIG. 9 shows a specific implementation flowchart of the method for measuring distance provided by an embodiment of the present application.
  • the method for measuring a distance provided in this embodiment may further include S901 and S902 before S701 . Details are as follows:
  • the method further includes:
  • the electronic device may display an application interface through the display module, wherein the application interface includes application icons of multiple application programs.
  • the electronic device may receive an application selection operation initiated by the user based on the application interface, and determine a corresponding target application in the application interface according to the application selection operation initiated by the user.
  • the user may send an application selection operation to the electronic device through the controller.
  • each application may correspond to an application identifier, and the user may generate the above-mentioned application selection instruction by clicking the button corresponding to the application identifier on the controller.
  • the application identifier of an application is "01"
  • a selection cursor can be displayed in the application interface, the user can control the movement of the selection cursor through the controller, and when the selection cursor moves to the display area of the application icon corresponding to the application to be selected, send a confirmation selection instruction to the electronic device, and the electronic The device may determine the target application according to the coordinate position of the current selection cursor, and generate the above application selection operation.
  • the user may generate an application selection operation through a touch screen on the electronic device.
  • the electronic device can output the above-mentioned application interface through the touch screen, and the user can perform a click operation on the touch screen.
  • the electronic device can identify the click coordinates corresponding to the click operation, determine the associated target application according to the click coordinates, and recognize the click operation as the above-mentioned application selection operation.
  • the electronic device can obtain the video data of the target object through the camera module, analyze the video data, determine the motion trajectory of the target object, and determine the target application that the user needs to select according to the motion trajectory, that is, The above action track is the application selection operation initiated by the user.
  • the electronic device may be provided with a table of correspondence between motion trajectories and application programs. After the electronic device determines the motion trajectory of the target object through the video data, it can query whether the above-mentioned correspondence table contains matching standard actions; if there is a matching standard action, the application program corresponding to the standard action is identified as the target application.
  • the electronic device may record the application type of each application program, and all the application programs may be divided into interactive applications and non-interactive applications according to the operation mode for the application programs.
  • interactive type applications it can also be divided into contact interactive type applications and non-contact interactive type applications.
  • the electronic device determines the target application according to the user's application selection operation, it can identify the application type of the target application, and determine whether the target application is an interactive application. type of application, it determines whether the operation mode is a contact interaction type or a non-contact interaction type. If it is a contact interaction type, it receives the user's touch operation, and responds to the above touch operation through the target application.
  • the above contact interaction type Including touch screen interaction or control device interaction (such as mouse control, keyboard control and/or control of its control, etc.); if it is an application program of non-contact interaction type, the operation of S902 is performed.
  • the target application is a non-contact interaction type application
  • the acquiring a distance measurement image including the target object is performed, so as to perform a non-contact interaction operation by using the distance value.
  • the electronic device detects that the target application is a non-contact interaction type application, such as a fitness monitoring application or a somatosensory game application.
  • a non-contact interaction type application such as a fitness monitoring application or a somatosensory game application.
  • the non-contact interaction type application program collects the non-contact operation of the interactive object, in addition to determining the action data of the interactive object (that is, the above-mentioned target object), it is often necessary to collect the distance value between the electronic device and the interactive object.
  • the distance value performs non-contact interactive operations such as operation prompts, generation of operation instructions, and calibration of motion data.
  • the electronic device will activate the camera module, and use the camera module to collect a ranging image containing the interactive object (ie, the target object), and obtain the distance value between the electronic device and the interactive object through the operations of S701 to S703, and based on this Contactless interaction with distance values.
  • the distance measurement process is performed, and the execution of the distance measurement operation can be realized. Accuracy, avoid unnecessary ranging operations.
  • a ranging reference parameter is determined according to the ranging image.
  • the electronic device may determine, according to the ranging image, a reference parameter used to determine the time of acquiring the image and the zoom ratio of the target's location, that is, the above-mentioned ranging reference parameter.
  • the electronic device can determine the zoom ratio of the target object in the ranging image according to the ranging reference parameter, so that the distance value from the electronic device can be determined according to the zoom ratio.
  • the manner of determining the ranging reference parameter according to the ranging image may be: in a scene where the electronic device is placed, identifiers may be marked at multiple key position points.
  • the user may, according to the prompt of the electronic device, set identifiers at a plurality of key points whose distance from the electronic device is a preset value.
  • FIG. 10 shows a schematic diagram of determining a ranging reference parameter based on an identifier provided by an embodiment of the present application. Referring to FIG. 10 , before performing ranging, the electronic device can output prompt information to prompt the user to set identifiers at multiple key points, which are four key points of 0.5m, 1m, 1.5m and 2m away from the electronic device.
  • the user can configure corresponding identifiers on different key points, for example, paste the corresponding pattern on the corresponding floor position, as shown in Figure 10.
  • the electronic device can recognize the identifier contained in the ranging image. After the electronic device recognizes and obtains each identifier, it can determine the distance value between the target object and the electronic device according to the relative position between the target object and each identifier. As shown in Figure 10, in the ranging image, the target object is between the first identifier (ie, 0.5m) and the second identifier (1m). Since the distance value between the identifier and the electronic device is fixed, Therefore, the distance value between the target object and the electronic device will be between 0.5m and 1m.
  • the distance value between the target object and the electronic device is 0.8m.
  • the actual distance between the target object and the electronic device can be calculated according to the pixel distance between the two identifiers and the preset conversion algorithm.
  • the method of determining the ranging reference parameter according to the ranging image may be as follows: the electronic device can identify the preset calibration object included in the ranging image, and determine the pixel of the preset calibration object in the ranging image according to the preset calibration object.
  • the size gets the ranging reference parameter.
  • the above-mentioned preset calibration object is any object whose shape and size are fixed and known, such as chairs, tables, and household appliances in the indoor scene where the user is located.
  • the electronic device may pre-store the size of the above-mentioned preset calibration object, for example, the appearance and shape of the table and the corresponding actual size are recorded in the electronic device.
  • FIG. 11 shows a schematic diagram of identifying the actual size of a photographed object provided by an embodiment of the present application. Referring to FIG. 11 , when the electronic device acquires the ranging image, the current scene contains a desired refrigerator, and the surface of the refrigerator contains the trademark of the corresponding manufacturer.
  • the electronic device can query the corresponding device model from the Internet according to the surface appearance of the refrigerator and the manufacturer's manufacturer, and determine the corresponding device size according to the device model, that is, the above-mentioned actual size, and use the actual size as the above-mentioned distance measurement reference parameter.
  • a distance value between the electronic device and the target object is determined based on the ranging reference parameter.
  • the electronic device may calculate the distance value between the target object and the electronic device based on the ranging reference parameter.
  • the electronic device can be configured with a distance value conversion algorithm, and the distance measurement reference parameter is imported into the above distance value conversion algorithm, the distance value associated with the distance measurement reference parameter can be calculated, and the distance value output by the distance conversion algorithm can be identified as an electronic device. The distance value from the target object.
  • the distance value can be fed back to the currently running application, so as to generate and display interactive information through the application. For example, if the electronic device is currently running a fitness monitoring application, it is determined whether the distance between the target object and the electronic device is too close according to the above distance value. Since the target object may hit the electronic device when exercising when the distance is too close, it is necessary to carry out warn.
  • the fitness monitoring application can compare the above distance value with the preset distance threshold value, and if the distance value is less than or equal to the distance threshold value, output a warning message that the distance is too close; otherwise, if the distance value is greater than the distance threshold value, then There is no need for exception warnings on the target object.
  • FIG. 12 shows a schematic diagram of prompts for multi-person monitoring provided by an embodiment of the present application.
  • the ranging image contains a plurality of target objects, namely user 1, user 2 and user 3.
  • the electronic device recognizes and obtains the distance values between each target object and the electronic device through the above methods, which are respectively 1m, 0.5m and 1.2m, among them, the preset distance threshold (used to indicate that the distance is too close) is 0.8m, so it can be determined that the distance between user 2 and the electronic device is too close, so the distance abnormality can be detected for user 2.
  • the prompt information of "too close, please stay away from the device” specifically, the prompt information can be generated according to the difference between the current distance value of the target object and the preset distance threshold, for example, "too close, Please stay 0.3m away from the device.
  • the distance value between each target object can also be measured .
  • the specific calculation process can be determined according to the distance values L1 and L2 between the two target objects that need to measure the distance and the electronic device, and the pixel distance L3 between the two target objects in the ranging image, as shown in FIG. 13 .
  • This is a schematic diagram of determining the distance value between any two target objects in the multi-person monitoring scenario provided by an embodiment of the present application. The above three distance values are respectively shown in FIG. 13 , then the distance value between the two target objects can be calculated according to the Pythagorean theorem.
  • the electronic device can also directly calculate the actual distance between the two objects by using the pixel distance between the two target objects and the scaling ratio determined by the ranging reference parameter. Similarly, if the distance value between any two target objects is too close, prompt information can also be output. As shown in Figure 12, if the distance between user 1 and user 2 is too close, user 1 can be prompted to move to the left to stay away from the user. 2.
  • the monitoring process is continuous, that is, the electronic device obtains the monitoring video of the target user, the monitoring video can be parsed, each video image frame in the monitoring video can be extracted, and each video image frame can be used as the above-mentioned video image frame.
  • the operation of S701 to S703 is performed on the distance measurement image of the video image frame, and the distance value between the target object and the electronic device in each video image frame is determined, so as to realize the purpose of continuous monitoring.
  • a method for measuring distance provided by this embodiment of the present application can obtain a ranging image including a target object, and the target object is a user who performs a non-contact interactive behavior.
  • the distance measurement reference parameter for measuring the distance so that the distance value between the electronic device and the target object can be determined according to the distance measurement reference parameter.
  • the electronic device only needs to include a camera module to achieve this. In this embodiment, the distance between the electronic device and the target object is measured.
  • the distance value When the distance value is high, it does not depend on the depth image, or needs to be measured by the difference of the shooting angle of the binocular camera, so the electronic device does not need to configure the optical pulse-based transceiver and binocular camera and other modules, so as to greatly The cost of electronic equipment is reduced; at the same time, during the process of distance measurement, the distance value is not directly obtained by determining the distance measurement reference parameters, so that the accuracy of the distance measurement can be improved.
  • the method of extracting ranging reference parameters from the ranging image, and determining the distance value between the target object and the electronic device according to the ranging reference parameters may include at least the following four methods:
  • Method 1 When the target object is standing, the electronic device can determine the zoom ratio according to the actual height of the target object pre-stored and the pixel height of the target object in the ranging image, and use the zoom ratio as the above-mentioned ranging reference parameter.
  • Mode 2 When the ranging image contains markers with known sizes, the electronic device can determine the actual size of the marker and the pixel length of the marker in the ranging image, determine the scaling ratio, and use the scaling ratio as the above-mentioned ranging Reference parameter.
  • Mode 3 If the user wears the wearable device, the electronic device can determine the zoom ratio according to the actual moving distance of the wearable device and the distance of the image trajectory of the wearable device in the plurality of ranging images, and use the zoom ratio as the above The ranging reference parameter.
  • the electronic device is equipped with a digital focusing camera module, and the electronic device can use the digital focal length when shooting the ranging image as the above-mentioned ranging reference parameter, and determine the distance value between the target user and the electronic device according to the digital focal length.
  • FIG. 14 shows a flow chart for realizing the determination of the ranging reference parameter provided by the first embodiment of the present application.
  • S702 specifically includes S1401 and S1402, which are described in detail as follows:
  • the pixel height of the target object in the ranging image is identified.
  • the electronic device can determine the target area image corresponding to the target object in the ranging image by using a preset target object recognition algorithm, and the target area image is specifically divided from the ranging image based on the contour line of the target object The resulting image of the area.
  • the target object recognition algorithm is specifically a human body recognition algorithm.
  • the electronic device may be configured with a plurality of feature points, such as a head feature point, a shoulder feature point, a hand feature point, a leg feature point, and the like.
  • the electronic device can determine whether the ranging image contains a plurality of pre-configured feature points, and if so, recognize that there is a real person photographed in the ranging image, identify the contour of the target object according to each feature point, and based on the contour
  • the above-mentioned object area image is obtained by separating the ranging image; on the contrary, if it does not contain any of the above-mentioned feature points, it is recognized that the ranging image does not contain a real person, that is, there is no target object, and no ranging operation is required.
  • the electronic device can recognize the object pose of the target object according to the relative positions between the feature points and/or the above-mentioned contour line .
  • the electronic device may identify multiple skeleton nodes in the object area image, such as head skeleton nodes, neck skeleton nodes, foot skeleton nodes, and hip skeleton nodes, etc.
  • the object pose of the target object is determined, and the relative position may specifically be the slope of a straight line connected between multiple different skeleton nodes, so that the object pose of the target object is determined according to the slope between the multiple different straight lines.
  • the electronic device needs to determine the object pose of the target object within the ranging image. If the object posture is a standing posture, the identification can use the methods of S1401 and S1402 to determine the ranging reference parameters; on the contrary, if the object posture of the target object is other than the standing posture, then the method 1 is not used to determine the ranging reference. Parameter.
  • the electronic device can also determine whether the image of the human body region of the target object has captured a complete human body, for example, whether it includes a complete human body from the feet to the head. Since the target object in the ranging image is in a standing posture, if some areas are missing at this time, for example, the feet are not captured, the height of the target object in the ranging image does not have a fixed object relationship with the actual height of the target object. , which cannot be used as a reference parameter for ranging.
  • the electronic device can further identify whether the human body region image contains a complete human body, for example, determine whether the human body region image contains a plurality of preset key points, and if so, If it is identified as including a complete human body, the ranging reference parameters are determined through S1401 and S1402.
  • the electronic device after the electronic device extracts the human body region image of the target object from the ranging image, it can count the pixel height corresponding to the target object in the human body region image, that is, the lowest point in the vertical direction in the human body region image and The number of pixels in the interval between the highest points is the above-mentioned pixel height.
  • the ranging reference parameter is obtained according to the ratio between the pixel height and the actual height associated with the target object.
  • the electronic device pre-stores the actual height of the target object, that is, the real height; after determining the pixel height of the target object in the ranging image, the ratio between the pixel height and the actual height can be calculated, so that the Determine the corresponding zoom ratio of the target object in the ranging image, and use the zoom ratio as the above-mentioned ranging reference parameter.
  • FIG. 15 shows a flowchart of determining a distance value based on an actual height and a pixel height provided by an embodiment of the present application.
  • the electronic device may store user information of multiple candidate objects.
  • the electronic device has two methods to determine the user information corresponding to the target object.
  • Method A Based on face recognition, the implementation is as follows:
  • the electronic device may store standard faces of each candidate object.
  • the standard face can be acquired by the above-mentioned camera module, and the user information associated with the collected standard face configuration by the user is received.
  • the electronic device in addition to determining the associated user information through user settings, can also be connected to a cloud server, and the user identifier associated with the standard face is determined through the face database in the cloud server, and based on The user identification downloads the associated user information from the cloud server, and establishes an association relationship between the standard face and the user information.
  • the cloud server side can store face images constructed based on multiple users. face database, and the user ID corresponding to the uploaded face image. Based on this, in addition to receiving the manual setting of the user to establish the association relationship between the standard face and the user information, the electronic device can also obtain it by downloading from the cloud server.
  • the electronic device may perform face recognition on the ranging image, and extract the face region image contained in the ranging image.
  • the electronic device can first identify the face feature points in the ranging image, such as multiple key regions with eye features, mouth features or nose features, so that the face region can be identified, and the face region can be converted from The distance measurement image is divided to obtain the above-mentioned face area image.
  • the electronic device can import the face region image into a preset face feature value recognition algorithm (for example, convert the two-dimensional image into a feature matrix or feature vector by means of pooling, convolution, etc.), and obtain Face feature parameters; compare the face feature parameters with the pre-stored standard feature parameters of each standard face, select a standard face with the highest matching degree as the standard face corresponding to the face feature parameter, and select the standard face corresponding to the face feature parameter.
  • the user information of the standard face is used as the user information of the target object.
  • the electronic device can obtain the ranging image again, and perform the above operations repeatedly. If the number of repeated matches is greater than the preset number of times threshold, and the standard face corresponding to the face feature parameter is not recognized, output the user information matching failed.
  • the prompt information is used to notify the user to configure the associated user information for the face feature parameter, or to determine the user information in the manner of method B.
  • the user information carries the actual height of the target object, that is, the height of the target object.
  • the electronic device can extract the height information from the user information, and use the parameter value corresponding to the height information as the actual height of the target object. .
  • the actual height of the target object is determined, and the user information does not need to be manually selected. the efficiency of the actual height selection.
  • the electronic device may output a user selection interface.
  • the above-mentioned user selection interface may be output before executing "obtaining a ranging image containing the target user".
  • the target object may select one or more candidate users as target users based on at least one candidate user displayed on the user selection interface.
  • each candidate user corresponds to a user icon on the above-mentioned user selection interface, and the user can determine the target user to be selected through selection operations such as clicking or selection, and identify the user account corresponding to the target user as the user account corresponding to the selection operation. .
  • the electronic device may receive a touch operation of the target object, and the touch operation is the above-mentioned selection operation, and the touch operation may specifically be a click operation.
  • the electronic device determines the user icon associated with the touch coordinate according to the touch coordinate of the touch operation, and determines the candidate account corresponding to the user icon as the user account corresponding to the above selection operation.
  • the electronic device may receive a control instruction sent by the target object through the control device, where the control instruction is the above-mentioned selection operation.
  • the electronic device determines the associated user icon according to the control instruction, and identifies the candidate account corresponding to the user icon as the user account corresponding to the above selection operation.
  • each user account may record user information of the corresponding user, such as the user's age, height, and weight.
  • the electronic device can extract the associated user information from the database according to the user account, and obtain the actual height of the target object from the user information, that is, the above-mentioned user height.
  • the user account is determined in a manner specified by the target user, and the actual height is obtained from the user information associated with the user account, which can improve the accuracy of the actual height, thereby improving the accuracy of distance measurement.
  • the electronic device may be configured with multiple object key points, and the object key points may include facial feature points (eyes, nose, mouth, etc.) and body key points (head, hand, shoulder, etc.)
  • the electronic device can mark the key points of each object on the ranging image, and generate the corresponding pose feature vector according to the pixel distance between the key points of each object. Calculate the vector distance between the posture feature vector and the standard feature vectors of each preset standard posture, and select a standard posture with the smallest vector distance as the object posture of the target object in the ranging image.
  • the electronic device can determine whether the gesture of the object is the preset first gesture or the preset second gesture. If the object posture of the target object is the first posture, the operations of S1602 and S1603 are performed; otherwise, if the object posture of the target object is the second posture, the operations of S1604 and S1605 are performed.
  • the electronic device can identify the inclination angle of the long side of the yoga mat in the ranging image, and if the inclination angle is greater than a preset angle threshold, it is recognized that the yoga mat is not upright (that is, the long side is not upright).
  • the electronic device can output prompt information, such as "please straighten the yoga mat", so that the long side of the yoga mat is basically parallel to the shooting plane, so as to avoid The deviation of the placement angle affects the accuracy of the distance measurement.
  • the pixel height of the target object is not equal to the height of the target object.
  • the ranging reference parameter may be determined according to the pixel length of the preset marker. Therefore, the electronic device can identify the preset marker from the ranging image according to the appearance feature information of the preset marker, and determine the pixel length of the preset marker in the ranging image.
  • the pixel length is specifically the outer contour of the preset marker and any side of the photographing plane of the electronic device. If the preset marker is a yoga mat, the pixel length is specifically the length corresponding to the longer side of the yoga mat in the ranging image.
  • the pixel length of the yoga mat in the ranging image is 1, and the actual length is L, and the ratio between the distance Dist between the yoga mat and the electronic device and the focal length f of the electronic device is the same as The ratio of l and L is the same, and the target object is on the yoga mat, so the distance value between the electronic device and the target user can be calculated according to the ratio between the above two lengths.
  • the ranging reference parameter is obtained according to the ratio between the pixel height and the actual height associated with the target object.
  • the applicable range of the method for measuring distance can be improved.
  • the posture of the object in the process of movement is changeable, and the reference parameter of ranging can be determined by the feature of the fixed size of the marker, that is, the reference object, which improves the accuracy of the measurement.
  • the number of the ranging images is M; the M is a positive integer greater than 1, that is, the electronic device will acquire a plurality of ranging images with a preset collection frequency, in this case, the The ranging image determines the ranging reference parameters, including:
  • pixel coordinates of a wearable device worn on the target object are determined from the M ranging images respectively, and a first trajectory of the mobile device is obtained based on the M pixel coordinates.
  • the electronic device may be set with a distance-measuring image collection duration, and within the distance-measuring image collection duration, M distance-measuring images may be acquired at a preset frequency.
  • the frequency is 60Hz, that is, 60 ranging images can be acquired within 1s, and the distance value between the electronic device and the target object is determined based on the 60 ranging images.
  • the electronic device can store M ranging images in a first-in, first-out queue, and the queue length is the above M, and the electronic device can The distance value between the electronic device and the target object at each moment is calculated from each ranging image.
  • the electronic device Since at each moment, the electronic device will obtain the latest ranging image collected at the current moment T, and it will be added to the above queue at the earliest The distance measurement image removal queue of , that is, the distance measurement image corresponding to the time TM, the electronic device can determine the distance value according to the updated queue.
  • prompt information may be generated to prompt the user to swing the part where the wearable device is worn in a plane perpendicular to the shooting direction, for example, in a plane perpendicular to the shooting direction Swing the arm wearing the smart watch, for example, move it upward instead of extending it forward. Since the above displacement cannot be reflected in the distance measurement image when swinging in a parallel shooting direction, thus reducing the accuracy of distance measurement, so by Outputting the above prompt information can reduce the correlation between the points on the plane perpendicular to the shooting direction, and improve the calibration accuracy.
  • FIG. 20 shows a schematic diagram of a first trajectory provided by an embodiment of the present application.
  • the above-mentioned wearable device is specifically a smart watch, and the electronic device can identify the pixel coordinates of the smart watch in each ranging image, and connect each pixel coordinate in turn to obtain the corresponding number of the smart watch in the multiple ranging images.
  • the moving track is the first track.
  • the electronic device may establish a communication connection with the wearable device; specifically, the communication connection may be a wireless communication connection.
  • the electronic device and the wearable device can be involved in the same WIFI network, and a corresponding communication link can be established based on the WIFI network for data interaction; the wearable device can also be added to the Bluetooth wireless network of the electronic device, through the wearable device.
  • the Bluetooth module of the electronic device and the Bluetooth module of the electronic device establish the above-mentioned communication connection.
  • the wearable device may send the movement trajectory within a preset time period as a motion parameter to the electronic device, and the electronic device may obtain the above-mentioned second trajectory according to the movement trajectory recorded by the wearable device.
  • the movement track sent by the wearable device may be marked with multiple collection points, and each collection point corresponds to a collection moment. The trajectory is intercepted from the trajectory to obtain the above-mentioned second trajectory.
  • S1902 may specifically include:
  • the electronic device may also send a plurality of motion parameters to the electronic device in a preset feedback cycle, and after receiving the motion parameters of a plurality of different feedback cycles, the electronic device can obtain the corresponding motion parameters in the plurality of feedback cycles.
  • a second trajectory of the electronic device within the monitoring time period For example, if the motion parameter is specifically the speed value or acceleration value of the electronic device, the above-mentioned second trajectory can be obtained by integrating the motion parameter.
  • the ranging reference parameter is obtained based on the first distance of the first track and the second distance of the second track.
  • the electronic device can obtain the three-dimensional movement trajectory of the target object through the motion parameters fed back by the motion sensor in the wearable device, and convert the three-dimensional movement trajectory according to the three-dimensional movement trajectory and the internal parameters of the camera module.
  • Perspective multi-point projection (Perspective-n-Point, PNP) is performed between each trajectory point and the trajectory points of the first trajectory (that is, the two-dimensional movement trajectory) obtained based on multiple ranging images, and the external parameter matrix is determined based on the external parameter matrix.
  • FIG. 21 shows a flowchart for realizing the determination of the ranging reference parameter provided by the fourth embodiment of the present application.
  • S701 in the method for measuring distance is specifically S2101 and S2102
  • S702 is specifically S2103
  • S703 is specifically S2104, and the details are as follows:
  • the ranging image including the target object is acquired based on the adjusted digital focal length.
  • the associated shooting information may be encapsulated in the ranging image, where the shooting information may include: shooting time, digital focal length when shooting, image format, and the like.
  • the electronic device may extract the above-mentioned digital focal length from the shooting information, and use the digital focal length as the above-mentioned distance measuring reference parameter.
  • the distance value determining unit 243 is configured to determine the distance value between the electronic device and the target object based on the ranging reference parameter.
  • the ranging reference parameter determination unit 242 includes:
  • a face area image acquisition unit used for extracting the face area image of the target object from the ranging image
  • a face area image comparison unit configured to determine user information corresponding to the target object based on the face area image
  • the device for measuring distance also includes:
  • a second user information extraction unit configured to acquire the actual height of the target object from the user information of the user account.
  • a first posture determination unit configured to acquire the pixel length of a preset marker associated with the target object in the ranging image if the object posture is the first posture
  • the device for measuring distance also includes:
  • a second posture determining unit configured to identify the pixel height of the target object in the ranging image if the object posture is the second posture
  • the second attitude ranging unit is configured to obtain the ranging reference parameter according to the ratio between the pixel height and the actual height associated with the target object.
  • the operation of recognizing the object posture of the target object in the ranging image is performed.
  • a first trajectory acquisition unit configured to determine pixel coordinates of the wearable device worn on the target object from the M ranging images respectively, and obtain a first trajectory of the mobile device based on the M pixel coordinates ;
  • a second trajectory obtaining unit configured to obtain a second trajectory based on the motion parameters fed back by the wearable device
  • a moving distance comparison unit configured to obtain the ranging reference parameter based on the first distance of the first track and the second distance of the second track.
  • the second trajectory acquisition unit includes:
  • a motion parameter receiving unit configured to receive the motion parameters sent by the wearable device in a preset feedback cycle
  • a second trajectory drawing unit configured to generate the second trajectory based on the motion parameters corresponding to a plurality of the feedback cycles.
  • the ranging image acquisition unit 241 includes:
  • a digital focal length adjustment unit configured to adjust the digital focal length of the digital focus camera module of the electronic device, so that the focus point of the digital focus camera module is aligned with the target object
  • a digital zoom photographing unit configured to obtain the ranging image including the target object based on the adjusted digital focal length
  • the ranging reference parameter determination unit 242 includes:
  • a digital focal length identifying unit configured to identify the digital focal length of the ranging image, and obtain a ranging reference parameter according to the digital focal length.
  • the distance value determining unit 243 includes:
  • Correspondence search unit used to query the shooting distance associated with the digital focal length based on the preset correspondence between the digital focal length and the shooting distance, and identify the shooting distance as the distance between the electronic device and the target object value.
  • the device for measuring distance also includes:
  • an application selection operation response unit configured to determine a target application corresponding to the application selection operation in response to the application selection operation of the target object on the application interface
  • the contactless interactive operation response unit is configured to execute the acquiring of the ranging image including the target object if the target application is an application of the contactless interaction type, so as to perform the contactless interactive operation by using the distance value.
  • the device for measuring distance can also obtain a distance measurement image including a target object, and the target object is a user who performs a non-contact interactive behavior.
  • the distance reference parameter so that the distance value between the electronic device and the target object can be determined according to the distance measurement reference parameter, and the electronic device only needs to include a camera module. , does not rely on the depth image, or needs to be measured by the difference of the shooting angle of the binocular camera, so the electronic device does not need to configure the optical pulse-based transceiver and binocular camera and other modules, thus greatly reducing the electronic equipment.
  • the distance measurement is performed by determining one or more distance measurement reference parameters, instead of directly obtaining the distance value, so the accuracy of the distance measurement can be improved.
  • FIG. 25 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • the electronic device 25 of this embodiment includes: at least one processor 250 (only one is shown in FIG. 25 ), a processor, a memory 251 , and a processor stored in the memory 251 and available for processing in the at least one processor
  • the so-called processor 250 may be a central processing unit (Central Processing Unit, CPU), and the processor 250 may also be other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuits) , ASIC), off-the-shelf programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
  • the memory 251 may be an internal storage unit of the electronic device 25 in some embodiments, such as a hard disk or a memory of the electronic device 25 .
  • the memory 251 may also be an external storage device of the electronic device 25 in other embodiments, such as a plug-in hard disk, a smart memory card (Smart Media Card, SMC), a secure digital (Secure Digital, SD) card, flash memory card (Flash Card), etc.
  • the memory 251 may also include both an internal storage unit of the electronic device 25 and an external storage device.
  • the memory 251 is used to store an operating system, an application program, a boot loader (Boot Loader), data, and other programs, such as program codes of the computer program, and the like.
  • the memory 251 may also be used to temporarily store data that has been output or will be output.
  • Embodiments of the present application further provide an electronic device, the electronic device comprising: at least one processor, a memory, and a computer program stored in the memory and executable on the at least one processor, the processor executing The computer program implements the steps in any of the foregoing method embodiments.
  • Embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the steps in the foregoing method embodiments can be implemented.
  • the embodiments of the present application provide a computer program product, when the computer program product runs on a mobile terminal, the steps in the foregoing method embodiments can be implemented when the mobile terminal executes the computer program product.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as an independent product, may be stored in a computer-readable storage medium.
  • the present application realizes all or part of the processes in the methods of the above embodiments, which can be completed by instructing the relevant hardware through a computer program, and the computer program can be stored in a computer-readable storage medium.
  • the computer program includes computer program code
  • the computer program code may be in the form of source code, object code, executable file or some intermediate form, and the like.
  • the computer-readable medium may include at least: any entity or device capable of carrying the computer program code to the photographing device/electronic device, recording medium, computer memory, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), electrical carrier signals, telecommunication signals, and software distribution media.
  • ROM read-only memory
  • RAM random access memory
  • electrical carrier signals telecommunication signals
  • software distribution media For example, U disk, mobile hard disk, disk or CD, etc.
  • computer readable media may not be electrical carrier signals and telecommunications signals.
  • the units described as separate components may or may not be physically separated, and components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

本申请适用于数据采集技术领域,提供了一种测量距离的方法、装置、电子设备以及可读存储介质,该方法包括:获取包含目标对象的测距图像;根据所述测距图像确定测距参照参量;基于所述测距参照参量确定所述电子设备与所述目标对象之间的距离值。本申请提供的技术方案无需电子设备配置基于光脉冲的收发器以及双目摄像头等模块,从而大大降低了电子设备的造价成本;与此同时,由于在进行距离测量的过程中,通过确定一种或多种测距参照参量进行距离测量,并非直接获取距离值,从而能够提高测距的准确性。

Description

测量距离的方法、装置、电子设备及可读存储介质
本申请要求于2020年08月28日提交国家知识产权局、申请号为202010893041.3、申请名称为“测量距离的方法、装置、电子设备及可读存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请属于数据采集技术领域,尤其涉及一种测量距离的方法、装置、电子设备以及可读存储介质。
背景技术
随着人机交互的技术不断方法,除了通过接触式的方式对设备进行控制外,还可以通过非接触式方式对设备进行控制,而在执行非接触式操作时,往往需要确定电子设备与操控用户之前的距离值,因此,距离值测量的准确性直接响应上述非接触式交互行为的准确性。然而现有的距离测量的技术,无法同时兼顾测量准确性以及降低用于测量距离的模块的造价成本两个方面,影响了非接触式交互技术的推广。
发明内容
本申请实施例提供了一种测量距离的方法、装置、电子设备以及可读存储介质,可以在提高测量准确率的同时,降低测量成本。
第一方面,本申请实施例提供了一种测量距离的方法,应用于电子设备,包括:
获取包含目标对象的测距图像;
根据所述测距图像确定测距参照参量;
基于所述测距参照参量确定所述电子设备与所述目标对象之间的距离值。
实施本申请实施例具有以下有益效果:通过获取包含目标对象的测距图像,上述目标对象即为执行非接触式交互行为的用户,通过上述测距图像提取用于测量距离的测距参照参量,从而可以根据测距参照参量确定电子设备与目标对象之间的距离值,电子设备只需包含一个摄像模块即可实现,本实施例测量电子设备与目标对象之间的距离值时,并不依赖深度图像,或需要通过双目摄像头的拍摄角度差的方式来进行测距,因此电子设备无需电子设备配置基于光脉冲的收发器以及双目摄像头等模块,从而大大降低了电子设备的造价成本;与此同时,由于在进行距离测量的过程中,通过确定一种或多种测距参照参量进行距离测量,并非直接获取距离值,从而能够提高测距的准确性。
在第一方面的一种可能实现方式中,所述根据所述测距图像确定测距参照参量,包括:
识别所述目标对象在所述测距图像内的像素高度;
根据所述像素高度与所述目标对象关联的实际高度之间的比值,得到所述测距参照参量。
在第一方面的一种可能实现方式中,在所述根据所述像素高度与所述目标对象关联的实际高度之间的比值,得到所述测距参照参量之前,还包括:
从所述测距图像内提取所述目标对象的人脸区域图像;
基于所述人脸区域图像确定所述目标对象对应的用户信息;
从所述用户信息提取所述目标对象的所述实际高度。
在第一方面的一种可能实现方式中,在所述根据所述像素高度与所述目标对象关联的实际高度之间的比值,得到所述测距参照参量之前,还包括:
响应于所述目标对象在用户选择界面上的选择操作,确定所述选择操作对应的用户账户;
从所述用户账户的用户信息中获取所述目标对象的所述实际高度。
在第一方面的一种可能实现方式中,所述根据所述测距图像确定测距参照参量,包括:
识别所述目标对象在所述测距图像内的对象姿态;
若所述对象姿态为第一姿态,则获取在所述测距图像内所述目标对象关联的预设标志物的像素长度;
根据所述预设标志物的实际长度以及所述像素长度,得到所述测距参照参量。
在第一方面的一种可能实现方式中,在所述识别所述目标对象在所述测距图像内的对象姿态之后,还包括:
若所述对象姿态为第二姿态,则识别所述目标对象在所述测距图像内的像素高度;
根据所述像素高度与所述目标对象关联的实际高度之间的比值,得到所述测距参照参量。
在第一方面的一种可能实现方式中,所述预设标记物为瑜伽垫;所述识别所述目标对象在所述测距图像内的对象姿态,包括:
若所述目标对象处于瑜伽训练模式,则执行所述识别所述目标对象在所述测距图像内的对象姿态的操作。
在第一方面的一种可能实现方式中,所述测距图像的个数为M个;所述M为大于1的正整数;所述根据所述测距图像确定测距参照参量,包括:
分别在M个所述测距图像确定佩戴于所述目标对象上的可穿戴设备的像素坐标,并基于M个所述像素坐标得到所述移动设备的第一轨迹;
基于所述可穿戴设备反馈的运动参量,得到第二轨迹;
基于所述第一轨迹的第一距离以及所述第二轨迹的第二距离,得到所述测距参照参量。
在第一方面的一种可能实现方式中,所述基于所述可穿戴设备反馈的运动参量,得到第二轨迹,包括:
接收所述可穿戴设备以预设反馈周期发送的运动参量;
基于多个所述反馈周期对应的所述运动参量,生成所述第二轨迹。
在第一方面的一种可能实现方式中,所述获取包含目标对象的测距图像,包括:
调整所述电子设备的数字对焦摄像模块的数字焦距,以使所述数字对焦摄像模块的对焦点对准所述目标对象;
基于调整后的所述数字焦距,获取包含所述目标对象的所述测距图像;
所述根据所述测距图像确定测距参照参量,包括:
识别所述测距图像的所述数字焦距,并根据所述数字焦距得到测距参照参量。
在第一方面的一种可能实现方式中,所述基于所述测距参照参量确定所述电子设备与所述目标对象之间的距离值,包括:
基于预设的数字焦距与拍摄距离的对应关系,查询所述数字焦距关联的拍摄距离,将所述拍摄距离识别为所述电子设备与所述目标对象之间的距离值。
在第一方面的一种可能实现方式中,在所述获取包含目标对象的测距图像之前,还包括:
响应于所述目标对象在应用界面上的应用选择操作,确定所述应用选择操作对应的目标应用;
若所述目标应用为非接触式交互类型的应用,则执行所述获取包含目标对象的测距图像,以通过所述距离值进行非接触式交互操作。
第二方面,本申请实施例提供了一种测量距离的装置,包括:
测距图像获取单元,用于获取包含目标对象的测距图像;
测距参照参量确定单元,用于根据所述测距图像确定测距参照参量;
距离值确定单元,用于基于所述测距参照参量确定电子设备与所述目标对象之间的距离值。
第三方面,本申请实施例提供了一种电子设备,存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机程序,其特征在于,所述处理器执行所述计算机程序时实现上述第一方面中任一项所述测量距离的方法。
第四方面,本申请实施例提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现上述第一方面中任一项所述测量距离的方法。
第五方面,本申请实施例提供了一种计算机程序产品,当计算机程序产品在电子设备上运行时,使得电子设备执行上述第一方面中任一项所述测量距离的方法。
第六方面,本申请实施例提供一种芯片系统,包括处理器,处理器与存储器耦合,所述处理器执行存储器中存储的计算机程序,以实现如第一方面中任一项所述测量距离的方法。
可以理解的是,上述第二方面至第六方面的有益效果可以参见上述第一方面中的相关描述,在此不再赘述。
附图说明
图1是本申请实施例提供的电子设备的结构示意图;
图2是本申请实施例的电子设备的软件结构框图;
图3是本申请一实施例提供的智能室内健身场景的示意图;
图4是本申请一实施例提供的基于激光的测距原理示意图;
图5是本申请一实施例提供的基于双目摄像头的测距原理示意图;
图6是本申请一实施例提供的基于深度信息的成像模组的测距原理示意图;
图7是本申请一实施例提供的测量距离的方法的实现流程图;
图8是本申请一实施例提供的屏幕自动点亮的示意图;
图9是本申请一实施例提供的测量距离的方法的具体实现流程图;
图10是本申请一实施例提供基于标示符确定测距参照参量的示意图;
图11是本申请一实施例提供的识别拍摄对象实际尺寸的示意图;
图12是本申请一实施例提供的多人监控的提示示意图;
图13是本申请一实施例提供的多人监控场景下,确定任一两个目标对象之间距离值的示意图;
图14是本申请第一实施例提供的测距参照参量的确定实现流程图;
图15是本申请一实施例提供的基于实际身高与像素高度确定距离值的流程图;
图16是本申请第二实施例提供的测距参照参量的确定实现流程图;
图17是本申请一实施例提供的预设标志物与目标对象之间位置关系的示意图;
图18是本申请一实施例提供预设标志位瑜伽垫时像素长度的获取示意图;
图19是本申请第三实施例提供的测距参照参量的确定实现流程图;
图20是本申请一实施例提供的第一轨迹的示意图;
图21是本申请第四实施例提供的测距参照参量的确定实现流程图;
图22是本申请一实施例提供的对焦点对准目标对象的示意图;
图23是本申请一实施例提供的配置上述对应关系表的示意图;
图24是本申请一实施例提供的测量距离的装置的结构框图;
图25是本申请一实施例提供的一种电子设备的示意图。
具体实施方式
以下描述中,为了说明而不是为了限定,提出了诸如特定系统结构、技术之类的具体细节,以便透彻理解本申请实施例。然而,本领域的技术人员应当清楚,在没有这些具体细节的其它实施例中也可以实现本申请。在其它情况中,省略对众所周知的系统、装置、电路以及方法的详细说明,以免不必要的细节妨碍本申请的描述。
应当理解,当在本申请说明书和所附权利要求书中使用时,术语“包括”指示所描述特征、整体、步骤、操作、元素和/或组件的存在,但并不排除一个或多个其它特征、整体、步骤、操作、元素、组件和/或其集合的存在或添加。
还应当理解,在本申请说明书和所附权利要求书中使用的术语“和/或”是指相关联列出的项中的一个或多个的任何组合以及所有可能组合,并且包括这些组合。
如在本申请说明书和所附权利要求书中所使用的那样,术语“如果”可以依据上下文被解释为“当...时”或“一旦”或“响应于确定”或“响应于检测到”。类似地,短语“如果确定”或“如果检测到[所描述条件或事件]”可以依据上下文被解释为意指“一旦确定”或“响应于确定”或“一旦检测到[所描述条件或事件]”或“响应于检测到[所描述条件或事件]”。
另外,在本申请说明书和所附权利要求书的描述中,术语“第一”、“第二”、“第三”等仅用于区分描述,而不能理解为指示或暗示相对重要性。
在本申请说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术 语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。
本申请实施例提供的测量距离的方法可以应用于手机、平板电脑、可穿戴设备、车载设备、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)等电子设备上本申请实施例对电子设备的具体类型不作任何限制。
例如,所述电子设备可以是WLAN中的站点(STAION,ST),可以是蜂窝电话、无绳电话、会话启动协议(Session InitiationProtocol,SIP)电话、无线本地环路(Wireless Local Loop,WLL)站、个人数字处理(Personal Digital Assistant,PDA)设备、具有无线通信功能的手持设备、计算设备或连接到无线调制解调器的其它处理设备、电脑、膝上型计算机、手持式通信设备、手持式计算设备、和/或用于在无线系统上进行通信的其它设备以及下一代通信系统,例如,5G网络中的移动终端或者未来演进的公共陆地移动网络(Public Land Mobile Network,PLMN)网络中的移动终端等。
作为示例而非限定,当所述电子设备为可穿戴设备时,该可穿戴设备还可以是应用穿戴式技术对日常穿戴进行智能化设计、开发出可以穿戴的设备的总称,如眼镜、手套、手表、服饰及鞋等。可穿戴设备即直接穿在身上,或是整合到用户的衣服或配件的一种便携式设备,通过附着与用户身上,采集用户的生物特征数据。可穿戴设备不仅仅是一种硬件设备,更是通过软件支持以及数据交互、云端交互来实现强大的功能。广义穿戴式智能设备包括功能全、尺寸大、可不依赖智能手机实现完整或者部分的功能,如智能手表或智能眼镜等,以及只专注于某一类应用功能,需要和其它设备如智能手机配合使用,如各类包含可解锁的触控屏的智能手环、智能首饰等。
图1示出了电子设备100的一种结构示意图。
电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本发明实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字 信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器180K,充电器,闪光灯,摄像头193等。例如:处理器110可以通过I2C接口耦合触摸传感器180K,使处理器110与触摸传感器180K通过I2C总线接口通信,实现电子设备100的触摸功能。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。在一些实施例中,音频模块170可以通过I2S接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块170与无线通信模块160可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S接口和所述PCM接口都可以用于音频通信。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与无线通信模块160。例如:处理器110通过UART接口与无线通信模块160中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块170可以通过UART接口向无线通信模块160传递音频信号,实现通过蓝牙耳机播放音乐的功能。
MIPI接口可以被用于连接处理器110与显示屏194,摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实 现电子设备100的拍摄功能。处理器110和显示屏194通过DSI接口通信,实现电子设备100的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头193,显示屏194,无线通信模块160,音频模块170,传感器模块180等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本发明实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备100的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,显示屏194,摄像头193,和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基 带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。显示屏194可包括触控面板以及其他输入设备。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头 被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别,脸部识别,语音识别,文本理解等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。处理器110通过运行存储在内部存储器121的指令,和/或存储在设置于处理器中的存储器的指令,执行电子设备100的各种功能应用以及数据处理。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备100 可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。电子设备100可以设置至少一个麦克风170C。在另一些实施例中,电子设备100可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,电子设备100还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。电子设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏194,电子设备100根据压力传感器180A检测所述触摸操作强度。电子设备100也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。
陀螺仪传感器180B可以用于确定电子设备100的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定电子设备100围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器180B检测电子设备100抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消电子设备100的抖动,实现防抖。陀螺仪传感器180B还可以用于导航,体感游戏场景。
气压传感器180C用于测量气压。在一些实施例中,电子设备100通过气压传感器180C测得的气压值计算海拔高度,辅助定位和导航。
磁传感器180D包括霍尔传感器。电子设备100可以利用磁传感器180D检测翻盖皮套的开合。在一些实施例中,当电子设备100是翻盖机时,电子设备100可以根据磁传感器180D检测翻盖的开合。进而根据检测到的皮套的开合状态或翻盖的开合状态,设置翻盖自动解锁等特性。
加速度传感器180E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。当电子设备100静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。
距离传感器180F,用于测量距离。电子设备100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备100可以利用距离传感器180F测距以实现快速对焦。
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备100通过发光二极管向外发射红外光。电子设备100使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定电子设备100附近有物体。当检测到不充分的反射光时,电子设备100可以确定电子设备100附近没有物体。电子设备100可以利用接近光传感器180G检测用户手持电子设备100贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。
环境光传感器180L用于感知环境光亮度。电子设备100可以根据感知的环境光亮度自适应调节显示屏194亮度。环境光传感器180L也可用于拍照时自动调节白平衡。环境光传感器180L还可以与接近光传感器180G配合,检测电子设备100是否在口袋里,以防误触。
指纹传感器180H用于采集指纹。电子设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器180J用于检测温度。在一些实施例中,电子设备100利用温度传感器180J检测的温度,执行温度处理策略。例如,当温度传感器180J上报的温度超过阈值,电子设备100执行降低位于温度传感器180J附近的处理器的性能,以便降低功耗实施热保护。在另一些实施例中,当温度低于另一阈值时,电子设备100对电池142加热,以避免低温导致电子设备100异常关机。在其他一些实施例中,当温度低于又一阈值时,电子设备100对电池142的输出电压执行升压,以避免低温导致的异常关机。
触摸传感器180K,也称“触控器件”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不同。
骨传导传感器180M可以获取振动信号。在一些实施例中,骨传导传感器180M可以获取人体声部振动骨块的振动信号。骨传导传感器180M也可以接触人体脉搏,接收血压跳动信号。在一些实施例中,骨传导传感器180M也可以设置于耳机中,结合成骨传导耳机。音频模块170可以基于所述骨传导传感器180M获取的声部振动骨块的振动信号,解析出语音信号,实现语音功能。应用处理器可以基于所述骨传导传感器180M获取的血压跳动信号解析心率信息,实现心率检测功能。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不 同的振动反馈效果。作用于显示屏194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和电子设备100的接触和分离。电子设备100可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口195可以支持Nano SIM卡,Micro SIM卡,SIM卡等。同一个SIM卡接口195可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口195也可以兼容不同类型的SIM卡。SIM卡接口195也可以兼容外部存储卡。电子设备100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备100采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在电子设备100中,不能和电子设备100分离。
电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本发明实施例以分层架构的Android系统为例,示例性说明电子设备100的软件结构。
图2是本发明实施例的电子设备100的一种软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序包。
如图2所示,应用程序包可以包括相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息等应用程序。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图2所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供电子设备100的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的 消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动。
下面结合捕获拍照场景,示例性说明电子设备100软件以及硬件的工作流程。
当触摸传感器180K接收到触摸操作,相应的硬件中断被发给内核层。内核层将触摸操作加工成原始输入事件(包括触摸坐标,触摸操作的时间戳等信息)。原始输入事件被存储在内核层。应用程序框架层从内核层获取原始输入事件,识别该输入事件所对应的控件。以该触摸操作是触摸单击操作,该单击操作所对应的控件为相机应用图标的控件为例,相机应用调用应用框架层的接口,启动相机应用,进而通过调用内核层启动摄像头驱动,通过摄像头193捕获静态图像或视频。
实施例一:
人机交互的过程中,可以通过触控的方式进行交互操作,例如通过鼠标、键盘以及触控屏等,向电子设备发送控制指令,电子设备可以根据接收到的控制指令反馈对应的处理结果;又可以非接触的方式进行交互操作,例如通过语音识别将语音信号转换为对应的控制指令,又或者通过动作捕捉确定该动作对应的控制指令。在部分非接触式的人机交互过程中,还可以对用户的行为动作进行监控以及识别,例如识别用户坐姿是否正确,并在坐姿不正确的时候进行提醒;又例如,用户在进行运动训练时,可以识别用户的动作是否与运动训练的要求一致,以实现对运动训练的监督。上述非接触式交互的过程中,往往需要测量电子设备与交互行为的用户之间的距离值,而距 离值的测量是否准确,则直接影响非接触式的交互行为的准确性。
在一种可能是的实现方式中,上述非接触式交互的场景可以为智能室内健身场景。示例性地,图3示出了本申请一实施例提供的智能室内健身场景的示意图。参见图3所示,电子设备可以通过摄像模块获取包含用户的视频数据,通过对视频数据内各个视频图像帧进行解析,确定用户的身高、与电子设备之间的距离值以及运动速度等参数,构建一个可感知的室内运动环境,为用户提供精准的运动指导,提高运动效率。
为了实现非接触交互操作,以下提供三种距离测量的方法:
方法1:用于响应非接触交互行为的电子设备可以配置有激光接收器以及激光发射器。其中,激光发射器在需要进行距离测量是可以发射一束激光,并由激光接收器接收经由阻挡物反射后的激光,计时器测定激光从发射到接收的发射时长,从而确定阻挡物与电子设备之间的距离值,图4示出了本申请一实施例提供的基于激光的测距原理示意图。然而该方法由于激光器发射的光斑大小有限,当用户在运动的过程中激光可能会存在部分时刻无法照射在用户身上,从而导致测量出与背景之间的距离,并且需要配置激光收发器件,增加了电子设备的造价成本;若为了减少激光无法照射到用户的概率,则需要增加多对激光收发器件,则进一步增大了造价成本,无法同时兼顾测距准确性以及控制造假成本两个方面。
方法2:电子设备可以配置有双目摄像头,即配置有两个或以上的摄像头,两个摄像头的相对位置与角度固定,通过双目摄像头获取包含用户的两个或以上的图像,根据预先设定的摄像头的内部参数以及上述两个摄像头之间的相对位置与角度,可以确定用户基于两个图像对应的视差信息,并结合三角测距原理确定用户与电子摄像头之间的距离值。图5示出了本申请一实施例提供的基于双目摄像头的测距原理示意图。然而该方法在确定视差信息时,需要预设的内部参数对应的摄像头之间的位姿,要与双目摄像头实际使用时的位姿完全相同,对于摄像头的位姿准确性要求较高,并且计算过程较为复杂,若双目摄像头的位姿与内部参数关联的位姿不一致,则会大大降低了距离测量的准确性;第二方面,若室内光线较暗或较亮(即存在过度曝光的场景下),拍摄内容的轮廓信息较弱,无法准确识别出视觉差信息,准确性较低,对于环境光的要求较高。
方法3:电子设备可以配置有可获取深度信息的成像模组,该成像模组包括投射结构光的投影仪以及用于接收结构光的摄像模块,结构光通过投影仪发射,并在在环境物(例如用户或家具)的表面进行反射后得到对应的结构光图案,摄像模块采集上述结构光图像,根据三角测量原理或对结构光图案进行解码,生成当前环境的深度信息,该深度信息是根据结构光内各个光脉冲的往返时间(即相位差)确定与反射物之间的距离;继而电子设备可以基于上述深度信息得到电子设备与用户之间的距离值。图6示出了本申请一实施例提供的基于深度信息的成像模组的测距原理示意图。然而该方式需要配置结构光的投影仪以及对应的成像的摄像模块,大大增加了电子设备的造价成本,且结构光的测距距离有限,超过预设的测量距离测量精度会大幅降低,降低了用户运动的自由度;另一方面,基于结构光确定深度信息时,结构光图案容易受环境光影响,例如若室内存在炫光等非调制光时,会影响深度信息的精度。
由此可见,上述三种方式均无法同时兼顾测距准确性与控制设备造价成本两方面。 因此,为了解决上述测量距离的过程的缺陷,本申请提供一种测量距离的方法,具体详述如下:参见图7所示,该测量距离的方法的执行主体为一电子设备,该电子设备可以为一智能手机、平板电脑、计算机、智能游戏机以及配置有摄像模块的任一设备,可选地,该电子设备可以为一智能电视机,通过智能电视机输出运动提示画面,以实现智能室内运动。图7示出了本申请一实施例提供的测量距离的方法的实现流程图,详述如下:
在S701中,获取包含目标对象的测距图像。
在本实施例中,电子设备可以配置有摄像模块,该摄像模块可以为任意类型的摄像模块,由于摄像模块的类型以及成像原理对于距离值的测量过程并不造成影响,因此可以减少电子设备的造价成本,只需采用普通的摄像模块即可,具体地,该摄像模块可以为一数字摄像头,该数字摄像头内配置有感光器件,通过采集拍摄场景下的光信号,将光信号转换为电信号,生成对应的电子图像,即上述的测距图像。
在本实施例中,上述目标对象具体为与电子设备进行交互操作的对象。该目标对象的个数可以为一个,也可以为多个,在此并不对目标对象的个数进行限定。需要说明的是,若目标对象的个数为两个或以上,电子设备可以分别识别各个目标对象与电子设备之间的距离值。可选地,电子设备可以建立目标对象与距离值之间的对应关系,从而电子设备可以通过查询该对应关系,确定各个目标对象与自身之间的相距距离。
在一种可能的实现方式中,电子设备可以从多个候选对象中选取至少一个候选对象作为目标对象,例如拍摄画面中存在多个用户,电子设备可以选取其中一个用户作为进行健身运动提示的用户,即需从多个用户中识别出目标对象,从而避免将经过拍摄区域的其他用户也识别为需要进行监控的用户,降低了健身运动监控的准确性。实现的方式具体可以为:电子设备识别各个候选对象在所述测距图像中所占据的区域面积,选取所述区域面积最大的候选对象作为上述的目标对象。上述从多个候选对象中选取出目标对象的方式还可以为:电子设备可以识别各个候选对象包含的关键点信息,选取关键点信息与标准关键信息匹配的候选对象作为目标对象。由于部分的健身运动监控,需要拍摄得到无遮挡的用户运动视频,例如包含腿部、腹部、头部以及手部等,电子设备可以确定多个标准关键点,并将所有标准关键点生成上述的标准关键信息。电子设备可以识别各个候选对象在测距图像内包含的关键点,生成上述的关键点信息,并判断关键点信息是否包含标准关键信息内的所有关键点,若是,则识别匹配成功,识别该候选对象为目标对象;反之,则识别匹配识别,将该候选对象识别为非目标对象。
在一种可能的实现方式中,电子设备可以存储由多个预设动作,电子设备若检测到任一拍摄对象的动作与预设动作相匹配,则识别该拍摄对象为目标对象,并通过人脸匹配或跟踪算法等对目标对象进行跟踪识别。
在本实施例中,电子设备在需要确定电子设备与目标对象之间距离值时,可以控制摄像模块获取包含有目标对象的测距图像。可选地,电子设备可以接收用户发起的测距请求指令,例如通过点击测距按钮或说出测距请求指令对应的语音控制指令(如“测量距离”或“开始测距”等语音信号),电子设备则执行S701的操作。
在一种可能的实现方式中,电子设备可以运行有健身监控应用,该健身监控应用 可以配置有测距条件,例如检测到目标对象的位置发生改变时,则生成测距指令,以执行S701的操作;上述测距条件还可以为当检测到启动监控功能时,则以预设的检测周期执行S701的操作。
在一种可能的实现方式中,电子设备可以配置有检测周期,以实现周期性确定电子设备与目标对象之间的距离值。上述检测周期可以根据电子设备当前启动的应用程序进行设置。例如,在进行健身运动监控时,用户移动速度较快,此时可以将上述检测周期的周期时长设置较短,例如0.2s检测一次距离值;在待机状态下(在待机状态下电子设备可以通过检测与用户之间的距离值来激活设备),此时可以将上述检测周期的周期时长设置较长,例如1s检测一次距离值。
在一种应用场景下,电子设备可以根据与用户之间的距离进行点亮屏幕的操作。图8示出了本申请一实施例提供的屏幕自动点亮的示意图。参见图8中的(a)所示,用户与电子设备之间的距离较远,此时电子设备处于灭屏状态(也可以为待机状态),此时电子设备可以以较低频率的检测周期获取用户与电子设备之间的距离值。当用户走向电子设备时,两者的距离会较近,并电子设备在某一次检测周期时确定电子设备与用户之间的距离值小于预设的启动阈值,即如图8中的(b),此时,电子设备会点亮屏幕,并将以较高频率的检测周期湖区用户与电子设备之间的距离值,根据电子设备的不同状态,动态调整上述检测周期,从而保证距离值测量的实时性需求的同时,也能够降低设备能耗。
在本实施中,电子设备通过摄像模块获取得到测距图像后,可以识别该测距图像内是否包含目标对象。若该测距图像内包含目标对象,则执行S702的操作;反之,若该测距图像内不包含任一目标对象,则等待下一次的测距图像的采集操作。具体地,电子设备可以配置有目标对象的特征信息,例如该目标对象为实体人,则对应的特征信息为人体特征信息,如人脸信息以及身体部位信息,若测距图像内的任一拍摄对象与上述的特征信息相匹配,则识别该测距图像内包含目标对象,例如若测距图像内任一拍摄对象包含人脸,则识别该拍摄对象为目标对象。
进一步地,作为本申请的另一实施例,图9示出了本申请一实施例提供的测量距离的方法的具体实现流程图。参见图9所示,相对于图7所述的实施例,本实施例提供的测量距离的方法在S701之前还可以包括S901以及S902。详述如下:
进一步地,在所述获取包含目标对象的测距图像之前,还包括:
在S901中,响应于所述目标对象在应用界面上的应用选择操作,确定所述应用选择操作对应的目标应用。
在本实施例中,电子设备可以通过显示模块显示应用界面,其中,该应用界面内包含有多个应用程序的应用图标。电子设备可以接收用户基于应用界面发起的应用选择操作,并根据用户发起的应用选择操作确定应用界面中对应的目标应用。
在一种可能的实现方式中,用户可以通过控制器向电子设备发送应用选择操作。在该情况下,每个应用程序可以对应一个应用标识,用户可以通过点击控制器上与应用标识对应的按键生成上述的应用选择指令,例如某一应用程序的应用标识为“01”,则用户可以点击控制器上按键“0”和按键“1”,或者直接点击按键“1”,生成上述的应用选择操作并发送给电子设备。应用界面内可以显示有一选择光标,用户可以通 过控制器控制选择光标的移动,并在选择光标移动至所需选择的应用程序对应的应用图标的显示区域时,向电子设备发送确认选择指令,电子设备则可以根据当前选择光标所在的坐标位置确定目标应用,并生成上述应用选择操作。
在一种可能的实现方式中,用户可以通过电子设备上的触控屏生成应用选择操作。电子设备可以通过触控屏输出上述的应用界面,用户可以在该触控屏上进行点击操作。电子设备可以识别点击操作对应的点击坐标,并根据该点击坐标确定关联的目标应用,将该点击操作识别为上述的应用选择操作。
在一种可能的实现方式中,电子设备可以通过摄像模块获取目标对象的视频数据,并对视频数据进行解析,确定目标对象的动作轨迹,并根据动作轨迹确定用户所需选择的目标应用,即上述动作轨迹即为用户发起的应用选择操作。电子设备可以设置有动作轨迹与应用程序之间的对应关系表。电子设备通过视频数据确定目标对象的动作轨迹后,可以查询与上述对应关系表内是否包含匹配的标准动作;若存在匹配的标准动作,则将该标准动作对应的应用程序识别为目标应用。
在本实施例中,电子设备可以记录有各个应用程序的应用类型,根据对于应用程序的操作模式可以将所有应用程序划分为交互应用以及非交互应用。对于交互类型的应用程序,还可以划分为接触式交互类型的应用程序以及非接触式交互类型的应用程序。电子设备根据用户的应用选择操作确定了目标应用后,可以识别该目标应用的应用类型,判断该目标应用是否为交互类型的应用程序,若不是,则无需采集测距图像;若目标应用为交互类型的应用程序,则判断操作模式是接触式交互类型或是非接触式交互类型,若是接触式交互类型,则接收用户的触控操作,通过目标应用响应上述的触控操作,上述接触式交互类型包括触控屏交互或控制设备交互(如鼠标控制、键盘控制和/或控制其控制等);若是非接触交互类型的应用程序,则执行S902的操作。
在S902中,若所述目标应用为非接触式交互类型的应用,则执行所述获取包含目标对象的测距图像,以通过所述距离值进行非接触式交互操作。
在本实施例中,电子设备若检测到目标应用为非接触式交互类型的应用程序,例如健身监控应用或者体感游戏应用等。而非接触式交互类型的应用程序在采集交互对象的非接触操作时,除了需要确定交互对象(即上述目标对象)的动作数据外,往往需要采集电子设备与交互对象之间的距离值,根据距离值进行操作提示、操作指令的生成以及动作数据的校准等非接触式的交互操作。基于此,电子设备会启动摄像模块,并通过摄像模块采集包含交互对象(即目标对象)的测距图像,通过S701至S703的操作,得到电子设备与交互对象之间的距离值,并基于该距离值进行非接触式交互操作。
在本申请实施例中,通过接收目标对象的应用选择操作,并响应应用选择操作启动目标应用,当目标应用为非接触式交互类型的应用时,执行距离测量流程,可以实现距离测量操作执行的精准性,避免不必要的测距操作。
在S702中,根据所述测距图像确定测距参照参量。
在本实施例中,电子设备可以根据测距图像,确定用于确定获取图像时刻,目标用于所在位置的缩放比例的参照参量,即上述测距参照参量。电子设备可以根据该测距参照参量确定目标对象在测距图像内的缩放比例,从而可以根据缩放比例确定与电 子设备之间的距离值。
在一种可能的实现方式中,根据测距图像确定测距参照参量的方式可以为:电子设备放置的场景下可以在多个关键位置点标记有标示符。用户可以根据电子设备的提示,在与电子设备之间的相距距离为预设值的多个关键点设置标示符。图10示出了本申请一实施例提供基于标示符确定测距参照参量的示意图。参见图10所示,电子设备在进行测距之前,可以输出提示信息,提示用户在多个关键点设置标示符,分别为距离电子设备0.5m、1m、1.5m以及2m四个关键点。用户可以在不同的关键点上配置对应的标示符,例如在对应的地板位置粘贴对应的图案,如图10所示。电子设备在获取测距图像时,可以识别测距图像内包含的标示符。电子设备在识别得到各个标识符后,可以根据目标对象与各个标示符之间的相对位置,确定目标对象与电子设备之间的距离值。如图10所示,在测距图像中,目标对象在第一标示符(即0.5m)与第二标示符(1m)之间,由于标识符与电子设备之间的距离值是固定的,因此目标对象与电子设备之间的距离值会在0.5m~1m之间,然后,根据与两个标识符之间的相对位置可以确定,该目标对象与电子设备之间的距离值为0.8m,具体可以根据与两个标示符之间的像素距离与预设的转换算法,计算该目标对象与电子设备之间的实际距离。
在一种可能的实现方式中,根据测距图像确定测距参照参量的方式可以为:电子设备可以识别测距图像内包含的预设标定物,根据预设标定物在测距图像内的像素尺寸得到测距参照参量。举例性地,上述预设标定物为形状尺寸固定且可知的任一物体,如用户所在室内场景下的椅子、桌子以及家用电器等。电子设备可以预先存储有上述预设标定物的尺寸大小,如在电子设备中录入有桌子的外观形状以及对应的实际尺寸。若电子设备并非存储有预设标定物的尺寸大小,又或者获取的测距图像内并不包含已经录入的任一预设标定物,则可以识别测距图像内包含的拍摄对象,并通过网络查询或预设的查询应用获取该拍摄对象的尺寸信息,从而确定该拍摄对象的实际尺寸。示例性地,图11示出了本申请一实施例提供的识别拍摄对象实际尺寸的示意图。参见图11所示,电子设备获取测距图像时,当前场景内包含有所需一冰箱,冰箱表面包含有对应的厂商的商标。电子设备可以根据冰箱的表面外观以及厂商的厂商从互联网上查询对应的设备型号,并根据设备型号确定对应的设备尺寸,即上述的实际尺寸,将该实际尺寸作为上述的测距参照参量。
在S703中,基于所述测距参照参量确定所述电子设备与所述目标对象之间的距离值。
在本实施例中,电子设备在确定了测距参照参量后,可以基于该测距参照参量计算目标对象与电子设备之间的距离值。电子设备可以配置有距离值转换算法,将测距参照参量导入到上述距离值转换算法内,可以计算得到与测距参照参量关联的距离值,并将距离转换算法输出的距离值识别为电子设备与目标对象之间的距离值。
在一种可能的实现方式中,电子设备在确定了目标对象与电子设备之间的距离值后,可以将该距离值反馈给当前运行的应用程序,以通过应用程序生成交互信息并进行显示。例如,电子设备当前运行着健身监控应用,则根据上述距离值判断目标对象与电子设备之间的距离是否过近,由于距离过近时目标对象在进行健身时有可能撞击到电子设备,需要进行警告。因此,健身监控应用可以将上述距离值与预设的距离阈 值进行比对,若该距离值小于或等于距离阈值,则输出距离过近的警告信息;反之,若该距离值大于距离阈值,则无需对目标对象进行异常警告。
在一种可能的实现方式中,若测距图像内包含多个目标对象,且电子设备分别识别了各个目标对象与电子设备之间的距离值,则电子设备可以根据各个目标对象对应的距离值,分别生成关联的交互信息并进行显示。继续以距离是否过近为例进行说明。示例性地,图12示出了本申请一实施例提供的多人监控的提示示意图。参见图12所示,测距图像内包含有多个目标对象,分别为用户1、用户2以及用户3,电子设备通过上述方式分别识别得到各个目标对象与电子设备之间的距离值,分别为1m,0.5m以及1.2m,其中,预设的距离阈值(用于提示距离过近)为0.8m,因此可以确定用户2与电子设备之间的距离过近,因此可以对用户2进行距离异常提示,如图12显示“过近,请远离设备”的提示信息,具体地,可以根据目标对象当前的距离值与预设的距离阈值之间的差值,生成提示信息,例如“过近,请远离设备0.3m”。
在一种可能的实现方式中,若测距图像内包含了多个目标对象,则除了可以确定各个目标对象分别与电子设备之间的距离值外,还可以测量各个目标对象之间的距离值。具体的计算过程可以根据所需测量距离的两个目标对象与电子设备之间的距离值L1和L2,以及在测距图像内两个目标对象之间的像素距离L3进行确定,图13示出了本申请一实施例提供的多人监控场景下,确定任一两个目标对象之间距离值的示意图。在图13中分别表示出上述三个距离值,则可以根据勾股定理计算得到两个目标对象之间的距离值。需要说明的是,电子设备还可以通过两个目标对象之间像素距离以及通过测距参照参量确定的缩放比例,直接计算两个对象之间的实际距离。同样地,若任意两个目标对象之间的距离值过近,也可以输出提示信息,如图12所示,用户1和用户2距离过近,则可以提示用户1向左移动,以远离用户2。
需要说明的是,若监控过程是持续的,即电子设备获取的是目标用户的监控视频,则可以对该监控视频进行解析,提取监控视频内的各个视频图像帧,将各个视频图像帧作为上述的测距图像执行S701至S703的操作,确定各个视频图像帧中目标对象与电子设备之间的距离值,实现了连续监控的目的。
以上可以看出,本申请实施例提供的一种测量距离的方法可以通过获取包含目标对象的测距图像,上述目标对象即为执行非接触式交互行为的用户,通过上述测距图像提取用于测量距离的测距参照参量,从而可以根据测距参照参量确定电子设备与目标对象之间的距离值,电子设备只需包含一个摄像模块即可实现,本实施例测量电子设备与目标对象之间的距离值时,并不依赖深度图像,或需要通过双目摄像头的拍摄角度差的方式来进行测距,因此电子设备无需电子设备配置基于光脉冲的收发器以及双目摄像头等模块,从而大大降低了电子设备的造价成本;与此同时,由于在进行距离测量的过程中,通过确定测距参照参量,并非直接获取距离值,从而能够提高测距的准确性。
作为本申请的另一实施例,从测距图像提取测距参照参量,并根据测距参照参量确定目标对象与电子设备之间的距离值的方式可以至少包含以下四种方式:
方式1:在目标对象为站立时,电子设备可以根据目标对象预存的实际高度以及目标对象在测距图像内的像素高度,确定缩放比例,将缩放比例作为上述的测距参照 参量。
方式2:在测距图像内包含尺寸已知的标志物时,电子设备可以确定标志物的实际尺寸与标志物在测距图像内的像素长度,确定缩放比例,将缩放比例作为上述的测距参照参量。
方式3:若用户佩戴了可穿戴设备,则电子设备可以根据可穿戴设备的实际移动距离与在多个测距图像内该可穿戴设备的图像轨迹的距离,确定缩放比例,将缩放比例作为上述的测距参照参量。
方式4:电子设备配置有数字对焦摄像模块,电子设备可以将拍摄测距图像时的数字焦距作为上述的测距参照参量,并根据数字焦距确定目标用户与电子设备之间的距离值。
具体地,上述4种实现方式的具体实现过程如下:
方式1:
图14示出了本申请第一实施例提供的测距参照参量的确定实现流程图。参见图14所示,相对于图7所述的实施例,该测量距离的方法中S702具体包括S1401以及S1402,详述如下:
在S1401中,识别所述目标对象在所述测距图像内的像素高度。
在本实施例中,电子设备可以通过预设的目标对象识别算法,确定目标对象在测距图像内对应的对象区域图像,该对象区域图像具体为基于目标对象的轮廓线从测距图像中划分得到的区域图像。
在一种可能的实现方式中,若该目标对象为实体人,则上述的目标对象识别算法具体为人体识别算法。在该情况下,电子设备可以配置有多个特征点,例如头部特征点、肩部特征点、手部特征点以及腿部特征点等。电子设备可以判断测距图像内是否包含预配置的多个特征点,若包含,则识别该测距图像内拍摄有实体人,则根据各个特征点识别出目标对象的轮廓线,并基于轮廓线从测距图像中分离得到上述的对象区域图像;反之,若不包含任一上述特征点,则识别该测距图像内不包含实体人,即不存在目标对象,无需执行测距操作。
在一种可能的实现方式中,电子设备在确定了目标对象的多个特征点并得到轮廓线后,可以根据各个特征点之间的相对位置和/或上述轮廓线,识别目标对象的对象姿态。可选地,电子设备可以在对象区域图像中识别出多个骨骼节点,例如头部骨骼节点、颈部骨骼节点、脚部骨骼节点以及臀部骨骼节点等,根据各个骨骼节点之间的相对位置,确定目标对象的对象姿态,上述相对位置具体可以为多个不同骨骼节点之间所连接成直线的斜率,从而根据多个不同直线之间的斜率确定目标对象的对象姿态。由于获取的目标对象的实际高度时,目标对象一般处于站立的姿态,或需要确定准确的缩放比例,则需要获取测距图像时目标对象的姿态与测量实际高度时的姿态一直,即处于站立姿态。因此,电子设备需要确定测距图像内的目标对象的对象姿态。若该对象姿态为站立姿态,则识别可以采用S1401以及S1402的方式确定测距参照参量;反之,若该目标对象的对象姿态为除站立姿态外的其他姿态,则不采用方式1确定测距参照参量。
在一种可能的实现方式中,电子设备在确定了目标对象的对象姿态后,还可以判 断该目标对象的人体区域图像是否拍摄得到完整的人体,例如是否包含脚部至头部完整的人体。由于测距图像内的目标对象为站立姿态,若此时部分区域缺失,例如脚部没有拍摄得到,则在测距图像内的目标对象的高度也并非与目标对象的实际高度存在固定的对象关系,无法作为测距参照参量。基于此,电子设备在确定了目标对象的对象姿态为站立姿态后,可以进一步识别该人体区域图像是否包含完整的人体,例如判断人体区域图像内是否包含预设的多个关键点,若包含,则识别为包含完整的人体,则通过S1401以及S1402确定测距参照参量。
在本实施例中,电子设备从测距图像内提取了目标对象的人体区域图像后,可以统计该人体区域图像中目标对象所对应的像素高度,即人体区域图像中竖直方向上最低点与最高点之间间隔的像素点个数,即为上述的像素高度。
在S1402中,根据所述像素高度与所述目标对象关联的实际高度之间的比值,得到所述测距参照参量。
在本实施例中,电子设备预先存储有目标对象的实际高度,即真实身高;在确定了目标对象在测距图像内的像素高度后,可以计算像素高度与实际高度之间的比值,从而可以确定目标对象在测距图像中对应的缩放比例,将该缩放比例作为上述的测距参照参量。
示例性地,图15示出了本申请一实施例提供的基于实际身高与像素高度确定距离值的流程图。参见图15所示,电子设备可以通过开启摄像模块,获取包含目标对象的测距图像(即对应上一实施例的S701的操作),对测距图像进行目标对象的识别,若检测到目标对象,则确定该目标对象在测距图像内的像素高度,以及查询该目标对象的实际高度,计算上述两者的比值,即放大倍率amp,其中,amp=像素高度h/实际高度H;然后电子设备获取摄像模块对应的成像焦距f,由于电子设备与目标对象之间的距离值dist,和该成像焦距f之间的放大倍率也为amp,因此可以将放大倍率amp作为上述的测距参照参量,计算距离值dist,其中,dist=f*amp。
在本申请的另一实施例中,电子设备可以存储有多个候选对象的用户信息。在该情况下,电子设备有两种方法确定目标对象对应的用户信息。
方法A:基于人脸识别,实现方式如下:
1.从所述测距图像内提取所述目标对象的人脸区域图像;
2.基于所述人脸区域图像确定所述目标对象对应的用户信息;
3.从所述用户信息提取所述目标对象的所述实际高度。
在本实施例中,电子设备可以存储有各个候选对象的标准人脸。该标准人脸可以通过上述的摄像模块采集得到,并接收用户对采集到的标准人脸配置关联的用户信息。
在一种可能的实现方式中,除了通过用户设置确定关联的用户信息外,电子设备还可以与云端服务器相连,通过云端服务器内的人脸数据库,确定该标准人脸关联的用户标识,并基于该用户标识从云端服务器处下载关联的用户信息,建立标准人脸与用户信息之间的关联关系。由于用户可以通过智能手机、平板电脑等智能终端在使用过程中采集多个人脸图像,并将人脸图像上传至对应的云端服务器侧,云端服务器侧可以存储有基于多个用户的人脸图像构建的人脸数据库,以及上传该人脸图像对应的用户标识。基于此,电子设备除了接收用户手动设置的方式建立标准人脸与用户信息 之间的关联关系外,还可以通过从云端服务器处下载的方式进行获取。
在本实施例中,电子设备可以对测距图像进行人脸识别,提取测距图像内包含的人脸区域图像。其中,电子设备首先可以识别出测距图像内的人脸特征点,例如具有眼部特征、嘴部特征或鼻子特征的多个关键区域,从而可以识别得到人脸区域,并将人脸区域从测距图像中划分出来,得到上述的人脸区域图像。
在本实施例中,电子设备可以将人脸区域图像导入到预设的人脸特征值识别算法内(例如通过池化、卷积等方式将二维图像转换为特征矩阵或特征向量),得到人脸特征参量;将人脸特征参量与预先存储有的各个标准人脸的标准特征参量进行比对,选取匹配度最高的一个标准人脸作为该人脸特征参量对应的标准人脸,将该标准人脸的用户信息作为目标对象的用户信息。
在一种可能的实现方式中,若电子设备没有存储有该人脸特征参量匹配的标准人脸,即各个标准人脸的标准特征参量与该人脸特征参量之间的匹配度均小于匹配阈值,则电子设备可以再次获取测距图像,重复执行上述操作,若重复匹配的次数大于预设的次数阈值,均没有识别得到该人脸特征参量对应的标准人脸,则输出用户信息匹配失败的提示信息,以通知用户为该人脸特征参量配置关联的用户信息,或者采用方法B的方式确定用户信息。
在本实施例中,该用户信息内携带有目标对象的实际高度,即目标对象的身高,电子设备可以从用户信息中提取出身高信息,将身高信息对应的参数值作为上述目标对象的实际高度。
在本申请实施例中,通过从测距图像中提取出人脸区域图像,并基于人脸区域图像自动识别得到关联的用户信息,从而确定了目标对象的实际高度,无需手动选择用户信息,提高了实际高度选取的效率。
方法B:
1.响应于所述目标对象在用户选择界面上的选择操作,确定所述选择操作对应的用户账户。
2.从所述用户账户的用户信息中获取所述目标对象的所述实际高度。
在本实施例中,电子设备在进行测量距离之前,可以输出用户选择界面。可选地,在执行“获取包含目标用户的测距图像”之前,即可输出上述的用户选择界面。目标对象可以基于用户选择界面上显示有的至少一个候选用户中,选择一个或多个候选用户为目标用户。其中,每个候选用户在上述用户选择界面上对应一个用户图标,用户可以通过点击或选取等选择操作确定所需选择的目标用户,将该目标用户对应的用户账户识别为选择操作对应的用户账户。
在一种可能的实现方式中,电子设备可以接收目标对象的触控操作,该触控操作即为上述的选择操作,该触控操作具体可以为点击操作。电子设备根据触控操作的触控坐标,确定该触控坐标关联的用户图标,将该用户图标对应的候选账户确定为上述选择操作对应的用户账户。
在一种可能的实现方式中,电子设备可以接收目标对象通过控制设备发送的控制指令,该控制指令即为上述的选择操作。电子设备根据控制指令确定关联的用户图标, 并将该用户图标对应的候选账户识别为上述选择操作对应的用户账户。
在本实施例中,每个用户账户可以记录有对应用户的用户信息,例如用户的年龄、身高以及体重等。电子设备可以根据用户账户从数据库内提取出关联的用户信息,并从用户信息中获取该目标对象的实际高度,即上述的用户身高。
在本申请实施例中,通过目标用户指定的方式确定用户账户,并从用户账户关联的用户信息中获取实际高度,可以提高实际高度的准确性,继而提高距离测量的准确性。
方式2:
图16示出了本申请第二实施例提供的测距参照参量的确定实现流程图。参见图16所示,相对于图7所述的实施例,该测量距离的方法中S702具体包括S1601~S1605,详述如下:
在S1601中,识别所述目标对象在所述测距图像内的对象姿态。
在本实施例中,电子设备可以设置有姿态识别算法,将测距图像导入到上述姿态识别算法内,可以得到该目标对象的对象姿态。其中,该对象姿态包括但不限于:站姿、卧姿、坐姿等不同的姿态。
在一种可能的实现方式中,电子设备可以通过预设的目标对象识别算法,确定目标对象在测距图像内对应的对象区域图像,该对象区域图像具体为基于目标对象的轮廓线从测距图像中划分得到的区域图像,即对象区域图像,根据对象区域图像与各个预设的标准姿态的标准图像进行匹配,根据匹配结果确定该测距图像内目标对象的对象姿态。
在一种可能的实现方式中,电子设备可以配置有多个对象关键点,上述对象关键点可以包括人脸特征点(眼睛、鼻子、嘴巴等)以及身体关键点(头部、手部、肩部、腿部等),电子设备可以在测距图像上标记出各个对象关键点,并根据各个对象关键点之间的像素距离,生成对应的姿态特征向量。计算该姿态特征向量与各个预设的标准姿态的标准特征向量之间的向量距离,选取向量距离最小的一个标准姿态作为该目标对象在测距图像的对象姿态。
在本实施例中,电子设备可以判断该对象姿态是否为预设的第一姿态,或是预设的第二姿态。若该目标对象的对象姿态为第一姿态,则执行S1602和S1603的操作;反之,若该目标对象的对象姿态为第二姿态,则执行S1604以及S1605的操作。
在一种可能的实现方式中,该第一姿态为卧姿;该第二姿态为站姿。
在一种可能的实现方式中,电子设备若检测到目标对象的对象姿态并非第一姿态以及第二姿态,则可以显示姿态调整信息,以提示目标对象重新调整当前的姿态后,再进行距离测量。
在S1602中,若所述对象姿态为第一姿态,则获取在所述测距图像内所述目标对象关联的预设标志物的像素长度。
在本实施例中,电子设备在检测到目标对象在测距图像内为第一姿态(例如该第一姿态可以为卧姿或坐姿)时,目标对象处于身高不完全可见的状态,此时无法直接根据目标对象的像素高度确定测距参照参量,因此可以在测距图像内识别出目标对象 关联的预设标志物,通过处于完全可见状态下的预设标志物的像素长度确定测距参照参量。
示例性地,图17示出了本申请一实施例提供的预设标志物与目标对象之间位置关系的示意图。与目标对象关联的预设标志物具体可以为:预设标志物与目标对象所在的平面与电子设备的拍摄方向垂直,如图17中的(a)所示,此时,预设标志物与电子设备的拍摄平面之间的距离,与目标对象与电子设备的拍摄平面之间的距离相同,因此可以将标志物与电子设备之间的距离值识别为目标对象与电子设备之间的距离值;又或者,目标对象还可以位于预设标志物上,例如在某一瓷砖、某一椅子或瑜伽垫行,如图17中的(b)所示,该在情况下,可以识别预设标志物与目标对象处于同一位置,即标志物与电子设备之间的距离,以及目标对象与电子设备之间的距离相同。
进一步地,作为本申请的另一实施例,该预设标志物为瑜伽垫,上述S1601具体为:若所述目标对象处于瑜伽训练模式,则执行所述识别所述目标对象在所述测距图像内的对象姿态的操作。即图16所示的实施例具体可以用于监控目标对象进行瑜伽运动的场景下。由于在瑜伽运动场景下,目标对象大概率会使用瑜伽垫,而瑜伽垫的尺寸相对固定,因此可以在进行目标对象进行瑜伽运动之前,预先录入该瑜伽垫的长度,或者从互联网或云端服务器下载得到。由于目标对象运动中基本在瑜伽垫上进行,因此将瑜伽垫与电子设备之间的距离值识别为目标对象与电子设备之间的距离值。
在一种可能的实现方式中,电子设备可以识别瑜伽垫长边在测距图像中的倾斜角度,若该倾斜角度大于预设的角度阈值,则识别该瑜伽垫并未正置(即长边与拍摄平面平行或基本平行),为了提高距离测量的准确性,电子设备可以输出提示信息,例如“请摆正瑜伽垫”,以使瑜伽垫的长边与拍摄平面基本平行,避免因瑜伽垫的放置角度的偏差,影响距离测量的准确性。
在本实施例中,由于目标对象在第一姿态下,具体为处于不完全伸展状态,即目标对象在测距图像内并非直立,因此该目标对象的像素高度并不等于该目标对象的身高。在该情况下,可以根据预设标志物的像素长度确定测距参照参量。因此,电子设备可以根据预设标志物的外观特征信息,从测距图像中识别出预设标志物,并确定该预设标志物在测距图像内的像素长度。该像素长度具体为预设标志物的外轮廓中与电子设备的拍摄平面的任一边。若该预设标志物为瑜伽垫,则上述像素长度具体为瑜伽垫的较长的边在测距图像内所对应的长度。
示例性地,图18示出了本申请一实施例提供预设标志位瑜伽垫时像素长度的获取示意图。参见图18所示,为电子设备获取的测距图像,其中,目标对象横卧在瑜伽垫上,即对象姿态为第一姿态,此时无法通过目标对象在测距图像内的像素高度确定上述的测距参照参量(即缩放比例),在该情况下,电子设备会识别测距图像中的瑜伽垫,并将瑜伽垫的长边所占的像素个数作为上述的像素长度l。
在S1603中,根据所述预设标志物的实际长度以及所述像素长度,得到所述测距参照参量。
在本实施例中,电子设备预先存储由预设标志物的实际长度,计算预设标志物在测距图像中的像素长度与实际长度之间的比值,确定拍摄测距图像时对应的缩放比例,并且由于预设标志物与电子设备之间的距离值,与目标用户与电子设备之间的距离值 相同,因此,预设标志物的缩放比例即为目标对象的缩放比例。同样地,也可以上述确定的测距参照参量计算得到电子设备与目标对象之间的距离值。
示例性地,继续参见图18所示,瑜伽垫在测距图像内的像素长度为l,而实际长度为L,瑜伽垫与电子设备之间距离Dist与电子设备的焦距f之间的比例与l和L的比例一致,并且目标对象在瑜伽垫上,因此可以根据上述两个长度之间的比值,计算得到电子设备与目标用户之间的距离值。
在一种可能的实现方式中,若对象姿态为卧姿,且电子设备并非存储由目标对象的实际高度,在该情况下,可以根据上述的预设标志物的实际长度以及像素长度,确定目标对象的实际高度。实现的方式如下:电子设备可以计算目标对象在测距图像内躺卧姿态下的对象像素长度,即图18所示的像素长度h,由于预设标志物与目标对象的高度之间的比例,无论在实际场景下或是拍摄画面内,均是保持一致的,因此,目标对象的实际高度H与预设标志物的实际长度L之间的比值,与目标对象的对象像素长度h与预设标志物的像素长度之间的比值相同,通过识别目标对象在躺卧姿态下的对象像素长度,以及上述预设标志物的像素长度和实际长度,即可计算得到目标对象的实际高度,实现了对目标对象的身高测量的目的,无需用户手动输入,提高了身高录入的便捷性。
在S1604中,若所述对象姿态为第二姿态,则识别所述目标对象在所述测距图像内的像素高度。
在S1605中,根据所述像素高度与所述目标对象关联的实际高度之间的比值,得到所述测距参照参量。
在本实施例中,若目标对象的对象姿态为第二姿态,具体地,该第二姿态为站立姿态,则可以根据目标对象在测距图像内的像素高度以及预先存储的实际高度,确定测距参照参量,实现的方式可以参见图14所述的实施例,在此不再赘述。
在本申请实施例中,通过识别目标对象的对象姿态,在不完全伸展的状态下通过预设标志物确定目标对象与电子设备之间的距离值,可以提高测量距离方法的适用范围,由于目标对象在运动的过程中的姿态多变,可以通过标志物,即参照物的尺寸固定的特点,确定测距参照参量,提高了测量的准确性。
方式3:
图19示出了本申请第三实施例提供的测距参照参量的确定实现流程图。参见图18所示,相对于图7所述的实施例,该测量距离的方法中S702具体包括S1901~S1903,详述如下:
进一步地,所述测距图像的个数为M个;所述M为大于1的正整数,即电子设备会以预设的采集频率获取多个测距图像,在该情况下,所述根据所述测距图像确定测距参照参量,包括:
在S1901中,分别在M个所述测距图像确定佩戴于所述目标对象上的可穿戴设备的像素坐标,并基于M个所述像素坐标得到所述移动设备的第一轨迹。
在本实施例中,电子设备可以设置有一测距图像采集时长,在该测距图像采集时长内可以预设的频率获取M个测距图像,例如,测距图像采集时长为1s,预设的频率 为60Hz,即在1s时间内可以获取60张测距图像,基于60张测距图像确定电子设备与目标对象之间的距离值。需要说明的是,在实时测量距离值的场景下,即电子设备可以将M个测距图像存放在先进先出的队列中,该队列长度即为上述的M,电子设备可以根据队列中的M个测距图像计算得到各个时刻下电子设备与目标对象之间的距离值,由于在每一时刻,电子设备会获取当前时刻T最新采集的测距图像,此时会将最早添加到上述队列中的测距图像移除队列,即第T-M时刻对应的测距图像,电子设备可以根据更新后的队列确定距离值。
在一种可能的实现方式中,上述确定目标对象的移动轨迹之前,可以生成提示信息,以提示用户在垂直拍摄方向的平面内就摆动佩戴可穿戴设备的部位,例如在直拍摄方向的平面内摆动穿戴有智能手表的手臂,例如将说向上移动,而并非向前伸展,由于在进行平行拍摄方向进行摆动时,上述位移无法再测距图像中体现,从而降低了距离测量的精度,因此通过输出上述提示信息,可以降低在垂直拍摄方向的平面的点之间的相关性,提高标定精度。
在本实施例中,目标对象在运动的过程中可以佩戴有可穿戴设备,电子设备可以在多个测距图像中分别标记出可穿戴设备,并将测距图像中该可穿戴设备的关键点所在的坐标作为上述的像素坐标。电子设备可以根据各个测距图像的采集次序,依次连接各个测距图像内该可穿戴设备的像素坐标,从而得到可穿戴设备在测距图像内的移动轨迹,即上述的第一轨迹。
示例性地,图20示出了本申请一实施例提供的第一轨迹的示意图。参见图20所示,上述可穿戴设备具体为一智能手表,电子设备可以识别各个测距图像内智能手表的像素坐标,并依次连接各个像素坐标得到该智能手表在多个测距图像中对应的移动轨迹,即第一轨迹。
在S1902中,基于所述可穿戴设备反馈的运动参量,得到第二轨迹。
在本实施例中,电子设备可以与可穿戴设备建立通信连接;具体地,该通信连接可以为无线通信连接。例如,电子设备与可穿戴设备可以介入到同一WIFI网络中,基于WIFI网络建立对应的通信链路,以进行数据交互;可穿戴设备还可以加入到电子设备的蓝牙无线网络中,通过可穿戴设备的蓝牙模块以及电子设备的蓝牙模块,建立上述的通信连接。
在本实施例中,电子设备在需要测量目标对象与电子设备之间的距离值时,可以通过通信连接向目标对象的可穿戴设备发送运动参量反馈请求。可穿戴设备在接收到运动参量反馈请求后,可以向电子设备反馈运动参量,该运动参量包括但不限于:角速度、加速度、偏转量等与运动相关的参量。
在一种可能的实现方式中,可穿戴设备可以将预设时间段内的移动轨迹作为运动参量发送给电子设备,电子设备可以根据可穿戴设备记录得到的移动轨迹得到上述的第二轨迹。其中,该可穿戴设备发送的移动轨迹中可以标记有多个采集点,每个采集点对应一个采集时刻,电子设备可以根据上述第一轨迹起始时刻以及中止时刻,从可穿戴设备反馈的移动轨迹中进行轨迹截取,得到上述的第二轨迹。
进一步地,作为本申请的另一实施例,S1902具体还可以包括:
1.接收所述可穿戴设备以预设反馈周期发送的运动参量。
2.基于多个所述反馈周期对应的所述运动参量,生成所述第二轨迹。
在本实施例中,电子设备还可以以预设的反馈周期向电子设备发送多个运动参量,电子设备在接收到多个不同反馈周期的运动参量后,即可以得到在多个反馈周期对应的监控时长内该电子设备的第二轨迹。举例性地,若该运动参量具体为电子设备的速度值或加速度值,则可以对运动参量进行积分可以得到上述的第二轨迹。
在S1903中,基于所述第一轨迹的第一距离以及所述第二轨迹的第二距离,得到所述测距参照参量。
在本实施例中,基于电子设备反馈的第二轨迹为电子设备实际移动的轨迹,即第二距离为实际距离;基于测距图像得到的第一轨迹的第一距离为第二距离经过缩放后的距离,因此,通过比对第一距离与第二距离之间的比值,可以得到缩放比例,将缩放比例作为上述的测距参照参量,即可以得到电子设备与目标对象的距离值。其中,由于电子设备的焦距已知,因此电子设备与目标对象之间的实际距离与焦距之间的比值,与实际距离(即第二距离)和第一距离之间的比值是相同的,从而计算得到电子设备与目标对象之间的距离值。
在一种可能的实现方式中,电子设备可以通过可穿戴设备内的运动传感器反馈的运动参量,得到目标对象的三维移动轨迹,并根据该三维移动轨迹以及摄像模块的内部参数,将三维移动轨迹的各个轨迹点与基于多个测距图像得到的第一轨迹(即二维移动轨迹)的轨迹点进行透视多点投影(Perspective-n-Point,PNP),确定外参矩阵,基于外参矩阵对三维移动轨迹(即第二轨迹)包含的轨迹点进行投影后,可以较大精度地进行二维至三维之间的坐标角度对齐,并基于坐标角度对齐后确定上述测距参照参量,从而提高了计算的精度。
在本申请实施例中,电子设备可以接收可穿戴设备反馈的运动参量,从而确定可穿戴设备的实际移动距离,并且从测距图像中确定出对应的像素移动距离,通过上述两者的比值确定测距参照参量,在目标对象移动较为快速的情况下,也可以实现距离测量的目的,提高了距离测量的适用范围。
方式4:
图21示出了本申请第四实施例提供的测距参照参量的确定实现流程图。参见图21所示,相对于图7所述的实施例,该测量距离的方法中的S701具体为S2101和S2102,S702具体为S2103,S703具体为S2104,详述如下:
在S2101中,调整所述电子设备的数字对焦摄像模块的数字焦距,以使所述数字对焦摄像模块的对焦点对准所述目标对象。
在本实施例中,电子设备内置的摄像模块具体为一数字对焦摄像模块。该数字对焦摄像模块在拍摄图像时,可以通过调整数字焦距以改变拍摄时的焦点,用于聚焦在不同的拍摄对象。基于此,电子设备在获取测距图像时,可以通过调整数字变焦模块的数字焦距,以使拍摄的对焦点对准目标对象,其中,对焦点最准目标对象是指:目标对象在取景框内是清晰无重影。
示例性地,图22示出了本申请一实施例提供的对焦点对准目标对象的示意图。参见图22所示,数字对焦摄像模块可以在取景框中标记有对焦点,电子设备可以通过移动该对焦点确定所需进行对焦的对象,此时,电子设备可以通过改变电子设备的数字 焦距,以使对焦点对准的对象在取景框中为清晰且无重影的,而对于没有在对焦点上的拍摄对象,在取景框内可以相对较模糊的。
在S2102中,基于调整后的所述数字焦距,获取包含所述目标对象的所述测距图像。
在本实施例中,电子设备在确定目标对象在对焦点上时,可以控制数字对焦摄像模块获取当前的图像,即作为上述的测距图像,在测距图像内目标对象为对焦对象。
在S2103中,识别所述测距图像的所述数字焦距,并根据所述数字焦距得到测距参照参量。
在本实施例中,电子设备在获取测距图像时,可以将关联的拍摄信息封装在测距图像内,其中,该拍摄信息可以包括有:拍摄时间、拍摄时的数字焦距、图像格式等。电子设备可以从拍摄信息中提取上述数字焦距,将数字焦距作为上述的测距参照参量。
在S2104中,基于预设的数字焦距与拍摄距离的对应关系,查询所述数字焦距关联的拍摄距离,将所述拍摄距离识别为所述电子设备与所述目标对象之间的距离值。
在本实施例中,电子设备在出厂之前,可以预先配置有数字焦距与拍摄距离之间的对应关系表。其中,建立上述对应关系表的过程可以为:在与数字变焦拍摄模块距离多个关键距离点的位置可以标记有预设的图案,图23示出了本申请一实施例提供的配置上述对应关系表的示意图。参见图23所示,在0.5m、1m、1.5m以及2m等多个位置标记有预设的图案,记录对准各个图案时数字对焦摄像模块对应的数字焦距,从而建立得到数字焦距与拍摄距离之间的对应关系,其中,可以根据各个数字焦距与拍摄距离进行曲线拟合,得到对应的距离与数字焦距之间的变化曲线,并将该变化曲线或对应关系存储于数字对焦摄像模块的寄存器内,电子设备可以通过读取关联的寄存器获取上述的对应关系。
在本实施例中,电子设备根据测距图像的数字焦距,查询上述的对应关系确定目标对象与电子设备之间的距离值。
在本申请实施例中,通过预先存储由数字焦距与拍摄距离之间的对应关系,在拍摄时通过将对焦点对准目标对象,从而能够通过查询上述对应关系直接获取得的目标对象与电子设备之间的距离值,提高了距离值获取的便捷性以及适用范围。
应理解,上述实施例中各步骤的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。
实施例二:
对应于上文实施例所述的测量距离的方法,图24示出了本申请实施例提供的测量距离的装置的结构框图,为了便于说明,仅示出了与本申请实施例相关的部分。
参照图24,该测量距离的装置包括:
测距图像获取单元241,用于获取包含目标对象的测距图像;
测距参照参量确定单元242,用于根据所述测距图像确定测距参照参量;
距离值确定单元243,用于基于所述测距参照参量确定电子设备与所述目标对象之间的距离值。
可选地,测距参照参量确定单元242包括:
像素高度获取单元,用于识别所述目标对象在所述测距图像内的像素高度;
像素高度比对单元,用于根据所述像素高度与所述目标对象关联的实际高度之间的比值,得到所述测距参照参量。
可选地,所述测量距离的装置还包括:
人脸区域图像获取单元,用于从所述测距图像内提取所述目标对象的人脸区域图像;
人脸区域图像比对单元,用于基于所述人脸区域图像确定所述目标对象对应的用户信息;
第一用户信息提取单元,用于从所述用户信息提取所述目标对象的所述实际高度。
可选地,所述测量距离的装置还包括:
选择操作接收单元,用于响应于所述目标对象在用户选择界面上的选择操作,确定所述选择操作对应的用户账户;
第二用户信息提取单元,用于从所述用户账户的用户信息中获取所述目标对象的所述实际高度。
可选地,所述测距参照参量确定单元242包括:
对象姿态识别单元,用于识别所述目标对象在所述测距图像内的对象姿态;
第一姿态确定单元,用于若所述对象姿态为第一姿态,则获取在所述测距图像内所述目标对象关联的预设标志物的像素长度;
第一姿态测距单元,用于根据所述预设标志物的实际长度以及所述像素长度,得到所述测距参照参量。
可选地,所述测量距离的装置还包括:
第二姿态确定单元,用于若所述对象姿态为第二姿态,则识别所述目标对象在所述测距图像内的像素高度;
第二姿态测距单元,用于根据所述像素高度与所述目标对象关联的实际高度之间的比值,得到所述测距参照参量。
可选地,所述预设标记物为瑜伽垫;所述对象姿态识别单元具体用于:
若所述目标对象处于瑜伽训练模式,则执行所述识别所述目标对象在所述测距图像内的对象姿态的操作。
可选地,所述测距图像的个数为M个;所述M为大于1的正整数;所述测距参照参量确定单元242包括:
第一轨迹获取单元,用于分别在M个所述测距图像确定佩戴于所述目标对象上的可穿戴设备的像素坐标,并基于M个所述像素坐标得到所述移动设备的第一轨迹;
第二轨迹获取单元,用于基于所述可穿戴设备反馈的运动参量,得到第二轨迹;
移动距离比对单元,用于基于所述第一轨迹的第一距离以及所述第二轨迹的第二距离,得到所述测距参照参量。
可选地,所述第二轨迹获取单元包括:
运动参量接收单元,用于接收所述可穿戴设备以预设反馈周期发送的运动参量;
第二轨迹绘制单元,用于基于多个所述反馈周期对应的所述运动参量,生成所述 第二轨迹。
可选地,所述测距图像获取单元241包括:
数字焦距调整单元,用于调整所述电子设备的数字对焦摄像模块的数字焦距,以使所述数字对焦摄像模块的对焦点对准所述目标对象;
数字变焦拍照单元,用于基于调整后的所述数字焦距,获取包含所述目标对象的所述测距图像;
所述测距参照参量确定单元242包括:
数字焦距识别单元,用于识别所述测距图像的所述数字焦距,并根据所述数字焦距得到测距参照参量。
可选地,所述距离值确定单元243包括:
对应关系查找单元,用于基于预设的数字焦距与拍摄距离的对应关系,查询所述数字焦距关联的拍摄距离,将所述拍摄距离识别为所述电子设备与所述目标对象之间的距离值。
可选地,所述测量距离的装置还包括:
应用选择操作响应单元,用于响应于所述目标对象在应用界面上的应用选择操作,确定所述应用选择操作对应的目标应用;
非接触式交互操作响应单元,用于若所述目标应用为非接触式交互类型的应用,则执行所述获取包含目标对象的测距图像,以通过所述距离值进行非接触式交互操作。
因此,本申请实施例提供的测量距离的装置同样可以通过获取包含目标对象的测距图像,上述目标对象即为执行非接触式交互行为的用户,通过上述测距图像提取用于测量距离的测距参照参量,从而可以根据测距参照参量确定电子设备与目标对象之间的距离值,电子设备只需包含一个摄像模块即可实现,本实施例测量电子设备与目标对象之间的距离值时,并不依赖深度图像,或需要通过双目摄像头的拍摄角度差的方式来进行测距,因此电子设备无需电子设备配置基于光脉冲的收发器以及双目摄像头等模块,从而大大降低了电子设备的造价成本;与此同时,由于在进行距离测量的过程中,通过确定一种或多种测距参照参量进行距离测量,并非直接获取距离值,从而能够提高测距的准确性。
图25为本申请一实施例提供的电子设备的结构示意图。如图25所示,该实施例的电子设备25包括:至少一个处理器250(图25中仅示出一个)处理器、存储器251以及存储在所述存储器251中并可在所述至少一个处理器250上运行的计算机程序252,所述处理器250执行所述计算机程序252时实现上述任意各个测量距离的方法实施例中的步骤。
所述电子设备25可以是桌上型计算机、笔记本、掌上电脑及云端服务器等计算设备。该电子设备可包括,但不仅限于,处理器250、存储器251。本领域技术人员可以理解,图25仅仅是电子设备25的举例,并不构成对电子设备25的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件,例如还可以包括输入输出设备、网络接入设备等。
所称处理器250可以是中央处理单元(Central Processing Unit,CPU),该处理器250还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集 成电路(Application Specific Integrated Circuit,ASIC)、现成可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
所述存储器251在一些实施例中可以是所述电子设备25的内部存储单元,例如电子设备25的硬盘或内存。所述存储器251在另一些实施例中也可以是所述电子设备25的外部存储设备,例如所述电子设备25上配备的插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital,SD)卡,闪存卡(Flash Card)等。进一步地,所述存储器251还可以既包括所述电子设备25的内部存储单元也包括外部存储设备。所述存储器251用于存储操作系统、应用程序、引导装载程序(BootLoader)、数据以及其他程序等,例如所述计算机程序的程序代码等。所述存储器251还可以用于暂时地存储已经输出或者将要输出的数据。
需要说明的是,上述装置/单元之间的信息交互、执行过程等内容,由于与本申请方法实施例基于同一构思,其具体功能及带来的技术效果,具体可参见方法实施例部分,此处不再赘述。
所属领域的技术人员可以清楚地了解到,为了描述的方便和简洁,仅以上述各功能单元、模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能单元、模块完成,即将所述装置的内部结构划分成不同的功能单元或模块,以完成以上描述的全部或者部分功能。实施例中的各功能单元、模块可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中,上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。另外,各功能单元、模块的具体名称也只是为了便于相互区分,并不用于限制本申请的保护范围。上述系统中单元、模块的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
本申请实施例还提供了一种电子设备,该电子设备包括:至少一个处理器、存储器以及存储在所述存储器中并可在所述至少一个处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现上述任意各个方法实施例中的步骤。
本申请实施例还提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时实现可实现上述各个方法实施例中的步骤。
本申请实施例提供了一种计算机程序产品,当计算机程序产品在移动终端上运行时,使得移动终端执行时实现可实现上述各个方法实施例中的步骤。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实现上述实施例方法中的全部或部分流程,可以通过计算机程序来指令相关的硬件来完成,所述的计算机程序可存储于一计算机可读存储介质中,该计算机程序在被处理器执行时,可实现上述各个方法实施例的步骤。其中,所述计算机程序包括计算机程序代码,所述计算机程序代码可以为源代码形式、对象代码形式、可执行文件或某些中间形式等。所述计算机可读介质至少可以包括:能够将计算机程序代码携带到拍照装置/电子设备的 任何实体或装置、记录介质、计算机存储器、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、电载波信号、电信信号以及软件分发介质。例如U盘、移动硬盘、磁碟或者光盘等。在某些司法管辖区,根据立法和专利实践,计算机可读介质不可以是电载波信号和电信信号。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述或记载的部分,可以参见其它实施例的相关描述。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
在本申请所提供的实施例中,应该理解到,所揭露的装置/网络设备和方法,可以通过其它的方式实现。例如,以上所描述的装置/网络设备实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通讯连接可以是通过一些接口,装置或单元的间接耦合或通讯连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
以上所述实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的精神和范围,均应包含在本申请的保护范围之内。

Claims (15)

  1. 一种测量距离的方法,应用于电子设备,其特征在于,包括:
    获取包含目标对象的测距图像;
    根据所述测距图像确定测距参照参量;
    基于所述测距参照参量确定所述电子设备与所述目标对象之间的距离值。
  2. 根据权利要求1所述的方法,其特征在于,所述根据所述测距图像确定测距参照参量,包括:
    识别所述目标对象在所述测距图像内的像素高度;
    根据所述像素高度与所述目标对象关联的实际高度之间的比值,得到所述测距参照参量。
  3. 根据权利要求2所述的方法,其特征在于,在所述根据所述像素高度与所述目标对象关联的实际高度之间的比值,得到所述测距参照参量之前,还包括:
    从所述测距图像内提取所述目标对象的人脸区域图像;
    基于所述人脸区域图像确定所述目标对象对应的用户信息;
    从所述用户信息提取所述目标对象的所述实际高度。
  4. 根据权利要求2所述的方法,其特征在于,在所述根据所述像素高度与所述目标对象关联的实际高度之间的比值,得到所述测距参照参量之前,还包括:
    响应于所述目标对象在用户选择界面上的选择操作,确定所述选择操作对应的用户账户;
    从所述用户账户的用户信息中获取所述目标对象的所述实际高度。
  5. 根据权利要求1所述的方法,其特征在于,所述根据所述测距图像确定测距参照参量,包括:
    识别所述目标对象在所述测距图像内的对象姿态;
    若所述对象姿态为第一姿态,则获取在所述测距图像内所述目标对象关联的预设标志物的像素长度;
    根据所述预设标志物的实际长度以及所述像素长度,得到所述测距参照参量。
  6. 根据权利要求5所述的方法,其特征在于,在所述识别所述目标对象在所述测距图像内的对象姿态之后,还包括:
    若所述对象姿态为第二姿态,则识别所述目标对象在所述测距图像内的像素高度;
    根据所述像素高度与所述目标对象关联的实际高度之间的比值,得到所述测距参照参量。
  7. 根据权利要求5所述的方法,其特征在于,所述预设标记物为瑜伽垫;所述识别所述目标对象在所述测距图像内的对象姿态,包括:
    若所述目标对象处于瑜伽训练模式,则执行所述识别所述目标对象在所述测距图像内的对象姿态的操作。
  8. 根据权利要求1所述的方法,其特征在于,所述测距图像的个数为M个;所述M为大于1的正整数;所述根据所述测距图像确定测距参照参量,包括:
    分别在M个所述测距图像确定佩戴于所述目标对象上的可穿戴设备的像素坐标,并基于M个所述像素坐标得到所述可穿戴设备的第一轨迹;
    基于所述可穿戴设备反馈的运动参量,得到第二轨迹;
    基于所述第一轨迹的第一距离以及所述第二轨迹的第二距离,得到所述测距参照参量。
  9. 根据权利要求8所述的方法,其特征在于,所述基于所述可穿戴设备反馈的运 动参量,得到第二轨迹,包括:
    接收所述可穿戴设备以预设反馈周期发送的运动参量;
    基于多个所述反馈周期对应的所述运动参量,生成所述第二轨迹。
  10. 根据权利要求1所述的方法,其特征在于,所述获取包含目标对象的测距图像,包括:
    调整所述电子设备的数字对焦摄像模块的数字焦距,以使所述数字对焦摄像模块的对焦点对准所述目标对象;
    基于调整后的所述数字焦距,获取包含所述目标对象的所述测距图像;
    所述根据所述测距图像确定测距参照参量,包括:
    识别所述测距图像的所述数字焦距,并根据所述数字焦距得到测距参照参量。
  11. 根据权利要求10所述的方法,其特征在于,所述基于所述测距参照参量确定所述电子设备与所述目标对象之间的距离值,包括:
    基于预设的数字焦距与拍摄距离的对应关系,查询所述数字焦距关联的拍摄距离,将所述拍摄距离识别为所述电子设备与所述目标对象之间的距离值。
  12. 根据权利要求1-11任一项所述的方法,其特征在于,在所述获取包含目标对象的测距图像之前,还包括:
    响应于所述目标对象在应用界面上的应用选择操作,确定所述应用选择操作对应的目标应用;
    若所述目标应用为非接触式交互类型的应用,则执行所述获取包含目标对象的测距图像,以通过所述距离值进行非接触式交互操作。
  13. 一种测量距离的装置,其特征在于,包括:
    测距图像获取单元,用于获取包含目标对象的测距图像;
    测距参照参量确定单元,用于根据所述测距图像确定测距参照参量;
    距离值确定单元,用于基于所述测距参照参量确定电子设备与所述目标对象之间的距离值。
  14. 一种电子设备,包括存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机程序,其特征在于,所述处理器执行所述计算机程序时实现如权利要求1至12任一项所述的方法。
  15. 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1至12任一项所述的方法。
PCT/CN2021/111482 2020-08-28 2021-08-09 测量距离的方法、装置、电子设备及可读存储介质 WO2022042275A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010893041.3 2020-08-28
CN202010893041.3A CN114111704B (zh) 2020-08-28 2020-08-28 测量距离的方法、装置、电子设备及可读存储介质

Publications (1)

Publication Number Publication Date
WO2022042275A1 true WO2022042275A1 (zh) 2022-03-03

Family

ID=80354556

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/111482 WO2022042275A1 (zh) 2020-08-28 2021-08-09 测量距离的方法、装置、电子设备及可读存储介质

Country Status (2)

Country Link
CN (1) CN114111704B (zh)
WO (1) WO2022042275A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114897655A (zh) * 2022-07-12 2022-08-12 深圳市信润富联数字科技有限公司 基于视觉的防疫控制方法及装置、存储介质、电子设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103499334A (zh) * 2013-09-05 2014-01-08 小米科技有限责任公司 距离测量方法、装置及电子设备
CN105187719A (zh) * 2015-08-21 2015-12-23 深圳市金立通信设备有限公司 一种拍摄方法及终端
CN105973140A (zh) * 2016-04-29 2016-09-28 维沃移动通信有限公司 一种测量物体空间参数的方法及移动终端
CN110458888A (zh) * 2019-07-23 2019-11-15 深圳前海达闼云端智能科技有限公司 基于图像的测距方法、装置、存储介质和电子设备
CN110986887A (zh) * 2019-12-31 2020-04-10 长城汽车股份有限公司 物体尺寸检测方法、测距方法、存储介质及单目摄像头
CN111104909A (zh) * 2019-12-20 2020-05-05 深圳市商汤科技有限公司 图像处理方法、装置、计算机设备及存储介质

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101183206A (zh) * 2006-11-13 2008-05-21 华晶科技股份有限公司 计算被拍摄物体的距离及其实际尺寸的方法
CN103322984B (zh) * 2013-05-13 2015-09-09 成都理工大学 基于视频图像的测距、测速方法及装置
CN105043271B (zh) * 2015-08-06 2018-09-18 宁波市北仑海伯精密机械制造有限公司 长度测量方法及装置
CN107764233B (zh) * 2016-08-15 2020-09-04 杭州海康威视数字技术股份有限公司 一种测量方法及装置
CN108225278A (zh) * 2017-11-29 2018-06-29 维沃移动通信有限公司 一种测距方法、移动终端
CN107920211A (zh) * 2017-12-28 2018-04-17 深圳市金立通信设备有限公司 一种拍照方法、终端及计算机可读存储介质
CN108458685A (zh) * 2018-02-07 2018-08-28 普联技术有限公司 测距方法和测距装置
CN109714539B (zh) * 2019-01-25 2021-05-18 Oppo广东移动通信有限公司 基于姿态识别的图像采集方法、装置及电子设备
CN110148167B (zh) * 2019-04-17 2021-06-04 维沃移动通信有限公司 一种距离测量方法及终端设备
CN111486798B (zh) * 2020-04-20 2022-08-26 苏州智感电子科技有限公司 图像测距方法、图像测距系统及终端设备
CN111563926B (zh) * 2020-05-22 2024-02-23 上海依图网络科技有限公司 测量图像中物体物理尺寸的方法、电子设备、介质及系统
CN111561906B (zh) * 2020-05-25 2022-03-11 北京洛必德科技有限公司 机器人单目测距方法、系统、电子设备和计算机存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103499334A (zh) * 2013-09-05 2014-01-08 小米科技有限责任公司 距离测量方法、装置及电子设备
CN105187719A (zh) * 2015-08-21 2015-12-23 深圳市金立通信设备有限公司 一种拍摄方法及终端
CN105973140A (zh) * 2016-04-29 2016-09-28 维沃移动通信有限公司 一种测量物体空间参数的方法及移动终端
CN110458888A (zh) * 2019-07-23 2019-11-15 深圳前海达闼云端智能科技有限公司 基于图像的测距方法、装置、存储介质和电子设备
CN111104909A (zh) * 2019-12-20 2020-05-05 深圳市商汤科技有限公司 图像处理方法、装置、计算机设备及存储介质
CN110986887A (zh) * 2019-12-31 2020-04-10 长城汽车股份有限公司 物体尺寸检测方法、测距方法、存储介质及单目摄像头

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114897655A (zh) * 2022-07-12 2022-08-12 深圳市信润富联数字科技有限公司 基于视觉的防疫控制方法及装置、存储介质、电子设备
CN114897655B (zh) * 2022-07-12 2022-10-28 深圳市信润富联数字科技有限公司 基于视觉的防疫控制方法及装置、存储介质、电子设备

Also Published As

Publication number Publication date
CN114111704A (zh) 2022-03-01
CN114111704B (zh) 2023-07-18

Similar Documents

Publication Publication Date Title
CN110495819B (zh) 机器人的控制方法、机器人、终端、服务器及控制系统
WO2021036568A1 (zh) 辅助健身的方法和电子装置
WO2021179773A1 (zh) 图像处理方法和装置
WO2021169394A1 (zh) 基于深度的人体图像美化方法及电子设备
WO2020029306A1 (zh) 一种图像拍摄方法及电子设备
CN113805487B (zh) 控制指令的生成方法、装置、终端设备及可读存储介质
WO2022007707A1 (zh) 家居设备控制方法、终端设备及计算机可读存储介质
CN112087649B (zh) 一种设备搜寻方法以及电子设备
WO2022206494A1 (zh) 目标跟踪方法及其装置
WO2022048453A1 (zh) 解锁方法及电子设备
WO2022042275A1 (zh) 测量距离的方法、装置、电子设备及可读存储介质
WO2022152174A9 (zh) 一种投屏的方法和电子设备
WO2021036562A1 (zh) 用于健身训练的提示方法和电子设备
WO2022100597A1 (zh) 动作自适应评价方法、电子设备和存储介质
WO2022222702A1 (zh) 屏幕解锁方法和电子设备
CN112149483A (zh) 一种吹气检测方法及设备
WO2022222705A1 (zh) 设备控制方法和电子设备
WO2023207862A1 (zh) 确定头部姿态的方法以及装置
WO2020237444A1 (zh) 一种移动终端最大发射功率的控制方法以及移动终端
CN115482900A (zh) 基于电子秤的检测报告的生成方法、装置以及电子设备
CN116798470A (zh) 一种音频的播放方法、装置以及电子设备
CN118000660A (zh) 前庭功能的风险检测方法及电子设备
CN117762279A (zh) 控制方法、电子设备、存储介质及程序产品
CN117073713A (zh) 一种计步校准方法、系统和设备
CN115203524A (zh) 一种健身推荐方法和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21860114

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21860114

Country of ref document: EP

Kind code of ref document: A1