CN108614263B - Mobile terminal, position detection method and related product - Google Patents

Mobile terminal, position detection method and related product Download PDF

Info

Publication number
CN108614263B
CN108614263B CN201810361907.9A CN201810361907A CN108614263B CN 108614263 B CN108614263 B CN 108614263B CN 201810361907 A CN201810361907 A CN 201810361907A CN 108614263 B CN108614263 B CN 108614263B
Authority
CN
China
Prior art keywords
mobile terminal
target object
wireless earphone
radar sensor
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201810361907.9A
Other languages
Chinese (zh)
Other versions
CN108614263A (en
Inventor
张伟正
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201810361907.9A priority Critical patent/CN108614263B/en
Publication of CN108614263A publication Critical patent/CN108614263A/en
Application granted granted Critical
Publication of CN108614263B publication Critical patent/CN108614263B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/024Guidance services

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The embodiment of the application discloses a mobile terminal, a position detection method and a related product, which are applied to the mobile terminal, wherein the mobile terminal comprises a radar sensor; the method comprises the following steps: starting a radar sensor when a position query request of a user for the wireless headset is detected; scanning at least one target object in the area where the mobile terminal is located through a radar sensor to obtain a contour image of each target object; comparing the reference image of the wireless earphone with the contour image of each target object to obtain a contour image matched with the reference image; and determining the position information of the wireless headset according to the matched contour image. The embodiment of the application is favorable for improving the accuracy and convenience of the mobile terminal for positioning the wireless earphone.

Description

Mobile terminal, position detection method and related product
Technical Field
The application relates to the technical field of mobile terminals, in particular to a mobile terminal, a position detection method and a related product.
Background
After the wireless headset and the mobile phone in the market are successfully connected in a pairing mode at present, after the mobile phone detects a call request (active call or passive call) and a user confirms to carry out call, a user can carry out voice call with a call counterpart through the wireless headset, and the user can place the wireless headset in the process of using the wireless headset and is possibly difficult to find when the wireless headset is used next time.
Disclosure of Invention
The embodiment of the application provides a mobile terminal, a position detection method and a related product, aiming to improve the accuracy and convenience of the mobile terminal in positioning the position of a wireless earphone.
In a first aspect, an embodiment of the present application provides a mobile terminal, including a radar sensor, a memory, a transceiver, and a processor, the processor being connected to the radar sensor, the memory, and the transceiver, wherein,
the processor is used for starting the radar sensor when detecting a position query request of a user for the wireless headset;
the radar sensor is used for scanning at least one target object in the area where the mobile terminal is located to obtain a contour image of each target object;
the processor is further configured to compare the reference image of the wireless headset with the contour image of each target object, and acquire a contour image matched with the reference image; and the position information of the wireless earphone is determined according to the matched contour image.
In a second aspect, an embodiment of the present application provides a position detection method, which is applied to a mobile terminal, where the mobile terminal includes a radar sensor; the method comprises the following steps:
starting a radar sensor when a position query request of a user for the wireless headset is detected;
scanning at least one target object in the area where the mobile terminal is located through the radar sensor to obtain a contour image of each target object;
comparing the reference image of the wireless earphone with the contour image of each target object to obtain a contour image matched with the reference image;
and determining the position information of the wireless earphone according to the matched contour image.
In a third aspect, an embodiment of the present application provides a position detection apparatus, which is applied to a mobile terminal, where the mobile terminal includes a radar sensor; the position detection device comprises a starting unit, a scanning unit, a comparison unit and a determination unit, wherein,
the starting unit is used for starting the radar sensor when detecting a position query request of a user for the wireless headset;
the scanning unit is used for scanning at least one target object in the area where the mobile terminal is located through the radar sensor to obtain a contour image of each target object;
the comparison unit is used for comparing the reference image of the wireless earphone with the contour image of each target object to obtain a contour image matched with the reference image;
the determining unit is used for determining the position information of the wireless earphone according to the matched contour image.
In a fourth aspect, an embodiment of the present application provides a mobile terminal, including a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the program includes instructions for executing steps of any of the methods in the first aspect of the embodiment of the present application.
In a fifth aspect, the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program makes a computer perform part or all of the steps described in any one of the methods in the second aspect of the present application.
In a sixth aspect, the present application provides a computer program product, wherein the computer program product includes a non-transitory computer-readable storage medium storing a computer program, and the computer program is operable to cause a computer to perform some or all of the steps described in any one of the methods of the second aspect of the present application. The computer program product may be a software installation package.
It can be seen that, in the embodiment of the present application, when the mobile terminal detects a location query request of a user for a wireless headset, the radar sensor is firstly enabled, then at least one target object in an area where the mobile terminal is located is scanned by the radar sensor, a profile image of each target object is obtained, then, a reference image of the wireless headset is compared with the profile image of each target object, a profile image matched with the reference image is obtained, and finally, location information of the wireless headset is determined according to the matched profile image. Therefore, the mobile terminal can scan surrounding objects through the radar sensor to obtain the outline image of the objects, the position of the wireless earphone is accurately identified through image comparison, and compared with the scheme that the position of the mobile terminal is used by a user for the last time through the existing intelligent offline recording, the actual position of the wireless earphone can be detected in real time, errors caused by recording the position of the mobile terminal and the situation that the position of the wireless earphone is moved but not correspondingly updated in an offline state are avoided, and the accuracy and convenience for positioning the position of the wireless earphone by the mobile terminal are improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1A is a schematic structural diagram of a mobile terminal according to an embodiment of the present application;
fig. 1B is an exemplary diagram of a radar sensor scanning a profile of a target object according to an embodiment of the present application;
fig. 1C is a schematic structural diagram of another mobile terminal provided in the embodiment of the present application;
fig. 2 is a schematic flowchart of a position detection method according to an embodiment of the present application;
fig. 3 is a schematic flowchart of another position detection method provided in the embodiment of the present application;
fig. 4 is a schematic flowchart of another position detection method provided in the embodiment of the present application;
fig. 5 is a schematic structural diagram of a wireless headset according to an embodiment of the present application;
fig. 6 is a block diagram of functional units of a position detection apparatus according to an embodiment of the present disclosure.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The wireless headset according to the embodiment of the present application may be a single headset or a pair of headsets (including 2 headsets worn on the left ear and the right ear, respectively) with wireless communication capability, and the wireless headset may support wired or wireless charging, for example, may be placed in a dedicated charging box for charging, and may support functions such as conversation and music after being connected to a Mobile terminal such as a Mobile phone, and the Mobile terminal may include various handheld devices, vehicle-mounted devices, wearable devices, computing devices, or other processing devices connected to a wireless modem with wireless communication function, and various forms of User Equipment (UE), a Mobile Station (Mobile Station, MS), a terminal device (terminal device), and the like.
The following describes embodiments of the present application in detail.
Referring to fig. 1A, fig. 1A is a schematic structural diagram of a mobile terminal 100 according to an embodiment of the present application, where the mobile terminal 100 includes: a housing 110, a processor 120 disposed within the housing 110, a radar sensor 130, a memory 140, and a transceiver 150, the processor 120 being coupled to the radar sensor 130, the memory 140, and the transceiver 150, wherein,
the processor 120 is configured to enable the radar sensor 130 when detecting a location query request of a user for the wireless headset;
the radar sensor 130 is configured to scan at least one target object in an area where the mobile terminal 100 is located, so as to obtain a profile image of each target object;
the processor 120 is further configured to compare the reference image of the wireless headset with the contour image of each target object, and obtain a contour image matched with the reference image; and the position information of the wireless earphone is determined according to the matched contour image.
The radar is a device for detecting a target and measuring target information by emitting an electromagnetic wave and receiving an echo, and the radar sensor is an electronic device based on the principle. The process of scanning by the radar sensor refers to a process that the mobile terminal finds a target object by a radio method through the radar sensor, the radar sensor emits electromagnetic waves to irradiate the target and receives the echo of the target, and therefore information such as the distance from the target to an electromagnetic wave emission point, the distance change rate (radial speed), the direction, the height and the like is obtained.
The reference image refers to a reference contour image of the wireless headset, and the reference contour image can be obtained through camera recognition processing by the mobile terminal.
The processor 120 includes an application processor and a baseband processor, and the processor is a control center of the mobile terminal, connects various parts of the entire mobile terminal by using various interfaces and lines, and executes various functions of the mobile terminal and processes data by running or executing software programs and/or modules stored in the memory and calling data stored in the memory, thereby performing overall monitoring of the mobile terminal. The application processor mainly processes an operating system, a user interface, application programs and the like, and the baseband processor mainly processes wireless communication. It will be appreciated that the baseband processor described above may not be integrated into the processor. The memory 140 may be used to store software programs and modules, and the processor executes various functional applications and data processing of the mobile terminal by operating the software programs and modules stored in the memory. The memory 140 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the storage data area may store data created according to the use of the mobile terminal, and the like. Further, the memory 140 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. The transceiver 150 may be, for example, a wireless communication module such as bluetooth.
As another example of the structure of the mobile terminal 100 shown in FIG. 1C, the mobile terminal 100 may include control circuitry that may include the storage and processing circuitry 30. The storage and processing circuit 30 may be a memory, such as a hard disk drive memory, a non-volatile memory (e.g., a flash memory or other electronically programmable read-only memory used to form a solid state drive, etc.), a volatile memory (e.g., a static or dynamic random access memory, etc.), etc., and the embodiments of the present application are not limited thereto. Processing circuitry in the storage and processing circuitry 30 may be used to control the operation of the mobile terminal 100. The processing circuitry may be implemented based on one or more microprocessors, microcontrollers, digital signal processors, baseband processors, power management units, audio codec chips, application specific integrated circuits, display driver integrated circuits, and the like.
The storage and processing circuitry 30 may be used to run software in the mobile terminal 100 such as an Internet browsing application, a Voice Over Internet Protocol (VOIP) telephone call application, an email application, a media playing application, operating system functions, etc. Such software may be used to perform control operations such as camera-based image capture, ambient light measurement based on an ambient light sensor, proximity sensor measurement based on a proximity sensor, information display functionality implemented based on a status indicator such as a status indicator light of a light emitting diode, touch event detection based on a touch sensor, functionality associated with displaying information on multiple (e.g., layered) displays, operations associated with performing wireless communication functions, operations associated with collecting and generating audio signals, control operations associated with collecting and processing button press event data, and other functions in the mobile terminal 100, to name a few, embodiments of the present application are not limited.
The mobile terminal 100 may also include input-output circuitry 42. The input-output circuitry 42 may be used to enable the mobile terminal 100 to enable the input and output of data, i.e., to allow the mobile terminal 100 to receive data from external devices and also to allow the mobile terminal 100 to output data from the mobile terminal 100 to external devices. The input-output circuitry 42 may further include the sensor 32. The sensors 32 may include ambient light sensors, optical and capacitive based proximity sensors, touch sensors (e.g., optical based touch sensors and/or capacitive touch sensors, where the touch sensors may be part of a display screen or may be used independently as a touch sensor structure), acceleration sensors, and other sensors, among others.
Input-output circuitry 42 may also include one or more displays, such as display 14. The display 14 may include one or a combination of liquid crystal displays, organic light emitting diode displays, electronic ink displays, plasma displays, displays using other display technologies. The display 14 may include an array of touch sensors (i.e., the display 14 may be a display screen). The touch sensor may be a capacitive touch sensor formed by a transparent touch sensor electrode (e.g., an Indium Tin Oxide (ITO) electrode) array, or may be a touch sensor formed using other touch technologies, such as acoustic wave touch, pressure sensitive touch, resistive touch, optical touch, and the like, and the embodiments of the present application are not limited thereto.
The mobile terminal 100 may also include an audio component 36. The audio component 36 may be used to provide audio input and output functionality for the mobile terminal 100. The audio components 36 in the mobile terminal 100 may include a speaker, a microphone, a buzzer, a tone generator, and other components for generating and detecting sound.
The communication circuit 38 may be used to provide the mobile terminal 100 with the capability to communicate with external devices. The communication circuit 38 may include analog and digital input-output interface circuits, and wireless communication circuits based on radio frequency signals and/or optical signals. The wireless communication circuitry in communication circuitry 38 may include radio-frequency transceiver circuitry, power amplifier circuitry, low noise amplifiers, switches, filters, and antennas. For example, the wireless Communication circuitry in Communication circuitry 38 may include circuitry to support Near Field Communication (NFC) by transmitting and receiving Near Field coupled electromagnetic signals. For example, the communication circuitry 38 may include a near field communication antenna and a near field communication transceiver. The communications circuitry 38 may also include a cellular telephone transceiver and antenna, a wireless local area network transceiver circuit and antenna, and the like.
The mobile terminal 100 may further include a battery, a power management circuit, and other input-output units 40. The input-output unit 40 may include buttons, joysticks, click wheels, scroll wheels, touch pads, keypads, keyboards, cameras, light emitting diodes and other status indicators, etc.
A user may enter commands through input-output circuitry 42 to control operation of mobile terminal 100 and may use output data of input-output circuitry 42 to enable receipt of status information and other outputs from mobile terminal 100.
It can be seen that, in the embodiment of the present application, when the mobile terminal detects a location query request of a user for a wireless headset, the radar sensor is firstly enabled, then at least one target object in an area where the mobile terminal is located is scanned by the radar sensor, a profile image of each target object is obtained, then, a reference image of the wireless headset is compared with the profile image of each target object, a profile image matched with the reference image is obtained, and finally, location information of the wireless headset is determined according to the matched profile image. Therefore, the mobile terminal can scan surrounding objects through the radar sensor to obtain the outline image of the objects, the position of the wireless earphone is accurately identified through image comparison, and compared with the scheme that the position of the mobile terminal is used by a user for the last time through the existing intelligent offline recording, the actual position of the wireless earphone can be detected in real time, errors caused by recording the position of the mobile terminal and the situation that the position of the wireless earphone is moved but not correspondingly updated in an offline state are avoided, and the accuracy and convenience for positioning the position of the wireless earphone by the mobile terminal are improved.
In one possible example, the radar sensor 130 is further configured to scan a plurality of target objects in the area where the mobile terminal 100 is located according to a first scanning cycle to obtain a first profile image of each target object before the scanning of at least one target object in the area where the mobile terminal 100 is located to obtain the profile image of each target object;
the processor 120 is further configured to determine a volume of each target object according to the first contour image of each target object; the wireless earphone is used for screening at least one target object with the volume smaller than or equal to a preset volume from the plurality of target objects, wherein the preset volume is larger than the volume of the wireless earphone;
in the aspect of scanning at least one target object in the area where the mobile terminal 100 is located to obtain the profile image of each target object, the radar sensor 130 is specifically configured to scan the at least one target object according to a second scanning period to obtain the profile image of each target object, where the first scanning period is smaller than the second scanning period.
Wherein, when the radar sensor scans the surrounding objects, the scanning can be performed according to the preset direction (such as counterclockwise or clockwise, without unique limitation), the line and the plane are combined with the attached drawings to describe the process of scanning and detecting the outline of the target object in detail, as shown in fig. 1B, when the mobile terminal controls the radar sensor to scan a target object, the mobile terminal can detect the intensity of the echo signal in real time, in the first stage, after the transmitting direction of the transmitting signal is transferred from the peripheral area of the target object to the first edge outline of the target object, because the echo reflection distance changes greatly, the intensity of the echo signal of the corresponding radar sensor changes greatly, the mobile terminal can determine the outline of the target object corresponding to the transmitting direction after detecting the change, in the second stage, the transmitting signal of the radar sensor is continuously transmitted to the body of the target object and forms the echo signal, the echo signal intensity at the stage is mainly changed by the shape of the target object, the change interval of the echo signal intensity is relatively small, in the third stage, the transmitting direction of the transmitting signal is transferred to a peripheral area from the second edge profile of the target object, the echo reflection distance is greatly changed, the echo signal intensity of the corresponding radar sensor can be greatly changed, and through the detection process, the mobile terminal can identify the profile of the target object.
The first scanning period corresponds to a low-precision mode, and the second scanning period corresponds to a high-precision mode.
Therefore, in this example, the mobile terminal may first obtain the contour image of the object in the peripheral area in the low-precision mode, and predict the volume of the corresponding object, so as to screen out at least one target object that is close to the volume of the wireless headset, and further accurately determine the contour of the screened object in the high-precision mode, thereby facilitating to quickly and accurately position the contour of the wireless headset.
In one possible example, in said determining the location information of the wireless headset from the matched profile image, the processor 120 is specifically configured to: acquiring the position parameters of the matched profile image scanned by the mobile terminal 100 through the radar sensor 130; and location information for determining the wireless headset according to the location parameter and the location of the mobile terminal 100.
The position parameters comprise distance parameters and angle parameters, the distance from the target object to the radar sensor can be obtained by measuring the time difference between a transmitting signal and an echo signal reflected by the target object, dividing the time difference by 2 and multiplying the time difference by the light speed, the angle information is the direction from which the target object comes relative to the radar sensor, the angle information mainly comprises an elevation angle and an azimuth angle, the direction is realized through the directivity of an antenna of the radar sensor, a beam transmitted by the radar sensor is narrow and narrow like a light column of a searchlight, the directivity is strong, the direction is observed in which direction an echo without a target exists, the direction is not changed, the angle is recorded in some directions, the distance information and the angle information of the target object relative to the radar sensor exist, the position of the target object relative to the radar can be known, and the positioning is realized.
Therefore, in this example, the mobile terminal can accurately calculate the actual position of the wireless headset according to the position parameters when the wireless headset is scanned, so that the positioning accuracy is improved.
In one possible example, the processor 120, after said determining the location information of the wireless headset from the matched scan image, is further configured to: generating a navigation route according to the position information; and for navigating according to the navigation route.
The navigation route may be a navigation route based on an indoor map navigation technology, and specifically, an initial position of a current mobile terminal and a target position of a wireless headset may be presented on an indoor map, and an optimal route from the initial position to the target position is planned.
Therefore, in this example, after the mobile terminal locates the position of the wireless headset, the navigation route can be further provided, so that the user can find the wireless headset in time according to the navigation route, and convenience and intelligence of inquiring the wireless headset are improved.
In one possible example, the processor 120 is further configured to: determining that the wireless earphone is in an effective connection distance range according to the position information of the wireless earphone, and establishing wireless connection with the wireless earphone; and the wireless earphone is used for controlling the wireless earphone to play preset audio information at the maximum volume so as to prompt the user of the position.
The preset audio can enable the sound which is easy to attract the attention of the user to be played, and specifically, the preset audio can be preset by a developer or preset by the user.
Therefore, in the example, after the position of the wireless earphone is determined, the mobile terminal can further perform position reminding by controlling the wireless earphone to play specific sound, so that the implementation mode of determining the position of the wireless earphone through the mobile terminal is further expanded, and the searching convenience and diversity are improved.
Referring to fig. 2, fig. 2 is a schematic flowchart illustrating a position detection method according to an embodiment of the present application, applied to the mobile terminal shown in fig. 1A, where the mobile terminal includes a radar sensor; as shown in the figure, the position detection method includes:
s201, when a mobile terminal detects a position query request of a user for a wireless earphone, starting a radar sensor;
s202, the mobile terminal scans at least one target object in the area where the mobile terminal is located through the radar sensor to obtain a contour image of each target object;
s203, the mobile terminal compares the reference image of the wireless earphone with the contour image of each target object to obtain a contour image matched with the reference image;
s203, the mobile terminal determines the position information of the wireless earphone according to the matched contour image.
It can be seen that, in the embodiment of the present application, when the mobile terminal detects a location query request of a user for a wireless headset, the radar sensor is firstly enabled, then at least one target object in an area where the mobile terminal is located is scanned by the radar sensor, a profile image of each target object is obtained, then, a reference image of the wireless headset is compared with the profile image of each target object, a profile image matched with the reference image is obtained, and finally, location information of the wireless headset is determined according to the matched profile image. Therefore, the mobile terminal can scan surrounding objects through the radar sensor to obtain the outline image of the objects, the position of the wireless earphone is accurately identified through image comparison, and compared with the scheme that the position of the mobile terminal is used by a user for the last time through the existing intelligent offline recording, the actual position of the wireless earphone can be detected in real time, errors caused by recording the position of the mobile terminal and the situation that the position of the wireless earphone is moved but not correspondingly updated in an offline state are avoided, and the accuracy and convenience for positioning the position of the wireless earphone by the mobile terminal are improved.
In one possible example, before the mobile terminal scans at least one target object in an area where the mobile terminal is located through the radar sensor to obtain a contour image of each target object, the method further includes: the mobile terminal scans a plurality of target objects in an area where the mobile terminal is located through the radar sensor according to a first scanning period to obtain a first contour image of each target object; determining the volume of each target object according to the first contour image of each target object; screening at least one target object with the volume smaller than or equal to a preset volume from the plurality of target objects, wherein the preset volume is larger than the volume of the wireless earphone;
the mobile terminal scans at least one target object in the area where the mobile terminal is located through the radar sensor to obtain a contour image of each target object, and the method comprises the following steps: and the mobile terminal scans the at least one target object through the radar sensor according to a second scanning period to obtain a profile image of each target object, wherein the first scanning period is smaller than the second scanning period.
In one possible example, the mobile terminal determines the position information of the wireless headset according to the matched contour image, and the method comprises the following steps: the mobile terminal acquires the position parameters of the matched contour image scanned by the mobile terminal through the radar sensor; and determining the position information of the wireless earphone according to the position parameters and the position of the mobile terminal.
In one possible example, after the mobile terminal determines the location information of the wireless headset according to the matched scanned image, the method further includes: the mobile terminal generates a navigation route according to the position information; and navigating according to the navigation route.
In one possible example, the method further comprises: the mobile terminal determines that the wireless earphone is in an effective connection distance range according to the position information of the wireless earphone, and establishes wireless connection with the wireless earphone; and controlling the wireless earphone to play preset audio information at the maximum volume so as to prompt the user of the position.
Referring to fig. 3, fig. 3 is a schematic flowchart of a position detection method according to an embodiment of the present application, applied to the mobile terminal shown in fig. 1A, where the mobile terminal includes a radar sensor, and as shown in the figure, the position detection method includes:
s301, when the mobile terminal detects a position query request of a user for the wireless headset, starting a radar sensor;
s302, the mobile terminal scans a plurality of target objects in an area where the mobile terminal is located through the radar sensor according to a first scanning period to obtain a first contour image of each target object;
s303, the mobile terminal determines the volume of each target object according to the first contour image of each target object;
s304, the mobile terminal screens out at least one target object with the volume smaller than or equal to a preset volume from the target objects, wherein the preset volume is larger than the volume of the wireless earphone;
s305, the mobile terminal scans the at least one target object through the radar sensor according to a second scanning period to obtain a profile image of each target object, wherein the first scanning period is smaller than the second scanning period.
S306, the mobile terminal compares the reference image of the wireless earphone with the contour image of each target object to obtain a contour image matched with the reference image;
s307, the mobile terminal determines the position information of the wireless earphone according to the matched contour image.
It can be seen that, in the embodiment of the present application, when the mobile terminal detects a location query request of a user for a wireless headset, the radar sensor is firstly enabled, then at least one target object in an area where the mobile terminal is located is scanned by the radar sensor, a profile image of each target object is obtained, then, a reference image of the wireless headset is compared with the profile image of each target object, a profile image matched with the reference image is obtained, and finally, location information of the wireless headset is determined according to the matched profile image. Therefore, the mobile terminal can scan surrounding objects through the radar sensor to obtain the outline image of the objects, the position of the wireless earphone is accurately identified through image comparison, and compared with the scheme that the position of the mobile terminal is used by a user for the last time through the existing intelligent offline recording, the actual position of the wireless earphone can be detected in real time, errors caused by recording the position of the mobile terminal and the situation that the position of the wireless earphone is moved but not correspondingly updated in an offline state are avoided, and the accuracy and convenience for positioning the position of the wireless earphone by the mobile terminal are improved.
In addition, the mobile terminal can quickly obtain the contour image of the object in the peripheral area through the low-precision mode, and predict the volume of the corresponding object, so that at least one target object close to the volume of the wireless earphone is screened out, and the contour of the screened object is accurately determined by further adopting the high-precision mode, so that the contour of the wireless earphone is positioned quickly and accurately.
Referring to fig. 4, fig. 4 is a schematic flowchart of a position detection method provided in an embodiment of the present application, and the position detection method is applied to a mobile terminal including a radar sensor, and as shown in the figure, the position detection method includes:
s401, when a mobile terminal detects a position query request of a user for a wireless earphone, starting a radar sensor;
s402, the mobile terminal scans a plurality of target objects in the area where the mobile terminal is located through the radar sensor according to a first scanning period to obtain a first contour image of each target object;
s403, the mobile terminal determines the volume of each target object according to the first contour image of each target object;
s404, the mobile terminal screens out at least one target object with the volume smaller than or equal to a preset volume from the target objects, wherein the preset volume is larger than the volume of the wireless earphone;
s405, the mobile terminal scans the at least one target object through the radar sensor according to a second scanning period to obtain a profile image of each target object, wherein the first scanning period is smaller than the second scanning period.
S406, the mobile terminal compares the reference image of the wireless earphone with the contour image of each target object to obtain a contour image matched with the reference image;
s407, the mobile terminal acquires the position parameters of the matched contour image scanned by the mobile terminal through the radar sensor;
s408, the mobile terminal determines the position information of the wireless earphone according to the position parameter and the position of the mobile terminal.
S409, the mobile terminal generates a navigation route according to the position information;
and S410, the mobile terminal navigates according to the navigation route.
It can be seen that, in the embodiment of the present application, when the mobile terminal detects a location query request of a user for a wireless headset, the radar sensor is firstly enabled, then at least one target object in an area where the mobile terminal is located is scanned by the radar sensor, a profile image of each target object is obtained, then, a reference image of the wireless headset is compared with the profile image of each target object, a profile image matched with the reference image is obtained, and finally, location information of the wireless headset is determined according to the matched profile image. Therefore, the mobile terminal can scan surrounding objects through the radar sensor to obtain the outline image of the objects, the position of the wireless earphone is accurately identified through image comparison, and compared with the scheme that the position of the mobile terminal is used by a user for the last time through the existing intelligent offline recording, the actual position of the wireless earphone can be detected in real time, errors caused by recording the position of the mobile terminal and the situation that the position of the wireless earphone is moved but not correspondingly updated in an offline state are avoided, and the accuracy and convenience for positioning the position of the wireless earphone by the mobile terminal are improved.
In addition, the mobile terminal can quickly obtain the contour image of the object in the peripheral area through the low-precision mode, and predict the volume of the corresponding object, so that at least one target object close to the volume of the wireless earphone is screened out, and the contour of the screened object is accurately determined by further adopting the high-precision mode, so that the contour of the wireless earphone is positioned quickly and accurately.
In addition, the mobile terminal can accurately calculate the actual position of the wireless earphone according to the position parameters when the wireless earphone is scanned, and the positioning accuracy is improved.
In addition, after the mobile terminal is located out of the position of the wireless earphone, a navigation route can be further provided, so that a user can find the wireless earphone in time according to the navigation route, and convenience and intelligence of inquiring the wireless earphone are improved.
In accordance with the embodiments shown in fig. 2, fig. 3, and fig. 4, please refer to fig. 5, and fig. 5 is a schematic structural diagram of a mobile terminal according to an embodiment of the present application, where as shown, the mobile terminal includes a radar sensor, the mobile terminal further includes a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the programs include instructions for performing the following steps;
starting a radar sensor when a position query request of a user for the wireless headset is detected;
scanning at least one target object in the area where the mobile terminal is located through the radar sensor to obtain a contour image of each target object;
comparing the reference image of the wireless earphone with the contour image of each target object to obtain a contour image matched with the reference image;
and determining the position information of the wireless earphone according to the matched contour image.
It can be seen that, in the embodiment of the present application, when the mobile terminal detects a location query request of a user for a wireless headset, the radar sensor is firstly enabled, then at least one target object in an area where the mobile terminal is located is scanned by the radar sensor, a profile image of each target object is obtained, then, a reference image of the wireless headset is compared with the profile image of each target object, a profile image matched with the reference image is obtained, and finally, location information of the wireless headset is determined according to the matched profile image. Therefore, the mobile terminal can scan surrounding objects through the radar sensor to obtain the outline image of the objects, the position of the wireless earphone is accurately identified through image comparison, and compared with the scheme that the position of the mobile terminal is used by a user for the last time through the existing intelligent offline recording, the actual position of the wireless earphone can be detected in real time, errors caused by recording the position of the mobile terminal and the situation that the position of the wireless earphone is moved but not correspondingly updated in an offline state are avoided, and the accuracy and convenience for positioning the position of the wireless earphone by the mobile terminal are improved.
In one possible example, the program further includes instructions for: before the radar sensor scans at least one target object in the area where the mobile terminal is located to obtain the profile image of each target object, scanning a plurality of target objects in the area where the mobile terminal is located by the radar sensor according to a first scanning period to obtain a first profile image of each target object; determining the volume of each target object according to the first contour image of each target object; screening at least one target object with the volume smaller than or equal to a preset volume from the plurality of target objects, wherein the preset volume is larger than the volume of the wireless earphone;
in the aspect that the at least one target object in the area where the mobile terminal is located is scanned by the radar sensor to obtain the profile image of each target object, the instructions in the program are specifically configured to perform the following operations: and scanning the at least one target object through the radar sensor according to a second scanning period to obtain a profile image of each target object, wherein the first scanning period is smaller than the second scanning period.
In one possible example, in said determining the location information of the wireless headset from the matched contour image, the instructions in the program are specifically configured to: acquiring the position parameters of the matched contour image scanned by the mobile terminal through the radar sensor; and determining the position information of the wireless earphone according to the position parameters and the position of the mobile terminal.
In one possible example, the program further includes instructions for: after the position information of the wireless headset is determined according to the matched scanning image, a navigation route is generated according to the position information; and navigating according to the navigation route.
In one possible example, the program further includes instructions for: determining that the wireless earphone is in an effective connection distance range according to the position information of the wireless earphone, and establishing wireless connection with the wireless earphone; and controlling the wireless earphone to play preset audio information at the maximum volume so as to prompt the user of the position.
The above description has introduced the solution of the embodiment of the present application mainly from the perspective of the method-side implementation process. It is understood that the mobile terminal includes hardware structures and/or software modules for performing the respective functions in order to implement the above-described functions. Those of skill in the art will readily appreciate that the present application is capable of hardware or a combination of hardware and computer software implementing the various illustrative elements and algorithm steps described in connection with the embodiments provided herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the mobile terminal may be divided into the functional units according to the method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Fig. 6 is a block diagram showing functional units of a position detection device 600 according to an embodiment of the present application. The position detection apparatus 600 is applied to a mobile terminal including a radar sensor, the position detection apparatus 600 includes a start unit 601, a scanning unit 602, a comparison unit 603, and a determination unit 604, wherein,
the starting unit 601 is configured to start the radar sensor when detecting a location query request of a user for the wireless headset;
the scanning unit 602 is configured to scan at least one target object in an area where the mobile terminal is located through the radar sensor, so as to obtain a profile image of each target object;
the comparing unit 603 is configured to compare the reference image of the wireless headset with the contour image of each target object, and obtain a contour image matched with the reference image;
the determining unit 604 is configured to determine the position information of the wireless headset according to the matched contour image.
It can be seen that, in the embodiment of the present application, when the mobile terminal detects a location query request of a user for a wireless headset, the radar sensor is firstly enabled, then at least one target object in an area where the mobile terminal is located is scanned by the radar sensor, a profile image of each target object is obtained, then, a reference image of the wireless headset is compared with the profile image of each target object, a profile image matched with the reference image is obtained, and finally, location information of the wireless headset is determined according to the matched profile image. Therefore, the mobile terminal can scan surrounding objects through the radar sensor to obtain the outline image of the objects, the position of the wireless earphone is accurately identified through image comparison, and compared with the scheme that the position of the mobile terminal is used by a user for the last time through the existing intelligent offline recording, the actual position of the wireless earphone can be detected in real time, errors caused by recording the position of the mobile terminal and the situation that the position of the wireless earphone is moved but not correspondingly updated in an offline state are avoided, and the accuracy and convenience for positioning the position of the wireless earphone by the mobile terminal are improved.
In a possible example, before the scanning, by the radar sensor, at least one target object in an area where the mobile terminal is located to obtain a contour image of each target object, the scanning unit 602 is further configured to: scanning a plurality of target objects in the area where the mobile terminal is located through the radar sensor according to a first scanning period to obtain a first contour image of each target object;
the determining unit 604 is further configured to determine a volume of each target object according to the first contour image of each target object;
the mobile terminal further comprises a screening unit, wherein the screening unit is used for screening at least one target object with the volume smaller than or equal to a preset volume from the plurality of target objects, and the preset volume is larger than the volume of the wireless earphone;
in the aspect that the at least one target object in the area where the mobile terminal is located is scanned by the radar sensor to obtain the profile image of each target object, the scanning unit 602 is specifically configured to: and scanning the at least one target object through the radar sensor according to a second scanning period to obtain a profile image of each target object, wherein the first scanning period is smaller than the second scanning period.
In one possible example, in said determining the position information of the wireless headset from the matched contour image, the determining unit 604 is specifically configured to: acquiring the position parameters of the matched contour image scanned by the mobile terminal through the radar sensor; and the position information of the wireless earphone is determined according to the position parameter and the position of the mobile terminal.
In one possible example, the mobile terminal further comprises a generating unit and a navigation unit,
the generating unit is configured to generate a navigation route according to the position information after the determining unit 604 determines the position information of the wireless headset according to the matched scanned image;
and the navigation unit is used for navigating according to the navigation route.
In one possible example, the mobile terminal further comprises a setup unit and a control unit,
the establishing unit is used for determining that the wireless earphone is in an effective connection distance range according to the position information of the wireless earphone and establishing wireless connection with the wireless earphone;
the control unit is used for controlling the wireless earphone to play preset audio information at the maximum volume so as to prompt the user of the position.
The starting unit 601, the comparing unit 603, and the determining unit 604 may be processors, and the scanning unit 602 may be a micro-control chip of the radar sensor.
Embodiments of the present application also provide a computer storage medium, where the computer storage medium stores a computer program for electronic data exchange, and the computer program enables a computer to execute part or all of the steps of any one of the methods described in the above method embodiments, and the computer includes a mobile terminal.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any of the methods as described in the above method embodiments. The computer program product may be a software installation package, the computer comprising a mobile terminal.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the above-described division of the units is only one type of division of logical functions, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer readable memory if it is implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the above-mentioned method of the embodiments of the present application. And the aforementioned memory comprises: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash Memory disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (9)

1. A mobile terminal comprising a radar sensor and a processor, the processor being connected to the radar sensor, wherein,
the processor is used for starting the radar sensor when detecting a position query request of a user for the wireless earphone, wherein the wireless earphone is used for realizing a sound input and output function after being connected with the mobile terminal;
the radar sensor is used for scanning at least one target object in the area where the mobile terminal is located to obtain a contour image of each target object;
the processor is further configured to compare the reference image of the wireless headset with the contour image of each target object, and acquire a contour image matched with the reference image; and determining location information of the wireless headset according to the matched contour image;
the processor is further used for generating a navigation route according to the position information and navigating according to the navigation route; wherein the navigation route is based on an indoor map navigation technology, and the processor generates the navigation route according to the position information, including: presenting an initial position of a current mobile terminal and a target position of a wireless earphone on an indoor map, and planning an optimal route from the initial position to the target position;
the processor is further configured to: determining that the wireless earphone is in an effective connection distance range according to the position information of the wireless earphone, and establishing wireless connection with the wireless earphone; and the wireless earphone is used for controlling the wireless earphone to play preset audio information at the maximum volume so as to prompt the user of the position, wherein the preset audio information is preset by the user.
2. The mobile terminal according to claim 1, wherein the radar sensor is further configured to scan a plurality of target objects in the area where the mobile terminal is located according to a first scanning cycle to obtain a first profile image of each target object before the scanning of at least one target object in the area where the mobile terminal is located to obtain the profile image of each target object;
the processor is further used for determining the volume of each target object according to the first contour image of each target object; the wireless earphone is used for screening at least one target object with the volume smaller than or equal to a preset volume from the plurality of target objects, wherein the preset volume is larger than the volume of the wireless earphone;
in the aspect of scanning at least one target object in the area where the mobile terminal is located to obtain the profile image of each target object, the radar sensor is specifically configured to scan the at least one target object according to a second scanning period to obtain the profile image of each target object, where the first scanning period is smaller than the second scanning period.
3. The mobile terminal of claim 1, wherein in said determining the location information of the wireless headset from the matched profile image, the processor is specifically configured to: acquiring the position parameters of the matched contour image scanned by the mobile terminal through the radar sensor; and the position information of the wireless earphone is determined according to the position parameter and the position of the mobile terminal.
4. A position detection method is characterized by being applied to a mobile terminal, wherein the mobile terminal comprises a radar sensor; the method comprises the following steps:
starting a radar sensor when a position query request of a user for a wireless earphone is detected, wherein the wireless earphone is used for realizing a sound input and output function after being connected with a mobile terminal;
scanning at least one target object in the area where the mobile terminal is located through the radar sensor to obtain a contour image of each target object;
comparing the reference image of the wireless earphone with the contour image of each target object to obtain a contour image matched with the reference image;
determining the position information of the wireless earphone according to the matched contour image;
generating a navigation route according to the position information; and
navigating according to the navigation route;
the method for generating the navigation route according to the position information comprises the following steps of: presenting an initial position of a current mobile terminal and a target position of a wireless earphone on an indoor map, and planning an optimal route from the initial position to the target position;
determining that the wireless earphone is in an effective connection distance range according to the position information of the wireless earphone, and establishing wireless connection with the wireless earphone;
and controlling the wireless earphone to play preset audio information at the maximum volume so as to prompt the user of the position, wherein the preset audio information is preset by the user.
5. The method according to claim 4, wherein before scanning at least one target object in an area where the mobile terminal is located by the radar sensor to obtain a contour image of each target object, the method further comprises:
scanning a plurality of target objects in the area where the mobile terminal is located through the radar sensor according to a first scanning period to obtain a first contour image of each target object;
determining the volume of each target object according to the first contour image of each target object;
screening at least one target object with the volume smaller than or equal to a preset volume from the plurality of target objects, wherein the preset volume is larger than the volume of the wireless earphone;
the scanning, by the radar sensor, of at least one target object in an area where the mobile terminal is located to obtain a profile image of each target object includes:
and scanning the at least one target object through the radar sensor according to a second scanning period to obtain a profile image of each target object, wherein the first scanning period is greater than the second scanning period.
6. The method of claim 4, wherein determining the location information of the wireless headset from the matched profile image comprises:
acquiring the position parameters of the matched contour image scanned by the mobile terminal through the radar sensor;
and determining the position information of the wireless earphone according to the position parameters and the position of the mobile terminal.
7. The position detection device is characterized by being applied to a mobile terminal, wherein the mobile terminal comprises a radar sensor; the position detection device comprises a starting unit, a scanning unit, a comparison unit, a determination unit generation unit and a navigation unit, wherein,
the starting unit is used for starting the radar sensor when detecting a position query request of a user for the wireless earphone, wherein the wireless earphone is used for realizing a sound input and output function after being connected with the mobile terminal;
the scanning unit is used for scanning at least one target object in the area where the mobile terminal is located through the radar sensor to obtain a contour image of each target object;
the comparison unit is used for comparing the reference image of the wireless earphone with the contour image of each target object to obtain a contour image matched with the reference image;
the determining unit is used for determining the position information of the wireless earphone according to the matched contour image;
the generating unit is used for generating a navigation route according to the position information after the determining unit determines the position information of the wireless headset according to the matched scanning image; wherein the navigation route is based on an indoor map navigation technology, and the generating unit generates the navigation route according to the position information, including: presenting an initial position of a current mobile terminal and a target position of a wireless earphone on an indoor map, and planning an optimal route from the initial position to the target position;
the navigation unit is used for navigating according to the navigation route;
the establishing unit is used for determining that the wireless earphone is in an effective connection distance range according to the position information of the wireless earphone and establishing wireless connection with the wireless earphone;
the control unit is used for controlling the wireless earphone to play preset audio information at the maximum volume so as to prompt the user of the position, and the preset audio information is preset by the user.
8. A wireless headset comprising a processor, a memory, a communication interface, and one or more programs stored in the memory and configured for execution by the processor, the programs comprising instructions for performing the steps in the method of any of claims 4-6.
9. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any of the claims 4-6.
CN201810361907.9A 2018-04-20 2018-04-20 Mobile terminal, position detection method and related product Expired - Fee Related CN108614263B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810361907.9A CN108614263B (en) 2018-04-20 2018-04-20 Mobile terminal, position detection method and related product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810361907.9A CN108614263B (en) 2018-04-20 2018-04-20 Mobile terminal, position detection method and related product

Publications (2)

Publication Number Publication Date
CN108614263A CN108614263A (en) 2018-10-02
CN108614263B true CN108614263B (en) 2020-01-14

Family

ID=63660566

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810361907.9A Expired - Fee Related CN108614263B (en) 2018-04-20 2018-04-20 Mobile terminal, position detection method and related product

Country Status (1)

Country Link
CN (1) CN108614263B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109799501A (en) * 2018-12-17 2019-05-24 珠海格力电器股份有限公司 Monitoring method and device of monitoring equipment, storage medium and monitoring equipment
CN115065746B (en) * 2020-05-31 2024-05-31 深圳市睿耳电子有限公司 Wireless headset intelligent retrieving method, related device, storage medium and program product
CN117291979B (en) * 2023-09-26 2024-04-26 北京鹰之眼智能健康科技有限公司 Ear hole positioning method, electronic equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105827793A (en) * 2015-05-29 2016-08-03 维沃移动通信有限公司 Voice directional output method and mobile terminal
CN107908275A (en) * 2017-11-30 2018-04-13 北京小米移动软件有限公司 Control method, mobile terminal and the storage medium of mobile terminal

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202995037U (en) * 2012-09-24 2013-06-12 颜慧 Vehicular wireless underground obstacle detector and construction machine
CN105588545A (en) * 2015-12-31 2016-05-18 歌尔科技有限公司 Multi-target positioning method and system
CN105913034B (en) * 2016-04-18 2020-01-03 智车优行科技(北京)有限公司 Vehicle identification method and device and vehicle
CN107730534B (en) * 2016-08-09 2020-10-23 深圳光启合众科技有限公司 Target object tracking method and device
CN106249239B (en) * 2016-08-23 2019-01-01 深圳市速腾聚创科技有限公司 Object detection method and device
CN206378590U (en) * 2017-01-10 2017-08-04 深圳市华儒科技有限公司 A kind of object monitoring device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105827793A (en) * 2015-05-29 2016-08-03 维沃移动通信有限公司 Voice directional output method and mobile terminal
CN107908275A (en) * 2017-11-30 2018-04-13 北京小米移动软件有限公司 Control method, mobile terminal and the storage medium of mobile terminal

Also Published As

Publication number Publication date
CN108614263A (en) 2018-10-02

Similar Documents

Publication Publication Date Title
US11868604B2 (en) Display processing method and apparatus
CN108600539B (en) Mobile terminal, position detection method and related product
CN108900231B (en) Dynamic antenna adjustment method and related product
US20220036588A1 (en) Parameter obtaining method and terminal device
CN106487984B (en) A kind of method and apparatus adjusting volume
CN108614263B (en) Mobile terminal, position detection method and related product
CN111049510B (en) Touch key, control method and electronic equipment
CN111090104B (en) Imaging processing method and electronic device
US11131557B2 (en) Full-vision navigation and positioning method, intelligent terminal and storage device
CN108958697B (en) Screen sounding control method and device and electronic device
KR20190038034A (en) Electronic device and method for managing geofence thereof
CN109144460B (en) Sound production control method, sound production control device, electronic device, and storage medium
CN110536193B (en) Audio signal processing method and device
CN109240281B (en) Avoidance driving method and related product
CN111652100B (en) Fingerprint identification module, electronic equipment, control method and control device thereof
CN108683800B (en) Mobile terminal, position detection method and related product
CN108958631B (en) Screen sounding control method and device and electronic device
CN108668018B (en) Mobile terminal, volume control method and related product
CN109062533B (en) Sound production control method, sound production control device, electronic device, and storage medium
CN114465365A (en) Wireless charging socket and control method thereof
CN108390985B (en) Electronic device, display screen detection method and related product
CN108769364A (en) Call control method, device, mobile terminal and computer-readable medium
CN108680181B (en) Wireless earphone, step counting method based on earphone detection and related product
KR20220003908A (en) Method and electronic device for detecting surrounding audio signal
CN109086022B (en) Sound production control method, sound production control device, electronic device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200114

CF01 Termination of patent right due to non-payment of annual fee