CN110198372B - Method for determining telescopic state of camera shooting assembly, readable storage medium and related equipment - Google Patents

Method for determining telescopic state of camera shooting assembly, readable storage medium and related equipment Download PDF

Info

Publication number
CN110198372B
CN110198372B CN201910473489.7A CN201910473489A CN110198372B CN 110198372 B CN110198372 B CN 110198372B CN 201910473489 A CN201910473489 A CN 201910473489A CN 110198372 B CN110198372 B CN 110198372B
Authority
CN
China
Prior art keywords
assembly
camera
telescopic
driving
camera shooting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910473489.7A
Other languages
Chinese (zh)
Other versions
CN110198372A (en
Inventor
邓旭同
王朝
辛强
马超
徐雷
陈尧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201910473489.7A priority Critical patent/CN110198372B/en
Publication of CN110198372A publication Critical patent/CN110198372A/en
Application granted granted Critical
Publication of CN110198372B publication Critical patent/CN110198372B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72451User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to schedules, e.g. using calendar applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions

Abstract

The application discloses a method for determining the telescopic state of a camera shooting assembly and electronic equipment, which relate to the field of machinery, and can improve the stability of determining the telescopic state of the camera shooting assembly and the accuracy of a result through a simple method while saving cost. The telescopic state of the camera shooting assembly is determined by analyzing the environmental noise respectively collected by the built-in microphones (microphones, MICs) at two different positions of the electronic equipment. The expansion state of the camera shooting assembly can be accurately determined based on the existing hardware configuration of the electronic equipment without arranging a new device in the electronic equipment, so that the cost can be saved. In addition, the method is simple to operate, relatively less influenced by the external environment and stable in performance, so that the determined expansion state result is high in reliability.

Description

Method for determining telescopic state of camera shooting assembly, readable storage medium and related equipment
Technical Field
The embodiment of the application relates to the field of machinery, in particular to a method for determining the telescopic state of a camera shooting assembly and electronic equipment comprising the camera shooting assembly.
Background
Nowadays, as the screen occupation ratio of electronic devices (e.g., smart phones, tablet computers, netbooks, etc.) by consumers is higher and higher, it is a trend to realize comprehensive screens of electronic devices. The front camera becomes the biggest problem of preventing the real realization of a full screen.
In order to really realize a full-screen, a lifting type camera is adopted in the prior art. When a user needs to use the camera, the driving assembly controls the camera to pop up (as shown in 1A in figure 1), and when the camera is not used, the camera is folded up (as shown in 1B in figure 1), so that the camera does not occupy the screen space. In order to better control the lifting type camera, the electronic equipment needs to determine the telescopic state of the camera when the driving assembly works. Including whether to normally extend or retract, whether to successfully extend, whether to successfully retract, whether to block rotation, and the like.
The existing method is to determine the telescopic state of the camera by detecting the magnetic flux. However, when the magnetic environment is severe or is influenced by an external magnetic field, it is difficult to accurately determine the telescopic state of the camera. In addition, the calibration threshold of the magnetic field is difficult to determine due to instability of the magnetic environment. Further, a device for detecting magnetic flux needs to be provided in the electronic apparatus, which is costly.
Therefore, it is necessary to provide a method for determining the telescopic state of the camera, which has strong anti-interference capability, simple operation and high result reliability.
Disclosure of Invention
The embodiment of the application provides a method for determining the telescopic state of a camera shooting assembly, namely electronic equipment.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical solutions:
in a first aspect, a method for determining a telescopic state of a camera assembly is provided, where the electronic device includes: the electronic equipment comprises a shell, a camera device, a first MIC and a second MIC, wherein the electronic equipment is provided with an accommodating cavity, the shell is provided with an opening, the camera device comprises a driving component and a camera component, the camera component is accommodated in the accommodating cavity, the driving component is used for driving the camera component to stretch out of the shell from the opening or driving the camera component to be accommodated in the accommodating cavity from the opening, and the method comprises the following steps: in response to a first operation of a user, the electronic equipment starts a driving component; in the working process of the driving assembly, the electronic equipment acquires a first environmental noise N1 through a first MIC (microphone) and acquires a second environmental noise N2 through a second MIC; the electronic device determines the telescopic state of the camera module according to N1 and N2.
According to the technical scheme provided by the first aspect, the telescopic state of the camera shooting assembly is determined by analyzing the environmental noise respectively acquired by MICs of two different positions of the electronic equipment. The expansion and contraction state of the camera assembly can be accurately determined based on the existing hardware configuration of the electronic equipment without a new device built in the electronic equipment. The cost is saved, and meanwhile, the accuracy of determining the telescopic state of the camera shooting assembly can be improved. And the method has high stability and simple operation.
In one possible implementation, an electronic device activates a driving assembly, including: the electronic equipment sends an enabling signal and a direction signal to the driving assembly, the enabling signal is used for triggering the driving assembly to start, and the direction signal is used for indicating the direction of the driving assembly for driving the camera shooting assembly to move. The electronic device can accurately drive the driving component to move to the corresponding direction according to the direction signal through the enabling signal and the direction signal.
In one possible implementation, the distance between the first MIC and the image pickup apparatus is greater than the distance between the second MIC and the image pickup apparatus. Through two environmental noise that gather with the MIC that camera device apart from the difference in the electronic equipment, can directly determine the noise that drive assembly during operation produced, and then confirm the flexible state of subassembly of making a video recording.
In one possible implementation manner, the camera shooting assembly has multiple telescopic states, a noise threshold corresponding to each telescopic state is stored in the electronic device, and the noise thresholds corresponding to different telescopic states are different; the electronic equipment determines the telescopic state of the camera shooting assembly according to N1 and N2, and comprises the following steps: the electronic equipment acquires a difference value N between N1 and N2; the electronic equipment respectively compares the N with the noise threshold corresponding to each telescopic state to determine the telescopic state of the camera shooting assembly; wherein, the flexible state of the camera shooting assembly at least comprises any one of the following: starting extension/starting retraction, normal extension/normal retraction, blocking in extension/blocking in retraction, ending extension/ending retraction. By comparing the noise generated when the driving assembly works with the noise threshold corresponding to each telescopic state, the telescopic state of the camera assembly can be accurately determined.
In one possible implementation, the method further includes: in response to the start of the driving component, the electronic device starts a timer and records the working time of the driving component. By recording the working time of the driving assembly, the working time of the driving assembly can be combined for determination when the telescopic state of the camera assembly is analyzed, and the accuracy of the result is improved.
In one possible implementation, the electronic device compares N with N, respectivelyThe noise threshold corresponding to each expansion state determines the expansion state of the camera shooting assembly, and comprises the following steps: if N is larger than lower limit T of normal telescopic noise thresholdworking1And is less than the upper limit T of the normal telescopic noise thresholdworking2The electronic equipment determines that the camera shooting assembly normally stretches; and if the telescopic time t of the camera assemblyworkingSatisfies the following conditions:
Figure BDA0002081402130000021
the electronic equipment determines that the telescopic state of the camera shooting assembly is the end of telescopic movement; wherein, l is the distance that the camera shooting component needs to move for completing the stretching and retracting, vworkingNormal telescopic speed of camera assembly, tworkingRecorded by a timer, Tworking1<Tworking2. Whether the preset maximum telescopic distance is finished or not is judged according to the recorded working time of the driving assembly, whether the telescopic movement is finished or not can be further determined, and the accuracy is higher.
In one possible implementation, the method further includes: the electronic equipment records the telescopic information of the camera shooting assembly, and the telescopic information comprises telescopic state information and time information corresponding to each telescopic state. By recording the information, the information can be combined to determine when the telescopic state and the specific telescopic position of the camera shooting assembly are analyzed, and the accuracy of the result is improved.
In one possible implementation, after the electronic device determines the telescopic state of the camera assembly according to N1 and N2, the method further includes: the electronic equipment determines the telescopic position information of the camera shooting assembly according to the telescopic information of the camera shooting assembly. By combining the telescopic state information and the time information corresponding to each telescopic state, the telescopic position of the camera shooting assembly can be accurately positioned.
In one possible implementation, the method further includes: if the driving assembly is not started or does not drive the camera shooting assembly, and at least the following conditions are met, determining that the driving assembly is powered down: flexible information recording's subassembly of making a video recording last enable time is less than tworking. By judging the conditions, whether the electronic equipment is not successfully stretched or not due to power failure can be accurately positioned, so that the electronic equipment can convenientlyThe equipment condition is accurately mastered.
In a second aspect, an electronic device is provided, the electronic device comprising: casing, camera device, first MIC and second MIC, this electronic equipment are equipped with and accept the chamber, and this casing is equipped with the opening, and this camera device includes drive assembly and the subassembly of making a video recording, and the subassembly of making a video recording is acceptd in acceping the chamber, and drive assembly is used for driving the subassembly of making a video recording to stretch out the casing from the opening or drive the subassembly of making a video recording and accept the chamber from the opening income, and this electronic equipment still includes: a processor for starting the driving component in response to a first operation of a user; the first MIC is used for collecting first environmental noise N1 in the working process of the driving assembly; the second MIC is used for collecting second ambient noise N2 in the working process of the driving assembly; the processor is further configured to determine a telescopic state of the camera assembly based on N1 and N2.
According to the technical scheme provided by the second aspect, the telescopic state of the camera shooting assembly is determined by analyzing the environmental noise respectively acquired by the MICs of the two different positions of the electronic equipment. The expansion and contraction state of the camera assembly can be accurately determined based on the existing hardware configuration of the electronic equipment without a new device built in the electronic equipment. The cost is saved, and meanwhile, the accuracy of determining the telescopic state of the camera shooting assembly can be improved. And the method has high stability and simple operation.
In one possible implementation, the processor activates the driving component, including: the processor sends an enabling signal and a direction signal to the driving assembly, the enabling signal is used for triggering the driving assembly to start, and the direction signal is used for indicating the direction of the driving assembly for driving the camera shooting assembly to move. The electronic device can accurately drive the driving component to move to the corresponding direction according to the direction signal through the enabling signal and the direction signal.
In one possible implementation, the distance between the first MIC and the image pickup apparatus is greater than the distance between the second MIC and the image pickup apparatus. Through two environmental noise that gather with the MIC that camera device apart from the difference in the electronic equipment, can directly determine the noise that drive assembly during operation produced, and then confirm the flexible state of subassembly of making a video recording.
In one possible implementation, the camera assembly has a plurality of telescopic states, and the electronic device further includes: the memory is used for storing the noise threshold corresponding to each expansion state, and the noise thresholds corresponding to different expansion states are different; the processor determines the telescopic state of the camera assembly according to N1 and N2, and comprises the following steps: the processor obtains a difference value N between N1 and N2; the processor respectively compares the noise threshold corresponding to the N and each telescopic state to determine the telescopic state of the camera shooting assembly; wherein, the flexible state of the camera shooting assembly at least comprises any one of the following: starting extension/starting retraction, normal extension/normal retraction, blocking in extension/blocking in retraction, ending extension/ending retraction. By comparing the noise generated when the driving assembly works with the noise threshold corresponding to each telescopic state, the telescopic state of the camera assembly can be accurately determined.
In one possible implementation, the electronic device further includes: and the timer is used for responding to the starting of the driving component and recording the working time length of the driving component. By recording the working time of the driving assembly, the working time of the driving assembly can be combined for determination when the telescopic state of the camera assembly is analyzed, and the accuracy of the result is improved.
In one possible implementation manner, the processor respectively compares the noise thresholds corresponding to the N and each expansion state to determine the expansion state of the camera module, and includes: if N is larger than lower limit T of normal telescopic noise thresholdworking1And is less than the upper limit T of the normal telescopic noise thresholdworking2The processor determines that the camera shooting assembly is normally stretched; and if the telescopic time t of the camera assemblyworkingSatisfies the following conditions:
Figure BDA0002081402130000031
the processor determines that the telescopic state of the camera shooting assembly is the end of telescopic movement; wherein, l is the distance that the camera shooting component needs to move for completing the stretching and retracting, vworkingNormal telescopic speed of camera assembly, tworkingRecorded by a timer, Tworking1< Tworking2. Whether the preset maximum telescopic distance is finished or not is judged according to the recorded working time of the driving assembly, whether the telescopic movement is finished or not can be further determined, and the accuracy is higher.
In a possible implementation manner, the memory is further configured to record telescopic information of the camera module, where the telescopic information includes telescopic state information and time information corresponding to each telescopic state. By recording the information, the information can be combined to determine when the telescopic state and the specific telescopic position of the camera shooting assembly are analyzed, and the accuracy of the result is improved.
In one possible implementation, the processor is further configured to determine telescopic position information of the camera assembly according to the telescopic information of the camera assembly after the processor determines the telescopic state of the camera assembly according to N1 and N2. By combining the telescopic state information and the time information corresponding to each telescopic state, the telescopic position of the camera shooting assembly can be accurately positioned.
In a possible implementation manner, the processor is further configured to determine that the driving component is powered down if the driving component is not started or does not drive the image capturing component, and at least the following conditions are met: flexible information recording's subassembly of making a video recording last enable time is less than tworking. Through the judgment of the conditions, whether the electronic equipment is not successfully stretched or not due to power failure can be accurately positioned, so that the electronic equipment can accurately master the equipment condition.
In a third aspect, there is provided an image pickup unit driving apparatus including: the camera shooting device comprises a driving assembly, a camera shooting assembly and an accommodating cavity, wherein the camera shooting assembly is accommodated in the accommodating cavity, and the driving assembly is used for driving the camera shooting assembly to extend out of the accommodating cavity or driving the camera shooting assembly to be accommodated in the accommodating cavity; the image pickup assembly driving device further includes: a storage unit to store computer program code, the computer program code comprising instructions; and the processing storage unit is used for executing the instructions to realize the method for determining the telescopic state of the camera shooting assembly in any one possible implementation manner of the first aspect.
In a fourth aspect, a computer-readable storage medium is provided, in which computer-executable instructions are stored, and when executed by a processor, the computer-executable instructions implement the method for determining the telescopic state of a camera module according to any one of the possible implementation manners of the first aspect.
In a fifth aspect, a chip system is provided, which includes a processor, a memory, and instructions stored in the memory; when executed by the processor, the instructions implement a method for determining a telescopic state of a camera assembly as in any one of the possible implementations of the first aspect. The chip system may be formed by a chip, and may also include a chip and other discrete devices.
Drawings
FIG. 1 is a schematic diagram of a mobile phone including a lift camera assembly;
fig. 2 is a schematic structural diagram of hardware of an electronic device including a lift-type camera assembly according to an embodiment of the present disclosure;
FIG. 3 is a schematic structural diagram of two driving assemblies provided in the present application;
fig. 4 is a schematic structural diagram of a mobile phone including a lift-type camera module according to an embodiment of the present disclosure;
fig. 5 is a flowchart of a method for determining a telescopic state of a camera module according to an embodiment of the present disclosure;
fig. 6 is a schematic diagram of an operating circuit of a driving assembly according to an embodiment of the present disclosure;
FIG. 7 is a schematic diagram illustrating operational noise of a driving assembly according to an embodiment of the present disclosure;
fig. 8 is an exemplary diagram for determining a telescopic state of a camera module according to a working noise of a driving module according to an embodiment of the present application;
fig. 9 is a flowchart of a method for determining a telescopic position of a camera module according to an embodiment of the present disclosure;
fig. 10 is a first schematic diagram of a hardware structure of a mobile phone according to an embodiment of the present disclosure;
fig. 11 is a schematic diagram of a hardware structure of a mobile phone according to an embodiment of the present application.
Detailed Description
The embodiment of the application provides a method for determining the telescopic state of a camera shooting assembly and electronic equipment. Specifically, the method for determining the telescopic state of the camera shooting assembly provided by the embodiment of the application comprises the following steps: the telescopic state of the camera shooting assembly is determined by analyzing the environmental noise respectively collected by MICs at two different positions of the electronic equipment.
As shown in fig. 2, the electronic apparatus in the embodiment of the present application includes a case 109, an image pickup device 110, a first MIC101, and a second MIC 102. The image capturing device 110 includes a driving module 104 and an image capturing module 103. When the camera module 103 is not used, the camera module 103 is accommodated in an accommodating cavity 111 provided in the electronic device, as shown in fig. 1B. When the camera assembly 103 is needed, the driving assembly 104 drives the camera assembly 103 to protrude from the opening 108 on the electronic device housing, as shown in fig. 1A. And after the camera device 110 is used up, the driving component 104 drives the camera component 103 to be received into the accommodating cavity 111 from the opening 108, as shown in fig. 1B.
It should be noted that the camera assembly 103 in the embodiment of the present application may be understood as a camera, such as a front camera. In some embodiments, the camera assembly 103 in the embodiments of the present application may further include a rear camera, for example, the camera assembly 103 includes a front camera and a rear camera respectively disposed on two opposite planes. In other embodiments, the camera assembly 103 in the embodiment of the present application may further include a rotatable camera, and the camera may freely rotate according to the shooting requirement.
In addition, the first MIC101 and the second MIC102 described above are disposed at different positions of the electronic apparatus.
In some embodiments, two MICs are provided in the electronic device, one of which is used for collecting clear call voice of a user, and thus, may be provided at the bottom end of the electronic device. And the other is used for counteracting the environmental noise by collecting the environmental noise and sending out sound waves opposite to the environmental noise, so as to achieve the purpose of noise elimination, and the MIC can be arranged near the camera shooting component. For example, each of the smart phones C8800 and C8650 has dual MICs built in to combat noise, so that even if a user talks in a noisy city, high quality calls can still be maintained. Herein, the MIC disposed at the bottom end of the electronic device may be understood as the first MIC101 in the embodiment of the present application, and the MIC disposed near the image pickup assembly of the electronic device may be understood as the second MIC102 in the embodiment of the present application. As shown in fig. 2, a first MIC101 is built into the bottom of the electronic device 100, and a second MIC102 is built into the left side of the camera assembly 103 of the electronic device 100. Therefore, the first MIC101 and the second MIC102 are at different distances from the camera module. More specifically, the distance between the first MIC101 and the image pickup apparatus 110 is greater than the distance between the second MIC102 and the image pickup assembly 103.
It should be noted that the driving assembly in the embodiment of the present application may be a single driving assembly, or may be a dual driving assembly.
As shown in fig. 2, the drive assembly 104 is a right drive assembly. The drive assembly 104 includes a camera assembly support 105, a drive support 106 disposed to the right of the front camera assembly 103 and camera assembly support 105, and a drive motor 107. The driving component can also be a left driving component. As shown in fig. 3A, the drive assembly 104 includes a camera assembly bracket 105, a drive bracket 106 disposed on the left side of the front camera assembly 103 and camera assembly bracket 105, and a drive motor 107. Here, "left" and "right" are defined as the positions of the drive assembly relative to the front camera assembly.
As shown in fig. 3B, is a dual drive assembly. The drive assembly 104 includes a camera assembly mount 105, a drive mount 106-1 and a drive motor 107-1 disposed to the left of the front camera assembly 103 and the camera assembly mount 105, and a drive mount 106-2 and a drive motor 107-2 disposed to the right of the front camera assembly 103 and the camera assembly mount 105.
It should be noted that fig. 2 illustrates the structure of the electronic apparatus 100 as an example, and includes 2 possible MIC positions, the opening 108, the imaging device 110, the housing 111, and the like. In fact, the 2 MICs may be built in any other position of the electronic apparatus 100, and the image capturing device 110 and the like may be disposed in any other position of the electronic apparatus 100. For example, the opening 108 is disposed at a left side of the housing, and the driving assembly 104 can drive the camera assembly 103 to protrude from the opening at the left side of the housing.
In addition, the driving assembly 104 of fig. 2 and 3 is merely an example, and the possible structure of the driving assembly 104 is not limited. Comprises a specific structure of a camera shooting component bracket 105, a driving bracket (106, 106-1 or 106-2) and a driving motor (107, 107-1 or 107-2), and the relative positions and the coupling modes among the camera shooting component bracket 105, the driving bracket (106, 106-1 or 106-2) and the driving motor (107, 107-1 or 107-2). The embodiment of the present application does not limit the driving method of the driving motor (107, 107-1, or 107-2). For example, the driving manner of the driving motor (107, 107-1 or 107-2) may be electric motor driving, ultrasonic motor driving, magnetic induction driving, or the like.
The electronic device in the embodiment of the present application may be a smart phone as shown in fig. 1, and may also be other desktop, laptop, and handheld devices, such as a tablet computer, a netbook, a Personal Digital Assistant (PDA), a wearable device (e.g., a smart watch), a Portable Multimedia Player (PMP), a dedicated media Player, an AR (augmented reality)/VR (virtual reality) device, and the like. The embodiment of the present application does not limit the specific type and structure of the electronic device.
Fig. 4 is a diagram illustrating a hardware structure of a mobile phone according to an embodiment of the present application. As shown in fig. 4, the mobile phone 100 may include a processor 410, an external memory interface 420, an internal memory 421, a Universal Serial Bus (USB) interface 430, a charging management module 440, a power management module 441, a battery 442, an antenna 1, an antenna 2, a mobile communication module 450, a wireless communication module 460, an audio module 470, a speaker 470A, a receiver 470B, a microphone 470C, a sensor module 480, a button 490, a motor 491, an indicator 492, a camera 493, a display 494, a Subscriber Identification Module (SIM) card interface 495, and the like. Wherein the sensor module 480 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the mobile phone 100. In other embodiments of the present application, the handset 100 may include more or fewer components than shown, or some components may be combined, some components may be separated, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 410 may include one or more processors. For example: the processor 410 may include an Application Processor (AP), a modem processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a flight controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), among others. The different processors may be separate devices or may be integrated into one or more processors.
A memory may also be provided in processor 410 for storing instructions and data. In some embodiments, the memory in the processor 410 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 410. If the processor 410 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 410, thereby increasing the efficiency of the system.
In some embodiments, processor 410 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The USB interface 430 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 430 may be used to connect a charger to charge the mobile phone 100, and may also be used to transmit data between the mobile phone 100 and peripheral devices. But also for connecting other electronic devices, such as AR devices, etc.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only an exemplary illustration, and does not limit the structure of the mobile phone 100.
The charging management module 440 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 440 may receive charging input from a wired charger via the USB interface 430. In some wireless charging embodiments, the charging management module 440 may receive a wireless charging input through a wireless charging coil of the cell phone 100. While the charging management module 440 charges the battery 442, the power management module 441 may also supply power to the electronic device.
The power management module 441 is used to connect the battery 442, the charging management module 440 and the processor 410. The power management module 441 receives input from the battery 442 and/or the charging management module 440 and provides power to the processor 410, the internal memory 421, the display screen 494, the camera 493, the wireless communication module 460, and the like. The power management module 441 may also be used to monitor parameters such as battery capacity, battery cycle number, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 441 may be disposed in the processor 410. In other embodiments, the power management module 441 and the charging management module 440 may be disposed in the same device.
The wireless communication function of the mobile phone 100 can be implemented by the antenna 1, the antenna 2, the mobile communication module 450, the wireless communication module 460, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset 100 may be used to cover a single or multiple communications bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 450 may provide a solution including 2G/3G/4G/5G wireless communication applied to the handset 100. The mobile communication module 450 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 450 may receive the electromagnetic wave from the antenna 1, and filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 450 can also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 450 may be disposed in the processor 410. In some embodiments, at least some of the functional blocks of the mobile communication module 450 may be disposed in the same device as at least some of the blocks of the processor 410.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 470A, the receiver 170B, etc.) or displays images or video through the display screen 494. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be separate from the processor 410, and may be located in the same device as the mobile communication module 450 or other functional modules.
The wireless communication module 460 may provide solutions for wireless communication applied to the mobile phone 100, including Wireless Local Area Networks (WLANs) (e.g., WiFi networks), bluetooth BT, Global Navigation Satellite Systems (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 460 may be one or more devices integrating at least one communication processing module. The wireless communication module 460 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 410. The wireless communication module 460 may also receive a signal to be transmitted from the processor 410, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the handset 100 is coupled to the mobile communication module 450 and the antenna 2 is coupled to the wireless communication module 460, such that the handset 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou satellite navigation system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The mobile phone 100 implements a display function through the GPU, the display screen 494, and the application processor. The GPU is an image processing microprocessor connected to a display screen 494 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 410 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 494 is used to display images, videos, and the like. The display screen 494 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, cell phone 100 may include 1 or N display screens 494, N being a positive integer greater than 1.
The mobile phone 100 may implement a shooting function through the ISP, the camera 493, the video codec, the GPU, the display screen 494, the application processor, and the like.
The ISP is used to process the data fed back by the camera 493. For example, when a photo is taken, the shutter is opened, light is transmitted to the photographic assembly photosensitive element through the lens, the optical signal is converted into an electric signal, and the photographic assembly photosensitive element transmits the electric signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 493.
The camera 493 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the handset 100 may include 1 or N cameras 493, where N is a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the handset 100 is in frequency bin selection, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. Handset 100 may support one or more video codecs. Thus, the handset 100 can play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can realize applications such as intelligent recognition of the mobile phone 100, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 420 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the mobile phone 100. The external memory card communicates with the processor 410 through the external memory interface 420 to implement data storage functions. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 421 may be used to store computer-executable program code, including instructions. The internal memory 421 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area may store data (e.g., audio data, a phonebook, etc.) created during use of the handset 100, and the like. In addition, the internal memory 421 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 410 executes various functional applications of the cellular phone 100 and data processing by executing instructions stored in the internal memory 421 and/or instructions stored in a memory provided in the processor.
The handset 100 may implement audio functions through the audio module 470, the speaker 470A, the receiver 470B, the microphone 470C, the application processor, and the like. Such as music playing, recording, etc.
The audio module 470 is used to convert digital audio information into an analog audio signal output and also used to convert an analog audio input into a digital audio signal. The audio module 470 may also be used to encode and decode audio signals. In some embodiments, the audio module 470 may be disposed in the processor 410, or some functional modules of the audio module 470 may be disposed in the processor 410.
The speaker 470A, also called a "horn", is used to convert the audio electrical signals into sound signals. The handset 100 may play voice or notify etc. through the speaker 470A.
The receiver 470B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the cellular phone 100 receives a call or voice information, it can receive voice by placing the receiver 470B close to the ear of the person.
The microphone 470C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal into the microphone 470C by speaking the user's mouth near the microphone 470C. The handset 100 may be provided with at least one microphone 470C. In other embodiments, the handset 100 may be provided with two microphones 470C to achieve noise reduction functions in addition to collecting sound signals. In other embodiments, the mobile phone 100 may further include three, four or more microphones 470C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
The pressure sensor is used for sensing a pressure signal and converting the pressure signal into an electric signal.
The gyro sensor can be used to determine the pose during movement of the handset 100.
The magnetic sensor includes a hall sensor. In some embodiments, physical parameters such as current, position, orientation, etc. may be measured by sensing magnetic field strength with a magnetic sensor.
The acceleration sensor can detect the magnitude of acceleration of the cellular phone 100 in various directions (typically three axes).
The fingerprint sensor is used for collecting fingerprints. Any type of sensing technology may be employed including, but not limited to, optical, capacitive, piezoelectric, or ultrasonic sensing technologies, etc. The mobile phone 100 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access application lock, fingerprint photographing and the like.
Touch sensors, also known as "touch devices". The touch sensor (also referred to as a touch panel) may be disposed on the display screen 494, and the touch sensor and the display screen 494 form a touch screen, also referred to as a "touch screen". The touch sensor is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine a touch event type. Visual output associated with the touch operation may be provided through the display screen 494. In other embodiments, the touch sensor can be disposed on a surface of the handset 100 at a different location than the display screen 494.
The keys 490 include a power-on key, a volume key, etc. The keys 490 may be mechanical keys. Or may be touch keys. The cellular phone 100 may receive a key input, and generate a key signal input related to user setting and function control of the cellular phone 100.
The motor 491 may generate a vibration indication. The motor 491 may be used for both incoming call vibration prompting and touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 491 may also respond to different vibration feedback effects in response to touch operations applied to different areas of the display screen 494. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 492 may be an indicator light, and may be used to indicate a charging status, a change in charge level, or a message, a missed call, a notification, etc.
The SIM card interface 495 is used to connect a SIM card. The SIM card can be attached to and detached from the cellular phone 100 by being inserted into the SIM card interface 495 or being pulled out from the SIM card interface 495. The handset 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 495 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. Multiple cards can be inserted into the same SIM card interface 495 at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 495 may also be compatible with different types of SIM cards. The SIM card interface 495 may also be compatible with an external memory card. The mobile phone 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the handset 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the mobile phone 100 and cannot be separated from the mobile phone 100.
The basic principle of the method for determining the telescopic state of the camera shooting assembly in the embodiment of the application is as follows: the electronic equipment analyzes the environmental noise collected by the two MICs through the built-in environmental noise collected by the two MICs at different positions of the electronic equipment, and determines the telescopic state of the camera shooting assembly. The camera shooting assembly stretching state comprises whether the camera shooting assembly is successfully started to stretch, whether the camera shooting assembly is in a normal stretching state, whether the camera shooting assembly successfully stretches out, whether the camera shooting assembly successfully retracts, whether the camera shooting assembly is locked up or not and the like.
The method for determining the telescopic state of the camera module according to the embodiment of the present application is specifically described below with reference to the mobile phone in fig. 4. The methods in the following embodiments may be implemented in an electronic device having the above hardware structure or an electronic device having a similar structure.
The method for determining the telescopic state of the camera shooting assembly according to the embodiment of the application is described below by using the relative positions of the first MIC101, the second MIC102 and the camera shooting device 110 shown in fig. 2. Specifically, a first MIC101 is disposed at the bottom end of the mobile phone 100, and a second MIC102 is disposed near the camera assembly 103. As shown in fig. 5, the method may include S501-S503.
S501, in response to the first operation of the user, the mobile phone 100 starts the driving component 104.
Wherein the first operation may be a start-up operation for the camera. For example, the lock screen interface of the handset 100 includes an icon of the camera that the user clicks on in the first operation. As another example, the main interface of the cell phone 100 includes an icon of the camera, which the user clicks in the first operation. For another example, an application installed in the mobile phone 100 (e.g., pan, beauty camera, WeChat, etc.) may call the camera, and the first operation is the user initiating an operation to call the camera in the application (e.g., initiating a scan in pan, clicking a shot short video icon in WeChat, etc.).
In one possible implementation, the handset 100 may trigger the driving component 104 to start by sending an enable signal to the driving component 104. Specifically, after the driving assembly 104 receives the enabling signal, the driving assembly 104 drives the driving bracket 106 to drive the camera assembly bracket 105 to drive the camera assembly 103 to extend or retract.
In some embodiments, the handset 100 may also instruct the driving assembly 104 to drive the camera assembly 103 in the direction of motion by sending a direction signal to the driving assembly 104. For example, in fig. 2, the driving component 104 receives an upward direction signal, and the driving component 104 drives the image pickup component 103 to extend upward; if the direction signal is downward, the driving component 104 drives the camera component 103 to retract downward.
Fig. 6 is a schematic diagram of an operating circuit of a driving assembly according to an embodiment of the present disclosure. After receiving the enable signal and the clock signal sent by the CPU of the mobile phone 100, the driving component 104 sets the initial parameter to zero according to the initialization signal, and sets the clock register of the driving component 104 to zero. The drive assembly 104 then performs directional parameter setting based on the received directional signal. Then, the driving component 104 outputs a-phase current and B-phase current through the a-phase coil and the B-phase coil under the driving of the enable signal according to the set direction parameters. Under the action of the A-phase current and the B-phase current, the driving motor is driven towards the corresponding direction.
S502, in response to the start of the driving component 104, the cell phone 100 collects the first environmental noise N1 through the first MIC101 and collects the second environmental noise N2 through the second MIC102 during the operation of the driving component 104.
As described above, since the first MIC101 in fig. 2 is farther from the drive assembly 104, the second MIC102 is closer to the drive assembly 104. Thus, it can be understood that: of the second ambient noise collected by the second MIC102, the noise generated by the operation of the drive component 104 is larger than the first ambient noise collected by the first MIC 101. Among other things, the operation of the drive assembly 104 may include: the whole process that the driving component 104 drives the camera component 103 to extend, and the whole process that the driving component 104 drives the camera component 103 to retract. Therefore, the first environmental noise mainly includes the noise of the external environment. The second ambient noise mainly includes superposition of ambient noise and noise generated by operation of the drive assembly 104. Here, it can be considered that the noise of the external environment in the first environmental noise is the same as the noise of the external environment in the second environmental noise.
S503, the mobile phone 100 determines the expansion/contraction state of the camera module 103 according to the first environmental noise N1 and the second environmental noise N2.
The telescopic state of the camera assembly 103 includes, but is not limited to, any of the following: starting extension/starting retraction, normal extension/normal retraction, locked rotor, ending extension/ending retraction, abnormal retraction and abnormal power failure. The specific determination method of abnormal recovery and abnormal power failure will be described in detail below.
In one possible implementation, the mobile phone 100 may store the noise threshold corresponding to each of the above expansion states. The noise threshold may be understood as a value, for example, a locked-rotor noise threshold. The noise threshold can also be understood as a range. For example, the noise threshold corresponding to each scaling state may include { lower threshold corresponding to the scaling state, upper threshold corresponding to the scaling state }, where the lower threshold of each scaling state < the upper threshold of the scaling state. For example, the normal scaling noise threshold includes { lower limit of the normal scaling noise threshold, upper limit of the normal scaling noise threshold }. Wherein, the noise thresholds corresponding to different stretching states are different.
For example, based on a large amount of experimental data as shown in fig. 7, noise thresholds for different telescoping states may be determined and each noise threshold may be preconfigured in the handset 100. For example: starting noise threshold TstartIs 19 dB-25 dB, wherein 19 is the lower limit of the starting noise threshold, and 25dB is the upper limit of the starting noise threshold. Locked rotor noise threshold Tlocked-rotorIs 31dB, normal scaling noise threshold Tworking7 dB-12 dB, and the telescopic noise threshold T is finishedfinishIs 15dB to 20 dB.
In some embodiments, the electronic device may not be pre-configured with noise thresholds for respectively initiating protraction/initiating retraction and ending protraction/ending retraction. That is, the states of start of extension/start of retraction and end of extension/end of retraction are not judged according to the noise threshold.
In a possible implementation manner, the electronic device may perform the expansion state judgment according to a preset waiting time. For example, the preset startup waiting time is 0.2s, and if the noise generated by the driving component 104 after the enable time passes 0.2s meets the normal protrusion noise threshold, the driving component 104 is considered to be successfully started. Fig. 7 is a schematic diagram of an operating noise of a driving assembly according to an embodiment of the present disclosure. As shown in fig. 7, at time 0s, the driving element 104 is activated, and the driving element 104 activation noise is 21dB, and after about 0.2s, the activation is completed. Then the driving assembly 104 works normally and the camera assembly 103 extends normally. At time 0.8s, the drive assembly 104 stalls and the noise suddenly rises to 33 dB. At time 0.9s, the drive assembly 104 retracts the camera assembly 103 and the noise falls back to 0 dB. At time 1.1s, the drive assembly 104 is restarted, with a start-up noise of 24dB, after which the camera assembly 103 is extended normally. At time 1.5s, the handset suddenly loses power. After rebooting, the drive assembly 104 again retracts the camera assembly 103 and, at time 1.7s, the drive assembly 104 is restarted again. Then, the image pickup assembly 103 normally extends out until the time 2.5s, and the driving assembly 104 collides with the frame of the accommodating cavity 111. At this point, the noise decibel is 18dB, ending the protrusion.
In one possible implementation, the determining, by the mobile phone 100, the expansion/contraction state of the camera assembly 103 according to the first environmental noise N1 and the second environmental noise N2 (i.e., S503), may include S5031 and S5032:
s5031, the mobile phone 100 obtains a difference N between the first environmental noise N1 and the second environmental noise N2.
The first environmental noise N1 collected by the first MIC101 is mainly noise of the external environment, and the second environmental noise N2 collected by the second MIC102 is superposition of the external environment and noise generated by the operation of the image pickup assembly 104. Therefore, the difference N between the first environmental noise N1 and the second environmental noise N2 reflects the noise generated when the image pickup unit 104 operates.
S5032, the mobile phone 100 compares the noise threshold corresponding to each expansion state with N, and determines the expansion state of the camera assembly 103.
In some embodiments, the handset 100 may also start as a machine when the handset 100 sends the enable signal and the direction signal to the drive assembly 104, recording the operating duration of the drive assembly 104. For example, the handset 100 records the operation duration of the driving component 104 by sending a Clock signal (Clock, CLK) to the driving component 104.
For example, as shown in fig. 8, at time t1, the mobile phone 100 drives the driving bracket 106 via the driving motor 107 to extend the camera assembly 103, and CLK starts timing.
In one possible implementation, the S5032, that is, the mobile phone 100 respectively compares N with the noise threshold corresponding to each expansion state to determine the expansion state of the camera assembly 103 (i.e., S5032), may include: if N satisfies the threshold TstartI.e. N is greater than the lower threshold T of the starting noisestart1And is less than the upper limit T of the starting noise thresholdstart2The mobile phone 100 determines that the camera assembly 103 starts to extend; if N satisfies the locked-rotor noise threshold Tlocked-rotorI.e. N is greater than the locked-rotor noise threshold Tlocked-rotorThe mobile phone 100 determines that the camera assembly 103 is locked; if N satisfies the normal telescopic noise threshold TworkingI.e. N is greater than the lower threshold T of the normal telescopic noiseworking1And is less than the upper limit T of the normal telescopic noise thresholdworking2The mobile phone 100 determines that the camera assembly 103 is normally extended; if N satisfies the end stretch noise threshold TfinishI.e. N is greater than the lower threshold T for ending the stretch noisefinish1And is less than the upper limit T of the end telescopic noise thresholdfinish2The cell phone 100 determines that the camera assembly 103 is finished zooming.
In some embodiments, if the environmental noise collected by the first MIC is greater than a predetermined threshold (e.g., 90dB), the mobile phone 100 may abandon the determination of the expansion state of the camera module according to the embodiment of the present application. Since the second MIC may have difficulty accurately detecting the operating noise of the drive assembly 104 when the environment is very noisy. Therefore, the method of the embodiment of the present application has poor effect. In this case, the cellular phone 100 may not make a judgment. The mobile phone 100 may also pop up a window on the display screen or alert the user with a voice, and the determination of the telescopic state of the camera assembly will be abandoned during the operation of the driving assembly 104. In one possible implementation, the mobile phone 100 can further determine whether the telescopic state of the camera assembly 103 is locked-rotor during extension or locked-rotor during retraction, whether extension or retraction is initiated, whether normal extension or normal retraction is initiated, or whether extension or retraction is terminated, in conjunction with the direction signal.
In another possible implementation manner, in S5032, the comparing N with the noise threshold corresponding to each expansion state by the cell phone 100 to determine the expansion state of the camera assembly 103 respectively may include:
if N satisfies the normal telescopic noise threshold TworkingI.e. N is greater than the lower threshold T of the normal telescopic noiseworking1And is less than the upper limit T of the normal telescopic noise thresholdworking2The mobile phone 100 determines that the camera assembly 103 is normally extended; and, if the image pickup module 103 expands and contracts for a time tworkingSatisfies the following conditions:
Figure BDA0002081402130000121
the expansion and contraction state of the image pickup module 103 is determined to be the end of expansion and contraction. Wherein l is the distance that the camera assembly 103 needs to move to extend or retract, vworkingThe speed at which the camera assembly 103 is normally extended or retracted, which may be pre-designed and configured in the handset 100.
For example, as shown in fig. 8, at time t1, the cellular phone 100 drives the driving mount 106 by the driving motor 107, and the first MIC101 and the second MIC102 start measuring noise. Where the first MIC101 and the second MIC102 measure the first environmental noise N1 to be 20dB and the second environmental noise N2 to be 42dB, respectively, it may be determined that the operating noise N-N2-N1-22 dB of the driving component 104 is shown as point a in fig. 8. Because N-22 dB meets the starting noise threshold Tstart(e.g. T)start19dB), the cellular phone 100 determines whether the telescopic state of the camera module 103 at time t1 is the start of extension or the start of retraction. Since the direction signal is upward, the telescopic state of the camera module 103 at time t1 can be further determined as the start of extension.
At time t3, N1 was measured as 20dB, N2 is 30dB, the operating noise of the drive assembly 104 can be determined to be 10dB, as shown at point C in fig. 8. Because N is 10dB and meets the normal telescopic noise threshold Tworking(e.g. T)working7dB), the cellular phone 100 determines that the telescopic state of the camera module 103 at time t3 is the normal telescopic state. Since the direction signal is upward, it can be further determined that the image pickup assembly 103 is in the extended state at time t 3.
At time t4, where N1 is measured as 20dB and N2 is measured as 53dB, the operating noise of the drive assembly 104 may be determined as 33dB, as indicated by point D in FIG. 8. Because N is 33dB, the locked-rotor noise threshold T is satisfiedlocked-rotor(e.g. T)locked-rotor31dB), the cellular phone 100 determines that the telescopic state of the camera module 103 at time t4 is locked rotation. Since the direction signal is upward, it can be further determined that the image pickup unit 103 is locked in rotation when the image pickup unit is extended at time t 4.
At time t5, where N1 is measured as 20dB and N2 is measured as 20dB, the operating noise of the drive assembly 104 may be determined as 0dB, as indicated by point E in FIG. 8. Since the last expansion state of the driving unit 104 is locked-rotor when extending, the mobile phone 100 determines that the expansion state of the camera unit 103 at time t5 is locked-rotor and then restarts the mobile phone.
At time t6, where N1 is measured as 20dB and N2 is measured as 44dB, the operating noise of the drive assembly 104 may be determined as 24dB, as indicated by point F in FIG. 8. Since the start noise threshold of 19dB is met by 24dB, the mobile phone 100 determines that the camera assembly 103 is in the extended state at time t6 as a restart extension.
At time t7, where N1 is measured to be 20dB and N2 is measured to be 32dB, the operating noise of the drive assembly 104 can be determined to be 12dB, as indicated at point G in FIG. 8. Since N-12 dB satisfies the normal telescopic noise threshold of 7dB, the cellular phone 100 determines that the telescopic state of the camera module 103 at time t7 is the normal extension.
At time t8, where N1 is measured as 20dB and N2 is measured as 20dB, the operating noise of the drive assembly 104 may be determined as 0dB, as indicated by point H in FIG. 8. Since the last expansion and contraction state of the driving assembly 104 was the normal extension, the cellular phone 100 determines that the expansion and contraction state of the camera assembly 103 at the time t8 is the power-down-in-extension state.
At time t9, where N1 is measured as 20dB and N2 is measured as 44dB, the operating noise of the drive assembly 104 may be determined as 24dB, as indicated by point I in FIG. 8. Since the start noise threshold of 19dB is met by 24dB, the mobile phone 100 determines that the camera assembly 103 is in the extended state at time t9 as a restart extension.
At time t10, where N1 is measured to be 20dB and N2 is measured to be 32dB, the operating noise of the drive assembly 104 may be determined to be 12dB, as shown at point J in FIG. 8. Since N is 12dB greater than the normal telescopic noise threshold by 7dB, the cellular phone 100 determines that the camera module 103 is in the telescopic state at time t10 as being in the normal extended state.
At time t11, where N1 is measured as 20dB and N2 is measured as 338dB, the operating noise of the drive assembly 104 may be determined as 18dB, as indicated by point K in FIG. 8. Since N is 18dB, the end stretch noise threshold T is satisfiedfinish(e.g. T)finish15dB), the cellular phone 100 determines that the telescopic state of the camera module 103 at time t11 is the end of extension.
In some embodiments, to determine where the camera assembly 103 is located. For example, the handset 100 may need to determine the location of the camera assembly 103 locked out from rotation, or before power is lost. Also for example, the handset 100 needs to know the telescopic position of the camera assembly 103. As shown in fig. 9, after 503, the method for determining the telescopic state of the camera assembly 103 according to the embodiment of the present application may further include:
s504, the mobile phone 100 determines the telescopic position information of the camera module 103 according to the telescopic information of the camera module 103.
Wherein, the telescopic information of the camera assembly 103 can be recorded by the mobile phone 100. The scaling information includes, but is not limited to, scaling state information and time information corresponding to each scaling state.
As shown in fig. 8, at time t4, the cellular phone 100 determines that the telescopic state of the camera module 103 is the lock-up in extension. The mobile phone 100 can determine the specific position of the locked rotor according to the time information recorded by the CLK and the information of the expansion state of the driving component 104. Specifically, at time t1, the camera assembly 103 is activated to extend. At time t3, the camera module 103 is activated and extended normally. At time t4, the camera assembly 103 stalls while extended.
In one possible implementation, × v may be calculated according to the formula (t4-t3)workingThe distance the camera assembly 103 has been extended at time t4 is calculated. Wherein v isworkingThe normal extension or retraction speed of the camera assembly 103, which may be pre-designed and configured in the handset 100.
For example, vworkingAt 8mm/s, the camera assembly 103 starts to extend at 0.3s and stalls at 0.7 s. It can be determined that the camera assembly 103 stalls when it extends 3.2 mm.
In another possible implementation manner, the extending or retracting speed of the camera assembly 103 may also be changed in real time, and the mobile phone 100 may obtain the real-time extending and retracting speed of the camera assembly 103 through a sensor or the like. In this possible implementation, the method can be implemented according to
Figure BDA0002081402130000131
The distance the camera assembly 103 has been extended at time t4 is calculated. Wherein v isiIs tiReal-time stretch-and-shrink speed, Δ t, measured at a timeiHolding v for the camera assembly 103iThe duration of the speed scaling.
For example, the camera assembly 103 is extended at a speed of 6mm/s starting at 0.3 s. At 0.4s, the ramp-up was extended at 7 mm/s. At 0.5s, the protrusion was started at 8 mm/s. Until stalling at 0.7 s. It was determined that l ═ (0.4s-0.3s) × mm/s + (0.5s-0.4s) × 7mm/s + (0.7s-0.5s) × 8mm/s was 2.9 mm. I.e., the camera assembly 103 stalls when it extends 2.9 mm.
In some embodiments, if the driving component 104 is not currently activated or does not drive the camera component 103, and at least the following conditions are satisfied, the mobile phone 100 determines that the driving component 104 is powered down, or the mobile phone 100 is powered down:
the latest enabling time of the camera assembly 103 for telescopic information recording is less than tworking
In some embodiments, the first MIC101 and the second MIC102 may continue to collect ambient noise after the drive assembly 104 is powered down.
In some embodiments, if the driving element 104 receives the enable signal, and the difference N between the first environmental noise and the second environmental noise is greater than TstopThe handset 100 determines the driverThe last retraction of the movable member 104 was a press retraction. Wherein, TstopIs a noise threshold at which the drive assembly 104 is not activated. E.g. TstopMay include { T }stop1,Tstop2In which T isstop1For an unactuated noise threshold lower bound, Tstop2For an unactuated upper noise threshold, Tstop1<Tstop2
Since the drive assembly 104 does not receive the enable signal, it indicates that the drive motor (107, 107-1 or 107-2) is not operating. The noise N at this time is mechanical noise caused by an external force. The lower noise threshold limit for proper operation of the drive assembly 104 is not exceeded. Since the drive motor controls the rotation, and the manual pressing rotation, it is essentially mechanical noise. This noise is less than the lower noise threshold for normal operation, and if the drive motor is not operating, then the motor can be considered to be operating passively, such as manually depressed retraction.
It is understood that the handset 100 includes corresponding hardware structures and/or software modules for performing the respective functions in order to realize the functions of any of the above-described embodiments. Those of skill in the art would readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, functional modules of the mobile phone 100 may be divided, for example, the functional modules may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, in the embodiment of the present application, the division of the module is schematic, and is only one logic function division, and there may be another division manner in actual implementation.
For example, in a case that each functional module is divided in an integrated manner, as shown in fig. 10, a schematic diagram of a hardware structure of a mobile phone provided in the embodiment of the present application is shown. The mobile phone 100 may include a driving component 104, a camera component 103, a first MIC101, a second MIC102, a processing unit 1010, and a storage unit 1020. Wherein the processing unit 1010 is configured to activate the driving assembly 104 in response to a first operation by a user. The driving assembly 104 is used for driving the camera assembly 103 to extend out of the mobile phone frame or driving the camera assembly 10 to retract from the mobile phone frame. The first MIC101 and the second MIC102 are used to collect first environmental noise and second environmental noise, respectively, during operation of the drive assembly 104. The processing unit 1010 is further configured to determine a telescopic state of the camera assembly 103 according to the first environmental noise and the second environmental noise.
In one possible configuration, as shown in fig. 11, the handset 100 may further include a timing unit 1030 for recording the operating time of the drive assembly 104.
It should be noted that, as shown in fig. 9, the mobile phone 100 may further include a radio frequency circuit. Specifically, the mobile phone 100 may receive and transmit wireless signals through the radio frequency circuit. Typically, the radio frequency circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency circuitry may also communicate with other devices via wireless communication. The wireless communication may use any communication standard or protocol including, but not limited to, global system for mobile communications, general packet radio service, code division multiple access, wideband code division multiple access, long term evolution, email, short message service, and the like.
In an alternative, when the data transfer is implemented using software, it may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions described in the embodiments of the present application are implemented in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied in hardware or may be embodied in software instructions executed by a processor. The software instructions may consist of corresponding software modules that may be stored in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, a hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. Of course, the storage medium may also be integral to the processor. The processor and the storage medium may reside in an ASIC. In addition, the ASIC may reside in a probing apparatus. Of course, the processor and the storage medium may reside as discrete components in the probe device.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed user equipment and method may be implemented in other manners. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical functional division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another device, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, that is, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (19)

1. A method for determining the telescopic state of a camera shooting assembly is applied to electronic equipment, and the electronic equipment is characterized by comprising the following steps: casing, camera device, first microphone MIC and second MIC, electronic equipment is equipped with accepts the chamber, the casing is equipped with the opening, camera device includes drive assembly and the subassembly of making a video recording, the subassembly of making a video recording accept in accept the chamber, drive assembly is used for the drive the subassembly of making a video recording is followed the opening stretches out the casing or the drive the subassembly of making a video recording is followed the opening income accept the chamber, the method includes:
in response to a first operation of a user, the electronic device starts the driving component;
the electronic equipment collects first environmental noise N1 through the first MIC and collects second environmental noise N2 through the second MIC in the working process of the driving component;
the electronic equipment acquires a difference value N between the N1 and the N2, and determines the telescopic state of the camera shooting assembly according to the N;
the distance between the first MIC and the image pickup device is larger than the distance between the second MIC and the image pickup device.
2. The method of claim 1, wherein the electronic device activates the drive assembly, comprising:
the electronic equipment sends an enabling signal and a direction signal to the driving assembly, the enabling signal is used for triggering the driving assembly to start, and the direction signal is used for indicating the direction of the driving assembly driving the camera shooting assembly to move.
3. The method of claim 1 or 2, wherein the camera assembly has a plurality of telescopic states, and a noise threshold corresponding to each telescopic state is stored in the electronic device, and the noise thresholds corresponding to different telescopic states are different;
the electronic equipment determines the telescopic state of the camera shooting assembly according to the obtained difference value N of the N1 and the N2, and comprises the following steps:
the electronic equipment respectively compares the N with the noise threshold corresponding to each telescopic state to determine the telescopic state of the camera shooting assembly;
wherein the telescopic state of the camera assembly at least comprises any one of the following states: starting extension/starting retraction, normal extension/normal retraction, blocking in extension/blocking in retraction, ending extension/ending retraction.
4. The method of claim 3, further comprising:
and responding to the starting of the driving component, the electronic equipment starts a timer and records the working time of the driving component.
5. The method of claim 4, wherein the electronic device compares the noise threshold corresponding to each of the N scaling states to determine the scaling state of the camera assembly, respectively, and comprises:
if N is larger than the lower limit T of the normal telescopic noise thresholdworking1And is less than the upper limit T of the normal telescopic noise thresholdworking2The electronic equipment determines that the camera shooting assembly is normally stretched; and is
If the telescopic time t of the camera shooting assemblyworkingSatisfies the following conditions:
Figure FDA0002491890190000011
the electronic equipment determines the camera groupThe telescopic state of the piece is the end of telescopic movement; wherein l is the distance of the camera assembly to complete the telescopic movement, vworkingNormal telescopic speed of camera assembly, tworkingRecorded by said timer, Tworking1<Tworking2
6. The method of claim 5, further comprising:
the electronic equipment records the telescopic information of the camera shooting assembly, and the telescopic information comprises telescopic state information and time information corresponding to each telescopic state.
7. The method of claim 6, wherein after the electronic device determines the telescopic state of the camera assembly from the N1 and the N2, the method further comprises:
and the electronic equipment determines the telescopic position information of the camera shooting assembly according to the telescopic information of the camera shooting assembly.
8. The method of claim 6, further comprising:
if the driving assembly is not started or does not drive the camera assembly, and at least the following conditions are met, determining that the driving assembly is powered down:
the last telescopic time t of the camera shooting assembly recorded by the telescopic informationworking' less than tworking
9. An electronic device, characterized in that the electronic device comprises: the electronic equipment comprises a shell, a camera device, a first microphone MIC and a second MIC, wherein the electronic equipment is provided with an accommodating cavity, the shell is provided with an opening, the camera device comprises a driving assembly and a camera component, the camera component is accommodated in the accommodating cavity, and the driving assembly is used for driving the camera component to extend out of the shell from the opening or driving the camera component to be accommodated into the accommodating cavity from the opening; the electronic device further includes:
a processor for activating the drive assembly in response to a first operation by a user;
the first MIC is used for collecting first environment noise N1 in the working process of the driving assembly;
the second MIC is used for collecting second ambient noise N2 in the working process of the driving assembly;
the processor is further used for acquiring a difference value N between the N1 and the N2, and determining the telescopic state of the camera shooting assembly according to the N;
the distance between the first MIC and the image pickup device is larger than the distance between the second MIC and the image pickup device.
10. The electronic device of claim 9, wherein the processor activates the drive assembly, comprising:
the processor sends an enabling signal and a direction signal to the driving assembly, the enabling signal is used for triggering the driving assembly to start, and the direction signal is used for indicating the direction of the driving assembly driving the camera shooting assembly to move.
11. The electronic device of claim 9 or 10, wherein the camera assembly has a plurality of telescopic states, the electronic device further comprising:
the memory is used for storing the noise threshold corresponding to each expansion state, and the noise thresholds corresponding to different expansion states are different;
the acquiring the difference value N between the N1 and the N2 and determining the telescopic state of the camera shooting assembly according to the N comprise the following steps:
the processor respectively compares the N with the noise threshold corresponding to each telescopic state to determine the telescopic state of the camera shooting assembly;
wherein the telescopic state of the camera assembly at least comprises any one of the following states: starting extension/starting retraction, normal extension/normal retraction, blocking in extension/blocking in retraction, ending extension/ending retraction.
12. The electronic device of claim 11, further comprising:
and the timer is used for responding to the starting of the driving component and recording the working time length of the driving component.
13. The electronic device of claim 12, wherein the processor compares the noise threshold for each of the N scaling states to determine the scaling state of the camera assembly, respectively, and comprises:
if N is larger than the lower limit T of the normal telescopic noise thresholdworking1And is less than the upper limit T of the normal telescopic noise thresholdworking2The processor determines that the camera shooting assembly is normally stretched; and is
If the telescopic time t of the camera shooting assemblyworkingSatisfies the following conditions:
Figure FDA0002491890190000021
the processor determines that the telescopic state of the camera shooting assembly is the end of telescopic movement; wherein l is the distance of the camera assembly to complete the telescopic movement, vworkingNormal telescopic speed of camera assembly, tworkingRecorded by said timer, Tworking1<Tworking2
14. The electronic device of claim 13, wherein the memory is further configured to record telescopic information of the camera assembly, and the telescopic information includes telescopic state information and time information corresponding to each telescopic state.
15. The electronic device of claim 14, wherein the processor is further configured to,
after the processor determines the telescopic state of the camera assembly according to the N1 and the N2, the telescopic position information of the camera assembly is determined according to the telescopic information of the camera assembly.
16. The electronic device of claim 15, wherein the processor is further configured to determine that the driving component is powered down if the driving component is not activated or does not drive the camera component, and at least the following conditions are met:
the last enabling time of the camera shooting assembly of the telescopic information record is less than tworking
17. A camera module driving apparatus, comprising: the camera shooting device comprises a driving assembly, a camera shooting assembly and an accommodating cavity, wherein the camera shooting assembly is accommodated in the accommodating cavity, and the driving assembly is used for driving the camera shooting assembly to extend out of the accommodating cavity or driving the camera shooting assembly to be accommodated in the accommodating cavity; the image pickup assembly driving apparatus further includes:
a storage unit to store computer program code, the computer program code comprising instructions;
a processing unit for executing the instructions to implement the method of determining a telescopic state of a camera assembly according to any of claims 1-8.
18. A computer readable storage medium having computer executable instructions stored thereon which, when executed by processing circuitry, implement a method of determining a camera assembly telescopic state as claimed in any of claims 1-8.
19. A chip system, comprising a processor, a memory, wherein the memory has instructions stored therein; the instructions, when executed by the processor, implement a method of determining camera assembly telescopic state as claimed in any of claims 1-8.
CN201910473489.7A 2019-05-31 2019-05-31 Method for determining telescopic state of camera shooting assembly, readable storage medium and related equipment Active CN110198372B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910473489.7A CN110198372B (en) 2019-05-31 2019-05-31 Method for determining telescopic state of camera shooting assembly, readable storage medium and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910473489.7A CN110198372B (en) 2019-05-31 2019-05-31 Method for determining telescopic state of camera shooting assembly, readable storage medium and related equipment

Publications (2)

Publication Number Publication Date
CN110198372A CN110198372A (en) 2019-09-03
CN110198372B true CN110198372B (en) 2020-10-09

Family

ID=67753829

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910473489.7A Active CN110198372B (en) 2019-05-31 2019-05-31 Method for determining telescopic state of camera shooting assembly, readable storage medium and related equipment

Country Status (1)

Country Link
CN (1) CN110198372B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111416896B (en) * 2020-03-26 2022-04-12 维沃移动通信有限公司 Electronic device
JP7071671B2 (en) * 2020-09-04 2022-05-19 富士通クライアントコンピューティング株式会社 Information processing equipment and information processing programs
CN112558695A (en) * 2020-12-22 2021-03-26 维沃移动通信有限公司 Electronic device and state detection method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103106374A (en) * 2013-01-15 2013-05-15 广东欧珀移动通信有限公司 Safety pre-warning processing method, system and mobile terminal of reminding user of mobile terminal
JP2013236139A (en) * 2012-05-02 2013-11-21 Kyocera Corp Electronic apparatus, control method, and control program
CN104580992A (en) * 2014-12-31 2015-04-29 广东欧珀移动通信有限公司 Control method and mobile terminal
CN108900663A (en) * 2018-06-08 2018-11-27 Oppo广东移动通信有限公司 Control method by sliding, device and the electronic device of slide assemblies
CN109547601A (en) * 2018-12-27 2019-03-29 维沃移动通信有限公司 A kind of stroke detecting method of terminal device and flexible camera

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102384503B1 (en) * 2015-08-11 2022-04-12 삼성전자주식회사 Slim-type portable electronic device
US20170302835A1 (en) * 2016-04-18 2017-10-19 Zeppelin, Inc. Portable device for generating wide angle images
CN206178871U (en) * 2016-08-18 2017-05-17 上海英众信息科技有限公司 Portable people's face automatic identification equipment
CN107026934B (en) * 2016-10-27 2019-09-27 华为技术有限公司 A kind of sound localization method and device
CN109194875A (en) * 2018-10-31 2019-01-11 维沃移动通信有限公司 A kind of image pickup method and electronic equipment
CN109639861A (en) * 2018-12-27 2019-04-16 维沃移动通信有限公司 A kind of cut-in method of matching network, system and mobile terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013236139A (en) * 2012-05-02 2013-11-21 Kyocera Corp Electronic apparatus, control method, and control program
CN103106374A (en) * 2013-01-15 2013-05-15 广东欧珀移动通信有限公司 Safety pre-warning processing method, system and mobile terminal of reminding user of mobile terminal
CN104580992A (en) * 2014-12-31 2015-04-29 广东欧珀移动通信有限公司 Control method and mobile terminal
CN108900663A (en) * 2018-06-08 2018-11-27 Oppo广东移动通信有限公司 Control method by sliding, device and the electronic device of slide assemblies
CN109547601A (en) * 2018-12-27 2019-03-29 维沃移动通信有限公司 A kind of stroke detecting method of terminal device and flexible camera

Also Published As

Publication number Publication date
CN110198372A (en) 2019-09-03

Similar Documents

Publication Publication Date Title
CN109831622B (en) Shooting method and electronic equipment
CN110198372B (en) Method for determining telescopic state of camera shooting assembly, readable storage medium and related equipment
CN113795817A (en) Operation method for split screen display and electronic equipment
CN112351156A (en) Lens switching method and device
CN112889027A (en) Automatic screen splitting method, graphical user interface and electronic equipment
CN114422340A (en) Log reporting method, electronic device and storage medium
CN112334860A (en) Touch method of wearable device, wearable device and system
CN114490174A (en) File system detection method, electronic device and computer readable storage medium
CN113535284A (en) Full-screen display method and device and electronic equipment
CN114879894A (en) Function starting method, user interface and electronic equipment
CN111625175B (en) Touch event processing method, touch event processing device, medium and electronic equipment
CN112639675A (en) Method for dynamically modulating frequency of internal memory and electronic equipment
CN111142767B (en) User-defined key method and device of folding device and storage medium
CN111819830B (en) Information recording and displaying method and terminal in communication process
CN114173005B (en) Application layout control method and device, terminal equipment and computer readable storage medium
CN114077519A (en) System service recovery method and device and electronic equipment
CN114064571A (en) Method, device and terminal for determining file storage position
CN114120987A (en) Voice awakening method, electronic equipment and chip system
CN113364970A (en) Imaging method of non-line-of-sight object and electronic equipment
CN114115513B (en) Key control method and key device
CN112702564B (en) Image monitoring method and device
CN117082165B (en) Photographing operation method, terminal equipment and storage medium
CN116723384B (en) Process control method, electronic device and readable storage medium
CN113672454B (en) Screen freezing monitoring method, electronic equipment and computer readable storage medium
CN113132532B (en) Ambient light intensity calibration method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20210430

Address after: Unit 3401, unit a, building 6, Shenye Zhongcheng, No. 8089, Hongli West Road, Donghai community, Xiangmihu street, Futian District, Shenzhen, Guangdong 518040

Patentee after: Honor Device Co.,Ltd.

Address before: 518129 Bantian HUAWEI headquarters office building, Longgang District, Guangdong, Shenzhen

Patentee before: HUAWEI TECHNOLOGIES Co.,Ltd.

TR01 Transfer of patent right