CN112351156B - Lens switching method and device - Google Patents

Lens switching method and device Download PDF

Info

Publication number
CN112351156B
CN112351156B CN201910721605.2A CN201910721605A CN112351156B CN 112351156 B CN112351156 B CN 112351156B CN 201910721605 A CN201910721605 A CN 201910721605A CN 112351156 B CN112351156 B CN 112351156B
Authority
CN
China
Prior art keywords
lens
switched
switching
image sensor
position corresponding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910721605.2A
Other languages
Chinese (zh)
Other versions
CN112351156A (en
Inventor
冯帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN201910721605.2A priority Critical patent/CN112351156B/en
Priority to PCT/CN2020/104722 priority patent/WO2021023035A1/en
Publication of CN112351156A publication Critical patent/CN112351156A/en
Application granted granted Critical
Publication of CN112351156B publication Critical patent/CN112351156B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/58Means for changing the camera field of view without moving the camera body, e.g. nutating or panning of optics or image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes

Abstract

The embodiment of the application provides a lens switching method and a lens switching device, wherein the lens switching device comprises a camera module, and the camera module comprises M lenses, N image sensors and a motor; the M is a positive integer greater than or equal to 2, the N is a positive integer greater than or equal to 1, and the M is greater than the N. The method comprises the steps of determining a lens to be switched, switching the lens to be switched to a position corresponding to a target sensor, and achieving buckling recombination of the lens and the image sensor. The method can enable the number of the image sensors to be less than that of the lenses, meet the shooting requirements of users, reduce the number of the image sensors and reduce the cost of the terminal equipment.

Description

Lens switching method and device
Technical Field
The present invention relates to the field of communications, and in particular, to a method and an apparatus for switching a lens.
Background
At present, the camera function is more and more emphasized by users, and becomes one of the core functions of the mobile terminal device. In order to meet the photographing requirements of users in different scenes, the current mobile terminal device adapts to different photographing scenes by being provided with a plurality of independent cameras (camera modules). For example, when the shooting target is far away, a far-focus camera is used for shooting; when the shooting target distance is very close, the macro camera is used for shooting. Typically, a camera includes an image sensor and a lens. Equipping mobile terminal devices with multiple independent cameras would result in a significant increase in the manufacturing costs of the devices. Therefore, how to reduce the manufacturing cost of the device and simultaneously meet the shooting requirements of different shooting scenes becomes a problem to be solved.
Disclosure of Invention
The embodiment of the application provides a lens switching method and a lens switching device, which realize free combination of M lenses and N image sensors by decoupling a fixed corresponding relation between the image sensors and lenses in a camera module and rotating the lenses. The lens switching method can be applied to a lens switching device, wherein the lens switching device can be a terminal device or a device for the terminal device. The lens switching device comprises a camera module, wherein the camera module comprises M lenses, N image sensors and a motor; the M is a positive integer greater than or equal to 2, the N is a positive integer greater than or equal to 1, and the M is greater than N, which means that the number of the image sensors in the camera module can be less than the number of the lenses. The lens switching method can switch the lens to be switched to the designated position according to the shooting requirements of the user to be buckled and recombined with the image sensors, and the number of the image sensors can be smaller than that of the lenses, so that the number of the image sensors is reduced while the shooting requirements of the user are met, and the cost of the terminal equipment is reduced.
In a first aspect, an embodiment of the present application provides a lens switching method, which may be performed by a lens switching device, where the lens switching device includes a camera module, where the camera module includes M lenses, N image sensors, and a motor; the M is a positive integer greater than or equal to 2, the N is a positive integer greater than or equal to 1, and the M is greater than the N. The lens switching device determines a lens to be switched and switches the lens to be switched to a position corresponding to a target image sensor so as to complete the recombination of the lens and the image sensor, and the shooting function is realized. The number of the image sensors can be less than that of the lenses, and the cost of the terminal equipment is reduced by reducing the number of the image sensors.
In a possible design, the lens switching device may determine a rotation angle between the lens to be switched and a position corresponding to the target image sensor, and switch the lens to be switched to the position corresponding to the target image sensor in a counterclockwise direction or a clockwise direction according to the rotation angle. The rotation angle of the lens is determined, and the lens can be switched to the position corresponding to the target sensor more accurately according to the rotation angle.
In a possible design, the rotation angles include a first rotation angle at which the lens to be switched rotates in a clockwise direction and a second rotation angle at which the lens to be switched rotates in a counterclockwise direction. If the first rotation angle is smaller than the second rotation angle, switching the lens to be switched to a position corresponding to a target image sensor in a clockwise direction; and if the second rotation angle is smaller than the first rotation angle, switching the lens to be switched to a position corresponding to the target image sensor in a counterclockwise direction. The angles of the lens rotating in the clockwise direction or the anticlockwise direction are compared, and a smaller angle is selected for rotation, so that the lens switching process is simplified.
In one possible design, the lens switching device may determine shooting parameters including a shooting scene and a shooting mode, and then determine the lens to be switched according to the shooting parameters. In the mode, the lens switching device can automatically determine the lens to be switched by determining the shooting parameters.
In one possible design, the lens switching device may output a simulation interface for the camera module, and receive an operation of the user on the simulation interface, where the operation is a dragging operation of the user dragging the target lens to a position corresponding to the target image sensor, or the operation is a clicking operation on the target lens; and determining the target lens as a lens to be switched. In the mode, the lens switching device can determine the lens to be switched according to the selection of the user, and the shooting requirement of the user can be met.
In one possible design, the lens switching device may output a selection item of the lens and receive a selection operation of the lens by the user; and then determining the lens selected by the selection operation as the lens to be switched. In the mode, the lens switching device can determine the lens to be switched according to the selection of the user, and the shooting requirement of the user can be met.
In a possible design, the lens switching device may perform a switching prompt during the process of switching the lens to be switched to the position corresponding to the target image sensor. Through the switching prompt, the lens switching process can be more vividly displayed, and the operation experience of a user is improved.
In one possible design, the switching cue includes an audio cue and/or a vibration cue.
In a second aspect, an embodiment of the present application provides a lens switching device, where the device has a function of implementing the lens switching method provided in the first aspect. The function can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the functions described above.
In a third aspect, an embodiment of the present application provides a terminal device, where the terminal device includes a processor and a memory; the memory is configured to store a computer program, and the processor executes the computer program stored in the memory to cause the terminal device to perform the method of the first aspect or any of the possible implementations of the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, which includes a program or instructions, which when executed on a computer, causes the computer to perform the method of the first aspect or any one of the possible implementations of the first aspect.
The system-on-chip in the above aspect may be a system-on-chip (SOC), a baseband chip, and the like, where the baseband chip may include a processor, a channel encoder, a digital signal processor, a modem, an interface module, and the like.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a schematic view of a camera module provided in the prior art;
fig. 2a is a schematic view of a camera module according to an embodiment of the present disclosure;
fig. 2b is a schematic view of another camera module provided in this embodiment of the present application;
fig. 2c is a schematic view of another camera module provided in this embodiment of the present application;
fig. 3a is a schematic view of another camera module according to an embodiment of the present disclosure;
fig. 3b is a schematic view of another camera module according to an embodiment of the present application;
fig. 4a is a schematic view of another camera module provided in this embodiment of the present application;
fig. 4b is a schematic view of another camera module provided in this embodiment of the present application;
fig. 5a is a corresponding relationship diagram of a lens and an image sensor provided in the prior art;
fig. 5b is a corresponding relationship diagram of a lens and an image sensor provided in an embodiment of the present application;
fig. 6 is a schematic structural diagram of a terminal device according to an embodiment of the present application;
fig. 7a is a schematic structural diagram of a software system and a hardware layer of a terminal device according to an embodiment of the present application;
fig. 7b is a schematic diagram of a software and hardware processing flow of a terminal device according to an embodiment of the present application;
fig. 8 is a schematic flowchart of a lens switching method according to an embodiment of the present application;
fig. 9 is a schematic diagram of an application scenario of a shot cut method according to an embodiment of the present application;
fig. 10 is a schematic diagram of an application scenario of another shot cut method provided in an embodiment of the present application;
fig. 11 is a schematic view of an application scenario of another shot cut method provided in an embodiment of the present application;
fig. 12 is a schematic diagram of an application scenario of another shot cut method provided in an embodiment of the present application;
fig. 13 is a schematic structural diagram of a lens switching device according to an embodiment of the present application;
fig. 14 is a schematic structural diagram of another lens switching device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
With the increasing emphasis of users on the functions of cameras, the current terminal equipment usually adopts the system design of a plurality of cameras to meet the shooting requirements in different scenes, for example, when the shooting scene requires a wider field angle, the wide-angle camera is used for shooting; when the shooting target is far away, a far-focus camera is used for shooting; when close-up is shot, the macro camera is used for shooting so as to achieve the purpose of optimal shooting experience. However, the existing terminal device needs to be equipped with a plurality of independent cameras to adapt to different scenes, that is, the terminal device needs to be equipped with a plurality of lenses and a corresponding number of image sensors, as shown in fig. 1, the terminal device includes three independent cameras, each camera includes one lens and a corresponding image sensor; however, the multi-camera scheme shown in fig. 1 increases the manufacturing cost of the terminal device, and the multi-camera scheme has poor expandability, and when none of the existing cameras can meet the requirement of a new shooting scene, the multi-camera scheme must be implemented by adding a camera.
In order to solve the above problem, embodiments of the present application provide a lens switching method and apparatus, where the lens switching apparatus may be a terminal device or an apparatus for a terminal device. The lens switching method realizes the free combination of M lenses and N image sensors by decoupling the fixed corresponding relation between the image sensors and the lenses in the camera module and rotating the lenses. When M is larger than N, the number of the image sensors can be reduced, and the cost of the terminal equipment is reduced. The lens switching method can be applied to a lens switching device, wherein the lens switching device can be a terminal device or a device for the terminal device. The lens switching device comprises a camera module, wherein the camera module comprises M lenses, N image sensors and a motor; the M is a positive integer greater than or equal to 2, the N is a positive integer greater than or equal to 1, and the M is greater than the N, which means that the number of the image sensors in the camera module can be smaller than the number of the lenses. The lens switching device can switch the lens to be switched to the designated position according to the shooting requirement of a user to be buckled and recombined with the image sensor, so that the shooting requirement of the user is met, and meanwhile, the manufacturing cost of the terminal equipment is reduced.
In an example, a camera module provided in an embodiment of the present application is shown in fig. 2a, fig. 2b, and fig. 2c, and includes 3 lenses and 1 image sensor, that is, M is 3,N is 1. The position of image sensor in this embodiment is fixed unchangeable, and 3 camera lenses can rotate in order to realize the switching of camera lens through the motor, when the camera lens rotates the position that image sensor corresponds, accomplish the optical axis and align, make camera lens and image sensor lock reorganization. In one possible implementation, the included angle between the 3 lenses is kept constant; each lens can rotate along clockwise or anticlockwise direction respectively, and the rotation angle is 90 degrees at most. As shown in fig. 2a and 2b, wherein fig. 2a and 2b show two different positional relationships of the motor and the image sensor, the motor in fig. 2a is located above the image sensor, and the motor in fig. 2b is located below the image sensor. Taking fig. 2a as an example, the lens 2 may rotate counterclockwise to a position corresponding to the image sensor, and the lens 1 and the lens 3 also rotate counterclockwise. In another possible implementation, the angle between the 3 lenses may be varied. As shown in fig. 2c, each lens can rotate 360 degrees in all directions, and the included angle between 3 lenses can be changed, then the lens 2 can rotate clockwise to the position corresponding to the image sensor, and the positions of the lens 1 and the lens 3 can be unchanged. It can be understood that a plurality of lenses shown in the embodiment can be independently placed in parallel, and the thickness of the whole machine is not affected.
In an example, another camera module provided in the embodiments of the present application is shown in fig. 3a and 3b, and the camera module includes 5 lenses and 2 image sensors, that is, M is 5,N is 2. In this example, the position of the image sensor is fixed, 5 lenses can be rotated by a motor to realize the switching of the lenses, and each lens can be rotated in 360 degrees in all directions. In one possible implementation, the angle between the 5 lenses may be constant. For example, if the image sensor corresponding to the lens 2 is the sensor 1, and the image sensor corresponding to the lens 3 is the sensor 2, the lens 2 can rotate clockwise to the position corresponding to the sensor 1, the lens 3 can rotate clockwise to the position corresponding to the sensor 2, and other lenses also rotate correspondingly, as shown in fig. 3 a. In another possible implementation, the angle between the 5 lenses may be varied. For example, the lens 2 may be rotated in a clockwise direction to a position corresponding to the sensor 1, the lens 3 may be rotated in a clockwise direction to a position corresponding to the sensor 2, and the positions of the other lenses are kept unchanged, as shown in fig. 3 b. It can be understood that the camera module provided by this embodiment can simultaneously support N image sensors to match with the lens to obtain an image, and implement corresponding functions (such as a picture-in-picture function).
It should be noted that the lenses and the image sensors in the camera module provided in this embodiment may also be arranged in rows or columns, for example, the camera module includes 2 lenses and one image sensor, and the lenses and the image sensors are arranged in rows (for example, arranged from left to right), as shown in fig. 4 a. When the camera module switches lenses, each lens or lens to be switched horizontally moves left or right to a position corresponding to the image sensor, for example, the lens 1 in fig. 4a can move horizontally right to a position corresponding to the image sensor. For example, the camera module includes 3 lenses and an image sensor, and the lenses and the image sensor are arranged in a column (e.g., from top to bottom), as shown in fig. 4 b. When the camera module switches lenses, each lens or lens to be switched vertically moves up or down to a position corresponding to the image sensor, for example, the lens 1 in fig. 4b can vertically move up to a position corresponding to the image sensor.
It should be noted that the camera module provided in this embodiment can implement dynamic matching between the lens and the image sensor, and can implement that the number of the image sensors is less than that of the lens to reduce the device cost; and when the number of the image sensors is reduced, the terminal equipment can select to install the image sensor with higher image resolution, so that different lenses can be matched with the image sensor with higher image resolution, and the shooting quality of the camera is improved. For example, in the multi-camera scheme shown in fig. 1, since each camera is an independent camera, an image sensor with a higher image resolution may not be matched for each lens when considering the manufacturing cost of the terminal device. As shown in fig. 5a, the image sensor corresponding to the middle-focus lens has the highest image resolution, the image sensor corresponding to the wide-angle lens has the second highest image resolution, and the image sensor corresponding to the far-focus lens has the lowest image resolution. The camera module shown in fig. 2a to 2c includes 3 lenses and 1 image sensor, where the image sensor may be an image sensor with a high image resolution, and when any one of the lenses is switched to a position corresponding to the image sensor, the lens module may be buckled and recombined with the image sensor to implement a shooting function. As shown in fig. 5b, 3 lenses can be matched with the image sensor, which is beneficial to improving the shooting quality of the camera.
The terminal device described in this embodiment may be a device with a wireless transceiving function, and may be deployed on land, including indoor or outdoor, handheld, wearable, or vehicle-mounted; can also be deployed on the water surface (such as a ship and the like); and may also be deployed in the air (e.g., airplanes, balloons, satellites, etc.). The terminal device may be a mobile phone (mobile phone), a tablet computer (Pad), a computer with a wireless transceiving function, a Virtual Reality (VR) terminal device, an Augmented Reality (AR) terminal device, a wireless terminal in industrial control (industrial control), a vehicle-mounted terminal device, a wireless terminal in unmanned driving (self), a wireless terminal in remote medical (remote), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), a wearable terminal device, and so on. The embodiments of the present application do not limit the application scenarios. The terminal device may also be referred to as a User Equipment (UE), an access terminal device, a vehicle-mounted terminal, an industrial control terminal, a UE unit, a UE station, a mobile station, a remote terminal device, a mobile device, a UE terminal device, a wireless communication device, a UE agent, or a UE apparatus, etc. The terminals may also be fixed or mobile. The terminal device of the present application may also be an on-board module, an on-board component, an on-board chip, or an on-board unit built into the vehicle as one or more components or units, and the vehicle may implement the method of the present application through the built-in on-board module, on-board component, on-board chip, or on-board unit.
For convenience of description, in the following embodiments, a terminal device is taken as an example for description.
Referring to fig. 6, the terminal device 600 may include a processor 610, an external memory interface 620, an internal memory 621, a Universal Serial Bus (USB) interface 630, a charging management module 640, a power management module 641, a battery 642, an antenna 1, an antenna 2, a mobile communication module 650, a wireless communication module 660, an audio module 670, a speaker 670A, a receiver 670B, a microphone 670C, an earphone interface 670D, a sensor module 680, a button 690, a motor 691, an indicator 692, a camera 693, a display 694, and a Subscriber Identity Module (SIM) card interface 695. The sensor module 680 may include a pressure sensor 680A, a gyroscope sensor 680B, an air pressure sensor 680C, a magnetic sensor 680D, an acceleration sensor 680E, a distance sensor 680F, a proximity light sensor 680G, a fingerprint sensor 680H, a temperature sensor 680J, a touch sensor 680K, an ambient light sensor 680L, a bone conduction sensor 680M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the terminal device 600. In other embodiments of the present application, terminal device 600 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 610 may include one or more processing units, such as: the processor 610 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 610 for storing instructions and data. In one embodiment, the memory in the processor 610 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 610. If the processor 610 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 610, thereby increasing the efficiency of the system.
In one embodiment, the processor 610 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bidirectional synchronous serial bus including a serial data line (SDA) and a Serial Clock Line (SCL). In one embodiment, processor 610 may include multiple sets of I2C buses. The processor 610 may be coupled to the touch sensor 680K, the charger, the flash, the camera 693, etc. through different I2C bus interfaces. For example: the processor 610 may be coupled to the touch sensor 680K through an I2C interface, so that the processor 610 and the touch sensor 680K communicate through an I2C bus interface to implement the touch function of the terminal device 600.
The I2S interface may be used for audio communication. In one embodiment, processor 610 may include multiple sets of I2S buses. The processor 610 may be coupled to the audio module 670 by an I2S bus, enabling communication between the processor 610 and the audio module 670. In one embodiment, the audio module 670 may transmit an audio signal to the wireless communication module 660 through an I2S interface, so as to implement a function of answering a call through a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In one embodiment, the audio module 670 and the wireless communication module 660 may be coupled by a PCM bus interface. In one embodiment, the audio module 670 may also transmit audio signals to the wireless communication module 660 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In one embodiment, a UART interface is generally used to connect the processor 610 and the wireless communication module 660. For example: the processor 610 communicates with the bluetooth module in the wireless communication module 660 through the UART interface to implement the bluetooth function. In an embodiment, the audio module 670 may transmit an audio signal to the wireless communication module 660 through a UART interface, so as to realize a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 610 to peripheral devices such as the display screen 694 and the camera 693. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In one embodiment, processor 610 and camera 693 communicate via a CSI interface to implement the capture functionality of terminal device 600. The processor 610 and the display screen 694 communicate through a DSI interface, and the display function of the terminal device 600 is realized.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In one embodiment, a GPIO interface may be used to connect the processor 610 to the camera 693, the display 694, the wireless communication module 660, the audio module 670, the sensor module 680, and the like. The GPIO interface may also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, and the like.
The USB interface 630 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 630 may be used to connect a charger to charge the terminal device 600, and may also be used to transmit data between the terminal device 600 and a peripheral device. And the method can also be used for connecting a headset and playing audio through the headset. The interface may also be used to connect other terminal devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only an exemplary illustration, and does not limit the structure of the terminal device 600. In other embodiments of the present application, the terminal device 600 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 640 is used to receive charging input from a charger. The charger can be a wireless charger or a wired charger. In one wired charging embodiment, the charging management module 640 may receive charging input from a wired charger via the USB interface 630. In one wireless charging embodiment, the charging management module 640 may receive a wireless charging input through a wireless charging coil of the terminal device 600. The charging management module 640 may also supply power to the terminal device through the power management module 641 while charging the battery 642.
The power management module 641 is configured to connect the battery 642, the charging management module 640 and the processor 610. The power management module 641 receives the input from the battery 642 and/or the charging management module 640, and supplies power to the processor 610, the internal memory 621, the display 694, the camera 693, the wireless communication module 660, and the like. The power management module 641 may also be configured to monitor battery capacity, battery cycle count, battery state of health (leakage, impedance), and other parameters. In one embodiment, the power management module 641 may also be disposed in the processor 610. In another embodiment, the power management module 641 and the charging management module 640 may be disposed in the same device.
The wireless communication function of the terminal device 600 may be implemented by the antenna 1, the antenna 2, the mobile communication module 650, the wireless communication module 660, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in terminal device 600 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 650 may provide a solution including 2G/3G/4G/5G wireless communication applied on the terminal device 600. The mobile communication module 650 may include at least one filter, switch, power amplifier, low Noise Amplifier (LNA), and the like. The mobile communication module 650 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the filtered electromagnetic wave to the modem processor for demodulation. The mobile communication module 650 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In one embodiment, at least some of the functional modules of the mobile communication module 650 may be disposed in the processor 610. In one embodiment, at least some of the functional blocks of the mobile communication module 650 may be disposed in the same device as at least some of the blocks of the processor 610.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 670A, the receiver 670B, etc.) or displays an image or video through the display screen 694. In one embodiment, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be separate from the processor 610 and may be located in the same device as the mobile communication module 650 or other functional modules.
The wireless communication module 660 may provide a solution for wireless communication applied to the terminal device 600, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 660 may be one or more devices integrating at least one communication processing module. The wireless communication module 660 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering on electromagnetic wave signals, and transmits the processed signals to the processor 610. The wireless communication module 660 may also receive a signal to be transmitted from the processor 610, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In one embodiment, the antenna 1 of the terminal device 600 is coupled to the mobile communication module 650 and the antenna 2 is coupled to the wireless communication module 660 such that the terminal device 600 can communicate with the network and other devices via wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The terminal device 600 implements the display function through the GPU, the display screen 694, and the application processor. The GPU is a microprocessor for image processing, connected to the display screen 694 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 610 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 694 is used for displaying images, videos, and the like, wherein the display screen 694 includes a display panel, the display screen may specifically include a foldable screen, a special-shaped screen, and the display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active matrix organic light-emitting diode (AMOLED) or an active matrix organic light-emitting diode (AMOLED), a flexible light-emitting diode (flex-emitting diode, FLED), a Micro led, a quantum dot light-emitting diode (QLED), and the like. In one embodiment, the terminal device 600 may include 1 or N display screens 694, where N is a positive integer greater than 1.
The terminal device 600 may implement a shooting function through the ISP, the camera module 693, the video codec, the GPU, the display screen 694, and the application processor, wherein the camera module 693 may include a lens and an image sensor.
For example, when taking a picture, the shutter is opened, light is transmitted to the image sensor through the lens, and the image sensor can convert an optical signal into an electrical signal and transmit the electrical signal to the ISP for processing and converting into an image visible to the naked eye. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene.
The lens is used for acquiring a static image or a video, the object generates an optical image through the lens, and the optical image is projected onto the image sensor, and the image sensor may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The image sensor converts the optical signal into an electrical signal and then transmits the electrical signal to the ISP to be converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the terminal device 600 selects a frequency point, the digital signal processor is used to perform fourier transform or the like on the frequency point energy.
Video codecs are used to compress or decompress digital video. The terminal device 600 may support one or more video codecs. In this way, the terminal device 600 can play or record videos in a plurality of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can implement applications such as intelligent recognition of the terminal device 600, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 620 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the terminal device 600. The external memory card communicates with the processor 610 through the external memory interface 620 to implement a data storage function. For example, files such as music, video, etc. are saved in the external memory card.
Internal memory 621 may be used to store computer-executable program code, including instructions. The internal memory 621 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, and the like) required by at least one function, and the like. The storage data area may store data (such as audio data, a phonebook, etc.) created during use of the terminal apparatus 600, and the like. In addition, the internal memory 621 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 610 executes various functional applications of the terminal device 600 and data processing by executing instructions stored in the internal memory 621 and/or instructions stored in a memory provided in the processor.
The terminal device 600 may implement an audio function through the audio module 670, the speaker 670A, the receiver 670B, the microphone 670C, the earphone interface 670D, and the application processor, etc. Such as music playing, recording, etc.
The audio module 670 is used to convert digital audio information into an analog audio signal output and also used to convert an analog audio input into a digital audio signal. The audio module 670 may also be used to encode and decode audio signals. In one embodiment, the audio module 670 may be disposed in the processor 610, or some functional modules of the audio module 670 may be disposed in the processor 610.
The speaker 670A, also known as a "horn", is used to convert electrical audio signals into acoustic signals. The terminal apparatus 600 can listen to music or listen to a handsfree call through the speaker 670A.
The receiver 670B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the terminal apparatus 600 answers a call or voice information, it is possible to answer voice by bringing the receiver 670B close to the human ear.
The microphone 670C, also known as a "microphone," is used to convert acoustic signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 670C by making a sound near the microphone 670C through the mouth of the user. The terminal device 600 may be provided with at least one microphone 670C. In other embodiments, the terminal device 600 may be provided with two microphones 670C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the terminal device 600 may further include three, four or more microphones 670C to collect a sound signal, reduce noise, identify a sound source, and implement a directional recording function.
The earphone interface 670D is used to connect a wired earphone. The earphone interface 670D may be the USB interface 630, or may be an Open Mobile Terminal Platform (OMTP) standard interface of 3.5mm, or a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 680A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In one embodiment, the pressure sensor 680A may be disposed on the display screen 694. The pressure sensor 680A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 680A, the capacitance between the electrodes changes. The terminal apparatus 600 determines the intensity of the pressure from the change in the capacitance. When a touch operation is applied to the display screen 694, the terminal apparatus 600 detects the intensity of the touch operation based on the pressure sensor 680A. The terminal apparatus 600 may also calculate the touched position from the detection signal of the pressure sensor 680A. In one embodiment, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 680B may be used to determine the motion attitude of the terminal device 600. In one embodiment, the angular velocity of terminal device 600 about three axes (i.e., x, y, and z axes) may be determined by gyroscope sensor 680B. The gyro sensor 680B may be used to photograph anti-shake. Illustratively, when the shutter is pressed, the gyro sensor 680B detects the shake angle of the terminal device 600, calculates the distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the terminal device 600 through a reverse movement, thereby achieving anti-shake. The gyro sensor 680B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 680C is used to measure air pressure. In one embodiment, the terminal device 600 calculates altitude, aiding positioning and navigation, from the barometric pressure value measured by the barometric pressure sensor 680C.
The magnetic sensor 680D includes a hall sensor. The terminal device 600 may detect the opening and closing of the flip holster using the magnetic sensor 680D. In one embodiment, when the terminal device 600 is a flip, the terminal device 600 may detect the opening and closing of the flip according to the magnetic sensor 680D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 680E may detect the magnitude of acceleration of the terminal device 600 in various directions (generally, three axes). The magnitude and direction of gravity can be detected when the terminal device 600 is stationary. The method can also be used for recognizing the posture of the terminal equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 680F for measuring distance. The terminal device 600 may measure the distance by infrared or laser. In one embodiment, taking a picture of a scene, terminal device 600 may utilize range sensor 680F to measure distance for fast focus.
The proximity light sensor 680G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The terminal device 600 emits infrared light to the outside through the light emitting diode. The terminal device 600 detects infrared reflected light from a nearby object using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the terminal apparatus 600. When insufficient reflected light is detected, the terminal device 600 can determine that there is no object near the terminal device 600. The terminal device 600 can utilize the proximity light sensor 680G to detect that the user holds the terminal device 600 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 680G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 680L is used to sense the ambient light level. The terminal device 600 may adaptively adjust the brightness of the display 694 according to the perceived ambient light level. The ambient light sensor 680L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 680L may also cooperate with the proximity light sensor 680G to detect whether the terminal device 600 is in a pocket, in order to prevent accidental touches.
The fingerprint sensor 680H is used to collect a fingerprint. The terminal device 600 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access to an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The temperature sensor 680J is used to detect temperature. In one embodiment, the terminal device 600 implements a temperature handling strategy using the temperature detected by the temperature sensor 680J. For example, when the temperature reported by the temperature sensor 680J exceeds a threshold, the terminal device 600 performs a reduction in performance of a processor located near the temperature sensor 680J, so as to reduce power consumption and implement thermal protection. In other embodiments, the terminal device 600 heats the battery 642 when the temperature is below another threshold to avoid a low temperature causing an abnormal shutdown of the terminal device 600. In other embodiments, when the temperature is below a further threshold, the terminal apparatus 600 performs boosting on the output voltage of the battery 642 to avoid abnormal shutdown due to low temperature.
Touch sensor 680K is also referred to as a "touch device". The touch sensor 680K may be disposed on the display screen 694, and the touch sensor 680K and the display screen 694 form a touch screen, which is also referred to as a "touch screen". The touch sensor 680K is used to detect a touch operation acting thereon or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided via the display screen 694. In other embodiments, the touch sensor 680K may be disposed on the surface of the terminal device 600, different from the display screen 694.
The bone conduction sensor 680M may acquire a vibration signal. In one embodiment, the bone conduction transducer 680M can acquire a vibration signal of the vibrating bone mass of the human voice. The bone conduction sensor 680M can also contact the human pulse to receive the blood pressure pulsation signal. In one embodiment, the bone conduction sensor 680M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 670 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 680M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 680M, so as to realize the heart rate detection function.
The keys 690 include a power-on key, a volume key, and the like. The keys 690 may be mechanical keys. Or may be touch keys. The terminal apparatus 600 may receive a key input, and generate a key signal input related to user setting and function control of the terminal apparatus 600.
The motor 691 may produce a vibration indication. Motor 691 can be used for incoming call vibration prompting, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 691 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 694. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 692 may be an indicator light that may be used to indicate a state of charge, a change in charge, or may be used to indicate a message, a missed call, a notification, etc.
The SIM card interface 695 is used for connecting a SIM card. The SIM card can be attached to and detached from the terminal device 600 by being inserted into or pulled out of the SIM card interface 695. The terminal device 600 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1. The SIM card interface 695 can support a Nano SIM card, a Micro SIM card, a SIM card, etc. Multiple cards can be inserted into the same SIM card interface 695 at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 695 may also be compatible with different types of SIM cards. The SIM interface 695 may also be compatible with an external memory card. The terminal device 600 interacts with the network through the SIM card to implement functions such as communication and data communication. In one embodiment, the terminal device 600 employs eSIM, namely: an embedded SIM card. The eSIM card can be embedded in the terminal apparatus 600 and cannot be separated from the terminal apparatus 600.
Referring to fig. 7a, the software system of the terminal device in this embodiment may adopt a layered architecture, which includes, from top to bottom, an application layer, an application framework layer, and a kernel layer.
The application layer may include a series of application packages. As shown in fig. 7a, the application package may include camera, gallery, calendar, call, map, navigation, WLAN, bluetooth, music, video, short message, etc. applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions. As shown in FIG. 7a, the application framework layer may include a window manager, content provider, view system, phone manager, resource manager, notification manager, system application program interface, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The telephone manager is used for providing a communication function of the terminal equipment. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scrollbar text in a status bar at the top of the system, such as a notification of a running application in the background, or a notification that appears on the screen in the form of a dialog window. For example, text information is prompted in the status bar, a prompt tone is given, the terminal device vibrates, an indicator light flickers, and the like.
A system Application Program Interface (API), also called an application programming interface, is a set of definitions, procedures and protocols through which computer software can communicate with each other. One of the primary functions of an API is to provide a common set of functions. The API is also a middleware and provides data sharing for various platforms.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The kernel layer is a layer between hardware and software. The core layer may include a lens rotation drive, an image data drive, a display drive, a sensor drive, an audio drive, a motor drive, and the like. Through the driving program, the corresponding hardware device can be driven to work normally, for example, the lens rotates to drive a motor for driving a hardware layer or a manual trigger switch to complete the switching of the lens.
The hardware layer in this embodiment may include various devices, such as motors, manual trigger switches, motor devices, audio devices, image sensors, lenses, and the like. Each device is used for carrying out corresponding function, for example, can realize after camera lens and image sensor lock reorganization and shoot the function, and the audio frequency device can play prompt tone etc..
With reference to the above description of the software system and the hardware layer, a detailed description is given of a software and hardware processing flow of the lens switching method in the terminal device, as shown in fig. 7b, which may specifically include the following steps:
according to the shooting requirements of users, a camera application program of an application layer can call a camera equipment control module to issue a lens switching control flow;
the application program framework layer receives a lens switching control flow issued by the application layer, and transmits the lens switching control flow to the kernel layer by calling an API (application program interface);
a lens switching driving program of the inner core layer receives the lens switching control flow and generates a switching control signal to control a motor or a manual trigger switch to rotate a lens;
and a motor or a manual trigger switch of the hardware layer receives the switching control signal, and the motor or the manual trigger switch starts to operate to enable the lens to be switched to rotate to the position corresponding to the target image sensor.
The camera application program comprises a camera equipment control module and a camera data stream module, wherein the camera equipment control module is used for controlling the switching, the lens switching and the like of the camera equipment, and the camera data stream module is used for receiving image data; the application program framework layer comprises a camera equipment API and a data flow API and is used for transmitting control flow or data flow; the kernel layer comprises a lens switching driving program and an image data driving program, the lens switching driving program is used for generating a control signal to drive the motor to realize lens switching, and the image data driving program is used for driving the camera module to acquire image data; the hardware layer comprises a motor, a manual trigger switch, a lens and an image sensor, and the lens and the image sensor are used for imaging after being recombined.
In one example, an audio driver and/or a motor driver of the kernel layer may also receive the shot-switching control stream, where the audio driver may generate an audio control signal according to the shot-switching control stream to control an audio device to play an alert tone; the motor drive may generate a vibration control signal according to the lens switching control flow to control vibration of the motor. Correspondingly, the hardware layer also comprises an audio device and/or a motor device.
Correspondingly, after the lens switching is completed, the image sensor can re-image according to the matched lens to generate a data stream, and transmit the data stream to the image data driver of the kernel layer, and the image data driver of the kernel layer transmits the data stream to the camera data stream module of the application layer through the data stream API of the kernel layer, so as to provide shooting data for the camera application program.
The following describes in detail the steps of the lens switching method provided in the embodiment of the present application, please refer to fig. 8, where the method specifically includes the following steps:
s801, determining a lens to be switched by the lens switching device, wherein the lens to be switched is a lens in the M lenses.
The lens switching device can select a lens in the camera module to shoot according to the shooting requirement of a user, and the shot lens is the lens to be switched; the number of the lenses to be switched can be one or more. For example, when the user needs to shoot a scene with a wider field angle (for example, shoot a wider landscape), the lens switching device determines that the lens to be switched is a wide-angle lens; for another example, when the user needs to take close-up (e.g., close-up), and needs to take a macro shot, the shot switching device determines that the shot to be switched is a macro shot.
In one example, each lens in the camera module may correspond to each lens identifier. Specifically, the determining of the lens to be switched may be determining a lens identifier of the lens to be switched, where the lens identifier corresponding to each lens may be preset by the lens switching device; for example, the camera module includes 3 lenses, which are a wide-angle lens, a telephoto lens, and a macro lens, respectively, and the lens identifier of the wide-angle lens is lens 1, the lens identifier of the telephoto lens is lens 2, and the lens identifier of the macro lens is lens 3.
The lens switching device can determine the lens to be switched through the following three ways, and certainly can determine the lens to be switched through other ways, which is not limited in the embodiment of the present application. In the first mode, the lens switching device can automatically determine the lens to be switched according to the shooting parameters. In the second and third modes, the lens switching device can determine the lens to be switched according to the selection operation of the user.
The method I comprises the following steps: determining shooting parameters, wherein the shooting parameters comprise a shooting scene and a shooting mode; and determining a lens to be switched according to the shooting parameters.
For example, when the shooting parameters include a shooting scene, the lens switching device may determine the shooting scene according to the image shooting preview picture, and then determine the lens to be switched according to the shooting scene. The lens switching device can automatically identify the shooting scene according to the content in the image shooting preview picture. For example, when the image capturing preview screen is opened, the lens shift device may analyze which type of image the image in the preview screen is, such as a landscape type image or a person type image; and determining image parameters such as the depth of field, the shooting distance and the like of the image in the preview picture. Based on the above-described recognition of the content in the image capturing preview screen, the lens shift device may determine the capturing scene from the image type and the image parameter in the preview screen. After the shooting scene is determined, the lens to be switched can be determined according to the corresponding relation between the shooting scene and the lens. The correspondence between the shooting scene and the lens may be preset and stored in the lens switching device, for example, the correspondence between the shooting scene and the lens stored in the lens switching device may be as shown in table 1 below. The shots and corresponding shooting scenes shown in table 1 are only some examples, and are used for illustration, without limiting the application scenes of the embodiment.
Table 1: correspondence table of shooting scene and lens
Wide-angle lens Is suitable for scenes requiring wide field angle, such as building, scenery and other shooting scenes
Telephoto lens Suitable for scenes far away from the shooting target, such as long shot scenes, aerial shooting scenes and the like
Macro lens Scenes suitable for close-up shooting, e.g. character features, scenery features, or the like
For example, as shown in fig. 9, when the user opens the camera application, the display interface of the lens switching device will display an image capture preview screen; and the lens switching device determines a shooting scene according to the preview picture. For example, if the preview screen in fig. 8 displays a portrait, it may be determined that the image type in the preview screen is a character; then, it is determined that the photographing distance is short by analyzing that the person in the preview screen occupies about two thirds of the preview screen. According to the above content recognition result, the shot switching device determines that the shooting scene is a character close-up shooting scene, and according to table 1, it can be known that the shooting scene can be shot by using macro lens, and then it is determined that the shot to be switched is the macro lens.
For another example, when the shooting parameters include a shooting mode, a selection item of the shooting mode may be output on the display interface, and a selection operation of a user on the shooting mode is received, and then the lens to be switched is determined according to the shooting mode selected by the selection operation. For example, when a user needs to use a camera to take a picture, the shooting mode needed to be used may be generally determined according to the shooting target, and for example, when the user needs to use the camera to shoot a landscape, the landscape shooting mode is generally selected to make the shot image more perfect. When the user determines the shooting mode that needs to be used, the shooting mode usually corresponds to one lens, and the lens switching device can determine that the lens corresponding to the shooting mode is a lens to be switched. The correspondence between the shooting mode and the lens can be preset and stored in the lens switching device, for example, the correspondence between the shooting mode and the lens stored in the lens switching device can be shown in table 2 below. The shots and the corresponding shooting modes shown in table 2 are only some examples, and are used for illustration, and do not limit the application scenarios of the embodiment.
Table 2: correspondence table between shooting mode and lens
Wide-angle lens Suitable for night scene mode, landscape mode, large aperture mode, etc
Telephoto lens Suitable for long-range mode, aerial photography mode and the like
Macro lens Adapted for portrait mode or the like
For example, as shown in fig. 10, when the user opens the camera application, the display interface displays the selection item of the shooting mode; the user clicks to select one of the shooting modes, for example, the user needs to shoot a portrait currently, and can click to select the portrait shooting mode. When the display interface displays the selection items of the shooting mode, all the shooting mode selectable items can be displayed in the option bar in a left-right sliding mode. After receiving the selection operation of the user on the shooting mode, the lens switching device searches the corresponding relation table between the shooting mode and the lens according to the selected shooting mode, and determines the lens to be switched, for example, if the lens corresponding to the portrait shooting mode is a macro lens, the lens to be switched is determined to be the macro lens.
The second method comprises the following steps: the simulation interface of the camera module can be output on the display interface of the lens switching device, and the operation of the user on the simulation interface can be received, wherein the operation of the user on the simulation interface can be a dragging operation of dragging the target lens to a position corresponding to the target image sensor, or a clicking operation of the target lens. And determining the target lens as the lens to be switched according to the operation of the user on the simulation interface.
The simulation interface of the camera module is a simulation visual interface of the camera module of the lens switching device, a user can manually select to switch the lens through the visual interface, and corresponding hardware equipment correspondingly executes related operations on a hardware layer according to the operation of the user on the simulation interface, such as lens switching. The operation of the user on the simulation interface may be a dragging operation of dragging the target lens to a position corresponding to the target image sensor, for example, dragging the target lens to a position corresponding to the image sensor in a clockwise direction or a counterclockwise direction. The operation of the user on the simulation interface may also be a click operation on a target lens, for example, the user clicks the target lens in the simulation interface, and after receiving the click operation on the target lens by the user, the lens switching device may automatically switch the target lens to a position corresponding to the target sensor.
For example, as shown in fig. 11, the user opens the camera application, clicks the setting item, and can select the simulation interface of the camera module to be displayed. After entering the simulation interface, the simulation interface displays the simulation structure of the camera module in the lens switching device, wherein the simulation structure is a structure displayed after the hardware structure of the camera module is visually operated. The simulation interface shown in fig. 11 includes 3 lenses and an image sensor, and each lens corresponds to a lens identifier (e.g., lens 1). The simulation interface can also display detailed information of each lens, such as parameters of the lens, shooting scenes suitable for the lens and the like. The user can determine the target lens to be the lens 1 according to the detailed information of each lens displayed in the simulation interface shown in fig. 11 and by combining the shooting requirements. After the target lens is determined, the user can drag the lens 1 to rotate to the position corresponding to the sensor, that is, the lens to be switched is determined to be the lens 1.
Optionally, after the user finishes operating the lens in the simulation interface, the simulation interface may be closed. For example, a close option may be set in the upper right corner of the simulation interface, and when the lens switching device detects that the user clicks the close option, the simulation interface will be closed. For another example, in a period of time after the lens switching device detects the operation of the user on the simulation interface, if the operation of the user on the simulation interface is not detected again, the lens switching device defaults that the lens switching is completed, and the simulation interface is automatically closed.
The third method comprises the following steps: the selection items of the lens can be output on the display interface of the lens switching device, the selection operation of the lens by the user is received, and the lens selected by the user is determined as the lens to be switched.
For example, the lens switching device includes a plurality of lenses, each lens corresponds to different lens details, and the lens details can be displayed on the display interface, for example, the lens 1 details is that the focal length of the lens is 8 mm, and the shooting scene suitable for the lens is a building shooting scene.
For example, as shown in fig. 12, when the user opens the camera application and clicks the lens selection item, the display interface of the lens switching device will display the lens selection item, for example, the camera module of the lens switching device includes 3 lenses, and the display interface will display the lens parameters and the applicable shooting scene of the 3 lenses. The user clicks and selects one of the lenses according to the shooting requirements, for example, the user needs to shoot a portrait at present, and can select a macro lens, and then the lens to be switched is determined to be the macro lens.
S802, the lens switching device switches the lens to be switched to a position corresponding to a target image sensor, wherein the target image sensor is a sensor in the N image sensors.
The target image sensor may be any one of the N image sensors, and the position of any one of the image sensors in this embodiment is fixed, so that the position to which the lens to be switched is switched after the target image sensor is determined is also determined. For example, when a camera module includes an image sensor and a plurality of lenses, the target image sensor is an image sensor in the camera module. And after the lens to be switched is determined, the lens to be switched can be directly switched to the position corresponding to the image sensor. For example, in the camera module shown in fig. 2a, the lens to be switched is the lens 2, and the lens 2 is directly switched to the position corresponding to the sensor.
The target image sensor may be a plurality of image sensors among the N image sensors. For example, when the camera module includes a plurality of image sensors and a plurality of lenses, a free combination of the plurality of image sensors and the plurality of lenses can be realized. After determining the plurality of lenses to be switched, determining an image sensor corresponding to each lens to be switched, so as to switch each lens to be switched to a corresponding position of the corresponding image sensor. For example, the lens switching device may store the correspondence relationship of the lens and the image sensor in advance. The lens switching device acquires the image sensor corresponding to the lens to be switched as a target sensor according to the pre-stored corresponding relation between the lens and the image sensor, and then switches the lens to be switched to the position corresponding to the target sensor. For example, the camera module includes two image sensors, where the image resolution of the sensor 1 is high and the image resolution of the sensor 2 is low; according to the shooting requirements of a user, determining the lenses to be switched to be a lens 1 and a lens 2, wherein the lens 1 is a wide-angle lens, an image sensor with higher image resolution needs to be matched, the lens 2 is a telephoto lens, and the requirement for the image resolution is lower than that of the lens 1, so that the corresponding relationship between each lens to be switched and each image sensor can be determined to be that the lens 1 is matched with the sensor 1, the lens 2 is matched with the sensor 2, the lens 1 is determined to be switched to the position corresponding to the sensor 1, and the lens 2 is determined to be switched to the position corresponding to the sensor 2.
When the lens to be switched is switched to the position corresponding to the target image sensor, the switching can be performed according to the rotation angle, and the method specifically comprises the following steps: and determining a rotation angle between the lens to be switched and a position corresponding to the target image sensor, and switching the lens to be switched to the position corresponding to the target image sensor along the anticlockwise direction or the clockwise direction according to the rotation angle. Further, the direction in which the lens to be switched is switched to the position corresponding to the target image sensor can be determined according to the first rotation angle and the second rotation angle, wherein the first rotation angle is a clockwise rotation angle, and the second rotation angle is a counterclockwise rotation angle. The method specifically comprises the following steps: if the first rotation angle is smaller than the second rotation angle, switching the lens to be switched to a position corresponding to a target image sensor in a clockwise direction; and if the second rotation angle is smaller than the first rotation angle, switching the lens to be switched to a position corresponding to a target image sensor in a counterclockwise direction. For example, in the camera module shown in fig. 2a, the lens to be switched is the lens 2, the first rotation angle of the lens 2 is about 270 degrees, and the second rotation angle of the lens 2 is about 30 degrees, so that the lens 2 rotates counterclockwise to the position corresponding to the target image sensor.
In an example, the lens to be switched may be switched to a position corresponding to the target image sensor by driving the lens to rotate by a motor, for example, a switching control signal may be sent to the motor, and after receiving the switching control signal, the motor may drive the lens to be switched to rotate to the position corresponding to the target image sensor.
In an example, the lens to be switched may be switched to a position corresponding to the target image sensor in a manual manner by a user, for example, a manual trigger switch (e.g., a manual knob) may be installed on the terminal device, when the lens is switched in the manual manner, the camera module may be visible (e.g., a camera module area of the terminal is a transparent housing, or a display interface of the terminal is to display an analog interface of the camera), the user may rotate the lens to be switched to the position corresponding to the target image sensor by directly operating the manual trigger switch, and the whole switching process is a visualization process, so as to ensure that the lens to be switched is accurately switched to the position corresponding to the target image sensor.
In one example, in the process of switching the lens to be switched to the position corresponding to the target image sensor, a switching prompt may be performed simultaneously, where the switching prompt may include, but is not limited to, an audio prompt, a vibration prompt, and the like. For example, in the switching process, a prompt tone (such as playing music, playing a lens switching voice, etc.) may be played, and the terminal device may also vibrate to prompt the user that the lens is being switched, thereby improving the user experience in the switching process.
The embodiment of the application provides a lens switching method, which is characterized in that a lens to be switched is determined, and then the lens to be switched is switched to a position corresponding to a target image sensor, so that the lens and the image sensor are buckled and recombined. The method can be applied to a lens switching device, wherein the lens switching device comprises a camera module, a lens and an image sensor in the camera module are decoupled, free combination between a plurality of lenses and the sensors can be realized through switching of the lens, and the manufacturing cost of the terminal equipment is reduced by reducing the number of the sensors; optionally, the lens switching process can be linked through a peripheral (such as an audio device and a motor device), so that a user can perceive the lens switching process, and the user experience is improved.
An embodiment of the present application provides a lens switching device, as shown in fig. 13. The lens switching apparatus 1300 can be used for executing the lens switching method described in fig. 8, and the lens switching apparatus includes:
a determining unit 1301, configured to determine a lens to be switched, where the lens to be switched is one of the M lenses;
a switching unit 1302, configured to switch the lens to be switched to a position corresponding to a target image sensor, where the target image sensor is one of the N image sensors.
In an implementation manner, the switching unit 1302 may specifically be configured to:
determining a rotation angle between the lens to be switched and a position corresponding to a target image sensor;
and switching the lens to be switched to a position corresponding to the target image sensor along the anticlockwise direction or the clockwise direction according to the rotation angle.
In one implementation manner, the rotation angles include a first rotation angle at which the lens to be switched rotates in a clockwise direction and a second rotation angle at which the lens to be switched rotates in a counterclockwise direction; the switching unit 1302 may specifically be configured to:
if the first rotation angle is smaller than the second rotation angle, switching the lens to be switched to a position corresponding to a target image sensor in a clockwise direction;
and if the second rotation angle is smaller than the first rotation angle, switching the lens to be switched to a position corresponding to the target image sensor in a counterclockwise direction.
In an implementation manner, the determining unit 1301 may specifically be configured to:
determining shooting parameters, wherein the shooting parameters comprise a shooting scene and a shooting mode;
and determining a lens to be switched according to the shooting parameters.
In an implementation manner, the determining unit 1301 may specifically be configured to:
outputting a simulation interface of the camera module;
receiving an operation of a user on the simulation interface, wherein the operation is a dragging operation of dragging the target lens to a position corresponding to the target image sensor by the user, or the operation is a clicking operation on the target lens;
and determining the target lens as a lens to be switched.
In an implementation manner, the determining unit 1301 may specifically be configured to:
outputting a selection item of the shot;
receiving a selection operation of a user on a lens;
and determining the lens selected by the selection operation as the lens to be switched.
In an implementation manner, the lens switching device further includes a prompting unit 1303, where the prompting unit is configured to perform a switching prompt in a process of switching the lens to be switched to a position corresponding to the target image sensor.
In one implementation, the switching cue includes an audio cue and/or a vibration cue.
The embodiment of the present application provides another lens switching device, as shown in fig. 14. The lens switching device 1400 may include a processor 1401, configured to determine a lens to be switched or switch the lens to be switched to a position corresponding to a target sensor; the relevant functions implemented by the determining unit 1301, the switching unit 1302 and the prompting unit 1303 shown in fig. 13 can be implemented by the processor 1401. Processor 1401 may include one or more processors, for example, processor 1401 may be one or more Central Processing Units (CPUs), network Processors (NPs), hardware chips, or any combination thereof. In the case where the processor 1401 is one CPU, the CPU may be a single-core CPU or a multi-core CPU.
The lens switching apparatus 1400 may further include a memory 1402, the memory 1402 storing program codes, etc. Memory 1402 may include volatile memory (volatile memory), such as Random Access Memory (RAM); the memory 1402 may also include a non-volatile memory (non-volatile memory), such as a read-only memory (ROM), a flash memory (flash memory), a Hard Disk Drive (HDD) or a solid-state drive (SSD); memory 1402 may also include a combination of the above types of memory.
The processor 1401 and the memory 1402 may be configured to implement the lens switching method described in fig. 8, where the processor 1401 is configured to determine a lens to be switched, and is further configured to switch the lens to be switched to a position corresponding to a target image sensor.
In one implementation, processor 1401 may be specifically configured to:
determining a rotation angle between the lens to be switched and a position corresponding to a target image sensor;
and switching the lens to be switched to a position corresponding to the target image sensor along the counterclockwise direction or the clockwise direction according to the rotation angle.
In one implementation manner, the rotation angles include a first rotation angle at which the lens to be switched rotates in a clockwise direction and a second rotation angle at which the lens to be switched rotates in a counterclockwise direction; processor 1401 may specifically be configured to:
if the first rotation angle is smaller than the second rotation angle, switching the lens to be switched to a position corresponding to a target image sensor in a clockwise direction;
and if the second rotation angle is smaller than the first rotation angle, switching the lens to be switched to a position corresponding to the target image sensor in a counterclockwise direction.
In one implementation, processor 1401 may be specifically configured to:
determining shooting parameters, wherein the shooting parameters comprise a shooting scene and a shooting mode;
and determining a lens to be switched according to the shooting parameters.
In one implementation, processor 1401 may be specifically configured to:
outputting a simulation interface of the camera module;
receiving an operation of a user on the simulation interface, wherein the operation is a dragging operation of dragging the target lens to a position corresponding to the target image sensor by the user, or the operation is a clicking operation on the target lens;
and determining the target lens as a lens to be switched.
In one implementation, processor 1401 may be specifically configured to:
outputting a selection item of the shot;
receiving selection operation of a user on a lens;
and determining the lens selected by the selection operation as the lens to be switched.
In one implementation, processor 1401 may be specifically configured to:
and in the process of switching the lens to be switched to the position corresponding to the target image sensor, carrying out switching prompt.
In one implementation, the switching cue includes an audio cue and/or a vibration cue.
The apparatus in the above embodiments may be a terminal device, or may be a chip applied in the terminal device, or other combined devices and components having the above terminal function.
An embodiment of the present application further provides a readable storage medium, which includes a program or an instruction, and when the program or the instruction is run on a computer, the program or the instruction causes the computer to execute the lens switching method executed by the lens switching apparatus in the above method embodiment.
In the above embodiments, all or part of the implementation may be realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., digital Video Disk (DVD)), or a semiconductor medium (e.g., solid State Disk (SSD)), among others.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (11)

1. A lens switching method is characterized in that the method is applied to a lens switching device, the lens switching device comprises a camera module, and the camera module comprises M lenses, N image sensors and a motor; the M is a positive integer greater than 2, the N is a positive integer greater than 1, and the M is greater than N, the method comprising:
determining a lens to be switched, wherein the lens to be switched is a plurality of lenses in the M lenses; wherein, confirm to wait to switch over the camera lens, include: outputting a simulation interface of the camera module, wherein the simulation interface displays a simulation structure of the camera module in the lens switching device, the simulation structure is displayed after a hardware structure of the camera module is subjected to visual operation, the simulation interface also displays detailed information of each lens, the detailed information of each lens comprises parameters of the M lenses and a shooting scene applicable to the lens, and receives the operation of a user on the simulation interface, the operation is a dragging operation of dragging a target lens to a position corresponding to a target image sensor by the user, and the target lens is determined as a lens to be switched; or determining shooting parameters, wherein the shooting parameters comprise a shooting scene and a shooting mode, and determining a lens to be switched according to the shooting parameters; or, outputting a selection item of the lens, receiving a selection operation of a user on the lens, and determining the lens selected by the selection operation as the lens to be switched;
and sending a lens switching control flow to a lens switching driving program according to the lens to be switched, and generating a switching control signal to control a motor to switch the lens to be switched to a position corresponding to a target image sensor, wherein the target image sensor is a plurality of image sensors in the N image sensors, and the plurality of image sensors have different image resolutions.
2. The method according to claim 1, wherein switching the lens to be switched to a position corresponding to a target image sensor comprises:
determining a rotation angle between the lens to be switched and a position corresponding to a target image sensor;
and switching the lens to be switched to a position corresponding to the target image sensor along the anticlockwise direction or the clockwise direction according to the rotation angle.
3. The method according to claim 2, wherein the rotation angles include a first rotation angle at which the lens to be switched rotates in a clockwise direction and a second rotation angle at which the lens to be switched rotates in a counterclockwise direction;
according to the rotation angle, the lens to be switched is switched to the position corresponding to the target image sensor along the anticlockwise direction or the clockwise direction, and the method comprises the following steps:
if the first rotation angle is smaller than the second rotation angle, switching the lens to be switched to a position corresponding to a target image sensor in a clockwise direction;
and if the second rotation angle is smaller than the first rotation angle, switching the lens to be switched to a position corresponding to the target image sensor in a counterclockwise direction.
4. The method according to any one of claims 1 to 3, further comprising:
and in the process of switching the lens to be switched to the position corresponding to the target image sensor, carrying out switching prompt.
5. The method of claim 4, wherein the switching cue comprises an audio cue and/or a vibration cue.
6. A lens switching device is characterized by comprising a camera module, wherein the camera module comprises M lenses, N image sensors and a motor; m is a positive integer greater than 2, N is a positive integer greater than 1, and M is greater than N; the lens switching device further includes:
the device comprises a determining unit, a judging unit and a judging unit, wherein the determining unit is used for determining a lens to be switched, and the lens to be switched is a plurality of lenses in the M lenses; the determining unit is used for determining the lens to be switched, and is specifically used for:
outputting a simulation interface of the camera module, wherein the simulation interface displays a simulation structure of the camera module in the lens switching device, the simulation structure is displayed after a hardware structure of the camera module is subjected to visual operation, the simulation interface also displays detailed information of each lens, the detailed information of each lens comprises parameters of the M lenses and a shooting scene applicable to the lens, and receives the operation of a user on the simulation interface, the operation is a dragging operation of dragging a target lens to a position corresponding to a target image sensor by the user, and the target lens is determined as a lens to be switched; or determining shooting parameters, wherein the shooting parameters comprise a shooting scene and a shooting mode, and determining a lens to be switched according to the shooting parameters; or, outputting a lens selection item, receiving a lens selection operation of a user, and determining the lens selected by the selection operation as a lens to be switched;
and the switching unit is used for sending a lens switching control flow to a lens switching driving program according to the lens to be switched, generating a switching control signal to control a motor to switch the lens to be switched to a position corresponding to a target image sensor, wherein the target image sensor is a plurality of image sensors in the N image sensors, and the plurality of image sensors have different image resolutions.
7. The apparatus according to claim 6, wherein the switching unit, when switching the lens to be switched to the position corresponding to the target image sensor, is specifically configured to:
determining a rotation angle between the lens to be switched and a position corresponding to a target image sensor;
and switching the lens to be switched to a position corresponding to the target image sensor along the anticlockwise direction or the clockwise direction according to the rotation angle.
8. The apparatus according to claim 7, wherein the rotation angles include a first rotation angle at which the lens to be switched rotates in a clockwise direction and a second rotation angle at which the lens to be switched rotates in a counterclockwise direction;
the switching unit is configured to, when the to-be-switched lens is switched to a position corresponding to the target image sensor in the counterclockwise direction or the clockwise direction according to the rotation angle, specifically:
if the first rotation angle is smaller than the second rotation angle, switching the lens to be switched to a position corresponding to a target image sensor in a clockwise direction;
and if the second rotation angle is smaller than the first rotation angle, switching the lens to be switched to a position corresponding to the target image sensor in a counterclockwise direction.
9. The device according to any one of claims 6 to 8, further comprising a prompting unit, wherein the prompting unit is configured to perform a switching prompt during switching the lens to be switched to a position corresponding to a target image sensor.
10. The apparatus of claim 9, wherein the switching cue comprises an audio cue and/or a vibration cue.
11. A readable storage medium, characterized by comprising a program or instructions for performing the method of any of claims 1 to 5 when the program or instructions are run on a computer.
CN201910721605.2A 2019-08-06 2019-08-06 Lens switching method and device Active CN112351156B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910721605.2A CN112351156B (en) 2019-08-06 2019-08-06 Lens switching method and device
PCT/CN2020/104722 WO2021023035A1 (en) 2019-08-06 2020-07-27 Lens switching method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910721605.2A CN112351156B (en) 2019-08-06 2019-08-06 Lens switching method and device

Publications (2)

Publication Number Publication Date
CN112351156A CN112351156A (en) 2021-02-09
CN112351156B true CN112351156B (en) 2023-02-14

Family

ID=74366509

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910721605.2A Active CN112351156B (en) 2019-08-06 2019-08-06 Lens switching method and device

Country Status (2)

Country Link
CN (1) CN112351156B (en)
WO (1) WO2021023035A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113395435A (en) * 2020-03-11 2021-09-14 北京芯海视界三维科技有限公司 Device for realizing 3D shooting and 3D display terminal
CN113055599B (en) * 2021-03-29 2023-02-07 维沃软件技术有限公司 Camera switching method and device, electronic equipment and readable storage medium
CN115236920A (en) * 2021-04-22 2022-10-25 华为技术有限公司 Imaging device, electronic apparatus, imaging device control method, and imaging device control apparatus
CN113660402A (en) * 2021-08-19 2021-11-16 南昌逸勤科技有限公司 Camera module and electronic equipment
CN113766109A (en) * 2021-09-02 2021-12-07 成都中科创达软件有限公司 Camera module, lens switching control method, device and equipment
CN218830168U (en) * 2021-09-26 2023-04-07 深圳市道通智能航空技术股份有限公司 Ultra-high definition multi-camera input switching device
CN113784052B (en) * 2021-09-26 2023-09-15 深圳市道通智能航空技术股份有限公司 Ultra-high definition multi-shot input switching device, method and system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005249824A (en) * 2004-03-01 2005-09-15 Casio Comput Co Ltd Imaging apparatus
CN102231097A (en) * 2011-07-28 2011-11-02 青岛海信移动通信技术股份有限公司 Method and device for unlocking screen
CN105052124A (en) * 2013-02-21 2015-11-11 日本电气株式会社 Image processing device, image processing method and permanent computer-readable medium
CN105242485A (en) * 2015-11-20 2016-01-13 重庆盾银科技有限公司 Multi-spectrum filter switching device for image imaging device
CN105979157A (en) * 2016-06-30 2016-09-28 维沃移动通信有限公司 Photographing mode switching method and mobile terminal
CN107800942A (en) * 2017-12-14 2018-03-13 信利光电股份有限公司 A kind of camera and terminal
CN107800951A (en) * 2016-09-07 2018-03-13 深圳富泰宏精密工业有限公司 Electronic installation and its Shot change method

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004208176A (en) * 2002-12-26 2004-07-22 Matsushita Electric Ind Co Ltd Portable terminal device
CN2666062Y (en) * 2003-11-20 2004-12-22 明基电通股份有限公司 Mobile phone and digital camera thereof
US20060187322A1 (en) * 2005-02-18 2006-08-24 Janson Wilbert F Jr Digital camera using multiple fixed focal length lenses and multiple image sensors to provide an extended zoom range
CN101551581A (en) * 2008-03-31 2009-10-07 纬创资通股份有限公司 Protecting shell applied to electronic device
CN102854593A (en) * 2012-09-17 2013-01-02 吴江市聚力机械有限公司 Rotary disc type mobile phone lens
CN203643672U (en) * 2013-11-25 2014-06-11 中兴通讯股份有限公司 Electronic device
US10871628B2 (en) * 2017-10-31 2020-12-22 Pony Ai Inc. Camera multi-lens mount for assisted-driving vehicle
CN109729246A (en) * 2018-11-27 2019-05-07 努比亚技术有限公司 A kind of multi-camera circuit structure, terminal and computer readable storage medium
CN109922179A (en) * 2019-03-21 2019-06-21 维沃移动通信(杭州)有限公司 A kind of camera module, camera control method and terminal
CN110881100A (en) * 2019-12-31 2020-03-13 华勤通讯技术有限公司 Lens module

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005249824A (en) * 2004-03-01 2005-09-15 Casio Comput Co Ltd Imaging apparatus
CN102231097A (en) * 2011-07-28 2011-11-02 青岛海信移动通信技术股份有限公司 Method and device for unlocking screen
CN105052124A (en) * 2013-02-21 2015-11-11 日本电气株式会社 Image processing device, image processing method and permanent computer-readable medium
CN105242485A (en) * 2015-11-20 2016-01-13 重庆盾银科技有限公司 Multi-spectrum filter switching device for image imaging device
CN105979157A (en) * 2016-06-30 2016-09-28 维沃移动通信有限公司 Photographing mode switching method and mobile terminal
CN107800951A (en) * 2016-09-07 2018-03-13 深圳富泰宏精密工业有限公司 Electronic installation and its Shot change method
CN107800942A (en) * 2017-12-14 2018-03-13 信利光电股份有限公司 A kind of camera and terminal

Also Published As

Publication number Publication date
CN112351156A (en) 2021-02-09
WO2021023035A1 (en) 2021-02-11

Similar Documents

Publication Publication Date Title
US11785329B2 (en) Camera switching method for terminal, and terminal
CN112351156B (en) Lens switching method and device
WO2020073959A1 (en) Image capturing method, and electronic device
WO2020259452A1 (en) Full-screen display method for mobile terminal, and apparatus
CN115866121B (en) Application interface interaction method, electronic device and computer readable storage medium
CN113475057B (en) Video frame rate control method and related device
US20220174143A1 (en) Message notification method and electronic device
CN110248037B (en) Identity document scanning method and device
CN114125130B (en) Method for controlling communication service state, terminal device and readable storage medium
CN114615423A (en) Method and equipment for processing callback stream
CN113448382B (en) Multi-screen display electronic device and multi-screen display method of electronic device
CN110138999B (en) Certificate scanning method and device for mobile terminal
WO2022100219A1 (en) Data transfer method and related device
WO2021037034A1 (en) Method for switching state of application, and terminal device
CN112532508B (en) Video communication method and video communication device
CN111249728B (en) Image processing method, device and storage medium
CN114528581A (en) Safety display method and electronic equipment
CN114006698B (en) token refreshing method and device, electronic equipment and readable storage medium
CN113542574A (en) Shooting preview method under zooming, terminal, storage medium and electronic equipment
CN116709018B (en) Zoom bar segmentation method and electronic equipment
CN116708751B (en) Method and device for determining photographing duration and electronic equipment
CN116095512B (en) Photographing method of terminal equipment and related device
CN116166347A (en) Screen capturing method and related device
CN114691066A (en) Application display method and electronic equipment
CN117692693A (en) Multi-screen display method and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20210427

Address after: Unit 3401, unit a, building 6, Shenye Zhongcheng, No. 8089, Hongli West Road, Donghai community, Xiangmihu street, Futian District, Shenzhen, Guangdong 518040

Applicant after: Honor Device Co.,Ltd.

Address before: 518129 Bantian HUAWEI headquarters office building, Longgang District, Guangdong, Shenzhen

Applicant before: HUAWEI TECHNOLOGIES Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant