CN217770227U - Camera module and intelligent terminal - Google Patents

Camera module and intelligent terminal Download PDF

Info

Publication number
CN217770227U
CN217770227U CN202222128725.1U CN202222128725U CN217770227U CN 217770227 U CN217770227 U CN 217770227U CN 202222128725 U CN202222128725 U CN 202222128725U CN 217770227 U CN217770227 U CN 217770227U
Authority
CN
China
Prior art keywords
module
lens
light
filter
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202222128725.1U
Other languages
Chinese (zh)
Inventor
倪磊
揭应平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Chuanyin Communication Technology Co ltd
Original Assignee
Chongqing Chuanyin Communication Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Chuanyin Communication Technology Co ltd filed Critical Chongqing Chuanyin Communication Technology Co ltd
Priority to CN202222128725.1U priority Critical patent/CN217770227U/en
Application granted granted Critical
Publication of CN217770227U publication Critical patent/CN217770227U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Studio Devices (AREA)

Abstract

The application provides a module and intelligent terminal make a video recording, wherein, this module of making a video recording includes two at least camera lenses, a sensitization module, two at least light filters and control module, the camera lens includes light incoming end and light outgoing end, the sensitization module is located the light-emitting light path of camera lens, single filter corresponds single camera lens setting, and filter is located between camera lens and the control module, be used for receiving the light of following light outgoing end and penetrating, control module connects filter, be used for controlling at least two camera lenses alternative light. Above-mentioned module of making a video recording has reduced the number that sets up of traditional solitary camera, reduces the cost of module of making a video recording, adopts a single sensitization module to carry on a plurality of optical filters of the camera lens of the multiple difference of control to realize the function of many cameras, effectively solve the current module space of making a video recording and account for than high technical problem.

Description

Camera module and intelligent terminal
Technical Field
The application relates to the technical field of imaging, in particular to a camera module and an intelligent terminal.
Background
When traditional module of making a video recording includes a plurality of cameras, a plurality of cameras exist independently, and each camera all disposes camera lens, filter lens and sensitization module alone promptly.
In conceiving and realizing this application process, the inventor finds that there is the following problem at least, and traditional module of making a video recording will realize the function of taking a video recording more than, need arrange the camera more than two kinds, and each camera of traditional module of making a video recording possesses independent sensitization module, leads to the module of making a video recording that has a plurality of cameras to have the space to account for than high technical problem.
The foregoing description is provided for general background information and is not admitted to be prior art.
SUMMERY OF THE UTILITY MODEL
To above-mentioned technical problem, the application provides a module and intelligent terminal make a video recording, aims at solving the current technical problem that the module cost is high, the space accounts for than high of making a video recording.
For solving the technical problem, the application provides a module of making a video recording, is applied to intelligent terminal, includes:
the lens comprises a light incidence end and a light emergence end;
the photosensitive module is positioned on a light emergent path of the lens and used for receiving light rays emitted from the light emitting end;
the single light filtering piece is arranged corresponding to the single lens, and the light filtering piece is positioned between the lens and the photosensitive module; and
and the control module is connected with the light filtering piece and used for selecting one light transmission part of the light filtering piece.
Optionally, the control module is configured to transmit a preset voltage signal to the selected filter, so that the filter transmits light.
Optionally, the camera module further includes a clock module, the clock module is connected to the control module, and the control module is configured to selectively transmit a preset voltage signal to one of the optical filters according to a clock signal when receiving the clock signal of the clock module, so that the optical filter transmits light.
Optionally, the control module at least includes a first preset voltage output end and a second preset voltage output end, and the first preset voltage output end and the second preset voltage output end are respectively connected to different optical filters, so that the optical filters transmit different light bands.
Optionally, the camera module further includes a lens barrel, the photosensitive module is disposed at a rear end of the lens barrel, the lens and the filter are disposed at a front end of the lens barrel, and the lens and/or the filter are detachably connected to the lens barrel.
Optionally, the lens includes at least one of a standard camera, a wide-angle lens, an ultra-wide-angle lens, a telephoto lens, a macro lens, and a blurring lens.
Optionally, the filter is an electron optical filter.
Optionally, the filter is an electro-optic tunable filter.
The application also provides an intelligent terminal, which comprises a main body; and
the camera module of any of the above, wherein the camera module is mounted to the main body.
Optionally, the intelligent terminal includes an input module, and the input module is connected to the control module of the camera module.
As mentioned above, the camera module of this application includes two at least camera lenses, a photosensitive module, two at least light filters and control module, and the camera lens includes light incoming end and light outgoing end, and photosensitive module is located the light-emitting light path of camera lens, and single light filter corresponds single camera lens setting, and light filter is located between camera lens and the control module for receive the light of following light outgoing end and penetrating, and control module connects light filter, is used for controlling the alternative light of two at least camera lenses. Above-mentioned module of making a video recording has reduced the number that sets up of traditional solitary camera, reduces the cost of module of making a video recording, adopts a single sensitization module to carry on a plurality of optical filters of the camera lens of the multiple difference of control to realize the function of many cameras, effectively solve the current module space of making a video recording and account for than high technical problem.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and, together with the description, serve to explain the principles of the application. In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments will be briefly described below, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic hardware structure diagram of a mobile terminal implementing various embodiments of the present application;
fig. 2 is a diagram illustrating a communication network system architecture according to an embodiment of the present application;
fig. 3 is a schematic structural view of a camera module according to the first embodiment;
fig. 4 is a schematic view showing a control state of the camera module according to the first embodiment;
fig. 5 is a schematic view showing another control state of the camera module according to the first embodiment;
fig. 6 is a schematic diagram showing automatic control of the image pickup module according to the second embodiment with reference to a clock signal.
Reference numerals:
Figure BDA0003792959840000021
Figure BDA0003792959840000031
the implementation, functional features and advantages of the objectives of the present application will be further explained with reference to the accompanying drawings. With the above figures, there are shown specific embodiments of the present application, which will be described in more detail below. These drawings and written description are not intended to limit the scope of the inventive concepts in any manner, but rather to illustrate the inventive concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. The following description refers to the accompanying drawings in which the same numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the application, as detailed in the appended claims.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, a reference to an element identified by the phrase "comprising one of 82308230a of 82303030, or an element defined by the phrase" comprising another identical element does not exclude the presence of the same element in a process, method, article, or apparatus comprising the element, and elements having the same designation may or may not have the same meaning in different embodiments of the application, the particular meaning being determined by its interpretation in the particular embodiment or by further reference to the context of the particular embodiment.
It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope herein. The word "if," as used herein, may be interpreted as "at \8230; \8230when" or "when 8230; \823030when" or "in response to a determination," depending on the context. Also, as used herein, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context indicates otherwise. It will be further understood that the terms "comprises," "comprising," "includes" and/or "including," when used in this specification, specify the presence of stated features, steps, operations, elements, components, items, species, and/or groups, but do not preclude the presence, or addition of one or more other features, steps, operations, elements, components, species, and/or groups thereof. As used herein, the terms "or," "and/or," "including at least one of the following," and the like, are to be construed as inclusive or meaning any one or any combination. For example, "includes at least one of: A. b, C "means" any of the following: a; b; c; a and B; a and C; b and C; a and B and C ", again for example," a, B or C "or" a, B and/or C "means" any one of the following: a; b; c; a and B; a and C; b and C; a and B and C'. An exception to this definition will occur only when a combination of elements, functions, steps or operations are inherently mutually exclusive in some way.
It should be understood that, although the steps in the flowcharts in the embodiments of the present application are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless otherwise indicated herein. Moreover, at least some of the steps in the figures may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, in different orders, and may be performed alternately or partially with other steps or at least some of the sub-steps or stages of other steps.
The words "if", as used herein may be interpreted as "at \8230; \8230whenor" when 8230; \8230when or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In the following description, suffixes such as "module", "component", or "unit" used to indicate elements are used only for facilitating the description of the present application, and have no particular meaning in themselves. Thus, "module", "component" or "unit" may be used mixedly.
The intelligent terminal may be implemented in various forms. For example, the smart terminal described in the present application may include smart terminals such as a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, and the like, and fixed terminals such as a Digital TV, a desktop computer, and the like.
The following description will be given taking a mobile terminal as an example, and it will be understood by those skilled in the art that the configuration according to the embodiment of the present application can be applied to a fixed type terminal in addition to elements particularly used for mobile purposes.
Referring to fig. 1, which is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present application, the mobile terminal 100 may include: RF (Radio Frequency) unit 101, wiFi module 102, audio output unit 103, a/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 1 is not intended to be limiting of mobile terminals, which may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following specifically describes the components of the mobile terminal with reference to fig. 1:
the radio frequency unit 101 may be configured to receive and transmit signals during information transmission and reception or during a call, and specifically, receive downlink information of a base station and then process the downlink information to the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA2000 (Code Division Multiple Access 2000 ), WCDMA (Wideband Code Division Multiple Access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access), FDD-LTE (Frequency Division duplex-Long Term Evolution), TDD-LTE (Time Division duplex-Long Term Evolution, time Division Long Term Evolution), 5G, and so on.
WiFi belongs to a short-distance wireless transmission technology, and the mobile terminal can help a user to receive and send emails, browse webpages, access streaming media and the like through the WiFi module 102, and provides wireless broadband internet access for the user. Although fig. 1 shows the WiFi module 102, it is understood that it does not belong to the essential constitution of the mobile terminal, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 into an audio signal and output as sound when the mobile terminal 100 is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 may include a speaker, a buzzer, and the like.
The a/V input unit 104 is used to receive audio or video signals. The a/V input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics processor 1041 processes image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the WiFi module 102. The microphone 1042 can receive sounds (audio data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, or the like, and can process such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode. The microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The mobile terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Optionally, the light sensor includes an ambient light sensor that may adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 1061 and/or the backlight when the mobile terminal 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the gesture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, the description is omitted here.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Alternatively, the user input unit 107 may include a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 1071 (e.g., an operation performed by the user on or near the touch panel 1071 using a finger, a stylus, or any other suitable object or accessory), and drive a corresponding connection device according to a predetermined program. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Optionally, the touch detection device detects a touch orientation of a user, detects a signal caused by a touch operation, and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and can receive and execute commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Optionally, other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like, and are not limited thereto.
Alternatively, the touch panel 1071 may cover the display panel 1061, and when the touch panel 1071 detects a touch operation on or near the touch panel 1071, the touch operation is transmitted to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although the touch panel 1071 and the display panel 1061 are shown in fig. 1 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
The interface unit 108 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal 100 and external devices.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a program storage area and a data storage area, and optionally, the program storage area may store an operating system, an application program (such as a sound playing function, an image playing function, and the like) required by at least one function, and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby integrally monitoring the mobile terminal. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor and a modem processor, optionally the application processor primarily handles operating systems, user interfaces, application programs, etc., and the modem processor primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The mobile terminal 100 may further include a power supply 111 (e.g., a battery) for supplying power to various components, and preferably, the power supply 111 may be logically connected to the processor 110 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system.
Although not shown in fig. 1, the mobile terminal 100 may further include a bluetooth module or the like, which is not described in detail herein.
In order to facilitate understanding of the embodiments of the present application, a communication network system on which the mobile terminal of the present application is based is described below.
Referring to fig. 2, fig. 2 is an architecture diagram of a communication Network system according to an embodiment of the present disclosure, where the communication Network system is an LTE system of a universal mobile telecommunications technology, and the LTE system includes a UE (User Equipment) 201, an e-UTRAN (Evolved UMTS Terrestrial Radio Access Network) 202, an epc (Evolved Packet Core) 203, and an IP service 204 of an operator, which are in communication connection in sequence.
Optionally, the UE201 may be the terminal 100 described above, and is not described herein again.
The E-UTRAN202 includes eNodeB2021 and other eNodeBs 2022, among others. Alternatively, the eNodeB2021 may be connected with other enodebs 2022 through a backhaul (e.g., X2 interface), the eNodeB2021 is connected to the EPC203, and the eNodeB2021 may provide the UE201 access to the EPC 203.
The EPC203 may include MME (Mobility Management Entity) 2031, hss (Home Subscriber Server) 2032, other MME2033, SGW (Serving gateway) 2034, pgw (PDN gateway) 2035, PCRF (Policy and Charging Rules Function) 2036, and the like. Optionally, the MME2031 is a control node that handles signaling between the UE201 and the EPC203, providing bearer and connection management. HSS2032 is used to provide registers to manage functions such as home location register (not shown) and holds subscriber specific information about service characteristics, data rates, etc. All user data may be sent through SGW2034, PGW2035 may provide IP address assignment for UE201 and other functions, and PCRF2036 is a policy and charging control policy decision point for traffic data flow and IP bearer resources, which selects and provides available policy and charging control decisions for a policy and charging enforcement function (not shown).
IP services 204 may include the internet, intranets, IMS (IP Multimedia Subsystem), or other IP services, among others.
Although the LTE system is described as an example, it should be understood by those skilled in the art that the present application is not limited to the LTE system, but may also be applied to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA, and future new network systems (e.g. 5G), and the like.
Based on the above mobile terminal hardware structure and communication network system, various embodiments of the present application are provided.
Referring to fig. 3 to 6, the camera module 300 of the present application includes at least two lenses 30, a photosensitive module 50, at least two filters 70, and a control module 90, where the lens 30 includes a light incident end 31 and a light emergent end 32; the photosensitive module 50 is located on the light emitting path of the lens 30 and is used for receiving the light emitted from the light emitting end 32; a single filter 70 is disposed corresponding to the single lens 30, and the filter 70 is located between the lens 30 and the photosensitive module 50; the control module 90 is connected to the filter 70, and the control module 90 is used for selecting one of the filters 70 to transmit light.
As described above, in the present embodiment, the camera module 300 of the present application includes at least two lenses 30, a photosensitive module 50, at least two optical filters, and a control module 90, the lens 30 includes a light incident end 31 and a light emergent end 32, the photosensitive module 50 is located on a light emergent path of the lens 30 and is spaced apart from the light emergent end 32 of the lens 30, a single optical filter 70 is disposed corresponding to the single lens 30, the optical filter 70 is located between the lens 30 and the control module 90 and is configured to receive light emitted from the light emergent end 32, and the control module 90 is connected to the optical filter 70 and is configured to control one of the at least two lenses 30 to pass light. Above-mentioned module 300 of making a video recording has reduced the number that sets up of traditional solitary camera, reduces the module 300's of making a video recording cost, adopts single photosensitive module 50 to carry on a plurality of optical filters 70 of the camera lens 30 of the multiple difference of control to realize the function of many cameras, effectively solve current module 300 of making a video recording with high costs, the space accounts for high technical problem.
Optionally, the camera module 300 further includes a lens barrel, the photosensitive module 50 is disposed at a rear end of the lens barrel, the lens 30 and the optical filter 70 are disposed at a front end of the lens barrel, and the lens 30 and/or the optical filter 70 are detachably connected to the lens barrel.
In this embodiment, the image pickup module includes or has a lens barrel, a lens 30 and a filter 70 are disposed at a front end of the lens barrel, and the photosensitive module 50 is disposed at a rear end of the lens barrel, wherein the filter 70 is disposed between the lens 30 and the photosensitive module 50 and adjacent to the lens 30. The filter 70 and the lens 30 can be independently mounted with the lens barrel, so that the lens 30 and the filter 70 are detachably connected, and the filter 70 is convenient to replace; the optical filter 70 and at least one lens can be packaged in combination, and then the combined lens 30 is assembled in the lens barrel, so that the relative position between the optical filter 70 and the lens is fixed, and the assembly is convenient.
Further, one lens 30 includes at least one lens, and the filter 70 is located between the at least one lens and the photosensitive module 50, and when the plurality of lenses are provided, the filter 70 is located between the outermost lens and the photosensitive module 50 that are located at the light emitting end 32 of the lens 30 away from the light emitting end 31, so as to filter the emitted light of the light emitting end 32.
In an embodiment, the lens 30 and the lens barrel in the camera module 300 of the present application are detachable from each other, so that the lens 30 with different functions can be replaced conveniently, and the camera module 300 includes three or more lenses 30 with different functions.
Further, the lens 30 includes at least two of a standard camera, a wide-angle lens 30, an ultra-wide-angle lens 30, a telephoto lens 30, a macro lens 30, and a blurring lens 30, that is, the lens 30 of the camera module 300 includes, but is not limited to, only the above-mentioned various lenses 30, and is specifically configured according to requirements.
It should be noted that optical filters are used to extract some radiation in a continuous spectrum through a spectral band of a certain width or in a linear spectrum. The application of the tuning filter is more remarkable. The variety of tuning filters is many based on different mechanisms: the grating filter is mechanically tunable, the grating is mounted on a rotating component, and the included angle between the grating and incident light determines the wavelength required to be selected. The angle between the incident light and the grating is changed by the rotation of the grating, so that the wavelength tuning is realized. The filter 70 of the present application is a tunable filter 70.
Further, the filter 70 is an electron optical filter; and/or filter 70 is an electro-optically tunable filter.
In the present embodiment, the electronic optical filter realizes wavelength tuning by changing birefringence of various electronic-optical functional materials by applying voltage, and wavelength tuning can also be realized by using rotation of an electro-optical crystal in an electric field.
In another embodiment, as the science of crystal materials develops, the filter 70 can also be made into an electro-optically tunable filter by using the electro-optic effect of the crystal. For example, the electro-optical tunable filter may be configured as a liquid crystal tuning filter, and the crystal in the liquid crystal tuning filter is controlled to rotate by applying a voltage to adjust the amount of transmitted light and the bandwidth of the optical spectrum of the filter 70.
Optionally, the control module 90 is configured to transmit a preset voltage signal to the selected filter 70 to make the filter 70 transparent.
In this embodiment, the camera module 300 further includes a voltage output structure, the voltage output structure is respectively connected to the optical filter 70 and the control module 90, and the control module 90 can output a control command to the voltage output structure to control a voltage value of the voltage output. The control module 90 controls the voltage output structure to output a preset voltage signal, and applies a voltage value corresponding to the preset voltage signal to the selected filter 70, so as to point-control the filter 70 to transmit light or not to transmit light, thereby implementing a function of selecting one of the plurality of lenses 30 to transmit light, so that the light sensing module 50 receives incident light of the same lens 30 in the same time period, and reducing interference.
It can be understood that there are at least two control modes for controlling the voltage output structure, one of which can be a trigger structure, such as a slider or a trigger button, on the terminal device applied to the camera module 300, so that the user can selectively use the desired lens 30 according to personal preference; alternatively, the control module 90 may be provided with an automatic control program, for example, the plurality of optical filters 70 are automatically controlled to be turned on and off at different time intervals according to time division, so as to automatically collect the incident light data of the plurality of lenses 30 at different time intervals.
Optionally, the camera module 300 further includes a clock module 80, the clock module 80 is connected to the control module 90, and the control module 90 is configured to selectively transmit a preset voltage signal to one of the optical filters 70 according to a clock signal when receiving the clock signal of the clock module 80, so as to enable the optical filter 70 to transmit light.
In the present embodiment, the clock module 80 outputs a clock signal to switch the different lenses 30 to transmit light in a time-sharing manner, records the light-transmitting market of each lens 30 and feeds the light-transmitting market back to the control module 90, and the control module 90 automatically controls the different lenses 30 to switch in a time-sharing manner according to the feedback data.
Optionally, the control module 90 at least includes a first preset voltage output terminal and a second preset voltage output terminal, and the first preset voltage output terminal and the second preset voltage output terminal are respectively connected to different optical filters 70, so that the optical filters 70 transmit different light bands.
Referring to fig. 6, the camera module 300 includes two lenses 30, the two lenses 30 are a lens 30A and a lens 30B, and the two lenses 30 can be defined as a first filter 70 disposed corresponding to the lens 30A, a second filter 70 disposed corresponding to the lens 30B, and the first filter 70 and the second filter 70 are both preset to be in a transparent state.
In this embodiment, the voltage output structure at least includes a first preset voltage output terminal and a second preset voltage output terminal, the first preset voltage output terminal is connected to the first optical filter 701 corresponding to the lens 30A, and the second preset voltage output terminal is connected to the second optical filter 702 corresponding to the lens 30B. When the camera module 300 selects the lens 30A to be transparent, the output voltage of the second preset voltage output terminal is applied to the second light filter 702, so that the lens 30B is not transparent; when the camera module 300 selects the lens 30B to be transparent, the first preset voltage output terminal outputs a voltage to be applied to the first optical filter 701, so that the lens 30A is opaque, and the lens 30 with different functions can be switched.
During the light-transmitting state of the lens 30A, the first preset voltage output end outputs a plurality of voltage thresholds to switch a plurality of different wave bands of the light correspondingly transmitted by the first optical filter 701; during the light-transmitting state of the lens 30B, the second preset voltage output terminal includes a plurality of voltage thresholds to switch a plurality of different wavelength bands of the light transmitted by the second light-filtering element 702, and both of the above two states are convenient for dynamically adjusting the light-transmitting wavelength bands of the light-filtering element 70 by different voltage values, so that one light-filtering element 70 can transmit light of different wavelength bands to obtain images with different chromaticities.
Further, the control module 90 controls the lens 30A and the lens 30B to transmit light in a time-sharing manner according to the clock signal sent by the clock module 80, so as to automatically collect the incident light data of the lens 30A and the lens 30B in a time-sharing manner.
The above lists are only reference examples, and in order to avoid redundancy, they are not listed here, and in actual development or application, they may be flexibly combined according to actual needs, but any combination belongs to the technical solution of the present application, and is covered by the protection scope of the present application.
The application also provides an intelligent terminal, including the main part and as above-mentioned arbitrary camera module 300, camera module 300 installs in the main part, but camera module 300's camera lens 30 dismantles in the main part alone.
Optionally, the intelligent terminal includes an input module, and the input module is connected to the control module 90 of the camera module 300.
In this embodiment, the input module may be a touch-enabled structural component disposed on the main body, or may be an operation interface of the intelligent terminal, such as a professional mode or a selection mode in shooting software of a camera or a mobile phone terminal, which is convenient for a user to manually control the intelligent terminal, and the input module is connected to the control module 90, wherein the control module 90 may also identify the input module, and when receiving an input instruction from the input module, the control module 90 switches the automatic control state to the manual control state, so that the user can select a function according to a preference; when the input instruction from the input module is not received within the preset time, the control module 90 switches the manual control state to the automatic control state, so as to facilitate automatic switching, for example, controlling the switching of the camera according to the clock signal.
It is understood that the clock signal includes at least one preset duration threshold, and the preset duration threshold is generally a short time, which facilitates switching the different functional lens 30 in a short time, so as to obtain image data about a plurality of different lenses 30 in a time period, and facilitate subsequent selective processing of the data.
In the embodiment of the intelligent terminal provided in the present application, all technical features of any one of the embodiments of the camera module 300 may be included, and the expanding and explaining contents of the specification are basically the same as those of the embodiments of the method described above, and are not described herein again.
Embodiments of the present application also provide a computer program product, which includes computer program code, when the computer program code runs on a computer, the computer is caused to execute the method in the above various possible embodiments.
Embodiments of the present application further provide a chip, which includes a memory and a processor, where the memory is used to store a computer program, and the processor is used to call and run the computer program from the memory, so that a device in which the chip is installed executes the method in the above various possible embodiments.
It is to be understood that the foregoing scenarios are only examples, and do not constitute a limitation on application scenarios of the technical solutions provided in the embodiments of the present application, and the technical solutions of the present application may also be applied to other scenarios. For example, as can be known by those skilled in the art, with the evolution of system architecture and the emergence of new service scenarios, the technical solution provided in the embodiments of the present application is also applicable to similar technical problems.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
The steps in the method of the embodiment of the application can be sequentially adjusted, combined and deleted according to actual needs.
The units in the device of the embodiment of the application can be combined, divided and deleted according to actual needs.
In the present application, the same or similar term concepts, technical solutions and/or application scenario descriptions will be generally described only in detail at the first occurrence, and when the description is repeated later, the detailed description will not be repeated in general for brevity, and when understanding the technical solutions and the like of the present application, reference may be made to the related detailed description before the description for the same or similar term concepts, technical solutions and/or application scenario descriptions and the like which are not described in detail later.
In the present application, each embodiment is described with emphasis, and reference may be made to the description of other embodiments for parts that are not described or illustrated in any embodiment.
All possible combinations of the technical features in the embodiments are not described in the present application for the sake of brevity, but should be considered as the scope of the present application as long as there is no contradiction between the combinations of the technical features.
Through the description of the foregoing embodiments, it is clear to those skilled in the art that the method of the foregoing embodiments may be implemented by software plus a necessary general hardware platform, and certainly may also be implemented by hardware, but in many cases, the former is a better implementation. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, a controlled terminal, or a network device) to execute the method of each embodiment of the present application.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions according to the embodiments of the present application are all or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). Computer-readable storage media can be any available media that can be accessed by a computer or a data storage device, such as a server, data center, etc., that includes one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, memory Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), among others.
The above description is only a preferred embodiment of the present application, and not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application, or which are directly or indirectly applied to other related technical fields, are included in the scope of the present application.

Claims (10)

1. The utility model provides a module of making a video recording, its characterized in that, the module of making a video recording includes:
the lens comprises a light incidence end and a light emergence end;
the photosensitive module is positioned on a light emergent path of the lens and used for receiving light rays emitted from the light emitting end;
the single light filtering piece is arranged corresponding to the single lens, and the light filtering piece is positioned between the lens and the photosensitive module; and
and the control module is connected with the light filtering piece and used for selecting one light transmission part of the light filtering piece.
2. The camera module of claim 1, wherein the control module is configured to transmit a predetermined voltage signal to the selected filter to make the filter transparent.
3. The camera module of claim 1, further comprising a clock module, wherein the clock module is connected to the control module, and the control module is configured to selectively transmit a preset voltage signal to one of the optical filters according to a clock signal when receiving the clock signal from the clock module, so that the optical filter transmits light.
4. The camera module according to claim 1, wherein the control module comprises at least a first preset voltage output terminal and a second preset voltage output terminal, and the first preset voltage output terminal and the second preset voltage output terminal are respectively connected to different filters, so that the filter has different wavelength bands of light passing therethrough.
5. The camera module of claim 1, further comprising a lens barrel, wherein the photosensitive module is disposed at a rear end of the lens barrel, the lens and the filter are disposed at a front end of the lens barrel, and the lens and/or the filter are detachably connected to the lens barrel.
6. The camera module of any one of claims 1-5, wherein the lenses comprise at least two of a standard camera, a wide-angle lens, an ultra-wide-angle lens, a telephoto lens, a macro lens, and a blurring lens.
7. The camera module of any of claims 1-5, wherein at least one of the filters is an electro-optical filter.
8. The camera module of any of claims 1-5, wherein at least one of the filters is an electro-optically tunable filter.
9. An intelligent terminal, characterized in that, intelligent terminal includes:
a main body; and
the camera module of any of claims 1-8, mounted to the body.
10. The intelligent terminal according to claim 9, wherein the intelligent terminal comprises an input module, and the input module is connected with the control module of the camera module.
CN202222128725.1U 2022-08-11 2022-08-11 Camera module and intelligent terminal Active CN217770227U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202222128725.1U CN217770227U (en) 2022-08-11 2022-08-11 Camera module and intelligent terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202222128725.1U CN217770227U (en) 2022-08-11 2022-08-11 Camera module and intelligent terminal

Publications (1)

Publication Number Publication Date
CN217770227U true CN217770227U (en) 2022-11-08

Family

ID=83878484

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202222128725.1U Active CN217770227U (en) 2022-08-11 2022-08-11 Camera module and intelligent terminal

Country Status (1)

Country Link
CN (1) CN217770227U (en)

Similar Documents

Publication Publication Date Title
CN107231515B (en) Terminal
CN107463243B (en) Screen control method, mobile terminal and computer readable storage medium
CN110198379B (en) Mobile terminal control method, mobile terminal, and computer-readable storage medium
CN111787235B (en) Shooting control method and device and computer readable storage medium
CN110381202B (en) Display adjustment method, mobile terminal and computer-readable storage medium
CN217770227U (en) Camera module and intelligent terminal
CN111968595B (en) Electrochromic control device and mobile terminal
CN109885275B (en) Audio regulation and control method, equipment and computer readable storage medium
CN109742608B (en) Connecting line and mobile terminal downtime processing method
CN113242368A (en) Camera module, mobile terminal, photographing method and storage medium
CN113419694A (en) Processing method, mobile terminal and storage medium
CN112468705B (en) Periscopic camera module and terminal equipment
CN220108110U (en) Camera module and intelligent terminal
CN218158668U (en) Camera and mobile device
CN217506336U (en) Camera and mobile device
WO2023102921A1 (en) Camera module, photographing method, intelligent terminal, and storage medium
CN218154047U (en) Protective housing and intelligent terminal with same
CN219227729U (en) Display device and intelligent terminal
CN216389737U (en) Antenna module and intelligent terminal
CN117742860A (en) Display method, intelligent terminal and storage medium
CN117193596A (en) Display method, intelligent terminal and storage medium
CN116483502A (en) Switching method, intelligent terminal and storage medium
CN116166363A (en) Processing method, intelligent terminal and storage medium
CN113315873A (en) Processing method, mobile terminal and storage medium
CN117222973A (en) Application display method, mobile terminal and storage medium

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant