CN115184956A - TOF sensor system and electronic device - Google Patents

TOF sensor system and electronic device Download PDF

Info

Publication number
CN115184956A
CN115184956A CN202211099980.6A CN202211099980A CN115184956A CN 115184956 A CN115184956 A CN 115184956A CN 202211099980 A CN202211099980 A CN 202211099980A CN 115184956 A CN115184956 A CN 115184956A
Authority
CN
China
Prior art keywords
light
tof
mode
sensor system
sub
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211099980.6A
Other languages
Chinese (zh)
Other versions
CN115184956B (en
Inventor
刘忠兴
冯晓刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211099980.6A priority Critical patent/CN115184956B/en
Publication of CN115184956A publication Critical patent/CN115184956A/en
Application granted granted Critical
Publication of CN115184956B publication Critical patent/CN115184956B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/42Photometry, e.g. photographic exposure meter using electric radiation detectors
    • G01J1/4204Photometry, e.g. photographic exposure meter using electric radiation detectors with determination of ambient light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements

Abstract

The application provides a TOF sensor system and electronic equipment, relates to electronic equipment technical field. This TOF sensor system not only is favorable to realizing the function of 3D formation of image, can also have the function that detects the ambient light in the external environment that electronic equipment is located, can save the setting of ambient light sensor to be favorable to reducing the device quantity in the electronic equipment, save the inside space of electronic equipment.

Description

TOF sensor system and electronic device
Technical Field
The application relates to the technical field of electronic equipment, in particular to a TOF sensor system and electronic equipment.
Background
Three-dimensional (3D) imaging techniques are techniques that extract 3D information of an object by optical means and recover 3D features of the object as much as possible during reconstruction. Among them, accurate acquisition of depth information is a key in 3D imaging technology. Time of flight (TOF) sensors are most used in 3D imaging techniques because they can obtain depth information with high efficiency and quality. The working principle of the TOF sensor is: infrared light (e.g., light having a wavelength of 850nm or 940 nm) is continuously transmitted to the object, and then infrared light reflected from the target object is received, and the distance between the TOF sensor and the target object is calculated by detecting the round-trip time of the infrared light, thereby generating depth information, and generating a 3D image.
However, not only the TOF sensor but also an Ambient Light Sensor (ALS) is currently provided in the electronic device. And adjusting the screen brightness by utilizing the ambient light intensity in the current environment where the electronic equipment is located, which is measured by the ambient light sensor.
However, the separate placement of the TOF sensor and the ambient light sensor takes up more space inside the electronic device.
Disclosure of Invention
The embodiment of the application provides a TOF sensor system and electronic equipment, and this TOF sensor system not only can realize the function of 3D formation of image, can also have the function that detects the ambient light in the external environment that electronic equipment is located, can save the setting of ambient light sensor to be favorable to reducing the device quantity in the electronic equipment, save the inside space of electronic equipment.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical solutions:
a TOF sensor system according to an embodiment of the first aspect of the application, comprising: the device comprises a light source, a light receiving device and a processing circuit. The light source is used for providing infrared light for the target object; the light receiving device comprises a light filter and an image sensor, the light filter comprises a first light filtering part and a second light filtering part, the first light filtering part is transparent to visible light, the second light filtering part is cut off to the visible light and is transparent to infrared light, the image sensor is arranged on one side of the light filter, the image sensor comprises a pixel layer, the pixel layer comprises a first pixel area and a second pixel area, the first pixel area is opposite to the first light filtering part and is used for sensing the first light filtered by the first light filtering part, the second pixel area is opposite to the second light filtering part, the infrared light is reflected by a target object and forms second light after being filtered by the second light filtering part, and the second pixel area is used for sensing the second light; the processing circuit is used for outputting a first signal according to first light sensed by the first pixel region, and the first signal is used for indicating ambient light intensity information in an external environment; the processing circuit is further configured to output a second signal according to a second light sensed by the second pixel region, where the second signal is used to indicate distance information between X positions of the reflecting surface of the target object and the image sensor, where X is a positive integer greater than 1.
In the TOF sensor system of the embodiment of the application, on the basis of ensuring the original imaging function of the TOF sensor, the function of the ambient light sensor can be integrated, so that when the TOF sensor system is used in electronic equipment, the structure of the electronic equipment is favorably simplified, the space of the electronic equipment is saved, and the compactness of the structure of the electronic equipment is realized.
In some embodiments, the first filtering portion is disposed around a perimeter of the second filtering portion. Thereby, it is advantageous to ensure the image acquisition effect of the TOF sensor system in the TOF mode.
In some embodiments, the first filtering portion includes two first portions disposed at opposite sides of the second filtering portion. Thereby, it is advantageous to ensure the image acquisition effect of the TOF sensor system in the TOF mode.
In some embodiments, the two first portions are symmetrically disposed with respect to the second filtering portion.
In some embodiments, the first light filtering portion includes N first portions, the second light filtering portion includes M second portions, the N first portions and the M second portions constitute (N + M) light filtering portions, the (N + M) light filtering portions are arranged in an array, one second portion is arranged between two adjacent first portions in each row of light filtering portions, and one second portion is arranged between two adjacent first portions in each column of light filtering portions, where N, M are positive integers greater than or equal to 2. Therefore, the detection effect of the TOF sensor system on the ambient light is improved.
In some embodiments, the TOF sensor system further comprises an optical lens at a side of the filter remote from the image sensor, and an optical axis of the optical lens is collinear with a central axis of the second filtering portion. Therefore, the image acquisition effect of the TOF sensor system in the TOF mode is guaranteed.
In some embodiments, the area of the second filtering portion is greater than the area of the first filtering portion. Thereby, it is advantageous to ensure the image acquisition effect of the TOF sensor system in the TOF mode.
In some embodiments, the TOF sensor system has a proximity light mode and a TOF mode, the TOF sensor system being switchable between the proximity light mode and the TOF mode; the second pixel region includes a first sub-region; in the TOF mode, the processing circuit is configured to output a second signal according to second light sensed by the second pixel region; in the approach light mode, the processing circuit is used for outputting a third signal according to the second light rays sensed by the first sub-area; the third signal is used for indicating distance information between Y positions of the reflecting surface of the target object and the image sensor, wherein Y is a positive integer and is smaller than X. Thereby, the TOF sensor system can be made to have a proximity detection function.
In some embodiments, the TOF sensor system has a proximity light mode and a TOF mode, the TOF sensor system being switchable between the proximity light mode and the TOF mode; the second pixel region includes a first sub-region; in the TOF mode, the processing circuit is configured to output a second signal according to second light sensed by the second pixel region; in the approach light mode, the processing circuit is configured to generate distance information between Y positions of the reflecting surface of the target object and the image sensor according to the second light rays sensed by the first sub-region, the processing circuit is further configured to process the Y distance information to obtain first distance information, the processing circuit is further configured to compare the first distance information with a preset threshold, and output a third signal according to a magnitude relationship between the first distance information and the preset threshold, the third signal is approach instruction information for indicating that the target object is close to the image sensor or non-approach instruction information for indicating that the target object is not close to the image sensor, where Y is a positive integer and is smaller than X. Therefore, the TOF sensor system has a proximity detection function, calculation power of the processor is reduced, and working reliability of the processor is improved.
In some embodiments, the processing circuit is configured to determine the Y pieces of distance information, and determine a minimum value of the Y pieces of distance information as the first distance information. Therefore, the reliability of detection of whether the target object is close to the electronic equipment is favorably ensured.
In some embodiments, the processing circuit is configured to determine the Y pieces of distance information and determine an average of the Y pieces of distance information as the first distance information. Therefore, the reliability of detection of whether the target object is close to the electronic equipment is favorably ensured.
In some embodiments, the TOF sensor system further comprises: the power supply drive is electrically connected with the light source; in the TOF mode, the power driver is operable to output a first current to the light source and in the proximity light mode, the power driver is operable to output a second current to the light source, wherein the first current is greater than the second current. Thus, power consumption of the TOF sensor system in the near-light mode is advantageously reduced.
In some embodiments, the light source comprises a first sub-light source and a second sub-light source, the number of first sub-light sources being greater than the number of second sub-light sources, the first sub-light source being configured to provide infrared light to the target object in the TOF mode and the second sub-light source being configured to provide infrared light to the target object in the proximity light mode. Thus, power consumption of the TOF sensor system in the near-light mode is advantageously reduced.
In some embodiments, the first sub-region is in the middle of the second pixel region. Therefore, the accuracy of the proximity detection is guaranteed.
In some embodiments, the light receiving device further includes a circuit substrate, the circuit substrate has a light hole, the image sensor is electrically connected to the circuit substrate, the optical filter and the image sensor are respectively fixed on two sides of the circuit substrate, and an orthographic projection of the first optical filtering portion on the circuit substrate, an orthographic projection of the second optical filtering portion on the circuit substrate, an orthographic projection of the first pixel region on the circuit substrate, and an orthographic projection of the second pixel region on the circuit substrate are located in the light hole. Therefore, the distance between the pixel layer and the optical filter can be reduced, and the alignment accuracy of the pixel layer and the optical filter is improved.
An electronic device according to an embodiment of a second aspect of the present application includes: a TOF sensor system and a processor. The TOF sensor system is the TOF sensor system of any of the embodiments described above; the processor is electrically connected with the TOF sensor system and used for controlling the electronic equipment to execute corresponding operation according to the first signal and the second signal.
In some embodiments, the TOF sensor system has a proximity light mode and a TOF mode, the TOF sensor system being switchable between the proximity light mode and the TOF mode; the second pixel region includes a first sub-region; in the TOF mode, the processing circuit is used for outputting a second signal according to second light rays sensed by the second pixel area; in the approach light mode, the processing circuit is used for outputting a third signal according to the second light rays sensed by the first sub-area, the third signal is used for indicating distance information between Y positions of the reflecting surface of the target object and the image sensor, wherein Y is a positive integer and is smaller than X; the processor receives the third signal, processes the Y pieces of distance information to obtain first distance information, compares the first distance information with a preset threshold value, and controls the electronic equipment to execute corresponding operation according to the size relation between the first distance information and the preset threshold value.
In some embodiments, the processor is configured to determine the Y distance information and determine a minimum value of the Y distance information as the first distance information. Therefore, the reliability of detection of whether the target object is close to the electronic equipment is favorably ensured.
In some embodiments, the processor is configured to determine the Y distance information and determine an average of the Y distance information as the first distance information. Therefore, the reliability of detection of whether the target object is close to the electronic equipment is favorably ensured.
In some embodiments, the TOF sensor system has a proximity light mode and a TOF mode, the TOF sensor system being switchable between the proximity light mode and the TOF mode; the second pixel region includes a first sub-region; in the TOF mode, the processing circuit is configured to output a second signal according to second light sensed by the second pixel region; in the approach light mode, the processing circuit is used for generating distance information between Y positions of the reflecting surface of the target object and the image sensor according to the second light rays sensed by the first sub-region, processing the Y distance information to obtain first distance information, comparing the first distance information with a preset threshold value, and outputting a third signal according to the magnitude relation between the first distance information and the preset threshold value, wherein the third signal is approach instruction information for indicating that the target object is close to the image sensor or non-approach instruction information for indicating that the target object is not close to the image sensor; wherein Y is a positive integer and is less than X; the processor is also used for controlling the electronic equipment to execute corresponding operation according to the third signal.
In some embodiments, the processing circuit is configured to determine the Y pieces of distance information, and determine a minimum value of the Y pieces of distance information as the first distance information. Therefore, the reliability of detection of whether the target object is close to the electronic equipment is favorably ensured.
In some embodiments, the processing circuit is configured to determine the Y pieces of distance information and determine an average of the Y pieces of distance information as the first distance information. Therefore, the reliability of detection of whether the target object is close to the electronic equipment is favorably ensured.
In some embodiments, the TOF sensor system further comprises: and the processor is used for controlling the power supply drive to output a first current to the light source in the TOF mode and controlling the power supply drive to output a second current to the light source in the proximity light mode, wherein the first current is larger than the second current.
In some embodiments, the light source comprises a first sub-light source and a second sub-light source, the number of first sub-light sources is greater than the number of second sub-light sources, the processor is configured to control the first sub-light source to provide infrared light to the target object in the TOF mode, and the processor is further configured to control the second sub-light source to provide infrared light to the target object in the proximity light mode.
The technical effects brought by any design manner in the second aspect can be referred to the technical effects brought by different design manners in the first aspect, and are not described herein again.
Drawings
Fig. 1 is a schematic view of an electronic apparatus in the related art;
FIG. 2 is a schematic diagram of a TOF sensor of some embodiments of the related art;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
FIG. 4 is a schematic illustration of an operating state of a TOF sensor system of some embodiments of the present application;
FIG. 5 is a schematic structural diagram of an electronic device according to some embodiments of the present application;
FIG. 6 is a schematic diagram of a TOF sensor system according to some embodiments of the present application;
FIG. 7 is a schematic cross-sectional view of the filter shown in FIG. 6;
FIG. 8 is a schematic diagram showing the relative positions of the optical filter and the image sensor in the light receiving device shown in FIG. 6;
FIG. 9 is a schematic view of the light source emitting device shown in FIG. 6;
FIG. 10 is a schematic diagram illustrating a relative position relationship between a filter and an image sensor according to another embodiment of the present application;
FIG. 11 is a schematic diagram illustrating a relative position relationship between a filter and an image sensor according to still other embodiments of the present disclosure;
FIG. 12 is a schematic view of a TOF sensor system according to further embodiments of the present application.
Detailed Description
In the embodiments of the present application, the terms "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the embodiments of the present application, the terms "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature.
In the description of the embodiments of the present application, the term "at least one" means one or more, "and" a plurality "means two or more. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
In the description of embodiments of the present application, the term "and/or" refers to and encompasses any and all possible combinations of one or more of the associated listed items. The term "and/or" is an associative relationship that describes an associated object, meaning that three relationships may exist, e.g., A and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" in the present application generally indicates that the former and latter related objects are in an "or" relationship.
In the description of the embodiments of the present application, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of another identical element in a process, method, article, or apparatus that comprises the element. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of another identical element in a process, method, article, or apparatus that comprises the element.
Electronic equipment such as cell-phone accessible configuration various sensors, richen electronic equipment's function, give the better use experience of user. Referring to fig. 1, fig. 1 is a schematic diagram of an electronic device 200 in the related art. The front of the electronic device 200 may be provided with a TOF sensor 20a. 3D imaging is performed by using the TOF sensor 20a to realize a 3D face unlocking function. As another example, a camera (e.g., a front camera) of the electronic apparatus 200 assists focusing with the TOF sensor 20a, performs depth of field determination at the time of photographing, to implement background blurring, and the like.
Referring to fig. 2, fig. 2 is a schematic diagram of a TOF sensor 20a according to some embodiments of the related art. The TOF sensor 20a includes a light receiving device 20a1 and a light source emitting device 20a2.
The light receiving device 20a1 includes an optical lens 20a13, an infrared filter 20a11, and an image sensor 20a12. The infrared filter 20a11 is interposed between the optical lens 20a13 and the image sensor 20a12. The ir filter 20a11 is used to filter the light so that ir light can pass through and visible light can be cut off. That is, infrared light may pass through the infrared filter, and visible light may not pass through the infrared filter.
The light source emitting device 20a2 may provide a plurality of infrared rays to the target object at the same time. The plurality of infrared rays respectively reach different positions of the reflecting surface of the target object. After being reflected by the reflecting surface of the target object, the infrared light beams may sequentially pass through the optical lens 20a13 and the infrared filter 20a11, reach different photosensitive pixels in the pixel layer of the image sensor 20a12, and are induced by the corresponding photosensitive pixels to generate corresponding electrical signals. Distance information between different positions of the reflecting surface of the target object and the image sensor can be obtained according to the flight round-trip time of the infrared light rays induced by different photosensitive pixels, so that depth information is generated, 3D image information is generated, and the electronic equipment is assisted to realize corresponding functions.
Referring to fig. 1, besides the TOF sensor 20a, a light sensor, such as an Ambient Light Sensor (ALS) 20b, a proximity light sensor (PS) 20c, etc., is disposed on the front surface of the electronic device. The ambient light sensor 20b typically includes a light sensitive element and logic circuitry. The light sensitive element is used for detecting ambient light and generating a current. The logic circuit includes a current amplifier and a passive low pass filter to detect and process the output voltage signal caused by the optical input. The electronic device can realize the function of automatically adjusting the screen brightness by using the current light intensity of the ambient light in the environment where the electronic device is located, which is measured by the ambient light sensor 20 b.
The proximity light sensor 20c includes an emitting end and a receiving end. The emitting end is used for providing infrared light to the target object. Infrared light is reflected after reaching a target object. The reflected infrared light can be sensed by a photosensitive element at the receiving end. Due to the difference in the distance of the proximity light sensor 20c from the target object, the intensity of the infrared light reflected by the target object is also different. The closer the distance, the stronger the intensity of the reflected infrared light; the further the distance, the weaker the intensity of the reflected infrared light. Therefore, the proximity light sensor 20c is used to detect whether an object is approaching the electronic device, so as to realize the function of automatically turning off the screen during the call, and avoid misoperation.
Referring to fig. 1, since the TOF sensor 20a and the light-sensing sensor are disposed on the front surface of the electronic device 200 at intervals, the TOF sensor 20a and the light-sensing sensor are independent of each other, occupy more space inside the electronic device 200, occupy more space on the main circuit board of the electronic device 200, and are higher in cost.
In order to solve the above technical problems, based on the common that both the TOF sensor 20a and the light-sensitive sensor can be used for sensing the intensity of light, the inventors of the present application have improved the above related art from the viewpoint of integrating the TOF sensor 20a and the light-sensitive sensor into a whole, and have provided a TOF sensor system. In the TOF sensor system of this application embodiment, on the basis of guaranteeing TOF sensor's original imaging function, can also integrate the function of environment light sensor, perhaps integrate environment light sensor and be close to the function of light sensor to when TOF sensor system uses in electronic equipment, be favorable to simplifying electronic equipment's structure, save electronic equipment's space, realize the compactness of electronic equipment structure.
The electronic device according to the embodiment of the present application will be described in detail below with reference to the drawings. For example, the electronic device described in the embodiments of the present application may be a mobile phone, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a Personal Digital Assistant (PDA), a wearable electronic device (e.g., a smart watch, etc.), an Augmented Reality (AR) \ Virtual Reality (VR) device, etc., and the embodiments of the present application do not specifically limit the specific form of the device.
Referring to fig. 3, fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. The electronic device 100 may include a main circuit board, a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like.
The processor 110, the external memory interface 120, the internal memory 121, a Universal Serial Bus (USB) interface 130, the charging management module 140, the power management module 141, the battery 142, the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, the sensor module 180, the button 190, the motor 191, the indicator 192, the camera 193, the display screen 194, and the Subscriber Identity Module (SIM) card interface 195 may be electrically connected to the main circuit board to electrically connect the devices.
The sensor module 180 may include a TOF sensor system 10, a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be called directly from memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bidirectional synchronous serial bus including a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the TOF sensor system 10, the touch sensor 180K, the charger, the flash, the camera 193, etc., respectively, through different I2C bus interfaces. For example, the processor 110 may be coupled to the TOF sensor system 10 through an I2C interface, so that the processor 110 and the TOF sensor system 10 communicate through an I2C bus interface, thereby implementing a 3D face unlocking function of the electronic device 100, detecting an ambient light intensity of an external environment where the electronic device 100 is currently located, and implementing functions of the electronic device 100, such as automatically adjusting screen brightness.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, such as the TOF sensor system 10, and the like. The GPIO interface may also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, and the like.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display screen 194, the camera 193, the sensor module 180, such as the TOF sensor system 10 and the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like.
The electronic device 100 may implement a photographing function through the TOF sensor system 10, the ISP, the camera 193, the video codec, the GPU, the display screen 194, and the application processor, etc.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in the external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into a sound signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking near the microphone 170C through the mouth. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration prompts as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signals acquired by the bone conduction sensor 180M, and the heart rate detection function is achieved.
The TOF sensor system 10 may implement 3D imaging to implement a 3D face unlocking function, or a camera (e.g., a front camera) of the electronic device 100 may utilize the TOF sensor to assist focusing, and perform depth of field determination during photographing to implement background blurring and other functions. On this basis, the TOF sensor system may also be used to detect the ambient light intensity in the external environment in which the electronic device 100 is located, i.e. the TOF sensor system 10 may integrate the functionality of an ambient light sensor. Further, the TOF sensor may also have the function of a proximity light sensor. In the following description, an example is given in which the TOF sensor system 10 has both the detection functions of the ambient light sensor and the proximity light sensor. Those skilled in the art, after reading this application, will appreciate the implementation of only an ambient light sensor integrated into the TOF sensor system 10.
In particular, the TOF sensor system 10 can have a TOF mode, an ambient light mode, and a proximity light mode. The TOF sensor system 10 can operate in one of a TOF mode, an ambient light mode, and a proximity light mode alone, in both the TOF mode and the ambient light mode, or in both the ambient light mode and the proximity light mode. Also, the TOF sensor system 10 can be freely switched between various modes or combinations of modes.
Referring to fig. 4, fig. 4 is a schematic diagram illustrating operation of the TOF sensor system 10 according to some embodiments of the present disclosure. The TOF sensor system 10 is on a side of the display screen 194 that faces away from a display surface of the display screen 194. Specifically, the TOF sensor system 10 is fixed to the main circuit board 101 and is located between the main circuit board 101 and the display screen 194. Display screen 194 may have a light-transmissive portion (not shown) through which infrared light delivered to the target object by TOF sensor system 10 may pass. The light-transmissive portion may also be used for ambient light in the external environment in which the electronic device 100 is located and for infrared light reflected by the target object to pass through for receipt by the TOF sensor system 10. Illustratively, the light-transmitting portion may be divided into a first light-transmitting portion and a second light-transmitting portion that are spaced apart. The first light-transmitting portion is used for infrared light emitted by the TOF sensor system 10 to pass through so as to be projected to a target object. The second light-transmitting portion is used for light rays in the external environment where the electronic device 100 is located and infrared light reflected by the target object to pass through so as to be received by the TOF sensor system 10. Of course, it is understood that in other examples, there may be one light-transmitting portion.
While the TOF sensor system 10 is in the TOF mode, the TOF sensor system 10 can detect the presence of infrared light (e.g., light having a wavelength of 850, 940 nm) that can reach different locations of a reflective surface of the target object and then be reflected by the reflective surface of the target object. The TOF sensor system 10 receives infrared light reflected from the reflective surface of the target object and by detecting the round-trip time of the infrared light reflected from different locations of the reflective surface of the target object. After detecting the round-trip time of different infrared lights, the TOF sensor system 10 may calculate the distance between the electronic device 100 and the different positions of the reflecting surface of the target object according to the distance = half of the light speed × time, thereby generating depth information, generating image information, and assisting the electronic device 100 to implement corresponding functions. For example, the functions of unlocking a 3D face and performing depth of field judgment to realize background blurring when a shooting function is realized are realized.
The TOF sensor system 10 can function as an ambient light sensor when the TOF sensor system 10 is in an ambient light mode. The TOF sensor system 10 can be used to sense the current ambient light intensity of the external environment in which the electronic device 100 is located. The electronic device 100 may adjust the display screen brightness according to the obtained light intensity of the ambient light. For example, when the ambient light is dark, the brightness of the display screen 194 is reduced to prevent glare; when the ambient light is brighter, the brightness of the display screen 194 is increased, so that the display screen 194 can be displayed more clearly.
When the TOF sensor system 10 is in the proximity light mode, the TOF sensor system 10 may utilize the principle of the TOF sensor to implement a function of detecting whether an object is in proximity to the electronic device 100, that is, may utilize the principle of the TOF sensor to implement a function of the proximity light sensor. Specifically, when the TOF sensor system 10 is in the proximity light mode, the distance between the electronic device 100 and the target object can be calculated according to the round-trip time of the detected infrared light, so as to determine whether an object is near the electronic device 100, so that when it is detected that the electronic device 100 is held by the user and close to an ear for conversation, the electronic device 100 can control the display screen to be turned off, and the purposes of preventing the human body from touching the screen by mistake and saving power can be achieved.
It is to be appreciated that the electronic device 100, such as the processor 110, can control the TOF sensor system 10 to operate in different modes when the electronic device 100 operates in different operating states. That is, the electronic device 100 may control the TOF sensor system 10 to operate in different modes depending on the current operating state in which it is operating. For example, when the electronic device 100 is in a bright screen state (including a bright screen under the lock screen and a bright screen after the unlock), the electronic device 100 controls the ambient light mode of the TOF sensor system 10 to be turned on. For another example, when the electronic device 100 is in a bright screen state under the lock screen and is unlocked using the 3D face unlocking function, the electronic device 100 controls both the TOF mode and the ambient light mode of the TOF sensor system 10 to be turned on. As another example, when the user is talking on the electronic device 100, the electronic device 100 controls the proximity light mode of the TOF sensor system 10 to be turned on. For another example, when the user is talking on the electronic device 100 and the electronic device 100 is in the bright screen state, the electronic device 100 controls both the proximity light mode and the ambient light mode to be turned on.
To facilitate the processor 110 to control the TOF sensor system 10 according to different operating states of the electronic device 100, please refer to fig. 5, where fig. 5 is a schematic structural diagram of the electronic device 100 according to some embodiments of the present disclosure. The electronic device 100 comprises a detection component 111. The detecting component 111 is used for acquiring the current operating status information of the electronic device 100 and feeding back the acquired operating status information to the processor 110. The processor 110 controls the TOF sensor system 10 to operate in a corresponding mode based on the operating state information. For example, when the operating state information of the electronic device 100 acquired by the processor 110 is that the electronic device 100 is in a bright screen state (including a bright screen under a locked screen and a bright screen after unlocked), the processor 110 controls the ambient light mode of the TOF sensor system 10 to be turned on. For another example, when the working state information of the electronic device 100 acquired by the processor 110 is that the electronic device 100 is in a bright screen state under a locked screen and is unlocked by using the 3D face unlocking function, the processor 110 controls both the TOF mode and the ambient light mode of the TOF sensor system 10 to be turned on. For another example, when the processor 110 obtains the operating state information of the electronic device 100 that the user is talking on the electronic device 100, the processor 110 controls the proximity light mode of the TOF sensor system 10 to be turned on. For another example, when the processor 110 obtains the operating state information of the electronic device 100 that the user is talking on the electronic device 100 and the electronic device 100 is in the bright screen state, the processor 110 controls both the proximity light mode and the ambient light mode to be turned on.
The detecting component 111 includes, but is not limited to, the aforementioned display screen 194, the receiver 170B, the microphone 170C, the camera 193, the key 190, the mobile communication module 150, the wireless communication module 160, the pressure sensor 180A, the gyroscope sensor 180B, the air pressure sensor 180C, the magnetic sensor 180D, the acceleration sensor 180E, the distance sensor 180F, the fingerprint sensor 180H, the temperature sensor 180J, the touch sensor 180K, the bone conduction sensor 180M, and the like. Illustratively, in the standby state of the electronic device 100, when the power key is triggered by the user, the power key transmits the detected trigger signal to the processor 110, and the processor 110 controls the display screen 194 to be lit and controls both the ambient light mode and the TOF mode of the TOF sensor system 10 to be on. The acquisition of the current operating state information of the electronic device 100 by the processor 110 is well known in the art and will not be described in detail herein.
To achieve the above-described functionality of the TOF sensor system 10, a detailed description of the specific structure of the TOF sensor system 10 is provided below with reference to the accompanying drawings.
Referring to fig. 6, fig. 6 is a schematic diagram of a TOF sensor system 10 according to some embodiments of the present application. The TOF sensor system 10 includes a light ray receiving device 10a, a light source emitting device 10b, and processing circuitry 10a22.
The light source emitting device 10b and the light receiving device 10a are disposed on the same side of the electronic apparatus 100. That is, the light source emitting device 10b and the light receiving device 10a are disposed side by side on the same side of the electronic apparatus 100.
The electronic device 100 may control the light source emitting apparatus 10b to provide infrared light to the target object when the TOF sensor system 10 is in the TOF mode and control the light source emitting apparatus 10b to provide infrared light to the target object when the TOF sensor system 10 is in the proximity light mode. In particular, in some embodiments, the processor 110 may control the light source 10b2 arrangement to provide infrared light to the target object while the TOF sensor system 10 is in the TOF mode. The processor 110 may also control the light source 10b2 apparatus to provide infrared light to the target object when the TOF sensor system 10 is in the proximity light mode. In other embodiments, the TOF sensor system 10 further includes a control unit electrically connected to the processor 110, the control unit can receive control instructions from the processor 110, and the control unit controls the light source emitting device 10b to provide infrared light to the target object in the TOF mode and the proximity light mode according to the instructions from the processor 110. The infrared light provided by the light source emitting device 10b to the target object reaches the target object, and then reaches the light receiving device 10a after being reflected by the target object.
With reference to fig. 6, the light source emitting device 10b includes a substrate 10b1, a light source 10b2 and a diffusion sheet 10b3.
The substrate 10b1 is electrically connected to the processor 110. The substrate 10b1 may be a PCB (printed circuit board) circuit board. The material of the substrate 10b1 is selected from a ceramic substrate, a resin substrate, a copper substrate, or the like.
The light source 10b2 is used to provide infrared light to the target object. The light source 10b2 is provided on the substrate 10b1. Illustratively, the light source 10b2 may be a laser, such as a Vertical Cavity Surface Emitting Laser (VCSEL) or an Edge Emitting Laser (EEL). Alternatively, the light source 10b2 may be one or more.
The diffusion sheet 10b3 may be provided on the light exit side of the light source 10b2. The diffusion sheet 10b3 may function to diffuse the light projected from the light source 10b2, which is advantageous for more light to reach the target object.
It is understood that fig. 6 only schematically shows some components included in the light source emitting device 10b, and the actual configuration and the actual size of these components are not limited by fig. 6. Further, the light source emitting device 10b may include other more devices. Other structures of the light source emitting device 10b are well known to those skilled in the art and will not be described herein.
Referring to fig. 6, the light receiving device 10a includes a fixing bracket 10a4, a filter 10a1, an image sensor 10a2, an optical lens 10a3, and a circuit substrate 10a5.
The fixing bracket 10a4 may function to support and fix the optical filter 10a1, the image sensor 10a2, the optical lens 10a3, the circuit substrate 10a5, and the like. Exemplary materials for the mounting bracket 10a4 include, but are not limited to, plastic, metal, and combinations thereof. The fixing bracket 10a4 has an accommodation space 10a41. Opposite sides of the accommodating space 10a41 may have openings.
The optical lens 10a3 is fixed to one of the openings of the fixing bracket 10a 4. The optical lens 10a3 may function to collect light so that more light passes through the filter 10a1 and is directed to the image sensor 10a2.
The circuit substrate 10a5 may be a PCB (printed circuit board) circuit board. The material of the circuit board 10a5 is selected from a ceramic substrate, a resin substrate, a copper substrate, or the like. The circuit substrate 10a5 may be fixed to the fixing bracket 10a4 and located at the other side opening of the fixing bracket 10a 4. The light receiving device 10a is electrically connected to the processor 110 through the circuit board 10a5.
The filter 10a1 is disposed in the accommodating space 10a41 and between the circuit substrate 10a5 and the optical lens 10a 3. To facilitate fixing of the filter 10a 1. The inner peripheral wall of the fixing bracket 10a4 is provided with a support plate 10a42. The support plate 10a42 has a through hole 10a421. The filter 10a1 is supported and fixed on the supporting plate 10a42, and the filter 10a1 is opposite to the through hole 10a421 so as to allow light to pass through. Exemplary ways of attaching the filter 10a1 to the support plate 10a42 include, but are not limited to, gluing, snapping, screwing, or integral molding.
With continued reference to fig. 6, the filter 10a1 includes a first filter portion 10a13 and a second filter portion 10a14. The first filtering portion 10a13 and the second filtering portion 10a14 may pass light of different wavelength bands.
Specifically, the first filter portion 10a13 transmits visible light. That is, the visible light may pass through the first filter portion 10a13. Illustratively, the first filter portion 10a13 is transparent only to visible light, that is, only visible light may pass through the first filter portion 10a13, and light of other wavelength bands, for example, infrared light, may not pass through the first filter portion 10a13. In this way, it is ensured that interference of infrared light reflected by the target object is prevented when the ambient light mode of the TOF sensor system 10 is in operation. Also illustratively, the first filter portion 10a13 is transparent to both visible light and infrared light, that is, both visible light and infrared light may pass through the first filter portion 10a13. For example, the first filter portion 10a13 is entirely transparent.
The second filter portion 10a14 passes infrared light and cuts off visible light. That is, the visible light cannot pass through the second filter portion 10a14, and the infrared light can pass through the second filter portion 10a14.
Specifically, in order to realize that the first filtering portion 10a13 and the second filtering portion 10a14 allow light of different wavelength bands to pass through, please refer to fig. 7, and fig. 7 is a schematic cross-sectional structure diagram of the filter 10a1 shown in fig. 6. The optical filter 10a1 may include a light-transmitting substrate 10a11 and an optical film layer 10a12. Illustratively, the light-transmitting substrate 10a11 is glass. The light-transmitting substrate 10a11 includes a first region and a second region. The optical film layer 10a12 includes a first optical film layer 10a121 and a second optical film layer 10a122. The first optical film layer 10a121 and the second optical film layer 10a122 can transmit different wavelength bands of light. Specifically, in the actual processing of the optical filter 10a1, the first optical film layer 10a121 may be disposed in a first region of the light-transmitting substrate 10a11, and the second optical film layer 10a122 may be disposed in a second region of the light-transmitting substrate 10a 11. The first optical film layer 10a121 and the first region are used to define the first filtering portion 10a13. The second optical film layer 10a122 and the second region are used to define the second filtering portion 10a14. It is understood that, in other examples, to simplify the manufacturing process of the optical filter 10a1, the optical filter 10a1 may not include the first optical film layer 10a12, i.e., the first filtering portion 10a13 is defined by the first region.
With reference to fig. 6, the image sensor 10a2 is disposed on a surface of the circuit substrate 10a5 facing the filter 10a 1. The image sensor 10a2 is electrically connected to the circuit board 10a5.
Referring to fig. 8, fig. 8 is a schematic diagram illustrating a relative position relationship between the filter 10a1 and the image sensor 10a2 in the light receiving device 10a shown in fig. 6. The image sensor 10a2 includes a pixel layer 10a21. The pixel layer 10a21 includes a first pixel region 10a211 and a second pixel region 10a212.
The first pixel region 10a211 is opposite to the first filter portion 10a13. Exemplarily, the orthographic projection of the first pixel region 10a211 on the first filter portion 10a13 coincides with the first filter portion 10a13. Further exemplarily, the orthographic projection of the first pixel region 10a211 on the first filter portion 10a13 is located within the outer contour of the first filter portion 10a13.
The second pixel region 10a212 is opposite to the second filtering portion 10a14. Illustratively, the orthographic projection of the second pixel region 10a212 on the second filter portion 10a14 coincides with the second filter portion 10a14. As another example, the orthographic projection of the second pixel region 10a212 on the second filter portion 10a14 is located within the outer contour of the second filter portion 10a14.
Specifically, the first pixel region 10a211 may include Z pixel units, where Z is a positive integer, that is, the first pixel region 10a211 may include one pixel unit or may include a plurality of pixel units. The second pixel region 10a212 includes X pixel units. X is a positive integer greater than 1. That is, the second pixel region 10a212 includes a plurality of pixel units.
Illustratively, the plurality of pixel units of the first pixel region 10a211 and the pixel units of the second pixel region 10a212 are arranged in a pixel array.
Generally, each pixel unit comprises a photoelectric converter and a switching tube. Among them, the photoelectric converter is a device that converts an optical signal into an electrical signal, and may be, for example, a PD (Photo Diode) or an APD (Avalanche Photo Diode). Illustratively, the switching tube may be an N-or P-channel metal oxide semiconductor field effect transistor (nmosft, NMOS transistor or PMOSFET, PMOS). The input end of the switch tube is connected with the signal output end of the photoelectric converter. When the switch tube is turned off, the pixel unit does not output the electric signal, and when the switch tube is turned on, the pixel unit can output the electric signal converted by the photoelectric converter. Other structures for the pixel cell are well known to those skilled in the art and will not be described in detail herein.
The electronic device 100 can control the on and off of the switching tubes of different pixel regions according to the current operating state information of the electronic device 100, so as to control the TOF sensor system 10 to operate in different modes. In some embodiments, the processor 110 may be electrically connected to the switching tubes of different pixel regions to control the switching tubes of different pixel regions to be turned on and off. In other embodiments, the control unit may control the switching tubes of different pixel regions to be turned on and off according to a control instruction of the processor 110. Specifically, the electronic device 100 controls the switching tube of the first pixel region 10a211 to be turned on so as to enter the ambient light mode. The electronic device 100 controls the switching tube of the second pixel region 10a212 to be turned on so as to enter the TOF mode.
Since the first filter portion 10a13 transmits visible light, ambient light in an external environment where the electronic device 100 is located forms first light after being filtered by the first filter portion 10a13. Since the first pixel region 10a211 is opposite to the first filter portion 10a13, the first light may reach the first pixel region 10a211. The photoelectric converter of the pixel unit of the first pixel region 10a211 may sense the first light and convert the optical signal into the first electrical signal. When the electronic device 100 controls the TOF sensor system 10 to operate in the ambient light mode, the optical-to-electrical converter outputs a first electrical signal to the processing circuitry 10a22. The processing circuit 10a22 receives the first electrical signal, and processes the first electrical signal, for example, converts the voltage signal into a digital signal or the like, to output a first signal indicating ambient light intensity information in the external environment.
The processor 110 may receive the first signal and control the electronic device 100 to perform a corresponding operation according to the first signal. For example, the processor 110 adaptively adjusts the brightness of the display screen 194 according to the obtained first signal. For example, when the ambient light is dark, the screen brightness is reduced to prevent dazzling; when the ambient light is brighter, the screen brightness is improved, and the screen display can be clearer. In addition, the processor 110 can also be used to implement other functions, such as automatically adjusting the white balance during the shooting function, according to the obtained first signal.
For example, the processor 110 may receive the first signal and determine a light intensity interval in which the intensity of the ambient light carried by the first signal is located. Different light intensity intervals may correspond to different screen brightness values. And after determining the light intensity interval in which the ambient light intensity carried by the first signal is positioned, the processor determines the screen brightness value corresponding to the current light intensity interval. The processor 110 controls the brightness of the display screen 194 to adjust to the corresponding screen brightness value.
In some embodiments, the processing circuitry 10a22 may comprise first processing circuitry. The first processing circuit comprises a first analog-to-digital converter. The first analog-to-digital converter may be electrically connected to an output terminal of the switching tube of the first pixel region 10a211. The first analog-to-digital converter is used for receiving the first electric signal and converting the first electric signal into a digital signal. Of course, it will be appreciated that in other embodiments, the first processing circuitry may include more devices to perform more processing on the first electrical signal.
In the ambient light mode, reference may be made to prior art ambient light sensors for other structures required by the TOF sensor system 10 to achieve ambient light detection, and will not be described herein.
When the electronic device 100 controls the TOF sensor system 10 to operate in the TOF mode, the electronic device 100 controls the light source 10b2 to provide a plurality of infrared light rays to the target object. The plurality of infrared light rays provided by the light source to the target object can respectively reach different positions of the reflecting surface of the target object and are reflected by the different positions of the reflecting surface of the target object. The infrared rays reflected by different positions of the reflecting surface of the target object may be transmitted through the second filter portion 10a14 to form a plurality of second rays. Since the second pixel region 10a212 is opposite to the second filtering portion 10a14. Accordingly, X pixel cells of the second pixel region 10a212 may sense X second light rays, and each pixel cell may generate a second electrical signal corresponding to the second light ray sensed thereby. I.e. X second electrical signals are generated. Each second electrical signal may be used to indicate a time of flight of the second light corresponding to the second electrical signal. It can be understood that the X second light rays correspond to X positions of the reflection surface of the target object, and each of the X second light rays is reflected by the corresponding position of the reflection surface of the target object and is sensed by the pixel unit corresponding to the second pixel region 10a212.
The processing circuit 10a22 receives the X second electrical signals and processes the X second electrical signals to output second signals. The second signal is used to indicate distance information between the X positions of the reflective surface of the target object and the image sensor. The distance information between the X positions of the reflecting surface of the target object and the image sensor corresponds to the X second electric signals one by one, and the distance information between each of the X positions of the reflecting surface of the target object and the image sensor is generated based on the corresponding second electric signals.
The processor 110 may receive the second signal and generate image information of the reflective surface of the target object according to the second signal, thereby controlling the electronic device 100 to perform a corresponding operation. For example, the processor 110 may receive the second signal, generate image information of a reflecting surface of the target object according to the second signal, match the image information with preset image information, and control the electronic device 100 to implement the 3D unlocking function after the matching function.
In some embodiments, the processing circuitry 10a22 may include a second processing circuitry 10a22. The second processing circuit 10a22 may comprise a second analog-to-digital converter, a processing module and an output module. The second analog-to-digital converter may be electrically connected to the output terminal of the switching tube of the second pixel region 10a212. The second analog-to-digital converter is used for receiving the second electric signal and converting the second electric signal into a digital signal. The processing module may process the digital signal into which the second electrical signal is converted to generate distance information, and output the second signal to the processor 110 via the output module.
Specifically, the processing circuit 10a22 may be integrally integrated in the image sensor 10a2. The processing circuit 10a22 may also be partially integrated in the image sensor 10a2 and partially integrated in the processor 110. The processing circuitry 10a22 may also be integrated in other modules of the TOF sensor system 10, such as the control unit described above.
For the TOF sensor system 10, in order to achieve 3D imaging, it has high requirements for pixels in the TOF mode. When the light mode is approached, complete graphic information does not need to be formed, and only one or a few pieces of distance information need to be generated to judge whether the target object is approached to the electronic device 100, so that the TOF sensor system 10 has a low requirement on pixels in the light mode. Based on this, in order to simplify the structure of the light receiving device 10a, the second pixel region 10a212 may include a first sub-region 10a2121 and a remaining region 10a2122. The first sub-region 10a2121 includes Y pixel units. The sensing of infrared light by the first sub-area 10a2121 is used for proximity detection purposes. Y is a positive integer. Y is less than X.
The electronic device 100 may control the on and off of the switching tube of the first sub-region 10a2121 according to the current operating state information of the electronic device 100, so as to control whether the TOF sensor system operates in the proximity light mode. In some embodiments, the processor 110 may be electrically connected to the switching tube of the first sub-region 10a2121 to control the switching tube to be turned on and off. In other embodiments, the control unit may control the switching tubes of the first sub-region 10a2121 to be turned on and off according to a control instruction of the processor 110. In some specific examples, the electronic device 100 controls the switching tubes of the first sub-region 10a2121 to be turned on, and controls the switching tubes of the remaining region 10a2122 to be turned off, so as to enter the approach light mode.
When the electronic device 100 controls the TOF sensor system 10 to operate in the proximity light mode, the electronic device 100 controls the light source 10b2 to provide a plurality of infrared lights to the target object. The plurality of infrared rays provided by the light source to the target object may respectively reach different positions of the reflecting surface of the target object and be reflected by the different positions of the reflecting surface of the target object. The infrared rays reflected by different positions of the reflecting surface of the target object may be transmitted through the second filter portion 10a14 to form a plurality of second rays. Since the first sub-area 10a2121 is opposite to the second filtering portion 10a14, Y pixel units of the first sub-area 10a2121 may sense Y second light rays, and each pixel unit may generate a third electrical signal corresponding to the second light ray sensed thereby. I.e. generating Y third electrical signals. Each third electrical signal may be used to indicate a time of flight of the second light corresponding to the third electrical signal. It can be understood that the Y second light rays correspond to Y positions of the reflection surface of the target object, and each of the Y second light rays is reflected by the corresponding position of the reflection surface of the target object and is sensed by the pixel unit corresponding to the first sub-area 10a 2121.
In some embodiments, processing circuit 10a22 receives Y third electrical signals and processes the Y third electrical signals to output third signals. The third signal is used to indicate distance information between the Y positions of the reflective surface of the target object and the image sensor. Distance information between the Y positions of the reflecting surface of the target object and the image sensor corresponds to the Y third electric signals one by one, and the distance information between each of the Y positions of the reflecting surface of the target object and the image sensor is generated based on the corresponding third electric signals.
The processor 110 may receive the third signal, determine the Y distance information, determine a minimum value of the Y distance information as the first distance information, compare the first distance information with a preset threshold, and control the electronic device 100 to perform a corresponding operation according to a magnitude relationship between the first distance information and the preset threshold. If the processor 110 determines that the first distance information does not exceed (i.e., is less than or equal to) the preset threshold, that is, it can be determined that there is a target object near the electronic device 100, the electronic device 100 is controlled to perform a screen locking operation, so as to achieve the purposes of preventing a human body from touching a screen by mistake and saving power when a user is talking. When the processor 110 determines that the first distance information exceeds (i.e., is greater than) the preset threshold, i.e., it may be determined that there is no target object near the electronic device 100, the processor 110 controls the display screen 194 of the electronic device 100 to be in the unlocked state, i.e., when the user calls, the processor 110 does not control the electronic device 100 to lock the screen.
In other embodiments, the processing circuit 10a22 receives Y third electrical signals, and generates distance information between Y positions of the reflective surface of the target object and the image sensor according to the Y third electrical signals. The processing circuit is further used for judging the Y pieces of distance information and determining the minimum value in the Y pieces of distance information as first distance information, and the processing circuit is further used for comparing the first distance information with a preset threshold value and outputting a third signal according to the size relation between the first distance information and the preset threshold value. The third signal may be instruction information for instructing the electronic apparatus 100 to perform a corresponding operation. The command information may be non-approach command information indicating that the target object is not in proximity to the image sensor 10a2, or may be approach command information indicating that the target object is in proximity to the image sensor 10a2. Specifically, when the processing circuit 10a22 determines that the first distance information does not exceed the preset threshold, that is, it can be determined that the target object exists near the electronic device 100, the third signal output by the processing circuit 10a22 is the proximity instruction information for indicating that the target object is in proximity to the image sensor 10a2. When the processing circuit 10a22 determines that the first distance information exceeds the preset threshold, that is, it can be determined that there is no target object near the electronic device 100, the processing circuit 10a22 outputs a third signal as non-proximity instruction information indicating that the target object is not in proximity to the image sensor 10a2.
The processor 110 controls the electronic device 100 to perform corresponding operations according to different instructions carried by the received third signal. For example, when the processor 110 receives the third signal carrying the non-proximity instruction information, the processor 110 controls the display screen 194 of the electronic device 100 to be in the unlocked state, that is, the processor 110 does not control the electronic device 100 to lock the screen when the user is talking. When receiving the third signal carrying the proximity instruction information, the processor 110 controls the electronic device 100 to perform a screen locking operation, so as to achieve the purposes of preventing a human body from touching the screen by mistake and saving power. Therefore, part of the calculation power of the processor 110 can be reduced, and the calculation capacity of the processor 110 of the electronic device 100 in other aspects can be improved.
It should be noted that the determination manner of the first distance information is not limited to the manner described above, and in other examples, the first distance information may be an average value of Y pieces of distance information.
In order to improve the receiving effect of the first sub-region 10a2121 on infrared light in the proximity light mode, in some embodiments, the second sub-region is in a middle region of the second pixel region 10a212.
Here, "central portion" should be broadly understood, and the central portion refers to a position away from the edge of the second pixel region 10a212. Illustratively, the center of the second sub-region coincides with the center of the second pixel region 10a212.
With reference to fig. 8, the first filter portion 10a13 is ring-shaped, and the first filter portion 10a13 is disposed around the edge of the second filter portion 10a14. Thus, the first pixel region 10a211 has a ring shape, and the first pixel region 10a211 surrounds the outer periphery of the second pixel region 10a212. Therefore, the sensing effect of the first pixel region 10a211 and the second pixel region 10a212 on light rays is facilitated, and the signal acquisition effect of the TOF sensor system 10 is further improved. Illustratively, the central axis of the first filter portion 10a13 is collinear with the central axis of the second filter portion 10a14.
Since in the TOF mode the second signal output by the processing circuit 10a22 is used to indicate image information of the target object. To ensure image capture by the TOF sensor system 10, in some embodiments, please refer back to fig. 6, the optical axis O1 of the optical lens 10a3 is collinear with the central axis of the second filtering portion 10a14. Illustratively, the central axis of the second filtering portion 10a14, the central axis of the first filtering portion 10a13, the central axis of the first pixel region 10a211, and the central axis of the second pixel region 10a212 are all collinear with the optical axis of the optical lens 10a3, thereby improving the acquisition effect of the TOF sensor system 10.
In some embodiments, the second signal output by the processing circuitry 10a22 is used to indicate image information of the target object due to TOF mode. In order to ensure the image acquisition effect of the TOF sensor system 10, the area of the second filtering portion 10a14 is larger than that of the first filtering portion 10a13, which is beneficial to ensure that the area of the second pixel region 10a212 is larger than that of the first pixel region 10a211, thereby improving the image acquisition effect of the TOF sensor system 10.
For the TOF sensor system 10, to achieve 3D imaging, in TOF mode, more infrared light beams are required to be projected onto the target object so that more infrared light beams can be reflected onto the image sensor, where a higher power light source 10b2 is required. In the approach light mode, it is only necessary to generate one or more distance information to determine whether the target object is approaching the electronic device 100, rather than forming pattern information, and therefore, the power of the light source 10b2 is relatively low in the approach light mode. To achieve the goal of reducing the power consumption of the TOF sensor system 10, in some embodiments, please refer to fig. 9, where fig. 9 is a schematic diagram of the light source emitting device 10b according to fig. 6. The light source emitting device 10b includes a power supply drive 10b4. The power supply driver 10b4 may be electrically connected to the power management module 141. The feeding driver 10b4 may be provided on the substrate 10b1. And is electrically connected to the light source 10b2. In the TOF mode, the processor 110 may control the power driver 10b4 to output a first current to the light source 10b2, and in the proximity light mode, the processor 110 may be configured to control the power driver 10b4 to output a second current to the light source 10b2, wherein the first current is greater than the second current. Illustratively, the first current has a value in the range of 1~4 amps. The value range of the second current is 10 to 50 milliamperes.
For the purpose of reducing power consumption of the TOF sensor system 10, in other embodiments, the light source 10b2 includes a plurality of first sub-light sources and a plurality of second sub-light sources. The plurality of first sub light sources and the plurality of second sub light sources are arranged at intervals. Illustratively, the plurality of first sub-light sources and the plurality of second sub-light sources are arranged in an array. In TOF mode, the processor 110 may control the plurality of first sub-light sources to provide infrared light to the target object. In the proximity light mode, the processor 110 controls the plurality of second sub-light sources to provide infrared light to the target object. Wherein the number of the first sub light sources may be greater than the number of the second sub light sources. Of course, it is understood that in other examples, in TOF mode, the processor 110 may control all of the sub-light sources to provide infrared light to the target object. In the proximity light mode, the processor 110 controls the first sub light source 10b2 or the second sub light source 10b2 to provide infrared light to the target object.
It should be understood that the number of the second sub-light sources may also be one, as long as it is ensured that the number of the first sub-light sources may be larger than the number of the second sub-light sources.
Referring to fig. 10, fig. 10 is a schematic diagram illustrating a relative position relationship between a filter 10a1 and an image sensor 10a2 according to another embodiment of the present disclosure. In the embodiment shown in fig. 10, the first filter portion 10a13 includes two first portions 10a131. The second filter portion 10a14 is an integral body. The two first portions are located on opposite sides of the second filter portion 10a14. At this time, the first pixel region 10a211 includes two first pixel sub-regions 10a2111. The second pixel region 10a212 is an integral body. The two first pixel sub-regions 10a2111 are located at opposite sides of the second pixel region 10a212. Illustratively, two first portions 10a131 are symmetrically disposed at both sides of the second filter portion 10a14. Thus, the structure is simple, and it is beneficial to realize the collinear arrangement of the central axis of the second filtering portion 10a14, the central axis of the second pixel region 10a212 and the optical axis of the optical lens 10a3, thereby improving the image capturing effect of the TOF sensor system 10.
Referring to fig. 11, fig. 11 is a schematic diagram illustrating a relative position relationship between a filter 10a1 and an image sensor 10a2 according to still another embodiment of the present disclosure. In the embodiment shown in fig. 11, the first filter portion 10a13 includes N first portions 10a131. The second filter portion 10a14 includes M second portions 10a141. The N first portions and the M second portions constitute (N + M) filter portions. The (N + M) filter portions are arranged in an array, one second portion 10a141 is arranged between two adjacent first portions 10a131 in each row of filter portions, and one second portion 10a141 is arranged between two adjacent first portions 10a131 in each column of filter portions, wherein N, M are positive integers greater than or equal to 2. At this time, the first pixel region 10a211 includes N first pixel sub-regions 10a2111, and the second pixel region 10a212 includes M second pixel sub-regions 10a2123. The N first pixel sub-regions 10a2111 correspond one-to-one to the N first portions 10a131. The M second pixel sub-regions 10a2123 correspond to the M second portions 10a141 one to one. This configuration is simple, and is advantageous in improving the effect of detecting the ambient light by the first pixel region 10a211.
For example, the first sub-region 10a2121 of the second pixel region 10a212 may be one of the second pixel sub-regions 10a2123, a part of one of the second pixel sub-regions 10a2123, or a part of M second pixel sub-regions 10a2123.
Referring to FIG. 12, FIG. 12 is a schematic diagram of a TOF sensor system 10 according to further embodiments of the present application. In order to reduce the distance between the pixel layer 10a21 and the filter 10a1, the alignment accuracy between the pixel layer 10a21 and the filter 10a1 in the image sensor 10a2 is improved so as to ensure that the first pixel region 10a211 is opposite to the first filter portion 10a13 and the second pixel region 10a212 is opposite to the second filter portion 10a14. The circuit board 10a5 is provided with a light-transmitting hole. The optical filter 10a1 and the image sensor 10a2 are fixed to both sides of the circuit board 10a5. And the orthographic projection of the first filter portion 10a13 on the circuit substrate 10a5, the orthographic projection of the second filter portion 10a14 on the circuit substrate 10a5, the orthographic projection of the first pixel region 10a211 on the circuit substrate 10a5 and the orthographic projection of the second pixel region 10a212 on the circuit substrate 10a5 are all located within the light-transmitting hole. For example, the optical filter 10a1, the image sensor 10a2, and the circuit substrate 10a5 are packaged as one body by a flip chip (Flipchip) process.
In some embodiments, when the electronic device 100 controls the TOF sensor system 10 to operate only in the ambient light mode, the electronic device 100 can control the power management module 141 to power only the first pixel region 10a211 and not the second pixel region 10a212 in the TOF sensor system 10. So as to achieve the purpose of reducing power consumption.
In the description herein, particular features, structures, materials, or characteristics may be combined in any suitable manner in any one or more embodiments or examples.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present application.

Claims (19)

1. A TOF sensor system, comprising:
a light source for providing infrared light to a target object;
the light receiving device comprises a light filter and an image sensor, the light filter comprises a first light filtering part and a second light filtering part, the first light filtering part is transparent to visible light, the second light filtering part is cut off to the visible light and is transparent to infrared light, the image sensor is arranged on one side of the light filter, the image sensor comprises a pixel layer, the pixel layer comprises a first pixel region and a second pixel region, the first pixel region is opposite to the first light filtering part and is used for sensing the first light filtered by the first light filtering part, the second pixel region is opposite to the second light filtering part, the infrared light is reflected by a target object and forms second light after being filtered by the second light filtering part, and the second pixel region is used for sensing the second light;
the processing circuit is used for outputting a first signal according to first light sensed by the first pixel region, and the first signal is used for indicating ambient light intensity information in an external environment; the processing circuit is further configured to output a second signal according to a second light sensed by the second pixel region, where the second signal is used to indicate distance information between X positions of a reflecting surface of the target object and the image sensor, where X is a positive integer greater than 1.
2. The TOF sensor system of claim 1, wherein the first filter portion is disposed around a perimeter of the second filter portion.
3. The TOF sensor system of claim 1, wherein the first filtering portion comprises two first portions disposed on opposite sides of the second filtering portion.
4. The TOF sensor system of claim 3, wherein two of the first portions are symmetrically disposed with respect to the second filtering portion.
5. The TOF sensor system of claim 1, wherein the first light filtering portion comprises N first portions, the second light filtering portion comprises M second portions, the N first portions and the M second portions constitute (N + M) light filtering portions, the (N + M) light filtering portions are arranged in an array, one second portion is arranged between two adjacent first portions in each row of the light filtering portions, and one second portion is arranged between two adjacent first portions in each column of the light filtering portions, wherein N, M are positive integers greater than or equal to 2.
6. The TOF sensor system of any of claims 1-4, further comprising an optical lens on a side of the filter remote from the image sensor, an optical axis of the optical lens being collinear with a central axis of the second filtering portion.
7. The TOF sensor system of claim 1, wherein an area of the second filtered portion is larger than an area of the first filtered portion.
8. The TOF sensor system of claim 1, wherein the TOF sensor system has a proximity light mode and a TOF mode, the TOF sensor system being switchable between the proximity light mode and the TOF mode;
the second pixel region includes a first sub-region;
in the TOF mode, the processing circuit is configured to output the second signal according to the second light sensed by the second pixel region;
in the approach light mode, the processing circuit is used for outputting a third signal according to the second light rays sensed by the first sub-area;
the third signal is used for indicating distance information between Y positions of the reflecting surface of the target object and the image sensor, wherein Y is a positive integer and is smaller than X.
9. The TOF sensor system of claim 1, wherein the TOF sensor system has a proximity light mode and a TOF mode, the TOF sensor system being switchable between the proximity light mode and the TOF mode;
the second pixel region includes a first sub-region;
in the TOF mode, the processing circuit is configured to output the second signal according to the second light sensed by the second pixel region;
in the approach light mode, the processing circuit is configured to generate distance information between Y positions of a reflecting surface of a target object and the image sensor according to the second light rays sensed by the first sub-region, the processing circuit is further configured to process the Y distance information to obtain first distance information, the processing circuit is further configured to compare the first distance information with a preset threshold, and output a third signal according to a magnitude relationship between the first distance information and the preset threshold, where the third signal is approach instruction information for indicating that the target object is close to the image sensor or non-approach instruction information for indicating that the target object is not close to the image sensor, Y is a positive integer and is smaller than X.
10. The TOF sensor system of claim 9, wherein the processing circuit is configured to determine Y of the distance information and determine a minimum of the Y of the distance information as the first distance information.
11. The TOF sensor system of any of claims 8-10, further comprising: a power supply drive electrically connected to the light source;
in the TOF mode, the power driver is configured to output a first current to the light source, and in the proximity light mode, the power driver is configured to output a second current to the light source, wherein the first current is greater than the second current.
12. The TOF sensor system of any of claims 8-10, wherein the light source comprises a first sub light source and a second sub light source, the first sub light source being greater in number than the second sub light source, the first sub light source for providing infrared light to a target object in the TOF mode and the second sub light source for providing infrared light to a target object in the proximity light mode.
13. The TOF sensor system of any of claims 8-10, wherein the first sub-region is in the middle of the second pixel region.
14. The TOF sensor system of claim 1, wherein the light receiving device further comprises a circuit substrate having a light hole, the image sensor is electrically connected to the circuit substrate, the optical filter and the image sensor are respectively fixed on two sides of the circuit substrate, and an orthographic projection of the first filtering portion on the circuit substrate, an orthographic projection of the second filtering portion on the circuit substrate, an orthographic projection of the first pixel region on the circuit substrate, and an orthographic projection of the second pixel region on the circuit substrate are all located in the light hole.
15. An electronic device, comprising:
a TOF sensor system according to any one of claims 1-14; and
the processor is electrically connected with the TOF sensor system and used for controlling the electronic equipment to execute corresponding operation according to the first signal and the second signal.
16. The electronic device of claim 15, wherein the TOF sensor system has a proximity light mode and a TOF mode, the TOF sensor system being switchable between the proximity light mode and the TOF mode;
the second pixel region includes a first sub-region;
in the TOF mode, the processing circuit is configured to output the second signal according to the second light sensed by the second pixel region;
in the approach light mode, the processing circuit is configured to output a third signal according to the second light rays sensed by the first sub-region, where the third signal is used to indicate distance information between Y positions of the reflecting surface of the target object and the image sensor, where Y is a positive integer and is smaller than X;
the processor receives the third signal and processes the Y pieces of distance information to obtain first distance information, the processor is further used for comparing the first distance information with a preset threshold value, and the processor controls the electronic equipment to execute corresponding operation according to the size relation between the first distance information and the preset threshold value.
17. The electronic device of claim 15, wherein the TOF sensor system has a proximity light mode and a TOF mode, the TOF sensor system being switchable between the proximity light mode and the TOF mode;
the second pixel region includes a first sub-region;
in the TOF mode, the processing circuit is configured to output the second signal according to the second light sensed by the second pixel region;
in the approach light mode, the processing circuit is configured to generate distance information between Y positions of a reflecting surface of a target object and the image sensor according to second light rays sensed by the first sub-region, the processing circuit is further configured to process the Y distance information to obtain first distance information, the processing circuit is further configured to compare the first distance information with a preset threshold, and output a third signal according to a magnitude relation between the first distance information and the preset threshold, where the third signal is approach instruction information for indicating that the target object is close to the image sensor or non-approach instruction information for indicating that the target object is not close to the image sensor; wherein Y is a positive integer and is less than X;
the processor is further configured to control the electronic device to perform a corresponding operation according to the third signal.
18. The electronic device of claim 16 or 17, wherein the TOF sensor system further comprises: a power driver in electrical communication with the light source, the processor configured to control the power driver to output a first current to the light source in a TOF mode and configured to control the power driver to output a second current to the light source in the proximity light mode, wherein the first current is greater than the second current.
19. The electronic device of claim 16 or 17, wherein the light source comprises a first sub-light source and a second sub-light source, the number of first sub-light sources is greater than the number of second sub-light sources, the processor is configured to control the first sub-light source to provide infrared light to a target object in the TOF mode, and the processor is further configured to control the second sub-light source to provide infrared light to a target object in the proximity light mode.
CN202211099980.6A 2022-09-09 2022-09-09 TOF sensor system and electronic device Active CN115184956B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211099980.6A CN115184956B (en) 2022-09-09 2022-09-09 TOF sensor system and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211099980.6A CN115184956B (en) 2022-09-09 2022-09-09 TOF sensor system and electronic device

Publications (2)

Publication Number Publication Date
CN115184956A true CN115184956A (en) 2022-10-14
CN115184956B CN115184956B (en) 2023-01-13

Family

ID=83524258

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211099980.6A Active CN115184956B (en) 2022-09-09 2022-09-09 TOF sensor system and electronic device

Country Status (1)

Country Link
CN (1) CN115184956B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116359945A (en) * 2023-05-16 2023-06-30 荣耀终端有限公司 TOF sensing module and electronic equipment

Citations (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN205844201U (en) * 2016-03-31 2016-12-28 中国人民公安大学 A kind of device of quick identification scene vestige
CN207281284U (en) * 2017-08-15 2018-04-27 东莞市迈科新能源有限公司 A kind of vehicle-mounted stereo visual system based on TOF camera
CN108445499A (en) * 2018-02-07 2018-08-24 余晓智 A kind of the ambient light suppression system and method for TOF sensor
CN108632594A (en) * 2018-07-17 2018-10-09 王锐 A kind of intelligence commodity information display system and method
CN208314204U (en) * 2018-05-14 2019-01-01 孙向明 A kind of TOF 3D depth image sensor of environment resistant light interference
US20190178711A1 (en) * 2016-07-28 2019-06-13 Philips Lighting Holding B.V. Methods and systems for camera-based ambient light estimation
CN110456379A (en) * 2019-07-12 2019-11-15 深圳奥比中光科技有限公司 The depth measurement device and distance measurement method of fusion
WO2020074467A1 (en) * 2018-10-08 2020-04-16 Sony Semiconductor Solutions Corporation Time of flight apparatus and method
CN210469533U (en) * 2019-11-04 2020-05-05 深圳市灵明光子科技有限公司 Ambient light adjusting device, image sensor, and electronic device
CN111427048A (en) * 2020-02-25 2020-07-17 深圳奥比中光科技有限公司 ToF depth measuring device, method for controlling ToF depth measuring device and electronic equipment
WO2020151493A1 (en) * 2019-01-25 2020-07-30 深圳市光鉴科技有限公司 Light projection system
US20200300987A1 (en) * 2019-03-18 2020-09-24 Kabushiki Kaisha Toshiba Electronic apparatus and method
EP3715904A1 (en) * 2019-03-25 2020-09-30 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Time of flight assembly, terminal device and control method for time of flight assembly
CN111736249A (en) * 2020-08-17 2020-10-02 深圳市汇顶科技股份有限公司 Infrared bandpass filter and sensor system
CN111766604A (en) * 2020-06-09 2020-10-13 中国电子科技集团公司第十一研究所 Composite distance measurement method and system
CN112363180A (en) * 2020-10-28 2021-02-12 Oppo广东移动通信有限公司 Imaging distance measuring sensor, method, system and storage medium
EP3833019A1 (en) * 2018-08-22 2021-06-09 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Electronic device and control method therefor
WO2021120402A1 (en) * 2019-12-18 2021-06-24 深圳奥比中光科技有限公司 Fused depth measurement apparatus and measurement method
WO2021208926A1 (en) * 2020-04-14 2021-10-21 华为技术有限公司 Photographing device and method
CN113945950A (en) * 2021-09-22 2022-01-18 荣耀终端有限公司 Electronic device and depth detection device
US20220018946A1 (en) * 2020-07-17 2022-01-20 Samsung Electronics Co., Ltd. Multi-function time-of-flight sensor and method of operating the same
CN114019474A (en) * 2021-09-22 2022-02-08 深圳阜时科技有限公司 Emission module, optical detection device and electronic equipment
CN114283753A (en) * 2022-01-04 2022-04-05 海信视像科技股份有限公司 Display device and control method of screen backlight brightness
CN114280628A (en) * 2022-03-03 2022-04-05 荣耀终端有限公司 Sensor module and electronic device
CN114500869A (en) * 2020-11-13 2022-05-13 荣耀终端有限公司 Electronic equipment and light detection subassembly under screen
WO2022121879A1 (en) * 2020-12-09 2022-06-16 华为技术有限公司 Tof apparatus and electronic device

Patent Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN205844201U (en) * 2016-03-31 2016-12-28 中国人民公安大学 A kind of device of quick identification scene vestige
US20190178711A1 (en) * 2016-07-28 2019-06-13 Philips Lighting Holding B.V. Methods and systems for camera-based ambient light estimation
CN207281284U (en) * 2017-08-15 2018-04-27 东莞市迈科新能源有限公司 A kind of vehicle-mounted stereo visual system based on TOF camera
CN108445499A (en) * 2018-02-07 2018-08-24 余晓智 A kind of the ambient light suppression system and method for TOF sensor
CN208314204U (en) * 2018-05-14 2019-01-01 孙向明 A kind of TOF 3D depth image sensor of environment resistant light interference
CN108632594A (en) * 2018-07-17 2018-10-09 王锐 A kind of intelligence commodity information display system and method
EP3833019A1 (en) * 2018-08-22 2021-06-09 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Electronic device and control method therefor
US20210349193A1 (en) * 2018-10-08 2021-11-11 Sony Semiconductor Solutions Corporation Time of flight apparatus and method
WO2020074467A1 (en) * 2018-10-08 2020-04-16 Sony Semiconductor Solutions Corporation Time of flight apparatus and method
WO2020151493A1 (en) * 2019-01-25 2020-07-30 深圳市光鉴科技有限公司 Light projection system
US20200300987A1 (en) * 2019-03-18 2020-09-24 Kabushiki Kaisha Toshiba Electronic apparatus and method
EP3715904A1 (en) * 2019-03-25 2020-09-30 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Time of flight assembly, terminal device and control method for time of flight assembly
US20200309909A1 (en) * 2019-03-25 2020-10-01 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Time of flight assembly, terminal device and control method for time of flight assembly
CN110456379A (en) * 2019-07-12 2019-11-15 深圳奥比中光科技有限公司 The depth measurement device and distance measurement method of fusion
CN210469533U (en) * 2019-11-04 2020-05-05 深圳市灵明光子科技有限公司 Ambient light adjusting device, image sensor, and electronic device
WO2021120402A1 (en) * 2019-12-18 2021-06-24 深圳奥比中光科技有限公司 Fused depth measurement apparatus and measurement method
CN111427048A (en) * 2020-02-25 2020-07-17 深圳奥比中光科技有限公司 ToF depth measuring device, method for controlling ToF depth measuring device and electronic equipment
WO2021208926A1 (en) * 2020-04-14 2021-10-21 华为技术有限公司 Photographing device and method
CN111766604A (en) * 2020-06-09 2020-10-13 中国电子科技集团公司第十一研究所 Composite distance measurement method and system
US20220018946A1 (en) * 2020-07-17 2022-01-20 Samsung Electronics Co., Ltd. Multi-function time-of-flight sensor and method of operating the same
CN111736249A (en) * 2020-08-17 2020-10-02 深圳市汇顶科技股份有限公司 Infrared bandpass filter and sensor system
CN112363180A (en) * 2020-10-28 2021-02-12 Oppo广东移动通信有限公司 Imaging distance measuring sensor, method, system and storage medium
CN114500869A (en) * 2020-11-13 2022-05-13 荣耀终端有限公司 Electronic equipment and light detection subassembly under screen
WO2022121879A1 (en) * 2020-12-09 2022-06-16 华为技术有限公司 Tof apparatus and electronic device
CN113945950A (en) * 2021-09-22 2022-01-18 荣耀终端有限公司 Electronic device and depth detection device
CN114019474A (en) * 2021-09-22 2022-02-08 深圳阜时科技有限公司 Emission module, optical detection device and electronic equipment
CN114283753A (en) * 2022-01-04 2022-04-05 海信视像科技股份有限公司 Display device and control method of screen backlight brightness
CN114280628A (en) * 2022-03-03 2022-04-05 荣耀终端有限公司 Sensor module and electronic device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
MILOS DAVIDOVIC: "TOF range finding sensor in 90nm CMOS capable of suppressing 180 klx ambient light", 《2010 IEEE SENSORS》 *
吴翔骅: "基于3D TOF技术的深度图像超分辨率重建算法", 《中国优秀硕士学位论文全文数据库信息科技辑》 *
朱才喜: "高精度TOF三维深度相机设计与实现", 《中国优秀硕士学位论文全文数据库信息科技辑》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116359945A (en) * 2023-05-16 2023-06-30 荣耀终端有限公司 TOF sensing module and electronic equipment
CN116359945B (en) * 2023-05-16 2023-10-20 荣耀终端有限公司 TOF sensing module and electronic equipment

Also Published As

Publication number Publication date
CN115184956B (en) 2023-01-13

Similar Documents

Publication Publication Date Title
US20220342074A1 (en) Electronic Device and Sensor Control Method
CN110347269B (en) Empty mouse mode realization method and related equipment
CN111262975B (en) Bright screen control method, electronic device, computer-readable storage medium, and program product
CN110798568B (en) Display control method of electronic equipment with folding screen and electronic equipment
CN113810601B (en) Terminal image processing method and device and terminal equipment
CN110602309A (en) Device unlocking method and system and related device
CN110649719A (en) Wireless charging method and electronic equipment
CN113347560B (en) Bluetooth connection method, electronic device and storage medium
CN110557740A (en) Electronic equipment control method and electronic equipment
CN115589051B (en) Charging method and terminal equipment
CN112686981A (en) Picture rendering method and device, electronic equipment and storage medium
CN113572956A (en) Focusing method and related equipment
CN114090102A (en) Method, device, electronic equipment and medium for starting application program
CN115184956B (en) TOF sensor system and electronic device
CN114880251A (en) Access method and access device of storage unit and terminal equipment
CN113393856A (en) Sound pickup method and device and electronic equipment
CN113542613A (en) Device and method for photographing
CN114095602B (en) Index display method, electronic device and computer readable storage medium
CN114356109A (en) Character input method, electronic device and computer readable storage medium
CN115032640B (en) Gesture recognition method and terminal equipment
CN113781548A (en) Multi-device pose measurement method, electronic device and system
CN114999535A (en) Voice data processing method and device in online translation process
CN114302063B (en) Shooting method and equipment
CN115714890A (en) Power supply circuit and electronic device
CN113572957A (en) Shooting focusing method and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant