CN115842964A - Image acquisition device, display equipment, image processing method and device - Google Patents

Image acquisition device, display equipment, image processing method and device Download PDF

Info

Publication number
CN115842964A
CN115842964A CN202111110947.4A CN202111110947A CN115842964A CN 115842964 A CN115842964 A CN 115842964A CN 202111110947 A CN202111110947 A CN 202111110947A CN 115842964 A CN115842964 A CN 115842964A
Authority
CN
China
Prior art keywords
light
light sources
image
reflected light
target pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111110947.4A
Other languages
Chinese (zh)
Inventor
林绍杰
任明坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202111110947.4A priority Critical patent/CN115842964A/en
Publication of CN115842964A publication Critical patent/CN115842964A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)

Abstract

The embodiment provides an image acquisition device, a display device, an image processing method and an image processing device, the image acquisition device comprises at least two light sources, an image sensor and a processor connected with the at least two light sources and the image sensor, the power of the at least two light sources is different, the image sensor comprises a pixel lattice for receiving the reflected light of the at least two light sources, the processor connected with the at least two light sources and the image sensor projects energy of corresponding power in a time-sharing mode by controlling the at least two light sources, the pixel lattice is controlled to receive the reflected light of the at least two light sources, and a gray level image is obtained according to the light intensity information of the reflected light of the at least two light sources. The method and the device can detect the depth information of the remote object more accurately, and further obtain the gray level image with higher quality.

Description

Image acquisition device, display equipment, image processing method and device
Technical Field
The present application relates to an image processing technique. And more particularly, to an image pickup apparatus, a display device, an image processing method, and an apparatus.
Background
Time Of Flight (TOF) technology is broadly understood to be the Time taken to measure the distance an object, particle or wave travels in a fixed medium. With the development of science and technology, the application of the TOF technology is more and more extensive, for example, the TOF technology can be applied to the fields of three-dimensional modeling, games, navigation, automatic driving, gesture capture and the like.
Currently, image capture devices employing TOF technology provide flood illumination via an active light source projector. The floodlighting mode which is a uniform illumination mode can enable the distance measurement calculation to obtain depth point cloud information with rich details in a close range; however, when the irradiation distance becomes longer, the energy of the irradiated light is rapidly reduced, and the irradiated light is easily affected by ambient light, so that the depth information of a remote object cannot be detected, and the quality of the obtained gray level image is not high.
Disclosure of Invention
The exemplary embodiment of the application provides an image acquisition device, a display device, an image processing method and an image processing device, which can detect the depth information of a remote object more accurately, and further obtain a higher-quality gray image.
In a first aspect, an embodiment of the present application provides an image capturing apparatus, including:
at least two light sources, the power of at least two light sources is different;
an image sensor comprising a pixel array for receiving reflected light of at least two light sources;
a processor connected to the at least two light sources and the image sensor, configured to:
controlling at least two light sources to project energy with corresponding power in a time-sharing manner;
controlling the pixel lattice to receive the reflected light of at least two light sources;
and obtaining a gray image according to the light intensity information of the reflected light of the at least two light sources.
In some possible implementations, the processor is configured to: and controlling at least two light sources to project energy with corresponding power in corresponding frame time in sequence in one period.
In some possible implementations, the processor is configured to: determining target pixel lattices respectively corresponding to at least two light sources in the pixel lattices; and controlling the target pixel lattice to receive the reflected light of the corresponding light source.
In some possible implementations, the processor is configured to: aiming at each light source in the at least two light sources, obtaining a gray level image according to the light intensity information of the reflected light of the light source and a light intensity threshold corresponding to the light source, wherein the light intensity threshold is used for filtering stray light and/or interference light in the reflected light of the light source.
In some possible implementations, the at least two light sources include a first light source and a second light source, the power of the first light source is greater than the power of the second light source, and the processor is configured to: in a first target pixel dot matrix corresponding to a first light source, setting the light intensity higher than a first light intensity threshold value in the light intensity information of the reflected light of the first light source as a preset value; setting the light intensity lower than a second light intensity threshold in the light intensity information of the reflected light of the second light source as a preset value in a second target pixel dot matrix corresponding to the second light source; superposing the first target pixel dot matrix and the second target pixel dot matrix to obtain a third target pixel dot matrix; and obtaining a gray image according to the light intensity information of the reflected light in the third target pixel dot matrix.
In some possible implementations, the processor is configured to: and sending the gray level image to a display connected with the image acquisition device, wherein the display is used for displaying the gray level image.
In a second aspect, an embodiment of the present application provides a display device, including:
a display for displaying a greyscale image and an image acquisition device as described in the first aspect of the present application.
In a third aspect, an embodiment of the present application provides an image processing method applied to an image capturing device, where the image capturing device includes at least two light sources and an image sensor, power of the at least two light sources is different, the image sensor includes a pixel lattice for receiving reflected light of the at least two light sources, and the image processing method includes:
controlling at least two light sources to project energy with corresponding power in a time-sharing manner;
controlling the pixel lattice to receive the reflected light of at least two light sources;
and obtaining a gray image according to the light intensity information of the reflected light of the at least two light sources.
In some possible implementations, controlling the at least two light sources to project the energy of the corresponding power in a time-sharing manner includes: and controlling at least two light sources to project energy with corresponding power in corresponding frame time in sequence in one period.
In some possible implementations, controlling the pixel lattice to receive the reflected light of the at least two light sources includes: determining target pixel lattices respectively corresponding to at least two light sources in the pixel lattices; and controlling the target pixel lattice to receive the reflected light of the corresponding light source.
In some possible implementations, obtaining a grayscale image according to light intensity information of reflected lights of at least two light sources includes: aiming at each light source in at least two light sources, obtaining a gray level image according to the light intensity information of the reflected light of the light source and a light intensity threshold corresponding to the light source, wherein the light intensity threshold is used for filtering stray light and/or interference light in the reflected light of the light source.
In some possible implementations, the obtaining the grayscale image according to the light intensity information of the reflected light of the light source and the light intensity threshold corresponding to the light source includes: setting the light intensity higher than a first light intensity threshold in the light intensity information of the reflected light of the first light source as a preset value in a first target pixel dot matrix corresponding to the first light source; setting the light intensity lower than a second light intensity threshold in the light intensity information of the reflected light of the second light source as a preset value in a second target pixel dot matrix corresponding to the second light source; superposing the first target pixel dot matrix and the second target pixel dot matrix to obtain a third target pixel dot matrix; and obtaining a gray image according to the light intensity information of the reflected light in the third target pixel dot matrix.
In some possible implementations, the image processing method further includes: and sending the gray scale image to a display connected with the image acquisition device, wherein the display is used for displaying the gray scale image.
In a fourth aspect, an embodiment of the present application provides an image processing apparatus, which is applied to an image capturing apparatus, where the image capturing apparatus includes at least two light sources and an image sensor, power of the at least two light sources is different, the image sensor includes a pixel lattice for receiving reflected light of the at least two light sources, and the image processing apparatus includes:
the first control module is used for controlling the at least two light sources to project energy with corresponding power in a time-sharing manner;
the second control module is used for controlling the pixel dot matrix to receive the reflected light of at least two light sources;
and the acquisition module is used for acquiring a gray image according to the light intensity information of the reflected light of the at least two light sources.
In some possible implementations, the first control module is specifically configured to: and controlling at least two light sources to project energy with corresponding power in corresponding frame time in sequence in one period.
In some possible implementations, the second control module is specifically configured to: determining target pixel lattices respectively corresponding to at least two light sources in the pixel lattices; and controlling the target pixel lattice to receive the reflected light of the corresponding light source.
In some possible implementations, the obtaining module is specifically configured to: aiming at each light source in the at least two light sources, obtaining a gray level image according to the light intensity information of the reflected light of the light source and a light intensity threshold corresponding to the light source, wherein the light intensity threshold is used for filtering stray light and/or interference light in the reflected light of the light source.
In some possible implementation manners, the at least two light sources include a first light source and a second light source, the power of the first light source is greater than the power of the second light source, and the obtaining module is specifically configured to, when obtaining the grayscale image according to the light intensity information of the reflected light of the light sources and the light intensity threshold corresponding to the light source: setting the light intensity higher than a first light intensity threshold in the light intensity information of the reflected light of the first light source as a preset value in a first target pixel dot matrix corresponding to the first light source; setting the light intensity lower than a second light intensity threshold in the light intensity information of the reflected light of the second light source as a preset value in a second target pixel dot matrix corresponding to the second light source; superposing the first target pixel dot matrix and the second target pixel dot matrix to obtain a third target pixel dot matrix; and obtaining a gray image according to the light intensity information of the reflected light in the third target pixel dot matrix.
In some possible implementations, the image processing apparatus further includes a sending module, configured to send the grayscale image to a display connected to the image capturing apparatus, where the display is configured to display the grayscale image.
In a fifth aspect, embodiments of the present application provide a computer-readable storage medium, in which computer program instructions are stored, and when executed, implement any one of the output image processing methods according to the third aspect of the present application.
In a sixth aspect, the present application provides a computer program product comprising a computer program that, when executed by a processor, implements any of the output image processing methods as described in the third aspect of the present application.
The image acquisition device comprises at least two light sources, an image sensor and a processor connected with the at least two light sources and the image sensor, the power of the at least two light sources is different, the image sensor comprises a pixel lattice used for receiving reflected light of the at least two light sources, the processor connected with the at least two light sources and the image sensor projects energy corresponding to the power in a time-sharing mode by controlling the at least two light sources, the pixel lattice is controlled to receive the reflected light of the at least two light sources, and a gray level image is obtained according to light intensity information of the reflected light of the at least two light sources. Due to the fact that the energy corresponding to the power is projected by controlling the light sources with at least two different powers in a time-sharing mode, the gray level image is obtained according to the received light intensity information corresponding to the reflected light of the different light sources, the depth information of the long-distance object can be detected more accurately, and the gray level image with higher quality is obtained.
These and other aspects of the present application will be more readily apparent from the following description of the embodiment(s).
Drawings
In order to more clearly illustrate the embodiments of the present application or the implementation manner in the related art, a brief description will be given below of the drawings required for the description of the embodiments or the related art, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained by those skilled in the art according to the drawings.
Fig. 1 is a schematic view of an application scenario of an image capturing apparatus according to an embodiment of the present disclosure;
fig. 2 is a block diagram of a hardware configuration of a display device according to an embodiment of the present application;
FIG. 3 is a software system diagram of a display device provided herein;
fig. 4 is a flowchart of an image processing method according to an embodiment of the present application;
FIG. 5 is a flowchart of an image processing method according to another embodiment of the present application;
fig. 6 is a schematic diagram of a TOF camera according to an embodiment of the present application for image processing;
FIG. 7 is a schematic diagram of a pixel array included in an image sensor according to an embodiment of the present disclosure;
fig. 8 is a schematic diagram illustrating a filtering process performed on light intensity information of reflected light of a light source according to an embodiment of the present application;
fig. 9 is a schematic diagram illustrating a filtering process performed on light intensity information of reflected light of a light source according to another embodiment of the present application;
FIG. 10 is a schematic diagram illustrating an exemplary embodiment of obtaining a third target pixel lattice;
fig. 11 is a schematic diagram of projection distance ranges respectively corresponding to a high-power laser and a low-power laser included in a TOF camera according to an embodiment of the present disclosure;
fig. 12 is an overall schematic view of a TOF camera according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
fig. 14 is a schematic diagram of a display device according to an embodiment of the present application.
Detailed Description
To make the objects, embodiments and advantages of the present application clearer, the following description of exemplary embodiments of the present application will clearly and completely describe the exemplary embodiments of the present application with reference to the accompanying drawings in the exemplary embodiments of the present application, and it is to be understood that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments described herein without making any inventive step, fall within the scope of the appended claims. In addition, while the disclosure herein has been presented in terms of exemplary embodiment or embodiments, it should be appreciated that individual aspects of the disclosure can be utilized in a variety of forms and embodiments.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first", "second", "third", and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and are not necessarily meant to define a particular order or sequence Unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein.
Furthermore, the terms "comprises" and "comprising," as well as any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
The term "module" as used herein refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The term "remote control" as used in this application refers to a component of an electronic device, such as the display device disclosed in this application, that is typically wirelessly controllable over a short range of distances. Typically using infrared and/or Radio Frequency (RF) signals and/or bluetooth to connect with the electronic device, and may also include WiFi, wireless USB, bluetooth, motion sensor, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in the common remote control device with the user interface in the touch screen.
The term "gesture" as used in this application refers to a user's behavior through a change in hand shape or an action such as hand motion to convey a desired idea, action, purpose, or result.
Fig. 1 is a schematic view of an application scenario of an image capturing device according to an embodiment of the present disclosure. As shown in fig. 1, the image acquisition apparatus is, for example, a TOF camera 100, and a display 275 of the display device 200 displays a gray-scale image including a human body acquired by the TOF camera 100.
As also shown in fig. 1, the display apparatus 200 also performs data communication with the server 400 through various communication means. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200.
The display 275 of the display device 200 may be a liquid crystal display, an OLED display, a projection display device. The particular display device type, size, resolution, etc. are not limiting, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
The display apparatus 200 may additionally provide an intelligent network tv function of a computer support function including, but not limited to, a network tv, an intelligent tv, an Internet Protocol Tv (IPTV), and the like, in addition to the broadcast receiving tv function.
Fig. 2 is a block diagram of a hardware configuration of a display device according to an embodiment of the present application. As shown in fig. 2, in some embodiments, at least one of the controller 250, the tuner demodulator 210, the communicator 220, the detector 230, the input/output interface 255, the display 275, the audio output interface 285, the memory 260, the power supply 290, the user interface 265, the external device interface 240 is included in the display apparatus 200.
In some embodiments, the display 275 is configured to receive image signals from the output of the first processor and to display video content and images and components of the menu manipulation interface.
In some embodiments, the display 275, includes a display screen assembly for presenting a picture, and a driving assembly that drives the display of an image.
In some embodiments, the video content is displayed from broadcast television content, or alternatively, from various broadcast signals that may be received via wired or wireless communication protocols. Alternatively, various image contents received from the network communication protocol and sent from the network server side can be displayed.
In some embodiments, the display 275 is used to present a user-manipulated UI interface generated in the display apparatus 200 and used to control the display apparatus 200.
In some embodiments, a driver assembly for driving the display is also included, depending on the type of display 275.
In some embodiments, display 275 is a projection display and may also include a projection device and a projection screen.
In some embodiments, communicator 220 is a component for communicating with external devices or external servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi chip, a bluetooth communication protocol chip, a wired ethernet communication protocol chip, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver.
In some embodiments, the display apparatus 200 may establish control signal and data signal transmission and reception with an external control apparatus or a content providing apparatus through the communicator 220.
In some embodiments, user interface 265 may be configured to receive infrared control signals from a control device (e.g., an infrared remote control, etc.).
In some embodiments, the detector 230 is a signal used by the display device 200 to collect an external environment or interact with the outside.
In some embodiments, the detector 230 includes a light receiver, a sensor for collecting the intensity of ambient light, and parameters changes can be adaptively displayed by collecting the ambient light, and the like.
In some embodiments, the detector 230 may further include an image collector, such as a camera, and the like, where the camera is a TOF camera, and may be configured to collect external environment scenes, collect attributes of a user or a gesture interacted with the user, adaptively change display parameters, and also recognize a gesture of the user, so as to implement a function of interaction with the user.
In some embodiments, the detector 230 may also include a temperature sensor or the like, such as by sensing ambient temperature.
In some embodiments, the display apparatus 200 may adaptively adjust a display color temperature of an image. For example, the display apparatus 200 may be adjusted to display a cool tone when the temperature is in a high environment, or the display apparatus 200 may be adjusted to display a warm tone when the temperature is in a low environment.
In some embodiments, the detector 230 may also be a sound collector or the like, such as a microphone, which may be used to receive the user's voice. Illustratively, a voice signal including a control instruction of the user to control the display device 200, or to collect an ambient sound for recognizing an ambient scene type, so that the display device 200 can adaptively adapt to an ambient noise.
In some embodiments, as shown in fig. 2, the input/output interface 255 is configured to allow data transfer between the controller 250 and external other devices or other controllers 250. Such as receiving video signal data and audio signal data of an external device, or command instruction data, etc.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: the interface can be any one or more of a high-definition multimedia interface (HDMI), an analog or data high-definition component input interface, a composite video input interface, a USB input interface, an RGB port and the like. The plurality of interfaces may form a composite input/output interface.
In some embodiments, as shown in fig. 2, the tuning demodulator 210 is configured to receive a broadcast television signal through a wired or wireless receiving manner, perform modulation and demodulation processing such as amplification, mixing, resonance, and the like, and demodulate an audio and video signal from a plurality of wireless or wired broadcast television signals, where the audio and video signal may include a television audio and video signal carried in a television channel frequency selected by a user and an EPG data signal.
In some embodiments, the frequency points demodulated by the tuner demodulator 210 are controlled by the controller 250, and the controller 250 can send out control signals according to user selection, so that the modem responds to the television signal frequency selected by the user and modulates and demodulates the television signal carried by the frequency.
In some embodiments, the broadcast television signal may be classified into a terrestrial broadcast signal, a cable broadcast signal, a satellite broadcast signal, an internet broadcast signal, or the like according to the broadcasting system of the television signal. Or may be classified into a digital modulation signal, an analog modulation signal, and the like according to a modulation type. Or the signals are classified into digital signals, analog signals and the like according to the types of the signals.
In some embodiments, the controller 250 and the modem 210 may be located in different separate devices, that is, the modem 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box. Therefore, the set top box outputs the television audio and video signals modulated and demodulated by the received broadcast television signals to the main body equipment, and the main body equipment receives the audio and video signals through the first input/output interface.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in memory. The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object displayed on the display 275, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to the icon. The user command for selecting the UI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch pad, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice spoken by the user.
As shown in fig. 2, the controller 250 includes at least one of a Random Access Memory 251 (RAM), a Read-Only Memory 252 (ROM), a video processor 270, an audio processor 280, other processors 253 (e.g., a Graphics Processing Unit (GPU), a processor 254 (CPU), a Communication Interface (Communication Interface), and a Communication Bus 256 (Bus), which connects the respective components.
In some embodiments, RAM 251 is used to store temporary data for the operating system or other programs that are running, and in some embodiments, ROM 252 is used to store instructions for various system boots.
In some embodiments, the ROM 252 is used to store a Basic Input Output System (BIOS). The system is used for completing power-on self-test of the system, initialization of each functional module in the system, a driver of basic input/output of the system and booting an operating system.
In some embodiments, when the power of the display apparatus 200 is started upon receiving the power-on signal, the CPU executes the system start-up command in the ROM 252, and copies the temporary data of the operating system stored in the memory into the RAM 251 so as to start or run the operating system. After the start of the operating system is completed, the CPU copies the temporary data of the various application programs in the memory to the RAM 251, and then, the various application programs are started or run.
In some embodiments, CPU processor 254 is used to execute operating system and application program instructions stored in memory. And executing various application programs, data and contents according to various interactive instructions received from the outside so as to finally display and play various audio and video contents.
In some example embodiments, the CPU processor 254 may comprise a plurality of processors. The plurality of processors may include a main processor and one or more sub-processors. A main processor for performing some operations of the display apparatus 200 in a pre-power-up mode and/or operations of displaying a screen in a normal mode. One or more sub-processors for one operation in a standby mode or the like.
In some embodiments, the graphics processor 253 is used to generate various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The display device comprises an arithmetic unit which carries out operation by receiving various interactive instructions input by a user and displays various objects according to display attributes. And the rendering device is used for rendering various objects obtained based on the arithmetic unit, and the rendered objects are used for being displayed on a display.
In some embodiments, the video processor 270 is configured to receive an external video signal, and perform video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image synthesis, and the like according to a standard codec protocol of the input signal, so as to obtain a signal that can be displayed or played on the direct display device 200.
In some embodiments, video processor 270 includes a demultiplexing module, a video decoding module, an image synthesis module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio and video data stream, and if the input MPEG-2 is input, the demultiplexing module demultiplexes the input audio and video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module, such as an image synthesizer, is used for performing superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphics generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert an input video frame rate, such as a 60Hz frame rate into a 120Hz frame rate or a 240Hz frame rate, and the normal format is implemented in, for example, an interpolation frame mode.
The display format module is used for converting the received video output signal after the frame rate conversion, and changing the signal to conform to the signal of the display format, such as outputting an RGB data signal.
In some embodiments, the graphics processor 253 and the video processor may be integrated or separately configured, and when the graphics processor and the video processor are integrated, the graphics processor and the video processor may perform processing of graphics signals output to a display, and when the graphics processor and the video processor are separately configured, the graphics processor and the video processor may perform different functions, for example, a GPU + FRC (Frame Rate Conversion) architecture.
In some embodiments, the audio processor 280 is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform noise reduction, digital-to-analog conversion, and amplification processes to obtain an audio signal that can be played in a speaker.
In some embodiments, video processor 270 may comprise one or more chips. The audio processor may also include one or more chips.
In some embodiments, the video processor 270 and the audio processor 280 may be separate chips or may be integrated together with the controller in one or more chips.
In some embodiments, the audio output, under the control of controller 250, receives sound signals output by audio processor 280, such as: the speaker 286, and an external sound output terminal of a generating device that can output to an external device, in addition to the speaker carried by the display device 200 itself, such as: external sound interface or earphone interface, etc., and may also include a near field communication module in the communication interface, for example: and the Bluetooth module is used for outputting sound of the Bluetooth loudspeaker.
The power supply 290 supplies power to the display device 200 from the power input from the external power source under the control of the controller 250. The power supply 290 may include a built-in power supply circuit installed inside the display apparatus 200, or may be a power supply interface installed outside the display apparatus 200 to provide an external power supply in the display apparatus 200.
A user interface 265 for receiving an input signal of a user and then transmitting the received user input signal to the controller 250. The user input signal may be a remote controller signal received through an infrared receiver, and various user control signals may be received through the network communication module.
In some embodiments, the user inputs a user command through the control device or the mobile terminal, the user input interface responds to the user input through the controller 250 according to the user input, and the display apparatus 200 responds to the user input through the controller 250.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on the display 275, and the user input interface receives the user input commands through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form that is acceptable to the user.
The memory 260 includes a memory storing various software modules for driving the display device 200. Such as: various software modules stored in the first memory, including: at least one of a basic module, a detection module, a communication module, a display control module, a browser module, and various service modules.
The base module is a bottom layer software module for signal communication between various hardware in the display device 200 and for sending processing and control signals to the upper layer module. The detection module is used for collecting various information from various sensors or user input interfaces, and the management module is used for performing digital-to-analog conversion and analysis management.
Fig. 3 is a software system diagram of a display device provided in the present application. Referring to fig. 3, in some embodiments, the system is divided into four layers, which are, from top to bottom, an Application (Applications) layer (referred to as an "Application layer"), an Application Framework (Application Framework) layer (referred to as a "Framework layer"), an Android runtime (Android runtime) layer and a system library layer (referred to as a "system runtime library layer"), and a kernel layer.
In some embodiments, at least one application program runs in the application program layer, and the application programs can be Window (Window) programs carried by an operating system, system setting programs, clock programs, camera applications and the like; or may be an application developed by a third party developer such as a hi program, a karaoke program, a magic mirror program, or the like. In specific implementation, the application packages in the application layer are not limited to the above examples, and may actually include other application packages, which is not limited in this embodiment of the present application.
The framework layer provides an Application Programming Interface (API) and a Programming framework for the Application programs of the Application layer. The application framework layer includes a number of predefined functions. The application framework layer acts as a processing center that decides to let the applications in the application layer act. The application program can access the resource in the system and obtain the service of the system in execution through the API interface
As shown in fig. 3, in the embodiment of the present application, the application framework layer includes a manager (Managers), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used for interacting with all activities running in the system; the Location Manager (Location Manager) is used for providing the system service or application with the access of the system Location service; a Package Manager (Package Manager) for retrieving various information related to an application Package currently installed on the device; a Notification Manager (Notification Manager) for controlling display and clearing of Notification messages; a Window Manager (Window Manager) is used to manage the icons, windows, toolbars, wallpapers, and desktop components on a user interface.
In some embodiments, the activity manager is to: managing the life cycle of each application program and the general navigation backspacing function, such as controlling the exit of the application program (including switching the user interface currently displayed in the display window to the system desktop), opening, backing (including switching the user interface currently displayed in the display window to the previous user interface of the user interface currently displayed), and the like.
In some embodiments, the window manager is configured to manage all window programs, such as obtaining a size of the display screen, determining whether a status bar exists, locking the screen, clipping the screen, controlling a change of the display window (e.g., zooming the display window out, dithering the display, distorting the display, etc.), and the like.
In some embodiments, the system runtime layer provides support for the upper layer, i.e., the framework layer, and when the framework layer is used, the android operating system runs the C/C + + library included in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 3, the core layer comprises at least one of the following drivers: audio drive, display drive, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (such as fingerprint sensor, temperature sensor, touch sensor, pressure sensor, etc.), and so on.
In some embodiments, the kernel layer further comprises a power driver module for power management.
In some embodiments, software programs and/or modules corresponding to the software architecture of fig. 3 are stored in the first memory or the second memory shown in fig. 2.
Currently, image capture devices employing TOF technology (such as TOF cameras) provide flood illumination via an active light source projector. The floodlighting mode which is a uniform illumination mode can enable the distance measurement calculation to obtain depth point cloud information with rich details in a close range; however, when the irradiation distance becomes longer, the energy of the irradiated light decreases rapidly, and the depth information of the remote object cannot be detected because the depth information is easily affected by the ambient light. In order to measure an object at a medium and long distance, the power of the laser can be increased, but due to the increased power of the laser, strong reflected light of the object at a short distance can form a multipath interference phenomenon or a stray light phenomenon. In order to reduce the multipath interference phenomenon or stray light phenomenon of a near-distance object, the power of the laser needs to be reduced, but at the moment, along with the increase of the distance, the energy projected by the laser is weakened, so that the data jitter is increased, the accuracy of measured data of a medium-distance and far-distance object is rapidly reduced, and the quality of an obtained gray image is low.
Based on the above problems, the present application provides an image capturing device, a display device, an image processing method and an image processing device, where at least two light sources with different powers in the image capturing device project energies with different powers in a time-sharing manner, and the received reflected lights corresponding to different light sources are processed, so that the measurement accuracy of different distances of the image capturing device can be improved, and a higher-quality gray scale image can be obtained.
In the following embodiments of the present application, how the image acquisition apparatus performs image processing will be described by taking an image acquisition apparatus as a TOF camera as an example and adopting detailed embodiments. It should be noted that the image acquisition apparatus of the present application is not limited to a TOF camera.
Fig. 4 is a flowchart of an image processing method according to an embodiment of the present application. The image processing method is applied to a TOF camera, the TOF camera comprises at least two light sources, an image sensor and a processor connected with the at least two light sources and the image sensor, the power of the at least two light sources is different, and the image sensor comprises a pixel dot matrix used for receiving the reflected light of the at least two light sources. As shown in fig. 4, a processor connected to the at least two light sources and the image sensor is configured to perform the steps of:
in S401, at least two light sources are controlled to project energy with corresponding power in a time-sharing manner.
In the embodiment of the present application, the light source is, for example, a laser, and the laser may be used to emit laser pulses. Illustratively, the TOF camera includes, for example, two lasers with different powers, namely a high-power laser and a low-power laser, respectively, and it can be understood that the power of the high-power laser is greater than that of the low-power laser, and the high-power laser and the low-power laser can be controlled to project energy with corresponding power in a time-sharing manner, that is, the high-power laser can be controlled to project energy with corresponding power first, then the high-power laser is turned off, and then the low-power laser is controlled to project energy with corresponding power. For how to control the at least two light sources to project the energy with the corresponding power in a time-sharing manner, reference may be made to the following embodiments, which are not described herein again.
In S402, the pixel array is controlled to receive the reflected light of the at least two light sources.
After controlling the at least two light sources to project energy with corresponding power in a time-sharing manner, the pixel lattice can be controlled to receive the reflected light of the at least two light sources. For example, the TOF camera includes two lasers with different powers, namely, a high-power laser and a low-power laser, and the high-power laser is controlled to project energy with corresponding power, and then the pixel lattice is controlled to receive reflected light of the high-power laser; and then the high-power laser is closed, the low-power laser is controlled to project energy with corresponding power, and the pixel lattice is controlled to receive the reflected light of the low-power laser. For how to control the pixel lattice to receive the reflected light of the at least two light sources, reference may be made to the following embodiments, which are not repeated herein.
In S403, a gray-scale image is obtained from the light intensity information of the reflected light of the at least two light sources.
And receiving the reflected light of the at least two light sources at the control pixel lattice, and obtaining a gray image according to the light intensity information of the reflected light of the at least two light sources. For how to obtain the gray scale image according to the light intensity information of the reflected light of the at least two light sources, reference may be made to the subsequent embodiments, which are not described herein again.
The image processing method provided by the embodiment of the application is applied to a TOF camera, the TOF camera comprises at least two light sources, an image sensor and a processor connected with the at least two light sources and the image sensor, the power of the at least two light sources is different, the image sensor comprises a pixel lattice used for receiving reflected light of the at least two light sources, the processor connected with the at least two light sources and the image sensor projects energy of corresponding power in a time-sharing mode by controlling the at least two light sources, the pixel lattice is controlled to receive the reflected light of the at least two light sources, and a gray level image is obtained according to light intensity information of the reflected light of the at least two light sources. According to the embodiment of the application, the at least two light sources with different powers are controlled to project energy with corresponding powers in a time-sharing manner, and the gray level image is obtained according to the received light intensity information of the reflected light corresponding to the different light sources, so that the depth information of a remote object can be detected more accurately, and the gray level image with higher quality can be obtained.
The following describes the image processing method provided by the present application in detail with reference to specific steps.
On the basis of the foregoing embodiment, fig. 5 is a flowchart of an image processing method according to another embodiment of the present application. As shown in fig. 5, the processor of the TOF camera is configured to perform the following steps:
in this embodiment of the application, the step S401 in fig. 4 may further include the following step S501:
in S501, in one period, at least two light sources are controlled to sequentially project energy with corresponding power in corresponding frame time.
In this step, exemplarily, fig. 6 is a schematic diagram of a TOF camera provided in an embodiment of the present application for performing image processing, and as shown in fig. 6, the TOF camera includes two lasers with different powers, which are respectively a high-power laser and a low-power laser, and the high-power laser and the low-power laser are distributed on two sides of an image sensor of the TOF camera. Optionally, the high-power laser and the low-power laser may also be distributed on one side of an image sensor of the TOF camera, and may be set as needed, which is not limited in this application. One period is, for example, two frame times, and is divided into a previous frame time and a next frame time within the two frame times, and the two frame times are consistent in duration. And controlling the high-power laser and the low-power laser to project energy of corresponding power in sequence in corresponding frame time in one period. Specifically, a light source of a high-power laser is turned on in the previous frame time, and the light source irradiates an object to generate reflected light; the high power laser is then turned off while the light source of the low power laser is turned on at a later frame time, which will illuminate the object, producing reflected light.
In this embodiment of the application, the step S402 in fig. 4 may further include the following two steps S502 and S503:
in S502, target pixel lattices respectively corresponding to the at least two light sources in the pixel lattice are determined.
In S503, the control target pixel lattice receives the reflected light of the corresponding light source.
Illustratively, a TOF camera includes two lasers of different power, a high power laser and a low power laser, respectively. Fig. 7 is a schematic diagram of a pixel lattice included in an image sensor according to an embodiment of the present disclosure, and as shown in fig. 7, a target pixel lattice corresponding to a high-power laser and a low-power laser in the pixel lattice is determined. Specifically, the small circles in the odd rows (e.g., row 1) in fig. 7 are the target pixel lattices corresponding to the high-power lasers, and the small circles in the even rows (e.g., row 2) in fig. 7 are the target pixel lattices corresponding to the low-power lasers. In a period, starting the high-power laser in the previous frame time, wherein light emitted by the high-power laser irradiates an object to generate reflected light, and controlling a target pixel dot matrix represented by small circles in odd rows in fig. 7 to receive the reflected light corresponding to the high-power laser, namely placing pixels for receiving the reflected light corresponding to the high-power laser in the odd rows; and then turning off the high-power laser and turning on the low-power laser at the time of the next frame, wherein light emitted by the low-power laser irradiates an object to generate reflected light, and the target pixel dot matrix represented by the small circles in the even rows in fig. 7 is controlled to receive the reflected light corresponding to the low-power laser, namely, pixels for receiving the reflected light corresponding to the low-power laser are placed in the even rows. It should be noted that fig. 7 is only a schematic diagram for determining target pixel lattices corresponding to two light sources in a pixel lattice according to an embodiment of the present disclosure, and the embodiment of the present disclosure does not limit the position relationship between the target pixel lattices corresponding to the two light sources in the pixel lattice of fig. 7, and the target pixel lattices corresponding to the light sources may be set as needed. For example, the small circles in the odd columns in fig. 7 may be used as the target pixel dot matrix corresponding to the high-power laser, and the small circles in the even columns in fig. 7 may be used as the target pixel dot matrix corresponding to the low-power laser.
In this embodiment of the application, the step S403 in fig. 4 may further include the following step S504:
in S504, for each of the at least two light sources, a grayscale image is obtained according to the light intensity information of the reflected light of the light source and the light intensity threshold corresponding to the light source.
The light intensity threshold is used for filtering stray light and/or interference light in reflected light of the light source.
In this step, light intensity thresholds corresponding to different light sources may be set as needed, and the light intensity thresholds may be used to filter stray light, or interference light, or stray light and/or interference light in reflected light of the light sources. For each light source in the at least two light sources, after the control target pixel lattice receives the reflected light of the corresponding light source, a gray image can be obtained according to the light intensity information of the reflected light of the light source and the light intensity threshold value corresponding to the light source.
Further, optionally, the at least two light sources include a first light source and a second light source, where the power of the first light source is greater than the power of the second light source, and when the processor of the TOF camera is used to obtain a grayscale image according to the light intensity information of the reflected light of the light sources and the light intensity threshold corresponding to the light source, the processor is specifically configured to: setting the light intensity higher than a first light intensity threshold in the light intensity information of the reflected light of the first light source as a preset value in a first target pixel dot matrix corresponding to the first light source; setting the light intensity lower than a second light intensity threshold in the light intensity information of the reflected light of the second light source as a preset value in a second target pixel dot matrix corresponding to the second light source; superposing the first target pixel dot matrix and the second target pixel dot matrix to obtain a third target pixel dot matrix; and obtaining a gray image according to the light intensity information of the reflected light in the third target pixel dot matrix.
Illustratively, a TOF camera includes two lasers of different powers, a high power laser (i.e., a first light source) and a low power laser (i.e., a second light source), respectively. Fig. 8 is a schematic diagram of filtering light intensity information of reflected light of a light source according to an embodiment of the present application, and as shown in fig. 8, in one period, a light source of a high-power laser is turned on in a previous frame time, and an obtained first target pixel lattice corresponding to the high-power laser is shown as 801 in fig. 8, where a 1 st row of pixels in an odd-numbered row includes two reflected light signal intensities of 10 and 18, and a 3 rd row includes two reflected light signal intensities of 12 and 30. Setting a first light intensity threshold value, such as 0 as a preset value, the light intensity of the reflected light signal in the odd-numbered row of pixels with intensity higher than the first light intensity threshold value can be set to 0, and the obtained result is shown as 802 in fig. 8, wherein the 1 st row of pixels in the odd-numbered row includes two reflected light signal intensities of 10 and 18, and the 3 rd row includes two reflected light signal intensities of 12 and 0. It will be appreciated that the three reflected light signal intensities 10, 18 and 12 are far reflected light signal intensities, while the reflected light signal intensity 30 is near reflected light signal intensity, which is much higher than the far reflected light signal intensity, and for the reflected light signal of the far object, the reflected light signal of the near object is a stray light signal or a disturbing light signal, and needs to be filtered out. Fig. 9 is a schematic diagram of filtering light intensity information of reflected light of a light source according to another embodiment of the present application, and as shown in fig. 9, after a light source of a high-power laser is turned off in one period, and simultaneously the light source of a low-power laser is turned on in a next frame time, a second target pixel lattice corresponding to the low-power laser is obtained as 901 in fig. 9, where a 2 nd row of even-numbered rows of pixels includes two reflected light signal intensities of 3 and 11, and a 4 th row includes two reflected light signal intensities of 5 and 23. Setting the second light intensity threshold, the light intensity with the intensity of the reflected light signal in the even-numbered row of pixels lower than the second light intensity threshold can be set to 0, and the obtained result is shown as 902 in fig. 9, wherein the 2 nd row of the even-numbered row of pixels comprises two reflected light signal intensities of 0 and 11, and the 4 th row comprises two reflected light signal intensities of 0 and 23. It will be appreciated that the two reflected light signal intensities of 11 and 23 are near reflected light signal intensities, and the two reflected light signal intensities of 3 and 5 are far reflected light signal intensities, which are much lower than the near reflected light signal intensities, and for the reflected light signal of the near object, the reflected light signal of the far object is a stray light signal or an interference light signal, and needs to be filtered.
After the first target pixel dot matrix and the second target pixel dot matrix are obtained, the first target pixel dot matrix and the second target pixel dot matrix may be subjected to superposition processing to obtain a third target pixel dot matrix. For example, the reflected light corresponding to the first target pixel lattice may be used as the first reflected light signal intensity, and the reflected light corresponding to the second target pixel lattice may be used as the second reflected light signal intensity, and the first reflected light signal intensity and the second reflected light signal intensity may be subjected to the superposition processing, so as to obtain the third target pixel lattice. Fig. 10 is a schematic diagram of obtaining a third target pixel dot matrix according to an embodiment of the present application, and as shown in fig. 10, the first target pixel dot matrix 1003 after filtering processing may be obtained by performing filtering processing according to the light intensity information of the reflected light in the first target pixel dot matrix 1001 and the first light intensity threshold, the second target pixel dot matrix 1004 after filtering processing may be obtained by performing filtering processing according to the light intensity information of the reflected light in the second target pixel dot matrix 1002 and the second light intensity threshold, and the third target pixel dot matrix 1005 shown in fig. 10 may be obtained by performing superposition processing on the first target pixel dot matrix 1003 and the second target pixel dot matrix 1004, where a 1 st row in the third target pixel dot matrix 1005 includes two reflected light signal intensities of 10 and 18, a 2 nd row includes two reflected light signal intensities of 0 and 11, a 3 rd row includes two reflected light signal intensities of 12 and 0, and a 4 th row includes two reflected light signal intensities of 0 and 23. By the method for acquiring the third target pixel dot matrix, the intensity of the far reflected light signal can be ensured to be enough, and the intensity of the near reflected light signal can be ensured not to explode excessively. For a depth image in a period, when a TOF camera shoots a far object and a near object, a laser is started twice, scattered light energy of the two lasers can be reflected by the near object and the far object, and the difference is that the near object is close to the laser, reflected light energy of a high-power laser is strong, multipath interference is easily caused, and the intensity of a reflected light signal of the near object is wrong, while reflected light energy of a low-power laser is low, multipath interference is not easily caused, and the confidence coefficient of the intensity of the low-power reflected light signal is high; correspondingly, the distance between the distant object and the laser is relatively long, the reflected light energy of the low-power laser is relatively weak, the imaging quality is relatively poor, so that the depth cloud image information of the distant object is relatively large in jitter, while the reflected light energy of the high-power laser is relatively high, the depth information jitter is relatively low, the imaging quality is relatively good, and the confidence coefficient of the intensity of the high-power reflected light signal is relatively high.
Exemplarily, referring to fig. 6, in one cycle, the reflected light corresponding to the 1-frame high-power laser is received by the target pixel lattice corresponding to the high-power laser included in the image sensor as a first reflected light signal intensity, the reflected light corresponding to the 1-frame low-power laser is received by the target pixel lattice corresponding to the low-power laser included in the image sensor as a second reflected light signal intensity, and the first reflected light signal intensity and the second reflected light signal intensity (i.e., the two-frame reflected light signal intensity) are subjected to a superposition process to obtain a third target pixel lattice. After the third target pixel lattice is obtained, a gray image can be obtained according to the light intensity information of the reflected light in the third target pixel lattice. For how to obtain the gray image according to the light intensity information of the reflected light in the third target pixel dot matrix, a preset method may be adopted or details are not repeated herein with reference to the related art at present.
In S505, the grayscale image is sent to a display connected to the TOF camera, the display being for displaying the grayscale image.
After the grayscale image is obtained, the grayscale image may be sent to a display connected to the TOF camera to cause the display to display the grayscale image.
The image processing method provided by the embodiment OF the application is applied to a TOF camera, a processor OF the OF camera controls at least two light sources to project energy with corresponding power in corresponding frame time in sequence in one period, determines target pixel lattices corresponding to the at least two light sources respectively in a pixel lattice, controls the target pixel lattices to receive reflected light OF the corresponding light sources, obtains a gray-scale image according to light intensity information OF the reflected light OF the light sources and light intensity thresholds corresponding to the light sources for each light source in the at least two light sources, and sends the gray-scale image to a display connected with the TOF camera, and the display is used for displaying the gray-scale image. Because the embodiment of the application controls the at least two light sources with different powers to project the energy of the corresponding power in a time-sharing manner, and filters stray light and/or interference light in the received reflected light corresponding to different light sources, the depth information of a remote object can be detected more accurately, the target pixel dot matrixes corresponding to the light sources are superposed to obtain the superposed target pixel dot matrixes, the gray level image is obtained according to the light intensity information of the reflected light in the superposed target pixel dot matrixes, and the obtained gray level image has higher quality.
On the basis of the foregoing embodiment, fig. 11 is a schematic diagram of projection distance ranges respectively corresponding to a high-power laser and a low-power laser included in a TOF camera according to an embodiment of the present application, and as shown in fig. 11, on the basis of referring to fig. 6, a projection distance range L1 corresponding to the high-power laser and a projection distance range L2 corresponding to the low-power laser are shown, where L1 may be understood as energy corresponding to power projected by the high-power laser on a long-distance object, and L2 may be understood as energy corresponding to power projected by the low-power laser on a short-distance object. It can be understood that the short-distance object adopts the reflected light energy of the low-power laser to obtain the low-power reflected light signal intensity, and the long-distance object adopts the reflected light energy of the high-power laser to obtain the high-power reflected light signal intensity, so that the high-precision light signal intensity information at different distances can be obtained.
On the basis of the foregoing embodiment, fig. 12 is an overall schematic diagram of a TOF camera according to an embodiment of the present application, and as shown in fig. 12, the TOF camera includes a high-power laser 1201, a low-power laser 1202, an image sensor 1203, a motherboard 1204 (a main chip may be understood as a processor connected to at least two light sources and the image sensor in the foregoing embodiment), and a Universal Serial Bus (USB) terminal 1205, and the components are connected by a connection line. The USB terminal 1205 may be connected to a USB terminal of a flat panel television, for example. In one period, after receiving the reflected light corresponding to the high-power laser 1201 and the low-power laser 1202, the image sensor 1203 transmits the reflected light to the main chip of the main board 1204 to perform intensity superposition processing on the two frames of reflected light signals, obtains a corresponding gray image according to the light intensity information of the reflected light after the superposition processing, and sends the gray image to the flat-panel television, so that the flat-panel television displays the gray image.
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the method of the present application.
Fig. 13 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application, where the image processing apparatus is applied to an image capturing apparatus, the image capturing apparatus includes at least two light sources, an image sensor, and a processor connected to the at least two light sources and the image sensor, power of the at least two light sources is different, and the image sensor includes a pixel lattice for receiving reflected light of the at least two light sources. As shown in fig. 13, an image processing apparatus 1300 according to an embodiment of the present application includes: a first control module 1301, a second control module 1302, and an acquisition module 1303. Wherein:
the first control module 1301 is configured to control the at least two light sources to project energy with corresponding power in a time-sharing manner.
The second control module 1302 is configured to control the pixel array to receive the reflected light of the at least two light sources.
And the obtaining module 1303 is configured to obtain a gray image according to the light intensity information of the reflected light of the at least two light sources.
In some possible implementations, the first control module 1301 may specifically be configured to: and controlling at least two light sources to project energy with corresponding power in corresponding frame time in sequence in one period.
In some possible implementations, the second control module 1302 may be specifically configured to: determining target pixel lattices respectively corresponding to at least two light sources in the pixel lattices; and controlling the target pixel lattice to receive the reflected light of the corresponding light source.
In some possible implementations, the obtaining module 1303 may be specifically configured to: aiming at each light source in the at least two light sources, obtaining a gray level image according to the light intensity information of the reflected light of the light source and a light intensity threshold corresponding to the light source, wherein the light intensity threshold is used for filtering stray light and/or interference light in the reflected light of the light source.
In some possible implementation manners, the at least two light sources include a first light source and a second light source, the power of the first light source is greater than the power of the second light source, and when the obtaining module 1303 is configured to obtain the grayscale image according to the light intensity information of the reflected light of the light sources and the light intensity threshold corresponding to the light source, the obtaining module may specifically be configured to: setting the light intensity higher than a first light intensity threshold in the light intensity information of the reflected light of the first light source as a preset value in a first target pixel dot matrix corresponding to the first light source; setting the light intensity lower than a second light intensity threshold in the light intensity information of the reflected light of the second light source as a preset value in a second target pixel dot matrix corresponding to the second light source; superposing the first target pixel dot matrix and the second target pixel dot matrix to obtain a third target pixel dot matrix; and obtaining a gray image according to the light intensity information of the reflected light in the third target pixel dot matrix.
In some possible implementations, the image processing apparatus further includes a sending module 1304 for sending the grayscale image to a display connected to the image capturing apparatus, the display being used for displaying the grayscale image.
It should be noted that the apparatus provided in this embodiment may be used to execute the image processing method, and the implementation manner and the technical effect are similar, which are not described herein again.
On the basis of the foregoing embodiments, fig. 14 is a schematic diagram of a display device provided in an embodiment of the present application, and as shown in fig. 14, a display device 1400 includes a display 1401 for displaying a grayscale image and an image collecting apparatus 1402 according to any one of the foregoing method embodiments. Wherein:
the display 1401 receives the grayscale image transmitted by the image pickup device and displays the grayscale image. The specific implementation process of the image acquisition device for obtaining the gray level image can be referred to the above method embodiment, and details are not repeated here.
It should be noted that the division of the modules of the above apparatus is only a logical division, and the actual implementation may be wholly or partially integrated into one physical entity, or may be physically separated. And these modules can be realized in the form of software called by processing element; or may be implemented entirely in hardware; and part of the modules can be realized in the form of calling software by the processing element, and part of the modules can be realized in the form of hardware. For example, the processing module may be a processing element separately set up, or may be implemented by being integrated in a chip of the apparatus, or may be stored in a memory of the apparatus in the form of program code, and a function of the processing module may be called and executed by a processing element of the apparatus. The other modules are implemented similarly. In addition, all or part of the modules can be integrated together or can be independently realized. The processing element here may be an integrated circuit with signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
For example, the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more ASICs (Application Specific Integrated circuits), or one or more DSPs (Digital Signal processors), or one or more FPGAs (Field Programmable Gate arrays), etc. For another example, when some of the above modules are implemented in the form of processing element dispatcher code, the processing element may be a general purpose processor, such as a CPU or other processor that can invoke the program code. As another example, these modules may be integrated together and implemented in the form of a System-on-a-Chip (SOC).
In the above embodiments, all or part of the implementation may be realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer programs. The procedures or functions according to the embodiments of the present application are all or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer program can be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another computer readable storage medium, for example, the computer program can be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital Subscriber Line (DSL)) or wirelessly (e.g., infrared, wireless, microwave, etc.). Computer-readable storage media can be any available media that can be accessed by a computer or a data storage device, such as a server, data center, etc., that includes one or more available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a DVD), or a semiconductor medium (e.g., a Solid State Disk (SSD)), among others.
The present application further provides a computer-readable storage medium, in which a computer program is stored, and when the computer program is executed by a processor, the image processing method according to any one of the above method embodiments is implemented.
Embodiments of the present application further provide a computer program product, which includes a computer program, where the computer program is stored in a computer-readable storage medium, and at least one processor can read the computer program from the computer-readable storage medium, and when the computer program is executed by the at least one processor, the at least one processor can implement the image processing method according to any one of the above method embodiments.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. An image acquisition apparatus, comprising:
at least two light sources, the power of the at least two light sources being different;
an image sensor comprising a pixel array for receiving reflected light of the at least two light sources;
a processor connected to the at least two light sources and the image sensor, configured to:
controlling the at least two light sources to project energy with corresponding power in a time-sharing manner;
controlling the pixel lattice to receive the reflected light of the at least two light sources;
and obtaining a gray image according to the light intensity information of the reflected light of the at least two light sources.
2. The image acquisition device of claim 1, wherein the processor is configured to:
and controlling the at least two light sources to project energy with corresponding power in corresponding frame time in sequence in one period.
3. The image acquisition device of claim 1, wherein the processor is configured to:
determining target pixel lattices respectively corresponding to the at least two light sources in the pixel lattices;
and controlling the target pixel lattice to receive the reflected light of the corresponding light source.
4. The image acquisition device of claim 1, wherein the processor is configured to:
and aiming at each light source in the at least two light sources, obtaining a gray level image according to the light intensity information of the reflected light of the light source and a light intensity threshold corresponding to the light source, wherein the light intensity threshold is used for filtering stray light and/or interference light in the reflected light of the light source.
5. The image capture device of claim 4, wherein the at least two light sources comprise a first light source and a second light source, the first light source having a power greater than the second light source, the processor configured to:
setting the light intensity higher than a first light intensity threshold value in the light intensity information of the reflected light of the first light source as a preset value in a first target pixel dot matrix corresponding to the first light source;
setting the light intensity lower than a second light intensity threshold in the light intensity information of the reflected light of the second light source as a preset value in a second target pixel lattice corresponding to the second light source;
superposing the first target pixel dot matrix and the second target pixel dot matrix to obtain a third target pixel dot matrix;
and obtaining the gray level image according to the light intensity information of the reflected light in the third target pixel dot matrix.
6. The image acquisition device of any one of claims 1 to 5, wherein the processor is configured to:
and sending the gray level image to a display connected with the image acquisition device, wherein the display is used for displaying the gray level image.
7. A display device, comprising:
display for displaying a gray scale image and image acquisition apparatus according to any one of claims 1 to 6.
8. An image processing method is applied to an image acquisition device, the image acquisition device comprises at least two light sources and an image sensor, the power of the at least two light sources is different, the image sensor comprises a pixel lattice for receiving the reflected light of the at least two light sources, and the image processing method comprises the following steps:
controlling the at least two light sources to project energy with corresponding power in a time-sharing manner;
controlling the pixel lattice to receive the reflected light of the at least two light sources;
and obtaining a gray image according to the light intensity information of the reflected light of the at least two light sources.
9. The method of claim 8, wherein the controlling the at least two light sources to project energy of corresponding power in a time-sharing manner comprises:
and controlling the at least two light sources to project energy with corresponding power in corresponding frame time in sequence in one period.
10. The method of claim 8, wherein the controlling the pixel array to receive the reflected light from the at least two light sources comprises:
determining target pixel lattices respectively corresponding to the at least two light sources in the pixel lattices;
and controlling the target pixel lattice to receive the reflected light of the corresponding light source.
CN202111110947.4A 2021-09-18 2021-09-18 Image acquisition device, display equipment, image processing method and device Pending CN115842964A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111110947.4A CN115842964A (en) 2021-09-18 2021-09-18 Image acquisition device, display equipment, image processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111110947.4A CN115842964A (en) 2021-09-18 2021-09-18 Image acquisition device, display equipment, image processing method and device

Publications (1)

Publication Number Publication Date
CN115842964A true CN115842964A (en) 2023-03-24

Family

ID=85574470

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111110947.4A Pending CN115842964A (en) 2021-09-18 2021-09-18 Image acquisition device, display equipment, image processing method and device

Country Status (1)

Country Link
CN (1) CN115842964A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116485626A (en) * 2023-04-10 2023-07-25 北京辉羲智能科技有限公司 Automatic driving SoC chip for sensor data dump

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116485626A (en) * 2023-04-10 2023-07-25 北京辉羲智能科技有限公司 Automatic driving SoC chip for sensor data dump
CN116485626B (en) * 2023-04-10 2024-03-12 北京辉羲智能科技有限公司 Automatic driving SoC chip for sensor data dump

Similar Documents

Publication Publication Date Title
EP3579544B1 (en) Electronic device for providing quality-customized image and method of controlling the same
KR102583929B1 (en) Display apparatus and control method thereof
CN111752518A (en) Screen projection method of display equipment and display equipment
CN114630053B (en) HDR image display method and display device
WO2021031598A1 (en) Self-adaptive adjustment method for video chat window position, and display device
CN115842964A (en) Image acquisition device, display equipment, image processing method and device
CN112351334B (en) File transmission progress display method and display equipment
CN112289271B (en) Display device and dimming mode switching method
CN114095769B (en) Live broadcast low-delay processing method of application-level player and display device
CN111954043B (en) Information bar display method and display equipment
US20150221078A1 (en) Calibration device, display system and control method thereof
CN112162764A (en) Display device, server and camera software upgrading method
CN115145482A (en) Parameter configuration system, method, reference monitor and medium
CN112399235B (en) Camera shooting effect enhancement method and display device of intelligent television
CN116801027A (en) Display device and screen projection method
CN112437284A (en) Projection picture correction method, terminal equipment and display equipment
CN115185392A (en) Display device, image processing method and device
CN113587812B (en) Display equipment, measuring method and device
CN112363683A (en) Method for supporting multi-layer display of webpage application and display equipment
CN112218156A (en) Method for adjusting video dynamic contrast and display equipment
CN113076031B (en) Display equipment, touch positioning method and device
CN113242383B (en) Display device and image calibration method for automatic focusing imaging of display device
CN115119035B (en) Display device, image processing method and device
CN113807375B (en) Display equipment
CN113825007B (en) Video playing method and device and display equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination