CN115604572B - Image acquisition method, electronic device and computer readable storage medium - Google Patents

Image acquisition method, electronic device and computer readable storage medium Download PDF

Info

Publication number
CN115604572B
CN115604572B CN202211071812.6A CN202211071812A CN115604572B CN 115604572 B CN115604572 B CN 115604572B CN 202211071812 A CN202211071812 A CN 202211071812A CN 115604572 B CN115604572 B CN 115604572B
Authority
CN
China
Prior art keywords
camera
near infrared
mode
image
visible light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211071812.6A
Other languages
Chinese (zh)
Other versions
CN115604572A (en
Inventor
乔晓磊
肖斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211071812.6A priority Critical patent/CN115604572B/en
Publication of CN115604572A publication Critical patent/CN115604572A/en
Application granted granted Critical
Publication of CN115604572B publication Critical patent/CN115604572B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation

Abstract

The embodiment of the application provides a method and a device for acquiring images, which are applied to electronic equipment, wherein cameras are operated on the electronic equipment and are provided with near infrared flash lamps and near infrared cameras, and the method comprises the following steps: and in response to shooting operation on a preview interface displayed by the camera, a near infrared flash is lightened, image data shot by the near infrared camera in a state that the near infrared flash is lightened is acquired, and a near infrared image is obtained based on the image data. Because the near infrared image is not influenced by the ambient brightness, the image with richer detail information can be obtained, especially in the dark environment, the detail information of the image shot by the visible light camera can not meet the requirement of vision on indexes such as definition and the like, and the near infrared image can meet the requirement of vision on indexes such as definition and the like, so that the aim of obtaining images with better quality is fulfilled.

Description

Image acquisition method, electronic device and computer readable storage medium
Technical Field
The present application relates to the field of electronic information technologies, and in particular, to a method and an apparatus for acquiring an image.
Background
The photographing function is a common function of electronic devices.
The photographing function includes photographing photographs, videos, and the like. In order to obtain better user experience, the electronic equipment can process the shot image and display the processed image to a user.
Therefore, how to obtain images with better quality becomes one of the current research hotspots.
Disclosure of Invention
The application provides an image acquisition method and device, and aims to solve the problem of how to acquire images with better quality.
In order to achieve the above object, the present application provides the following technical solutions:
a first aspect of the present application provides an image acquisition method applied to an electronic device, on which a camera application is running, the electronic device having a near infrared flash and a near infrared camera, the method comprising: and in response to shooting operation on a preview interface displayed based on the camera application, a near infrared flash is lightened (namely, the near infrared flash is controlled to be in a light-emitting state), image data shot by the near infrared camera in the state that the near infrared flash is lightened is acquired, and a near infrared image is obtained based on the image data. Because the near infrared image is not influenced by the ambient brightness, the image with richer detail information can be obtained, especially in the dark environment, the detail information of the image shot by the visible light camera can not meet the requirement of vision on indexes such as definition and the like, and the near infrared image can meet the requirement of vision on indexes such as definition and the like, so that the aim of obtaining images with better quality is fulfilled.
In some implementations, the electronic device also has a visible light flash, the preview interface including: preview interface in a first shooting mode, the first shooting mode comprising: a shooting mode selected in the case where the ambient brightness value is less than a first threshold value and the visible light flash is off. The condition of the first photographing mode defines a photographing mode in which the first photographing mode matches with a dark light environment, so that the advantage of near infrared photographing can be further exerted.
In some implementations, a method of a camera displaying a preview interface in a first capture mode includes: after the camera is turned on, a preview interface in a first shooting mode is displayed in the camera in response to the ambient brightness value being less than a first threshold and the visible light flash being turned off. Namely, based on the dark light environment and the state that the visible light flash lamp is closed, the first shooting mode is automatically selected, so that the use convenience is improved, and better user experience is obtained.
In some implementations, a method of a camera displaying a preview interface in a first capture mode includes: after the camera is turned on, displaying a preview interface in the second photographing mode in the camera in response to the ambient brightness value being less than a second threshold value, and displaying the preview interface in the first photographing mode in the camera in response to an operation of selecting the first photographing mode on the preview interface in the second photographing mode. Namely, the first shooting mode is automatically recommended, and a space which can be selected again by a user is provided, so that the use convenience can be improved, and better user experience can be obtained.
In some implementations, the electronic device also has a visible light flash, and the preview interface displays a first control for indicating that the visible light flash is off and a second control for indicating that the filter function is off. Since the dark light environment is destroyed by the lighting of the visible light flash and the effect of the filter is hardly visible, the filter function is automatically turned off and the off state of the visible light flash and the filter function is prompted in the preview interface.
In some implementations, the first control is further for prompting that the operation is not available, the second control is further for prompting that the operation is not available, the method further comprising: in response to operation of the first control or the second control, the near infrared flash is turned off. Because the lighting of the visible light flash lamp can destroy the dim light environment, and the effect of the filter is almost invisible in the dim light environment, the control prompt is not started, but the opportunity that a user can select autonomously is reserved, if the visible light flash lamp or the filter function is started through the control, the shooting mode matched with the dim light environment is exited, so that better user experience is obtained, and the shot image has better quality.
In some implementations, the method further comprises: responsive to operation of the first control or the second control, a preview interface displayed in the camera is switched. On the one hand, if the visible light flash lamp or the filter function is started through the control, the shooting mode matched with the dim light environment is exited, so that better user experience is obtained, and the shot image is ensured to have better quality. On the other hand, after the shooting mode matched with the dim light environment is exited, the switching of the shooting mode is prompted through the switching of the preview interface, so that the use experience of a user is further improved.
In some implementations, the method further comprises: in response to a zoom operation on a preview interface displayed by the camera, the preview interface displayed in the camera is switched. In order to avoid the influence of infrared rays emitted by the near infrared flash lamp on the laser focusing function, under the mode of dark environment matching, the focal length of the infrared camera is a fixed focal length, so that a zoom control is not displayed in a preview interface, but the opportunity that a user can select autonomously is reserved, if the user performs zoom operation, the shooting mode of dark environment matching is exited, better user experience is obtained, and the shot image has better quality.
In some implementations, the electronic device further has a visible light camera, and before acquiring image data captured by the near infrared camera in a state in which the near infrared flash is on, further includes: closing functional synchronization between the visible light camera and the near infrared camera, wherein the functions comprise: automatic exposure, automatic focusing and automatic white balance to guarantee can not influence each other between the camera of different types and reduce the quality of the image of shooing, and near infrared camera no longer carries out automatic white balance in order to save the resource.
In some implementations, the electronic device further has a visible light camera, and before acquiring the image data captured by the near infrared camera in a state in which the near infrared flash is on, at least one of the following is further included: and responding to the focusing mode of the visible light camera to focus based on laser, and setting the focusing mode of the visible light camera to be a fixed focal length so as to avoid the possibility of inaccurate focusing or frequent jumping of a picture of a preview interface caused by the influence of near infrared light emitted by a near infrared flash lamp on the focusing of the laser. And responding to the focusing mode of the near infrared camera to focus based on laser, and setting the focusing mode of the near infrared camera to be a fixed focal length so as to avoid the problem of inaccurate focusing caused by the influence of near infrared light emitted by the near infrared flash lamp on the focusing of the laser.
In some implementations, before the near infrared flash is ignited, further comprising: based on the ambient brightness parameter, setting a current value of the near infrared flash lamp, wherein the specific mode for lighting the near infrared flash lamp comprises the following steps: the near infrared flash lamp is controlled to emit light based on the current value, so that the intensity of near infrared light emitted by the near infrared flash lamp is more matched with the ambient brightness, and the quality of a near infrared image is improved.
In some implementations, based on the image data, the manner in which the near infrared image is derived includes: selecting a near infrared original image from the image data, calling an image enhancement algorithm to process the near infrared original image to obtain an enhancement processing result, and fusing the enhancement processing result with an image shot by a visible light camera to obtain a near infrared image. The near infrared image is a black-and-white image, and after the near infrared image is fused with the image shot by the visible light camera, the black-and-white image has color information, so that the visual effect of the image can be improved.
In some implementations, the fusing the enhancement processing result with the image captured by the visible light camera to obtain the near infrared image includes: and under the condition that the ambient brightness is not less than a preset threshold value, fusing the enhancement processing result with an image shot by the visible light camera to obtain a near infrared image. That is, in case that color information of an RGB image photographed in an excessively dark environment is insufficient to improve visual effect of a near infrared image, the color information is not fused any more, so that meaningless resource overhead is reduced.
A second aspect of the present application provides an electronic device comprising: one or more processors, and one or more memories storing one or more programs that, when executed by the processors, cause the electronic device to perform the image acquisition method provided by the first aspect of the present application.
A third aspect of the present application provides a computer-readable storage medium having a computer program stored therein, which when executed by a processor, causes the processor to perform the image acquisition method provided in the first aspect of the present application.
A fourth aspect of the application provides a computer program product comprising: computer program code which, when run on an electronic device, causes the electronic device to perform the method of acquiring an image provided by the first aspect of the application.
Drawings
Fig. 1a is an example of an image taken by a cell phone through a visible light camera in a scene with an ambient brightness of 0.1 lux;
FIG. 1b is an example of an image obtained after image enhancement processing of an acquired infrared image;
fig. 2a is an exemplary diagram of a preview interface corresponding to a night scene mode of a camera displayed in an execution process of an image acquisition method according to an embodiment of the present application;
FIG. 2b is an exemplary diagram of a preview interface in "black and white overnight" mode;
fig. 3 is a diagram illustrating a composition example of an electronic device according to an embodiment of the present application;
fig. 4 is an example of some parameters of a main camera and a near infrared camera in an electronic device according to an embodiment of the present application;
FIG. 5 is a flow chart of acquisition of near infrared images based on execution of a software framework in an electronic device;
FIG. 6 is a flowchart of an image acquisition method according to an embodiment of the present application;
fig. 7a is a parameter example of different cameras at different stages in the image acquisition method according to the embodiment of the present application;
fig. 7b is another example of parameters of different cameras at different stages in the image acquisition method according to the embodiment of the present application;
FIG. 8a is an example of a flow of processing a near infrared raw image into a near infrared image by an electronic device;
fig. 8b is a further flow example of the electronic device processing the near infrared raw map into a near infrared image.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application. The terminology used in the following examples is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the application and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include, for example, "one or more" such forms of expression, unless the context clearly indicates to the contrary. It should also be understood that in embodiments of the present application, "one or more" means one, two, or more than two; "and/or", describes an association relationship of the association object, indicating that three relationships may exist; for example, a and/or B may represent: a alone, a and B together, and B alone, wherein A, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
The plurality of the embodiments of the present application is greater than or equal to two. It should be noted that, in the description of the embodiments of the present application, the terms "first," "second," and the like are used for distinguishing between the descriptions and not necessarily for indicating or implying a relative importance, or alternatively, for indicating or implying a sequential order.
The camera of the electronic device can provide the shooting function of the image for the user. In order to obtain a better user experience, cameras are typically capable of providing multiple shooting modes, such as panoramic mode, sports mode, night scene mode, and the like.
The night scene mode can be understood as a shooting mode in which an image acquired in a dark scene is subjected to enhancement processing such as denoising, brightening and the like to obtain an image with better quality. The image with better quality can be understood as that parameters such as image definition and the like meet the requirements, namely, the image meets the requirements of vision on indexes such as definition and the like.
The inventors found in the course of the study that, for images obtained under a scene of some dim light, for example, an ambient brightness of about 1 lux, an image with better quality can be obtained based on an image enhancement algorithm. However, for other dark light scenes, for example, a scene with an ambient brightness of 0.1 lux, the information of the original (raw) image acquired by the camera is insufficient due to the excessively dark light, so that even if an image enhancement algorithm is used, the obtained image is insufficient to meet the requirement of vision on indexes such as definition. Taking fig. 1a as an example, fig. 1a is an image taken by a mobile phone in a scene with an ambient brightness of 0.1 lux, and it is seen that, because the ambient light is too dark, the image lacks detailed information, and even if the image enhancement processing is performed, it is unlikely that an image satisfying the visual definition is obtained.
The inventor also finds that the infrared image acquired under the condition of starting the infrared flash lamp has rich detail information in the research process, so that after the infrared image obtained based on the infrared flash lamp in the (visible light) dim light scene is processed, the requirement of vision on indexes such as definition and the like can be met. For example, in a scene of 0.1 lux, an infrared flash lamp is turned on and an infrared camera is used to collect an infrared image, and fig. 1b is an image obtained by performing image enhancement processing on the collected infrared image, so that compared with fig. 1a, the detail information in fig. 1b is rich, and the requirement of vision on indexes such as definition can be met.
Moreover, because infrared light can not be perceived by human eyes, the night view mode shooting of the infrared flash lamp is started, and image enhancement without perception in extremely dark light or no light environment can be realized.
For the above reasons, the embodiment of the application discloses an image acquisition method, which aims to start an infrared flash lamp to acquire an infrared image in a dark light environment and perform image enhancement processing on the infrared image to obtain an image meeting the requirements of vision on indexes such as definition.
The image acquisition method provided by the embodiment of the application is triggered and executed by the operation of the camera application program by the user, and the improvement of the camera interface is described below.
Fig. 2a is a preview interface corresponding to a night scene mode of a camera displayed in an executing process of the image acquisition method according to the embodiment of the present application.
Night scene mode is one of the shooting modes of the camera, as shown in fig. 2a, some examples of the shooting modes of the camera include: portrait, photograph, night scene, video, and panorama. The names of these shooting modes are displayed on the slideable control 8.
In fig. 2a and 2b, the "night scene mode" is taken as an example of a sub-mode in the photographing mode, instead of the conventional night scene mode selected by the slidable control 8. It will be appreciated that the night scene mode shown in fig. 2a and 2b may also be a night scene mode selected on the slideable control 8, i.e. a traditional night scene mode modified such that it has the interface shown in fig. 2a or 2 b.
In some implementations, after the user activates the camera, the photographing mode can be selected through the slide control 8, and thus the night scene mode is selected under a menu of photographing modes. In other implementations, after the user activates the camera, in a case where the intelligent (AI) function is on (i.e., control 1 in fig. 2 a), the camera automatically switches to a night view mode in the photographing mode (in the following description, the "night view mode" is used to represent the night view mode in the photographing mode) based on the brightness of the environment collected by the camera (e.g., less than 1 lux) and the visible light flash being in an off state (i.e., control 2 in fig. 2 a).
As shown in fig. 2a, in the preview interface of the night view mode, control 3 indicates that the filter is turned off, because the filter effect is hardly visible in a dark environment, and thus the filter is turned off. The control 4 represents a two-dimensional code scanning function, and a user clicks and then enters a two-dimensional code scanning interface. Control 5 represents the camera's setup function and the user clicks and then enters the camera's setup interface.
Control 2 and control 3 in fig. 2a are both operational. After the user clicks the control 2 indicating that the visible light flash is turned off, the visible light flash is switched from the off state to the on state, and because the visible light flash is in the on state, after the user clicks the shooting control 9, the visible light flash emits light, so that the visible light flash is no longer in a dark light environment, and therefore, if the user clicks the control 2 to switch the visible light flash from the off state to the on state, the camera exits from the night view mode, for example, the camera is switched from the night view mode to the shooting mode.
After the user clicks the control 3 to switch the filter from the off state to the on state, in some implementations, the camera exits the night scene mode, and in other implementations, the camera remains in the current night scene mode, and further, the camera can send out reminding information to remind the user that the visual effect of the filter is poor.
In the preview interface of the night scene mode, a control 6 representing the night scene mode is also displayed, and it is understood that the user may click on the control "x" on the control 6 to turn off the night scene mode. After the night scene mode is turned off, the camera switches to a default mode, such as a default mode of a photographing mode.
It will be appreciated that after the shooting mode of the camera is changed, the preview interface is also adjusted accordingly.
Unlike the preview interface corresponding to the night scene mode of the conventional camera, in fig. 2a, the night scene mode includes two sub-modes, namely a standard night scene mode and an extremely dark night scene mode. The control 61 named "standard" in fig. 2a represents the standard night scene mode, and the control 62 named "black and white overnight" represents the extremely dark night scene mode.
In some implementations, the camera selects a sub-mode in the night scene mode based on the brightness of the environment sensed by the visible light camera, and indicates the selected sub-mode in the preview interface corresponding to the night scene mode, and in fig. 2a, assuming that the selected sub-mode is the standard night scene mode, the display control 61 is emphasized to prompt the user of the sub-mode currently in.
It will be appreciated that in the standard night scene mode shown in fig. 2a, the user clicks control 62 to switch from the standard night scene mode to the "black and white night" mode.
As can also be seen from fig. 2a, in the standard night scene mode, a zoom control 7 is also displayed, the user can select the zoom level of the preview image by sliding on the zoom control 7.
After the user clicks the capture control 9 in fig. 2a, the camera invokes the visible camera to capture an image.
Fig. 2b is a preview interface in "black and white overnight" mode, with night view mode control 6 shown in fig. 2b and highlighting control 62 in order to prompt the user for the mode currently in.
After clicking the shooting control 9 in fig. 2b, the camera invokes the infrared camera to shoot an image under the condition that the infrared flash is turned on, and acquires the image shot in the black-white polar night mode based on the result of enhancing the acquired image data.
Unlike fig. 2a, control 2 and control 3 are shown gray in fig. 2 b. And, the zoom control is no longer displayed.
Because the camera invokes the infrared flash and the infrared camera in the "black and white night" mode, the visible flash-on control 2 is displayed in gray to prompt the user that the visible flash is not on in the "black and white night" mode. In some implementations, if the user clicks control 2 in fig. 2b, the visible light flash switches from an off state to an on state, but the camera exits the night scene mode, e.g., switches from the night scene mode to the photo mode. In other implementations, if the user clicks control 2 in fig. 2b, the camera does not respond.
Because the infrared image is a black and white image, the effect of the filter cannot be supported, and therefore, control 3 is displayed in gray to alert the user that the filter is not available in "black and white night" mode. In some implementations, if the user clicks control 3 in fig. 2b, the filter function is turned on and the camera exits the night scene mode, e.g., switches from the night scene mode to a default mode of the photographing mode. In other implementations, if the user clicks control 3 in fig. 2b, the camera does not respond or the filter function is prompted to be unavailable.
In order to avoid the influence of the infrared rays emitted by the infrared flash lamp on the focusing function of the laser, in the "black and white polar night" mode, the focal length of the infrared camera is a fixed focal length, so the zoom control is not shown in fig. 2 b. In some implementations, if the user performs a zoom gesture operation in the interface shown in fig. 2b, the camera exits the "black and white night" mode. In other implementations, if the user performs a zoom gesture operation in the interface shown in FIG. 2a, the camera issues a non-zoomable hint message or does not respond.
It will be appreciated that the preview interfaces depicted in fig. 2a and 2b are each shown in black: the camera is currently in a dark environment so the preview image displayed based on the image captured by the visible camera is as shown in fig. 2a or fig. 2 b.
It will be appreciated that the interfaces and controls in the interfaces shown in fig. 2a and 2b are merely examples and are not intended to be limiting of the interfaces displayed in the image acquisition methods provided by embodiments of the present application.
In conjunction with fig. 2a and fig. 2b, it can be understood that one way to enter the night scene mode is to automatically select the night scene mode based on the brightness of the collected environment and the off state of the visible flash, and another way is to select the night scene mode from the shooting modes of the camera. In the night scene mode, a user can switch between the standard night scene mode and the extremely dark night scene mode through a control.
The image acquisition method provided by the embodiment of the application is applied to electronic equipment, wherein the electronic equipment comprises, but is not limited to, electronic equipment with cameras such as mobile phones, tablet computers, desktop, laptop, notebook computers, ultra-mobile personal computers (UMPC), handheld computers, netbooks, personal digital assistants (Personal Digital Assistant, PDA), wearable electronic equipment, smart watches and the like.
Fig. 3 is a composition example of an electronic device according to an embodiment of the present application. Taking a mobile phone as an example, the electronic device may include a processor 110, an internal memory 120, a display 130, a camera 140, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, and the like.
It is to be understood that the structure illustrated in the present embodiment does not constitute a specific limitation on the electronic apparatus. In other embodiments, the electronic device may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The internal memory 120 may be used to store computer-executable program code that includes instructions. The processor 110 executes various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 110. The internal memory 120 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device (e.g., audio data, phonebook, etc.), and so forth. In addition, the internal memory 120 may include a high-speed random access memory, and may also include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 120 and/or instructions stored in a memory provided in the processor.
The electronic device implements display functions through the GPU, the display 130, and the application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display 130 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 130 is used to display images, videos, and the like. The display 130 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro-led, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device may include 1 or N display screens 130, N being a positive integer greater than 1.
The electronic device may implement shooting functions through the ISP, the camera 140, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 140. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 140.
The camera 140 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the electronic device may include 1 or N cameras 140, N being a positive integer greater than 1.
The camera 140 may comprise a variety of types of cameras, including, but not limited to: visible light cameras such as RGB cameras (abbreviated as RGB), and near infrared cameras (abbreviated as NIR). Also, in an embodiment of the present application, the electronic device further includes a visible light flash 180A and a near infrared flash 180B.
In some implementations, the visible light camera is a primary camera of the electronic device. Fig. 4 shows an example of some parameters of the main camera and the near infrared camera.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, and so on.
Video codecs are used to compress or decompress digital video. The electronic device may support one or more video codecs. In this way, the electronic device may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The internal memory 120 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 120. The internal memory 120 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device (e.g., audio data, phonebook, etc.), and so forth. In addition, the internal memory 120 may include a high-speed random access memory, and may also include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 120 and/or instructions stored in a memory provided in the processor.
The electronic device may implement audio functions through an audio module 170, a speaker 170A, a microphone 170B, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device may listen to music, or to hands-free conversations, through speaker 170A.
In some embodiments, speaker 170A may play video information with special effects as mentioned in embodiments of the present application.
Microphone 170B, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170B through the mouth, inputting a sound signal to the microphone 170B.
The wireless communication function of the electronic device may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied on an electronic device. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc. for application on an electronic device. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In addition, an operating system is run on the components. Such as iOS operating systems, android operating systems, windows operating systems, and the like.
Taking the Android operating system as an example, in some embodiments, the Android system is divided into four layers, namely an application layer, an application framework layer, an Zhuoyun row (Android run) and system library, a Hardware Abstraction Layer (HAL) and a kernel layer from top to bottom. Each layer has a clear role and division of work. The layers communicate with each other through a software interface.
The application layer may include a series of application packages. The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions. Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system. The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. The hardware abstraction layer is used for abstracting hardware. The hardware interface details of a specific platform are hidden, a virtual hardware platform is provided for an operating system, so that the operating system has hardware independence, and can be transplanted on various platforms. The kernel layer is a layer between hardware and software, and a driver of the hardware generally runs in the kernel layer.
The modules related to the embodiment of the present application are distributed in an application layer, a system library, a HAL, and a kernel layer, and fig. 5 is an example of each layer and each module related to the embodiment of the present application, and the functions of each layer and module not shown in fig. 5 are not described again.
Although the Android system is taken as an example for explanation, the basic principle of the embodiment of the application is also applicable to electronic devices based on iOS, windows and other operating systems.
Taking fig. 5 as an example, the application running on the application layer includes a camera. The system library comprises an image processing algorithm library, wherein an algorithm for image processing is preconfigured in the image processing algorithm library, and the algorithm comprises but is not limited to an infrared image enhancement algorithm. The HAL operates with a camera HAL. The kernel layer is operated with a visible light flash lamp driver, a visible light camera driver, a near infrared flash lamp driver and a near infrared camera driver.
The basic flow of obtaining near infrared images in a very dark environment in connection with fig. 2a, 2b and 5 is: the camera responds to the operation of starting the camera by a user, sends a starting instruction of the visible light camera to the camera HAL, and the camera HAL sends the starting instruction of the visible light camera to the visible light camera driver so as to control the starting of the visible light camera.
The camera sends to the camera HAL a parameter based on the brightness parameter of the environment and the flash off state acquired by the visible light camera. The camera HAL sends information of the night scene mode selected based on the parameters to the camera, which displays the interface shown in fig. 2a based on the received information. It can be understood that after the visible light camera is turned on, an image shot by the visible light camera is displayed in a preview interface of the camera.
The camera responds to the operation of the user selecting the 'black and white night' mode, sends information of the 'black and white night' mode selected by the user to the camera HAL, and sends an opening instruction of the near infrared camera to the near infrared camera driver based on the information of the 'black and white night' mode so as to control the near infrared camera to be opened.
The camera responds to shooting operation of a user, sends shooting instructions to the camera HAL, controls the near infrared flash to be started by sending a start instruction of the near infrared flash to the near infrared flash drive, selects a near infrared ray image from images acquired by the near infrared camera in a near infrared flash start state, invokes an image enhancement algorithm in an image processing algorithm library, carries out enhancement processing on the near infrared ray image, and sends the processed near infrared image to a gallery. And the gallery stores the processed near infrared image.
It can be understood that after shooting is completed, the camera HAL sends a closing instruction to the near infrared flash drive and the near infrared camera drive respectively to control the near infrared flash and the near infrared camera to be closed respectively, and after the user exits the camera, the camera sends a closing instruction to the camera HAL, and the camera HAL controls the visible camera to be closed by sending a closing instruction to the visible camera.
The method for acquiring an image according to the embodiment of the present application will be described in more detail below on the basis of fig. 5. It can be appreciated that the image acquiring method disclosed in this embodiment is applicable to a scene photographed by using an electronic device, where photographing a scene includes, but is not limited to, a dark scene, and photographing includes, but is not limited to, photographing a photograph or photographing a video.
As shown in fig. 6, the image acquisition method according to the present embodiment includes the steps of:
s01, the camera responds to the starting operation of the camera, and transmits a starting instruction of the visible light camera to the camera HAL.
In some implementations, a user clicks an icon of a camera on a desktop of an electronic device, and the camera detects a turn-on operation.
It will be appreciated that the camera may transmit an on command of the visible light camera to the camera HAL through the corresponding module of the application framework layer, and will not be described in detail in fig. 6.
S02, the camera HAL transmits an opening instruction of the visible light camera to the visible light camera driver.
It can be appreciated that the visible light camera drive controls the opening of the visible light camera in response to an opening command of the visible light camera.
S03, the visible light camera is driven to respond to an opening instruction of the visible light camera, and the opening of the visible light camera is controlled.
S04, the camera obtains image data shot by the visible light camera.
After the visible light camera is started, the preview interface of the camera displays images (i.e. preview images) acquired by the visible light camera. In some implementations, the image data collected by the visible light camera is transmitted to the camera through the camera HAL module after being processed by the ISP and the like, and the details are not repeated in fig. 6.
The image data includes parameters such as an ambient brightness parameter (which may be simply referred to as a brightness parameter), or the brightness parameter or the like can be obtained based on the image data.
S05, the camera transmits image data and visible light flash lamp state parameters to the camera HAL.
It is understood that the visible light flash status parameter indicates whether the visible light flash is in an on or off state.
The present embodiment assumes that the user turns on the camera outdoors at night without a light, and that the visible light flash is in an off state, the brightness parameter transmitted by the camera to the camera HAL characterizes a dim light environment, e.g. less than 1 lux, and the visible light flash state parameter characterizes the visible light flash is in an off state.
S06, the camera HAL selects a shooting mode (i.e., a first target shooting mode) based on the image data and the visible light flash status parameter.
It will be appreciated that the camera includes a plurality of shooting modes, and the night scene mode is a shooting mode suitable for a dim light environment.
In some implementations, the camera HAL obtains brightness parameters from the image data, in other implementations, the camera HAL is capable of calculating the obtained brightness parameters based on the image data.
In some implementations, a correspondence between the photographing mode and at least one of a brightness parameter and a setting parameter of the camera is preconfigured in the camera HAL, and the camera HAL determines the photographing mode based on the brightness parameter and the correspondence in the received image data.
In some implementations, the camera HAL converts the luminance parameters to LuxIndex units no matter what way the luminance parameters are obtained. For example, assume that the ambient brightness parameter satisfies the condition: luxIndex >500 and the flash is in the off state, then the "black and white polar night" mode is selected, assuming that the ambient brightness parameter satisfies the condition: the standard night scene mode is selected if LuxIndex <480 and the flash is off.
For convenience of description, a photographing mode selected by the camera HAL is referred to as a first target photographing mode.
It will be appreciated that the luminance parameter can be obtained from a light intensity sensor on the electronic device, i.e. the camera HAL obtains the luminance value of the environment from the light intensity sensor, in addition to being able to be obtained based on the image data transmitted by the camera.
S07, the camera HAL transmits information of the first target photographing mode to the camera.
The information of the photographing mode may be understood as information representing the photographing mode, for example, an identification of the photographing mode.
And S08, the camera displays a preview interface corresponding to the first target shooting mode based on the information of the first target shooting mode, and displays an image shot by the visible light camera on the preview interface.
In one example, the target shooting mode is a "black and white night" mode, then the camera displays a preview interface as shown in fig. 2b, and in another example, the target shooting mode is a standard night scene mode, then the camera displays a preview interface as shown in fig. 2 a.
It will be appreciated that the user may switch the shooting mode from "black and white night" mode to the standard night scene mode by clicking the control 61 on the preview interface shown in fig. 2b, or the user may switch the shooting mode from the standard night scene mode to "black and white night" mode by clicking the control 62 on the preview interface shown in fig. 2 a. Alternatively, the user may operate on the preview interface shown in fig. 2a or 2b to switch from the night scene mode to the other shooting mode, that is, the user may reselect the shooting mode by himself after the camera recommends the shooting mode to the user.
It will be appreciated that, because the camera has obtained the image data captured by the visible light camera at S04, and the camera has not received the information of the first target mode transmitted by the camera HAL at S04, after the camera is turned on, the user first displays a preview interface corresponding to the default mode, for example, a preview interface corresponding to the photographing mode, and then switches to the preview interface corresponding to the first target mode, for example, the "black and white night" mode. The process of this interface switching may or may not be perceived by the user.
In the case that the camera HAL obtains the light intensity parameter based on the light intensity sensor, S04 may not be executed, so as to avoid the user from perceiving the switching of the shooting mode, and improve the user experience.
S09, the camera transmits information of the second target photographing mode to the camera HAL.
As previously described, it will be appreciated that it is possible for the user to reselect the photographing mode, for example clicking on control 62 in fig. 2a, from the standard night view mode to the "black and white night" mode based on the preview interface of the camera, and the purpose of this step is therefore to inform the camera HAL of the information of the latest target photographing mode (abbreviated as second target photographing mode).
S10, the camera HAL judges whether near infrared shooting needs to be started or not based on the information of the second target mode, if yes, S11 is executed, and if not, the process is ended (or the process of shooting based on the visible light camera is executed).
It is understood that near infrared photographing may be understood as turning on a near infrared flash and a near infrared camera, or turning on a near infrared camera.
In connection with fig. 2b and the specific implementation of the foregoing steps, it can be understood that, in the case where the second target mode is the "black and white night" mode, it is determined that the near infrared shooting needs to be turned on.
Because the cameras of different types are provided with the linkage logic and the near infrared flash lamp has influence on various cameras, and the like, under the condition that the judgment result of S10 is yes, the configuration of each camera needs to be modified so as to reduce the influence of the near infrared flash lamp on other cameras and improve the quality of image data acquired by each camera.
S11, closing 3A synchronization by the camera HAL.
3A is an abbreviation for Auto Exposure (AE), auto Focus (AF), and Auto white balance (Auto White Balance, AWB).
Typically, cameras on electronic devices are preconfigured to 3A and different cameras are configured to 3A synchronization. However, it can be understood that the near infrared image is a black-and-white image, so that AWB is not required to be configured, and therefore, when the camera calls the near infrared camera to collect image data, the AWB of the near infrared camera needs to be turned off to save resources, but the AWB of the visible light camera needs to be kept on, so that the AWB of the near infrared camera and the AWB of the visible light camera are different, and therefore, 3A synchronization between the near infrared camera and the visible light camera needs to be turned off.
So far, the near infrared camera is ready to collect near infrared image data.
S12, the camera responds to shooting operation of a user and transmits shooting instructions to the camera HAL.
An example of a shooting operation by a user is shown in connection with fig. 2a or fig. 2b, in which the user clicks the shooting control 9.
S13, the camera HAL judges whether a focus locking condition is met, if yes, S14 is executed, and if no, S15 is executed.
The near infrared light emitted by the near infrared flash lamp can influence the induction of the laser used for AF, so that the accuracy of a ranging result is reduced, the focal length of AF detection is possibly changed frequently, and because the preview interface shown in fig. 2b displays image data collected by the visible light camera, the preview interface shown in fig. 2b shows a picture jumping effect indicating that the focal length is changed frequently, so that the experience of a user is reduced, and therefore, the AF of the visible light camera needs to be closed. It will be appreciated that the fixed focal length does not require laser focusing, so the above problem does not exist, and thus the focus locking condition of the visible light camera includes: the focusing mode of the visible light camera is AF based on laser.
Similarly, the AF of the near-infrared camera is also based on laser ranging, and the near-infrared light emitted by the near-infrared camera can influence the induction of the laser used for AF, so that the accuracy of the ranging result and the frequent change of the focal length are reduced, thereby influencing the quality of the image collected by the near-infrared camera, and therefore, the focus locking condition of the near-infrared camera comprises: the focusing mode of the near infrared camera is AF based on laser.
S14, the camera HAL modifies the focusing mode of the visible light camera and the near infrared camera into a fixed focal length.
It is understood that the value of the fixed focal length may be preconfigured in the electronic device.
S15, the camera HAL sends the current value of the near infrared flash lamp to the near infrared flash lamp drive based on the brightness parameter.
Examples of current values of near infrared flash based on different brightness parameter settings are shown in table 1:
TABLE 1
Ambient brightness (LuxIndex) Current value (mA)
450 250
480 500
510 750
540 1000
It can be seen that the larger the LuxIndex, the lower the luminance, and thus the higher the intensity of the near infrared light required, and thus the higher the current value of the near infrared flash lamp required.
S16, controlling the near infrared flash lamp to be lightened based on the opposite current value by the near infrared flash lamp.
S17, the near infrared camera is driven and controlled to acquire near infrared data according to the set parameters.
The parameters set include, but are not limited to, the focus mode set to a fixed focal length.
That is, in addition to the focusing mode, the camera is configured with other photographing parameters, and in some examples, the sensitivity (ISO) of the main camera is 6400 and the exposure time is 1/2 second. The ISO of the near infrared camera is 1309, and the exposure time is 1/17 second. It will be appreciated that, because of the assistance of the near infrared flash, although the ISO of the near infrared camera is less than that of the main camera, and the exposure time of the near infrared camera is also less than that of the main camera, the brightness of the image captured by the near infrared camera is instead brighter than that captured by the visible camera.
In some implementations, the near-infrared camera driver sends the set parameters to a controller of the near-infrared camera, which controls the near-infrared camera to collect near-infrared image data based on the set parameters. It can be understood that after the visible light camera is turned on, the visible light image data, such as RGB raw image, is collected at a preset frequency before being turned off.
It will be appreciated that the near infrared camera may be turned on by the camera HAL via near infrared camera drive control after S11, or may be turned on by the camera HAL via near infrared camera drive control after S12, S13 or S14. And, after S14, the set parameters are transmitted by the camera HAL to the near infrared camera drive. In either way, the near-infrared camera is turned on, and after S16, the near-infrared camera is driven to control the near-infrared camera to acquire near-infrared image data based on the set parameters.
S18, the camera HAL selects a near infrared ray image from near infrared data, and selects an RGB ray image from image data acquired by a visible light camera.
In some implementations, the image data collected by the near infrared camera and the image data collected by the visible light camera are processed by hardware modules such as ISP and then transmitted to the camera HAL through corresponding driving in the kernel layer.
In one example, a 4-frame near infrared raw map is selected, and a 6-frame RGB raw map is selected. Specific implementation manners of selecting the near infrared ray diagram, the RGB ray diagram and the RGB thumbnail are not repeated.
In some implementations, the camera HAL may also select RGB thumbnails from the image data captured by the visible light camera. RGB thumbnails may be displayed in the thumbnail box in the lower left corner of the preview interface after photographing is completed. In other implementations, the near infrared image obtained in S19 is displayed in a thumbnail box in the lower left corner of the preview interface after the photographing is completed.
S19, the camera HAL calls an image enhancement algorithm to process the near infrared ray image, and an enhanced near infrared image is obtained.
In connection with fig. 5, the camera HAL invokes an image enhancement algorithm in the image processing algorithm library in the system library. The purpose of the image enhancement algorithm is to enhance the detail characteristics of the image, so that the image meets the requirement of the visual angle of the user on definition, and specific logic of the enhancement algorithm is not repeated here.
It will be appreciated that the near infrared image resulting from the enhancement process is a black and white image.
S20, the camera HAL obtains a near infrared image based on the RGB raw image and the enhanced near infrared raw image.
And fusing the RGB raw image with the enhanced near infrared image to enable the near infrared image to have color information so as to obtain a better visual effect, thereby improving user experience.
Considering that color information of an RGB image photographed in an excessively dark environment is insufficient to improve visual effects of a near-infrared image, in some implementations, the near-infrared image is obtained based on the RGB raw image and the enhanced near-infrared raw image in a case where the ambient brightness is not less than a preset threshold (e.g., 0.1 lux).
And under the condition that the ambient brightness is smaller than a preset threshold (such as 0.1 lux), the color information of the RGB raw graph is not fused any more, so that meaningless resource expenditure is reduced.
S21, the camera HAL transmits the near infrared image to the gallery.
The near infrared image is saved in the image for viewing by the user. It will be appreciated that RGB thumbnails may also be transferred to a gallery for storage.
S22, the camera HAL controls the close of the near infrared flash lamp through the driving of the near infrared flash lamp.
In some implementations, the camera HAL sends a close command to a near infrared flash drive, which closes the near infrared flash in response to the close command.
S23, the camera HAL controls the close of the near infrared camera through the driving of the near infrared camera.
Since one photographing in response to the photographing operation of the user ends, the near infrared flash and the near infrared camera are turned off. It will be appreciated that the visible light camera is kept on as long as the camera remains on so that the preview interface can output preview images.
In some implementations, the camera HAL sends a close command to a near infrared camera driver that closes the near infrared camera in response to the close command.
It is understood that the execution order of S22 and S23 is not limited.
S24, the camera HAL restores the focusing mode of the visible light camera and the near infrared camera.
It is understood that S24 is an optional step, and in the case where the determination result of S13 is no and S14 is not performed, S24 is not performed.
In summary, in the present embodiment, three stages, that is, before the user performs the photographing operation (i.e., the preview stage), after the near-infrared camera is turned on in response to the photographing operation (i.e., the photographing stage), and after the near-infrared camera is turned off (i.e., the photographing completion stage), are assumed that the electronic device has the laser auto-focusing function, the configuration states of the RGB camera and the 3A of the near-infrared camera are as shown in fig. 7a, and the configuration states of the RGB camera and the 3A of the near-infrared camera are as shown in fig. 7b, assuming that the electronic device does not have the laser auto-focusing function (i.e., the camera of the electronic device adopts a fixed focal length).
In some implementations, the camera HAL configures the restored focusing mode to the camera through the corresponding driver of the kernel layer, which is not described herein.
S25, the camera responds to the operation of closing the camera and transmits a closing instruction to the camera HAL.
S26, the camera HAL responds to the closing instruction to control the visible light camera to be closed.
In some implementations, the camera HAL transmits a shutdown instruction to a visible light camera driver, which controls the visible light camera to shutdown in response to the shutdown instruction.
It will be appreciated that the image data acquired by the camera is typically processed by hardware and software modules, and that S18-S20 in fig. 6 belong to the processing steps based on the software modules.
In some implementations, as shown in fig. 8a, the near-infrared raw image is processed by an ISP chip (hardware-based processing), then sequentially processed by raw (i.e. S18-S20), RGB (red, green and blue) and Yuv, and finally, the result of Yuv processing is subjected to Jpeg processing (e.g. Jpeg encoding), so as to obtain a near-infrared image.
In other implementations, a step of fusing color information is added on the basis of the processing shown in fig. 8a, specifically, as shown in fig. 8b, after the near infrared raw image and the RGB raw image processed by the ISP are fused, raw processing is performed, so that the obtained near infrared image has color information, and thus, a richer visual effect is achieved.
In this embodiment, specific implementation manners of RGB processing, yuv processing, and Jpeg processing are not described in detail.

Claims (14)

1. The method for acquiring the image is characterized by being applied to electronic equipment, wherein a camera is operated on the electronic equipment, the electronic equipment is provided with a visible light camera, a visible light flash lamp, a near infrared flash lamp and a near infrared camera, the camera displays a first night scene sub-mode or a second night scene sub-mode in a night scene mode, the first night scene sub-mode and the second night scene sub-mode are shooting modes suitable for a dark light environment, the environment brightness suitable for the first night scene sub-mode is smaller than the environment brightness suitable for the second night scene sub-mode, in a preview interface of the first night scene sub-mode, the visible light flash lamp and a filter function are in a closed and non-opened state, and in a preview interface of the second night scene sub-mode, the visible light flash lamp and the filter function are in a closed and opened state, and the method comprises:
in response to a photographing operation based on a preview interface of the first night scene sub-mode displayed by the camera, acquiring a current value of a near infrared flash based on a brightness parameter indicating ambient brightness, the brightness parameter being acquired based on a light intensity sensor on the electronic device;
Igniting the near infrared flash lamp based on the current value;
acquiring image data acquired by the near infrared camera under the state that the near infrared flash lamp is lighted and with preset parameters, wherein the preset parameters comprise light sensitivity ISO and exposure time, the ISO of the near infrared camera is smaller than the ISO of the visible light camera, and the exposure time of the near infrared camera is smaller than the exposure time of the visible light camera;
and obtaining a near infrared image based on the image data.
2. The method of claim 1, wherein the first night scene mode comprises: a shooting mode selected in a case where the ambient brightness value is less than a first threshold value and the visible light flash is off.
3. The method of claim 2, wherein the method of the camera displaying the preview interface in the first night scene mode comprises:
after the camera is turned on, in response to the ambient brightness value being less than the first threshold and the visible light flash being turned off, a preview interface in the first night scene sub-mode is displayed in the camera.
4. The method of claim 2, wherein the method of the camera displaying the preview interface in the first night scene mode comprises:
After the camera is started, responding to the environment brightness value being smaller than a second threshold value, and displaying a preview interface in the second night scene sub-mode in the camera;
responsive to selecting the first night scene sub-mode on the preview interface in the second night scene sub-mode, displaying the preview interface in the first night scene sub-mode in the camera.
5. The method of any of claims 1-4, wherein the preview interface of the first night scene mode displays a first control for indicating that the visible light flash is off and a second control for indicating that a filter function is off.
6. The method of claim 5, wherein the first control is further for indicating that it is inoperable and the second control is further for indicating that it is inoperable;
the method further comprises the steps of:
and closing the near infrared flash lamp in response to the operation of the first control or the second control.
7. The method of claim 5, wherein the method further comprises:
responsive to operation of the first control or the second control, a preview interface displayed in the camera is switched.
8. The method according to any one of claims 1-4, further comprising:
responsive to a zoom operation on a preview interface of a first night scene sub-mode displayed by the camera, switching the preview interface displayed in the camera.
9. The method according to any one of claims 1 to 4, further comprising, before the acquiring the image data captured by the near-infrared camera with the near-infrared flash being lit, with a preset parameter:
closing functional synchronization between the visible light camera and the near infrared camera, the functions comprising: automatic exposure, automatic focusing, and automatic white balancing.
10. The method according to any one of claims 1 to 4, further comprising, before said acquiring image data captured by the near infrared camera at a preset parameter in a state where the near infrared flash is on, at least one of:
setting a focusing mode of the visible light camera to be a fixed focal length in response to the focusing mode of the visible light camera being focusing based on laser;
and setting the focusing mode of the near infrared camera to be a fixed focal length in response to the focusing mode of the near infrared camera being focusing based on laser.
11. The method according to any one of claims 1-4, wherein said obtaining a near infrared image based on said image data comprises:
selecting a near infrared original image from the image data;
invoking an image enhancement algorithm to process the near infrared original image to obtain an enhancement processing result;
and fusing the enhancement processing result with the image shot by the visible light camera to obtain the near infrared image.
12. The method of claim 11, wherein fusing the enhancement processing result with an image captured by a visible camera to obtain the near infrared image comprises:
and under the condition that the ambient brightness is not smaller than a preset threshold value, fusing the enhancement processing result with an image shot by a visible light camera to obtain the near infrared image.
13. An electronic device, comprising:
one or more processors;
one or more memories;
the memory stores one or more programs that, when executed by the processor, cause the electronic device to perform the image acquisition method of any of claims 1-12.
14. A computer readable storage medium, characterized in that the computer readable storage medium stores therein a computer program, which when executed by a processor causes the processor to perform the image acquisition method according to any one of claims 1 to 12.
CN202211071812.6A 2022-09-02 2022-09-02 Image acquisition method, electronic device and computer readable storage medium Active CN115604572B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211071812.6A CN115604572B (en) 2022-09-02 2022-09-02 Image acquisition method, electronic device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211071812.6A CN115604572B (en) 2022-09-02 2022-09-02 Image acquisition method, electronic device and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN115604572A CN115604572A (en) 2023-01-13
CN115604572B true CN115604572B (en) 2023-12-01

Family

ID=84843799

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211071812.6A Active CN115604572B (en) 2022-09-02 2022-09-02 Image acquisition method, electronic device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN115604572B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117499789A (en) * 2023-12-25 2024-02-02 荣耀终端有限公司 Shooting method and related device
CN117499797A (en) * 2023-12-29 2024-02-02 荣耀终端有限公司 Image processing method and related equipment

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1743950A (en) * 2004-09-01 2006-03-08 富士胶片株式会社 Lens-fitted photo film unit
CN104954681A (en) * 2015-06-16 2015-09-30 广东欧珀移动通信有限公司 Method for switching off laser focusing mode and terminal
CN105187726A (en) * 2015-06-17 2015-12-23 广州市巽腾信息科技有限公司 Multifunctional mobile image processing device, processing method of utilizing multifunctional mobile image processing device, and use of multifunctional mobile image processing device
CN105827902A (en) * 2015-06-19 2016-08-03 维沃移动通信有限公司 Night scene detection method and terminal
CN106331519A (en) * 2016-10-31 2017-01-11 维沃移动通信有限公司 Method for adjusting light-compensating lamp and mobile terminal
CN106897716A (en) * 2017-04-27 2017-06-27 广东工业大学 A kind of dormitory safety monitoring system and method
CN108429888A (en) * 2018-04-28 2018-08-21 维沃移动通信有限公司 The method and mobile terminal of light filling
CN108737728A (en) * 2018-05-03 2018-11-02 Oppo广东移动通信有限公司 A kind of image capturing method, terminal and computer storage media
CN108848294A (en) * 2018-08-15 2018-11-20 努比亚技术有限公司 A kind of shooting parameter adjustment method, terminal and computer readable storage medium
CN110351490A (en) * 2018-04-04 2019-10-18 杭州海康威视数字技术股份有限公司 Exposure method, device and picture pick-up device
CN110557549A (en) * 2018-05-31 2019-12-10 晨星半导体股份有限公司 Circuit for controlling image capturing equipment and related control method
CN111614892A (en) * 2020-04-28 2020-09-01 深圳英飞拓智能技术有限公司 Face image acquisition method, shooting device and computer-readable storage medium
CN112437235A (en) * 2020-11-11 2021-03-02 Oppo广东移动通信有限公司 Night scene picture generation method and device and mobile terminal
WO2021208789A1 (en) * 2020-04-14 2021-10-21 华为技术有限公司 Photographing method and electronic device
WO2022030367A1 (en) * 2020-08-05 2022-02-10 パナソニックIpマネジメント株式会社 Imaging device
CN114708326A (en) * 2021-12-14 2022-07-05 燕山大学 Full-automatic camera calibration system and method for adaptively adjusting brightness and ambiguity

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1743950A (en) * 2004-09-01 2006-03-08 富士胶片株式会社 Lens-fitted photo film unit
CN104954681A (en) * 2015-06-16 2015-09-30 广东欧珀移动通信有限公司 Method for switching off laser focusing mode and terminal
CN105187726A (en) * 2015-06-17 2015-12-23 广州市巽腾信息科技有限公司 Multifunctional mobile image processing device, processing method of utilizing multifunctional mobile image processing device, and use of multifunctional mobile image processing device
CN105827902A (en) * 2015-06-19 2016-08-03 维沃移动通信有限公司 Night scene detection method and terminal
CN106331519A (en) * 2016-10-31 2017-01-11 维沃移动通信有限公司 Method for adjusting light-compensating lamp and mobile terminal
CN106897716A (en) * 2017-04-27 2017-06-27 广东工业大学 A kind of dormitory safety monitoring system and method
CN110351490A (en) * 2018-04-04 2019-10-18 杭州海康威视数字技术股份有限公司 Exposure method, device and picture pick-up device
CN108429888A (en) * 2018-04-28 2018-08-21 维沃移动通信有限公司 The method and mobile terminal of light filling
CN108737728A (en) * 2018-05-03 2018-11-02 Oppo广东移动通信有限公司 A kind of image capturing method, terminal and computer storage media
CN110557549A (en) * 2018-05-31 2019-12-10 晨星半导体股份有限公司 Circuit for controlling image capturing equipment and related control method
CN108848294A (en) * 2018-08-15 2018-11-20 努比亚技术有限公司 A kind of shooting parameter adjustment method, terminal and computer readable storage medium
WO2021208789A1 (en) * 2020-04-14 2021-10-21 华为技术有限公司 Photographing method and electronic device
CN111614892A (en) * 2020-04-28 2020-09-01 深圳英飞拓智能技术有限公司 Face image acquisition method, shooting device and computer-readable storage medium
WO2022030367A1 (en) * 2020-08-05 2022-02-10 パナソニックIpマネジメント株式会社 Imaging device
CN112437235A (en) * 2020-11-11 2021-03-02 Oppo广东移动通信有限公司 Night scene picture generation method and device and mobile terminal
CN114708326A (en) * 2021-12-14 2022-07-05 燕山大学 Full-automatic camera calibration system and method for adaptively adjusting brightness and ambiguity

Also Published As

Publication number Publication date
CN115604572A (en) 2023-01-13

Similar Documents

Publication Publication Date Title
CN113132620B (en) Image shooting method and related device
WO2021052232A1 (en) Time-lapse photography method and device
CN115604572B (en) Image acquisition method, electronic device and computer readable storage medium
CN112840642B (en) Image shooting method and terminal equipment
WO2023015991A1 (en) Photography method, electronic device, and computer readable storage medium
CN116055897B (en) Photographing method and related equipment thereof
CN115272138B (en) Image processing method and related device
CN116074634B (en) Exposure parameter determination method and device
CN113630558A (en) Camera exposure method and electronic equipment
CN114063951B (en) Screen projection abnormity processing method and electronic equipment
WO2023131070A1 (en) Electronic device management method, electronic device, and readable storage medium
CN117119291A (en) Picture mode switching method and electronic equipment
CN113891008B (en) Exposure intensity adjusting method and related equipment
CN117651221A (en) Video processing method and electronic equipment
CN115550556A (en) Exposure intensity adjusting method and related device
CN114463191A (en) Image processing method and electronic equipment
CN115631250A (en) Image processing method and electronic equipment
CN115705663B (en) Image processing method and electronic equipment
CN115460343B (en) Image processing method, device and storage medium
WO2023160221A1 (en) Image processing method and electronic device
CN116709018B (en) Zoom bar segmentation method and electronic equipment
WO2023160220A1 (en) Image processing method and electronic device
CN116051368B (en) Image processing method and related device
CN115426458B (en) Light source detection method and related equipment thereof
WO2023160224A1 (en) Photographing method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant