CN111093031B - Image generation method and electronic device - Google Patents

Image generation method and electronic device Download PDF

Info

Publication number
CN111093031B
CN111093031B CN201911418903.0A CN201911418903A CN111093031B CN 111093031 B CN111093031 B CN 111093031B CN 201911418903 A CN201911418903 A CN 201911418903A CN 111093031 B CN111093031 B CN 111093031B
Authority
CN
China
Prior art keywords
image
image sensor
image plane
target object
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911418903.0A
Other languages
Chinese (zh)
Other versions
CN111093031A (en
Inventor
许盟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201911418903.0A priority Critical patent/CN111093031B/en
Publication of CN111093031A publication Critical patent/CN111093031A/en
Application granted granted Critical
Publication of CN111093031B publication Critical patent/CN111093031B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image

Abstract

The invention provides an image generation method and electronic equipment, wherein the method comprises the following steps: under the condition that a camera of the electronic equipment finishes focusing, controlling a first image plane corresponding to a target object to move to the position of an image sensor; wherein the first image plane coincides with the plane of the image sensor; determining parameter information of a target object corresponding to a second image plane, wherein the first image plane is an ideal image plane of the target object, the second image plane is an actual image plane of the target object, and the first image plane and the second image plane are tangent; moving the image sensor at least once according to the parameter information so as to enable the image sensor to be intersected with the second image plane; collecting at least one pixel point in an image sensor; according to each pixel point, a target image of a target object is output, the fact that a lens with a complex structure is not required to be assembled to reduce aberration can be achieved, the lens and the image sensor are directly combined to carry out aberration correction on the image, and therefore imaging quality of shooting of electronic equipment is improved.

Description

Image generation method and electronic device
Technical Field
The present invention relates to the field of electronic devices, and in particular, to an image generation method and an electronic device.
Background
As the market matures, consumer demands for cameras for electronic devices are increasing step by step.
For the design of the lens of the electronic device, because the adopted lenses are different, the different images shot by the different lenses have more or less aberration, wherein the aberration refers to the deviation between the result obtained by non-paraxial ray tracing and the result obtained by paraxial ray tracing in the actual optical system and the ideal condition of Gaussian optics.
In order to improve the imaging quality, some lenses of electronic devices may use several aspheric surfaces to solve the problem of aberration of the captured image, however, even though the aspheric lens is used, the aberration may still exist, which affects the quality of the captured image.
Disclosure of Invention
The embodiment of the invention provides an image generation method and electronic equipment, and aims to solve the problem of image aberration in the prior art.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an image generation method, where the method includes: under the condition that a camera of the electronic equipment finishes focusing, controlling a first image plane corresponding to a target object to move to the position of an image sensor; the first image surface is an ideal image surface of the target object, and the first image surface is superposed with the plane of the image sensor; determining parameter information of a second image plane corresponding to the target object, wherein the second image plane is an actual image plane of the target object, and the first image plane is tangent to the second image plane; moving the image sensor at least once according to the parameter information so as to enable the image sensor to be intersected with the second image plane; collecting at least one pixel point in the image sensor; and outputting a target image of the target object according to each pixel point, wherein the target image is an aberration-free image.
In a second aspect, an embodiment of the present invention further provides an electronic device, where the electronic device includes: the control module is used for controlling a first image plane corresponding to a target object to move to the position of the image sensor under the condition that the camera of the electronic equipment finishes focusing; the first image surface is an ideal image surface of the target object, and the first image surface is superposed with the plane of the image sensor; the determining module is used for determining parameter information of a second image plane corresponding to the target object, wherein the first image plane is an ideal image plane of the target object, the second image plane is an actual image plane of the target object, and the first image plane and the second image plane are tangent; the moving module is used for moving the image sensor at least once according to the parameter information so as to enable the image sensor to be intersected with the second image plane; the acquisition module is used for acquiring at least one pixel point in the image sensor; and the generating module is used for outputting a target image of the target object according to each pixel point, wherein the target image is an aberration-free image.
In a third aspect, an embodiment of the present invention further provides an electronic device, which includes a processor, a memory, and a computer program stored on the memory and executable on the processor, where the computer program, when executed by the processor, implements the steps of the image generation method.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the image generation method.
In the embodiment of the invention, under the condition that the camera of the electronic equipment completes focusing, the first image plane corresponding to the target object is controlled to move to the position of the image sensor; wherein the first image plane coincides with the plane of the image sensor; determining parameter information of a target object corresponding to a second image plane, wherein the first image plane is an ideal image plane of the target object, the second image plane is an actual image plane of the target object, and the first image plane and the second image plane are tangent; moving the image sensor at least once according to the parameter information so as to enable the image sensor to be intersected with the second image plane; collecting at least one pixel point in an image sensor; according to each pixel point, a target image of a target object is output, the fact that a lens with a complex structure is not required to be assembled to reduce aberration can be achieved, the lens and the image sensor are directly combined to carry out aberration correction on the image, and therefore imaging quality of shooting of electronic equipment is improved.
Drawings
FIG. 1 is a flowchart illustrating steps of an image generation method according to a first embodiment of the present invention;
FIG. 2 is a flowchart illustrating steps of an image generation method according to a second embodiment of the present invention;
FIG. 3 is a schematic diagram of generating a first image plane and a second image plane for a target object according to a second embodiment of the present invention;
fig. 4 is a block diagram of an electronic device according to a third embodiment of the present invention;
fig. 5 is a block diagram of an electronic device according to a fourth embodiment of the present invention;
fig. 6 is a schematic diagram of a hardware structure of an electronic device according to a fifth embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example one
Referring to fig. 1, a flowchart illustrating steps of an image generation method according to a first embodiment of the present invention is shown.
The image generation method provided by the embodiment of the invention comprises the following steps:
step 101: and under the condition that the camera of the electronic equipment finishes focusing, controlling the first image plane corresponding to the target object to move to the position of the image sensor.
The first image plane is an ideal image plane of the target object, and the first image plane is superposed with the plane of the image sensor.
The first image surface of the target object is an ideal image surface of the target object, in the optical imaging principle, an actual image surface formed by the target object is generally a spherical surface, an aspheric surface and the like, the first image surface is the ideal image surface of the target object, and based on an ideal optical system, point imaging is performed: that is, for each point in the object space, there must be one point corresponding to it in the image space and only one point corresponding to it; line-to-line imaging: that is, for each straight line of the object space, one straight line is necessarily corresponding to the object space in the image space, and only one straight line is corresponding to the object space; planar imaging: i.e. each plane of the object space, there must be one plane corresponding to it and only one plane corresponding to it in the image space. The target object must have an image plane corresponding to the object in the object space, that is, an ideal image plane of the target object.
And moving the first image plane corresponding to the target object to be coincident with the plane of the image sensor by using a VCM or other displacement devices, wherein the VCM is an engine technology.
Step 102: and determining the parameter information of the target object corresponding to the second image plane.
The second image surface is an actual image surface of the target object, and the first image surface is tangent to the second image surface.
The second image plane of the target object is an actual image plane of the target object and can be a spherical surface, a curved surface, an aspheric surface and the like, and the shape of the second image plane is related to a lens of the camera of the electronic device.
Step 103: and moving the image sensor at least once according to the parameter information so as to enable the image sensor to be intersected with the second image plane.
And moving the image sensor at least once according to the parameter information of the second image surface, enabling the image sensor to be intersected with the second image surface every time, sampling the images displayed at different moving positions, and moving the image sensor according to the parameter information to realize aberration correction on the finally shot image.
Step 104: and collecting at least one pixel point in the image sensor.
For example: when the second image plane is a curved surface, when the curvatures of the second image planes at different view fields are different, the density of the collected pixel points can be increased at the position with the larger curvature, and the sampled pixel points can be properly reduced at the position with the smaller curvature.
The field of view represents the maximum range that can be observed by the camera, and is usually expressed in terms of angles, the larger the field of view is, the larger the observation range is, the curvature of the curve is the rotation rate of the tangential direction angle of a certain point on the curve to the arc length, and is defined by differentiation, and the degree that the curve deviates from a straight line is indicated.
Step 105: and outputting a target image of the target object according to each pixel point.
Wherein the target image is an aberration-free image.
And through the pixel points acquired for many times, all the pixel points are fused to generate a target image, and the aberration of the output target image is effectively corrected.
In the embodiment of the invention, under the condition that the camera of the electronic equipment completes focusing, the first image plane corresponding to the target object is controlled to move to the position of the image sensor; wherein the first image plane coincides with the plane of the image sensor; determining parameter information of a target object corresponding to a second image plane, wherein the first image plane is an ideal image plane of the target object, the second image plane is an actual image plane of the target object, and the first image plane and the second image plane are tangent; moving the image sensor at least once according to the parameter information so as to enable the image sensor to be intersected with the second image plane; collecting at least one pixel point in an image sensor; according to each pixel point, a target image of a target object is output, the fact that a lens with a complex structure is not required to be assembled to reduce aberration can be achieved, the lens and the image sensor are directly combined to carry out aberration correction on the image, and therefore imaging quality of shooting of electronic equipment is improved.
Example two
Referring to fig. 2, a flowchart illustrating steps of an image generation method according to a second embodiment of the present invention is shown.
The image generation method provided by the embodiment of the invention comprises the following steps:
step 201: and under the condition that the camera of the electronic equipment finishes focusing, controlling the first image plane corresponding to the target object to move to the position of the image sensor.
The first image surface of the target object is an ideal image surface of the target object, in the optical imaging principle, an actual image surface formed by the target object is generally a spherical surface, an aspheric surface and the like, the first image surface is the ideal image surface of the target object, and based on an ideal optical system, point imaging is performed: that is, for each point in the object space, there must be one point corresponding to it in the image space and only one point corresponding to it; line-to-line imaging: that is, for each straight line of the object space, one straight line is necessarily corresponding to the object space in the image space, and only one straight line is corresponding to the object space; planar imaging: i.e. each plane of the object space, there must be one plane corresponding to it and only one plane corresponding to it in the image space. The target object must have an image plane corresponding to the object in the object space, that is, an ideal image plane of the target object.
And moving the first image plane corresponding to the target object to be coincident with the plane of the image sensor by using a VCM or other displacement devices, wherein the VCM is an engine technology.
The first image plane is an ideal image plane of the target object, and the first image plane is superposed with the plane of the image sensor.
Step 202: determining position information of the first image plane and acquiring lens information of a camera of the electronic equipment.
Based on the detection technology, the position information of the first image plane and the lens information of the camera of the electronic equipment are detected.
Step 203: and determining the shape information of the second image surface according to the position information and the lens information.
Based on the position information and the lens information, shape information of the second image plane can be obtained, for example, the shape information of the second image plane is a curved surface, a spherical surface, an aspherical surface, or the like.
Step 204: the image sensor is moved by discrete step values in accordance with the shape information so that the image sensor intersects the second image plane.
And moving the image sensor in the Z-axis direction of the second image surface, wherein the discrete stepping value comprises the moving distance and the moving angle of the image sensor, and the parameter information comprises the shape information of the second image surface.
As shown in fig. 3, to generate a schematic diagram of a first image plane and a second image plane for a target object, in the diagram, 1 indicates the first image plane, 2 indicates the second image plane, 3 indicates an image sensor, and 4 indicates a lens.
When the second image plane of the target object is a curved surface. Considering the prior art, in order to realize the scheme of correcting aberration by taking the position of the image sensor as a parameter, the image sensor is arranged on a base which can make limited translation in the xyz direction and limited rotation around the xyz axis, the position and the angle of the image sensor are reasonably adjusted by utilizing the translation and the rotation of the base while exposing, and the coordinate position (x, y, z) of the base and the included angle (theta) between the coordinate position (x, y, z) of the base and the xyz axis are adjustedxyz) And controlling the relative position of the image sensor and the second image plane as a parameter, and sampling the image with the compensated field curvature on the second image plane to improve the imaging quality.
Theoretically, the base which moves freely with six degrees of freedom can adapt to the curved surface with any shape, so that the image sensor can adapt to the second image plane with any shape and correct corresponding aberrations such as field curvature. The position of the image sensor displacement is given by an optimized parameter obtained during the design of an optical system, a specific displacement scheme needs to be combined with system exposure and sampling time, and images in the second image plane are sampled at different discrete positions; the curvatures of the image surfaces at different view fields are different, the density of the sampling position can be increased at the position with the larger curvature, and the sampling position can be properly reduced at the position with the smaller curvature; in general, one exposure process requires sampling of the above positions during the exposure time to correct the aberration.
Step 205: and determining the coordinate position of each pixel intersected by the image sensor and the second image surface.
Step 206: and collecting pixel points at least one coordinate position.
For example: when the second image plane is a curved surface, when the curvatures of the second image planes at different view fields are different, the position with the larger curvature can increase the value of the number of the collected pixel points, and the position with the smaller curvature can appropriately reduce the value of the number of the sampled pixel points.
Step 207: and converting the optical signals of the pixel points into electric signals for storage.
The optical signals of the pixel points are converted into electric signals and stored, so that a target image can be conveniently generated subsequently.
Step 208: and outputting a target image of the target object according to each pixel point.
Wherein the target image is an aberration-free image.
And through the pixel points acquired for many times, all the pixel points are fused, a target image is output, and the generated target image effectively corrects the aberration.
In the embodiment of the invention, under the condition that the camera of the electronic equipment completes focusing, the first image plane corresponding to the target object is controlled to move to the position of the image sensor; wherein the first image plane coincides with the plane of the image sensor; determining parameter information of a target object corresponding to a second image plane, wherein the first image plane is an ideal image plane of the target object, the second image plane is an actual image plane of the target object, and the first image plane and the second image plane are tangent; moving the image sensor at least once according to the parameter information so as to enable the image sensor to be intersected with the second image plane; collecting at least one pixel point in an image sensor; according to each pixel point, a target image of a target object is output, the fact that a lens with a complex structure is not required to be assembled to reduce aberration can be achieved, the lens and the image sensor are directly combined to carry out aberration correction on the image, and therefore imaging quality of shooting of electronic equipment is improved.
EXAMPLE III
Referring to fig. 4, a block diagram of an electronic device according to a third embodiment of the present invention is shown.
The electronic equipment provided by the embodiment of the invention comprises: the control module 301 is configured to control a first image plane corresponding to a target object to move to a position of an image sensor when a camera of the electronic device completes focusing; the first image surface is an ideal image surface of the target object, and the first image surface is superposed with the plane of the image sensor; a determining module 302, configured to determine parameter information of a second image plane corresponding to the target object, where the second image plane is an actual image plane of the target object, and the first image plane is tangent to the second image plane; a moving module 303, configured to move the image sensor at least once according to the parameter information, so that the image sensor intersects with the second image plane; a collecting module 304, configured to collect at least one pixel point in the image sensor; an output module 305, configured to output a target image of the target object according to each of the pixel points, where the target image is an aberration-free image.
In the embodiment of the invention, under the condition that the camera of the electronic equipment completes focusing, the first image plane corresponding to the target object is controlled to move to the position of the image sensor; wherein the first image plane coincides with the plane of the image sensor; determining parameter information of a target object corresponding to a second image plane, wherein the first image plane is an ideal image plane of the target object, the second image plane is an actual image plane of the target object, and the first image plane and the second image plane are tangent; moving the image sensor at least once according to the parameter information so as to enable the image sensor to be intersected with the second image plane; collecting at least one pixel point in an image sensor; according to each pixel point, a target image of a target object is output, the fact that a lens with a complex structure is not required to be assembled to reduce aberration can be achieved, the lens and the image sensor are directly combined to carry out aberration correction on the image, and therefore imaging quality of shooting of electronic equipment is improved.
Example four
Referring to fig. 5, a block diagram of an electronic device according to a fourth embodiment of the present invention is shown.
The electronic equipment provided by the embodiment of the invention comprises: the control module 401 is configured to control a first image plane corresponding to a target object to move to a position of an image sensor when a camera of the electronic device completes focusing; the first image surface is an ideal image surface of the target object, and the first image surface is superposed with the plane of the image sensor; a determining module 402, configured to determine parameter information of a second image plane corresponding to the target object, where the second image plane is an actual image plane of the target object, and the first image plane is tangent to the second image plane; a moving module 403, configured to move the image sensor at least once according to the parameter information, so that the image sensor intersects with the second image plane; an acquisition module 404, configured to acquire at least one pixel point in the image sensor; an output module 405, configured to output a target image of the target object according to each of the pixel points, where the target image is an aberration-free image.
Preferably, the determining module 402 comprises: the first determining submodule 4021 is configured to determine position information of the first image plane and acquire lens information of a camera of the electronic device; the second determining submodule 4022 is configured to determine parameter information of the second image plane according to the position information and the lens information.
Preferably, the parameter information and the shape information of the second image plane, the moving module 403 is specifically configured to: and according to the shape information, moving the image sensor according to discrete step values so as to enable the image sensor to be intersected with the second image surface, wherein the image sensor is moved in the Z-axis direction of the second image surface, and the discrete step values comprise the moving distance and the moving angle of the image sensor.
Preferably, the acquisition module 404 includes: a third determining submodule 4041, configured to calculate a coordinate position of each pixel where the image sensor intersects with the second image plane; and the collecting submodule 4042 is configured to collect pixel points at least one coordinate position.
Preferably, the electronic device further includes: a storage module 406, configured to convert the optical signal of each pixel into an electrical signal for storage after the acquisition submodule 4042 acquires N pixel points at each coordinate position.
The electronic device provided in the embodiment of the present invention can implement each process implemented by the electronic device in the method embodiments of fig. 1 to fig. 2, and is not described herein again to avoid repetition.
In the embodiment of the invention, under the condition that the camera of the electronic equipment completes focusing, the first image plane corresponding to the target object is controlled to move to the position of the image sensor; wherein the first image plane coincides with the plane of the image sensor; determining parameter information of a target object corresponding to a second image plane, wherein the first image plane is an ideal image plane of the target object, the second image plane is an actual image plane of the target object, and the first image plane and the second image plane are tangent; moving the image sensor at least once according to the parameter information so as to enable the image sensor to be intersected with the second image plane; collecting at least one pixel point in an image sensor; according to each pixel point, a target image of a target object is output, the fact that a lens with a complex structure is not required to be assembled to reduce aberration can be achieved, the lens and the image sensor are directly combined to carry out aberration correction on the image, and therefore imaging quality of shooting of electronic equipment is improved.
EXAMPLE five
Referring to fig. 6, a hardware structure diagram of an electronic device for implementing various embodiments of the present invention is shown.
The electronic device 500 includes, but is not limited to: a radio frequency unit 501, a network module 502, an audio output unit 503, an input unit 504, a sensor 505, a display unit 506, a user input unit 507, an interface unit 508, a memory 509, a processor 510, and a power supply 511. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 6 does not constitute a limitation of the electronic device, and that the electronic device may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
A processor 510, configured to control a first image plane corresponding to a target object to move to a position of an image sensor when a camera of the electronic device completes focusing; the first image surface is an ideal image surface of the target object, and the first image surface is superposed with the plane of the image sensor; determining parameter information of a second image plane corresponding to the target object, wherein the second image plane is an actual image plane of the target object, and the first image plane is tangent to the second image plane; moving the image sensor at least once according to the parameter information so as to enable the image sensor to be intersected with the second image plane; collecting at least one pixel point in the image sensor; and outputting a target image of the target object according to each pixel point, wherein the target image is an aberration-free image.
In the embodiment of the invention, under the condition that the camera of the electronic equipment completes focusing, the first image plane corresponding to the target object is controlled to move to the position of the image sensor; wherein the first image plane coincides with the plane of the image sensor; determining parameter information of a target object corresponding to a second image plane, wherein the first image plane is an ideal image plane of the target object, the second image plane is an actual image plane of the target object, and the first image plane and the second image plane are tangent; moving the image sensor at least once according to the parameter information so as to enable the image sensor to be intersected with the second image plane; collecting at least one pixel point in an image sensor; according to each pixel point, a target image of a target object is output, the fact that a lens with a complex structure is not required to be assembled to reduce aberration can be achieved, the lens and the image sensor are directly combined to carry out aberration correction on the image, and therefore imaging quality of shooting of electronic equipment is improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 501 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 510; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 501 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 501 can also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 502, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 503 may convert audio data received by the radio frequency unit 501 or the network module 502 or stored in the memory 509 into an audio signal and output as sound. Also, the audio output unit 503 may also provide audio output related to a specific function performed by the electronic apparatus 500 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 503 includes a speaker, a buzzer, a receiver, and the like.
The input unit 504 is used to receive an audio or video signal. The input Unit 504 may include a Graphics Processing Unit (GPU) 5041 and a microphone 5042, and the Graphics processor 5041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 506. The image frames processed by the graphic processor 5041 may be stored in the memory 509 (or other storage medium) or transmitted via the radio frequency unit 501 or the network module 502. The microphone 5042 may receive sounds and may be capable of processing such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 501 in case of the phone call mode.
The electronic device 500 also includes at least one sensor 505, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 5061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 5061 and/or a backlight when the electronic device 500 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 505 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 506 is used to display information input by the user or information provided to the user. The Display unit 506 may include a Display panel 5061, and the Display panel 5061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 507 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 507 includes a touch panel 5071 and other input devices 5072. Touch panel 5071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 5071 using a finger, stylus, or any suitable object or attachment). The touch panel 5071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 510, and receives and executes commands sent by the processor 510. In addition, the touch panel 5071 may be implemented in various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 5071, the user input unit 507 may include other input devices 5072. In particular, other input devices 5072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 5071 may be overlaid on the display panel 5061, and when the touch panel 5071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 510 to determine the type of the touch event, and then the processor 510 provides a corresponding visual output on the display panel 5061 according to the type of the touch event. Although in fig. 6, the touch panel 5071 and the display panel 5061 are two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 5071 and the display panel 5061 may be integrated to implement the input and output functions of the electronic device, and is not limited herein.
The interface unit 508 is an interface for connecting an external device to the electronic apparatus 500. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 508 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the electronic apparatus 500 or may be used to transmit data between the electronic apparatus 500 and external devices.
The memory 509 may be used to store software programs as well as various data. The memory 509 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 509 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 510 is a control center of the electronic device, connects various parts of the whole electronic device by using various interfaces and lines, performs various functions of the electronic device and processes data by running or executing software programs and/or modules stored in the memory 509 and calling data stored in the memory 509, thereby performing overall monitoring of the electronic device. Processor 510 may include one or more processing units; preferably, the processor 510 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 510.
The electronic device 500 may further include a power supply 511 (e.g., a battery) for supplying power to various components, and preferably, the power supply 511 may be logically connected to the processor 510 via a power management system, so as to implement functions of managing charging, discharging, and power consumption via the power management system.
In addition, the electronic device 500 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides an electronic device, which includes a processor 510, a memory 509, and a computer program that is stored in the memory 509 and can be run on the processor 510, and when the computer program is executed by the processor 510, the processes of the above-mentioned embodiment of the image generation method are implemented, and the same technical effect can be achieved, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the embodiment of the image generation method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (12)

1. An image generation method applied to an electronic device, the method comprising:
under the condition that a camera of the electronic equipment finishes focusing, controlling a first image plane corresponding to a target object to move to the position of an image sensor; the first image surface is an ideal image surface of the target object, and the first image surface is superposed with the plane of the image sensor;
determining parameter information of a second image plane corresponding to the target object, wherein the second image plane is an actual image plane of the target object, and the first image plane is tangent to the second image plane;
moving the image sensor for multiple times according to the parameter information so that the image sensor is intersected with the second image plane when the image sensor is moved each time, and sampling images displayed at different moving positions;
the image sensor is moved each time, at least one pixel point in the image sensor is collected;
and fusing the pixel points according to the pixel points, and outputting a target image of the target object, wherein the target image is an aberration-free image.
2. The method according to claim 1, wherein the step of determining the parameter information of the target object corresponding to the second image plane comprises:
determining position information of the first image plane, an
Acquiring lens information of a camera of the electronic equipment;
and determining parameter information of the second image plane according to the position information and the lens information.
3. The method of claim 1, wherein the parameter information includes shape information of the second image plane, and wherein the step of moving the image sensor a plurality of times in accordance with the parameter information so that each movement of the image sensor intersects the second image plane comprises:
and according to the shape information, moving the image sensor according to discrete step values so as to enable the image sensor to be intersected with the second image surface, wherein the image sensor is moved in the Z-axis direction of the second image surface, and the discrete step values comprise the moving distance and the moving angle of the image sensor.
4. The method of claim 1, wherein said step of capturing at least one pixel in said image sensor each time said image sensor is moved comprises:
determining the coordinate position of each pixel intersected by the image sensor and the second image plane;
and collecting pixel points at least one coordinate position.
5. The method of claim 4, wherein after the step of capturing a pixel point at least one of the coordinate locations, the method further comprises:
and converting the optical signals of the pixel points into electric signals for storage.
6. An electronic device, characterized in that the electronic device comprises:
the control module is used for controlling a first image plane corresponding to a target object to move to the position of the image sensor under the condition that the camera of the electronic equipment finishes focusing; the first image surface is an ideal image surface of the target object, and the first image surface is superposed with the plane of the image sensor;
the determining module is used for determining parameter information of a second image plane corresponding to the target object, wherein the second image plane is an actual image plane of the target object, and the first image plane is tangent to the second image plane;
the moving module is used for moving the image sensor for multiple times according to the parameter information so that the image sensor is intersected with the second image plane when the image sensor is moved each time, and sampling images displayed at different moving positions;
the acquisition module is used for acquiring at least one pixel point in the image sensor every time the image sensor is moved;
and the output module is used for fusing the pixel points according to the pixel points and outputting a target image of the target object, wherein the target image is an aberration-free image.
7. The electronic device of claim 6, wherein the determining module comprises:
the first determining submodule is used for determining the position information of the first image plane and acquiring lens information of a camera of the electronic equipment;
and the second determining submodule is used for determining the parameter information of the second image plane according to the position information and the lens information.
8. The electronic device of claim 6, wherein the parameter information includes shape information of the second image plane, and the moving module is specifically configured to:
and according to the shape information, moving the image sensor according to discrete step values so as to enable the image sensor to be intersected with the second image surface, wherein the image sensor is moved in the Z-axis direction of the second image surface, and the discrete step values comprise the moving distance and the moving angle of the image sensor.
9. The electronic device of claim 6, wherein the acquisition module comprises:
the third determining submodule is used for calculating the coordinate position of each pixel intersected by the image sensor and the second image plane;
and the acquisition submodule is used for acquiring at least one pixel point at the coordinate position.
10. The electronic device of claim 9, further comprising:
and the storage module is used for converting the optical signals of the pixel points into electric signals for storage after the acquisition submodule acquires the pixel points at least one coordinate position.
11. An electronic device, comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the image generation method according to any one of claims 1 to 5.
12. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the image generation method according to any one of claims 1 to 5.
CN201911418903.0A 2019-12-31 2019-12-31 Image generation method and electronic device Active CN111093031B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911418903.0A CN111093031B (en) 2019-12-31 2019-12-31 Image generation method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911418903.0A CN111093031B (en) 2019-12-31 2019-12-31 Image generation method and electronic device

Publications (2)

Publication Number Publication Date
CN111093031A CN111093031A (en) 2020-05-01
CN111093031B true CN111093031B (en) 2021-06-15

Family

ID=70398229

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911418903.0A Active CN111093031B (en) 2019-12-31 2019-12-31 Image generation method and electronic device

Country Status (1)

Country Link
CN (1) CN111093031B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112637474B (en) * 2020-12-30 2022-04-12 维沃移动通信有限公司 Camera module and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5398064A (en) * 1990-05-14 1995-03-14 Minolta Co., Ltd. Camera system with automatic variator compensation in a zooming mode of operation
JP2004118972A (en) * 2002-09-27 2004-04-15 Sharp Corp Optical pickup device
CN101183175A (en) * 2006-11-13 2008-05-21 华晶科技股份有限公司 Optical aberration correcting system and method of digital cameras
CN203164571U (en) * 2011-10-24 2013-08-28 奥林巴斯映像株式会社 Photographic device and optical device
CN104113703A (en) * 2013-04-15 2014-10-22 佳能株式会社 Image processing apparatus and image processing method
CN106973203A (en) * 2017-04-21 2017-07-21 华为机器有限公司 Camera module
CN108600623A (en) * 2018-04-20 2018-09-28 维沃移动通信有限公司 Refocusing display methods and terminal device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050013021A1 (en) * 2003-06-10 2005-01-20 Olympus Corporation Decentered optical system, light transmitting device, light receiving device, and optical system
US7369320B2 (en) * 2005-12-30 2008-05-06 3M Innovative Properties Company Projection system with beam homogenizer
CN102519593B (en) * 2011-12-27 2013-08-21 中国科学院长春光学精密机械与物理研究所 Optical system for super-wide-angle short wave infrared push-broom hyperspectral imager

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5398064A (en) * 1990-05-14 1995-03-14 Minolta Co., Ltd. Camera system with automatic variator compensation in a zooming mode of operation
JP2004118972A (en) * 2002-09-27 2004-04-15 Sharp Corp Optical pickup device
CN101183175A (en) * 2006-11-13 2008-05-21 华晶科技股份有限公司 Optical aberration correcting system and method of digital cameras
CN203164571U (en) * 2011-10-24 2013-08-28 奥林巴斯映像株式会社 Photographic device and optical device
CN104113703A (en) * 2013-04-15 2014-10-22 佳能株式会社 Image processing apparatus and image processing method
CN106973203A (en) * 2017-04-21 2017-07-21 华为机器有限公司 Camera module
CN108600623A (en) * 2018-04-20 2018-09-28 维沃移动通信有限公司 Refocusing display methods and terminal device

Also Published As

Publication number Publication date
CN111093031A (en) 2020-05-01

Similar Documents

Publication Publication Date Title
CN108513070B (en) Image processing method, mobile terminal and computer readable storage medium
CN110557575B (en) Method for eliminating glare and electronic equipment
CN108989672B (en) Shooting method and mobile terminal
CN109660723B (en) Panoramic shooting method and device
CN109685915B (en) Image processing method and device and mobile terminal
CN108763998B (en) Bar code identification method and terminal equipment
CN110300267B (en) Photographing method and terminal equipment
CN111031234B (en) Image processing method and electronic equipment
CN111355863B (en) Image distortion correction method and device, electronic equipment and storage medium
CN110602389B (en) Display method and electronic equipment
CN107846583B (en) Image shadow compensation method and mobile terminal
CN111031253B (en) Shooting method and electronic equipment
CN111652942B (en) Calibration method of camera module, first electronic equipment and second electronic equipment
CN108174110B (en) Photographing method and flexible screen terminal
CN111008929B (en) Image correction method and electronic equipment
CN110602390B (en) Image processing method and electronic equipment
CN108156386B (en) Panoramic photographing method and mobile terminal
CN111182206B (en) Image processing method and device
CN111093031B (en) Image generation method and electronic device
CN109257504B (en) Audio processing method and terminal equipment
CN111416948A (en) Image processing method and electronic equipment
CN108712604B (en) Panoramic shooting method and mobile terminal
CN108600623B (en) Refocusing display method and terminal device
CN110942064A (en) Image processing method and device and electronic equipment
CN111182230B (en) Image processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant