WO2018076938A1 - 图像处理装置及方法和计算机存储介质 - Google Patents

图像处理装置及方法和计算机存储介质 Download PDF

Info

Publication number
WO2018076938A1
WO2018076938A1 PCT/CN2017/100949 CN2017100949W WO2018076938A1 WO 2018076938 A1 WO2018076938 A1 WO 2018076938A1 CN 2017100949 W CN2017100949 W CN 2017100949W WO 2018076938 A1 WO2018076938 A1 WO 2018076938A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
size
cropping
unit
images
Prior art date
Application number
PCT/CN2017/100949
Other languages
English (en)
French (fr)
Inventor
魏宇虹
Original Assignee
努比亚技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 努比亚技术有限公司 filed Critical 努比亚技术有限公司
Publication of WO2018076938A1 publication Critical patent/WO2018076938A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Definitions

  • the present invention relates to image processing technologies in the field of information technology, and in particular, to an image processing apparatus and method and a computer storage medium.
  • zooming is usually performed.
  • the zoom involves the user's operation on the collecting device, and the jitter of the electronic device may occur, and the jitter of the electronic device may cause a blur phenomenon such as ghosting of the captured image, resulting in blurring of the captured image.
  • the problem of the quality of the captured image is degraded.
  • embodiments of the present invention are directed to providing an image processing apparatus and method and a computer storage medium that at least partially solve the problem of poor image quality.
  • a first aspect of the embodiments of the present invention provides an image processing apparatus, including:
  • a zoom unit configured to respond to a zoom operation and control the acquisition module to perform digital zoom
  • the acquiring unit is configured to collect the first image of the first size by using the acquiring module after zooming;
  • a cropping unit configured to crop the first image to obtain a second image of a second size; wherein the second size is smaller than the first image;
  • a display unit configured to display the second image.
  • a second aspect of the embodiments of the present invention provides an image processing method, including:
  • the acquisition module is controlled to perform digital zoom
  • the acquiring module collects a first image of a first size
  • the second image is displayed.
  • a third aspect of the embodiments of the present invention provides an image processing apparatus, including:
  • An image collector configured to acquire an image
  • a memory configured to store information including at least an image acquired by the image collector and a computer program
  • a processor coupled to the image collector, the memory, and the display, respectively, configured to control the image processing apparatus to perform at least the following steps by executing the computer program:
  • the acquisition module is controlled to perform digital zoom
  • the acquiring module collects a first image of a first size
  • the second image is displayed.
  • a fourth aspect of the embodiments of the present invention provides an image processing apparatus, including:
  • An image collector configured to acquire an image
  • a memory configured to store information including at least an image acquired by the image collector
  • a processor configured to be connected to the image collector, the memory, and the display Executing the computer program implements one or more of the image processing methods described above.
  • a fifth aspect of the embodiments of the present invention is a computer storage medium, wherein the computer storage medium stores a computer program for executing the one or more image processing methods.
  • the image processing apparatus and method and the computer storage medium provided by the embodiments of the present invention in order to avoid the jitter caused by the zoom, affect the image quality such as the sharpness of the image to be output and output, and collect a first size with a relatively large area during the acquisition. The first image is then passed to obtain a second image of a higher quality than the first image of the first image, thereby improving the image quality of the image outputted.
  • the image processing apparatus does not introduce more optical structures for anti-shake to provide image quality, and has the characteristics of low implementation cost and simple implementation.
  • FIG. 1 is a schematic structural diagram of a camera according to an embodiment of the present invention.
  • FIG. 2 is a schematic flowchart of an image processing method according to an embodiment of the present invention.
  • FIG. 3A is a schematic diagram of a first cropping according to an embodiment of the present invention.
  • FIG. 3B is a schematic diagram of a second cutting according to an embodiment of the present invention.
  • FIG. 4 is a schematic structural diagram of an image collection device according to an embodiment of the present invention.
  • FIG. 5 is a schematic structural diagram of a mobile terminal according to an embodiment of the present disclosure.
  • FIG. 6 is a schematic structural diagram of a communication system according to an embodiment of the present invention.
  • Embodiments of the present invention first provide an image acquisition structure, which may be a camera.
  • Figure 1 is a block diagram of the electrical structure of the camera.
  • the photographic lens 1211 is composed of a plurality of optical lenses for forming a subject image, and is a single focus lens or a zoom lens.
  • the photographic lens 1211 is movable in the optical axis direction under the control of the lens driver 1221, and the lens driver 1221 controls the focus position of the photographic lens 1211 in accordance with a control signal from the lens driving control circuit 1222, and can also be controlled in the case of the zoom lens. Focus distance.
  • the lens drive control circuit 1222 performs drive control of the lens driver 1221 in accordance with a control command from the microcomputer 1217.
  • An imaging element 1212 is disposed on the optical axis of the photographic lens 1211 near the position of the subject image formed by the photographic lens 1211.
  • the imaging element 1212 is for capturing an image of a subject and acquiring captured image data.
  • Photodiodes constituting each pixel are arranged two-dimensionally and in a matrix on the imaging element 1212. Each photodiode generates a photoelectric conversion current corresponding to the amount of received light, and the photoelectric conversion current is charged by a capacitor connected to each photodiode.
  • the front surface of each pixel is provided with a Bayer array of RGB color filters.
  • the imaging element 1212 is connected to the imaging circuit 1213.
  • the imaging circuit 1213 performs charge accumulation control and image signal readout control in the imaging element 1212, and performs waveform shaping after reducing the reset noise of the read image signal (analog image signal). Further, gain improvement or the like is performed to obtain an appropriate signal level.
  • the imaging circuit 1213 is connected to an A/D converter 1214 that performs analog-to-digital conversion on the analog image signal and outputs a digital image signal (hereinafter referred to as image data) to the bus 1227.
  • A/D converter 1214 that performs analog-to-digital conversion on the analog image signal and outputs a digital image signal (hereinafter referred to as image data) to the bus 1227.
  • the bus 1227 is a transmission path for transmitting various data read or generated inside the camera.
  • the A/D converter 1214 is connected to the bus 1227, and an image processor 1215, a JPEG processor 1216, a microcomputer 1217, a SDRAM (Synchronous Dynamic Random Access Memory) 1218, and a memory interface are also connected. (hereinafter referred to as memory I/F) 1219, LCD (Liquid Crystal Display) driver 1220.
  • memory I/F memory I/F
  • LCD Liquid Crystal Display
  • the image processor 1215 performs OB subtraction on image data based on the output of the imaging element 1212.
  • Various image processing such as processing, white balance adjustment, color matrix calculation, gamma conversion, color difference signal processing, noise removal processing, simultaneous processing, edge processing, and the like.
  • the JPEG processor 1216 compresses the image data read out from the SDRAM 1218 in accordance with the JPEG compression method when the image data is recorded on the recording medium 1225. Further, the JPEG processor 1216 performs decompression of JPEG image data for image reproduction display.
  • the file recorded on the recording medium 1225 is read, and after the compression processing is performed in the JPEG processor 1216, the decompressed image data is temporarily stored in the SDRAM 1218 and displayed on the LCD 1226.
  • the JPEG method is adopted as the image compression/decompression method.
  • the compression/decompression method is not limited thereto, and other compression/decompression methods such as MPEG, TIFF, and H.264 may be used.
  • the microcomputer 1217 functions as a control unit of the entire camera, and collectively controls various processing sequences of the camera.
  • the microcomputer 1217 is connected to the operation unit 1223 and the flash memory 1224.
  • the operating unit 1223 includes, but is not limited to, a physical button or a virtual button, and the entity or virtual button may be a power button, a camera button, an edit button, a dynamic image button, a reproduction button, a menu button, a cross button, an OK button, a delete button, an enlarge button
  • the operation controls such as various input buttons and various input keys detect the operation state of these operation controls.
  • the detection result is output to the microcomputer 1217. Further, a touch panel is provided on the front surface of the LCD 1226 as a display, and the touch position of the user is detected, and the touch position is output to the microcomputer 1217.
  • the microcomputer 1217 executes various processing sequences corresponding to the user's operation in accordance with the detection result from the operation position of the operation unit 1223.
  • the flash memory 1224 stores programs for executing various processing sequences of the microcomputer 1217.
  • the microcomputer 1217 performs overall control of the camera in accordance with the program. Further, the flash memory 1224 stores various adjustment values of the camera, and the microcomputer 1217 reads out the adjustment value, and performs control of the camera in accordance with the adjustment value.
  • the SDRAM 1218 is an electrically rewritable volatile memory for temporarily storing image data or the like.
  • the SDRAM 1218 temporarily stores image data output from the A/D converter 1214 and image data processed in the image processor 1215, the JPEG processor 1216, and the like.
  • the memory interface 1219 is connected to the recording medium 1225, and performs control for writing image data and a file header attached to the image data to the recording medium 1225 and reading out from the recording medium 1225.
  • the recording medium 1225 is, for example, a recording medium such as a memory card that can be detachably attached to the camera body.
  • the recording medium 1225 is not limited thereto, and may be a hard disk or the like built in the camera body.
  • the LCD driver 1210 is connected to the LCD 1226, and stores image data processed by the image processor 1215 in the SDRAM 1218.
  • the image data stored in the SDRAM 1218 is read and displayed on the LCD 1226, or the image data stored in the JPEG processor 1216 is compressed.
  • the JPEG processor 1216 reads the compressed image data of the SDRAM 1218, decompresses it, and displays the decompressed image data through the LCD 1226.
  • the LCD 1226 is configured to display an image on the back of the camera body.
  • the LCD 1226 LCD is not limited thereto, and various display panels (LCD 1226) such as an organic EL may be used.
  • various display panels such as an organic EL may be used.
  • the present invention is not limited thereto, and various display panels such as an organic EL may be used.
  • this embodiment provides an image processing method, including:
  • Step S110 Control the acquisition module to perform digital zoom in response to the zoom operation
  • Step S120 collecting, by using the zooming module, the first image of the first size after zooming
  • Step S130 cropping the first image to obtain a second image of a second size; wherein the second size is smaller than the first image;
  • Step S140 Display the second image.
  • the embodiment provides an image processing method, which can be applied to an electronic device including an acquisition module.
  • the acquisition module herein may include the camera shown in FIG. 1 or any existing camera.
  • the zoom operation is controlled in the step S110 to control the acquisition of the film assembly.
  • Digital zoom For example, the user touches the touch screen by a finger, inputs a zoom operation to the electronic device, and the user's finger is evacuated after zooming, which usually causes the electronic device to shake.
  • the camera the user zooms by rotating the hardware structure of the lens when the user evacuates.
  • jitter may cause image blurring such as ghosting due to jitter.
  • the image capturing is performed with a larger first size during image capturing, and the first image is obtained.
  • the first size here may be the number corresponding to the unit pixel point.
  • the jitter of the electronic device causes the acquisition module to move up and down and left and right in the vertical plane in which it is located, which may cause unclear image quality problems such as blurring in parts of the image.
  • the electronic device is in the background, and the first image is cropped to obtain a second image that is smaller in size relative to the first image.
  • the first size includes S1 unit pixels; the second image may include S2 unit pixels; the S2 is smaller than the S1; and the S2 unit pixels are part of the S1 unit pixels .
  • the S1-S2 pixels that are cropped are parts that do not satisfy the clear condition, so that the image cropping in the background of the electronic device can make the displayed image have higher definition, thereby realizing the anti-shake processing for the user and improving the image.
  • the method further includes deleting the first image and storing the second image to facilitate subsequent user review of the second image.
  • the step S130 may include:
  • At least a portion of the peripheral area of the first image is cropped to form the second image.
  • a cropping strategy is pre-stored, or a cropping strategy is received from other electronic devices.
  • the study found that the images acquired during the jitter of the electronic device are blurred.
  • the area is usually located around the image.
  • the peripheral area of the first image is cropped, and the intermediate area of the first image is retained to form the second image.
  • the left image of FIG. 3A is the first image
  • the right image of FIG. 3A is the second image
  • a portion corresponding to the broken line frame in the first image is a portion of the second image. It is apparent that the size obtained by the cropping of the second image is smaller than the size of the first image, while the second image is the intermediate portion of the first image.
  • the left image of FIG. 3B is the first image
  • the right image of FIG. 3B is the second image
  • a portion corresponding to the broken line frame in the first image is a portion of the second image. It is apparent that the size obtained by the cropping of the second image is smaller than the size of the first image, while the second image includes the intermediate portion of the first image and a portion of the peripheral region.
  • the step S120 may include:
  • N pieces of the first image of the first size by using the zooming acquisition module Acquiring N pieces of the first image of the first size by using the zooming acquisition module; wherein the N is an integer not less than 2;
  • the step S130 may include:
  • the overlapping regions in the N first images are retained during cropping to form the second image.
  • the acquisition module collects a plurality of the first images before outputting the second image; the first image is a captured image at different times during the shaking process of the electronic device. Since the electronic device is in the process of shaking, the object, the character, and/or the animal collected by the acquisition module may be changed, and the area where the change first occurs is definitely the surrounding area. As long as the amplitude of the jitter of the electronic device is not particularly large, the two first images must have some of the acquired objects remain unchanged, and usually these acquired objects are displayed in the middle of the first image, and usually It is clear that the overlapping area is the intermediate area of the N first images.
  • the overlapping area of the N first images is selected as the intermediate area, and the Second image.
  • the size of the second image thus formed is obviously smaller than the The size of an image.
  • the second size is an image size of a predetermined output after zooming
  • the step S130 may include:
  • An intermediate region equal to the second size in the first image is selected as a reserved region, and the first image is cropped to form the second image.
  • the digital zoom will correspond to an image size, which in the present embodiment is directly the predetermined image size.
  • the image area of the second size is reserved with the center point of the first image as the center point of the second image to form a second image.
  • the step S130 may further include:
  • An area in which the definition satisfies the preset definition condition is selected as the reserved area to form the second image.
  • the first image may be first divided into a plurality of regions; and the resolution of each region in the first image is obtained by using various processing manners. For example, the sharpness of each region is detected using edge gradient detection, correlation detection, and an evaluation function based on statistics and/or changes.
  • a plurality of regions having a sharpness greater than a preset threshold or a higher definition are selected as the reserved regions of the crop to form the second image.
  • the method described in this embodiment may appear in the periphery of the first image according to a large probability that the region with low statistical probability is low. Therefore, in the embodiment, the second image may be formed based on an image policy.
  • the reserved area in the embodiment may not be limited to the intermediate area, so that the second image of the largest size can be obtained.
  • the step S130 may include dividing the first image into N equal areas.
  • the area is further selected according to the sharpness of the image, and the continuously distributed M areas satisfying the preset definition condition are selected as the reserved area to form the second image.
  • the M is less than or equal to the N.
  • the N is equal to 16, and the M can be equal to 4.
  • the remaining M regions may be intermediate regions and/or peripheral regions of the N regions.
  • the step S130 may include:
  • Forming the second image from the N regions by selecting M regions whose resolution meets the preset definition condition and continuously distributed as the reserved region; wherein the M is a positive integer smaller than N; N is the total number of regions of the first image.
  • the step S120 may include: after the zooming, based on one acquisition determining operation, acquiring a plurality of first images equal to the first size; equivalent to one acquisition instruction, the device automatically collects two or More than 2 first images.
  • the step S130 may include: comparing the sharpness of the N first images, selecting the clearest first image to perform cropping to obtain the cropped second image.
  • the clearest one is selected for cropping, and it is not necessary to crop each of the first images, and the uncut images that are not sharp enough can be directly discarded, thereby reducing the amount of stored data.
  • the step S130 may include cutting a plurality of the first images when a plurality of images of the first image having a continuous distribution image and the image area satisfying the preset clarity are equal to the second size. And obtaining a plurality of image regions satisfying the preset definition; splicing the plurality of image regions to obtain the second image that satisfies the preset definition condition.
  • a smaller image area satisfying the definition can be obtained from the different first images by cropping, and then spliced to form The second image can reduce the number of repeated acquisitions for the user.
  • the embodiment provides an image processing apparatus, including:
  • the zoom unit 310 is configured to control the acquisition module to perform digital zoom in response to the zoom operation;
  • the collecting unit 320 is configured to collect the first image of the first size by using the acquiring module after zooming image;
  • the cropping unit 330 is configured to crop the first image to obtain a second image of a second size; wherein the second size is smaller than the first image;
  • the display unit 340 is configured to display the second image.
  • the image processing apparatus described in this embodiment may be a structure applied to the aforementioned mobile terminal or a camera or an electronic device.
  • the zoom unit 310 can control the capture module to perform zooming corresponding to the controller.
  • the acquisition module can include an acquisition lens.
  • the collecting unit 320 may be a structure including the collecting module, and the collecting unit 320 may further include a controller or a control circuit, and the controller or the control circuit controls the collecting module to perform image sensing, and then performs an image. collection.
  • the cropping unit 330 can correspond to a processor or processing circuit, and the processor can include a central processing unit, a microprocessor, a digital signal processor, an application processor or a programmable array, and the like.
  • the processing circuit can include an application specific integrated circuit or the like.
  • the processor or processing circuit can be coupled to a storage medium, and the cropping operation of the cropping unit 330 can be implemented by reading executable code in the storage medium.
  • the display unit 340 can correspond to various display structures, for example, a liquid crystal display, an organic light emitting diode OLED display, a projection display, an electronic ink display, or a plasma display, and the like, which can display a second image. structure.
  • a liquid crystal display for example, a liquid crystal display, an organic light emitting diode OLED display, a projection display, an electronic ink display, or a plasma display, and the like, which can display a second image. structure.
  • the user can see the captured image through the display of the display unit 340, and the second image outputted by the device in the embodiment is clearer than the image output by the prior art after zooming, and the image effect is better. .
  • the cropping unit 330 is configured to crop at least a portion of the peripheral region of the first image to form the second image according to a cropping strategy.
  • the apparatus can also include a storage unit that can correspond to a storage medium for storing the cropping strategy.
  • the apparatus may also include a communication unit corresponding to the communication interface and operable to receive a search cropping policy from the peripheral device.
  • the cutting unit 330 and the The storage unit or the communication unit is connected, and according to the cropping strategy, at least part of the peripheral area of the first image is cropped to obtain a second image with higher definition. If the first image is a rectangular image, the peripheral area may include at least four peripheral areas according to the side where it is located.
  • the cropping unit 330 may cut only 1, 2 or 3 peripheral regions during the cropping process, and retain a portion of the peripheral region to form the second image.
  • the collecting unit 320 is configured to acquire N first images of the first size by using the zoomed acquisition module; wherein the N is an integer not less than 2;
  • the cropping unit 330 is configured to compare the first images of the N sheets, and retain overlapping regions in the N first images when cropping to form the second image.
  • the acquisition unit 320 may acquire at least two first images before the display unit 340 finally displays the second image, or before the crop unit 330 is cropped.
  • the cropping unit 330 determines, by image comparison, that the reserved area of the first image forms the second image, which is equivalent to dynamically determining the area of the second size.
  • the second size is an image size of a predetermined output after zooming
  • the cropping unit 330 is configured to select an intermediate region in the first image that is equal to the second size as a reserved region, and crop the first image to form the second image.
  • the second size is statically set, and is relatively simple to implement with respect to dynamic determination.
  • the cropping unit 330 is configured to acquire the sharpness of each region of the first image; and select an area whose resolution meets the preset sharpness condition as a reserved area to form the second image.
  • the cropping will be performed according to the sharpness.
  • the cropped second image satisfies at least the clarity condition, and the sharpness of the second image displayed by the display unit 340 can be ensured.
  • the cropping unit 330 may be configured to select, from the N regions, M regions whose resolution meets the preset definition condition and are continuously distributed as the reserved region, to form the second image; Wherein M is a positive integer smaller than N; and N is a total number of regions of the first image.
  • the collecting unit 320 is configured to collect a plurality of first images equal to the first size after performing an operation based on one acquisition after zooming;
  • the cropping unit is configured to compare the sharpness of the N first images, and select the clearest first image to perform cropping to obtain the cropped second image.
  • the cropping unit 330 is configured to: when there are multiple images in the first image that have a continuous distribution image and the image area that meets the preset clarity is equal to the second size, the cropping unit is configured to cut more And scanning the first image to obtain a plurality of image regions that satisfy the preset definition; splicing the plurality of image regions to obtain the second image that satisfies the preset definition condition.
  • the embodiment of the invention provides an image processing device, which can correspond to a mobile terminal, and includes:
  • An image collector configured to acquire an image
  • a memory configured to store information including at least an image acquired by the image collector and a computer program
  • a processor coupled to the image collector, the memory, and the display, respectively, configured to control the image processing apparatus to perform at least the following steps by executing the computer program:
  • the acquisition module is controlled to perform digital zoom
  • the acquiring module collects a first image of a first size
  • the second image is displayed.
  • the processor is further configured to:
  • N pieces of the first image of the first size by using the zooming acquisition module Acquiring N pieces of the first image of the first size by using the zooming acquisition module; wherein the N is an integer not less than 2;
  • the second image includes:
  • the overlapping regions in the N first images are retained during cropping to form the second image.
  • the image processing apparatus provided in this embodiment may perform the image processing method provided by any one of the foregoing technical solutions.
  • the image collector described in this embodiment may be various sensors or devices capable of image acquisition, for example, corresponding to the camera 121 or the lens in FIG.
  • the memory may be a device including various types of storage media, such as a random access memory or a read only memory, or a disk storage including a flash memory, etc., such as the memory 160 in FIG.
  • the processor can be various types of processors, such as a central memory, a microprocessor, a digital signal processor, an application processor, a programmable array, or an application specific integrated circuit.
  • the processor can be the controller 180 as shown in FIG.
  • the processor can be connected to the memory and the image collector through a bus interface such as an integrated circuit bus, and can send control commands to the memory and the image collector through the bus to control image acquisition and information storage, and can also be from the memory and/or image collector. Read data to implement image processing.
  • a bus interface such as an integrated circuit bus
  • An embodiment of the present invention further provides an image processing apparatus, including:
  • An image collector configured to acquire an image
  • a memory configured to store information including at least an image acquired by the image collector
  • a processor configured to be connected to the image collector, the memory, and the display, respectively, to implement an image processing method provided by any one or more technical solutions by executing the computer program law.
  • the embodiment of the invention further provides a computer storage medium, wherein the computer storage medium stores a computer program, and the computer program can be used to implement an image processing method provided by any one or more technical solutions.
  • the computer storage medium is a non-transitory storage medium.
  • the embodiment further provides a mobile terminal, which may include any one of the above image processing apparatuses.
  • the mobile terminal can be implemented in various forms.
  • the terminals described in the present invention may include, for example, mobile phones, smart phones, notebook computers, digital broadcast receivers, personal digital assistants (PDAs), tablet computers (PADs), portable multimedia players (PMPs), navigation devices, and the like.
  • Mobile terminals and fixed terminals such as digital TVs, desktop computers, and the like.
  • the terminal is a mobile terminal.
  • PDAs personal digital assistants
  • PADs tablet computers
  • PMPs portable multimedia players
  • Mobile terminals and fixed terminals such as digital TVs, desktop computers, and the like.
  • the terminal is a mobile terminal.
  • those skilled in the art will appreciate that configurations in accordance with embodiments of the present invention can be applied to fixed type terminals in addition to components that are specifically for mobile purposes.
  • FIG. 5 is a schematic diagram showing the hardware structure of a mobile terminal 100 that implements various embodiments of the present invention.
  • the mobile terminal 100 may include a wireless communication unit 110, an audio/video (A/V) input unit 120, and a user input unit 130.
  • Figure 5 illustrates a mobile terminal 100 having various components, but it should be understood that not all illustrated components are required to be implemented. More or fewer components can be implemented instead. The elements of the mobile terminal 100 will be described in detail below.
  • Wireless communication unit 110 typically includes one or more components that permit radio communication between mobile terminal 100 and a wireless communication system or network.
  • the wireless communication unit 110 may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short-range communication module 114, and a location information module 115.
  • the broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast management server via a broadcast channel.
  • the broadcast channel can include a satellite channel and/or a terrestrial channel.
  • the broadcast management server may be a server that generates and transmits a broadcast signal and/or broadcast associated information or a broadcast signal and/or broadcast associated information that is generated beforehand and transmitted to a server of the terminal.
  • the broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like.
  • the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal.
  • the broadcast associated information may also be provided via a mobile communication network, and in this case, the broadcast associated information may be received by the mobile communication module 112.
  • the broadcast signal may exist in various forms, for example, it may exist in the form of Digital Multimedia Broadcasting (DMB) Electronic Program Guide (EPG), Digital Video Broadcasting Handheld (DVB-H) Electronic Service Guide (ESG), and the like.
  • the broadcast receiving module 111 can receive a signal broadcast by using various types of broadcast systems.
  • the broadcast receiving module 111 can use forward link media (MediaFLO) by using, for example, multimedia broadcast-terrestrial (DMB-T), digital multimedia broadcast-satellite (DMB-S), digital video broadcast-handheld (DVB-H)
  • MediaFLO forward link media
  • the digital broadcasting system of the @ ) data broadcasting system, the terrestrial digital broadcasting integrated service (ISDB-T), and the like receives digital broadcasting.
  • the broadcast receiving module 111 can be constructed as various broadcast systems suitable for providing broadcast signals as well as the above-described digital broadcast system.
  • the broadcast signal and/or broadcast associated information received via the broadcast receiving module 111 may be stored in the memory 160 (or other type of
  • the mobile communication module 112 transmits the radio signals to and/or receives radio signals from at least one of a base station (e.g., an access point, a Node B, etc.), an external terminal, and a server.
  • a base station e.g., an access point, a Node B, etc.
  • Such radio signals may include voice call signals, video call signals, or various types of data transmitted and/or received in accordance with text and/or multimedia messages.
  • the wireless internet module 113 supports wireless internet access of the mobile terminal 100.
  • the wireless internet module 113 can be internally or externally coupled to the terminal.
  • the wireless internet access technologies involved in the wireless internet module 113 may include wireless local area network (WLAN), wireless compatibility authentication (Wi-Fi), wireless broadband (Wibro), global microwave interconnection access (Wimax), and high speed downlink. Packet Access (HSDPA) and more.
  • the short range communication module 114 is a module for supporting short range communication.
  • Some examples of short-range communication technology include Bluetooth TM, a radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB), ZigBee, etc. TM.
  • the location information module 115 is a module for checking or acquiring location information of the mobile terminal 100.
  • a typical example of location information module 115 is Global Positioning System (GPS) module 115.
  • GPS Global Positioning System
  • the GPS module 115 calculates distance information and accurate time information from three or more satellites and applies triangulation to the calculated information to accurately calculate three-dimensional current position information based on longitude, latitude, and altitude.
  • the method for calculating position and time information uses three satellites and corrects the calculated position and time information errors by using another satellite.
  • the GPS module 115 is capable of calculating speed information by continuously calculating current position information in real time.
  • the A/V input unit 120 is for receiving an audio or video signal.
  • the A/V input unit 120 may include a camera 121 and a microphone 122 that processes image data of still pictures or video obtained by the image capturing device in a video capturing mode or an image capturing mode.
  • the processed image frame can be displayed on the display unit 151.
  • the image frames processed by the camera 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras 121 may be provided according to the configuration of the mobile terminal 100.
  • the microphone 122 can receive sound (audio data) via a microphone in an operation mode of a telephone call mode, a recording mode, a voice recognition mode, and the like, and can process such sound as audio data.
  • the processed audio (voice) data can be converted to a format output that can be transmitted to the mobile communication base station via the mobile communication module 112 in the case of a telephone call mode.
  • the microphone 122 can implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated during the process of receiving and transmitting audio signals.
  • the user input unit 130 may generate key input data according to a command input by the user to control various operations of the mobile terminal 100.
  • the user input unit 130 allows the user to input various types of information, and may include a keyboard, a pot, a touch pad (eg, a touch sensitive component that detects changes in resistance, pressure, capacitance, etc. due to contact), a scroll wheel , rocker, etc. Especially when touched When the template is superimposed on the display unit 151 in the form of a layer, a touch screen can be formed.
  • the sensing unit 140 detects the current state of the mobile terminal 100 (eg, the open or closed state of the mobile terminal 100), the location of the mobile terminal 100, the presence or absence of contact (ie, touch input) by the user with the mobile terminal 100, and the mobile terminal.
  • the sensing unit 140 can sense whether the slide type phone is turned on or off.
  • the sensing unit 140 can detect whether the power supply unit 190 provides power or whether the interface unit 170 is coupled to an external device.
  • the interface unit 170 serves as an interface through which at least one external device can connect with the mobile terminal 100.
  • the external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port (a typical example is a universal serial bus USB port), for connection having The port of the device that identifies the module, the audio input/output (I/O) port, the video I/O port, the headphone port, and so on.
  • the identification module may be stored to verify various information used by the user using the mobile terminal 100 and may include a User Identification Module (UIM), a Customer Identification Module (SIM), a Universal Customer Identity Module (USIM), and the like.
  • the device having the identification module (hereinafter referred to as "identification device”) may take the form of a smart card, and thus the identification device may be connected to the mobile terminal 100 via a port or other connection device.
  • the interface unit 170 can be configured to receive input (eg, data information, power, etc.) from an external device and transmit the received input to one or more components within the mobile terminal 100 or can be used at the mobile terminal 100 and externally Data is transferred between devices.
  • input eg, data information, power, etc.
  • the interface unit 170 may function as a path through which power is supplied from the base to the mobile terminal 100 or may be used as a transmission of various command signals allowing input from the base to the mobile terminal 100 The path of the terminal 100.
  • Various command signals or power input from the base can be used as signals for identifying whether the mobile terminal 100 is accurately mounted on the base.
  • Output unit 150 is configured to provide an output signal (eg, an audio signal, a video signal, an alarm signal, a vibration signal, etc.) in a visual, audio, and/or tactile manner.
  • the output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153, and the like.
  • the display unit 151 can display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in a phone call mode, the display unit 151 can display a user interface (UI) or a graphical user interface (GUI) related to a call or other communication (eg, text messaging, multimedia file download, etc.). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display unit 151 may display a captured image and/or a received image, a UI or GUI showing a video or image and related functions, and the like.
  • UI user interface
  • GUI graphical user interface
  • the display unit 151 can function as an input device and an output device.
  • the display unit 151 may include at least one of a liquid crystal display (LCD), a thin film transistor LCD (TFT-LCD), an organic light emitting diode (OLED) display, a flexible display, a three-dimensional (3D) display, and the like.
  • LCD liquid crystal display
  • TFT-LCD thin film transistor LCD
  • OLED organic light emitting diode
  • a flexible display a three-dimensional (3D) display, and the like.
  • 3D three-dimensional
  • Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as a transparent display, and a typical transparent display may be, for example, a TOLED (Transparent Organic Light Emitting Diode) display or the like.
  • TOLED Transparent Organic Light Emitting Diode
  • the mobile terminal 100 may include two or more display units (or other display devices), for example, the mobile terminal 100 may include an external display unit (not shown) and an internal display unit (not shown) ).
  • the touch screen can be used to detect touch input pressure as well as touch input position and touch input area.
  • the audio output module 152 may output audio data received by the wireless communication unit 110 or stored in the memory 160 when the mobile terminal 100 is in a call signal receiving mode, a call mode, a recording mode, a voice recognition mode, a broadcast receiving mode, and the like.
  • the audio signal is converted and output as sound.
  • the audio output module 152 can provide audio output (eg, call signal reception sound, message reception sound, etc.) associated with a particular function performed by the mobile terminal 100.
  • the audio output module 152 can include a speaker, a buzzer, and the like.
  • the alarm unit 153 can provide an output to notify the mobile terminal 100 of the occurrence of an event. Typical events may include call reception, message reception, key signal input, touch input, and the like. In addition to audio or video output, the alert unit 153 can provide an output in a different manner to notify of the occurrence of an event. For example, the alarm unit 153 can provide an output in the form of vibrations, and when a call, message, or some other incoming communication is received, the alarm unit 153 can provide a tactile output (ie, vibration) to notify the user of it. By providing such a tactile output, the user is able to recognize the occurrence of various events even when the user's mobile phone is in the user's pocket. The alarm unit 153 can also provide an output of the notification event occurrence via the display unit 151 or the audio output module 152.
  • the memory 160 may store a software program or the like that performs processing and control operations performed by the controller 180, or may temporarily store data (for example, a phone book, a message, a still image, a video, and the like) that has been output or is to be output. Moreover, the memory 160 can store data regarding vibrations and audio signals of various manners that are output when a touch is applied to the touch screen.
  • the memory 160 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card type memory (eg, SD or DX memory, etc.), a random access memory (RAM), a static random access memory ( SRAM), read only memory (ROM), electrically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), magnetic memory, magnetic disk, optical disk, and the like.
  • the mobile terminal 100 can cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.
  • the controller 180 typically controls the overall operation of the mobile terminal 100.
  • the controller 180 performs the control and processing associated with voice calls, data communications, video calls, and the like.
  • the controller 180 may include a multimedia module 181 for reproducing or playing back multimedia data, which may be constructed within the controller 180 or may be configured to be separate from the controller 180.
  • the controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
  • the power supply unit 190 receives external power or internal power under the control of the controller 180 and provides appropriate power required to operate the various components and components.
  • the various embodiments described herein can be implemented in a computer readable medium using, for example, computer software, hardware, or any combination thereof.
  • the embodiments described herein may be through the use of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays ( An FPGA, a processor, a controller, a microcontroller, a microprocessor, at least one of the electronic units designed to perform the functions described herein, in some cases, such an embodiment may be at the controller 180 Implemented in the middle.
  • implementations such as procedures or functions may be implemented with separate software modules that permit the execution of at least one function or operation.
  • the software code can be implemented by a software application (or program) written in any suitable programming language, which can be stored in memory 160 and executed by
  • the mobile terminal 100 has been described in terms of its function.
  • the slide type mobile terminal 100 in various types of mobile terminals 100 such as a folding type, a bar type, a swing type, a slide type mobile terminal 100, and the like will be described as an example. Therefore, the present invention can be applied to any type of mobile terminal 100, and is not limited to the slide type mobile terminal 100.
  • the mobile terminal 100 as shown in FIG. 5 can be configured to operate using a communication system such as a wired and wireless communication system and a satellite-based communication system that transmits data via frames or packets.
  • a communication system such as a wired and wireless communication system and a satellite-based communication system that transmits data via frames or packets.
  • a communication system in which the mobile terminal 100 according to the present invention can operate will now be described with reference to FIG.
  • Such communication systems may use different air interfaces and/or physical layers.
  • air interfaces used by communication systems include, for example, Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and Universal Mobile Telecommunications System (UMTS) (in particular, Long Term Evolution (LTE)). ), Global System for Mobile Communications (GSM), etc.
  • FDMA Frequency Division Multiple Access
  • TDMA Time Division Multiple Access
  • CDMA Code Division Multiple Access
  • UMTS Universal Mobile Telecommunications System
  • LTE Long Term Evolution
  • GSM Global System for Mobile Communications
  • the following description relates to a CDMA communication system, but such teachings are equally applicable to other types of systems.
  • the CDMA wireless communication system may include a plurality of mobile terminals 100, a plurality of base stations (BS) 270, a base station controller (BSC) 275, and a mobile switching center (MSC) 280.
  • the MSC 280 is configured to interface with a public switched telephone network (PSTN) 290.
  • PSTN public switched telephone network
  • the MSC 280 is also configured to interface with a BSC 275 that can be coupled to the base station 270 via a backhaul line.
  • the backhaul line can be constructed in accordance with any of a number of well known interfaces including, for example, E1/T1, ATM, IP, PPP, Frame Relay, HDSL, ADSL, or xDSL. It will be appreciated that the system as shown in Figure 6 can include multiple BSCs 2750.
  • Each BS 270 can serve one or more partitions (or regions), with each partition covered by a multi-directional antenna or an antenna pointing in a particular direction radially away from the BS 270. Alternatively, each partition may be covered by two or more antennas for diversity reception. Each BS 270 can be configured to support multiple frequency allocations, and each frequency allocation has a particular frequency spectrum (eg, 1.25 MHz, 5 MHz, etc.).
  • BS 270 may also be referred to as a Base Transceiver Subsystem (BTS) or other equivalent terminology.
  • BTS Base Transceiver Subsystem
  • the term "base station” can be used to generally mean a single BSC 275 and at least one BS 270.
  • a base station can also be referred to as a "cell station.”
  • each partition of a particular BS 270 may be referred to as multiple cellular stations.
  • a broadcast transmitter (BT) 295 transmits a broadcast signal to the mobile terminal 100 operating within the system.
  • a broadcast receiving module 111 as shown in FIG. 5 is provided at the mobile terminal 100 to receive a broadcast signal transmitted by the BT 295.
  • several satellites 300 are shown, for example, a Global Positioning System (GPS) satellite 300 can be employed.
  • GPS Global Positioning System
  • the satellite 300 helps locate at least one of the plurality of mobile terminals 100.
  • a plurality of satellites 300 are depicted, but it is understood that useful positioning information can be obtained using any number of satellites.
  • the GPS module 115 as shown in Figure 5 is typically configured to cooperate with the satellite 300 to obtain the desired positioning information. Instead of GPS tracking techniques or in addition to GPS tracking techniques, other techniques that can track the location of the mobile terminal 100 can be used. In addition, to One less GPS satellite 300 can selectively or additionally process satellite DMB transmissions.
  • BS 270 receives reverse link signals from various mobile terminals 100.
  • Mobile terminal 100 typically participates in calls, messaging, and other types of communications.
  • Each reverse link signal received by a particular base station 270 is processed within a particular BS 270.
  • the obtained data is forwarded to the relevant BSC 275.
  • the BSC provides call resource allocation and coordinated mobility management functions including a soft handoff procedure between the BSs 270.
  • the BSC 275 also routes the received data to the MSC 280, which provides additional routing services for interfacing with the PSTN 290.
  • PSTN 290 interfaces with MSC 280, which forms an interface with BSC 275, and BSC 275 controls BS 270 accordingly to transmit forward link signals to mobile terminal 100.
  • the mobile communication module 112 of the wireless communication unit 110 in the mobile terminal accesses the mobile based on necessary data (including user identification information and authentication information) of the mobile communication network (such as 2G/3G/4G mobile communication network) built in the mobile terminal.
  • the communication network transmits mobile communication data (including uplink mobile communication data and downlink mobile communication data) for services such as web browsing and network multimedia playback of the mobile terminal user.
  • the disclosed apparatus and method may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the unit is only a logical function division.
  • there may be another division manner such as: multiple units or components may be combined, or Can be integrated into another system, or some features can be ignored or not executed.
  • the coupling, or direct coupling, or communication connection of the components shown or discussed may be indirect coupling or communication connection through some interfaces, devices or units, and may be electrical, mechanical or other forms. of.
  • the units described above as separate components may or may not be physically separated, and the components displayed as the unit may or may not be physical units, that is, may be located in one place or distributed to multiple network units; You can choose which one according to your actual needs. Some or all of the units implement the objectives of the embodiment of the present embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing module, or each unit may be separately used as one unit, or two or more units may be integrated into one unit; the above integration
  • the unit can be implemented in the form of hardware or in the form of hardware plus software functional units.
  • the foregoing program may be stored in a computer readable storage medium, and the program is executed when executed.
  • the foregoing storage device includes the following steps: the foregoing storage medium includes: a mobile storage device, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk.
  • ROM read-only memory
  • RAM random access memory
  • magnetic disk or an optical disk.
  • optical disk A medium that can store program code.
  • the image processing technical solution provided by the embodiment of the present invention collects an image with a larger area after completing a digital zoom, and then obtains an image with a smaller area and high definition by image cropping, as a final collected image storage and/or Output, in this case, compared to the image directly acquired by the required area, the image may be unclear due to the unclear phenomenon in the edge area, which may improve the definition of the image that the user finally sees and improve the user experience. It has a positive industrial effect and has a promising application prospect that can be widely promoted in the industry.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

一种图像处理装置及方法和计算机存储介质,所述装置包括:变焦单元,用于响应变焦操作,控制采集模组进行数码变焦;采集单元,用于利用变焦后所述采集模组采集第一尺寸的第一图像;裁剪单元,用于裁剪所述第一图像,得到第二尺寸的第二图像;其中,所述第二尺寸小于所述第一图像;显示单元,用于显示所述第二图像。在本实施例中为了避免变焦后带来的抖动,影响采集输出的图像的清晰度等图像质量,在采集时采集一个相对面积较大的第一尺寸的第一图像,再通过才将第一尺寸的第一图像得到一个相对于第一图像的图像质量更高的第二图像,从而提升了采集输出的图像的图像质量。

Description

图像处理装置及方法和计算机存储介质
本申请基于申请号为201610971287.1、申请日为2016年10月28日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请作为参考。
技术领域
本发明涉及信息技术领域的图像处理技术,尤其涉及一种图像处理装置及方法和计算机存储介质。
背景技术
随着信息技术的发展,电子设备通常都携带有摄像头可以进行图像采集。在进行图像采集的过程中,通常会进行变焦,一般变焦涉及到作用于采集设备的用户操作,会出现电子设备的抖动,而电子设备的抖动会使得采集的图像出现重影等模糊现象,导致采集图像的质量下降的问题。
发明内容
有鉴于此,本发明实施例期望提供一种图像处理装置及方法和计算机存储介质,至少部分解决图像质量不佳的问题。
为达到上述目的,本发明的技术方案是这样实现的:
本发明实施例第一方面提供一种图像处理装置,包括:
变焦单元,配置为响应变焦操作,控制采集模组进行数码变焦;
采集单元,配置为利用变焦后所述采集模组采集第一尺寸的第一图像;
裁剪单元,配置为裁剪所述第一图像,得到第二尺寸的第二图像;其中,所述第二尺寸小于所述第一图像;
显示单元,配置为显示所述第二图像。
本发明实施例第二方面提供了一种图像处理方法,包括:
响应变焦操作,控制采集模组进行数码变焦;
利用变焦后所述采集模组采集第一尺寸的第一图像;
裁剪所述第一图像,得到第二尺寸的第二图像;其中,所述第二尺寸小于所述第一图像;
显示所述第二图像。
本发明实施例第三方面提供一种图像处理装置,包括:
图像采集器,配置为采集图像;
存储器,配置为存储信息,所述信息至少包括所述图像采集器采集的图像及计算机程序;
显示器,用于显示信息;
处理器,分别与所述图像采集器、存储器及所述显示器连接,配置为通过执行所述计算机程序,控制所述图像处理装置至少执行以下步骤:
响应变焦操作,控制采集模组进行数码变焦;
利用变焦后所述采集模组采集第一尺寸的第一图像;
裁剪所述第一图像,得到第二尺寸的第二图像;其中,所述第二尺寸小于所述第一图像;
显示所述第二图像。
本发明实施例第四方面提供一种图像处理装置,包括:
图像采集器,配置为采集图像;
存储器,配置为存储信息,所述信息至少包括所述图像采集器采集的图像;
显示器,用于显示信息;
计算机程序,存储在所述存储器上;
处理器,分别与所述图像采集器、存储器及所述显示器连接配置为通 过执行所述计算机程序,实现执行上述一个或多个图像处理方法。
本发明实施例第五方面一种计算机存储介质,所述计算机存储介质中存储有计算机程序,所述计算机程序用于执行上述一个或多个图像处理方法。
本发明实施例提供的图像处理装置及方法和计算机存储介质,为了避免变焦后带来的抖动,影响采集输出的图像的清晰度等图像质量,在采集时采集一个相对面积较大的第一尺寸的第一图像,再通过才将第一尺寸的第一图像得到一个相对于第一图像的图像质量更高的第二图像,从而提升了采集输出的图像的图像质量。且在本实施例中所述图像处理装置没有引入更多的光学结构来进行防抖,以提供图像质量,具有实现成本低及实现简便的特点。
附图说明
图1为本发明实施例提供的一种相机的结构示意图;
图2为本发明实施例提供的一种图像处理方法的流程示意图;
图3A为本发明实施例提供的第一种裁剪示意图;
图3B为本发明实施例提供的第二种裁剪示意图;
图4为本发明实施例提供的一种图像采集装置的结构示意图;
图5为本发明实施例提供的一种移动终端的结构示意图;
图6为本发明实施例提供的一种通信系统的结构示意图。
具体实施方式
以下结合说明书附图及具体实施例对本发明的技术方案做进一步的详细阐述,应当理解,以下所说明的优选实施例仅用于说明和解释本发明,并不用于限定本发明。
本发明实施例首先提供一种图像采集结构,该图像采集结构可为相机。图1为相机的电气结构框图。摄影镜头1211由用于形成被摄体像的多个光学镜头构成,为单焦点镜头或变焦镜头。摄影镜头1211在镜头驱动器1221的控制下能够在光轴方向上移动,镜头驱动器1221根据来自镜头驱动控制电路1222的控制信号,控制摄影镜头1211的焦点位置,在变焦镜头的情况下,也可控制焦点距离。镜头驱动控制电路1222按照来自微型计算机1217的控制命令进行镜头驱动器1221的驱动控制。
在摄影镜头1211的光轴上、由摄影镜头1211形成的被摄体像的位置附近配置有摄像元件1212。摄像元件1212用于对被摄体像摄像并取得摄像图像数据。在摄像元件1212上二维且呈矩阵状配置有构成各像素的光电二极管。各光电二极管产生与受光量对应的光电转换电流,该光电转换电流由与各光电二极管连接的电容器进行电荷蓄积。各像素的前表面配置有拜耳排列的RGB滤色器。
摄像元件1212与摄像电路1213连接,该摄像电路1213在摄像元件1212中进行电荷蓄积控制和图像信号读出控制,对该读出的图像信号(模拟图像信号)降低重置噪声后进行波形整形,进而进行增益提高等以成为适当的信号电平。
摄像电路1213与A/D转换器1214连接,该A/D转换器1214对模拟图像信号进行模数转换,向总线1227输出数字图像信号(以下称之为图像数据)。
总线1227是用于传送在相机的内部读出或生成的各种数据的传送路径。在总线1227连接着上述A/D转换器1214,此外还连接着图像处理器1215、JPEG处理器1216、微型计算机1217、SDRAM(Synchronous Dynamic random access memory,同步动态随机存取内存)1218、存储器接口(以下称之为存储器I/F)1219、LCD(Liquid Crystal Display,液晶显示器)驱动器1220。
图像处理器1215对基于摄像元件1212的输出的图像数据进行OB相减 处理、白平衡调整、颜色矩阵运算、伽马转换、色差信号处理、噪声去除处理、同时化处理、边缘处理等各种图像处理。JPEG处理器1216在将图像数据记录于记录介质1225时,按照JPEG压缩方式压缩从SDRAM1218读出的图像数据。此外,JPEG处理器1216为了进行图像再现显示而进行JPEG图像数据的解压缩。进行解压缩时,读出记录在记录介质1225中的文件,在JPEG处理器1216中实施了解压缩处理后,将解压缩的图像数据暂时存储于SDRAM1218中并在LCD1226上进行显示。另外,在本实施方式中,作为图像压缩解压缩方式采用的是JPEG方式,然而压缩解压缩方式不限于此,当然可以采用MPEG、TIFF、H.264等其他的压缩解压缩方式。
微型计算机1217发挥作为该相机整体的控制部的功能,统一控制相机的各种处理序列。微型计算机1217连接着操作单元1223和闪存1224。
操作单元1223包括但不限于实体按键或者虚拟按键,该实体或虚拟按键可以为电源按钮、拍照键、编辑按键、动态图像按钮、再现按钮、菜单按钮、十字键、OK按钮、删除按钮、放大按钮等各种输入按钮和各种输入键等操作控件,检测这些操作控件的操作状态,。
将检测结果向微型计算机1217输出。此外,在作为显示器的LCD1226的前表面设有触摸面板,检测用户的触摸位置,将该触摸位置向微型计算机1217输出。微型计算机1217根据来自操作单元1223的操作位置的检测结果,执行与用户的操作对应的各种处理序列。
闪存1224存储用于执行微型计算机1217的各种处理序列的程序。微型计算机1217根据该程序进行相机整体的控制。此外,闪存1224存储相机的各种调整值,微型计算机1217读出调整值,按照该调整值进行相机的控制。
SDRAM1218是用于对图像数据等进行暂时存储的可电改写的易失性存储器。该SDRAM1218暂时存储从A/D转换器1214输出的图像数据和在图像处理器1215、JPEG处理器1216等中进行了处理后的图像数据。
存储器接口1219与记录介质1225连接,进行将图像数据和附加在图像数据中的文件头等数据写入记录介质1225和从记录介质1225中读出的控制。记录介质1225例如为能够在相机主体上自由拆装的存储器卡等记录介质,然而不限于此,也可以是内置在相机主体中的硬盘等。
LCD驱动器1210与LCD1226连接,将由图像处理器1215处理后的图像数据存储于SDRAM1218,需要显示时,读取SDRAM1218存储的图像数据并在LCD1226上显示,或者,JPEG处理器1216压缩过的图像数据存储于SDRAM1218,在需要显示时,JPEG处理器1216读取SDRAM1218的压缩过的图像数据,再进行解压缩,将解压缩后的图像数据通过LCD1226进行显示。
LCD1226配置在相机主体的背面进行图像显示。该LCD1226LCD,然而不限于此,也可以采用有机EL等各种显示面板(LCD1226),然而不限于此,也可以采用有机EL等各种显示面板。
基于上述移动终端硬件结构以及相机的电气结构示意图,提出本发明拍摄方法各个实施例。
如图2所示,本实施例提供一种图像处理方法,包括:
步骤S110:响应变焦操作,控制采集模组进行数码变焦;
步骤S120:利用变焦后所述采集模组采集第一尺寸的第一图像;
步骤S130:裁剪所述第一图像,得到第二尺寸的第二图像;其中,所述第二尺寸小于所述第一图像;
步骤S140:显示所述第二图像。
本实施例提供一种图像处理方法,可应用于包括采集模组的电子设备中,这里的采集模组可包括图1所示的相机或现有的任意一种摄像机等结构。
在本实施例中所述步骤S110中将响应变焦操作,控制采集膜组件进行 数码变焦。例如,用户通过手指触摸触控屏,向电子设备输入变焦操作,在变焦后用户手指撤离,通常会导致电子设备的抖动,再比如说照相机,用户通过旋转镜头的硬件结构进行变焦,当用户撤离操作之后,也有很大概率出现手持设备的抖动。而抖动可能会导致采集的图像因抖动出现重影等图像模糊现象。
在本实施例中为了提高图像质量,例如,减少图像模糊现象,提升图像清晰度,在进行图像采集时会以较大的第一尺寸进行图像采集,获得第一图像。这里的第一尺寸可为对应于单位像素点的个数。
通常电子设备的抖动,会导致采集模块在其所在的竖直平面内的上下左右移动,这样的话,会使图像中的部分区域出现模糊等不清晰的图像质量问题。
在本实施例中电子设备在后台,会裁剪所述第一图像,获得相对于第一图像更小尺寸的第二图像。例如,所述第一尺寸包括S1个单位像素;所述第二图像可包括S2个单位像素;所述S2小于所述S1;且这S2个单位像素为所述S1个单位像素中的部分像素。通常裁剪掉的S1-S2个像素为不满足清晰条件的部分,这样的话,通过电子设备后台的图像裁剪可以使显示的图像清晰度更高,从而对于用户而言实现了防抖处理,提升了电子设备的智能性,
在具体的实现过程中,所述方法还包括删除所述第一图像及存储所述第二图像,方便后续用户的查阅所述第二图像。
在一些实施例中,所述步骤S130可包括:
根据裁剪策略,裁剪掉所述第一图像的至少部分周边区域,形成所述第二图像。
在本实施例中所述电子设备中预先存储有裁剪策略,或从其他电子设备有接收裁剪策略。研究发现,在电子设备抖动过程中采集的图像,模糊 区域通常位于图像的周边。在本实施例中主要根据裁剪策略,裁剪掉第一图像的周边区域,保留所述第一图像的中间区域,形成所述第二图像。
图3A的左图为所述第一图像,图3A的右图为所述第二图像,第一图像中虚线框对应的部分为所述第二图像的部分。显然裁剪获得第二图像的尺寸小于所述第一图像的尺寸,同时第二图像为所述第一图像的中间区域。
图3B的左图为所述第一图像,图3B的右图为所述第二图像,第一图像中虚线框对应的部分为所述第二图像的部分。显然裁剪获得第二图像的尺寸小于所述第一图像的尺寸,同时第二图像包括了所述第一图像的中间区域及部分周边区域。
在有些实施例中,所述步骤S120可包括:
利用变焦后的所述采集模组,采集N张所述第一尺寸的第一图像;其中,所述N为不小于2的整数;
所述步骤S130可包括:
比对N张所述第一图像,裁剪时保留所述N张所述第一图像中的重叠区域,形成所述第二图像。
在本实施例中所述采集模组会在输出第二图像之前,采集多张所述第一图像;所述第一图像为电子设备抖动过程中不同时刻的采集图像。由于电子设备在抖动的过程中,会使得采集模组采集的景物、人物和/或动物等采集对象发生变化,而首先发生变化的区域肯定是周边区域。只要所述电子设备的抖动幅度不是特别大,则两个第一图像肯定会有部分采集对象是保持不变的,且通常这些采集对象均是显现在第一图像的中间区域的,且通常都是清晰的,即所述重叠区域为所述N张第一图像的中间区域。在本实施例中,为了确定保留下来中间区域的尺寸,在本实施例中通过N张第一图像的比对,选择出N张第一图像的重叠区域作为所述中间区域保留,生成所述第二图像。显然,这样形成的所述第二图像的尺寸显然是小于第 一图像的尺寸。
在一些实施例,所述第二尺寸为变焦后预定输出的图像尺寸;
所述步骤S130可包括:
选择所述第一图像中等于所述第二尺寸的中间区域作为保留区域,裁剪所述第一图像,形成所述第二图像。
数码变焦后都会对应一个图像尺寸,在本实施例中所述第二尺寸直接为所述预定的图像尺寸。在本实施例中在裁剪时,以所述第一图像的中心点为第二图像的中心点保留第二尺寸的图像区域,形成第二图像。如前述提到的由于根据研究模糊更高频次的发生在周边区域,显然这样可以提升第二图像的清晰度,提升图像的图像质量。
在一些实施例中,所述步骤S130还可包括:
获取所述第一图像各个区域的清晰度;
选择清晰度满足预设清晰度条件的区域作为保留区域,形成所述第二图像。
在本实施例中可以首先将第一图像划分为多个区域;再利用各种处理方式获取第一图像中各个区域的清晰度。例如,利用边缘梯度检测、相关性检测以及基于统计和/或变化的评价函数,检测各个区域的清晰度。
接下来选择清晰度大于预设阈值或清晰度较高的多个区域,作为裁剪的保留区域,形成所述第二图像。
当然,本实施例所述的方法,根据统计概率清晰度低的区域大概率出现在第一图像的周边,故在本实施例中也可以是基于图像策略的形成所述第二图像。当然,也存在这种可能,周边区域也有部分清晰的时候,本实施例中所述保留区域可以不局限于中间区域,这样可以获得最大的尺寸的第二图像。
可选地,所述步骤S130可包括:将所述第一图像划分为N个等面积 的区域;再根据所述图像的清晰度,选择连续分布的M个满足预设清晰度条件的区域作为保留区域,形成所述第二图像。所述M小于或等于所述N。例如,所述N等于16,所述M可以等于4。保留下来的M个区域可以为所述N个区域的中间区域和/或周边区域。例如,所述步骤S130可包括:
从N个区域中选择清晰度满足所述预设清晰度条件且连续分布的M个区域作为所述保留区域,形成所述第二图像;其中,所述M为小于N的正整数;所述N为第一图像的区域的总个数。
在一些实施例中,所述步骤S120可包括:在变焦后基于一个采集确定操作之后,采集多张等于所述第一尺寸的第一图像;相当于一个采集指令,设备会自动采集2个或2个以上的第一图像。
所述步骤S130可包括:比对N张所述第一图像的清晰度,选择最清晰的第一图像进行裁剪以获得裁剪后的所述第二图像。
在本实施例中选择最清晰的一张进行裁剪即可,不必裁剪每一张第一图像,未裁剪的不够清晰的图像可以直接丢弃,减少存储数据量。
所述步骤S130可包括:当多张所述第一图像中没有一张图像的有连续分布图像且满足预设清秀度的图像区域等于所述第二尺寸时,裁剪多张所述第一图像,获得满足所述预设清晰度的多个图像区域;拼接多个图像区域,获得满足所述预设清晰度条件的所述第二图像。
当没有一张第一图像可以提供第二尺寸的连续清晰的区域时,在本实施例中可以通过裁剪,从不同给的第一图像中获取比较小的满足清晰度的图像区域,然后拼接形成第二图像,对于用户而言可以减少反复采集的次数。
如图4所示,本实施例提供一种图像处理装置,包括:
变焦单元310,配置为响应变焦操作,控制采集模组进行数码变焦;
采集单元320,配置为利用变焦后所述采集模组采集第一尺寸的第一图 像;
裁剪单元330,配置为裁剪所述第一图像,得到第二尺寸的第二图像;其中,所述第二尺寸小于所述第一图像;
显示单元340,配置为显示所述第二图像。
本实施例所述的图像处理装置可为应用于前述移动终端或相机或电子设备中的结构。
所述变焦单元310可对应于控制器,控制所述采集模组进行变焦。所述采集模组可包括采集镜头。所述采集单元320可为包括所述采集模组的结构,所述采集单元320还可包括控制器或控制电路,所述控制器或控制电路控制所述采集模组进行图像感光,然后进行图像采集。
所述裁剪单元330可对应于处理器或处理电路,所述处理器可包括中央处理器、微处理器、数字信号处理器、应用处理器或可编程阵列等。所述处理电路可包括专用集成电路等。所述处理器或处理电路可与存储介质连接,可通过读取存储介质中的可执行代码,实现上述裁剪单元330的裁剪操作。
所述显示单元340可对应于各种显示结构,例如,液晶显示屏、有机发光二级管OLED显示屏、投影显示屏、电子墨水显示屏或等离子显示屏等各种可以显示第二图像的显示结构。这样用户就可以通过显示单元340的显示,看到采集的图像,且本实施例中所述装置,输出的第二图像相对于现有技术在变焦后输出的图像的更加清晰,图像效果更佳。
在一些实施例中,所述裁剪单元330,配置为根据裁剪策略,裁剪掉所述第一图像的至少部分周边区域,形成所述第二图像。
所述装置还可包括存储单元,所述存储单元可对应于存储介质,用于存储所述裁剪策略。所述装置也可以包括通信单元,所述通信单元对应于通信接口,可以用于从外设接收搜书裁剪策略。所述裁剪单元330与所述 存储单元或所述通信单元连接,根据所述裁剪策略,裁剪掉所述第一图像的至少部分周边区域,获得清晰度更高的第二图像。若所述第一图像为矩形图像,则所述周边区域可根据其所在边,至少包括4个周边区域。所述裁剪单元330在裁剪的过程中,可以仅裁剪了其中1、2或3个周边区域,保留了部分周边区域形成所述第二图像。
在一些实施例中,所述采集单元320,配置为利用变焦后的所述采集模组,采集N张所述第一尺寸的第一图像;其中,所述N为不小于2的整数;
所述裁剪单元330,配置为比对N张所述第一图像,裁剪时保留所述N张所述第一图像中的重叠区域,形成所述第二图像。
在本实施例中所述采集单元320会在显示单元340最终显示第二图像之前,或在裁剪单元330裁剪之前,采集至少两张第一图像。所述裁剪单元330通过图像比对,确定所述第一图像的保留区域形成所述第二图像,这样的话,相当于动态确定了所述第二尺寸的面积。
在一些实施例中,所述第二尺寸为变焦后预定输出的图像尺寸;
所述裁剪单元330,配置为选择所述第一图像中等于所述第二尺寸的中间区域作为保留区域,裁剪所述第一图像,形成所述第二图像。
在本实施例中所述第二尺寸为静态设置的,相对于动态确定,具有实现更为简便的特点。
在一些实施例中,所述裁剪单元330,配置为获取所述第一图像各个区域的清晰度;选择清晰度满足预设清晰度条件的区域作为保留区域,形成所述第二图像。在本实施例中将会根据清晰度进行裁剪,这样的话,裁剪得到的第二图像至少满足清秀度条件,能够确保显示单元340显示的第二图像的清晰度。
所述裁剪单元330,可配置为从N个区域中选择清晰度满足所述预设清晰度条件且连续分布的M个区域作为所述保留区域,形成所述第二图像; 其中,所述M为小于N的正整数;所述N为第一图像的区域的总个数。
所述采集单元320,配置为在变焦后基于一个采集确定操作之后,采集多张等于所述第一尺寸的第一图像;
所述裁剪单元,配置为比对N张所述第一图像的清晰度,选择最清晰的第一图像进行裁剪以获得裁剪后的所述第二图像。
可选地,所述裁剪单元330,可配置为当多张所述第一图像中没有一张图像的有连续分布图像且满足预设清秀度的图像区域等于所述第二尺寸时,裁剪多张所述第一图像,获得满足所述预设清晰度的多个图像区域;拼接多个图像区域,获得满足所述预设清晰度条件的所述第二图像。
本发明实施例提供一种图像处理装置,可对应于移动终端,包括:
图像采集器,配置为采集图像;
存储器,配置为存储信息,所述信息至少包括所述图像采集器采集的图像及计算机程序;
显示器,用于显示信息;
处理器,分别与所述图像采集器、存储器及所述显示器连接,配置为通过执行所述计算机程序,控制所述图像处理装置至少执行以下步骤:
响应变焦操作,控制采集模组进行数码变焦;
利用变焦后所述采集模组采集第一尺寸的第一图像;
裁剪所述第一图像,得到第二尺寸的第二图像;其中,所述第二尺寸小于所述第一图像;
显示所述第二图像。
可选地,所述处理器还用于执行以下操作:
利用变焦后的所述采集模组,采集N张所述第一尺寸的第一图像;其中,所述N为不小于2的整数;
所述根据裁剪策略,裁剪掉所述第一图像的至少部分周边区域,形成 所述第二图像,包括:
比对N张所述第一图像,裁剪时保留所述N张所述第一图像中的重叠区域,形成所述第二图像。
总之,本实施例提供的图像处理装置,可以执行前述任意一个技术方案提供的图像处理方法。
本实施例中所述的图像采集器可为各种能够进行图像采集的传感器或设备,例如,对应于图5中的照相机121或镜头等。
所述存储器可为包括各种类型的存储介质的器件,例如,随机存储器或只读存储器,或者,是包括闪存的磁盘存储器等,例如可如图5中的存储器160。
所述处理器可为各种类型的处理器,例如,中央存储器、微处理器、数字信号处理器、应用处理器、可编程阵列或专用集成电路等。所述处理器可为如图5中所示的控制器180。
所述处理器可以通过集成电路总线等总线接口与存储器及图像采集器连接,可以通过总线向存储器及图像采集器发送控制指令,控制图像采集和信息存储,同时可以从存储器和/或图像采集器读取数据,以实现图像处理。
本发明实施例还提供一种图像处理装置,包括:
图像采集器,配置为采集图像;
存储器,配置为存储信息,所述信息至少包括所述图像采集器采集的图像;
显示器,用于显示信息;
计算机程序,存储在所述存储器上;
处理器,分别与所述图像采集器、存储器及所述显示器连接配置为通过执行所述计算机程序,实现任意一个或多个技术方案提供的图像处理方 法。
本发明实施例还提供一种种计算机存储介质,所述计算机存储介质中存储有计算机程序,所述计算机程序可用于实现任意一个或多个技术方案提供的图像处理方法。该计算机存储介质为非瞬间存储介质。
本实施例还提供一种移动终端,可以包括上述任意一个图像处理装置。所述移动终端可以以各种形式来实施。例如,本发明中描述的终端可以包括诸如移动电话、智能电话、笔记本电脑、数字广播接收器、个人数字助理(PDA)、平板电脑(PAD)、便携式多媒体播放器(PMP)、导航装置等等的移动终端以及诸如数字TV、台式计算机等等的固定终端。下面,假设终端是移动终端。然而,本领域技术人员将理解的是,除了特别用于移动目的的元件之外,根据本发明的实施方式的构造也能够应用于固定类型的终端。
图5为实现本发明各个实施例的移动终端100的硬件结构示意,如图5所示,移动终端100可以包括无线通信单元110、音频/视频(A/V)输入单元120、用户输入单元130、感测单元140、输出单元150、存储器160、接口单元170、控制器180和电源单元190等等。图5示出了具有各种组件的移动终端100,但是应理解的是,并不要求实施所有示出的组件。可以替代地实施更多或更少的组件。将在下面详细描述移动终端100的元件。
无线通信单元110通常包括一个或多个组件,其允许移动终端100与无线通信系统或网络之间的无线电通信。例如,无线通信单元110可以包括广播接收模块111、移动通信模块112、无线互联网模块113、短程通信模块114和位置信息模块115中的至少一个。
广播接收模块111经由广播信道从外部广播管理服务器接收广播信号和/或广播相关信息。广播信道可以包括卫星信道和/或地面信道。广播管理服务器可以是生成并发送广播信号和/或广播相关信息的服务器或者接收之 前生成的广播信号和/或广播相关信息并且将其发送给终端的服务器。广播信号可以包括TV广播信号、无线电广播信号、数据广播信号等等。而且,广播信号可以进一步包括与TV或无线电广播信号组合的广播信号。广播相关信息也可以经由移动通信网络提供,并且在该情况下,广播相关信息可以由移动通信模块112来接收。广播信号可以以各种形式存在,例如,其可以以数字多媒体广播(DMB)的电子节目指南(EPG)、数字视频广播手持(DVB-H)的电子服务指南(ESG)等等的形式而存在。广播接收模块111可以通过使用各种类型的广播系统接收信号广播。特别地,广播接收模块111可以通过使用诸如多媒体广播-地面(DMB-T)、数字多媒体广播-卫星(DMB-S)、数字视频广播-手持(DVB-H),前向链路媒体(MediaFLO@)的数据广播系统、地面数字广播综合服务(ISDB-T)等等的数字广播系统接收数字广播。广播接收模块111可以被构造为适合提供广播信号的各种广播系统以及上述数字广播系统。经由广播接收模块111接收的广播信号和/或广播相关信息可以存储在存储器160(或者其它类型的存储介质)中。
移动通信模块112将无线电信号发送到基站(例如,接入点、节点B等等)、外部终端以及服务器中的至少一个和/或从其接收无线电信号。这样的无线电信号可以包括语音通话信号、视频通话信号、或者根据文本和/或多媒体消息发送和/或接收的各种类型的数据。
无线互联网模块113支持移动终端100的无线互联网接入。无线互联网模块113可以内部或外部地耦接到终端。无线互联网模块113所涉及的无线互联网接入技术可以包括无线局域网(WLAN)、无线相容性认证(Wi-Fi)、无线宽带(Wibro)、全球微波互联接入(Wimax)、高速下行链路分组接入(HSDPA)等等。
短程通信模块114是用于支持短程通信的模块。短程通信技术的一些示例包括蓝牙TM、射频识别(RFID)、红外数据协会(IrDA)、超宽带(UWB)、 紫蜂TM等等。
位置信息模块115是用于检查或获取移动终端100的位置信息的模块。位置信息模块115的典型示例是全球定位系统(GPS)模块115。根据当前的技术,GPS模块115计算来自三个或更多卫星的距离信息和准确的时间信息并且对于计算的信息应用三角测量法,从而根据经度、纬度和高度准确地计算三维当前位置信息。当前,用于计算位置和时间信息的方法使用三颗卫星并且通过使用另外的一颗卫星校正计算出的位置和时间信息的误差。此外,GPS模块115能够通过实时地连续计算当前位置信息来计算速度信息。
A/V输入单元120用于接收音频或视频信号。A/V输入单元120可以包括相机121和麦克风122,相机121对在视频捕获模式或图像捕获模式中由图像捕获装置获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元151上。经相机121处理后的图像帧可以存储在存储器160(或其它存储介质)中或者经由无线通信单元110进行发送,可以根据移动终端100的构造提供两个或更多相机121。麦克风122可以在电话通话模式、记录模式、语音识别模式等等运行模式中经由麦克风接收声音(音频数据),并且能够将这样的声音处理为音频数据。处理后的音频(语音)数据可以在电话通话模式的情况下转换为可经由移动通信模块112发送到移动通信基站的格式输出。麦克风122可以实施各种类型的噪声消除(或抑制)算法以消除(或抑制)在接收和发送音频信号的过程中产生的噪声或者干扰。
用户输入单元130可以根据用户输入的命令生成键输入数据以控制移动终端100的各种操作。用户输入单元130允许用户输入各种类型的信息,并且可以包括键盘、锅仔片、触摸板(例如,检测由于被接触而导致的电阻、压力、电容等等的变化的触敏组件)、滚轮、摇杆等等。特别地,当触 摸板以层的形式叠加在显示单元151上时,可以形成触摸屏。
感测单元140检测移动终端100的当前状态,(例如,移动终端100的打开或关闭状态)、移动终端100的位置、用户对于移动终端100的接触(即,触摸输入)的有无、移动终端100的取向、移动终端100的加速或减速移动和方向等等,并且生成用于控制移动终端100的操作的命令或信号。例如,当移动终端100实施为滑动型移动电话时,感测单元140可以感测该滑动型电话是打开还是关闭。另外,感测单元140能够检测电源单元190是否提供电力或者接口单元170是否与外部装置耦接。
接口单元170用作至少一个外部装置与移动终端100连接可以通过的接口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口(典型示例是通用串行总线USB端口)、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。识别模块可以是存储用于验证用户使用移动终端100的各种信息并且可以包括用户识别模块(UIM)、客户识别模块(SIM)、通用客户识别模块(USIM)等等。另外,具有识别模块的装置(下面称为“识别装置”)可以采取智能卡的形式,因此,识别装置可以经由端口或其它连接装置与移动终端100连接。
接口单元170可以用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到移动终端100内的一个或多个元件或者可以用于在移动终端100和外部装置之间传输数据。
另外,当移动终端100与外部底座连接时,接口单元170可以用作允许通过其将电力从底座提供到移动终端100的路径或者可以用作允许从底座输入的各种命令信号通过其传输到移动终端100的路径。从底座输入的各种命令信号或电力可以用作用于识别移动终端100是否准确地安装在底座上的信号。
输出单元150被构造为以视觉、音频和/或触觉方式提供输出信号(例如,音频信号、视频信号、警报信号、振动信号等等)。输出单元150可以包括显示单元151、音频输出模块152、警报单元153等等。
显示单元151可以显示在移动终端100中处理的信息。例如,当移动终端100处于电话通话模式时,显示单元151可以显示与通话或其它通信(例如,文本消息收发、多媒体文件下载等等)相关的用户界面(UI)或图形用户界面(GUI)。当移动终端100处于视频通话模式或者图像捕获模式时,显示单元151可以显示捕获的图像和/或接收的图像、示出视频或图像以及相关功能的UI或GUI等等。
同时,当显示单元151和触摸板以层的形式彼此叠加以形成触摸屏时,显示单元151可以用作输入装置和输出装置。显示单元151可以包括液晶显示器(LCD)、薄膜晶体管LCD(TFT-LCD)、有机发光二极管(OLED)显示器、柔性显示器、三维(3D)显示器等等中的至少一种。这些显示器中的一些可以被构造为透明状以允许用户从外部观看,这可以称为透明显示器,典型的透明显示器可以例如为TOLED(透明有机发光二极管)显示器等等。根据特定想要的实施方式,移动终端100可以包括两个或更多显示单元(或其它显示装置),例如,移动终端100可以包括外部显示单元(未示出)和内部显示单元(未示出)。触摸屏可用于检测触摸输入压力以及触摸输入位置和触摸输入面积。
音频输出模块152可以在移动终端100处于呼叫信号接收模式、通话模式、记录模式、语音识别模式、广播接收模式等等模式下时,将无线通信单元110接收的或者在存储器160中存储的音频数据转换音频信号并且输出为声音。而且,音频输出模块152可以提供与移动终端100执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出模块152可以包括扬声器、蜂鸣器等等。
警报单元153可以提供输出以将事件的发生通知给移动终端100。典型的事件可以包括呼叫接收、消息接收、键信号输入、触摸输入等等。除了音频或视频输出之外,警报单元153可以以不同的方式提供输出以通知事件的发生。例如,警报单元153可以以振动的形式提供输出,当接收到呼叫、消息或一些其它进入通信(incoming communication)时,警报单元153可以提供触觉输出(即,振动)以将其通知给用户。通过提供这样的触觉输出,即使在用户的移动电话处于用户的口袋中时,用户也能够识别出各种事件的发生。警报单元153也可以经由显示单元151或音频输出模块152提供通知事件的发生的输出。
存储器160可以存储由控制器180执行的处理和控制操作的软件程序等等,或者可以暂时地存储已经输出或将要输出的数据(例如,电话簿、消息、静态图像、视频等等)。而且,存储器160可以存储关于当触摸施加到触摸屏时输出的各种方式的振动和音频信号的数据。
存储器160可以包括至少一种类型的存储介质,所述存储介质包括闪存、硬盘、多媒体卡、卡型存储器(例如,SD或DX存储器等等)、随机访问存储器(RAM)、静态随机访问存储器(SRAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、可编程只读存储器(PROM)、磁性存储器、磁盘、光盘等等。而且,移动终端100可以与通过网络连接执行存储器160的存储功能的网络存储装置协作。
控制器180通常控制移动终端100的总体操作。例如,控制器180执行与语音通话、数据通信、视频通话等等相关的控制和处理。另外,控制器180可以包括用于再现或回放多媒体数据的多媒体模块181,多媒体模块181可以构造在控制器180内,或者可以构造为与控制器180分离。控制器180可以执行模式识别处理,以将在触摸屏上执行的手写输入或者图片绘制输入识别为字符或图像。
电源单元190在控制器180的控制下接收外部电力或内部电力并且提供操作各元件和组件所需的适当的电力。
这里描述的各种实施方式可以以使用例如计算机软件、硬件或其任何组合的计算机可读介质来实施。对于硬件实施,这里描述的实施方式可以通过使用特定用途集成电路(ASIC)、数字信号处理器(DSP)、数字信号处理装置(DSPD)、可编程逻辑装置(PLD)、现场可编程门阵列(FPGA)、处理器、控制器、微控制器、微处理器、被设计为执行这里描述的功能的电子单元中的至少一种来实施,在一些情况下,这样的实施方式可以在控制器180中实施。对于软件实施,诸如过程或功能的实施方式可以与允许执行至少一种功能或操作的单独的软件模块来实施。软件代码可以由以任何适当的编程语言编写的软件应用程序(或程序)来实施,软件代码可以存储在存储器160中并且由控制器180执行。
至此,已经按照其功能描述了移动终端100。下面,为了简要起见,将描述诸如折叠型、直板型、摆动型、滑动型移动终端100等等的各种类型的移动终端100中的滑动型移动终端100作为示例。因此,本发明能够应用于任何类型的移动终端100,并且不限于滑动型移动终端100。
如图5中所示的移动终端100可以被构造为利用经由帧或分组发送数据的诸如有线和无线通信系统以及基于卫星的通信系统来操作。
现在将参考图6描述其中根据本发明的移动终端100能够操作的通信系统。
这样的通信系统可以使用不同的空中接口和/或物理层。例如,由通信系统使用的空中接口包括例如频分多址(FDMA)、时分多址(TDMA)、码分多址(CDMA)和通用移动通信系统(UMTS)(特别地,长期演进(LTE))、全球移动通信系统(GSM)等等。作为非限制性示例,下面的描述涉及CDMA通信系统,但是这样的教导同样适用于其它类型的系统。
参考图6,CDMA无线通信系统可以包括多个移动终端100、多个基站(BS)270、基站控制器(BSC)275和移动交换中心(MSC)280。MSC 280被构造为与公共电话交换网络(PSTN)290形成接口。MSC 280还被构造为与可以经由回程线路耦接到基站270的BSC 275形成接口。回程线路可以根据若干己知的接口中的任一种来构造,所述接口包括例如E1/T1、ATM、IP、PPP、帧中继、HDSL、ADSL或xDSL。将理解的是,如图6中所示的系统可以包括多个BSC 2750。
每个BS 270可以服务一个或多个分区(或区域),由多向天线或指向特定方向的天线覆盖的每个分区放射状地远离BS 270。或者,每个分区可以由用于分集接收的两个或更多天线覆盖。每个BS 270可以被构造为支持多个频率分配,并且每个频率分配具有特定频谱(例如,1.25MHz,5MHz等等)。
分区与频率分配的交叉可以被称为CDMA信道。BS 270也可以被称为基站收发器子系统(BTS)或者其它等效术语。在这样的情况下,术语“基站”可以用于笼统地表示单个BSC 275和至少一个BS 270。基站也可以被称为“蜂窝站”。或者,特定BS 270的各分区可以被称为多个蜂窝站。
如图6中所示,广播发射器(BT)295将广播信号发送给在系统内操作的移动终端100。如图5中所示的广播接收模块111被设置在移动终端100处以接收由BT295发送的广播信号。在图6中,示出了几个卫星300,例如可以采用全球定位系统(GPS)卫星300。卫星300帮助定位多个移动终端100中的至少一个。
在图6中,描绘了多个卫星300,但是理解的是,可以利用任何数目的卫星获得有用的定位信息。如图5中所示的GPS模块115通常被构造为与卫星300配合以获得想要的定位信息。替代GPS跟踪技术或者在GPS跟踪技术之外,可以使用可以跟踪移动终端100的位置的其它技术。另外,至 少一个GPS卫星300可以选择性地或者额外地处理卫星DMB传输。
作为无线通信系统的一个典型操作,BS 270接收来自各种移动终端100的反向链路信号。移动终端100通常参与通话、消息收发和其它类型的通信。特定基站270接收的每个反向链路信号被在特定BS 270内进行处理。获得的数据被转发给相关的BSC 275。BSC提供通话资源分配和包括BS 270之间的软切换过程的协调的移动管理功能。BSC275还将接收到的数据路由到MSC 280,其提供用于与PSTN 290形成接口的额外的路由服务。类似地,PSTN 290与MSC 280形成接口,MSC与BSC 275形成接口,并且BSC 275相应地控制BS 270以将正向链路信号发送到移动终端100。
移动终端中无线通信单元110的移动通信模块112基于移动终端内置的接入移动通信网络(如2G/3G/4G等移动通信网络)的必要数据(包括用户识别信息和鉴权信息)接入移动通信网络为移动终端用户的网页浏览、网络多媒体播放等业务传输移动通信数据(包括上行的移动通信数据和下行的移动通信数据)。
在本申请所提供的几个实施例中,应该理解到,所揭露的设备和方法,可以通过其它的方式实现。以上所描述的设备实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,如:多个单元或组件可以结合,或可以集成到另一个系统,或一些特征可以忽略,或不执行。另外,所显示或讨论的各组成部分相互之间的耦合、或直接耦合、或通信连接可以是通过一些接口,设备或单元的间接耦合或通信连接,可以是电性的、机械的或其它形式的。
上述作为分离部件说明的单元可以是、或也可以不是物理上分开的,作为单元显示的部件可以是、或也可以不是物理单元,即可以位于一个地方,也可以分布到多个网络单元上;可以根据实际的需要选择其中的 部分或全部单元来实现本实施例方案的目的。
另外,在本发明各实施例中的各功能单元可以全部集成在一个处理模块中,也可以是各单元分别单独作为一个单元,也可以两个或两个以上单元集成在一个单元中;上述集成的单元既可以采用硬件的形式实现,也可以采用硬件加软件功能单元的形式实现。
本领域普通技术人员可以理解:实现上述方法实施例的全部或部分步骤可以通过程序指令相关的硬件来完成,前述的程序可以存储于一计算机可读取存储介质中,该程序在执行时,执行包括上述方法实施例的步骤;而前述的存储介质包括:移动存储设备、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本发明的具体实施方式,但本发明的保护范围并不局限于此,凡按照本发明原理所作的修改,都应当理解为落入本发明的保护范围。
工业实用性
本发明实施例提供的图像处理技术方案,在完成一次数码变焦之后,会采集面积较大的图像,然后通过图像裁剪得到面积较小且清晰度高的图像,作为最终的采集图像存储和/或输出,这样的话,相对于直接采集所需面积的图像,可能因为边缘区域出现不清晰的现象,导致图像质量差的问题,从而提升了用户最终看到的图像的清晰度,提升了用户体验,具有积极的工业效果,且具有实现简便可在工业上广泛推广的应用前景。

Claims (20)

  1. 一种图像处理装置,包括:
    变焦单元,配置为响应变焦操作,控制采集模组进行数码变焦;
    采集单元,配置为利用变焦后所述采集模组采集第一尺寸的第一图像;
    裁剪单元,配置为裁剪所述第一图像,得到第二尺寸的第二图像;其中,所述第二尺寸小于所述第一图像;
    显示单元,配置为显示所述第二图像。
  2. 根据权利要求1所述的装置,其中,
    所述裁剪单元,配置为根据裁剪策略,裁剪掉所述第一图像的至少部分周边区域,形成所述第二图像。
  3. 根据权利要求2所述的装置,其中,
    所述采集单元,配置为利用变焦后的所述采集模组,采集N张所述第一尺寸的第一图像;其中,所述N为不小于2的整数;
    所述裁剪单元,配置为比对N张所述第一图像,裁剪时保留所述N张所述第一图像中的重叠区域,形成所述第二图像。
  4. 根据权利要求2所述的装置,其中,
    所述第二尺寸为变焦后预定输出的图像尺寸;
    所述裁剪单元,配置为选择所述第一图像中等于所述第二尺寸的中间区域作为保留区域,裁剪所述第一图像,形成所述第二图像。
  5. 根据权利要求1或2所述的装置,其中,
    所述裁剪单元,配置为获取所述第一图像各个区域的清晰度;选择清晰度满足预设清晰度条件的区域作为保留区域,形成所述第二图像。
  6. 根据权利要求1所述的装置,其中,
    所述裁剪单元,配置为从N个区域中选择清晰度满足所述预设清晰度 条件且连续分布的M个区域作为所述保留区域,形成所述第二图像;其中,所述M为小于N的正整数;所述N为第一图像的区域的总个数。
  7. 根据权利要求6所述的装置,其中,
    所述采集单元,配置为在变焦后基于一个采集确定操作之后,采集多张等于所述第一尺寸的第一图像;
    所述裁剪单元,配置为比对N张所述第一图像的清晰度,选择最清晰的第一图像进行裁剪以获得裁剪后的所述第二图像。
  8. 根据权利要求要求1所述的装置,其中
    所述裁剪单元,配置为当多张所述第一图像中没有一张图像的有连续分布图像且满足预设清秀度的图像区域等于所述第二尺寸时,裁剪多张所述第一图像,获得满足所述预设清晰度的多个图像区域;
    拼接多个图像区域,获得满足所述预设清晰度条件的所述第二图像。
  9. 一种图像处理方法,包括:
    响应变焦操作,控制采集模组进行数码变焦;
    利用变焦后所述采集模组采集第一尺寸的第一图像;
    裁剪所述第一图像,得到第二尺寸的第二图像;其中,所述第二尺寸小于所述第一图像;
    显示所述第二图像。
  10. 根据权利要求9所述的方法,其中,
    所述裁剪所述第一图像,得到第二尺寸的第二图像,包括:
    根据裁剪策略,裁剪掉所述第一图像的至少部分周边区域,形成所述第二图像。
  11. 根据权利要求10所述的方法,其中,
    所述利用变焦后所述采集模组采集第一尺寸的第一图像,包括:
    利用变焦后的所述采集模组,采集N张所述第一尺寸的第一图像;其 中,所述N为不小于2的整数;
    所述根据裁剪策略,裁剪掉所述第一图像的至少部分周边区域,形成所述第二图像,包括:
    比对N张所述第一图像,裁剪时保留所述N张所述第一图像中的重叠区域,形成所述第二图像。
  12. 根据权利要求10所述的方法,其中,
    所述第二尺寸为变焦后预定输出的图像尺寸;
    所述根据裁剪策略,裁剪掉所述第一图像的至少部分周边区域,形成所述第二图像,包括:
    选择所述第一图像中等于所述第二尺寸的中间区域作为保留区域,裁剪所述第一图像,形成所述第二图像。
  13. 根据权利要求9或10所述的方法,其中,
    所述裁剪所述第一图像,得到第二尺寸的第二图像,包括:
    获取所述第一图像各个区域的清晰度;
    选择清晰度满足预设清晰度条件的区域作为保留区域,形成所述第二图像。
  14. 根据权利要求9所述的方法,其中,
    所述选择清晰度满足预设清晰度条件的区域作为保留区域,形成所述第二图像,包括:
    从N个区域中选择清晰度满足所述预设清晰度条件且连续分布的M个区域作为所述保留区域,形成所述第二图像;其中,所述M为小于N的正整数;所述N为第一图像的区域的总个数。
  15. 根据权利要求9所述的方法,其中,
    所述利用变焦后所述采集模组采集第一尺寸的第一图像,包括:
    在变焦后基于一个采集确定操作之后,采集多张等于所述第一尺寸的 第一图像;
    所述裁剪所述第一图像,得到第二尺寸的第二图像;其中,所述第二尺寸小于所述第一图像,包括:
    比对N张所述第一图像的清晰度,选择最清晰的第一图像进行裁剪以获得裁剪后的所述第二图像。
  16. 根据权利要求要求9所述的方法,其中
    所述裁剪所述第一图像,得到第二尺寸的第二图像;其中,所述第二尺寸小于所述第一图像,包括:
    当多张所述第一图像中没有一张图像的有连续分布图像且满足预设清秀度的图像区域等于所述第二尺寸时,裁剪多张所述第一图像,获得满足所述预设清晰度的多个图像区域;
    拼接多个图像区域,获得满足所述预设清晰度条件的所述第二图像。
  17. 一种图像处理装置,包括:
    图像采集器,配置为采集图像;
    存储器,配置为存储信息,所述信息至少包括所述图像采集器采集的图像及计算机程序;
    显示器,用于显示信息;
    处理器,分别与所述图像采集器、存储器及所述显示器连接,配置为通过执行所述计算机程序,控制所述图像处理装置至少执行以下步骤:
    响应变焦操作,控制采集模组进行数码变焦;
    利用变焦后所述采集模组采集第一尺寸的第一图像;
    裁剪所述第一图像,得到第二尺寸的第二图像;其中,所述第二尺寸小于所述第一图像;
    显示所述第二图像。
  18. 根据权利要求17所述的装置,其中,
    所述处理器还用于执行以下操作:
    利用变焦后的所述采集模组,采集N张所述第一尺寸的第一图像;其中,所述N为不小于2的整数;
    所述根据裁剪策略,裁剪掉所述第一图像的至少部分周边区域,形成所述第二图像,包括:
    比对N张所述第一图像,裁剪时保留所述N张所述第一图像中的重叠区域,形成所述第二图像。
  19. 一种图像处理装置,包括:
    图像采集器,配置为采集图像;
    存储器,配置为存储信息,所述信息至少包括所述图像采集器采集的图像;
    显示器,用于显示信息;
    计算机程序,存储在所述存储器上;
    处理器,分别与所述图像采集器、存储器及所述显示器连接配置为通过执行所述计算机程序,实现权利要求9至16任一项提供的图像处理方法。
  20. 一种计算机存储介质,所述计算机存储介质中存储有计算机程序,所述计算机程序用于执行权利要求9至16任一项提供的图像处理方法。
PCT/CN2017/100949 2016-10-28 2017-09-07 图像处理装置及方法和计算机存储介质 WO2018076938A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610971287.1 2016-10-28
CN201610971287.1A CN106454105A (zh) 2016-10-28 2016-10-28 图像处理装置及方法

Publications (1)

Publication Number Publication Date
WO2018076938A1 true WO2018076938A1 (zh) 2018-05-03

Family

ID=58180191

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/100949 WO2018076938A1 (zh) 2016-10-28 2017-09-07 图像处理装置及方法和计算机存储介质

Country Status (2)

Country Link
CN (1) CN106454105A (zh)
WO (1) WO2018076938A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110503605A (zh) * 2019-08-27 2019-11-26 Oppo广东移动通信有限公司 一种图像处理方法、装置及存储介质
CN110619616A (zh) * 2019-09-19 2019-12-27 广东工业大学 一种图像处理方法、装置和相关设备
CN111583273A (zh) * 2020-04-29 2020-08-25 京东方科技集团股份有限公司 可读存储介质、显示装置及其图像处理方法

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106454105A (zh) * 2016-10-28 2017-02-22 努比亚技术有限公司 图像处理装置及方法
CN108322658B (zh) * 2018-03-29 2020-04-17 青岛海信移动通信技术股份有限公司 一种拍照的方法和装置
CN112422805B (zh) * 2019-08-22 2022-02-18 华为技术有限公司 一种拍摄方法及电子设备
WO2021212498A1 (zh) * 2020-04-24 2021-10-28 深圳市大疆创新科技有限公司 图像处理方法、片上系统和电子设备
CN113766115B (zh) * 2020-06-02 2023-08-04 北京小米移动软件有限公司 图像采集方法、移动终端、装置及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101771810A (zh) * 2008-12-29 2010-07-07 上海乐金广电电子有限公司 获取清晰图像的方法及装置
CN102036005A (zh) * 2009-09-29 2011-04-27 Hoya株式会社 处理捕获图像的成像器
CN103595909A (zh) * 2012-08-16 2014-02-19 Lg电子株式会社 移动终端及其控制方法
US20150185585A1 (en) * 2012-09-19 2015-07-02 Fujifilm Corporation Imaging device, and focus-confirmation display method
CN105049640A (zh) * 2015-08-31 2015-11-11 努比亚技术有限公司 一种实现调焦的装置和方法
CN106454105A (zh) * 2016-10-28 2017-02-22 努比亚技术有限公司 图像处理装置及方法

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4854620B2 (ja) * 2007-07-19 2012-01-18 三洋電機株式会社 電子ズーム装置およびプログラム
US8040399B2 (en) * 2008-04-24 2011-10-18 Sony Corporation System and method for effectively optimizing zoom settings in a digital camera
JP5335571B2 (ja) * 2009-06-15 2013-11-06 キヤノン株式会社 撮像装置
JP5521518B2 (ja) * 2009-12-01 2014-06-18 ソニー株式会社 撮像装置、撮像方法、およびプログラム
US20120075489A1 (en) * 2010-09-24 2012-03-29 Nishihara H Keith Zoom camera image blending technique
CN102868855B (zh) * 2011-07-04 2015-02-11 安凯(广州)微电子技术有限公司 一种数码变焦方法和装置
JP6006024B2 (ja) * 2012-07-02 2016-10-12 オリンパス株式会社 撮像装置、撮像方法およびプログラム
KR101578600B1 (ko) * 2013-03-22 2015-12-17 가시오게산키 가부시키가이샤 이미지 처리 장치, 이미지 처리 방법 및 컴퓨터로 읽을 수 있는 기록 매체

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101771810A (zh) * 2008-12-29 2010-07-07 上海乐金广电电子有限公司 获取清晰图像的方法及装置
CN102036005A (zh) * 2009-09-29 2011-04-27 Hoya株式会社 处理捕获图像的成像器
CN103595909A (zh) * 2012-08-16 2014-02-19 Lg电子株式会社 移动终端及其控制方法
US20150185585A1 (en) * 2012-09-19 2015-07-02 Fujifilm Corporation Imaging device, and focus-confirmation display method
CN105049640A (zh) * 2015-08-31 2015-11-11 努比亚技术有限公司 一种实现调焦的装置和方法
CN106454105A (zh) * 2016-10-28 2017-02-22 努比亚技术有限公司 图像处理装置及方法

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110503605A (zh) * 2019-08-27 2019-11-26 Oppo广东移动通信有限公司 一种图像处理方法、装置及存储介质
CN110503605B (zh) * 2019-08-27 2023-03-24 Oppo广东移动通信有限公司 一种图像处理方法、装置及存储介质
CN110619616A (zh) * 2019-09-19 2019-12-27 广东工业大学 一种图像处理方法、装置和相关设备
CN110619616B (zh) * 2019-09-19 2022-06-24 广东工业大学 一种图像处理方法、装置和相关设备
CN111583273A (zh) * 2020-04-29 2020-08-25 京东方科技集团股份有限公司 可读存储介质、显示装置及其图像处理方法

Also Published As

Publication number Publication date
CN106454105A (zh) 2017-02-22

Similar Documents

Publication Publication Date Title
WO2018076938A1 (zh) 图像处理装置及方法和计算机存储介质
CN106454121B (zh) 双摄像头拍照方法及装置
WO2018019124A1 (zh) 一种图像处理方法及电子设备、存储介质
US8780258B2 (en) Mobile terminal and method for generating an out-of-focus image
WO2017071559A1 (zh) 图像处理装置及方法
WO2017050115A1 (zh) 一种图像合成方法和装置
WO2017067520A1 (zh) 具有双目摄像头的移动终端及其拍照方法
WO2018076935A1 (zh) 图像虚化处理方法、装置、移动终端和计算机存储介质
WO2017067526A1 (zh) 图像增强方法及移动终端
CN106909274B (zh) 一种图像显示方法和装置
WO2017045647A1 (zh) 一种处理图像的移动终端和方法
CN106878588A (zh) 一种视频背景虚化终端及方法
WO2017071475A1 (zh) 一种图像处理方法及终端、存储介质
WO2017206656A1 (zh) 一种图像处理方法及终端、计算机存储介质
WO2018019128A1 (zh) 一种夜景图像的处理方法和移动终端
WO2017071542A1 (zh) 图像处理方法及装置
WO2017088662A1 (zh) 对焦方法和装置
WO2017041714A1 (zh) 一种获取rgb数据的方法和装置
CN105159594A (zh) 一种基于压力传感器的触摸拍照装置、方法及移动终端
CN106851125B (zh) 一种移动终端及多重曝光拍摄方法
CN106911881B (zh) 一种基于双摄像头的动态照片拍摄装置、方法和终端
CN106851113A (zh) 一种基于双摄像头的拍照方法及移动终端
CN106303229A (zh) 一种拍照方法及装置
WO2017071469A1 (zh) 一种移动终端和图像拍摄方法、计算机存储介质
WO2017185778A1 (zh) 一种移动终端及其曝光方法、计算机存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17864683

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 20.09.2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17864683

Country of ref document: EP

Kind code of ref document: A1