CN110490204B - Image processing method, image processing device and terminal - Google Patents

Image processing method, image processing device and terminal Download PDF

Info

Publication number
CN110490204B
CN110490204B CN201910625191.3A CN201910625191A CN110490204B CN 110490204 B CN110490204 B CN 110490204B CN 201910625191 A CN201910625191 A CN 201910625191A CN 110490204 B CN110490204 B CN 110490204B
Authority
CN
China
Prior art keywords
character
gray
image
pixel points
character image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910625191.3A
Other languages
Chinese (zh)
Other versions
CN110490204A (en
Inventor
傅博扬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yihua Computer Co Ltd
Shenzhen Yihua Time Technology Co Ltd
Shenzhen Yihua Financial Intelligent Research Institute
Original Assignee
Shenzhen Yihua Computer Co Ltd
Shenzhen Yihua Time Technology Co Ltd
Shenzhen Yihua Financial Intelligent Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Yihua Computer Co Ltd, Shenzhen Yihua Time Technology Co Ltd, Shenzhen Yihua Financial Intelligent Research Institute filed Critical Shenzhen Yihua Computer Co Ltd
Priority to CN201910625191.3A priority Critical patent/CN110490204B/en
Publication of CN110490204A publication Critical patent/CN110490204A/en
Application granted granted Critical
Publication of CN110490204B publication Critical patent/CN110490204B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06T5/94
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/273Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion removing elements interfering with the pattern to be recognised
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/14Image acquisition
    • G06V30/148Segmentation of character regions
    • G06V30/153Segmentation of character regions using recognition of characters or words

Abstract

The invention is applicable to the technical field of image recognition, and provides an image processing method, an image processing device, a terminal and a computer readable storage medium. Wherein the image processing method comprises: acquiring a character image corresponding to a character area in an image; calculating the gray average value of the character image; determining foreground pixel points of the character image according to a preset proportion; and carrying out gray enhancement processing on foreground pixel points with gray values smaller than the gray mean value in the character image. The method has higher robustness and computational efficiency, can effectively realize the enhancement and correction of the unclear characters in the image, and is beneficial to improving the recognition efficiency and the recognition accuracy of the characters in the image.

Description

Image processing method, image processing device and terminal
Technical Field
The present invention belongs to the field of image recognition technology, and in particular, to an image processing method, an image processing apparatus, a terminal, and a computer-readable storage medium.
Background
Currently, in the bill service, a bill processing device includes a bill issuing module and a bill receiving module; during the issuing or receiving process, the image sensor is required to collect the image of the bill.
However, in some cases, due to problems with printing characters on the document, there may be situations where the document image has characters that are not printed completely, characters that are missing, or characters that are not ink-uniform; in addition, in the collection process of the bill images, the parameters of the image sensor and the angles of the light supplement lamps influence the quality of the bill images. Therefore, the above problem causes a problem of character unclear on the bill image, affecting the recognition efficiency and recognition accuracy of the characters.
Disclosure of Invention
In view of this, the present invention provides an image processing method, an image processing apparatus, a terminal, and a computer readable storage medium, so as to solve the problem in the prior art that the recognition efficiency and the recognition accuracy of characters are affected due to unclear characters on a ticket image.
A first aspect of an embodiment of the present invention provides an image processing method, including:
acquiring a character image corresponding to a character area in an image;
calculating the gray average value of the character image;
determining foreground pixel points of the character image according to a preset proportion;
and carrying out gray level enhancement processing on foreground pixel points with gray level values smaller than the gray level mean value in the character image.
A second aspect of an embodiment of the present invention provides an image processing apparatus including:
the character image acquisition unit is used for acquiring a character image corresponding to a character area in an image;
the first calculation unit is used for calculating the gray average value of the character image;
the foreground pixel determining unit is used for determining foreground pixel points of the character image according to a preset proportion;
and the gray processing unit is used for carrying out gray enhancement processing on the foreground pixel points with the gray values smaller than the gray mean value in the character image.
A third aspect of embodiments of the present invention provides a terminal, including a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of any of the image processing methods when executing the computer program.
A fourth aspect of the embodiments of the present invention provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the image processing method according to any one of the above.
Compared with the prior art, the invention has the following beneficial effects:
the method comprises the steps of obtaining a character image corresponding to a character area in an image; calculating the gray average value of the character image; determining foreground pixel points of the character image according to a preset proportion; and carrying out gray enhancement processing on foreground pixel points with gray values smaller than the gray mean value in the character image. The method does not relate to a fixed threshold value in the image processing process, does not adopt an iterative processing means, and only determines the foreground pixel points according to the preset proportion, so that the method has higher robustness and calculation efficiency, can effectively realize the enhancement and correction of the unclear characters in the image, and is favorable for improving the recognition efficiency and the recognition accuracy of the characters in the image.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings required to be used in the embodiments or the prior art description will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings may be obtained according to these drawings without inventive labor.
FIG. 1 is a flowchart illustrating an implementation of an image processing method according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a terminal according to an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
In order to make the objects, technical solutions and advantages of the present invention more apparent, the following description is made by way of specific embodiments with reference to the accompanying drawings.
Referring to fig. 1, it shows a flowchart of an implementation of the image processing method provided by the embodiment of the present invention, which is detailed as follows:
in step 101, a character image corresponding to a character area in an image is acquired;
in the embodiment of the invention, the image to be processed is an image containing characters, and the character image corresponding to the character area in the image is acquired so as to be convenient for subsequent processing of the character image. For example, the character image may be an image of a rectangular region including a single character, or may be an image of a rectangular region including a plurality of characters.
Optionally, the step 101 may include: locating a character area in the image; and performing character cutting on the character area to obtain a character image at least comprising one character.
In the embodiment of the invention, when the method is applied to processing of the bill image, the coordinates of the character area can be obtained specifically aiming at the actual position of the character on the bill, and the character area is determined on the image according to the coordinates.
In another application scenario, the positioning of the character area based on a specific color of the character to be processed may also be performed on the image according to the specific color.
After the character region is located, the character may be cut to obtain a single character image, or an image containing two characters, or an image containing three characters.
In step 102, a mean value of the gray levels of the character images is calculated.
In the embodiment of the present invention, the calculated gray-scale mean is a gray-scale mean of all pixel points on the character image.
In step 103, determining foreground pixel points of the character image according to a preset proportion;
in the embodiment of the present invention, the preset ratio is obtained by counting gray-scale values of foreground pixel points on a certain number of related character images. Illustratively, for ticket images, specific character areas on the ticket are printed based on the same ink. Then, for example, for a black character and a character image with a white background, among all the pixel points of the character image, the pixel point with the smallest gray value (the darkest brightness) should be the pixel point (the foreground pixel point) corresponding to the black character pattern, and moreover, in fact, the gray values of the pixel points corresponding to all the black character patterns should be within a certain gray scale, that is, no matter whether the character is a number or a letter, no matter whether the character is a number 0 or a number 9, among the gray values corresponding to all the pixel points of the character image, the pixel point corresponding to the gray value with a smaller certain proportion should be the pixel point corresponding to the black character pattern.
In the embodiment of the invention, a proportion can be obtained by counting a certain number of character images with the same printing ink, and pixel points corresponding to the gray value of the proportion in the pixel gray levels of the character images should be both foreground pixel points or both background pixel points.
Optionally, the step 103 may include:
counting the gray value of each pixel point of the character image; and determining the pixel points with the smaller gray value and the preset proportion as foreground pixel points of the character image.
For example, for a character image with dark-on-light characters, regardless of the character shape, the gray value of 15% smaller of the gray values of all the pixels may correspond to the foreground pixels, and the gray value of 85% larger of the gray values may correspond to the background pixels.
Therefore, based on the calculated proportion, the division of foreground pixel points and background pixel points of the character image can be realized.
Accordingly, for the character image of the dark-bottom and bright character, the above step 103 should be: counting the gray value of each pixel point of the character image; and determining the pixel points with larger gray values in the preset proportion as foreground pixel points of the character image.
In step 104, performing gray enhancement processing on foreground pixel points in the character image, of which the gray values are smaller than the gray mean value.
In the embodiment of the invention, the foreground pixel points corresponding to the character patterns in the character image can be determined through the preset proportion, and then the gray enhancement processing can be carried out on the foreground pixel points corresponding to the character patterns in the character image, so that the foreground pixel points are clearer. Specifically, for the character image with bright background and dark characters, the gray value of the foreground pixel point can be further reduced, so that the gray value is darker; for the character image of the dark bottom bright character, the gray value of the foreground pixel point can be further improved, so that the character image is brighter. In addition, the gray values of the foreground pixel points can be uniformly distributed through gray enhancement processing, so that the character pattern contour is clearer.
Optionally, the performing gray level enhancement processing on the foreground pixel point of which the gray level value is smaller than the gray level mean value in the character image includes: and replacing the gray value of the foreground pixel point with the gray value smaller than the gray average value in the character image with the specified gray value.
In the embodiment of the invention, the designated gray value can be determined based on the actual gray corresponding to the character pattern, the foreground pixel points with the gray values smaller than the average gray value in the character image are the foreground pixel points with low pixel quality, and the gray of the foreground pixel points with low pixel quality can be corrected by replacing the foreground pixel points with the designated gray value, so that the character pattern is clearer.
Optionally, before replacing the gray value of the foreground pixel point with the gray value smaller than the gray mean value in the character image with the designated gray value, the method further includes:
counting the gray average value of foreground pixel points in the character image; and recording the gray average value of the foreground pixel points in the character image as the designated gray value.
In the embodiment of the invention, the gray average value of the foreground pixel points in the counted character image can be used as the designated gray value, and the gray replacement is carried out on the foreground pixel points with the gray value smaller than the gray average value in the character image.
Optionally, after the step 104, the method may further include: and performing character recognition based on the character image after the enhancement processing, wherein the gray value corresponding to the character pattern in the character image after the enhancement processing is more uniform, the gray difference value between the gray value and the background pixel point is increased, and the character pattern is clearer, so that the efficiency and the accuracy of character recognition can be improved.
In the invention, the character image corresponding to the character area in the image is obtained; calculating the gray average value of the character image; determining foreground pixel points of the character image according to a preset proportion; and carrying out gray enhancement processing on foreground pixel points with gray values smaller than the gray mean value in the character image. The method does not relate to a fixed threshold value in the image processing process, does not adopt an iterative processing means, only determines the foreground pixel points according to a preset proportion, has higher robustness and calculation efficiency, can effectively realize the enhancement and correction of the unclear characters in the image, and is beneficial to improving the recognition efficiency and the recognition accuracy of the characters in the image.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by functions and internal logic of the process, and should not limit the implementation process of the embodiments of the present invention in any way.
The following are embodiments of the apparatus of the invention, reference being made to the corresponding method embodiments described above for details which are not described in detail therein.
Fig. 2 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present invention, and for convenience of description, only the portions related to the embodiment of the present invention are shown, and the details are as follows:
as shown in fig. 2, the image processing apparatus 2 includes: a character image acquisition unit 21, a first calculation unit 22, a foreground pixel determination unit 23, and a gradation processing unit 24.
A character image acquisition unit 21 configured to acquire a character image corresponding to a character area in an image;
a first calculation unit 22 for calculating a grayscale mean of the character image;
a foreground pixel determining unit 23, configured to determine foreground pixels of the character image according to a preset ratio;
and the gray processing unit 24 is configured to perform gray enhancement processing on foreground pixels in the character image, where the gray values of the foreground pixels are smaller than the gray mean value.
Optionally, the image processing apparatus 2 further includes:
a character area positioning unit for positioning a character area in the image;
the character image obtaining unit is specifically configured to perform character cutting on the character area to obtain a character image including at least one character.
Optionally, the image processing apparatus 2 further includes:
the gray value counting unit is used for counting the gray value of each pixel point of the character image;
the foreground pixel determining unit 23 is specifically configured to determine the pixel points with the smaller gray value in the preset ratio as foreground pixel points of the character image.
Optionally, the gray processing unit 24 is specifically configured to replace the gray value of the foreground pixel point in the character image, of which the gray value is smaller than the gray average value, with a specified gray value.
Optionally, the image processing apparatus 2 further includes:
and the designated gray value determining unit is used for counting the gray mean value of the foreground pixel points in the character image before the gray processing unit 24 replaces the gray value of the foreground pixel points with the gray mean value, wherein the gray value of the foreground pixel points is smaller than the gray mean value, and recording the gray mean value of the foreground pixel points in the character image as the designated gray value.
In the invention, the character image corresponding to the character area in the image is obtained; calculating the gray average value of the character image; determining foreground pixel points of the character image according to a preset proportion; and carrying out gray level enhancement processing on foreground pixel points with gray level values smaller than the gray level mean value in the character image. The method does not relate to a fixed threshold value in the image processing process, does not adopt an iterative processing means, only determines the foreground pixel points according to a preset proportion, has higher robustness and calculation efficiency, can effectively realize the enhancement and correction of the unclear characters in the image, and is beneficial to improving the recognition efficiency and the recognition accuracy of the characters in the image.
Fig. 3 is a schematic diagram of a terminal according to an embodiment of the present invention. As shown in fig. 3, the terminal 3 of this embodiment includes: a processor 30, a memory 31 and a computer program 32 stored in said memory 31 and executable on said processor 30. The processor 30, when executing the computer program 32, implements the steps in the various image processing method embodiments described above, such as the steps 101 to 104 shown in fig. 1. Alternatively, the processor 30, when executing the computer program 32, implements the functions of the modules/units in the above-mentioned device embodiments, such as the functions of the units 21 to 24 shown in fig. 2.
Illustratively, the computer program 32 may be partitioned into one or more modules/units, which are stored in the memory 31 and executed by the processor 30 to implement the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions for describing the execution of the computer program 32 in the terminal 3. For example, the computer program 32 may be divided into the character image obtaining unit 21, the first calculating unit 22, the foreground pixel determining unit 23 and the gradation processing unit 24, and the specific functions of each unit are as follows:
a character image acquisition unit 21 configured to acquire a character image corresponding to a character area in an image;
a first calculation unit 22 for calculating a gray average value of the character image;
a foreground pixel determining unit 23, configured to determine a foreground pixel of the character image according to a preset ratio;
and the gray processing unit 24 is configured to perform gray enhancement processing on foreground pixels in the character image, where the gray values of the foreground pixels are smaller than the gray mean value.
The terminal 3 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The terminal may include, but is not limited to, a processor 30, a memory 31. It will be appreciated by those skilled in the art that fig. 3 is only an example of a terminal 3 and does not constitute a limitation of the terminal 3 and may comprise more or less components than those shown, or some components may be combined, or different components, e.g. the terminal may further comprise input output devices, network access devices, buses, etc.
The Processor 30 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 31 may be an internal storage unit of the terminal 3, such as a hard disk or a memory of the terminal 3. The memory 31 may also be an external storage device of the terminal 3, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) and the like provided on the terminal 3. Further, the memory 31 may also include both an internal storage unit and an external storage device of the terminal 3. The memory 31 is used for storing the computer program and other programs and data required by the terminal. The memory 31 may also be used to temporarily store data that has been output or is to be output.
It should be clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional units and modules is only used for illustration, and in practical applications, the above function distribution may be performed by different functional units and modules as needed, that is, the internal structure of the apparatus may be divided into different functional units or modules to perform all or part of the above described functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. For the specific working processes of the units and modules in the system, reference may be made to the corresponding processes in the foregoing method embodiments, which are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal and method may be implemented in other ways. For example, the above-described apparatus/terminal embodiments are merely illustrative, and for example, the division of the modules or units is only one type of logical function division, and other division manners may exist in actual implementation, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments described above may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, U.S. disk, removable hard disk, magnetic diskette, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signal, telecommunications signal, and software distribution medium, etc. It should be noted that the computer readable medium may contain suitable additions or subtractions depending on the requirements of legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media may not include electrical carrier signals or telecommunication signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (6)

1. An image processing method, characterized in that the image processing method comprises:
determining a character area corresponding to a character in an image according to the actual position of the character in the image to be processed or the specific color of the character;
performing character cutting on the character area to obtain a character image at least comprising one character;
calculating the gray average value of the character image;
determining foreground pixel points of the character images according to a preset proportion, wherein the preset proportion is obtained by statistics according to gray value ratios of the foreground pixel points on a preset number of related character images;
counting the gray average value of foreground pixel points in the character image;
recording the gray average value of foreground pixel points in the character image as an appointed gray value;
and carrying out gray level enhancement processing on the foreground pixel points with the gray level values smaller than the gray level mean value of the character image in the character image, wherein the enhancement processing mode is to replace the gray level values of the foreground pixel points with the gray level values smaller than the gray level mean value in the character image with the specified gray level values.
2. The image processing method according to claim 1, wherein the determining foreground pixel points of the character image according to a preset ratio comprises:
counting the gray value of each pixel point of the character image;
and determining the pixel points with the smaller gray value and the preset proportion as foreground pixel points of the character image.
3. The image processing method according to claim 1 or 2, further comprising, after performing gray enhancement processing on foreground pixel points in the character image whose gray values are smaller than the gray mean:
and performing character recognition based on the character image after the enhancement processing.
4. An image processing apparatus characterized by comprising:
the determining unit is used for determining a character area corresponding to the character in the image according to the actual position of the character in the image to be processed or the specific color of the character;
the character image acquisition unit is used for carrying out character cutting on the character area to obtain a character image at least comprising one character;
the first calculating unit is used for calculating the gray average value of the character image;
the foreground pixel determining unit is used for determining foreground pixels of the character images according to a preset proportion, and the preset proportion is obtained through statistics according to gray value proportion of the foreground pixels on a preset number of related character images;
the second calculation unit is used for counting the gray average value of foreground pixel points in the character image;
the execution unit is used for recording the gray average value of the foreground pixel points in the character image as an appointed gray value;
and the gray processing unit is used for carrying out gray enhancement processing on the foreground pixel points with the gray values smaller than the gray mean value of the character image in the character image, wherein the enhancement processing mode is to replace the gray values of the foreground pixel points with the gray values smaller than the gray mean value of the character image with the specified gray values.
5. A terminal comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the image processing method according to any of the preceding claims 1 to 3 when executing the computer program.
6. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the image processing method according to any one of claims 1 to 3.
CN201910625191.3A 2019-07-11 2019-07-11 Image processing method, image processing device and terminal Active CN110490204B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910625191.3A CN110490204B (en) 2019-07-11 2019-07-11 Image processing method, image processing device and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910625191.3A CN110490204B (en) 2019-07-11 2019-07-11 Image processing method, image processing device and terminal

Publications (2)

Publication Number Publication Date
CN110490204A CN110490204A (en) 2019-11-22
CN110490204B true CN110490204B (en) 2022-07-15

Family

ID=68546994

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910625191.3A Active CN110490204B (en) 2019-07-11 2019-07-11 Image processing method, image processing device and terminal

Country Status (1)

Country Link
CN (1) CN110490204B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111080554B (en) * 2019-12-20 2023-08-04 成都极米科技股份有限公司 Method and device for enhancing subtitle region in projection content and readable storage medium
CN112017352B (en) * 2020-09-03 2022-12-06 平安科技(深圳)有限公司 Certificate authentication method, device, equipment and readable storage medium
CN113781351B (en) * 2021-09-16 2023-12-08 广州安方生物科技有限公司 Image processing method, apparatus and computer readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102136064A (en) * 2011-03-24 2011-07-27 成都四方信息技术有限公司 System for recognizing characters from image
CN106247969A (en) * 2016-09-21 2016-12-21 哈尔滨工业大学 A kind of deformation detecting method of industrial magnetic core element based on machine vision
CN107992874A (en) * 2017-12-20 2018-05-04 武汉大学 Image well-marked target method for extracting region and system based on iteration rarefaction representation
CN108205671A (en) * 2016-12-16 2018-06-26 浙江宇视科技有限公司 Image processing method and device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6173542B1 (en) * 2016-08-10 2017-08-02 株式会社Pfu Image processing apparatus, image processing method, and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102136064A (en) * 2011-03-24 2011-07-27 成都四方信息技术有限公司 System for recognizing characters from image
CN106247969A (en) * 2016-09-21 2016-12-21 哈尔滨工业大学 A kind of deformation detecting method of industrial magnetic core element based on machine vision
CN108205671A (en) * 2016-12-16 2018-06-26 浙江宇视科技有限公司 Image processing method and device
CN107992874A (en) * 2017-12-20 2018-05-04 武汉大学 Image well-marked target method for extracting region and system based on iteration rarefaction representation

Also Published As

Publication number Publication date
CN110490204A (en) 2019-11-22

Similar Documents

Publication Publication Date Title
CN110490204B (en) Image processing method, image processing device and terminal
CN107403421B (en) Image defogging method, storage medium and terminal equipment
CN109977949B (en) Frame fine adjustment text positioning method and device, computer equipment and storage medium
CN111368587B (en) Scene detection method, device, terminal equipment and computer readable storage medium
CN110969046B (en) Face recognition method, face recognition device and computer-readable storage medium
CN109214996B (en) Image processing method and device
CN110675334A (en) Image enhancement method and device
CN110648284B (en) Image processing method and device with uneven illumination
CN114943649A (en) Image deblurring method, device and computer readable storage medium
CN113487473B (en) Method and device for adding image watermark, electronic equipment and storage medium
CN108268868B (en) Method and device for acquiring inclination value of identity card image, terminal and storage medium
CN111369531B (en) Image definition scoring method, device and storage device
CN109801428B (en) Method and device for detecting edge straight line of paper money and terminal
CN111311610A (en) Image segmentation method and terminal equipment
CN111539975A (en) Method, device and equipment for detecting moving target and storage medium
CN116883336A (en) Image processing method, device, computer equipment and medium
CN110633705A (en) Low-illumination imaging license plate recognition method and device
CN113391779B (en) Parameter adjusting method, device and equipment for paper-like screen
CN114998282A (en) Image detection method, image detection device, electronic equipment and storage medium
CN107103321B (en) The generation method and generation system of road binary image
CN114219760A (en) Reading identification method and device of instrument and electronic equipment
CN114267035A (en) Document image processing method and system, electronic device and readable medium
CN112541899A (en) Incomplete certificate detection method and device, electronic equipment and computer storage medium
CN111062984A (en) Method, device and equipment for measuring area of video image region and storage medium
CN112530079B (en) Method, device, terminal equipment and storage medium for detecting bill factors

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant