CN105931249B - A kind of test method and test device of Camera imaging - Google Patents
A kind of test method and test device of Camera imaging Download PDFInfo
- Publication number
- CN105931249B CN105931249B CN201610302424.2A CN201610302424A CN105931249B CN 105931249 B CN105931249 B CN 105931249B CN 201610302424 A CN201610302424 A CN 201610302424A CN 105931249 B CN105931249 B CN 105931249B
- Authority
- CN
- China
- Prior art keywords
- value
- camera
- test
- color lump
- pixel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
Abstract
The invention discloses the test methods and test device of a kind of Camera imaging, big to solve the problems, such as image checking low efficiency, omission factor exists in the prior art.This method are as follows: after the test instruction for receiving host computer transmission, when determining that Camera is in test mode, acquire each of the Camera current preview frame corresponding original yuv data of pixel, and according to the corresponding original yuv data of each collected pixel, filter out all target areas, and according to all target areas filtered out, complete the imaging test to Camera.In this way, the automatic test for not only realizing Camera imaging also improves testing efficiency and accuracy while saving a large amount of human resources.
Description
Technical field
The test method and test dress being imaged the present invention relates to digital imaging technology field more particularly to a kind of Camera
It sets.
Background technique
With the continuous development of electronic technology, the more and more abundant multiplicity of the type of electronic product, people are to electronic product
Functional requirement is also higher and higher, in order to meet the needs of people, guarantees the quality of electronic product, in the process of production electronic product
In, it will usually the various functions of electronic product are tested, such as: the Camera function of taking pictures of electronic product is examined
Survey, i.e., detector lens whether have a stain, bad point, imaging whether colour cast, whether focusing clear etc..
In the prior art, the method for generalling use artificial detection, it is to detect the Camera function of electronic product, i.e., artificial to see
Examine camera lens whether have a stain, bad point, and after being taken pictures using electronic product, by the imaging of artificial detection photo whether colour cast, it is right
Whether burnt clear, in this way, can not only expend a large amount of human resources, but also that there are detection efficiencies is low, and false dismissal probability is big, accurate
Spend the problems such as low.
Summary of the invention
The embodiment of the invention provides the test methods and test device of a kind of Camera imaging, to solve the prior art
Middle image checking low efficiency, the problem that omission factor is big, accuracy is low.
Specific technical solution provided in an embodiment of the present invention is as follows:
A kind of test method of Camera imaging, comprising:
When UE determines that Camera is in test mode, each of Camera current preview frame pixel pair is acquired
The original yuv data answered;
UE is based on the corresponding original yuv data of each pixel, filters out all target areas;
Number of the UE based on the pixel in each target area filters out all from all target areas
Color block areas, and it is based on each color block areas, complete the imaging test to the Camera.
Preferably, UE is based on the corresponding original yuv data of each pixel, all target areas are filtered out, comprising:
UE extracts the Y value in the corresponding original yuv data of each pixel respectively;
UE is based on the corresponding Y value of each pixel, filters out all mesh that corresponding Y value is less than preset first threshold
Mark region.
Preferably, number of the UE based on the pixel in each target area is filtered out from all target areas
All color block areas, comprising:
UE traverses each target area, determines the number of the pixel in any one target area preset first
When in range, determine that any one described target area is color block areas.
Preferably, further comprising:
When UE determines the number of the pixel in any one target area not in preset first range, described in judgement
Any one target area is bad point region or stain region.
Preferably, UE is based on each color block areas, the imaging test to the Camera is completed, comprising:
UE filters out black and white color lump from all color block areas, and is based on the black and white color lump, completes to described
The focusing of Camera is tested;
UE filters out green color lump, blue color lump and red color lump from all color block areas, and is based on the green
Color lump, the blue color lump and the red color lump, complete to test the colour cast of the Camera.
Preferably, UE is based on the black and white color lump, completes the focusing to the Camera and tests, comprising:
UE is based on each of the black and white color lump corresponding Y value of pixel, is starting with each pixel, successively
Calculate the Y value difference between per two adjacent pixels;
UE filters out all Y value differences less than preset second threshold, and it is pre- to determine that the number of all Y value differences is greater than
If third threshold value when, determine that Camera focusing is normal.
Preferably, UE is based on the green color lump, the blue color lump and the red color lump, complete to the Camera
Colour cast test, comprising:
The corresponding original yuv data of collected all pixels is converted to corresponding RGB data respectively by UE, and is divided
Indescribably take out the corresponding R value of the red color lump, the corresponding B value of the blue color lump;
UE chooses one piece of region from other regions in addition to the color block areas, and calculates in one piece of region
R value and G value ratio and B value and G value ratio;
UE determines that the ratio of R value and G value is greater than 1, and the corresponding R value of the red color lump is greater than preset third threshold value
When, determine that the imaging of the Camera is partially red;Alternatively,
UE determines that the ratio of B value and G value is greater than 1, and the corresponding B value of the blue color lump is greater than preset 4th threshold value
When, determine that the imaging of the Camera is partially blue.
A kind of test device of Camera imaging, comprising:
Acquisition unit when for determining that Camera is in test mode, acquires every in the Camera current preview frame
The corresponding original yuv data of one pixel;
Screening unit filters out all target areas for being based on the corresponding original yuv data of each pixel;
Test cell, for the number based on the pixel in each target area, from all target areas, sieve
All color block areas are selected, and are based on each color block areas, complete the imaging test to the Camera.
Preferably, it is based on the corresponding original yuv data of each pixel, it is described when filtering out all target areas
Screening unit is used for:
The Y value in the corresponding original yuv data of each pixel is extracted respectively;
Based on the corresponding Y value of each pixel, all targets that corresponding Y value is less than preset first threshold are filtered out
Region.
Preferably, the number based on the pixel in each target area filters out institute from all target areas
When some color block areas, the test cell is used for:
Each target area is traversed, determines the number of the pixel in any one target area in preset first model
When enclosing interior, determine that any one described target area is color block areas.
Preferably, the test cell is further used for:
When determining the number of the pixel in any one target area not in preset first range, described appoint is determined
Target area of anticipating is bad point region or stain region.
Preferably, being based on each color block areas, when completing the imaging test to the Camera, the test cell is used
In:
From all color block areas, black and white color lump is filtered out, and be based on the black and white color lump, completed to the Camera
Focusing test;
From all color block areas, green color lump, blue color lump and red color lump are filtered out, and based on the green color
Block, the blue color lump and the red color lump, complete to test the colour cast of the Camera.
Preferably, being based on the black and white color lump, when completing the focusing test to the Camera, the test cell is used
In:
Based on the corresponding Y value of each of described black and white color lump pixel, it is starting with each pixel, successively counts
Calculate the Y value difference between per two adjacent pixels;
All Y value differences less than preset second threshold are filtered out, it is default to determine that the number of all Y value differences is greater than
Third threshold value when, determine that Camera focusing is normal.
Preferably, being completed based on the green color lump, the blue color lump and the red color lump to the Camera's
When colour cast is tested, the test cell is used for:
The corresponding original yuv data of collected all pixels is converted into corresponding RGB data respectively, and respectively
Extract the corresponding R value of the red color lump, the corresponding B value of the blue color lump;
From other regions in addition to the color block areas, one piece of region is chosen, and calculates the R in one piece of region
Value and the ratio of G value and the ratio of B value and G value;
Determine that the ratio of R value and G value is greater than 1, and when the corresponding R value of the red color lump is greater than preset third threshold value,
Determine that the imaging of the Camera is partially red;Alternatively,
Determine that the ratio of B value and G value is greater than 1, and when the corresponding B value of the blue color lump is greater than preset four threshold value,
Determine that the imaging of the Camera is partially blue.
The embodiment of the present invention has the beneficial effect that:
It, will be automatically to itself as long as UE determines that itself Camera is in test mode in the embodiment of the present invention
Camera carries out imaging test, also improves testing efficiency while saving a large amount of human resources without artificial detection
And accuracy.In addition to this, the imaging test of Camera can be realized in the side UE, and is only needed to every in current preview frame
The corresponding original yuv data of one pixel is analyzed, without the photo upload by obtaining after taking pictures to host computer into
Row imaging test saves the processing time, further increases testing efficiency.
Detailed description of the invention
Fig. 1 is the simple process schematic diagram of Camera imaging test method in the embodiment of the present invention;
Fig. 2 is the overview schematic diagram of Camera imaging test method in the embodiment of the present invention;
Fig. 3 is the structural schematic diagram of grey card graphic in the embodiment of the present invention;
Fig. 4 is the structural schematic diagram that image is highlighted in the embodiment of the present invention;
Fig. 5 is normal black and white block structure schematic diagram of focusing in the embodiment of the present invention;
Fig. 6 is the idiographic flow schematic diagram of Camera imaging test method in the embodiment of the present invention;
Fig. 7 is the illustrative view of functional configuration of Camera imaging test device in the embodiment of the present invention.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete
Site preparation description, it is clear that described embodiments are only a part of the embodiments of the present invention, is not whole embodiments.It is based on
Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts every other
Embodiment shall fall within the protection scope of the present invention.
In order to solve the problems, such as that image checking low efficiency, omission factor are big in the prior art, accuracy is low, the embodiment of the present invention
It provides a kind of couple of Camera and the method tested automatically is imaged, as shown in fig.1, host computer is soft by the detection in UE
Part will pass through in advance by the core nodes of the write-in such as test item and dependence test configuration parameter UE, and after the completion of determining write-in
Inspection software in UE, the hardware abstraction layer (Hardware Abstraction Layer, HAL) into UE send test and refer to
It enables.After HAL receives test instruction, it can be directly based upon the test item read from core nodes and test configurations parameter, it is right
Above-mentioned Camera carries out imaging test, and core nodes are written in test result.Inspection software in UE determines that Camera is tested
When completion, the test result read from core nodes can be reported to host computer.Host computer can be according to the test reported
As a result, Camera corresponding to above-mentioned UE carries out the judgement of imaging test.
The present invention program is described in detail below by specific embodiment, certainly, the present invention is not limited to following realities
Apply example.
As shown in fig.2, can be but not limited in the embodiment of the present invention: individually being completed by UE to itself Camera's
Imaging test, and determine the abnormal conditions of Camera, or indicate that UE carries out imaging test to itself Camera by host computer,
Imaging test is executed by UE, the abnormal conditions of Camera are determined by host computer, wherein above-mentioned UE can be Camera function
Any one electronic product, such as: mobile phone, tablet computer, digital camera etc., below only with pass through host computer indicate UE pairs
Itself Camera be illustrated for imaging test, and the detailed process of Camera imaging test method is as follows:
After step 200:UE receives the test instruction of host computer transmission, when determining that Camera is in test mode, acquisition
The corresponding original yuv data of each of above-mentioned Camera current preview frame pixel.
Specifically, when executing step 200, it can use but be not limited to following steps:
Firstly, inspection software of the host computer into UE sends write instruction, test item is carried in above-mentioned write instruction,
And dependence test configuration parameter needed for imaging test process, wherein above-mentioned test item can include but is not limited to: partially
Any one in color test, focusing test and bad point test and stain test or any combination, above-mentioned dependence test configuration
Parameter can be but not limited to: preset threshold needed for imaging test process, preset range etc..
Then, after the inspection software in UE receives above-mentioned write instruction, the test item that will be carried in above-mentioned write instruction
Core nodes in mesh and dependence test configuration parameter write-in UE, above-mentioned core nodes save the survey carried in above-mentioned write instruction
After examination project and dependence test configuration parameter, by above-mentioned inspection software, Xiang Shangshu host computer sends write-in and completes response.
It completes after responding, is determined by above-mentioned test item and correlative measurement secondly, above-mentioned host computer receives above-mentioned write-in
Examination configuration parameter is successfully written the core nodes of above-mentioned UE, and the inspection software into above-mentioned UE sends test and instructs, in instruction
It states UE: by HAL, opening itself Camera, and be based on above-mentioned test configurations parameter, complete all tests to itself Camera
The test of project.
Finally, above-mentioned test instruction is sent to HAL, HAL connects after the inspection software in UE receives above-mentioned test instruction
After receiving the test instruction that above-mentioned inspection software is sent, when determining that itself Camera is in test mode, read from core nodes
Above-mentioned test item and test configurations parameter are taken, determines after reading successfully, opens itself Camera, and acquire above-mentioned Camera and work as
The corresponding original yuv data of each of preceding preview frame pixel.
Value is said, only when determining that itself Camera is in test mode, i.e., is only determining above-mentioned Camera
It is just in order not to interrupt user to the above-mentioned Camera purpose for carrying out imaging test when being based on above-mentioned test instruction unpack
It uses, improves user experience.
Specifically, the method for judging whether above-mentioned Camera is in test mode can use but be not limited to following manner:
When determining the corresponding mark position 1 above-mentioned Camera, determine that above-mentioned Camera is in test mode, and determine
When stating the corresponding mark position 2 Camera, determine that above-mentioned Camera is in use state.
Such as: host computer 1 sends write instruction 1 to UE1.
After inspection software in UE1 receives write instruction 1, by the test item and test of the carrying in write instruction 1
The core nodes in UE1 are written in configuration parameter.
After core nodes in UE1 save the test item and test configurations parameter carried in write instruction 1, pass through detection
Software sends write-in to host computer 1 and completes response 1.
Host computer 1 receives write-in and completes after responding 1, and determination successfully writes above-mentioned test item and test configurations parameter
Enter the core nodes in UE1, and the inspection software into UE1 sends test instruction 1.
When inspection software in UE1 receives test instruction 1, test is instructed into 1 HAL being sent in UE1.
HAL is when determining the corresponding mark position 1 above-mentioned Camera, when determining that above-mentioned Camera is in test mode, from
Read test project and test configurations parameter in core nodes, and after successful reading, it opens on itself Camera, and acquisition
State the corresponding original yuv data of each of Camera current preview frame pixel.
Step 201:UE is based on the corresponding original yuv data of each pixel, filters out all target areas.
Specifically, when executing step 201, it can use but be not limited to following steps:
Firstly, UE is based on HAL, the Y value in the corresponding original yuv data of each pixel is extracted respectively, wherein mention
After taking out the Y value in the corresponding original yuv data of each pixel, it can obtain as shown in Figure 3 with current preview frame pair
The grey card graphic answered.
Then, UE is based on HAL, using the Y value of color block areas and the Y value in bad point region (or stain region) than other areas
The small feature of the Y value in domain successively traverses each pixel in every a line in above-mentioned grey card graphic using algorithm of region growing
The corresponding Y value of point filters out corresponding Y value less than all target areas of preset first threshold (all target area packets
It includes: color block areas, bad point region and stain region).
Finally, a current preview frame can also be replicated, and on the current preview frame of duplication, by every by what is filtered out
The corresponding Y value of each pixel for including in one target area is set as brightness 255, and highlight mark goes out to filter out each
A target area, wherein after highlight mark goes out each target area filtered out, highlighted figure as shown in Figure 4 can be obtained
Picture.In this way, Camera test occur abnormal when, so that it may from the current preview frame of duplication, directly acquire highlighted mark
Whether target area, the target area to check acquisition are accurate.
Such as: continue to use the example above, it is assumed that in the test configurations parameter of host computer write-in: preset first threshold is 100.
UE1 is based on HAL and extracts each respectively according to the corresponding original yuv data of each collected pixel
The corresponding Y value of pixel obtains grey card graphic corresponding with current preview frame.
UE1 is based on HAL, using algorithm of region growing, successively traverses each in every a line in above-mentioned grey card graphic
The corresponding Y value of pixel, all target areas for filtering out corresponding Y value less than 100 (i.e. first thresholds) (include: color lump area
Domain, bad point region and stain region), and in the current preview frame of duplication, it is highlighted to mark each target area filtered out
In include all pixels point.
Step 202:UE completes the imaging test to above-mentioned Camera based on all target areas filtered out.
Specifically, when executing step 202, it can use but be not limited to following steps:
Firstly, UE is based on HAL, the number for the pixel for including in each target area is counted, and utilize " each color
The number of all pixels point in block region is generally in the feature of 600-1000 ", successively based on including in each target area
The number of all pixels point successively judge the institute in each target area since the first aim region determined
There is the number of pixel whether in preset first range, if so, further determining that the target area is color block areas;It is no
Then, determine that the target area is bad point region or stain region.
Then, UE is based on HAL, using the distribution characteristics of color block areas, determines that first color block areas filtered out is green
Color color lump, second color block areas are blue color lump, and third color block areas is black and white color lump and the 4th color block areas is
Red color lump.
Finally, UE is based on HAL, according to the black and white color lump filtered out, completes the focusing to itself Camera and test, and be based on
Green color lump, blue color lump and red color lump are filtered out, completes to test the colour cast of itself Camera, and be based on above-mentioned bad point
Region and above-mentioned stain region complete to test the miscellaneous point of above-mentioned Camera.
Preferably, the HAL in UE is based on filtering out as shown in fig.5, Fig. 5 is normal black and white block structural diagram of focusing
Black and white color lump can use when completing to test the focusing of itself Camera but be not limited to following steps:
Firstly, UE is based on HAL, according to the corresponding Y value of each of above-mentioned black and white color lump pixel, with each pixel
Point is starting, the Y value difference between per two adjacent pixels is successively calculated, below only to calculate every phase in horizontal direction
It is illustrated for Y value difference between two adjacent pixels.
Then, UE is based on HAL, and when using focusing normal, the Y value difference between two neighboring pixel is not less than 80 spy
Sign (when focusing is abnormal, Y value difference very little) between two neighboring pixel, filters out all less than preset second threshold
Y value difference, and the number of all Y value differences is counted, and the number of the Y value difference counted is written to core nodes.
In this way, inspection software can read the number of Y value difference from core nodes, and supreme position machine is sent, prompted above-mentioned upper
Machine judges whether Camera focusing is normal based on the number of Y value difference.
Finally, the number of all Y value differences is not less than 100 feature when host computer will be using focusing normal, institute is determined
When the number of some Y value differences is greater than preset third threshold value, determine that the corresponding Camera focusing of above-mentioned UE is normal.
Preferably, UE is based on HAL, according to green color lump, blue color lump and red color lump is filtered out, complete to itself
When the colour cast test of Camera, it can use but be not limited to following steps:
Firstly, UE is based on HAL, the corresponding original yuv data of collected all pixels is converted to accordingly respectively
RGB data, and extract the corresponding R value of above-mentioned red color lump, the corresponding B value of above-mentioned blue color lump respectively.
Then, UE is based on HAL, from other regions in addition to above-mentioned color block areas, chooses one piece of region, and calculate
State the ratio of the R value and G value in one piece of region and the ratio of B value and G value, and the above-mentioned red color lump pair that will be extracted
The ratio of the R value and the corresponding B value of above-mentioned blue color lump and R value and G value in calculated above-mentioned one piece of region answered, B value
It is written with the ratio of G value to core nodes.In this way, inspection software can read above-mentioned R value from core nodes, above-mentioned B value,
The ratio of the ratio and above-mentioned B value and G value of above-mentioned R value and G value, and supreme position machine is sent, prompt host computer to be based on above-mentioned four
A numerical value judge Camera whether colour cast.
Finally, host computer will utilize the ratio between the R value, G value and B value in other regions in addition to above-mentioned color block areas
The feature that value is generally 1:1:1 judges above-mentioned UE according to the ratio of the ratio and above-mentioned B value and G value of above-mentioned R value and G value
The imaging of corresponding Camera whether colour cast when, it is understood that there may be but be not limited to following two situation (under normal circumstances, Camera
Imaging will not be partially green, so, can not imaging to Camera partially whether green is tested):
The first situation: when determining that the ratio of R value and G value is greater than 1, when further utilizing not colour cast, red color lump is corresponding
R value generally 120 or so feature, when determining that the corresponding R value of above-mentioned red color lump is greater than preset four threshold value, in judgement
The imaging for stating the corresponding Camera of UE is partially red, wherein further determined that according to the corresponding R value of above-mentioned red color lump be in order to
Guarantee the accuracy of judging result.
Second situation: when determining that the ratio of B value and G value is greater than 1, when further utilizing not colour cast, blue color lump is corresponding
B value generally 120 or so feature, when determining that the corresponding B value of above-mentioned blue color lump is greater than preset five threshold value, in judgement
The imaging for stating the corresponding Camera of UE is partially blue, wherein further determined that according to the corresponding B value of above-mentioned blue color lump be in order to
Guarantee the accuracy of judging result.
Such as: continue to use the example above, it is assumed that host computer is written in the test configurations parameter of core nodes: preset first model
Enclosing is 600-1000;Preset second threshold is 80.The preset third threshold value pre-saved into host computer is 100;It is default
The 4th threshold value be 120;Preset 5th threshold value is 120.
UE1 is based on HAL, successively counts the number of the pixel in each target area, and each according to what is counted
The number of pixel in a target area successively judges each target area since the first aim region determined
Whether the number of the pixel in domain is between 600-1000 (i.e. preset first range), if so, determining the target area
It is exactly color block areas, otherwise, it determines the target area is bad point region perhaps stain region and by bad point region or stain area
The location information in domain, the write-in such as number to core nodes.
UE1 is based on HAL, according to the distribution characteristics of color block areas, determines that first color block areas filtered out is green color
Block, second color block areas are blue color lump, and third color block areas is black and white color lump and the 4th color block areas is red
Color lump.
UE1 is based on HAL, is starting with each pixel, successively calculates every two adjacent pixels in horizontal direction
Y value difference between point.
UE1 is based on HAL, filters out all Y value differences less than 80 (i.e. preset second threshold), and counts and to filter out
The number of all Y value differences, and the number of Y value difference is written to above-mentioned core nodes.
UE1 is based on HAL, and the corresponding original yuv data of collected all pixels is converted to corresponding RGB number
According to, and the corresponding R value of above-mentioned red color lump, the corresponding B value of above-mentioned blue color lump are extracted respectively.
UE1 is based on HAL and chooses one piece of region, and calculate above-mentioned one from other regions in addition to above-mentioned color block areas
The ratio of R value and G value in block region and the ratio of B value and G value, and by the R value extracted and B value, and calculate
R value and G value ratio, the ratio of B value and G value is written to above-mentioned core nodes.
When inspection software in UE1 determines that Camera test is completed, above-mentioned test result will be read from core nodes,
And test result is reported into host computer.
After host computer receives the test result that the inspection software in UE1 reports, it will be taken according in above-mentioned test result
The number of the Y value difference of band determines UE1's when determining that the number of Y value difference is greater than 100 (i.e. preset third threshold value)
Camera focusing is normal, the exception otherwise, it is determined that Camera of UE1 focuses.
Ratio and above-mentioned B value of the host computer according to the above-mentioned R value and above-mentioned R value and G value carried in above-mentioned test result
With the ratio of above-mentioned B value and G value, however, it is determined that the ratio of R value and G value is greater than 1, and the corresponding R value of above-mentioned red color lump is greater than 120
(i.e. preset 4th threshold value), then determine that the imaging of the Camera of UE1 is partially red;If it is determined that the ratio of B value and G value be greater than 1, and
The corresponding B value of above-mentioned blue color lump is greater than 120 (i.e. preset 5th threshold values), then determines that the imaging of the Camera of UE1 is partially blue.
It further, can also further basis after above-mentioned host computer completes the judgement of Camera corresponding to above-mentioned UE
Preset Camera criterion of acceptability judges whether that Camera corresponding to above-mentioned UE is needed to carry out respective handling.Specifically, it presets
Camera criterion of acceptability can require be set according to the different of each production firm.
Such as: it is illustrated by taking the test result for test of focusing as an example, in test result of the host computer according to focusing test
The number of the Y value difference of carrying determines that the number of all Y value differences (can be carried out in 80-120 according to different production firms
Flexible configuration) between when, it is believed that the corresponding Camera of UE1 meets preset Camera criterion of acceptability.
It is illustrated by taking the test result of bad point test (or stain test) as an example, host computer is tested according to bad point (or dirty
Point test) test result in pixel number in the bad point region (or stain region) that carries, determine any one bad point
When pixel number in region (or stain region) is less than 3 (can carry out flexible configuration according to different production firms),
Think that the corresponding Camera of UE1 meets preset Camera criterion of acceptability.
Above-described embodiment is described in further detail using specific application scenarios below, as shown in fig.6, of the invention
In embodiment, the detailed process of Camera imaging test method is as follows:
Step 600: for host computer 1 by the inspection software in UE1, the core nodes into UE1 send write instruction 1.
After core nodes in step 601:UE1 receive write instruction 1, the test item carried in write instruction 1 is saved
Mesh and test configurations parameter, and by inspection software, write-in, which is sent, to host computer 1 completes response 1.
Step 602: host computer 1 receives write-in and completes after responding 1, determines and joins above-mentioned test item and test configurations
The core nodes that number is successfully written in UE1, and the inspection software into UE1 sends test instruction 1.
When inspection software in step 603:UE1 receives test instruction 1, test instruction 1 is sent to HAL, HAL is determined
When the corresponding mark position 1 above-mentioned Camera, determine that above-mentioned Camera is in test mode, the read test item from core nodes
Mesh and test configurations parameter, and in determining read successfully, itself Camera is opened, and acquire above-mentioned Camera current preview frame
Each of the corresponding original yuv data of pixel.
Step 604:UE1 is based on HAL and is extracted respectively according to the corresponding original yuv data of each collected pixel
The corresponding Y value of each pixel out obtains grey card graphic corresponding with current preview frame.
Step 605:UE1 is based on HAL, using algorithm of region growing, successively traverses in every a line in above-mentioned grey card graphic
The corresponding Y value of each pixel, filtering out all target areas of the corresponding Y value less than 100 (includes: color block areas, bad
Point region and stain region), and in the current preview frame of duplication, highlighted mark in each target area filtered out includes
All pixels point.
Step 606:UE1 is based on HAL, counts the number of the pixel in each target area, and from the determined
One target area starts, successively judge the number of the pixel in each target area whether between 600-1000, if
It is to then follow the steps 607;Otherwise, step 608 is executed.
Step 607:UE1 is based on HAL, determines that the target area is exactly color block areas.
Step 608:UE1 is based on HAL, determines that the target area is bad point region or stain region, and by bad point region
Or location information and information of number etc. the write-in in stain region are to core nodes.
Step 609:UE1 is based on HAL and judges whether the color block areas is black and white according to the distribution characteristics of color block areas
Block, if so, thening follow the steps 610;Otherwise, step 612 is executed.
Step 610:UE1 is based on HAL, is starting with each pixel, successively calculates every adjacent in horizontal direction
Y value difference between two pixels.
Step 611:UE1 is based on HAL, filters out all Y value differences less than 80, and it is poor to count all Y values filtered out
The number of value, and the number of Y value difference is written to core nodes.
Step 612:UE1 is based on HAL, determines that the color block areas is green color lump, in blue color lump, or red color lump
Any one, and after determining green color lump, blue color lump and red color lump, collected all pixels is corresponding
Original yuv data is converted to corresponding RGB data, and extracts the corresponding R value of above-mentioned red color lump, above-mentioned blue color lump respectively
Corresponding B value.
Step 613:UE1 is based on HAL and chooses one piece of region, and count from other regions in addition to above-mentioned color block areas
Count stating the ratio of the R value and G value in one piece of region and the ratio of B value and G value, and the R value that will be extracted, B value, R value in
It is written with the ratio and B value of G value and the ratio of G value to core nodes.
When inspection software in step 614:UE1 determines that imaging test is completed, the read test from core nodes is as a result, i.e.
Read location information and the information of number etc. in above-mentioned bad point region or stain region, the number of above-mentioned Y value difference, above-mentioned R
Value, above-mentioned B value, the ratio of the ratio and above-mentioned B value and G value of above-mentioned R value and G value, and test result reported to upper
Machine.
Step 615: after host computer receives the test result reported, according to the above-mentioned R value that is carried in test result and upper
State the ratio of R value and G value and the ratio of above-mentioned B value and above-mentioned B value and G value, judge UE1 Camera imaging whether colour cast.
Specifically, judge itself Camera imaging whether colour cast when, it is understood that there may be following two situation:
The first situation: if it is determined that the ratio of R value and G value is greater than 1, and the corresponding R value of above-mentioned red color lump is greater than 140,
Then determine that the imaging of itself Camera is partially red.
Second situation: if it is determined that the ratio of B value and G value is greater than 1, and the corresponding B value of above-mentioned blue color lump is greater than 100,
Then determine that the imaging of itself Camera is partially blue.
Host computer determines that the number of above-mentioned Y value difference is greater than 100 according to the number of the Y value difference carried in test result
When, determine that the Camera focusing of UE1 is normal, the exception otherwise, it is determined that Camera of UE1 focuses.
Step 616: host computer continues the test result reported according to UE, judges whether to need Camera corresponding to UE1
Carry out respective handling.
Specifically, when judging whether to need Camera corresponding to UE1 to carry out respective handling, can use but be not limited to
Under type:
It is illustrated by taking the test result for test of focusing as an example, what host computer carried in the test result according to focusing test
The number of Y value difference, when determining that the number of all Y value differences is between 80-120, it is believed that the corresponding camera of UE1 meets default
Camera criterion of acceptability.
It is illustrated by taking the test result of bad point test (or stain test) as an example, host computer is tested according to bad point (or dirty
Point test) test result in pixel number in the bad point region (or stain region) that carries, determine any one bad point
When pixel number in region (or stain region) is less than 3, it is believed that the corresponding camera of above-mentioned UE meets preset
Camera criterion of acceptability.
Based on the above embodiment, as shown in fig.7, in the embodiment of the present invention, Camera imaging test device is at least wrapped
It includes:
Acquisition unit 700 when for determining that Camera is in test mode, acquires in above-mentioned Camera current preview frame
The corresponding original yuv data of each pixel;
Screening unit 701 filters out all target areas for being based on the corresponding original yuv data of each pixel
Domain;
Test cell 702, for the number based on the pixel in each target area, from all target areas
In, all color block areas are filtered out, and be based on each color block areas, completes the imaging test to above-mentioned Camera.
Preferably, it is based on the corresponding original yuv data of each pixel, it is above-mentioned when filtering out all target areas
Screening unit 701 is used for:
The Y value in the corresponding original yuv data of each pixel is extracted respectively;
Based on the corresponding Y value of each pixel, all targets that corresponding Y value is less than preset first threshold are filtered out
Region.
Preferably, the number based on the pixel in each target area filters out institute from all target areas
When some color block areas, above-mentioned test cell 702 is used for:
Each target area is traversed, determines the number of the pixel in any one target area in preset first model
When enclosing interior, determine that any one above-mentioned target area is color block areas.
Preferably, above-mentioned test cell 702 is further used for:
When determining the number of the pixel in any one target area not in preset first range, above-mentioned is determined
Target area of anticipating is bad point region or stain region.
Preferably, it is based on each color block areas, and when completing the imaging test to above-mentioned Camera, above-mentioned test cell
702 are used for:
From all color block areas, black and white color lump is filtered out, and be based on above-mentioned black and white color lump, completed to above-mentioned Camera
Focusing test;
From all color block areas, green color lump, blue color lump and red color lump are filtered out, and based on above-mentioned green color
Block, above-mentioned blue color lump and above-mentioned red color lump, complete to test the colour cast of above-mentioned Camera.
Preferably, it is based on above-mentioned black and white color lump, and when completing the focusing test to above-mentioned Camera, above-mentioned test cell 702
For:
UE is based on each of the above-mentioned black and white color lump corresponding Y value of pixel, is starting with each pixel, successively
Calculate the Y value difference between per two adjacent pixels;
UE filters out all Y value differences less than preset second threshold, and it is pre- to determine that the number of all Y value differences is greater than
If third threshold value when, determine that above-mentioned Camera focusing is normal.
Preferably, being completed based on above-mentioned green color lump, above-mentioned blue color lump and above-mentioned red color lump to above-mentioned Camera's
When colour cast is tested, above-mentioned test cell 702 is used for:
The corresponding original yuv data of collected all pixels is converted into corresponding RGB data respectively, and respectively
Extract the corresponding R value of above-mentioned red color lump, the corresponding B value of above-mentioned blue color lump;
From other regions in addition to above-mentioned color block areas, one piece of region is chosen, and calculates the R in above-mentioned one piece of region
Value and the ratio of G value and the ratio of B value and G value;
Determine that the ratio of R value and G value is greater than 1, and when the corresponding R value of above-mentioned red color lump is greater than preset third threshold value,
Determine that the imaging of above-mentioned Camera is partially red;Alternatively,
Determine that the ratio of B value and G value is greater than 1, and when the corresponding B value of above-mentioned blue color lump is greater than preset four threshold value,
Determine that the imaging of above-mentioned Camera is partially blue.
In conclusion when UE determines that itself Camera is in test mode, beginning to acquire above-mentioned in the embodiment of the present invention
The corresponding original yuv data of each of Camera current preview frame pixel, and it is based on the corresponding original of each pixel
Beginning yuv data filters out all target areas, and the number based on the pixel in each target area, from all
Target area in, filter out all color block areas, finally, being based on each color block areas, complete to being itself Camera
Imaging test.In this way, when as long as UE determines that itself Camera is in test mode, will automatically to itself Camera carry out at
As test, testing efficiency and accuracy are also improved while saving a large amount of human resources without artificial detection.It removes
Except this, the imaging test of Camera can be realized in the side UE, and only need to each of current preview frame pixel
Corresponding original yuv data is analyzed, and carries out imaging test without the photo upload by obtaining after taking pictures to host computer,
The processing time is saved, testing efficiency is further increased.
It should be understood by those skilled in the art that, the embodiment of the present invention can provide as method, system or computer program
Product.Therefore, complete hardware embodiment, complete software embodiment or reality combining software and hardware aspects can be used in the present invention
Apply the form of example.Moreover, it wherein includes the computer of computer usable program code that the present invention, which can be used in one or more,
The computer program implemented in usable storage medium (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.) produces
The form of product.
The present invention be referring to according to the method for the embodiment of the present invention, the process of equipment (system) and computer program product
Figure and/or block diagram describe.It should be understood that every one stream in flowchart and/or the block diagram can be realized by computer program instructions
The combination of process and/or box in journey and/or box and flowchart and/or the block diagram.It can provide these computer programs
Instruct the processor of general purpose computer, special purpose computer, Embedded Processor or other programmable data processing devices to produce
A raw machine, so that being generated by the instruction that computer or the processor of other programmable data processing devices execute for real
The device for the function of being specified in present one or more flows of the flowchart and/or one or more blocks of the block diagram.
These computer program instructions, which may also be stored in, is able to guide computer or other programmable data processing devices with spy
Determine in the computer-readable memory that mode works, so that it includes referring to that instruction stored in the computer readable memory, which generates,
Enable the manufacture of device, the command device realize in one box of one or more flows of the flowchart and/or block diagram or
The function of being specified in multiple boxes.
These computer program instructions also can be loaded onto a computer or other programmable data processing device, so that counting
Series of operation steps are executed on calculation machine or other programmable devices to generate computer implemented processing, thus in computer or
The instruction executed on other programmable devices is provided for realizing in one or more flows of the flowchart and/or block diagram one
The step of function of being specified in a box or multiple boxes.
Although preferred embodiments of the present invention have been described, it is created once a person skilled in the art knows basic
Property concept, then additional changes and modifications may be made to these embodiments.So it includes excellent that the following claims are intended to be interpreted as
It selects embodiment and falls into all change and modification of the scope of the invention.
Obviously, those skilled in the art can carry out various modification and variations without departing from this hair to the embodiment of the present invention
The spirit and scope of bright embodiment.In this way, if these modifications and variations of the embodiment of the present invention belong to the claims in the present invention
And its within the scope of equivalent technologies, then the present invention is also intended to include these modifications and variations.
Claims (11)
1. a kind of test method of camera Camera imaging characterized by comprising
After user equipment (UE) receives the test instruction of host computer transmission, when determining that Camera is in test mode, described in acquisition
The corresponding original yuv data of each of Camera current preview frame pixel;
UE is based on the corresponding original yuv data of each pixel, filters out all target areas;
Number of the UE based on the pixel in each target area filters out all color lumps from all target areas
Region, bad point region and stain region;
UE is completed the imaging to the Camera and is surveyed based on all color block areas, bad point region and the stain region filtered out
Examination;Wherein,
UE is completed the imaging to the Camera and is surveyed based on all color block areas, bad point region and the stain region filtered out
Examination, comprising:
UE filters out black and white color lump from all color block areas, and is based on the black and white color lump, completes to the Camera's
Focusing test;
UE filters out green color lump, blue color lump and red color lump from all color block areas, and based on the green color
Block, the blue color lump and the red color lump, complete to test the colour cast of the Camera;
UE is based on the bad point region and the stain region, completes to test the miscellaneous point of the Camera.
2. the method as described in claim 1, which is characterized in that be based on the corresponding original yuv data of each pixel, screening
All target areas out, comprising:
UE extracts the Y value in the corresponding original yuv data of each pixel respectively;
UE is based on the corresponding Y value of each pixel, filters out all target areas that corresponding Y value is less than preset first threshold
Domain.
3. the method as described in claim 1, which is characterized in that number of the UE based on the pixel in each target area,
From all target areas, all color block areas, bad point region and stain region are filtered out, comprising:
Whether UE traverses each target area, successively judge the number of the pixel in each target area preset
In one range;
UE determines that target area of the number of corresponding pixel in preset first range is color block areas, and judgement pair
Target area of the number for the pixel answered not in preset first range is bad point region or stain region.
4. the method as described in claim 1, which is characterized in that UE is based on the black and white color lump, completes to the Camera's
Focusing test, comprising:
UE is based on each of the black and white color lump corresponding Y value of pixel, is starting with each pixel, successively calculates
Y value difference between per two adjacent pixels;
UE filters out all Y value differences less than preset second threshold, and the number for counting all Y value differences is sent to
The host computer, and the prompt host computer: the number based on the Y value difference determines that the number of the Y value difference is greater than
When preset third threshold value, determine that the Camera focusing is normal.
5. the method as described in claim 1, which is characterized in that UE is based on the green color lump, the blue color lump and described
Red color lump is completed to test the colour cast of the Camera, comprising:
The corresponding original yuv data of collected all pixels is converted to corresponding RGB data respectively by UE, and is mentioned respectively
Take out the corresponding R value of the red color lump, the corresponding B value of the blue color lump;
UE chooses one piece of region from other regions in addition to the color block areas, and calculates the R value in one piece of region
With the ratio of G value and the ratio of B value and G value;
UE is by the corresponding R value of the red color lump, the corresponding B value of the blue color lump, the ratio and B value and G of R value and G value
The ratio of value is sent to the host computer, and the host computer is prompted to complete any one following operation:
It determines that the ratio of R value and G value is greater than 1, and when the corresponding R value of the red color lump is greater than preset four threshold value, determines
The imaging of the Camera is partially red;Alternatively,
It determines that the ratio of B value and G value is greater than 1, and when the corresponding B value of the blue color lump is greater than preset five threshold value, determines
The imaging of the Camera is partially blue.
6. a kind of test device of camera Camera imaging characterized by comprising
Acquisition unit when for determining that Camera is in test mode, acquires each of described Camera current preview frame
The corresponding original yuv data of pixel;
Screening unit filters out all target areas for being based on the corresponding original yuv data of each pixel;
Test cell is filtered out from all target areas for the number based on the pixel in each target area
All color block areas, bad point region and stain regions, and based on all color block areas, bad point region and the stain filtered out
The imaging test to the Camera is completed in region;Wherein,
It is surveyed based on all color block areas, bad point region and the stain region filtered out, completing the imaging to the Camera
When examination, the test cell is specifically used for:
From all color block areas, black and white color lump is filtered out, and be based on the black and white color lump, complete pair to the Camera
Coke test;
From all color block areas, filter out green color lump, blue color lump and red color lump, and based on the green color lump,
The blue color lump and the red color lump, complete to test the colour cast of the Camera;
Based on the bad point region and the stain region, complete to test the miscellaneous point of the Camera.
7. test device as claimed in claim 6, which is characterized in that it is based on the corresponding original yuv data of each pixel,
When filtering out all target areas, the screening unit is used for:
The Y value in the corresponding original yuv data of each pixel is extracted respectively;
Based on the corresponding Y value of each pixel, all target areas that corresponding Y value is less than preset first threshold are filtered out
Domain.
8. test device as claimed in claim 6, which is characterized in that the number based on the pixel in each target area
Mesh, from all target areas, when filtering out all color block areas, the test cell is used for:
Each target area is traversed, determines the number of the pixel in any one target area in preset first range
When, determine that any one described target area is color block areas.
9. test device as claimed in claim 8, which is characterized in that the test cell is further used for:
When determining the number of the pixel in any one target area not in preset first range, determine described any one
A target area is bad point region or stain region.
10. test device as claimed in claim 6, which is characterized in that be based on the black and white color lump, complete to the Camera
Focusing test when, the test cell is used for:
It is starting with each pixel based on the corresponding Y value of each of described black and white color lump pixel, successively calculates every
Y value difference between two adjacent pixels;
All Y value differences less than preset second threshold are filtered out, determine that the number of all Y value differences is greater than preset the
When three threshold values, determine that the Camera focusing is normal.
11. test device as claimed in claim 6, which is characterized in that based on the green color lump, the blue color lump and institute
Red color lump is stated, when completing to test the colour cast of the Camera, the test cell is used for:
The corresponding original yuv data of collected all pixels is converted into corresponding RGB data respectively, and is extracted respectively
The corresponding R value of the red color lump, the corresponding B value of the blue color lump out;
From other regions in addition to the color block areas, choose one piece of region, and calculate the R value in one piece of region and
The ratio of G value and the ratio of B value and G value;
It determines that the ratio of R value and G value is greater than 1, and when the corresponding R value of the red color lump is greater than preset four threshold value, determines
The imaging of the Camera is partially red;Alternatively,
It determines that the ratio of B value and G value is greater than 1, and when the corresponding B value of the blue color lump is greater than preset five threshold value, determines
The imaging of the Camera is partially blue.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610302424.2A CN105931249B (en) | 2016-05-06 | 2016-05-06 | A kind of test method and test device of Camera imaging |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610302424.2A CN105931249B (en) | 2016-05-06 | 2016-05-06 | A kind of test method and test device of Camera imaging |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105931249A CN105931249A (en) | 2016-09-07 |
CN105931249B true CN105931249B (en) | 2019-04-26 |
Family
ID=56835530
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610302424.2A Active CN105931249B (en) | 2016-05-06 | 2016-05-06 | A kind of test method and test device of Camera imaging |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105931249B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108234999B (en) * | 2018-01-22 | 2019-08-20 | Oppo广东移动通信有限公司 | The test macro and test method that camera module for electronic device is tested |
CN109655010B (en) * | 2018-10-31 | 2020-07-07 | 上海畅联智融通讯科技有限公司 | Camera dynamic gridding shooting object measurement method |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101193323A (en) * | 2006-11-22 | 2008-06-04 | 乐金电子(昆山)电脑有限公司 | Bad pixel detection method for digital video device |
CN101282418A (en) * | 2007-04-05 | 2008-10-08 | 佳能株式会社 | Image processing apparatus and control method therefor |
CN103475828A (en) * | 2013-10-10 | 2013-12-25 | 旗瀚科技有限公司 | Method for rectifying missing pixels and image sensor |
CN104867159A (en) * | 2015-06-05 | 2015-08-26 | 北京大恒图像视觉有限公司 | Stain detection and classification method and device for sensor of digital camera |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8103121B2 (en) * | 2007-08-31 | 2012-01-24 | Adobe Systems Incorporated | Systems and methods for determination of a camera imperfection for an image |
US8797429B2 (en) * | 2012-03-05 | 2014-08-05 | Apple Inc. | Camera blemish defects detection |
-
2016
- 2016-05-06 CN CN201610302424.2A patent/CN105931249B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101193323A (en) * | 2006-11-22 | 2008-06-04 | 乐金电子(昆山)电脑有限公司 | Bad pixel detection method for digital video device |
CN101282418A (en) * | 2007-04-05 | 2008-10-08 | 佳能株式会社 | Image processing apparatus and control method therefor |
CN103475828A (en) * | 2013-10-10 | 2013-12-25 | 旗瀚科技有限公司 | Method for rectifying missing pixels and image sensor |
CN104867159A (en) * | 2015-06-05 | 2015-08-26 | 北京大恒图像视觉有限公司 | Stain detection and classification method and device for sensor of digital camera |
Also Published As
Publication number | Publication date |
---|---|
CN105931249A (en) | 2016-09-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108683907A (en) | Optics module picture element flaw detection method, device and equipment | |
CN104486552B (en) | A kind of method and electronic equipment obtaining image | |
CN103828345B (en) | Apparatus, method and image pickup apparatus for image processing | |
CN109520690A (en) | A kind of rotary machine rotor Mode Shape global measuring device and method based on video | |
CN102244715B (en) | Image processing apparatus, setting device and method for image processing apparatus | |
CN104202583B (en) | Image processing device and method | |
CN104811692A (en) | Picture for testing shooting die set and method thereof | |
CN108109145A (en) | Picture quality detection method, device, storage medium and electronic device | |
CN109493283A (en) | A kind of method that high dynamic range images ghost is eliminated | |
CN107566827B (en) | Shoot Delay computing method, device and equipment | |
CN103905725A (en) | Image processing apparatus and image processing method | |
CN113014908A (en) | Image detection method, device and system and computer readable storage medium | |
CN105931249B (en) | A kind of test method and test device of Camera imaging | |
CN110363720A (en) | A kind of color enhancement method, apparatus, equipment and the storage medium of image | |
CN108805872B (en) | Product detection method and device | |
CN109801322A (en) | A kind of light leak test method and device | |
US11228723B2 (en) | Pixel correction | |
CN106937064A (en) | The detection method of digital picture bad point, bearing calibration and device | |
CN106412575A (en) | Method and device for detecting display | |
CN104954627B (en) | A kind of information processing method and electronic equipment | |
CN104025581B (en) | Imaging Device | |
CN106331460A (en) | Image processing method and device, and terminal | |
CN109427041A (en) | A kind of image white balance method and system, storage medium and terminal device | |
CN116233607A (en) | Multi-exposure image processing method and device, chip and electronic equipment | |
CN101980299B (en) | Chessboard calibration-based camera mapping method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |