CN114500995A - Personalized configuration system of USB camera at manufacturing end - Google Patents

Personalized configuration system of USB camera at manufacturing end Download PDF

Info

Publication number
CN114500995A
CN114500995A CN202210396270.3A CN202210396270A CN114500995A CN 114500995 A CN114500995 A CN 114500995A CN 202210396270 A CN202210396270 A CN 202210396270A CN 114500995 A CN114500995 A CN 114500995A
Authority
CN
China
Prior art keywords
configuration
image sensor
usb
video
center point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210396270.3A
Other languages
Chinese (zh)
Other versions
CN114500995B (en
Inventor
王伟光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beiyuan Technology Shenzhen Co ltd
Original Assignee
Beiyuan Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beiyuan Technology Shenzhen Co ltd filed Critical Beiyuan Technology Shenzhen Co ltd
Priority to CN202210396270.3A priority Critical patent/CN114500995B/en
Publication of CN114500995A publication Critical patent/CN114500995A/en
Application granted granted Critical
Publication of CN114500995B publication Critical patent/CN114500995B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L41/00Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
    • H04L41/08Configuration management of networks or network elements
    • H04L41/0803Configuration setting
    • H04L41/0813Configuration setting characterised by the conditions triggering a change of settings
    • H04L41/082Configuration setting characterised by the conditions triggering a change of settings the condition being updates or upgrades of network functionality

Abstract

The invention belongs to the technical field of computer software, and particularly relates to a personalized configuration system for a USB camera at a manufacturing end. The system is used for providing a USB camera manufacturing end, and carrying out personalized configuration on the USB camera based on user requirements, and comprises the following steps: the basic function configuration part is used for configuring equipment information, video effect and firmware downloading for the USB camera; the professional function configuration part is used for carrying out optical correction configuration and personalized function configuration on the USB camera; and the video configuration part is used for performing personalized video function configuration on the USB camera. The invention solves the requirements of manufacturing the USB camera and personalized customization of manufacturers.

Description

Personalized configuration system of USB camera at manufacturing end
Technical Field
The invention belongs to the technical field of computer software, and particularly relates to a personalized configuration system for a USB camera at a manufacturing end.
Background
The camera with the USB interface outputting the video stream has very wide application due to the universality of the USB interface and the convenience of plug and play. Generally, the development of the firmware of the USB camera is completed, and the USB camera enters a manufacturing production process at a next stage, in which a manufacturer faces various specification requirements of its terminal user and also faces quality problems caused by physical reasons in the manufacturing process of the USB camera. Manufacturers typically feed back various requirements to the USB camera firmware development end to modify the firmware program. The requirements of terminal users are very complicated, and great aging pressure is caused to the development end and the manufacturing and production end of the USB camera firmware. Therefore, the multifunctional software for upgrading the server based on the USB camera just meets the complicated requirements, helps USB camera manufacturers meet the requirements of users through the software, and greatly reduces the pressure of a camera firmware development end.
Disclosure of Invention
In view of this, the main object of the present invention is to provide a USB camera personalized configuration system at a manufacturing end, which solves the requirements of USB camera manufacturing and personalized customization of manufacturers.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
the personalized configuration system of the USB camera of the manufacture end is used for providing the manufacture end of the USB camera, and carrying out personalized configuration on the USB camera based on user requirements, and comprises the following steps:
the basic function configuration part is used for configuring equipment information, video effect and firmware downloading for the USB camera;
the professional function configuration part is used for carrying out optical correction configuration and personalized function configuration on the USB camera; the optical correction arrangement comprises: the method comprises the following steps of USB camera lens vignetting correction configuration, double-camera central point correction configuration and double-sensor frame synchronization picture central correction configuration; the personalized functional configuration comprises: an auto exposure mode configuration and a voice coil motor drive configuration.
The video configuration part is used for performing personalized video function configuration on the USB camera; the personalized video function configuration comprises: USB video resolution configuration, USB video format configuration, USB video photographing setting, USB audio output configuration, USB video broadband configuration and USB video color configuration.
Further, the system further comprises: an upgrade section; the upgrade section includes: a local end and a server end; the local terminal is configured to send an upgrade request to a server; the server side responds to an upgrading request of a local side and sends updating contents to the local side; and the local terminal upgrades the system based on the received updated content of the server.
Further, the method for correcting the configuration of the center points of the two cameras comprises the following steps: integrating two USB cameras on a circuit board in a mode of being placed side by side; the method for correcting and configuring the center of the frame synchronization picture of the double sensors comprises the following steps: and connecting the image processors on the circuit board of one USB camera side by side with two image sensors.
Further, the implementation process of the dual-sensor frame synchronization picture center correction configuration includes: step A1: enumerating a USB camera; step A2: performing target device enumeration; step A3: acquiring a video stream; step A4: obtaining the coordinates of the central point in the characteristic areas in the output pictures of the first image sensor and the second image sensor; step A5: comparing coordinate differences of feature region center point coordinates of the first image sensor and the second image sensor, if the coordinate difference comparison result exceeds a preset condition, executing step A6, and if the coordinate difference comparison result does not exceed the preset condition, executing step A7; step A6: transmitting the deviation value to the ISP through the UVC XU protocol, controlling the data output by the second image sensor by the ISP to enable the whole image to be subjected to X-axis/Y-axis direction deviation, and returning to the step A4 again; step A7: the correction is successful; step A8: and storing the data into the ISP flash memory.
Further, the step a6 specifically includes: calculating a deviation value between the coordinate of the center point of the characteristic region in the output picture of the second image sensor and the coordinate of the center point of the characteristic region in the first image sensor by taking the first image sensor as a reference; if the abscissa of the center point of the characteristic region in the output picture of the second image sensor
Figure 214863DEST_PATH_IMAGE001
And the abscissa of the center point of the feature region in the first image sensor
Figure 654066DEST_PATH_IMAGE002
If the deviation value is larger than 4, the deviation value is transmitted to the ISP through the UVC XU, and the ISP controls the second image sensor to output data, so that the whole image is shifted leftwards
Figure 415087DEST_PATH_IMAGE003
Each pixel point; if the abscissa of the center point of the characteristic region in the output picture of the second image sensor
Figure 581626DEST_PATH_IMAGE001
And the abscissa of the center point of the feature region in the first image sensor
Figure 737801DEST_PATH_IMAGE002
If the deviation value is less than-4, the deviation value is transmitted to the ISP through the UVC XU, and the ISP controls the second image sensor to output data, so that the whole image shifts rightwards
Figure 347905DEST_PATH_IMAGE003
Each pixel point; if the abscissa of the center point of the characteristic region in the output picture of the second image sensor
Figure 956740DEST_PATH_IMAGE001
And the abscissa of the center point of the feature region in the first image sensor
Figure 192550DEST_PATH_IMAGE002
Is less than or equal to 4 and is greater than or equal to-4, and the ordinate of the center point of the feature area in the output picture of the second image sensor
Figure 514816DEST_PATH_IMAGE004
And the abscissa of the center point of the feature region in the first image sensor
Figure 482772DEST_PATH_IMAGE005
If the deviation value is larger than 4, the deviation value is transmitted to the ISP through the UVC XU, and the ISP controls the second image sensor to output data, so that the whole image is shifted upwards
Figure 641220DEST_PATH_IMAGE006
Each pixel point; if the abscissa of the center point of the characteristic region in the output picture of the second image sensor
Figure 87245DEST_PATH_IMAGE001
And the abscissa of the center point of the feature region in the first image sensor
Figure 31062DEST_PATH_IMAGE002
Is less than or equal to 4 and is greater than or equal to-4, and the ordinate of the center point of the feature area in the output picture of the second image sensor
Figure 169919DEST_PATH_IMAGE004
And the abscissa of the center point of the feature region in the first image sensor
Figure 815664DEST_PATH_IMAGE005
If the deviation value is less than-4, the deviation value is transmitted to the ISP through the UVC XU, and the ISP controls the second image sensor to output data, so that the whole image is shifted downwards
Figure 65380DEST_PATH_IMAGE007
And (5) each pixel point.
Further, the step a4 specifically includes: step A4.1: performing image segmentation on the first image sensor and the second image sensor to complete data separation; step A4.2: carrying out binarization processing on the result of image segmentation; step A4.3: carrying out black and white point statistics on the result of the binarization processing; step A4.4: abnormal point processing is carried out through two times of Gaussian filtering; step A4.5: carrying out black and white point extreme value statistics in the horizontal direction; step A4.5: counting the range value of the black points according to the number and the direction extreme values of the counting points; step A4.6: merging pixels according to the range; step A4.7: solving a pixel combination minimum value point as a suspected central point; step A4.8: counting and combining horizontal and vertical coordinates as coordinates of a central point; step A4.9: and (5) carrying out center point coordinate transmission.
Further, the execution process of the USB camera lens vignetting correction configuration includes: step B1: carrying out original data transmission; step B2: respectively separating original data in an R channel, a GR channel, a GB channel and a G channel; step B3: carrying out median filtering processing; step B4: carrying out Gaussian filtering processing; step B5: carrying out original data blocking processing; step B6: merging the original data with a size of 3 x 3; step B7: taking the extreme number of the original data of each channel and the corresponding position of the extreme number; step B8: respectively carrying out comparison operation on extreme values of the R channel, the GR channel, the GB channel and the G channel and the mean value to obtain a comparison result; step B9: carrying out mean value extraction on the center of the original data block; step B10: carrying out extreme value extraction on the center of the original data block; step B11: performing least square curve fitting; step B12: and (5) taking a fitting value for the edge, and taking a true value for the rest part to obtain a result.
Further, the USB video color configuration includes: a USB video color configuration and a USB video black and white configuration.
The personalized configuration system of the USB camera at the manufacturing end has the following beneficial effects:
1. personalized configuration: the invention can provide feasible configuration for a camera manufacturing party, and can modify and upgrade the firmware according to the requirements of the terminal user.
2. Promote the camera quality: the professional configuration in the invention can be used for detecting and correcting the element quality of the camera, thereby improving the yield.
3. The function is various: compared with the prior art that the USB camera only provides basic functions, the system provided by the invention has more functions when being cooperated with the USB camera.
Drawings
Fig. 1 is a schematic system structure diagram of a personalized configuration system for a USB camera at a manufacturing end according to an embodiment of the present invention;
fig. 2 is a schematic diagram of an execution flow of a dual-sensor frame synchronization picture center correction configuration of a USB camera personalized configuration system at a manufacturing end according to an embodiment of the present invention;
fig. 3 is a schematic flowchart of step a6 when the personalized configuration system for USB camera at the manufacturing end performs center correction configuration on a frame synchronization picture of a dual sensor according to an embodiment of the present invention;
fig. 4 is a schematic flowchart of a process of obtaining a center point coordinate when the personalized configuration system of the USB camera at the manufacturing end performs center correction configuration on a frame synchronization picture of a dual sensor according to an embodiment of the present invention;
fig. 5 is a schematic diagram of a black characteristic region generated when the USB camera personalized configuration system at the manufacturing end performs dual-sensor frame synchronization picture center correction configuration, binarization processing, and black and white point statistics according to the embodiment of the present invention;
fig. 6 is a schematic flow chart of a USB camera lens vignetting correction configuration performed by the USB camera personalized configuration system at the manufacturing end according to the embodiment of the present invention.
Detailed Description
The method of the present invention will be described in further detail below with reference to the accompanying drawings and embodiments of the invention.
Example 1
As shown in fig. 1, the system for personalized configuration of a USB camera at a manufacturing end, which is provided for the manufacturing end of the USB camera, performs personalized configuration of the USB camera based on user requirements, includes:
the basic function configuration part is used for configuring equipment information, video effect and firmware downloading for the USB camera;
the professional function configuration part is used for carrying out optical correction configuration and personalized function configuration on the USB camera; the optical correction arrangement comprises: the method comprises the following steps of USB camera lens vignetting correction configuration, double-camera central point correction configuration and double-sensor frame synchronization picture central correction configuration; the personalized functional configuration comprises: an auto exposure mode configuration and a voice coil motor drive configuration.
The video configuration part is used for performing personalized video function configuration on the USB camera; the personalized video function configuration comprises: USB video resolution configuration, USB video format configuration, USB video photographing setting, USB audio output configuration, USB video broadband configuration and USB video color configuration.
Specifically, the USB camera: the camera scheme or product using the USB interface as video output mainly follows the usbvideoclass (uvc) protocol of the USB video standard established by the USB international official organization.
Example 2
On the basis of the above embodiment, the system further includes: an upgrade section; the upgrade section includes: a local end and a server end; the local terminal is configured to send an upgrade request to a server; the server side responds to an upgrading request of a local side and sends updating contents to the local side; and the local terminal upgrades the system based on the received updated content of the server.
Specifically, the device information configuration includes: and configuring attribute information of the USB camera equipment and configuring enumeration of the USB host. The enumeration includes the ID, name and product serial number of the device.
Specifically, the video effect configuration includes: in a UVC protocol followed by the USB camera, the attribute is adjusted according to the parameters of the video image effect. Generally, a terminal user can adjust different values according to own environment, but the values need to be stored in a camera memory, and the values can be effective when the terminal user is opened again next time, and the default values cannot be recovered.
The firmware download configuration comprises: and configuring a program, namely Firmware, running in a memory of the USB camera. When the camera works normally, a burning tool is needed to burn the firmware into a memory of the camera.
Example 3
On the basis of the previous embodiment, the method for correcting the configuration of the center points of the two cameras comprises the following steps: integrating two USB cameras on a circuit board in a mode of being placed side by side; the method for correcting and configuring the center of the frame synchronization picture of the double sensors comprises the following steps: and connecting the image processors on the circuit board of one USB camera side by side with two image sensors.
Specifically, camera shading/vignetting correction: the english lensshading correction corrects for the non-uniformity of the brightness and color generated by the camera lens. In the production process of the camera, the camera can have the problems of bright middle and dark periphery and large picture color deviation due to unreasonable matching of the image sensor CMOSsensor, the quality of the lens and parameters, so that a manufacturer can use the software to correct each camera on a production line by using the module disclosed by the invention, and the brightness and the color of each camera are ensured to be uniform. The quality is consistent.
Specifically, the center of the two cameras is corrected: usually, some camera modules can be integrated on a board by two independent USB cameras, and due to physical errors in the production process, the video picture centers of the two cameras are deviated, so that the center correction procedure is added in the reproduction process, and thus when a user of the camera develops the camera, extra correction is not needed by an algorithm. Development resources and time are saved.
And (3) correcting and configuring the center of a frame synchronization picture of the double sensors: in some camera modules, one main control chip (ISP) is connected to two cmos sensors at the same time, and frame synchronization is achieved for output pictures. Also, due to physical errors in the production process, the centers of the frames output by the two sensors are deviated, so that a center correction procedure needs to be added in the production process. Thus, when a user of the camera develops the camera, no additional correction is needed by the algorithm. Development resources and time are saved.
Example 4
As shown in fig. 2, on the basis of the above embodiment, the implementation process of the dual-sensor frame synchronization picture center correction configuration includes: step A1: enumerating a USB camera; step A2: performing target device enumeration; step A3: acquiring a video stream; step A4: obtaining the coordinates of the central point in the characteristic areas in the output pictures of the first image sensor and the second image sensor; step A5: comparing coordinate differences of coordinates of center points of the feature areas of the first image sensor and the second image sensor, if the comparison result of the coordinate differences exceeds a preset condition, executing the step A6, and if the comparison result of the coordinate differences does not exceed the preset condition, executing the step A7; step A6: transmitting the deviation value to the ISP through the UVC XU protocol, controlling the data output by the second image sensor by the ISP to enable the whole image to be subjected to X-axis/Y-axis direction deviation, and returning to the step A4 again; step A7: the correction is successful; step A8: and storing the data into the ISP flash memory.
Specifically, the target device is a device for storing the acquired image after the image is acquired by the USB camera.
Specifically, the meaning of the automatic exposure mode configuration is specifically: the USB camera supports automatic exposure, and the general USB camera uses the average brightness value of the image as the reference value of the ambient brightness, but this cannot solve the requirement of all the usage environments on brightness. The auto exposure mode of the present invention provides 3 functions, the first being the conventional averaging metering mode. The second is a central metering mode, for example, in a conventional scene of a camera, the effective shot content is in the central area of the picture, and then the problem of improper brightness of the shot content can be solved by using the central metering. And the third is arbitrary region metering (ROI), which tells the camera which region of the picture is to be taken as the key brightness reference through application software of a computer end connected with the USB camera, and the USB camera takes the region as the key region for exposure as long as appropriate coordinates and range are provided. The application problem of more scenes is solved.
Specifically, the meaning of the voice coil motor driving configuration is: the camera has an automatic focusing function, the lens assembly is provided with a motor, and the motor drives a driving chip, namely the VCM. VCM has a plurality of models, the VCM is matched with the common models, and camera manufacturers can select self-matching.
Specifically, the resolution of the USB camera is: and the resolution list can be output by the USB camera. The USB camera firmware program contains a series of resolutions for the user to choose, and the requirements of different users on the resolutions are different. The invention supports the self-configuration of manufacturers according to the requirements of users.
USB video format: and the video format which can be output by the USB camera, YUV or MJPEG compressed video stream.
In fig. 2, 3, and 4, the Sensor0 and the Sensor1 represent a first image Sensor and a second image Sensor, respectively. RAW is RAW data.
Referring to fig. 5, in fig. 5, the image size is set: width Height, target region size: w × h, binarization threshold: t, coordinate points (x, y) in the target area; width/2-w/2< = x < = Width/2+ w/2; height/2-h/2< = y < = Height/2+ h/2;
the binarization processing process comprises the following steps:
the color of a pixel is represented by three rgb values, and a pixel corresponds to three color values, r, b and g respectively. G data G (x, y) is taken, the G data of each point is compared with a binarization threshold value T (the point mark smaller than the binarization threshold value is G (x, y) = 0; the point mark larger than the binarization threshold value T is G (x, y) = 1), and the target area rgb image is binarized to be only composed of 0 and 1.
Figure 362238DEST_PATH_IMAGE008
The black spot statistical process is as follows:
by HCNT(n) represents the number of G (x, y) =1 in the nth row within the target region:
HCNT(n)=
Figure 671996DEST_PATH_IMAGE009
with ACNTRepresents the sum of the numbers of all G (x, y) =1 within the target area:
ACNT=
Figure 805037DEST_PATH_IMAGE010
abnormal point treatment:
and (3) judging the points in each row, wherein the number of G (x, y) =1 of which is less than the rated value 3, as invalid black points, and filtering:
Figure 858444DEST_PATH_IMAGE011
gaussian filtering G (x, y) to obtain new GN(x,y);
Gaussian filtering:
a Gaussian matrix:
K=[1,1,1;1,1,1;1,1,1]
V=G(x-1,y-1)+G(x-1,y)+G(x-1,y+1)+G(x,y-1)+G(x,y)+G(x,y+1)+G(x+1,y-1)+G(x+1,y)+G(x+1,y+1);
GN(x,y)
Figure 511273DEST_PATH_IMAGE012
acquiring the coordinates of the central point of the black area in the image:
get GNAll (x, y) greater than 3 correspond to (x)1,x2,...,xn) And (y)1,y2,...,yn) Respectively taking an average value
X0=
Figure 991933DEST_PATH_IMAGE013
Y0=
Figure 346691DEST_PATH_IMAGE014
(X0,Y0) Namely the coordinates of the central point of the black characteristic region.
Example 5
On the basis of the previous embodiment, the step a6 specifically includes: calculating a deviation value between the coordinate of the center point of the characteristic region in the output picture of the second image sensor and the coordinate of the center point of the characteristic region in the first image sensor by taking the first image sensor as a reference; if the abscissa of the center point of the characteristic region in the output picture of the second image sensor
Figure 203789DEST_PATH_IMAGE001
And the abscissa of the center point of the feature region in the first image sensor
Figure 209660DEST_PATH_IMAGE002
If the deviation value is larger than 4, the deviation value is transmitted to the ISP through the UVC XU, and the ISP controls the second image sensor to output data, so that the whole image is shifted leftwards
Figure 861221DEST_PATH_IMAGE003
Each pixel point; if the abscissa of the center point of the characteristic region in the output picture of the second image sensor
Figure 968854DEST_PATH_IMAGE001
And the abscissa of the center point of the feature region in the first image sensor
Figure 364064DEST_PATH_IMAGE002
If the deviation value is less than-4, the deviation value is transmitted to the ISP through the UVC XU, and the ISP controls the second image sensor to output data, so that the whole image shifts rightwards
Figure 725906DEST_PATH_IMAGE003
Each pixel point; if the abscissa of the center point of the characteristic region in the output picture of the second image sensor
Figure 813948DEST_PATH_IMAGE001
And the abscissa of the center point of the feature region in the first image sensor
Figure 143298DEST_PATH_IMAGE002
Is less than or equal to 4 and is greater than or equal to-4, and the ordinate of the center point of the feature area in the output picture of the second image sensor
Figure 76619DEST_PATH_IMAGE004
And the abscissa of the center point of the feature region in the first image sensor
Figure 80519DEST_PATH_IMAGE005
If the deviation value is larger than 4, the deviation value is transmitted to the ISP through the UVC XU, and the ISP controls the second image sensor to output data, so that the whole image is shifted upwards
Figure 339462DEST_PATH_IMAGE006
Pixel points; if the abscissa of the center point of the characteristic region in the output picture of the second image sensor
Figure 890529DEST_PATH_IMAGE001
And the abscissa of the center point of the feature region in the first image sensor
Figure 893121DEST_PATH_IMAGE002
Is less than or equal to 4 and is greater than or equal to-4, and the ordinate of the center point of the feature area in the output picture of the second image sensor
Figure 229555DEST_PATH_IMAGE004
And the abscissa of the center point of the feature region in the first image sensor
Figure 393820DEST_PATH_IMAGE005
If the deviation value is less than-4, the deviation value is transmitted to the ISP through the UVC XU, and the ISP controls the second image sensor to output data, so that the whole image is shifted downwards
Figure 697763DEST_PATH_IMAGE006
And (6) each pixel point.
The content of the USB video photographing setting comprises the following steps: and the USB camera supports a video photographing function. According to the method, a user can return to a camera firmware development end to modify according to what photographing mode is needed in a scene, and the 3 photographing modes are adjusted to be configurable by a manufacturer.
The content of the USB audio output configuration comprises: microphone audio processing built on the usbaudioclass (uac) standard. Including the enabling of the microphone function. The USB camera only has one audio processing chip, and the invention supports 3 types for manufacturers to adapt to users.
USB video bandwidth: the USB interface data transmission rate has a regulation, and different video stream sizes can be controlled by compression for video transmission. A general user may connect a USB camera with a computer, Linux, or android device, and when a video data stream of the USB camera is large, a picture of these host platforms may fail, or resources such as a CPU may be consumed too much. Controlling the video stream is also a necessary setting. The conventional USB camera is fixed in video stream size, and the invention supports manufacturers to configure different video stream bandwidths for users according to different user requirements.
USB video color/black & white: the USB camera output video stream may be configured as a color image or a black and white image. Manufacturers can provide color or black and white effects to their customers depending on their customers' scenarios and needs. A general USB camera does not have this function, and needs a camera firmware development end to develop two different sets of programs to manufacturers.
Example 6
On the basis of the previous embodiment, the step a4 specifically includes: step A4.1: performing image segmentation on the first image sensor and the second image sensor to complete data separation; step A4.2: carrying out binarization processing on the result of image segmentation; step A4.3: carrying out black and white point statistics on the result of the binarization processing; step A4.4: abnormal point processing is carried out through two times of Gaussian filtering; step A4.5: carrying out black and white point extreme value statistics in the horizontal direction; step A4.5: counting the range value of the black points according to the number and direction extreme values of the counting points; step A4.6: merging pixels according to the range; step A4.7: solving a pixel combination minimum value point as a suspected central point; step A4.8: counting and combining horizontal and vertical coordinates as coordinates of a central point; step A4.9: and (5) carrying out center point coordinate transmission.
Example 7
On the basis of the above embodiment, the execution process of the USB camera lens vignetting correction configuration includes: step B1: carrying out original data transmission; step B2: respectively separating original data in an R channel, a GR channel, a GB channel and a G channel; step B3: carrying out median filtering processing; step B4: carrying out Gaussian filtering processing; step B5: carrying out original data blocking processing; step B6: merging the original data with a size of 3 x 3; step B7: taking the extreme number of the original data of each channel and the corresponding position of the extreme number; step B8: respectively carrying out comparison operation on extreme values of the R channel, the GR channel, the GB channel and the G channel and the mean value to obtain a comparison result; step B9: carrying out mean value extraction on the center of the original data block; step B10: carrying out extreme value extraction on the center of the original data block; step B11: performing least square curve fitting; step B12: and (5) taking a fitting value for the edge, and taking a true value for the rest part to obtain a result.
Specifically, the original data is original image data acquired by the USB camera.
Specifically, a RAW picture is set: length: w, width: H.
the RAW picture data consists of four channel data of R, GR, GB and B, and single channel RAW data of R, GR, GB and B are obtained in a separated mode according to the color sequence of the RAW data;
the median filtering process is as follows:
original matrix: r [ W/2, H/2]
Matrix after conversion: r [ W/2, H/2]
Median matrix k = [0,1,0;1,1,1;0,1,0]
mean is a median function
r[x,y]=median(R[x,y-1]R[x-1,y]R[x,y]R[x+1,y]R[x,y+1])
The Gaussian filtering process is as follows:
original matrix: r [ W/2, H/2]
Matrix after conversion: k [ W/2, H/2]
Gaussian matrix dimension: n x n
n=5
N=(n+1)/2
5 x 5 Gauss matrix
k=[0.36790.53530.60650.53530.3679;0.53530.77880.88250.77880.5353;0.60650.88251.00000.88250.6065;0.53530.77880.88250.77880.5353;0.36790.53530.60650.53530.3679;]
Temporary matrix I = [ [ alpha ] ]
r(x-(n-1)/2,y-(n-1)/2),...,r(x-(n-1)/2,y+(n-1)/2);
...
r(x+(n-1)/2,y-(n-1)/2),...,r(x+(n-1)/2,y+(n-1)/2);
]
Each data of the I matrix is compared with the intermediate data r [ x, y ] +/-delta t (delta t is a preset value), the data in the range keeps the original value, and the data out of the range is replaced by r [ x, y ]. A new matrix T is obtained with dimensions n x n.
Multiplying the matrix T with the matrix I to obtain a new matrix V
V=T*I
Carrying out dicing processing on the matrix V;
S1=[
V(1,1),...,V(1,N),
...
V(N,1),...,V(N,N)
]
S2=[
V(N,1),...,V(N,N),
...
V(2*N-1,1),...,V(2*N-1,N)
]
S3=[
V(1,N),...,V(1,2*N-1),
...
V(N,N),...,V(N,2*N-1),
]
S4=[
V(N,N),...,V(N,2*N-1),
...
V(2*N-1,N),...,V(2*N-1,2*N-1)
]
are respectively paired with S1,S2,S3,S4Ordering of identically indexed data
M1=Sort(S1(1),S2(1),S3(1),S4(1))
...
MN*N=Sort(S1(N*N),S2(N*N),S3(N*N),S4(N*N))
Removing the maximum value and the minimum value in the M matrix to form a new one-dimensional matrix
F1=[M1(2),...,MN*N(2)]
F2=[M1(3),...,MN*N(3)]
F=[F1(1)+F2(1),...,F1(N*N)+F2(N*N)]
Figure 238465DEST_PATH_IMAGE015
And (3) performing least square curve fitting:
x, y are data points, n is a polynomial order, return p is a polynomial coefficient vector p with power from high to low, and polyfit is a function in matlab.
p=polyfit(x,y,n)。
Example 8
On the basis of the above embodiment, the USB video color configuration includes: a USB video color configuration and a USB video black and white configuration.
It should be noted that, the system provided in the foregoing embodiment is only illustrated by dividing the functional units, and in practical applications, the functions may be distributed by different functional units according to needs, that is, the units or steps in the embodiments of the present invention are further decomposed or combined, for example, the units in the foregoing embodiment may be combined into one unit, or may be further decomposed into multiple sub-units, so as to complete all or the functions of the units described above. The names of the units and steps involved in the embodiments of the present invention are only for distinguishing the units or steps, and are not to be construed as unduly limiting the present invention.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes and related descriptions of the storage device and the processing device described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Those of skill in the art would appreciate that the various illustrative elements, method steps, described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that programs corresponding to the elements, method steps may be located in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. To clearly illustrate this interchangeability of electronic hardware and software, various illustrative components and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as electronic hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The terms "first," "second," and the like are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order.
The terms "comprises," "comprising," or any other similar term are intended to cover a non-exclusive inclusion, such that a process, method, article, or unit/apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or unit/apparatus.
So far, the technical solutions of the present invention have been described in connection with the preferred embodiments shown in the drawings, but it is easily understood by those skilled in the art that the scope of the present invention is obviously not limited to these specific embodiments. Equivalent modifications or substitutions of the related art marks may be made by those skilled in the art without departing from the principle of the present invention, and the technical solutions after such modifications or substitutions will fall within the protective scope of the present invention.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention.

Claims (8)

1. The personalized configuration system of the USB camera of the manufacture end is characterized in that the system is used for providing the manufacture end of the USB camera, and carrying out personalized configuration on the USB camera based on user requirements, and comprises the following steps:
the basic function configuration part is used for configuring equipment information, video effect and firmware downloading for the USB camera;
the professional function configuration part is used for carrying out optical correction configuration and personalized function configuration on the USB camera; the optical correction arrangement comprises: the method comprises the following steps of USB camera lens vignetting correction configuration, double-camera central point correction configuration and double-sensor frame synchronization picture central correction configuration; the personalized functional configuration comprises: an automatic exposure mode configuration and a voice coil motor drive configuration;
the video configuration part is used for performing personalized video function configuration on the USB camera; the personalized video function configuration comprises: USB video resolution configuration, USB video format configuration, USB video photographing setting, USB audio output configuration, USB video broadband configuration and USB video color configuration.
2. The system of claim 1, wherein the system further comprises: an upgrade section; the upgrade section includes: a local end and a server end; the local terminal is configured to send an upgrade request to a server; the server side responds to an upgrading request of a local side and sends updating contents to the local side; and the local terminal upgrades the system based on the received updated content of the server.
3. The system of claim 1, wherein the method of dual-camera centerpoint correction configuration comprises: integrating two USB cameras on a circuit board in a mode of being placed side by side; the method for correcting and configuring the center of the frame synchronization picture of the double sensors comprises the following steps: and connecting the image processors on the circuit board of one USB camera side by side with two image sensors.
4. The system of claim 3, wherein the performing of the dual sensor frame synchronization picture center correction configuration comprises: step A1: enumerating a USB camera; step A2: performing target device enumeration; step A3: acquiring a video stream; step A4: obtaining the coordinates of the central point in the characteristic areas in the output pictures of the first image sensor and the second image sensor; step A5: comparing coordinate differences of coordinates of center points of the feature areas of the first image sensor and the second image sensor, if the comparison result of the coordinate differences exceeds a preset condition, executing the step A6, and if the comparison result of the coordinate differences does not exceed the preset condition, executing the step A7; step A6: transmitting the deviation value to the ISP through the UVC XU protocol, controlling the data output by the second image sensor by the ISP to enable the whole image to be subjected to X-axis/Y-axis direction deviation, and returning to the step A4 again; step A7: the correction is successful; step A8: and storing the data into the ISP flash memory.
5. The system according to claim 4, wherein the step A6 specifically comprises: calculating the coordinates of the center point of the characteristic region in the output image of the second image sensor and the center point of the characteristic region in the first image sensor based on the first image sensorThe deviation value of the coordinates of (a); if the abscissa of the center point of the characteristic region in the output picture of the second image sensor
Figure 549768DEST_PATH_IMAGE001
And the abscissa of the center point of the feature region in the first image sensor
Figure 212830DEST_PATH_IMAGE002
If the deviation value is larger than 4, the deviation value is transmitted to the ISP through the UVC XU, and the ISP controls the second image sensor to output data, so that the whole image is shifted leftwards
Figure 676173DEST_PATH_IMAGE003
Each pixel point; if the abscissa of the center point of the characteristic region in the output picture of the second image sensor
Figure 833616DEST_PATH_IMAGE001
And the abscissa of the center point of the feature region in the first image sensor
Figure 331593DEST_PATH_IMAGE002
If the deviation value is less than-4, the deviation value is transmitted to the ISP through the UVC XU, and the ISP controls the second image sensor to output data, so that the whole image shifts to the right
Figure 899978DEST_PATH_IMAGE003
Each pixel point; if the abscissa of the center point of the characteristic region in the output picture of the second image sensor
Figure 116195DEST_PATH_IMAGE001
And the abscissa of the center point of the feature region in the first image sensor
Figure 310285DEST_PATH_IMAGE002
Is less than or equal to 4 and is greater than or equal to-4, and the ordinate of the center point of the feature area in the output picture of the second image sensor
Figure 662769DEST_PATH_IMAGE004
And the abscissa of the center point of the feature region in the first image sensor
Figure 402055DEST_PATH_IMAGE005
If the deviation value is larger than 4, the deviation value is transmitted to the ISP through the UVC XU, and the ISP controls the second image sensor to output data, so that the whole image is shifted upwards
Figure 839989DEST_PATH_IMAGE006
Each pixel point; if the abscissa of the center point of the characteristic region in the output picture of the second image sensor
Figure 526186DEST_PATH_IMAGE001
And the abscissa of the center point of the feature region in the first image sensor
Figure 546225DEST_PATH_IMAGE002
Is less than or equal to 4 and is greater than or equal to-4, and the ordinate of the center point of the feature area in the output picture of the second image sensor
Figure 659675DEST_PATH_IMAGE004
And the abscissa of the center point of the feature region in the first image sensor
Figure 647223DEST_PATH_IMAGE005
If the deviation value is less than-4, the deviation value is transmitted to the ISP through the UVC XU, and the ISP controls the second image sensor to output data, so that the whole image is shifted downwards
Figure 871531DEST_PATH_IMAGE006
And (5) each pixel point.
6. The system according to claim 5, wherein the step a4 specifically includes: step A4.1: performing image segmentation on the first image sensor and the second image sensor to complete data separation; step A4.2: carrying out binarization processing on the result of image segmentation; step A4.3: carrying out black and white point statistics on the result of the binarization processing; step A4.4: abnormal point processing is carried out through two times of Gaussian filtering; step A4.5: carrying out black and white point extreme value statistics in the horizontal direction; step A4.5: counting the range value of the black points according to the number and direction extreme values of the counting points; step A4.6: merging pixels according to the range; step A4.7: solving a pixel combination minimum value point as a suspected central point; step A4.8: counting and combining horizontal and vertical coordinates as coordinates of a central point; step A4.9: and (5) carrying out center point coordinate transmission.
7. The system of claim 1, wherein the execution of the USB camera lens vignetting correction configuration comprises: step B1: carrying out original data transmission; step B2: respectively separating original data in an R channel, a GR channel, a GB channel and a G channel; step B3: performing median filtering processing on the result of the original data separation; step B4: performing Gaussian filtering processing on the result of the median filtering; step B5: performing original data blocking processing on the result of the Gaussian filtering processing; step B6: merging the original data in a size of 3 x 3; step B7: taking the extreme number of the original data of each channel and the corresponding position of the extreme number; step B8: respectively carrying out comparison operation on extreme values of the R channel, the GR channel, the GB channel and the G channel and the mean value to obtain a comparison result; step B9: carrying out mean value extraction on the center of the original data block; step B10: carrying out extreme value extraction on the center of the original data block; step B11: performing least square curve fitting; step B12: and (5) taking a fitting value for the edge, and taking a true value for the rest part to obtain a result.
8. The system of claim 1, wherein the USB video color configuration comprises: a USB video color configuration and a USB video black and white configuration.
CN202210396270.3A 2022-04-15 2022-04-15 Personalized configuration system of USB camera at manufacturing end Active CN114500995B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210396270.3A CN114500995B (en) 2022-04-15 2022-04-15 Personalized configuration system of USB camera at manufacturing end

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210396270.3A CN114500995B (en) 2022-04-15 2022-04-15 Personalized configuration system of USB camera at manufacturing end

Publications (2)

Publication Number Publication Date
CN114500995A true CN114500995A (en) 2022-05-13
CN114500995B CN114500995B (en) 2022-06-21

Family

ID=81489408

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210396270.3A Active CN114500995B (en) 2022-04-15 2022-04-15 Personalized configuration system of USB camera at manufacturing end

Country Status (1)

Country Link
CN (1) CN114500995B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060274170A1 (en) * 2005-06-07 2006-12-07 Olympus Corporation Image pickup device
CN101154222A (en) * 2006-09-27 2008-04-02 中国移动通信集团公司 System and method for on-line updating map
JP2008131177A (en) * 2006-11-17 2008-06-05 Aisin Seiki Co Ltd Correcting device for on-board camera, correcting method, and production method for vehicle using same correcting method
US20110007969A1 (en) * 2009-07-08 2011-01-13 Samsung Electronics Co., Ltd. Method and apparatus for correcting lens shading
US20110279686A1 (en) * 2010-05-17 2011-11-17 Hon Hai Precision Industry Co., Ltd. Image correction device and image correction method thereof
US20120050567A1 (en) * 2010-09-01 2012-03-01 Apple Inc. Techniques for acquiring and processing statistics data in an image signal processor
CN103369347A (en) * 2012-03-05 2013-10-23 苹果公司 Camera blemish defects detection
JP2015215430A (en) * 2014-05-09 2015-12-03 キヤノン株式会社 Image blurring correction device, optical instrument, and image blurring correction method
CN108429872A (en) * 2017-02-13 2018-08-21 半导体元件工业有限责任公司 Method, imaging device and imaging system for correcting image vignetting
WO2019009008A1 (en) * 2017-07-05 2019-01-10 Sony Semiconductor Solutions Corporation Imaging apparatus with second imaging element used for correcting vignetting in images captured by first imaging element
US20190281221A1 (en) * 2016-08-05 2019-09-12 Sony Corporation Imaging device, solid-state image sensor, camera module, drive control unit, and imaging method
CN113170028A (en) * 2019-01-30 2021-07-23 华为技术有限公司 Method for generating image data of imaging algorithm based on machine learning

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060274170A1 (en) * 2005-06-07 2006-12-07 Olympus Corporation Image pickup device
CN101154222A (en) * 2006-09-27 2008-04-02 中国移动通信集团公司 System and method for on-line updating map
JP2008131177A (en) * 2006-11-17 2008-06-05 Aisin Seiki Co Ltd Correcting device for on-board camera, correcting method, and production method for vehicle using same correcting method
US20110007969A1 (en) * 2009-07-08 2011-01-13 Samsung Electronics Co., Ltd. Method and apparatus for correcting lens shading
US20110279686A1 (en) * 2010-05-17 2011-11-17 Hon Hai Precision Industry Co., Ltd. Image correction device and image correction method thereof
US20120050567A1 (en) * 2010-09-01 2012-03-01 Apple Inc. Techniques for acquiring and processing statistics data in an image signal processor
CN103369347A (en) * 2012-03-05 2013-10-23 苹果公司 Camera blemish defects detection
JP2015215430A (en) * 2014-05-09 2015-12-03 キヤノン株式会社 Image blurring correction device, optical instrument, and image blurring correction method
US20190281221A1 (en) * 2016-08-05 2019-09-12 Sony Corporation Imaging device, solid-state image sensor, camera module, drive control unit, and imaging method
CN108429872A (en) * 2017-02-13 2018-08-21 半导体元件工业有限责任公司 Method, imaging device and imaging system for correcting image vignetting
WO2019009008A1 (en) * 2017-07-05 2019-01-10 Sony Semiconductor Solutions Corporation Imaging apparatus with second imaging element used for correcting vignetting in images captured by first imaging element
CN113170028A (en) * 2019-01-30 2021-07-23 华为技术有限公司 Method for generating image data of imaging algorithm based on machine learning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李哲等: "基于嵌入式ZedBoard和OpenCV的运动目标检测", 《计算机与数字工程》 *

Also Published As

Publication number Publication date
CN114500995B (en) 2022-06-21

Similar Documents

Publication Publication Date Title
US8704900B2 (en) Imaging apparatus and imaging method
US8780215B2 (en) Apparatus and method for processing an image to correct image distortion caused by a hand shake
US20060152603A1 (en) White balance correction in digital camera images
US7974468B2 (en) Image processing apparatus and image processing method
US9871976B2 (en) Imaging apparatus, control system and control method
US8717460B2 (en) Methods and systems for automatic white balance
US10382671B2 (en) Image processing apparatus, image processing method, and recording medium
EP2671374B1 (en) Systems and methods for restoring color and non-color related integrity in an image
US20130147910A1 (en) Mobile device and image capturing method
US20200273152A1 (en) Method and system for image blurring processing
CN112218065B (en) Image white balance method, system, terminal device and storage medium
US20160088266A1 (en) Automatic image color correciton using an extended imager
US11863873B2 (en) Image capturing apparatus, method of controlling image capturing apparatus, and storage medium
CN114998122A (en) Low-illumination image enhancement method
CN114500995B (en) Personalized configuration system of USB camera at manufacturing end
CN114223209A (en) Adaptive image data linearization for HDR image sensors
US20230164451A1 (en) Information processing apparatus, method, medium, and system for color correction
JP2003333617A (en) Color correction circuit and image pickup device
US10657673B2 (en) Image processing apparatus, image processing method, and storage medium to correct pixels using singular value decomposition
JP2003101884A (en) Shading correcting device and method
US9582861B2 (en) Image processing technique using high frequency data analysis to filter low frequency data by dynamic kernel adjustment
KR102194447B1 (en) Method for removing digital image noise
CN103685853A (en) Image processing apparatus and image processing method
US20200314349A1 (en) Image processing device, image processing method, and program
CN108475432B (en) 360-degree panoramic image and 360-degree panoramic video identification method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant