CN114500995B - Personalized configuration system of USB camera at manufacturing end - Google Patents

Personalized configuration system of USB camera at manufacturing end Download PDF

Info

Publication number
CN114500995B
CN114500995B CN202210396270.3A CN202210396270A CN114500995B CN 114500995 B CN114500995 B CN 114500995B CN 202210396270 A CN202210396270 A CN 202210396270A CN 114500995 B CN114500995 B CN 114500995B
Authority
CN
China
Prior art keywords
image sensor
configuration
usb
video
center point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210396270.3A
Other languages
Chinese (zh)
Other versions
CN114500995A (en
Inventor
王伟光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beiyuan Technology Shenzhen Co ltd
Original Assignee
Beiyuan Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beiyuan Technology Shenzhen Co ltd filed Critical Beiyuan Technology Shenzhen Co ltd
Priority to CN202210396270.3A priority Critical patent/CN114500995B/en
Publication of CN114500995A publication Critical patent/CN114500995A/en
Application granted granted Critical
Publication of CN114500995B publication Critical patent/CN114500995B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L41/00Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
    • H04L41/08Configuration management of networks or network elements
    • H04L41/0803Configuration setting
    • H04L41/0813Configuration setting characterised by the conditions triggering a change of settings
    • H04L41/082Configuration setting characterised by the conditions triggering a change of settings the condition being updates or upgrades of network functionality

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Studio Devices (AREA)

Abstract

The invention belongs to the technical field of computer software, and particularly relates to a USB camera personalized configuration system of a manufacturing end. The system is used for providing a USB camera manufacturing end, and carrying out personalized configuration on the USB camera based on user requirements, and comprises the following steps: the basic function configuration part is used for configuring equipment information, video effect and firmware downloading for the USB camera; the professional function configuration part is used for carrying out optical correction configuration and personalized function configuration on the USB camera; and the video configuration part is used for performing personalized video function configuration on the USB camera. The invention solves the requirements of manufacturing the USB camera and personalized customization of manufacturers.

Description

Personalized configuration system of USB camera at manufacturing end
Technical Field
The invention belongs to the technical field of computer software, and particularly relates to a personalized configuration system for a USB camera at a manufacturing end.
Background
The camera with the USB interface outputting the video stream has very wide application due to the universality of the USB interface and the convenience of plug and play. Generally, the development of the firmware of the USB camera is completed, and the USB camera enters a manufacturing production process at a next stage, in which a manufacturer faces various specification requirements of its terminal user and also faces quality problems caused by physical reasons in the manufacturing process of the USB camera. Manufacturers typically feed back various requirements to the USB camera firmware development end to modify the firmware program. The requirements of terminal users are very complicated, and great aging pressure is caused to the development end and the manufacturing and production end of the USB camera firmware. Therefore, the multifunctional software for upgrading the server based on the USB camera just meets the complicated requirements, helps USB camera manufacturers meet the requirements of users through the software, and greatly reduces the pressure of a camera firmware development end.
Disclosure of Invention
In view of this, the main objective of the present invention is to provide a personalized configuration system for a USB camera at a manufacturing end, which solves the requirements of manufacturing the USB camera and personalized customization of manufacturers.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
the personalized configuration system of the USB camera of the manufacture end is used for providing the manufacture end of the USB camera, and carrying out personalized configuration on the USB camera based on user requirements, and comprises the following steps:
the basic function configuration part is used for configuring equipment information, video effect and firmware downloading for the USB camera;
the professional function configuration part is used for carrying out optical correction configuration and personalized function configuration on the USB camera; the optical correction arrangement comprises: the method comprises the following steps of USB camera lens vignetting correction configuration, double-camera central point correction configuration and double-sensor frame synchronization picture central correction configuration; the personalized functional configuration comprises: an auto exposure mode configuration and a voice coil motor drive configuration.
The video configuration part is used for performing personalized video function configuration on the USB camera; the personalized video function configuration comprises: USB video resolution configuration, USB video format configuration, USB video photographing setting, USB audio output configuration, USB video broadband configuration and USB video color configuration.
Further, the system further comprises: an upgrade section; the upgrade section includes: a local end and a server end; the local terminal is configured to send an upgrade request to a server; the server side responds to an upgrading request of a local side and sends updating contents to the local side; and the local terminal upgrades the system based on the received updated content of the server.
Further, the method for correcting the configuration of the center points of the two cameras comprises the following steps: integrating two USB cameras on a circuit board in a mode of being placed side by side; the method for correcting and configuring the center of the frame synchronization picture of the double sensors comprises the following steps: and connecting the image processors on the circuit board of one USB camera side by side with two image sensors.
Further, the implementation process of the dual-sensor frame synchronization picture center correction configuration includes: step A1: enumerating a USB camera; step A2: performing target device enumeration; step A3: acquiring a video stream; step A4: obtaining the coordinates of the central point in the characteristic areas in the output pictures of the first image sensor and the second image sensor; step A5: comparing coordinate differences of coordinates of center points of the feature areas of the first image sensor and the second image sensor, if the comparison result of the coordinate differences exceeds a preset condition, executing the step A6, and if the comparison result of the coordinate differences does not exceed the preset condition, executing the step A7; step A6: transmitting the deviation value to the ISP through the UVC XU protocol, controlling the data output by the second image sensor by the ISP to enable the whole image to be subjected to X-axis/Y-axis direction deviation, and returning to the step A4 again; step A7: the correction is successful; step A8: and storing the data into the ISP flash memory.
Further, the step a6 specifically includes: calculating a deviation value between the coordinate of the center point of the characteristic region in the output picture of the second image sensor and the coordinate of the center point of the characteristic region in the first image sensor by taking the first image sensor as a reference; if the abscissa of the center point of the characteristic region in the output picture of the second image sensor
Figure DEST_PATH_IMAGE002
And the abscissa of the center point of the feature region in the first image sensor
Figure DEST_PATH_IMAGE004
If the deviation value is larger than 4, the deviation value is transmitted to the ISP through the UVC XU, and the ISP controls the second image sensor to output data, so that the whole image is shifted leftwards
Figure DEST_PATH_IMAGE006
Each pixel point; if the abscissa of the center point of the characteristic region in the output picture of the second image sensor
Figure DEST_PATH_IMAGE002A
And the abscissa of the center point of the feature region in the first image sensor
Figure DEST_PATH_IMAGE004A
If the deviation value is less than-4, the deviation value is transmitted to the ISP through the UVC XU, and the ISP controls the second image sensor to output data, so that the whole image shifts rightwards
Figure DEST_PATH_IMAGE006A
Each pixel point; if the abscissa of the center point of the characteristic region in the output picture of the second image sensor
Figure DEST_PATH_IMAGE002AA
And the abscissa of the center point of the feature region in the first image sensor
Figure DEST_PATH_IMAGE004AA
Is less than or equal to 4 and is greater than or equal to-4, and the ordinate of the center point of the feature area in the output picture of the second image sensor
Figure DEST_PATH_IMAGE008
And the ordinate of the center point of the characteristic region in the first image sensor
Figure DEST_PATH_IMAGE010
If the deviation value is larger than 4, the deviation value is transmitted to the ISP through the UVC XU, and the ISP controls the second image sensor to output data, so that the whole image is shifted upwards
Figure DEST_PATH_IMAGE012
Each pixel point; if the abscissa of the center point of the characteristic region in the output picture of the second image sensor
Figure DEST_PATH_IMAGE002AAA
And the abscissa of the center point of the feature region in the first image sensor
Figure DEST_PATH_IMAGE004AAA
Is less than or equal to 4 and is greater than or equal to-4, and the ordinate of the center point of the feature area in the output picture of the second image sensor
Figure DEST_PATH_IMAGE008A
And the ordinate of the center point of the characteristic region in the first image sensor
Figure DEST_PATH_IMAGE010A
If the deviation value is less than-4, the deviation value is transmitted to the ISP through the UVC XU, and the ISP controls the second image sensor to output data, so that the whole image is shifted downwards
Figure DEST_PATH_IMAGE012A
And (5) each pixel point.
Further, the step a4 specifically includes: step A4.1: performing image segmentation on the first image sensor and the second image sensor to complete data separation; step A4.2: carrying out binarization processing on the result of image segmentation; step A4.3: carrying out black and white point statistics on the result of the binarization processing; step A4.4: abnormal point processing is carried out through two times of Gaussian filtering; step A4.5: carrying out black and white point extreme value statistics in the horizontal direction; step A4.5: counting the range value of the black points according to the number and direction extreme values of the counting points; step A4.6: merging pixels according to the range; step A4.7: solving a pixel combination minimum value point as a suspected central point; step A4.8: counting and combining horizontal and vertical coordinates as coordinates of a central point; step A4.9: and (5) carrying out center point coordinate transmission.
Further, the execution process of the USB camera lens vignetting correction configuration includes: step B1: carrying out original data transmission; step B2: respectively separating original data in an R channel, a GR channel, a GB channel and a G channel; step B3: carrying out median filtering processing; step B4: carrying out Gaussian filtering processing; step B5: carrying out original data blocking processing; step B6: merging the original data with a size of 3 x 3; step B7: taking the extreme number of the original data of each channel and the corresponding position of the extreme number; step B8: respectively carrying out comparison operation on extreme values of the R channel, the GR channel, the GB channel and the G channel and the mean value to obtain a comparison result; step B9: carrying out mean value extraction on the center of the original data block; step B10: carrying out extreme value extraction on the center of the original data block; step B11: performing least square curve fitting; step B12: and (5) taking a fitting value for the edge, and taking a true value for the rest part to obtain a result.
Further, the USB video color configuration includes: a USB video color configuration and a USB video black and white configuration.
The personalized configuration system of the USB camera at the manufacturing end has the following beneficial effects:
1. personalized configuration: the invention can provide feasible configuration for a camera manufacturing party, and can modify and upgrade the firmware according to the requirements of the terminal user.
2. Promote the camera quality: the professional configuration in the invention can be used for detecting and correcting the element quality of the camera, thereby improving the yield.
3. The function is various: compared with the prior art that the USB camera only provides basic functions, the system provided by the invention has more functions when being cooperated with the USB camera.
Drawings
Fig. 1 is a schematic system structure diagram of a personalized configuration system for a USB camera at a manufacturing end according to an embodiment of the present invention;
fig. 2 is a schematic diagram of an execution flow of a dual-sensor frame synchronization picture center correction configuration of a USB camera personalized configuration system at a manufacturing end according to an embodiment of the present invention;
fig. 3 is a schematic flowchart of step a6 when the personalized configuration system for USB camera at the manufacturing end performs center correction configuration on a frame synchronization picture of a dual sensor according to an embodiment of the present invention;
fig. 4 is a schematic flowchart of a process of obtaining a center point coordinate when the personalized configuration system of the USB camera at the manufacturing end performs center correction configuration on a frame synchronization picture of a dual sensor according to an embodiment of the present invention;
fig. 5 is a schematic diagram of a black characteristic region generated when the USB camera personalized configuration system at the manufacturing end performs dual-sensor frame synchronization picture center correction configuration, binarization processing, and black and white point statistics according to the embodiment of the present invention;
fig. 6 is a schematic flow chart of a USB camera lens vignetting correction configuration performed by the USB camera personalized configuration system at the manufacturing end according to the embodiment of the present invention.
Detailed Description
The method of the present invention will be described in further detail below with reference to the accompanying drawings and embodiments of the invention.
Example 1
As shown in fig. 1, the system for personalized configuration of a USB camera at a manufacturing end, which is provided for the manufacturing end of the USB camera, performs personalized configuration of the USB camera based on user requirements, includes:
the basic function configuration part is used for configuring equipment information, video effect and firmware downloading for the USB camera;
the professional function configuration part is used for carrying out optical correction configuration and personalized function configuration on the USB camera; the optical correction arrangement comprises: the method comprises the following steps of USB camera lens vignetting correction configuration, double-camera central point correction configuration and double-sensor frame synchronization picture central correction configuration; the personalized functional configuration comprises: an auto exposure mode configuration and a voice coil motor drive configuration.
The video configuration part is configured to perform personalized video function configuration on the USB camera; the personalized video function configuration comprises: USB video resolution configuration, USB video format configuration, USB video photographing setting, USB audio output configuration, USB video broadband configuration and USB video color configuration.
Specifically, the USB camera: the camera scheme or product using the USB interface as video output mainly follows the usbvideoclass (uvc) protocol of the USB video standard established by the USB international official organization.
Example 2
On the basis of the above embodiment, the system further includes: an upgrade section; the upgrade section includes: a local end and a server end; the local terminal is configured to send an upgrade request to a server; the server side responds to an upgrading request of a local side and sends updating contents to the local side; and the local terminal upgrades the system based on the received updated content of the server.
Specifically, the device information configuration includes: and configuring attribute information of the USB camera equipment and configuring enumeration of the USB host. The enumeration includes the ID, name and product serial number of the device.
Specifically, the video effect configuration includes: in a UVC protocol followed by the USB camera, the attribute is adjusted according to the parameters of the video image effect. Generally, a terminal user can adjust different values according to own environment, but the values need to be stored in a camera memory, and the values can be effective when the terminal user is opened again next time, and the default values cannot be recovered.
The firmware download configuration comprises: and configuring a program, namely Firmware, running in a memory of the USB camera. When the camera works normally, a burning tool is needed to burn the firmware into a memory of the camera.
Example 3
On the basis of the previous embodiment, the method for correcting the configuration of the center points of the two cameras comprises the following steps: integrating two USB cameras on a circuit board in a mode of being placed side by side; the method for correcting and configuring the center of the frame synchronization picture of the double sensors comprises the following steps: and connecting the image processors on the circuit board of one USB camera side by side with two image sensors.
Specifically, camera shading/vignetting correction: the english lensshading correction corrects for the non-uniformity of the brightness and color generated by the camera lens. In the production process of the camera, the camera can have the problems of bright middle and dark periphery and large picture color deviation due to unreasonable matching of the image sensor CMOSsensor, the quality of the lens and parameters, so that a manufacturer can use the software to correct each camera on a production line by using the module disclosed by the invention, and the brightness and the color of each camera are ensured to be uniform. The quality is consistent.
Specifically, the center of the two cameras is corrected: usually, some camera modules can be integrated on a board by two independent USB cameras, and due to physical errors in the production process, the video picture centers of the two cameras are deviated, so that the center correction procedure is added in the reproduction process, and thus when a user of the camera develops the camera, extra correction is not needed by an algorithm. Development resources and time are saved.
And (3) correcting and configuring the center of a frame synchronization picture of the double sensors: in some camera modules, one main control chip (ISP) is connected to two cmos sensors at the same time, and frame synchronization is achieved for output pictures. Also, due to physical errors in the production process, the centers of the frames output by the two sensors are deviated, so that a center correction procedure needs to be added in the production process. Thus, when a user of the camera develops the camera, no additional correction is needed by the algorithm. Development resources and time are saved.
Example 4
As shown in fig. 2, on the basis of the above embodiment, the implementation process of the dual-sensor frame synchronization picture center correction configuration includes: step A1: enumerating a USB camera; step A2: enumerating target equipment; step A3: acquiring a video stream; step A4: obtaining the coordinates of the central point in the characteristic areas in the output pictures of the first image sensor and the second image sensor; step A5: comparing coordinate differences of feature region center point coordinates of the first image sensor and the second image sensor, if the coordinate difference comparison result exceeds a preset condition, executing step A6, and if the coordinate difference comparison result does not exceed the preset condition, executing step A7; step A6: transmitting the deviation value to the ISP through the UVC XU protocol, controlling the data output by the second image sensor by the ISP to enable the whole image to be subjected to X-axis/Y-axis direction deviation, and returning to the step A4 again; step A7: the correction is successful; step A8: and storing the data into the ISP flash memory.
Specifically, the target device is a device for storing the acquired image after the image is acquired by the USB camera.
Specifically, the meaning of the automatic exposure mode configuration is specifically: the USB camera supports automatic exposure, and the general USB camera uses the average brightness value of the image as the reference value of the ambient brightness, but this cannot solve the requirement of all the usage environments on brightness. The auto exposure mode of the present invention provides 3 functions, the first being the conventional averaging metering mode. The second is a central metering mode, for example, in a conventional scene of a camera, the effective shot content is in the central area of the picture, and then the problem of improper brightness of the shot content can be solved by using the central metering. And the third method is arbitrary regional metering (ROI), which tells the camera which region of the picture is to be taken as the key brightness reference through application software connected with a computer end of the USB camera, and the USB camera takes the region as the key exposure region as long as appropriate coordinates and range are provided. The application problem of more scenes is solved.
Specifically, the meaning of the voice coil motor driving configuration is: the camera has an automatic focusing function, the lens assembly can be provided with a motor, and the motor drives a driving chip, namely VCM. VCM has a plurality of models, the VCM is matched with the common models, and camera manufacturers can select self-matching.
Specifically, the resolution of the USB camera is: and the resolution list can be output by the USB camera. The USB camera firmware program contains a series of resolutions for the user to choose, and the requirements of different users on the resolutions are different. The invention supports the self-configuration of manufacturers according to the requirements of users.
USB video format: and the video format which can be output by the USB camera, YUV or MJPEG compressed video stream.
In fig. 2, 3, and 4, the Sensor0 and the Sensor1 represent a first image Sensor and a second image Sensor, respectively. RAW is RAW data.
Referring to fig. 5, in fig. 5, the image size is set: width Height, target region size: w × h, binarization threshold: t, coordinate points (x, y) in the target area; width/2-w/2< = x < = Width/2+ w/2; height/2-h/2< = y < = Height/2+ h/2;
the binarization processing process comprises the following steps:
the color of a pixel is represented by three rgb values, and a pixel corresponds to three color values, r, b and g respectively. G data G (x, y) is taken, the G data of each point is compared with a binarization threshold value T (the point mark smaller than the binarization threshold value is G (x, y) = 0; the point mark larger than the binarization threshold value T is G (x, y) = 1), and the target area rgb image is binarized to be only composed of 0 and 1.
Figure DEST_PATH_IMAGE014
The black spot statistical process is as follows:
by HCNT(n) represents the number of G (x, y) =1 in the nth row within the target region:
HCNT(n)=
Figure DEST_PATH_IMAGE016
with ACNTRepresents the sum of the numbers of all G (x, y) =1 within the target area:
ACNT=
Figure DEST_PATH_IMAGE018
abnormal point treatment:
and (3) judging the points in each row, wherein the number of G (x, y) =1 of which is less than the rated value 3, as invalid black points, and filtering:
Figure DEST_PATH_IMAGE020
gaussian filtering G (x, y) to obtain new GN(x,y);
Gaussian filtering:
a Gaussian matrix:
K=[1,1,1;1,1,1;1,1,1]
V=G(x-1,y-1)+G(x-1,y)+G(x-1,y+1)+G(x,y-1)+G(x,y)+G(x,y+1)+G(x+1,y-1)+G(x+1,y)+G(x+1,y+1);
GN(x,y)
Figure DEST_PATH_IMAGE022
acquiring the coordinates of the central point of the black area in the image:
get GNAll (x, y) greater than 3 correspond to (x)1,x2,...,xn) And (y)1,y2,...,yn) Respectively taking an average value
X0=
Figure DEST_PATH_IMAGE024
Y0=
Figure DEST_PATH_IMAGE026
(X0,Y0) I.e. the coordinates of the central point of the black characteristic region。
Example 5
On the basis of the previous embodiment, the step a6 specifically includes: calculating a deviation value between the coordinate of the center point of the characteristic region in the output picture of the second image sensor and the coordinate of the center point of the characteristic region in the first image sensor by taking the first image sensor as a reference; if the abscissa of the center point of the characteristic region in the output picture of the second image sensor
Figure DEST_PATH_IMAGE002AAAA
And the abscissa of the center point of the feature region in the first image sensor
Figure DEST_PATH_IMAGE004AAAA
If the deviation value is larger than 4, the deviation value is transmitted to the ISP through the UVC XU, and the ISP controls the second image sensor to output data, so that the whole image is shifted leftwards
Figure DEST_PATH_IMAGE006AA
Each pixel point; if the abscissa of the center point of the characteristic region in the output picture of the second image sensor
Figure DEST_PATH_IMAGE002_5A
And the abscissa of the center point of the feature region in the first image sensor
Figure DEST_PATH_IMAGE004_5A
If the deviation value is less than-4, the deviation value is transmitted to the ISP through the UVC XU, and the ISP controls the second image sensor to output data, so that the whole image shifts rightwards
Figure DEST_PATH_IMAGE006AAA
Each pixel point; if the abscissa of the center point of the characteristic region in the output picture of the second image sensor
Figure DEST_PATH_IMAGE002_6A
And the abscissa of the center point of the feature region in the first image sensor
Figure DEST_PATH_IMAGE004_6A
Is less than or equal to 4 and is greater than or equal to-4, and the ordinate of the center point of the feature area in the output picture of the second image sensor
Figure DEST_PATH_IMAGE008AA
And the ordinate of the center point of the characteristic region in the first image sensor
Figure DEST_PATH_IMAGE010AA
If the deviation value is larger than 4, the deviation value is transmitted to the ISP through the UVC XU, and the ISP controls the second image sensor to output data, so that the whole image is shifted upwards
Figure DEST_PATH_IMAGE012AA
Pixel points; if the abscissa of the center point of the characteristic region in the output picture of the second image sensor
Figure DEST_PATH_IMAGE002_7A
And the abscissa of the center point of the feature region in the first image sensor
Figure DEST_PATH_IMAGE004_7A
Is less than or equal to 4 and is greater than or equal to-4, and the ordinate of the center point of the feature area in the output picture of the second image sensor
Figure DEST_PATH_IMAGE008AAA
And the ordinate of the center point of the characteristic region in the first image sensor
Figure DEST_PATH_IMAGE010AAA
If the deviation value is less than-4, the deviation value is transmitted to the ISP through the UVC XU, and the ISP controls the second image sensor to output data, so that the whole image is shifted downwards
Figure DEST_PATH_IMAGE012AAA
And (5) each pixel point.
The content of the USB video photographing setting comprises the following steps: and the USB camera supports a video photographing function. According to the method, a user can return to a camera firmware development end to modify according to what photographing mode is needed in a scene, and the 3 photographing modes are adjusted to be configurable by a manufacturer.
The content of the USB audio output configuration comprises: microphone audio processing built on the usbaudioclass (uac) standard. Including the enabling of the microphone function. The USB camera only has one audio processing chip, and the invention supports 3 types for manufacturers to adapt to users.
USB video bandwidth: the USB interface data transmission rate has a regulation, and different video stream sizes can be controlled by compression for video transmission. A general user may connect a USB camera with a computer or Linux or android device, and when a video data stream of the camera is large, a picture of these host platforms fails, or resources such as a CPU are consumed too much. Controlling the video stream is also a necessary setting. The conventional USB camera is fixed in video stream size, and the invention supports manufacturers to configure different video stream bandwidths for users according to different user requirements.
USB video color/black & white: the USB camera output video stream may be configured as a color image or a black and white image. Manufacturers can provide color or black and white effects to their customers depending on their customers' scenarios and needs. A general USB camera does not have this function, and needs a camera firmware development end to develop two different sets of programs to manufacturers.
Example 6
On the basis of the previous embodiment, the step a4 specifically includes: step A4.1: performing image segmentation on the first image sensor and the second image sensor to complete data separation; step A4.2: carrying out binarization processing on the result of image segmentation; step A4.3: carrying out black and white point statistics on the result of the binarization processing; step A4.4: abnormal point processing is carried out through two times of Gaussian filtering; step A4.5: carrying out black and white point extreme value statistics in the horizontal direction; step A4.5: counting the range value of the black points according to the number and direction extreme values of the counting points; step A4.6: merging pixels according to the range; step A4.7: solving a pixel combination minimum value point as a suspected central point; step A4.8: counting and combining horizontal and vertical coordinates as coordinates of a central point; step A4.9: and (5) transmitting the coordinates of the central point.
Example 7
On the basis of the above embodiment, the execution process of the USB camera lens vignetting correction configuration includes: step B1: carrying out original data transmission; step B2: respectively separating original data in an R channel, a GR channel, a GB channel and a G channel; step B3: carrying out median filtering processing; step B4: carrying out Gaussian filtering processing; step B5: carrying out original data blocking processing; step B6: merging the original data with a size of 3 x 3; step B7: taking the extreme number of the original data of each channel and the corresponding position of the extreme number; step B8: respectively carrying out comparison operation on extreme values of the R channel, the GR channel, the GB channel and the G channel and the mean value to obtain a comparison result; step B9: carrying out mean value extraction on the center of the original data block; step B10: carrying out extreme value extraction on the center of the original data block; step B11: performing least square curve fitting; step B12: and (5) taking a fitting value for the edge, and taking a true value for the rest part to obtain a result.
Specifically, the original data is original image data acquired by the USB camera.
Specifically, a RAW picture is set: length: w, width: H.
the RAW picture data consists of four channel data of R, GR, GB and B, and single channel RAW data of R, GR, GB and B are obtained in a separated mode according to the color sequence of the RAW data;
the median filtering process is as follows:
original matrix: r [ W/2, H/2]
Matrix after conversion: r [ W/2, H/2]
Median matrix k = [0,1,0;1,1,1;0,1,0]
mean is a median function
r[x,y]=median(R[x,y-1]R[x-1,y]R[x,y]R[x+1,y]R[x,y+1])
The Gaussian filtering process is as follows:
original matrix: r [ W/2, H/2]
Matrix after conversion: k [ W/2, H/2]
Gaussian matrix dimension: n x n
n=5
N=(n+1)/2
5 x 5 Gauss matrix
k=[0.36790.53530.60650.53530.3679;0.53530.77880.88250.77880.5353;0.60650.88251.00000.88250.6065;0.53530.77880.88250.77880.5353;0.36790.53530.60650.53530.3679;]
Temporary matrix I = [ [ alpha ] ]
r(x-(n-1)/2,y-(n-1)/2),...,r(x-(n-1)/2,y+(n-1)/2);
...
r(x+(n-1)/2,y-(n-1)/2),...,r(x+(n-1)/2,y+(n-1)/2);
]
Each data of the I matrix is compared with the intermediate data r [ x, y ] +/-. delta t (delta t is a preset value), the data in the range keeps the original value, and the data not in the range is replaced by r [ x, y ]. A new matrix T is obtained with dimensions n x n.
Multiplying the matrix T by the matrix I to obtain a new matrix V
V=T*I
Carrying out dicing processing on the matrix V;
S1=[
V(1,1),...,V(1,N),
...
V(N,1),...,V(N,N)
]
S2=[
V(N,1),...,V(N,N),
...
V(2*N-1,1),...,V(2*N-1,N)
]
S3=[
V(1,N),...,V(1,2*N-1),
...
V(N,N),...,V(N,2*N-1),
]
S4=[
V(N,N),...,V(N,2*N-1),
...
V(2*N-1,N),...,V(2*N-1,2*N-1)
]
are respectively paired with S1,S2,S3,S4Same cableSorting the data of the index
M1=Sort(S1(1),S2(1),S3(1),S4(1))
...
MN*N=Sort(S1(N*N),S2(N*N),S3(N*N),S4(N*N))
Removing the maximum value and the minimum value in the M matrix to form a new one-dimensional matrix
F1=[M1(2),...,MN*N(2)]
F2=[M1(3),...,MN*N(3)]
F=[F1(1)+F2(1),...,F1(N*N)+F2(N*N)]
Figure DEST_PATH_IMAGE028
And (3) least square curve fitting:
x, y are data points, n is a polynomial order, return p is a polynomial coefficient vector p with power from high to low, and polyfit is a function in matlab.
p=polyfit(x,y,n)。
Example 8
On the basis of the above embodiment, the USB video color configuration includes: a USB video color configuration and a USB video black and white configuration.
It should be noted that, the system provided in the foregoing embodiment is only illustrated by dividing each functional unit, and in practical applications, the functions may be distributed by different functional units as needed, that is, the units or steps in the embodiments of the present invention are further decomposed or combined, for example, the units in the foregoing embodiments may be combined into one unit, or may be further split into multiple sub-units, so as to complete all or the functions of the units described above. The names of the units and steps involved in the embodiments of the present invention are only for distinguishing the units or steps, and are not to be construed as unduly limiting the present invention.
It can be clearly understood by those skilled in the art that, for convenience and simplicity of description, the specific working processes and related descriptions of the storage device and the processing device described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Those of skill in the art will appreciate that the various illustrative elements, method steps, described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that programs corresponding to the software elements, method steps may be located in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. To clearly illustrate this interchangeability of electronic hardware and software, various illustrative components and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as electronic hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The terms "first," "second," and the like are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order.
The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or unit/apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or unit/apparatus.
So far, the technical solutions of the present invention have been described in connection with the preferred embodiments shown in the drawings, but it is easily understood by those skilled in the art that the scope of the present invention is obviously not limited to these specific embodiments. Equivalent modifications or substitutions of the related art marks may be made by those skilled in the art without departing from the principle of the present invention, and the technical solutions after such modifications or substitutions will fall within the protective scope of the present invention.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention.

Claims (6)

1. The individualized configuration system of USB camera of end of making, its characterized in that, the system is used for providing USB camera end of making, based on user's demand, carries out the individualized configuration of USB camera, includes:
the basic function configuration part is used for configuring equipment information, video effect and firmware downloading for the USB camera;
the professional function configuration part is used for carrying out optical correction configuration and personalized function configuration on the USB camera; the optical correction arrangement comprises: the method comprises the following steps of USB camera lens vignetting correction configuration, double-camera central point correction configuration and double-sensor frame synchronization picture central correction configuration; the personalized functional configuration comprises: an automatic exposure mode configuration and a voice coil motor drive configuration;
the video configuration part is used for performing personalized video function configuration on the USB camera; the personalized video function configuration comprises: the method comprises the following steps of USB video resolution configuration, USB video format configuration, USB video photographing setting, USB audio output configuration, USB video broadband configuration and USB video color configuration; the method for correcting and configuring the center of the frame synchronization picture of the double sensors comprises the following steps: connecting an image processor on a circuit board of a USB camera with two image sensors in parallel; the execution process of the double-sensor frame synchronization picture center correction configuration comprises the following steps: step A1: enumerating a USB camera; step A2: enumerating target equipment; step A3: acquiring a video stream; step A4: obtaining the coordinates of the central point in the characteristic areas in the output pictures of the first image sensor and the second image sensor; step A5: comparing coordinate differences of coordinates of center points of the feature areas of the first image sensor and the second image sensor, if the comparison result of the coordinate differences exceeds a preset condition, executing the step A6, and if the comparison result of the coordinate differences does not exceed the preset condition, executing the step A7; step A6: transmitting the deviation value to the ISP through the UVC XU protocol, controlling the data output by the second image sensor by the ISP to enable the whole image to be subjected to X-axis/Y-axis direction deviation, and returning to the step A4 again; step A7: the correction is successful; step A8: storing the data into an ISP flash memory;
the execution process of the USB camera lens vignetting correction configuration comprises the following steps: step B1: carrying out original data transmission; step B2: respectively separating original data in an R channel, a GR channel, a GB channel and a G channel; step B3: performing median filtering processing on the result of the original data separation; step B4: performing Gaussian filtering processing on the result of the median filtering; step B5: performing original data blocking processing on the result of the Gaussian filtering processing; step B6: merging the original data with a size of 3 x 3; step B7: taking the extreme number of the original data of each channel and the corresponding position of the extreme number; step B8: respectively carrying out comparison operation on extreme values of the R channel, the GR channel, the GB channel and the G channel and the mean value to obtain a comparison result; step B9: carrying out mean value extraction on the center of the original data block; step B10: extracting an extreme value from the center of the original data block; step B11: performing least square curve fitting; step B12: and (5) taking a fitting value for the edge, and taking a true value for the rest part to obtain a result.
2. The system of claim 1, wherein the system further comprises: an upgrade section; the upgrade section includes: a local end and a server end; the local terminal is configured to send an upgrade request to a server; the server side responds to an upgrading request of a local side and sends updating contents to the local side; and the local terminal upgrades the system based on the received updated content of the server.
3. The system of claim 1, wherein the method of dual-camera centerpoint correction configuration comprises: two USB cameras are integrated on a circuit board in a side-by-side arrangement mode.
4. The system according to claim 3, wherein the step A6 specifically comprises: calculating a deviation value between the coordinate of the center point of the characteristic region in the output picture of the second image sensor and the coordinate of the center point of the characteristic region in the first image sensor by taking the first image sensor as a reference; if the abscissa of the center point of the characteristic region in the output picture of the second image sensor
Figure DEST_PATH_IMAGE001
And the abscissa of the center point of the feature region in the first image sensor
Figure 594318DEST_PATH_IMAGE002
If the deviation value is larger than 4, the deviation value is transmitted to the ISP through the UVC XU, and the ISP controls the second image sensor to output data, so that the whole image is shifted leftwards
Figure DEST_PATH_IMAGE003
Each pixel point; if the abscissa of the center point of the feature region in the output image of the second image sensor
Figure 455964DEST_PATH_IMAGE001
And the abscissa of the center point of the feature region in the first image sensor
Figure 979349DEST_PATH_IMAGE002
If the deviation value is less than-4, the deviation value is transmitted to the ISP through the UVC XU, and the ISP controls the second image sensor to output data, so that the whole image shifts rightwards
Figure 32362DEST_PATH_IMAGE003
Each pixel point; if the abscissa of the center point of the feature region in the output image of the second image sensor
Figure 273988DEST_PATH_IMAGE001
And the abscissa of the center point of the feature region in the first image sensor
Figure 611428DEST_PATH_IMAGE002
Is less than or equal to 4 and is greater than or equal to-4, and the ordinate of the center point of the feature area in the output picture of the second image sensor
Figure 254899DEST_PATH_IMAGE004
And the ordinate of the center point of the characteristic region in the first image sensor
Figure DEST_PATH_IMAGE005
If the deviation value is larger than 4, the deviation value is transmitted to the ISP through the UVC XU, and the ISP controls the second image sensor to output data, so that the whole image is shifted upwards
Figure 714699DEST_PATH_IMAGE006
Each pixel point; if the abscissa of the center point of the characteristic region in the output picture of the second image sensor
Figure 443621DEST_PATH_IMAGE001
And the abscissa of the center point of the feature region in the first image sensor
Figure 86217DEST_PATH_IMAGE002
Is less than or equal to 4 and is greater than or equal to-4, and the ordinate of the center point of the feature area in the output picture of the second image sensor
Figure 584195DEST_PATH_IMAGE004
And the ordinate of the center point of the characteristic region in the first image sensor
Figure 418158DEST_PATH_IMAGE005
If the deviation value is less than-4, the deviation value is transmitted to the ISP through the UVC XU, and the ISP controls the second image sensor to output data, so that the whole image is shifted downwards
Figure 431114DEST_PATH_IMAGE006
And (5) each pixel point.
5. The system according to claim 4, wherein the step A4 specifically comprises: step A4.1: performing image segmentation on the first image sensor and the second image sensor to complete data separation; step A4.2: carrying out binarization processing on the image segmentation result; step A4.3: carrying out black and white point statistics on the result of the binarization processing; step A4.4: abnormal point processing is carried out through two times of Gaussian filtering; step A4.5: carrying out black and white point extreme value statistics in the horizontal direction; step A4.5: counting the range value of the black points according to the number and direction extreme values of the counting points; step A4.6: merging pixels according to the range; step A4.7: solving a pixel combination minimum value point as a suspected central point; step A4.8: counting and combining horizontal and vertical coordinates as coordinates of a central point; step A4.9: and (5) carrying out center point coordinate transmission.
6. The system of claim 1, wherein the USB video color configuration comprises: a USB video color configuration and a USB video black and white configuration.
CN202210396270.3A 2022-04-15 2022-04-15 Personalized configuration system of USB camera at manufacturing end Active CN114500995B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210396270.3A CN114500995B (en) 2022-04-15 2022-04-15 Personalized configuration system of USB camera at manufacturing end

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210396270.3A CN114500995B (en) 2022-04-15 2022-04-15 Personalized configuration system of USB camera at manufacturing end

Publications (2)

Publication Number Publication Date
CN114500995A CN114500995A (en) 2022-05-13
CN114500995B true CN114500995B (en) 2022-06-21

Family

ID=81489408

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210396270.3A Active CN114500995B (en) 2022-04-15 2022-04-15 Personalized configuration system of USB camera at manufacturing end

Country Status (1)

Country Link
CN (1) CN114500995B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019009008A1 (en) * 2017-07-05 2019-01-10 Sony Semiconductor Solutions Corporation Imaging apparatus with second imaging element used for correcting vignetting in images captured by first imaging element

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7920200B2 (en) * 2005-06-07 2011-04-05 Olympus Corporation Image pickup device with two cylindrical lenses
CN101154222A (en) * 2006-09-27 2008-04-02 中国移动通信集团公司 System and method for on-line updating map
JP4803449B2 (en) * 2006-11-17 2011-10-26 アイシン精機株式会社 On-vehicle camera calibration device, calibration method, and vehicle production method using this calibration method
KR101589310B1 (en) * 2009-07-08 2016-01-28 삼성전자주식회사 Lens shading correction method and apparatus
TW201143429A (en) * 2010-05-17 2011-12-01 Hon Hai Prec Ind Co Ltd System for correcting image and correcting method thereof
US8531542B2 (en) * 2010-09-01 2013-09-10 Apple Inc. Techniques for acquiring and processing statistics data in an image signal processor
US8797429B2 (en) * 2012-03-05 2014-08-05 Apple Inc. Camera blemish defects detection
JP2015215430A (en) * 2014-05-09 2015-12-03 キヤノン株式会社 Image blurring correction device, optical instrument, and image blurring correction method
US10868961B2 (en) * 2016-08-05 2020-12-15 Sony Corporation Imaging device, solid-state image sensor, camera module, drive control unit, and imaging method
US10142568B2 (en) * 2017-02-13 2018-11-27 Semiconductor Components Industries, Llc Methods and apparatus for vignette and out-of-focus correction
CN113170028B (en) * 2019-01-30 2023-06-20 华为技术有限公司 Method for generating image data of machine learning based imaging algorithm

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019009008A1 (en) * 2017-07-05 2019-01-10 Sony Semiconductor Solutions Corporation Imaging apparatus with second imaging element used for correcting vignetting in images captured by first imaging element

Also Published As

Publication number Publication date
CN114500995A (en) 2022-05-13

Similar Documents

Publication Publication Date Title
US8704900B2 (en) Imaging apparatus and imaging method
US8045061B2 (en) Method and apparatus for removing color noise of image signal
US20060152603A1 (en) White balance correction in digital camera images
US8780215B2 (en) Apparatus and method for processing an image to correct image distortion caused by a hand shake
CN108337496B (en) White balance processing method, processing device, processing equipment and storage medium
US7974468B2 (en) Image processing apparatus and image processing method
US10382671B2 (en) Image processing apparatus, image processing method, and recording medium
US9871976B2 (en) Imaging apparatus, control system and control method
CN111161188B (en) Method for reducing image color noise, computer device and readable storage medium
US8600185B1 (en) Systems and methods for restoring color and non-color related integrity in an image
CN114998122A (en) Low-illumination image enhancement method
US20110187903A1 (en) Digital photographing apparatus for correcting image distortion and image distortion correcting method thereof
CN103685854A (en) Image processing apparatus, image processing method, and program
CN114500995B (en) Personalized configuration system of USB camera at manufacturing end
JP2015041900A (en) Image processing device, its control method, and control program
US20230164451A1 (en) Information processing apparatus, method, medium, and system for color correction
JP2003333617A (en) Color correction circuit and image pickup device
US10657673B2 (en) Image processing apparatus, image processing method, and storage medium to correct pixels using singular value decomposition
KR102194447B1 (en) Method for removing digital image noise
CN103685853A (en) Image processing apparatus and image processing method
US9582861B2 (en) Image processing technique using high frequency data analysis to filter low frequency data by dynamic kernel adjustment
KR102471978B1 (en) Apparatus for supporting image data compression by correcting error data and method thereof
JP4047154B2 (en) Image processing apparatus and image processing method
TW202304189A (en) Method and system of performing scene-dependent lens shading correction
CN116567436A (en) White balance processing method, white balance processing device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant