CN112969022B - Camera adjustment method, system, storage medium and electronic equipment - Google Patents

Camera adjustment method, system, storage medium and electronic equipment Download PDF

Info

Publication number
CN112969022B
CN112969022B CN202110129635.1A CN202110129635A CN112969022B CN 112969022 B CN112969022 B CN 112969022B CN 202110129635 A CN202110129635 A CN 202110129635A CN 112969022 B CN112969022 B CN 112969022B
Authority
CN
China
Prior art keywords
image
target
camera
target image
brightness
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110129635.1A
Other languages
Chinese (zh)
Other versions
CN112969022A (en
Inventor
麻越
陈奕名
张建鑫
马丁
王超
霍卫涛
王赛
董连杰
张赫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
New Oriental Education Technology Group Co ltd
Original Assignee
New Oriental Education Technology Group Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by New Oriental Education Technology Group Co ltd filed Critical New Oriental Education Technology Group Co ltd
Priority to CN202110129635.1A priority Critical patent/CN112969022B/en
Publication of CN112969022A publication Critical patent/CN112969022A/en
Application granted granted Critical
Publication of CN112969022B publication Critical patent/CN112969022B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/06Protocols specially adapted for file transfer, e.g. file transfer protocol [FTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • H04L67/125Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L69/00Network arrangements, protocols or services independent of the application payload and not provided for in the other groups of this subclass
    • H04L69/04Protocols for data compression, e.g. ROHC
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/76Circuitry for compensating brightness variation in the scene by influencing the image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Security & Cryptography (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Studio Devices (AREA)

Abstract

The disclosure relates to a camera adjustment method, a camera adjustment system, a storage medium and electronic equipment. The camera adjusting system comprises a server, one or more edge devices connected with the server, one or more cameras connected with the edge devices, and a target image shot by the cameras is acquired through the edge devices; comparing the target image with a reference image to obtain an image offset vector; and sending the first control instruction to the camera according to the image offset vector, so that the camera can adjust the view angle of the camera according to the first control instruction, thereby realizing automatic adjustment of the view angle of the camera, and shooting a video or an image meeting expectations through the camera.

Description

Camera adjustment method, system, storage medium and electronic equipment
Technical Field
The disclosure relates to the technical field of automatic control, in particular to a camera adjusting method, a camera adjusting system, a storage medium and electronic equipment.
Background
With the continuous improvement of education level, the recording and broadcasting of offline lessons becomes a very important ring in the education scene. In order to realize the recording and broadcasting of the online lessons, one or more cameras are arranged in front of the blackboard according to the specification of the classroom, and teaching process videos are completely recorded through the cameras and provided for students. Students can improve the teaching quality by watching and replaying digestion and mastering teaching difficulties. This has put forward higher requirement to the camera view angle in the classroom, needs to be in time adjusted when the camera leads to the view angle unsuitable because of various factors, but relies on the manual work mainly to the adjustment of camera view angle in the correlation technique, wastes time and energy, and is not in time enough.
Disclosure of Invention
In order to solve the above problems, the present disclosure provides a camera adjustment method, a system, a storage medium, and an electronic device.
In a first aspect, the present disclosure provides a camera adjustment system, the system including a server, one or more edge devices connected to the server, one or more cameras connected to the edge devices; wherein:
the server is used for receiving the image or video uploaded by the edge equipment;
the camera is used for shooting images and sending the shot images to the edge equipment; receiving a first control instruction sent by the edge equipment and adjusting the view angle of the camera according to the first control instruction;
the edge equipment is used for acquiring a target image shot by the camera; comparing the target image with a reference image to obtain an image offset vector; and sending the first control instruction to the camera according to the image offset vector so that the camera can adjust the view angle of the camera according to the first control instruction.
Optionally the edge device is specifically configured to:
extracting feature points of the target image and the reference image respectively by using a SIFT algorithm to obtain a plurality of target feature points of the target image and a plurality of reference feature points of the reference image;
Performing image feature matching on the target feature points and the reference feature points, and acquiring a position offset vector between each target feature point and the matched reference feature points;
smoothing the position offset vector to eliminate noise;
and calculating the image offset vector according to the position offset vector after the smoothing and filtering.
Optionally the edge device is specifically configured to:
under the condition that the image offset vector is larger than or equal to a preset vector threshold value, circularly executing a camera adjusting step until the image offset vector is smaller than the preset vector threshold value; the camera adjusting step comprises the following steps:
sending the first control instruction to the camera according to the image offset vector so that the camera can adjust the view angle of the camera according to the first control instruction;
acquiring a new target image through the adjusted camera;
and comparing the new target image with the reference image to obtain a new image offset vector, and updating the new image offset vector into the image offset vector.
Optionally the edge device is specifically configured to:
under the condition that the obtained target image is distorted, correcting the target image, and then comparing the corrected target image with a reference image to obtain an image offset vector; wherein the correction process includes one or more of rectangular correction, brightness distribution correction, and noise cancellation.
Optionally the edge device is specifically configured to:
acquiring the brightness distribution of the target image, and confirming whether the brightness distribution of the target image accords with the expected brightness distribution;
correcting the target image by using a preset brightness curve to enable the brightness distribution of the corrected target image to conform to the expected brightness distribution when the brightness distribution of the target image does not conform to the expected brightness distribution, wherein a curve equation used by the preset brightness curve comprises a target brightness adjustment parameter matrix for adjusting the brightness of pixels of the target image, and the target brightness adjustment parameter matrix is obtained by the following steps:
inputting the target image into a preset brightness correction model to obtain a first candidate brightness adjustment parameter matrix;
and smoothing the first candidate brightness adjustment parameter matrix to obtain the target brightness adjustment parameter matrix.
Optionally the edge device is specifically configured to:
acquiring one or more target boundaries contained in the target image, wherein the target boundaries represent boundaries among a plurality of first target objects in the target image;
acquiring one or more target areas from the target image according to the target boundary;
Acquiring a second brightness adjustment parameter matrix corresponding to each target area from the first candidate brightness adjustment parameter matrix;
performing brightness smoothing filtering processing on the second brightness adjustment parameter matrix to eliminate noise;
and obtaining the target brightness adjustment parameter matrix according to the second brightness adjustment parameter matrix after smooth filtering.
Optionally, the edge device is specifically configured to:
acquiring a rectangular target object in the target image;
confirming whether the shape of the rectangular target object in the target image is rectangular or not;
and correcting the target image through an image perspective transformation algorithm under the condition that the shape of the rectangular target object in the target image is not rectangular, so that the rectangular target object in the corrected image is a regular rectangle.
Optionally the edge device is further configured to:
acquiring a new target image shot by a camera after the visual angle is adjusted;
and carrying out the correction processing on the new target image, and taking the corrected image as a new reference image.
Optionally characterized in that:
the server sends a first image acquisition instruction and/or a first camera adjustment instruction to the edge equipment according to a first user input instruction; receiving the compressed image uploaded by the edge equipment and displaying the compressed image to a user;
The edge equipment is also used for receiving a first image acquisition instruction sent by the server and acquiring an image shot by the camera according to the first image acquisition instruction; correcting the image shot by the camera and compressing the image to a preset target resolution to obtain a compressed image; uploading the compressed image to the server; the method comprises the steps of,
and receiving a first camera adjustment instruction sent by the server, and sending a second control instruction to the camera according to the first camera adjustment instruction, so that the camera can adjust the camera visual angle according to the second control instruction.
In a second aspect, the present disclosure provides a camera adjustment method, where the method is applied to an edge device of a camera adjustment system, where the camera adjustment system includes a server, one or more edge devices connected to the server, and one or more cameras connected to the edge devices; the method comprises the following steps:
acquiring a target image shot by a camera;
comparing the target image with a reference image to obtain an image offset vector;
and sending the first control instruction to the camera according to the image offset vector so that the camera can adjust the view angle of the camera according to the first control instruction.
Optionally, the comparing the target image with the reference image to obtain an image offset vector includes:
extracting feature points of the target image and the reference image respectively by using a SIFT algorithm to obtain a plurality of target feature points of the target image and a plurality of reference feature points of the reference image;
performing image feature matching on the target feature points and the reference feature points, and acquiring a position offset vector between each target feature point and the matched reference feature points;
smoothing the position offset vector to eliminate noise;
and calculating the image offset vector according to the position offset vector after the smoothing and filtering.
Optionally, the sending the first control instruction to the camera according to the image offset vector, so that the camera adjusts the camera viewing angle according to the first control instruction includes:
under the condition that the image offset vector is larger than or equal to a preset vector threshold value, circularly executing a camera adjusting step until the image offset vector is smaller than the preset vector threshold value; the camera adjusting step comprises the following steps:
sending the first control instruction to the camera according to the image offset vector;
Acquiring a new target image through the adjusted camera;
and comparing the new target image with the reference image to obtain a new image offset vector, and updating the new image offset vector into the image offset vector.
Optionally, comparing the target image with a reference image to obtain an image offset vector includes:
under the condition that the obtained target image is distorted, correcting the target image, and then comparing the corrected target image with a reference image to obtain an image offset vector; wherein the correction process includes one or more of rectangular correction, brightness distribution correction, and noise cancellation.
Optionally, the brightness distribution correction includes:
acquiring the brightness distribution of the target image, and confirming whether the brightness distribution of the target image accords with the expected brightness distribution;
correcting the target image by using a preset brightness curve to enable the brightness distribution of the corrected target image to conform to the expected brightness distribution when the brightness distribution of the target image does not conform to the expected brightness distribution, wherein a curve equation used by the preset brightness curve comprises a target brightness adjustment parameter matrix for adjusting the brightness of pixels of the target image, and the target brightness adjustment parameter matrix is obtained by the following steps:
Inputting the target image into a preset brightness correction model to obtain a first candidate brightness adjustment parameter matrix;
and smoothing the first candidate brightness adjustment parameter matrix to obtain the target brightness adjustment parameter matrix.
Optionally, smoothing the first candidate brightness adjustment parameter matrix to obtain the target brightness adjustment parameter matrix includes:
acquiring one or more target boundaries contained in the target image, wherein the target boundaries represent boundaries among a plurality of first target objects in the target image;
acquiring one or more target areas from the target image according to the target boundary;
acquiring a second brightness adjustment parameter matrix corresponding to each target area from the first candidate brightness adjustment parameter matrix;
performing brightness smoothing filtering processing on the second brightness adjustment parameter matrix to eliminate noise;
and obtaining the target brightness adjustment parameter matrix according to the second brightness adjustment parameter matrix after smooth filtering.
Optionally, the rectangle correction includes:
acquiring a rectangular target object in the target image;
confirming whether the shape of the rectangular target object in the target image is rectangular or not;
And correcting the target image through an image perspective transformation algorithm under the condition that the shape of the rectangular target object in the target image is not rectangular, so that the rectangular target object in the corrected target image is a regular rectangle.
Optionally, after the adjusting the camera according to the image offset vector, the method further includes:
acquiring a new target image shot by the adjusted camera;
and carrying out the correction processing on the new target image, and taking the corrected image as a new reference image.
Optionally, the method further comprises:
receiving a first image acquisition instruction sent by the server, and acquiring an image shot by the camera according to the first image acquisition instruction;
correcting the image shot by the camera and compressing the image to a preset target resolution to obtain a compressed image;
uploading the compressed image to the server;
and receiving a first camera adjustment instruction sent by the server, and sending a second control instruction to the camera according to the first camera adjustment instruction, so that the camera can adjust the camera visual angle according to the second control instruction.
In a third aspect, the present disclosure provides a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of the method of the second aspect of the present disclosure.
In a fourth aspect, the present disclosure provides an electronic device comprising: a memory having a computer program stored thereon; a processor for executing the computer program in the memory to implement the steps of the method of the second aspect of the present disclosure.
By adopting the technical scheme, the camera adjusting system comprises a server, one or more edge devices connected with the server, one or more cameras connected with the edge devices, and a target image shot by the cameras is acquired through the edge devices; comparing the target image with a reference image to obtain an image offset vector; and sending the first control instruction to the camera according to the image offset vector, so that the camera can adjust the view angle of the camera according to the first control instruction, thereby realizing automatic adjustment of the view angle of the camera, and shooting a video or an image meeting expectations through the camera.
Additional features and advantages of the present disclosure will be set forth in the detailed description which follows.
Drawings
The accompanying drawings are included to provide a further understanding of the disclosure, and are incorporated in and constitute a part of this specification, illustrate the disclosure and together with the description serve to explain, but do not limit the disclosure. In the drawings:
fig. 1 is a schematic structural diagram of a camera adjustment system according to an embodiment of the present disclosure;
fig. 2 is a flowchart of a method for adjusting a camera according to an embodiment of the present disclosure;
FIG. 3 is a block diagram of an electronic device provided by an embodiment of the present disclosure;
fig. 4 is a block diagram of another electronic device provided by an embodiment of the present disclosure.
Detailed Description
Specific embodiments of the present disclosure are described in detail below with reference to the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating and illustrating the disclosure, are not intended to limit the disclosure.
In the following description, the words "first," "second," and the like are used merely for distinguishing between the descriptions and not for indicating or implying a relative importance or order.
First, an application scenario of the present disclosure will be described. The method can be applied to camera adjustment scenes, one or more cameras are installed in front of a classroom blackboard for realizing online lesson recording and broadcasting, and for clearly recorded teaching videos, the visual angles of the cameras are required to be effectively adjusted, but in the related technology, the adjustment of the visual angles of the cameras is mainly carried out manually, generally, workers regularly patrol the cameras in each classroom, and if the change of the visual angles of the cameras is found, the manual adjustment is carried out; in addition, if the teaching video is unclear or lacks content, the staff checks and adjusts the camera for recording the teaching video. The manual adjustment method is time-consuming and labor-consuming and not timely enough.
In order to solve the above problems, the present disclosure provides a camera adjustment method, a system, a storage medium, and an electronic device, where the camera adjustment system includes a server, one or more edge devices connected to the server, one or more cameras connected to the edge devices, and acquiring a target image captured by the cameras through the edge devices; comparing the target image with a reference image to obtain an image offset vector; and sending the first control instruction to the camera according to the image offset vector, so that the camera can adjust the view angle of the camera according to the first control instruction, thereby realizing automatic adjustment of the view angle of the camera, and shooting a video or an image meeting expectations through the camera.
Specific embodiments of the present disclosure are described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic structural diagram of a camera adjustment system according to an embodiment of the present disclosure, where, as shown in fig. 1, the camera adjustment system includes a server 101, one or more edge devices 102 connected to the server, and one or more cameras 103 connected to the edge devices; wherein:
the server 101 is configured to receive an image or video uploaded by the edge device 102.
The camera 103 is used for shooting images and sending the shot images to the edge device 102; and receiving a first control instruction sent by the edge device 102 and adjusting the camera view angle according to the first control instruction.
The edge device 102 is configured to acquire a target image captured by the camera 103; comparing the target image with a reference image to obtain an image offset vector; and sending the first control instruction to the camera 103 according to the image offset vector, so that the camera 103 adjusts the camera view angle according to the first control instruction.
The above operation performed by the edge device 102 may be referred to as a camera adjustment operation, and the edge device 102 may perform the camera adjustment operation at a preset time set by a user, for example, may be set at 8 and 18 points per day, and perform the camera adjustment operation; or 8 a.m. on sunday per week. Thus, automatic camera adjustment can be realized without manual intervention.
Of course, the edge device 102 may also perform the above-mentioned camera adjustment operation according to the received adjustment operation instruction, where the adjustment operation instruction may be sent by the server to the edge device according to the user instruction.
The reference image may be an image which is photographed by a user through the camera and meets the user's expectation after the user manually adjusts the view angle of the camera to the expected view angle. The reference image can be stored in the edge device or uploaded to the server through the edge device, the server receives and stores the reference image, and the server can display the reference image to the user so as to ensure that the user confirms whether the reference image meets the expectations of the user.
It should be noted that, the conventional centralized cloud computing service cannot meet the requirements of real-time performance, security, low energy consumption and the like of big data processing, and occupies a large amount of bandwidth, and the camera adjustment system in the present disclosure includes Edge devices (Edge devices) that may have a processor and a memory to implement the above functions. The edge device may be an artificial intelligence development board, or may be a conventional computer or server, for example.
By adopting the system, the target image shot by the camera is acquired through the edge equipment; comparing the target image with a reference image to obtain an image offset vector; and sending the first control instruction to the camera according to the image offset vector, so that the camera can adjust the view angle of the camera according to the first control instruction, thereby realizing automatic adjustment of the view angle of the camera, and shooting a video or an image meeting expectations through the camera.
Alternatively, the camera 103 may be provided with a pan/tilt head. The camera 103 may receive a first control instruction sent by the edge device 102 through the pan-tilt and adjust a camera view angle according to the first control instruction.
It should be noted that, the cradle head is a supporting device for installing and fixing the camera, and the types of cradle heads may include a fixed cradle head or an electric cradle head. The pan-tilt in this embodiment may be an electric pan-tilt, which may receive the first control instruction and adjust a horizontal angle and/or a pitch angle of the camera according to the first control instruction, so as to adjust a camera viewing angle.
Further, the first control instruction may include a horizontal angle adjustment amount and/or a pitch angle adjustment amount of the camera, where the horizontal angle adjustment amount and the pitch angle adjustment amount may be obtained according to the image offset vector. For example, the image offset vector may include a horizontal offset vector and a vertical offset vector of the second target object in the target image and the reference image; the second target object may be a stationary object in a classroom, and for example, the second target object may be a blackboard, a whiteboard, a wall, a floor, a desk, or a chair in the classroom. The edge device can acquire the distance between the second target object and the camera as a target distance, calculate the horizontal angle adjustment amount according to the target distance and the horizontal offset vector, and calculate the pitching angle adjustment amount according to the target distance and the vertical offset vector. It should be noted that, the target distance may be preset, or may be obtained according to a distance detection module installed on the camera, where the distance detection module may be an ultrasonic ranging module or a laser ranging module.
In this way, an accurate horizontal angle adjustment amount and/or a pitching angle adjustment amount can be obtained through the image offset vector, and the camera view angle is further adjusted to an expected view angle through the first control instruction, so that a target image shot by the camera is consistent with a reference image, and the camera adjustment efficiency is improved.
In other embodiments of the present disclosure, the edge device may compare the target image to a reference image to obtain an image offset vector by:
first, feature point extraction is performed on the target image and the reference image using SIFT (Scale-Invariant Feature Transform) algorithm, respectively, to obtain a plurality of target feature points of the target image and a plurality of reference feature points of the reference image.
The algorithm for extracting the feature points of the image includes SIFT algorithm and Harris corner extraction algorithm. However, the reason why the SIFT algorithm is used in the present embodiment, but the Harris corner extraction algorithm is not applied is as follows: in a classroom scene, the corner points of each desk and chair are similar under Harris characteristic expression, so that Harris can be subjected to a large number of mismatching of the corner points, and compared with the SIFT algorithm, the robustness is higher.
The extraction method of each feature point may include: and obtaining a feature vector of each feature point according to the information of the position, the scale and the direction of the feature point by using a SIFT algorithm, wherein the feature vector not only comprises the information of the feature point, but also comprises the information of adjacent feature points around the feature point. Thus, the feature vector obtained by the SIFT algorithm has higher uniqueness, and information such as each desk chair in a classroom scene can be distinguished more obviously.
Alternatively, the gradient histogram used by the SIFT algorithm contains 36 bins, each representing an angular range of 10 degrees, with 36 bins in turn representing an angular range of 0 degrees to 360 degrees.
It should be noted that, the gradient histogram used in the general SIFT algorithm generally includes 8 columns, and the inventor finds that using 36 columns has more accurate direction expression, and reduces mismatching of feature points with similar direction expression, thereby improving the matching accuracy of the feature points.
And secondly, carrying out image feature matching on the target feature points and the reference feature points, and acquiring a position offset vector between each target feature point and the matched reference feature points.
Again, the positional deviation vector is subjected to smoothing filter processing to eliminate noise.
And finally, calculating the image offset vector according to the position offset vector after the smoothing filtering.
The average value of the positional deviation vectors of each target feature point may be used as the image deviation vector, or the positional deviation vectors of each target feature point may be summed to obtain the image deviation vector.
The step of smoothing the position offset vector to eliminate noise may be an optional step, or the image offset vector may be directly calculated from the position offset vector not subjected to smoothing.
Therefore, more accurate characteristic points can be obtained through the SIFT algorithm, and more accurate image offset vectors can be obtained through characteristic matching, so that the accuracy of camera adjustment is improved.
In other embodiments of the present disclosure, the edge device may also be used to:
under the condition that the obtained target image is distorted, correcting the target image, and then comparing the corrected target image with a reference image to obtain an image offset vector; wherein the correction process includes one or more of brightness distribution correction, rectangular correction, and noise cancellation. The following describes the implementation manner of brightness distribution correction, rectangular correction, and noise cancellation, respectively:
The edge device may perform luminance distribution correction by:
first, the luminance distribution of the target image is acquired, and it is confirmed whether the luminance distribution of the target image matches the expected luminance distribution.
Secondly, in the case that the luminance distribution of the target image does not conform to the expected luminance distribution, the target image is corrected using a preset luminance curve so that the luminance distribution of the corrected target image conforms to the expected luminance distribution.
The curve equation used by the preset brightness curve comprises a target brightness adjustment parameter matrix for adjusting the brightness of the pixel point of the target image, and the target brightness adjustment parameter matrix is obtained by the following steps: inputting the target image into a preset brightness correction model to obtain a first candidate brightness adjustment parameter matrix; and smoothing the first candidate brightness adjustment parameter matrix to obtain the target brightness adjustment parameter matrix.
Alternatively, the above curve equation may be the following equation:
f(x)=x+a*x*(1-x),
and x is the brightness of each pixel point of the target image, a is the target brightness adjustment parameter matrix, and the value of a can be obtained through training a depth curve estimation network according to sample image data.
Further, the preset brightness curve may be obtained by iterating the curve equation itself for a preset number of times. Since the curve equation is a second-order equation, the order of the preset brightness curve can be improved after the preset times of self iteration, and the brightness correction effect is further improved. For example, the number of times may be 2 or 16, and the higher the number of times, the higher the order of the preset luminance curve, and the better the luminance correction effect.
It should be noted that, each pixel of the target image has 3 channels, corresponding to R (Red), G (Green), and B (Blue), and each channel of each pixel corresponds to the parameter a, so the number of parameters of the target brightness adjustment parameter matrix corresponding to each pixel is three times of the preset number of times. For example, in the case where the preset number of times is 8, the number of parameters of the target brightness adjustment parameter matrix corresponding to each pixel point is 24 (3 channels, 8 iterations).
Further, after the edge device smoothes the first candidate brightness adjustment parameter matrix, an implementation manner of obtaining the target brightness adjustment parameter matrix may include:
first, one or more target boundaries contained in the target image are acquired, the target boundaries characterizing boundaries between a plurality of first target objects in the target image.
In this step, an image edge detection algorithm may be used to detect a pixel value difference value according to each pixel point and an adjacent pixel point, so as to obtain the target boundary. By way of example, the first target object may include one or more of a blackboard, a whiteboard, a projection courseware, a wall, a floor, a desk, a chair, etc. in a classroom, and the target boundary may be a boundary between the blackboard and the wall or a boundary between the desk and the floor.
Secondly, one or more target areas are obtained from the target image according to the target boundary;
thirdly, acquiring a second brightness adjustment parameter matrix corresponding to each target area from the first candidate brightness adjustment parameter matrix; and performing brightness smoothing filtering processing on the second brightness adjustment parameter matrix to eliminate noise.
And finally, obtaining the target brightness adjustment parameter matrix according to the second brightness adjustment parameter matrix after smooth filtering.
In this way, when the target image shot by the camera is too bright or too dark, the target image is corrected through brightness distribution correction to obtain a target image with proper brightness, so that the target image is compared with the reference image, and a more accurate image offset vector is obtained.
The edge device may perform rectangular correction by:
acquiring a rectangular target object in the target image; and confirming whether the shape of the rectangular target object in the target image is rectangular; and correcting the target image through an image perspective transformation algorithm under the condition that the shape of the rectangular target object in the target image is not rectangular, so that the rectangular target object in the corrected image is a regular rectangle.
The rectangular target object can comprise one or more of a blackboard, a whiteboard, a projection courseware, a desk, a chair and other rectangular objects in a classroom.
Therefore, under the condition that the target image shot by the camera is distorted in a rectangular mode, the target image is corrected and restored through rectangular correction so as to be compared with the reference image, and an accurate image offset vector is obtained.
The edge device may also perform noise cancellation on the target image through a low pass filter. The low-pass filter can adopt a classical mean value algorithm or a Gaussian filtering algorithm.
In other embodiments of the present disclosure, the edge device may be configured to:
under the condition that the image offset vector is larger than or equal to a preset vector threshold, circularly executing a camera adjusting step until the image offset vector is smaller than the preset vector threshold; the camera adjusting step comprises the following steps:
Firstly, the first control instruction is sent to the camera according to the image offset vector, so that the camera can adjust the view angle of the camera according to the first control instruction.
And secondly, acquiring a new target image through the adjusted camera.
Finally, the new target image is compared with the reference image to obtain a new image offset vector, and the new image offset vector is updated to the image offset vector.
It should be noted that, the above image offset vector being greater than or equal to the preset vector threshold may indicate that the camera viewing angle has deviated from the expected viewing angle, where the video or the image captured by the camera may have a problem of unclear or lack of partial content, and at this time, the camera viewing angle needs to be adjusted; on the contrary, the image offset vector is greater than or equal to the preset vector threshold value, which can indicate that the camera view angle accords with the expected view angle, and the camera view angle does not need to be adjusted at the moment.
Thus, by cyclically performing the camera adjustment step, the camera view angle can be adjusted to meet the desired view angle so that a video or image meeting the desired view angle can be captured by the camera.
Further, after cyclically performing the camera adjustment step until the image offset vector is less than the preset vector threshold, the edge device is further configured to:
Acquiring a new target image shot by a camera after the visual angle is adjusted; the correction processing is performed on the new target image, and the corrected image is used as a new reference image.
As such, the correction process may include one or more of rectangular correction, brightness distribution correction, and noise cancellation.
Thus, after the camera is adjusted, a new reference image can be obtained, and based on the new reference image, the target image shot by the camera is compared with the new reference image to obtain an image offset vector; and sending the first control instruction to the camera according to the image offset vector, so that the camera can adjust the view angle of the camera according to the first control instruction.
In some other embodiments of the present disclosure, the server in the camera adjustment system is configured to send a first image acquisition instruction and/or a first camera adjustment instruction to the edge device according to a first user input instruction; and receiving the compressed image uploaded by the edge equipment and displaying the compressed image to a user.
The edge equipment is also used for receiving a first image acquisition instruction sent by the server and acquiring an image shot by the camera according to the first image acquisition instruction; correcting and compressing an image shot by the camera to a preset target resolution to obtain a compressed image; uploading the compressed image to the server; the method comprises the steps of,
And receiving a first camera adjustment instruction sent by the server, and sending a second control instruction to the camera according to the first camera adjustment instruction so that the camera can adjust the camera view angle according to the second control instruction.
The preset target resolution may be 480P, 360P or 1080P, or may be 1/4 or 1/16 of the original resolution of the image shot by the camera. The compression algorithm may include a joint bilateral image interpolation algorithm.
Through the system, the function of manually adjusting the visual angle of the camera can be realized, and under the condition that the visual angle of the camera can not meet the requirement when the edge equipment automatically adjusts the visual angle, the quality of video or image shot by the camera is ensured through manual adjustment.
It should be noted that, this camera adjustment system is applied to the classroom scene, and the server is the remote server, and edge equipment and camera are installed in the classroom, and edge equipment and camera can be through wireless or wired connection. The camera adopts a high-definition camera, and the resolution of a shot image can reach 1920 x 1080, so that under the condition that a server is connected with a plurality of edge devices, if the image is not compressed, a large amount of bandwidth is occupied, the problem of insufficient bandwidth can be possibly caused, and the resolution of the image presented by a user by the server is not required to be so large and can be smaller, therefore, the image can be compressed in the embodiment, the bandwidth between the edge devices and the server can be saved through image compression, and the problem of insufficient bandwidth is avoided.
In other embodiments of the present disclosure, the camera adjustment system may further include a terminal, which is connected to the server, and the terminal may be a mobile phone, a PAD, or other electronic device, where:
the terminal is used for sending a second image acquisition instruction or a second camera adjustment instruction to the server according to a second user input instruction; and receiving the compressed image sent by the server and displaying the compressed image to a user.
The server is used for responding to the received second image acquisition instruction and sending a first image acquisition instruction to the edge equipment; responding to the received second camera adjustment instruction, and sending a first camera adjustment instruction to the edge equipment; and receiving the compressed image uploaded by the edge equipment and sending the compressed image to the terminal.
The edge equipment is also used for receiving a first image acquisition instruction sent by the server and acquiring an image shot by the camera according to the first image acquisition instruction; correcting and compressing an image shot by the camera to a preset target resolution to obtain a compressed image; uploading the compressed image to the server; the method comprises the steps of,
And receiving a first camera adjustment instruction sent by the server, and sending a second control instruction to the camera according to the first camera adjustment instruction so that the camera can adjust the camera view angle according to the second control instruction.
Therefore, the user can realize the function of manually adjusting the visual angle of the camera through the terminal, and the user operation is facilitated.
Fig. 2 is a schematic diagram of a camera adjustment method according to an embodiment of the present disclosure, where, as shown in fig. 2, an execution subject of the method may be an edge device of a camera adjustment system, where the camera adjustment system includes a server, one or more edge devices connected to the server, and one or more cameras connected to the edge devices; the method comprises the following steps:
s201, acquiring a target image shot by a camera.
In this step, a shooting instruction may be sent to the camera to control the camera to start shooting, and a shot target image may be sent to the edge device; or the target image shot by the camera can be acquired in the normal shooting process of the camera.
S202, comparing the target image with a reference image to obtain an image offset vector.
The reference image may be an image which is shot by the camera and meets the user's expectation after the user manually adjusts the view angle of the camera to the expected view angle. The reference image can be stored in the edge device or uploaded to the server through the edge device, the server receives and stores the reference image, and the server can display the reference image to the user so as to ensure that the user confirms whether the reference image meets the expectations of the user.
S203, sending the first control instruction to the camera according to the image offset vector, so that the camera can adjust the view angle of the camera according to the first control instruction.
By adopting the method, the target image shot by the camera is acquired; comparing the target image with a reference image to obtain an image offset vector; and sending the first control instruction to the camera according to the image offset vector, so that the camera can adjust the view angle of the camera according to the first control instruction, thereby realizing automatic adjustment of the view angle of the camera, and shooting a video or an image meeting expectations through the camera.
In some other embodiments of the present disclosure, the step S202 may compare the target image with the reference image to obtain the image offset vector, which may include:
first, feature point extraction is performed on the target image and the reference image using SIFT (Scale-Invariant Feature Transform) algorithm, respectively, to obtain a plurality of target feature points of the target image and a plurality of reference feature points of the reference image.
The algorithm for extracting the feature points of the image includes SIFT algorithm and Harris corner extraction algorithm. However, the reason why the SIFT algorithm is used in the present embodiment, but the Harris corner extraction algorithm is not applied is as follows: in a classroom scene, the corner points of each desk and chair are similar under Harris characteristic expression, so that Harris can be subjected to a large number of mismatching of the corner points, and compared with the SIFT algorithm, the robustness is higher.
The extraction method of each feature point may include: and obtaining a feature vector of each feature point according to the information of the position, the scale and the direction of the feature point by using a SIFT algorithm, wherein the feature vector not only comprises the information of the feature point, but also comprises the information of adjacent feature points around the feature point. Thus, the feature vector obtained by the SIFT algorithm has higher uniqueness, and information such as each desk chair in a classroom scene can be distinguished more obviously.
Alternatively, the gradient histogram used by the SIFT algorithm contains 36 bins, each representing an angular range of 10 degrees, with 36 bins in turn representing an angular range of 0 degrees to 360 degrees.
It should be noted that, the gradient histogram used in the general SIFT algorithm generally includes 8 columns, and the inventor finds that using 36 columns has more accurate direction expression, and reduces mismatching of feature points with similar direction expression, thereby improving the matching accuracy of the feature points.
And secondly, carrying out image feature matching on the target feature points and the reference feature points, and acquiring a position offset vector between each target feature point and the matched reference feature points.
Again, the positional deviation vector is subjected to smoothing filter processing to eliminate noise.
And finally, calculating the image offset vector according to the position offset vector after the smoothing filtering.
The average value of the positional deviation vectors of each target feature point may be used as the image deviation vector, or the positional deviation vectors of each target feature point may be summed to obtain the image deviation vector.
The step of smoothing the position offset vector to eliminate noise may be an optional step, or the image offset vector may be directly calculated from the position offset vector not subjected to smoothing.
Therefore, more accurate characteristic points can be obtained through the SIFT algorithm, and more accurate image offset vectors can be obtained through characteristic matching, so that the accuracy of camera adjustment is improved.
In other embodiments of the present disclosure, the step S202 may further include comparing the target image with a reference image to obtain an image offset vector:
under the condition that the obtained target image is distorted, correcting the target image, and then comparing the corrected target image with a reference image to obtain an image offset vector; wherein the correction process includes one or more of brightness distribution correction, rectangular correction, and noise cancellation. The following describes the implementation manner of brightness distribution correction, rectangular correction, and noise cancellation, respectively:
The luminance distribution correction can be performed by:
first, the luminance distribution of the target image is acquired, and it is confirmed whether the luminance distribution of the target image matches the expected luminance distribution.
Secondly, in the case that the luminance distribution of the target image does not conform to the expected luminance distribution, the target image is corrected using a preset luminance curve so that the luminance distribution of the corrected target image conforms to the expected luminance distribution.
The curve equation used by the preset brightness curve comprises a target brightness adjustment parameter matrix for adjusting the brightness of the pixel point of the target image, and the target brightness adjustment parameter matrix is obtained by the following steps: inputting the target image into a preset brightness correction model to obtain a first candidate brightness adjustment parameter matrix; and smoothing the first candidate brightness adjustment parameter matrix to obtain the target brightness adjustment parameter matrix.
Alternatively, the above curve equation may be the following equation:
f(x)=x+a*x*(1-x),
and x is the brightness of each pixel point of the target image, a is the target brightness adjustment parameter matrix, and the value of a can be obtained through training a depth curve estimation network according to sample image data.
Further, the preset brightness curve may be obtained by iterating the curve equation itself for a preset number of times. Since the curve equation is a second-order equation, the order of the preset brightness curve can be improved after the preset times of self iteration, and the brightness correction effect is further improved. For example, the number of times may be 2 or 16, and the higher the number of times, the higher the order of the preset luminance curve, and the better the luminance correction effect.
It should be noted that, each pixel of the target image has 3 channels, corresponding to R (Red), G (Green), and B (Blue), and each channel of each pixel corresponds to the parameter a, so the number of parameters of the target brightness adjustment parameter matrix corresponding to each pixel is three times of the preset number of times. For example, in the case where the preset number of times is 8, the number of parameters of the target brightness adjustment parameter matrix corresponding to each pixel point is 24 (3 channels, 8 iterations).
Further, after smoothing the first candidate brightness adjustment parameter matrix, the implementation manner of obtaining the target brightness adjustment parameter matrix may include:
first, one or more target boundaries contained in the target image are acquired, the target boundaries characterizing boundaries between a plurality of first target objects in the target image.
In this step, an image edge detection algorithm may be used to detect a pixel value difference value according to each pixel point and an adjacent pixel point, so as to obtain the target boundary. By way of example, the first target object may include one or more of a blackboard, a whiteboard, a projection courseware, a wall, a floor, a desk, a chair, etc. in a classroom, and the target boundary may be a boundary between the blackboard and the wall or a boundary between the desk and the floor.
Secondly, one or more target areas are obtained from the target image according to the target boundary;
thirdly, acquiring a second brightness adjustment parameter matrix corresponding to each target area from the first candidate brightness adjustment parameter matrix; and performing brightness smoothing filtering processing on the second brightness adjustment parameter matrix to eliminate noise.
And finally, obtaining the target brightness adjustment parameter matrix according to the second brightness adjustment parameter matrix after smooth filtering.
In this way, when the target image shot by the camera is too bright or too dark, the target image is corrected through brightness distribution correction to obtain a target image with proper brightness, so that the target image is compared with the reference image, and a more accurate image offset vector is obtained.
The rectangular correction can be performed by:
acquiring a rectangular target object in the target image; and confirming whether the shape of the rectangular target object in the target image is rectangular; and correcting the target image through an image perspective transformation algorithm under the condition that the shape of the rectangular target object in the target image is not rectangular, so that the rectangular target object in the corrected image is a regular rectangle.
The rectangular target object can comprise one or more of a blackboard, a whiteboard, a projection courseware, a desk, a chair and other rectangular objects in a classroom.
Therefore, under the condition that the target image shot by the camera is distorted in a rectangular mode, the target image is corrected and restored through rectangular correction so as to be compared with the reference image, and an accurate image offset vector is obtained.
The target image may be noise canceled by a low pass filter. The low-pass filter can adopt a classical mean value algorithm or a Gaussian filtering algorithm.
In some other embodiments of the present disclosure, the step S203 of sending the first control instruction to the camera according to the image offset vector, so that the camera adjusts the camera view according to the first control instruction may include:
Under the condition that the image offset vector is larger than or equal to a preset vector threshold, circularly executing a camera adjusting step until the image offset vector is smaller than the preset vector threshold; the camera adjusting step comprises the following steps:
firstly, the first control instruction is sent to the camera according to the image offset vector, so that the camera can adjust the view angle of the camera according to the first control instruction.
And secondly, acquiring a new target image through the adjusted camera.
Finally, the new target image is compared with the reference image to obtain a new image offset vector, and the new image offset vector is updated to the image offset vector.
It should be noted that, the above image offset vector being greater than or equal to the preset vector threshold may indicate that the camera viewing angle has deviated from the expected viewing angle, where the video or the image captured by the camera may have a problem of unclear or lack of partial content, and at this time, the camera viewing angle needs to be adjusted; on the contrary, the image offset vector is greater than or equal to the preset vector threshold value, which can indicate that the camera view angle accords with the expected view angle, and the camera view angle does not need to be adjusted at the moment.
Thus, by cyclically performing the camera adjustment step, the camera view angle can be adjusted to meet the desired view angle so that a video or image meeting the desired view angle can be captured by the camera.
Further, after cyclically performing the camera adjustment step until the image offset vector is less than the preset vector threshold, the method may further include:
acquiring a new target image shot by a camera after the visual angle is adjusted; the correction processing is performed on the new target image, and the corrected image is used as a new reference image.
As such, the correction process may include one or more of rectangular correction, brightness distribution correction, and noise cancellation.
Thus, after the camera is adjusted, a new reference image can be obtained, and based on the new reference image, the target image shot by the camera is compared with the new reference image to obtain an image offset vector; and sending the first control instruction to the camera according to the image offset vector, so that the camera can adjust the view angle of the camera according to the first control instruction.
In still other embodiments of the present disclosure, the method further comprises:
first, a first image acquisition instruction sent by a server is received, and an image shot by the camera is acquired according to the first image acquisition instruction.
The first image acquisition instruction is sent to the edge device by the server according to a first user input instruction. The first user input instruction is used for representing an instruction for viewing a target image shot by the camera by a user.
Secondly, correcting the image shot by the camera and compressing the image to a preset target resolution to obtain a compressed image.
The preset target resolution may be 480P, 360P or 1080P, or may be 1/4 or 1/16 of the original resolution of the image shot by the camera. The compression algorithm may include a joint bilateral image interpolation algorithm. As such, the correction process may include one or more of rectangular correction, brightness distribution correction, and noise cancellation.
And uploading the compressed image to the server.
The server receives the compressed image uploaded by the edge device and can display the compressed image to a user. And the user judges whether the compressed image is consistent with the reference image or not according to the compressed image, if the compressed image is inconsistent with the reference image, a second user input instruction is sent, the second user input instruction can comprise a first camera adjustment instruction, and the server can send the first camera adjustment instruction to the edge equipment according to the second user input instruction.
And finally, receiving a first camera adjustment instruction sent by the server, and sending a second control instruction to the camera according to the first camera adjustment instruction so that the camera can adjust the camera visual angle according to the second control instruction.
In this way, the function of manually adjusting the view angle of the camera can be realized, and the quality of the video or the image shot by the camera is ensured through manual adjustment under the condition that the view angle of the camera can not meet the requirement when the edge equipment automatically adjusts the view angle of the camera; in addition, by compressing the image, bandwidth between the edge device and the server can be saved.
Fig. 3 is a block diagram of an electronic device 300, according to an example embodiment. As shown in fig. 3, the electronic device 300 may include: a processor 301, a memory 302. The electronic device 300 may also include one or more of a multimedia component 303, an input/output (I/O) interface 304, and a communication component 305.
The processor 301 is configured to control the overall operation of the electronic device 300 to perform all or part of the steps in the camera adjustment method described above. The memory 302 is used to store various types of data to support operation at the electronic device 300, which may include, for example, instructions for any application or method operating on the electronic device 300, as well as application-related data, such as contact data, transceived messages, pictures, audio, video, and the like. The Memory 302 may be implemented by any type or combination of volatile or non-volatile Memory devices, such as static random access Memory (Static Random Access Memory, SRAM for short), electrically erasable programmable Read-Only Memory (Electrically Erasable Programmable Read-Only Memory, EEPROM for short), erasable programmable Read-Only Memory (Erasable Programmable Read-Only Memory, EPROM for short), programmable Read-Only Memory (Programmable Read-Only Memory, PROM for short), read-Only Memory (ROM for short), magnetic Memory, flash Memory, magnetic disk, or optical disk. The multimedia component 303 may include a screen and an audio component. Wherein the screen may be, for example, a touch screen, the audio component being for outputting and/or inputting audio signals. For example, the audio component may include a microphone for receiving external audio signals. The received audio signals may be further stored in the memory 302 or transmitted through the communication component 305. The audio assembly further comprises at least one speaker for outputting audio signals. The I/O interface 304 provides an interface between the processor 301 and other interface modules, which may be a keyboard, mouse, buttons, etc. These buttons may be virtual buttons or physical buttons. The communication component 305 is used for wired or wireless communication between the electronic device 300 and other devices. Wireless communication, such as Wi-Fi, bluetooth, near field communication (Near Field Communication, NFC for short), 2G, 3G, 4G, NB-IOT, eMTC, or other 5G, etc., or one or a combination of more of them, is not limited herein. The corresponding communication component 305 may thus comprise: wi-Fi module, bluetooth module, NFC module, etc.
In an exemplary embodiment, the electronic device 300 may be implemented by one or more application specific integrated circuits (Application Specific Integrated Circuit, abbreviated as ASIC), digital signal processor (Digital Signal Processor, abbreviated as DSP), digital signal processing device (Digital Signal Processing Device, abbreviated as DSPD), programmable logic device (Programmable Logic Device, abbreviated as PLD), field programmable gate array (Field Programmable Gate Array, abbreviated as FPGA), controller, microcontroller, microprocessor, or other electronic components for performing the camera adjustment method described above.
In another exemplary embodiment, a computer readable storage medium is also provided, comprising program instructions which, when executed by a processor, implement the steps of the camera adjustment method described above. For example, the computer readable storage medium may be the memory 302 including program instructions described above, which are executable by the processor 301 of the electronic device 300 to perform the camera adjustment method described above.
Fig. 4 is a block diagram of an electronic device 400, shown in accordance with an exemplary embodiment. For example, electronic device 400 may be provided as a server. Referring to fig. 4, the electronic device 400 includes a processor 422, which may be one or more in number, and a memory 432 for storing computer programs executable by the processor 422. The computer program stored in memory 432 may include one or more modules each corresponding to a set of instructions. Further, the processor 422 may be configured to execute the computer program to perform the camera adjustment method described above.
In addition, the electronic device 400 may further include a power supply component 426 and a communication component 450, the power supply component 426 may be configured to perform power management of the electronic device 400, and the communication component 450 may be configured to enable communication of the electronic device 400, e.g., wired or wireless communication. In addition, the electronic device 400 may also include an input/output (I/O) interface 458. The electronic device 400 may operate based on an operating system stored in the memory 432, such as Windows Server, mac OS, unix, linux, etc.
In another exemplary embodiment, a computer readable storage medium is also provided, comprising program instructions which, when executed by a processor, implement the steps of the camera adjustment method described above. For example, the computer readable storage medium may be the memory 432 described above including program instructions executable by the processor 422 of the electronic device 400 to perform the camera adjustment method described above.
In another exemplary embodiment, a computer program product is also provided, which comprises a computer program executable by a programmable apparatus, the computer program having code portions for performing the above-mentioned camera adjustment method when being executed by the programmable apparatus.
The preferred embodiments of the present disclosure have been described in detail above with reference to the accompanying drawings, but the present disclosure is not limited to the specific details of the above embodiments, and various simple modifications may be made to the technical solutions of the present disclosure within the scope of the technical concept of the present disclosure, and all the simple modifications belong to the protection scope of the present disclosure.
In addition, the specific features described in the above embodiments may be combined in any suitable manner without contradiction. The various possible combinations are not described further in this disclosure in order to avoid unnecessary repetition.
Moreover, any combination between the various embodiments of the present disclosure is possible as long as it does not depart from the spirit of the present disclosure, which should also be construed as the disclosure of the present disclosure.

Claims (12)

1. A camera adjustment system, characterized in that the system comprises a server, one or more edge devices connected with the server, one or more cameras connected with the edge devices, the camera adjustment system is applied to a classroom scene, the server is a remote server, the edge devices and the cameras are installed in the classroom, and the edge devices and the cameras are connected through wireless or wired connections; wherein:
The server is used for receiving the image or video uploaded by the edge equipment;
the camera is used for shooting images and sending the shot images to the edge equipment; receiving a first control instruction sent by the edge equipment and adjusting the view angle of the camera according to the first control instruction;
the edge equipment is used for acquiring a target image shot by the camera; comparing the target image with a reference image to obtain an image offset vector; sending the first control instruction to the camera according to the image offset vector so that the camera can adjust the view angle of the camera according to the first control instruction;
the server is further configured to send a first image acquisition instruction to the edge device according to a first user input instruction; receiving the compressed image uploaded by the edge equipment and displaying the compressed image to a user;
the edge equipment is also used for receiving a first image acquisition instruction sent by the server and acquiring an image shot by the camera according to the first image acquisition instruction; correcting the image shot by the camera and compressing the image to a preset target resolution to obtain a compressed image; uploading the compressed image to the server;
The edge device is specifically configured to:
under the condition that the obtained target image is distorted, correcting the target image, and then comparing the corrected target image with a reference image to obtain an image offset vector; wherein the correction processing includes rectangular correction, luminance distribution correction, and noise cancellation;
the edge device is specifically configured to:
acquiring the brightness distribution of the target image, and confirming whether the brightness distribution of the target image accords with the expected brightness distribution;
correcting the target image by using a preset brightness curve to enable the brightness distribution of the corrected target image to conform to the expected brightness distribution when the brightness distribution of the target image does not conform to the expected brightness distribution, wherein a curve equation used by the preset brightness curve comprises a target brightness adjustment parameter matrix for adjusting the brightness of pixels of the target image, and the target brightness adjustment parameter matrix is obtained by the following steps:
inputting the target image into a preset brightness correction model to obtain a first candidate brightness adjustment parameter matrix;
smoothing the first candidate brightness adjustment parameter matrix to obtain the target brightness adjustment parameter matrix;
The edge device is specifically configured to:
acquiring one or more target boundaries contained in the target image, wherein the target boundaries represent boundaries among a plurality of first target objects in the target image, and the first target objects comprise one or more of a blackboard, a whiteboard, a projection courseware, a wall, a ground, a desk and a chair in a classroom;
acquiring one or more target areas from the target image according to the target boundary;
acquiring a second brightness adjustment parameter matrix corresponding to each target area from the first candidate brightness adjustment parameter matrix;
performing brightness smoothing filtering processing on the second brightness adjustment parameter matrix to eliminate noise;
obtaining the target brightness adjustment parameter matrix according to the smoothed and filtered second brightness adjustment parameter matrix;
the edge device is specifically configured to:
acquiring a rectangular target object in the target image; the rectangular target object comprises one or more of a blackboard, a whiteboard, a projection courseware, a desk and a chair in a classroom;
confirming whether the shape of the rectangular target object in the target image is rectangular or not;
correcting the target image through an image perspective transformation algorithm under the condition that the shape of the rectangular target object in the target image is not rectangular, so that the rectangular target object in the corrected image is a regular rectangle;
The curve equation used by the preset brightness curve is the following formula:
f(x)=x+a*x*(1-x),
wherein x is the pixel brightness of each pixel of the target image, and a is the target brightness adjustment parameter matrix.
2. The system according to claim 1, wherein the edge device is specifically configured to:
extracting feature points of the target image and the reference image respectively by using a SIFT algorithm to obtain a plurality of target feature points of the target image and a plurality of reference feature points of the reference image;
performing image feature matching on the target feature points and the reference feature points, and acquiring a position offset vector between each target feature point and the matched reference feature points;
smoothing the position offset vector to eliminate noise;
and calculating the image offset vector according to the position offset vector after the smoothing and filtering.
3. The system according to claim 1, wherein the edge device is specifically configured to:
under the condition that the image offset vector is larger than or equal to a preset vector threshold value, circularly executing a camera adjusting step until the image offset vector is smaller than the preset vector threshold value; the camera adjusting step comprises the following steps:
Sending the first control instruction to the camera according to the image offset vector so that the camera can adjust the view angle of the camera according to the first control instruction;
acquiring a new target image through the adjusted camera;
and comparing the new target image with the reference image to obtain a new image offset vector, and updating the new image offset vector into the image offset vector.
4. The system of claim 1, wherein the edge device is further configured to:
acquiring a new target image shot by a camera after the visual angle is adjusted;
and carrying out the correction processing on the new target image, and taking the corrected image as a new reference image.
5. The system according to any one of claims 1 to 4, wherein:
the server sends a first camera adjusting instruction to the edge equipment according to a first user input instruction;
the edge equipment is also used for receiving a first camera adjustment instruction sent by the server and sending a second control instruction to the camera according to the first camera adjustment instruction so that the camera can adjust the camera visual angle according to the second control instruction.
6. The camera adjusting method is characterized by being applied to edge equipment of a camera adjusting system, the camera adjusting system comprises a server, one or more edge equipment connected with the server, one or more cameras connected with the edge equipment, the camera adjusting system is applied to a classroom scene, the server is a remote server, the edge equipment and the cameras are installed in the classroom, and the edge equipment and the cameras are connected through wireless or wired connection; the method comprises the following steps:
acquiring a target image shot by a camera;
comparing the target image with a reference image to obtain an image offset vector;
sending the first control instruction to the camera according to the image offset vector so that the camera can adjust the view angle of the camera according to the first control instruction;
the method further comprises the steps of:
receiving a first image acquisition instruction sent by the server, and acquiring an image shot by the camera according to the first image acquisition instruction;
correcting the image shot by the camera and compressing the image to a preset target resolution to obtain a compressed image;
Uploading the compressed image to the server;
comparing the target image with a reference image to obtain an image offset vector, including:
under the condition that the obtained target image is distorted, correcting the target image, and then comparing the corrected target image with a reference image to obtain an image offset vector; wherein the correction processing includes rectangular correction, luminance distribution correction, and noise cancellation;
the luminance distribution correction includes:
acquiring the brightness distribution of the target image, and confirming whether the brightness distribution of the target image accords with the expected brightness distribution;
correcting the target image by using a preset brightness curve to enable the brightness distribution of the corrected target image to conform to the expected brightness distribution when the brightness distribution of the target image does not conform to the expected brightness distribution, wherein a curve equation used by the preset brightness curve comprises a target brightness adjustment parameter matrix for adjusting the brightness of pixels of the target image, and the target brightness adjustment parameter matrix is obtained by the following steps:
inputting the target image into a preset brightness correction model to obtain a first candidate brightness adjustment parameter matrix;
Smoothing the first candidate brightness adjustment parameter matrix to obtain the target brightness adjustment parameter matrix;
after the smoothing of the first candidate brightness adjustment parameter matrix, obtaining the target brightness adjustment parameter matrix includes:
acquiring one or more target boundaries contained in the target image, wherein the target boundaries represent boundaries among a plurality of first target objects in the target image, and the first target objects comprise one or more of a blackboard, a whiteboard, a projection courseware, a wall, a ground, a desk and a chair in a classroom;
acquiring one or more target areas from the target image according to the target boundary;
acquiring a second brightness adjustment parameter matrix corresponding to each target area from the first candidate brightness adjustment parameter matrix;
performing brightness smoothing filtering processing on the second brightness adjustment parameter matrix to eliminate noise;
obtaining the target brightness adjustment parameter matrix according to the smoothed and filtered second brightness adjustment parameter matrix;
the rectangle correction includes:
acquiring a rectangular target object in the target image; the rectangular target object comprises one or more of a blackboard, a whiteboard, a projection courseware, a desk and a chair in a classroom;
Confirming whether the shape of the rectangular target object in the target image is rectangular or not;
correcting the target image through an image perspective transformation algorithm under the condition that the shape of the rectangular target object in the target image is not rectangular, so that the rectangular target object in the corrected target image is a regular rectangle;
the curve equation used by the preset brightness curve is the following formula:
f(x)=x+a*x*(1-x),
wherein x is the pixel brightness of each pixel of the target image, and a is the target brightness adjustment parameter matrix.
7. The method of claim 6, wherein comparing the target image with a reference image to obtain an image offset vector comprises:
extracting feature points of the target image and the reference image respectively by using a SIFT algorithm to obtain a plurality of target feature points of the target image and a plurality of reference feature points of the reference image;
performing image feature matching on the target feature points and the reference feature points, and acquiring a position offset vector between each target feature point and the matched reference feature points;
smoothing the position offset vector to eliminate noise;
And calculating the image offset vector according to the position offset vector after the smoothing and filtering.
8. The method of claim 6, wherein the sending the first control instruction to the camera according to the image offset vector so that the camera adjusts a camera view according to the first control instruction comprises:
under the condition that the image offset vector is larger than or equal to a preset vector threshold value, circularly executing a camera adjusting step until the image offset vector is smaller than the preset vector threshold value; the camera adjusting step comprises the following steps:
sending the first control instruction to the camera according to the image offset vector;
acquiring a new target image through the adjusted camera;
and comparing the new target image with the reference image to obtain a new image offset vector, and updating the new image offset vector into the image offset vector.
9. The method of claim 6, wherein after said adjusting said camera according to said image offset vector, said method further comprises:
acquiring a new target image shot by the adjusted camera;
And carrying out the correction processing on the new target image, and taking the corrected image as a new reference image.
10. The method according to any one of claims 6 to 9, further comprising:
and receiving a first camera adjustment instruction sent by the server, and sending a second control instruction to the camera according to the first camera adjustment instruction, so that the camera can adjust the camera visual angle according to the second control instruction.
11. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the steps of the method according to any one of claims 6 to 10.
12. An electronic device, comprising:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to implement the steps of the method of any one of claims 6 to 10.
CN202110129635.1A 2021-01-29 2021-01-29 Camera adjustment method, system, storage medium and electronic equipment Active CN112969022B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110129635.1A CN112969022B (en) 2021-01-29 2021-01-29 Camera adjustment method, system, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110129635.1A CN112969022B (en) 2021-01-29 2021-01-29 Camera adjustment method, system, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN112969022A CN112969022A (en) 2021-06-15
CN112969022B true CN112969022B (en) 2023-09-01

Family

ID=76272509

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110129635.1A Active CN112969022B (en) 2021-01-29 2021-01-29 Camera adjustment method, system, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN112969022B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113382171B (en) * 2021-06-21 2023-03-24 车路通科技(成都)有限公司 Traffic camera automatic correction method, device, equipment and medium
CN115617532B (en) * 2022-11-22 2023-03-31 浙江莲荷科技有限公司 Target tracking processing method, system and related device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108389438A (en) * 2018-05-10 2018-08-10 科大讯飞股份有限公司 A kind of writing on the blackboard acquisition system
CN108961175A (en) * 2018-06-06 2018-12-07 平安科技(深圳)有限公司 Face luminance regulating method, device, computer equipment and storage medium
GB201818096D0 (en) * 2018-11-06 2018-12-19 Telensa Holdings Ltd Monitoring system
CA3076028A1 (en) * 2017-09-20 2019-03-28 Jeremy LEFEBVRE Secure, remote support platform with an edge device
CN110458971A (en) * 2019-07-05 2019-11-15 中国平安人寿保险股份有限公司 Check class attendance recording method, device, computer equipment and storage medium
WO2020045837A1 (en) * 2018-08-28 2020-03-05 김영대 Method for smart-remote lecturing using automatic scene-transition technology having artificial intelligence function in virtual and augmented reality lecture room
CN112235605A (en) * 2020-11-03 2021-01-15 新东方教育科技集团有限公司 Video processing system and video processing method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8274544B2 (en) * 2009-03-23 2012-09-25 Eastman Kodak Company Automated videography systems
JP2019009686A (en) * 2017-06-27 2019-01-17 株式会社日立製作所 Information processing unit and processing method of image data

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA3076028A1 (en) * 2017-09-20 2019-03-28 Jeremy LEFEBVRE Secure, remote support platform with an edge device
CN108389438A (en) * 2018-05-10 2018-08-10 科大讯飞股份有限公司 A kind of writing on the blackboard acquisition system
CN108961175A (en) * 2018-06-06 2018-12-07 平安科技(深圳)有限公司 Face luminance regulating method, device, computer equipment and storage medium
WO2020045837A1 (en) * 2018-08-28 2020-03-05 김영대 Method for smart-remote lecturing using automatic scene-transition technology having artificial intelligence function in virtual and augmented reality lecture room
GB201818096D0 (en) * 2018-11-06 2018-12-19 Telensa Holdings Ltd Monitoring system
CN110458971A (en) * 2019-07-05 2019-11-15 中国平安人寿保险股份有限公司 Check class attendance recording method, device, computer equipment and storage medium
CN112235605A (en) * 2020-11-03 2021-01-15 新东方教育科技集团有限公司 Video processing system and video processing method

Also Published As

Publication number Publication date
CN112969022A (en) 2021-06-15

Similar Documents

Publication Publication Date Title
CN106851122B (en) Calibration method and device for automatic exposure parameters based on double-camera system
CN112969022B (en) Camera adjustment method, system, storage medium and electronic equipment
US9654750B2 (en) Image processing system, image processing apparatus, and image processing method to respectively display images obtained by dividing one image on first and the second display media
US8704929B2 (en) System and method for user guidance of photographic composition in image acquisition systems
CN107749986B (en) Teaching video generation method and device, storage medium and computer equipment
CN109982029B (en) Automatic adjusting method and device for camera monitoring scene
US20070200926A1 (en) Apparatus and method for generating panorama images
CN103797782A (en) Image processing device and program
CN112218099A (en) Panoramic video generation method, panoramic video playing method, panoramic video generation device, and panoramic video generation system
CN104917960A (en) Method for controlling camera to rotate and terminal
EP2728544A1 (en) De-warping processing method for digital images
JP2014154046A (en) Image processing apparatus, image processing method, and program
CN111200686A (en) Photographed image synthesizing method, terminal, and computer-readable storage medium
CN112839165A (en) Method and device for realizing face tracking camera shooting, computer equipment and storage medium
CN112770095A (en) Panoramic projection method and device and electronic equipment
US11218662B2 (en) Image processing device, image processing method, and projection system
CN115174878B (en) Projection picture correction method, apparatus and storage medium
US20140002589A1 (en) Method for Producing a Panoramic Image and Implementation Apparatus
CN111988520B (en) Picture switching method and device, electronic equipment and storage medium
CN112738425A (en) Real-time video splicing system with multiple cameras for acquisition
CN117896621B (en) Shooting track recording method, device and equipment of cradle head and storage medium
CN107105155B (en) Automatic calibration method for panoramic video recorded based on fisheye camera
KR101204097B1 (en) Security system and method of controlling thereof
CN103685853A (en) Image processing apparatus and image processing method
CN115022608B (en) Laser projection apparatus and control method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant