CN112104840B - Video acquisition method and mobile baseband workstation - Google Patents

Video acquisition method and mobile baseband workstation Download PDF

Info

Publication number
CN112104840B
CN112104840B CN202010942399.0A CN202010942399A CN112104840B CN 112104840 B CN112104840 B CN 112104840B CN 202010942399 A CN202010942399 A CN 202010942399A CN 112104840 B CN112104840 B CN 112104840B
Authority
CN
China
Prior art keywords
data
workstation
image acquisition
frame
parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010942399.0A
Other languages
Chinese (zh)
Other versions
CN112104840A (en
Inventor
张根
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Neoway Technology Co Ltd
Original Assignee
Shenzhen Neoway Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Neoway Technology Co Ltd filed Critical Shenzhen Neoway Technology Co Ltd
Priority to CN202010942399.0A priority Critical patent/CN112104840B/en
Publication of CN112104840A publication Critical patent/CN112104840A/en
Application granted granted Critical
Publication of CN112104840B publication Critical patent/CN112104840B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/231Content storage operation, e.g. caching movies for short term storage, replicating data over plural servers, prioritizing data for deletion
    • H04N21/23106Content storage operation, e.g. caching movies for short term storage, replicating data over plural servers, prioritizing data for deletion involving caching operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
    • H04N21/2383Channel coding or modulation of digital bit-stream, e.g. QPSK modulation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/24Monitoring of processes or resources, e.g. monitoring of server load, available bandwidth, upstream requests
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44004Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving video buffer management, e.g. video decoder buffer or video display buffer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44245Monitoring the upstream path of the transmission network, e.g. its availability, bandwidth

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Databases & Information Systems (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The application provides a video acquisition method and electronic equipment, wherein the video acquisition method comprises the following steps: the electronic equipment receives the fixed parameters and the identifier parameters of the image acquisition equipment and configures the working parameters of the image acquisition equipment; the electronic equipment receives image information acquired by the image acquisition equipment according to the working parameters, compresses and encodes the image information and synthesizes the image information into video data; the electronic device may store the video data locally or transmit the video data to the user terminal. The video acquisition method and the electronic equipment can be connected with a plurality of image acquisition devices, and can configure working parameters of the image acquisition devices according to related parameters of the image acquisition devices, so that a user can arrange the positions and the number of the image acquisition devices according to a monitored scene, and the monitored scene can be effectively, comprehensively and pertinently covered.

Description

Video acquisition method and mobile baseband workstation
Technical Field
The invention relates to the field of monitoring, in particular to a video acquisition method and electronic equipment.
Background
In recent years, with the development of the internet of things technology, video monitoring is widely applied to various places such as families, companies, hotels, warehouses and the like. The monitoring system provides better application functions and experience for users by adding a cloud deck, improving the resolution of the camera, adjusting the dynamic focal length of the camera, using a wide-angle lens and the like. The location of the image capturing device in the monitoring system largely determines the coverage area of the monitoring. When a user needs to arrange a plurality of monitoring systems to cover a plurality of areas or a plurality of isolated scenes.
When the video monitoring scene is a multi-partition scene, such as a family scene or a storage scene, the house layout of the family, the internal layout of the storage and the installation position of the image acquisition device of the video monitoring device often determine the monitoring range which can be effectively covered by the monitoring system. The user needs to use a plurality of sets of monitoring systems to comprehensively cover the monitored scene, and the cost is high. When a monitored scene changes, the parameters of the existing monitoring system cannot be adjusted in a self-adaptive manner according to the scene, so that the flexibility is poor and the user experience is poor.
Disclosure of Invention
The application aims to provide a video acquisition method and electronic equipment, which can support a user to comprehensively cover and pertinently cover a monitored scene by adjusting the type, the number and the working parameters of image acquisition equipment of a monitoring system, so that the user does not need to use multiple sets of monitoring systems to comprehensively cover the monitored scene; according to the video acquisition method and the electronic equipment, when the number of the image acquisition equipment is changed, the working parameters of the image acquisition equipment are adjusted in real time, and the monitored area is monitored in a more targeted manner.
In a first aspect, the present application provides a video acquisition method applied to a mobile baseband workstation in a monitoring system, where the method includes: the method comprises the steps of obtaining fixed parameters and identifier parameters of image acquisition equipment in a monitoring system, configuring parameters such as resolution, frame number and the like of images when the image acquisition equipment acquires the images according to the fixed parameters and the identifier parameters, and carrying out video coding on acquired image information.
It can be seen that the mobile baseband workstation in the monitoring system is connected with a plurality of image acquisition devices, and the positions of different image acquisition devices in the monitored scene are different from each other, so that the monitored areas of different image acquisition devices are different, and the monitored scene comprises a plurality of monitoring areas. In the whole monitoring scene, priority relations exist among the monitoring areas covered by each image acquisition device. The method provided by the application establishes the mapping relation between the priority of the monitoring area and the image acquisition equipment through the identifier parameters. The method provided by the application configures the working parameters of the image acquisition equipment according to the identifier parameters and the fixed parameters of the image acquisition equipment, so that the monitoring system monitors the monitored area in a targeted manner.
Based on the first aspect, in a possible application scenario, a user places image capturing devices with different identifier parameters in different monitoring areas according to priorities of the monitoring areas. In other scenarios, the mobile workstation is pre-configured with an identifier parameter, and a mapping relationship between the identifier parameter and the image acquisition device is manually established. The identifier parameter represents not only the priority of the monitored area but also the priority between different image acquisition devices. And when the resources of the mobile baseband workstation are insufficient to receive and process the image information of all the image acquisition devices connected to the mobile baseband workstation, preferentially reducing the working parameters of the image acquisition devices with low priority. And when the resources of the mobile baseband workstation receive and process all the image information of the image acquisition equipment linked to the mobile baseband workstation, preferentially improving the working parameters of the image acquisition equipment with high priority.
The method provided by the application can be seen in that the processing capacity of the mobile base band workstation is utilized to the maximum extent, the priority among different monitored areas is considered, and the monitoring of the monitored areas is more targeted.
Based on the first aspect, the method provided by the present application may further synthesize the data after the encoding compression into video data, and send the video data to the user terminal through the local area network. Under the condition of a weak network, the sending data rate of the mobile baseband workstation is greater than the receiving data rate of a user, and in order to avoid blocking, a buffer queue is arranged on the mobile baseband workstation to provide a certain buffer effect. Further, if the buffer queue on the mobile baseband workstation is full, the mobile baseband workstation evaluates the current network condition according to the full queue time and the full queue times within the threshold time, and deletes M frames of data after sending a frame of data successfully according to the evaluation result, wherein M is an integer greater than or equal to 1.
The method optimizes the logic of the mobile workstation for sending the video data through the local area network under the weak network condition, ensures the real-time property of the video data received by the user, and enables the user to see the real-time monitoring data.
Based on the first aspect, the method provided by the present application may further synthesize the data after the encoding compression into video data, and send the video data to the user terminal through the local area network. And under the condition that the network is unstable, namely the transmission rate is lower when the network has less frequency burst, the mobile baseband workstation is provided with a buffer queue to provide a certain buffer effect. Further, if the buffer queue on the mobile baseband workstation is full, N frames of data are deleted after one frame of data is successfully sent, wherein N is a preset positive integer greater than or equal to 1.
The method optimizes the logic of the mobile workstation for sending the video data through the local area network under the condition of unstable network, thereby ensuring the real-time property of the video data received by the user, ensuring the integrity of the video data received by the user and improving the comprehensive experience of the user.
In a second aspect, the present application provides a mobile baseband workstation comprising: the USB interface, the MIPI interface, the memory and the processor which support the OTG function are connected through a bus; the processor is used for configuring working parameters of each image acquisition device according to the identifier parameters and the fixed parameters of the at least three image acquisition devices respectively; and respectively receiving at least three paths of image information acquired by the at least three image acquisition devices according to the working parameters; and video coding the at least three paths of image information respectively.
It should be noted that, in a specific embodiment, each of the functional modules may be configured to implement the method described in the first aspect, and details are not described here again.
In a fourth aspect, the present application provides a non-transitory computer-readable storage medium; the computer readable storage medium is used for storing code for implementing the method of the first aspect. When the program code is executed by a computing device, the user device is adapted to the method of the first aspect.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 shows a system architecture of a monitoring system comprising a mobile baseband workstation.
Fig. 2 shows an exemplary application scenario in which the monitoring device is deployed.
Fig. 3 shows another exemplary application scenario for deploying the monitoring device.
Fig. 4 shows a video capture method applied to a mobile baseband workstation.
Fig. 5 shows a typical image information acquisition flow of the monitoring apparatus.
Fig. 6 shows an exemplary audio information collection flow of the monitoring device.
Fig. 7 shows a workflow of code stream adaptation of the monitoring device.
Fig. 8 shows another workflow of code stream adaptation of the monitoring device.
Fig. 9 shows a data interaction flow between the monitoring device and the network side.
Fig. 10 shows another data interaction flow of the monitoring device and the network side.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without inventive step based on the embodiments of the present invention, are within the scope of protection of the present invention.
Fig. 1 shows a system architecture of a monitoring system comprising a mobile baseband workstation. As shown in fig. 1, wherein:
the mobile baseband workstation 101 is provided with a processor (System on Chip, soc) 501. A processor refers to an integrated circuit with dedicated objects that contain the complete system and have the entire contents of the embedded software. The processor runs an Android (Android) operating system, supports the running of two functional modules and can be maintained by a synchronous thread or an asynchronous thread; one of the modules is a data processing module 5011, and the other module is a code stream adaptation module 5012. The mobile baseband workstation 101 comprises a memory 502 for storing data and optionally network transmission means 503 for supporting network transmission. The Soc501, the memory 502 and the network transmission device 503 are connected by a bus to complete the mutual transmission of control information and data. The mobile baseband workstation 101 is connected to at least three image acquisition devices. The at least three image acquisition devices comprise at least one MIPI camera 203 and at least two USB cameras 311, 312, \8230and31K.
The Android system running on the Soc501 enables kernel support for UVC (USB Video Class) equipment, support for Android.
In addition, the Mobile baseband workstation 101 further includes a Universal Serial Bus (USB) Interface 301 supporting OTG (On-The-Go) function, a Mobile Industry Processor Interface (MIPI) 201, and an audio input Interface 401.
Mobile baseband workstation 101 may have a plurality of USB interfaces 301, MIPI interfaces 201, and a plurality of audio input interfaces 401.USB interface 301, MIPI interface 201 and Soc501 are connected through the bus.
The USB camera is a digital camera device, the interface of which is a USB interface, is mainly used in the field of video communication and supports the hot plug function. The USB camera supplies power and transmits data through a USB transmission line connected to the camera. The MIPI camera is a digital camera device, and its interface is a MIPI interface 201. The MIPI interface 201 is a differential serial port. The attenuation of the MIPI video transmission line 202 increases dramatically with increasing length, which is not well beyond 3.5 inches as specified by the MIPI Group, so the MIPI camera 203 is collocated with the mobile baseband workstation 101. The length of the USB transmission line 304 is generally not limited.
The MIPI camera 203 is connected to the MIPI interface 201 through an MIPI video transmission line 202, the plurality of USB cameras 311, 312, 8230are connected to the MIPI camera 31K through USB transmission lines 304 respectively, the MIPI camera 31 is connected to one end of a branch port of a HUB (HUB) 303, and one end of a main port of the HUB303 is connected to the USB interface 301 of the mobile baseband workstation 101. An audio capture device 402, such as a microphone, is connected to the audio input interface 401.
It should be noted that, a plurality of MIPI cameras may be respectively connected to a plurality of MIPI interfaces through MIPI video transmission lines; the plurality of audio acquisition devices may be connected to the plurality of audio input interfaces, respectively.
The Soc501, the MIPI interface 201, the USB interface 301, and the audio input interface 401 are connected by a bus to complete mutual transmission of control information and data.
When the USB cameras 311, 312, 8230, 31K are connected to the mobile baseband workstation 101 through the HUB303, the device identification numbers corresponding to the USB cameras 311, 312, 8230, 31K are generated under the system directory/dev, and the Soc501 can calculate the number of the USB cameras by detecting the device identification numbers. The Android system running on the Soc501 modifies the JNI interface to point to Camera of the JAVA layer, and drives the USB cameras 311, 312, \8230;, 31K through a V4L2 (video Linux Two) instruction set, and receives image data, working parameters, and fixed parameters collected by the USB cameras 311, 312, \8230;, 31K.
The fixed parameters comprise the resolution and the frame number preset by each image acquisition device, the working parameters are the resolution and the frame number adopted when each image acquisition device actually acquires image information, and the working parameters are different from the fixed parameters.
Registering a declaration of the MIPI camera 203 in an Android system running on the Soc501, loading a driver of the MIPI camera 203, receiving image information, working parameters and fixed parameters of the MIPI camera 203, and acquiring identifier parameters of the MIPI camera 203, wherein the format of the image information is YUV format.
According to the sequence of the image acquisition devices identified by the Soc501 or the mapping relation of the HUB303 line-dividing ports, independent identifier parameters are respectively configured for the image acquisition devices, and the identifier parameters represent the priority among the image acquisition devices.
At least three image capturing devices may be provided with identifier parameters, and Soc501 may read the identifier parameters corresponding to the at least three image capturing devices.
The Soc501 reads the fixed parameters and identifier parameters of at least three image capturing devices and configures the working parameters of the image capturing devices. The process of configuring the working parameters of the image acquisition equipment by the Soc501 is shown in figures 5 and 6.
The at least three image acquisition devices respectively transmit the acquired monitoring image information data to the mobile baseband workstation 101. The audio acquisition devices transmit the acquired audio information to the mobile baseband workstation 101, respectively.
The Soc501 encodes and compresses the image information and the audio information, respectively, and synthesizes the encoded and compressed data into video data, and processes of encoding, compressing and synthesizing the video data by the Soc501 are shown in fig. 6 and 7.
When the user terminal requests data from the mobile baseband workstation 101, the network transmission device 503 on the mobile baseband workstation 101 receives a stream pulling request from the user terminal, and sends the video data to the user terminal through the local area network. The process of the network transmission device 503 transmitting data to the user terminal through the local area network is as shown in fig. 9 and fig. 10.
Fig. 2 shows an exemplary application scenario in which the monitoring device is deployed. As shown in fig. 2, the monitored area is a home scene, and the user configures the placement position of the image capturing device and the placement position of the audio capturing device according to the characteristics, size, importance degree, and the like of the monitored area. Wherein the MIPI camera 203 is configured in the living room, the USB camera 311 is placed in the main bed, the USB camera 312 is placed in the sub bed, the USB camera 313 is placed on the balcony, and the microphone 402 is optionally placed in the living room.
In this embodiment, the mobile baseband workstation 101 configures the identifier parameters of the MIPI camera 203, the USB camera 311, the USB camera 312, and the USB camera 313 to be 5, 3, 1, and 1 according to the sequence and the type of the image capturing device. The larger the value of the identifier parameter, the higher the priority it represents.
In other embodiments, the manner of the mobile baseband workstation 101 identifying the identifier parameter of the USB cameras 311 to 313 further includes: the identifier parameters are preset in the Soc501, and the mapping relation between the identifier parameters and the at least three image acquisition devices is established through the corresponding sequence of the HUB303 ports to configure the identifier parameters of the at least three image acquisition devices.
The identifier parameter may also be a parameter having an order, such as an english letter, a binary code, etc., or a parameter that is specified by a person to be prioritized.
The fixed parameters of the MIPI camera in this embodiment include: resolutions 1920 × 1080, 1280 × 960, 1280 × 720, 800 × 600, 720 × 600, frame numbers 25, 20, 15; USB camera resolutions 720 × 600, 640 × 480, 640 × 360, 320 × 240, frame numbers 20, 15.
It is worth mentioning that the fixed parameters of the at least three image acquisition devices are not limited to resolution and frame number. For convenience of description, the following only describes the process of configuring the operating parameters of at least three image capturing devices by the Soc501 by taking the resolution and the frame number as examples.
The code stream adaptation module 5012 running on the Soc501 configures the working parameters of the at least three image capturing devices according to the fixed parameters and the corresponding identifier parameters of the at least three image capturing devices. The adjusted working parameters enable the code rate of image information acquired by at least three image acquisition devices not to exceed the load processing capacity of a data processing module 5011 in the Soc 501; the configured working parameters support that the code rate of the image acquisition equipment does not exceed the load transmission capability of the USB interface 301.
The code stream adaptation module 5012 running on the Soc501 supports the system to configure the working parameters of the image acquisition devices corresponding to the monitoring area according to the priority represented by the identifier parameter.
In this embodiment, after being adjusted by the system code stream adaptation module 5012, the working parameters of the image capturing device are:
working parameters of the MIPI camera 203: resolution 1280 × 720, frame number 25.
Working parameters of the USB camera 311: resolution 720 x 600, frame number 20.
Working parameters of the USB camera 312: resolution 640 x 360, frame number 20.
Working parameters of the USB camera 313: resolution 640 x 360, frame number 20.
Fig. 3 shows another exemplary application scenario for deploying the monitoring device. As shown in fig. 3, the system supports that the user can add a USB camera 314 to the guest bed in the application scenario shown in fig. 2, where the corresponding identifier parameter is 3.
The code stream adaptation module 5012 running on the Soc501 operates in the mode shown in fig. 7 and 8.
In this embodiment, after being configured by the code stream adaptation module 5012, the working parameters of the image capturing device are:
working parameters of the MIPI camera 203: resolution 1280 × 720, frame number 25.
Working parameters of the USB camera 311: resolution 720 x 600, frame number 20.
Working parameters of the USB camera 312: resolution 320 × 240, frame number 15.
Working parameters of the USB camera 313: resolution 320 × 240, frame number 15.
The working parameters of the USB camera 314 are: resolution 720 x 600, frame number 15.
After the code stream adaptation module 5012 running on the Soc501 configures the working parameters of the image acquisition device as above, the code rate of the USB cameras 311, 312, 313, and 314 is not greater than the load transmission capability of the USB interface 301; so that the code rate of the image acquisition equipment does not exceed the load processing capacity of the data processing module 5011 in the Soc 501.
The data processing module 5011 running on the Soc501 receives image information and audio information acquired by the image acquisition device and the audio acquisition device, compresses and encodes the acquired image information and audio information, and synthesizes the compressed and encoded data into a video.
The work flow of the data processing module 5011 is shown in fig. 4 and 5.
The data processing module 5011 running on the Soc501 can receive the image information data from the image acquisition device and perform data compression on the image information data. In this embodiment, four independent threads are preferably used for data compression. The four independent threads correspond to image information coding and compression processes of the MIPI camera 203, the USB camera 311, the USB camera 312 and the USB camera 313 respectively.
Optionally, a data processing module 5011 running on Soc501 may receive the audio information of microphone 402. In this embodiment, the format of the audio information recorded by the microphone is a Pulse Code Modulation (PCM) format.
The data processing module 5011 compresses the image information and the audio information as shown in fig. 5. The data processing module 5011 writes the compressed data into the memory 502 and stores the compressed data on the local side to meet the requirement of video playback. The data processing module 5011 writes the compressed data to the network transmission device 503.
The data processing module 5011 may write the synthesized video to the memory 502. The storage mode is preferably circular segmented storage.
The data processing module 5011 may write the synthesized video into the network transmission device 503.
It can be understood that, based on the arbitrary increase and decrease of the number of image capturing devices in the scene shown in fig. 1, the system can be effectively adjusted to achieve targeted monitoring of the monitored area.
Fig. 4 shows a video capture method applied to a mobile baseband workstation, as shown in fig. 4, the method includes:
s401: and acquiring working parameters and identifier parameters of the image acquisition equipment.
The video acquisition method acquires working parameters and identifier parameters of image acquisition equipment connected to a mobile baseband workstation, wherein the acquisition of the working parameters of the image acquisition equipment specifically comprises the following steps: and enabling kernel support for the UVC equipment, android. Modifying a JNI interface to point to a Camera of a JAVA layer, driving a USB Camera through a V4L2 (Video for Linux Two) instruction set, acquiring working parameters of the USB Camera, and declaring a MIPI Camera through registration and loading the drive of the MIPI Camera; the acquiring of the identifier parameter of the image capturing device specifically includes: the identifier parameter of the image capturing device may be read according to a mapping relationship under the system directory/dev or set according to a device class of the system directory/dev. Step S402 is performed.
S402: and configuring the working parameters of the image acquisition equipment according to the working parameters and the identifier parameters of the image acquisition equipment, and driving the image acquisition equipment to work.
After the working parameters and the identifier parameters of the image acquisition equipment are obtained, the method configures the working parameters of the image acquisition equipment, and specifically comprises the following steps: determining the priority of the image acquisition equipment according to the identifier parameters; selecting proper resolution and frame number from fixed parameters of the image acquisition equipment as working parameters of the image acquisition equipment according to the priority of the image acquisition equipment; and setting parameters corresponding to the Camera instance of the image acquisition equipment, and driving the image acquisition equipment to acquire image information according to the working parameters. Step S403 is performed.
S403: and acquiring image information acquired by the image acquisition equipment under the working parameters, and compressing and encoding the image information.
The method for compressing and encoding the acquired image information after acquiring the image information acquired by the image acquisition equipment under the working parameters specifically comprises the following steps: the acquired image information is in a YUV format, and the compression coding mode is H.264 compression coding or other compression coding modes.
Fig. 5 shows an exemplary image information collecting flow of the monitoring device, as shown in fig. 5, including:
s501: and receiving YUV data.
The data processing module 5011 running on the Soc501 drives the USB camera and the MIPI camera through the interface in the Android system by using the V4L2 instruction set, respectively, to obtain YUV data acquired by the USB camera and the MIPI camera. The above-described driving processes may be maintained by different threads, respectively, or may be maintained by one thread. Step S502 is performed.
S502: and (5) encoding.
The data processing module 5011 running on the Soc501 encodes the received YUV format data. Step S503 is performed.
S503: audio data is received and video is synthesized.
The data processing module 5011 running on the Soc501 synthesizes the compressed image information with the audio information. When the audio acquisition equipment corresponds to the image acquisition equipment in a one-to-one manner, the audio information and the image information are synthesized into a video according to the corresponding relationship. When the number of the audio equipment and the number of the image acquisition equipment are not equal, establishing a mapping relation between the audio acquisition equipment and the image acquisition equipment, and synthesizing the audio information and the image information into a video according to the mapping relation. Step S504 is performed.
S504: storing video data to a memory 502 or a network transmission device 503
Soc501 transmits the synthesized video data to network transmission module 503 or Soc501 transmits the synthesized video data to memory 502 for storage.
The code synthesis mode can be a multi-thread mode or a single-thread mode. The audio data acquisition flow in step S503 is shown in fig. 6.
In this embodiment, the audio capture device is a microphone 402. The data processing module 5011 running on the Soc501 establishes a mapping relationship between the audio acquisition device and the image acquisition device. The mapping relation is one-to-many, namely one audio acquisition device corresponds to a plurality of image acquisition devices, namely one path of audio information corresponds to a plurality of paths of image information. The data processing module 5011 running on the Soc501 synthesizes the audio information and the image information into a video according to the above mapping relationship.
In this embodiment, the format of the synthesized video data is MP4 format; in other embodiments, the format of the synthesized video data may also be 3GP, AVI, MPEG, FLV, or other formats. The encoding and synthesizing method may be h.264 encoding, h.265 encoding, or other various methods.
Fig. 6 shows an exemplary audio information collection process of the monitoring device, as shown in fig. 6, including:
s601: data is acquired for the microphone 402.
The data processing module 5011 running on the Soc501 obtains the data of the audio acquisition device, and the data format is PCM format. In the embodiment shown in fig. 2 or fig. 3, the audio capturing device is a microphone 402. Step S602 is performed.
S602: and (5) encoding.
The data processing module 5011 running on the Soc501 encodes the acquired audio information in PCM format. In the embodiment shown in fig. 2 or fig. 3, the encoding format is AAC.
In this embodiment, the memory 502 stores the synthesized video data, preferably in a circular segment storage manner.
The method comprises the steps of storing a video file every other fixed time in a circulating segmented storage mode, wherein the size of the video file is related to the interval time, and marking the storage time of the video file. When the remaining storage space reaches a threshold, the oldest stored video file is deleted to store the newly composed video file.
The threshold is the remaining memory space capacity after I times the single video size subtracted from the capacity of the storage 502. In this embodiment, I is 1, and in other embodiments, I may be a positive integer greater than or equal to 1. The above-mentioned labels are used to judge the chronological order in which the video files stored in the memory 502 are stored, and to delete the video file written earliest according to the identifier.
Fig. 7 shows a workflow of code stream adaptation of the monitoring device, as shown in fig. 7, including:
s701: the number of image information collecting devices varies.
The code stream adaptation module 5012 running on the Soc501 detects the number of image acquisition devices in the system directory/dev of the mobile baseband workstation 101 to determine whether the number of image acquisition devices is increased, if so, S702 is executed, and if not, the process is ended.
S702: and configuring the working parameters of the image acquisition equipment.
The code stream adaptation module 5012 running on the Soc501 drives the newly accessed image acquisition device according to the path pointed by the directory/dev, and obtains the fixed parameters and identifier parameters of the newly accessed image acquisition device. The code stream adaptation module 5012 obtains the working parameter a of the image acquisition device with the closest priority according to the priority of the identifier parameter mark. The code stream adaptation module 5012 configures the working parameters of the newly accessed image acquisition device according to the fixed parameters of the newly accessed image acquisition device and the reference working parameters a. Step S703 is executed.
S703: the total code rate of the image acquisition equipment is less than or equal to the compression coding capacity of Soc 501.
The code stream adaptation module 5012 obtains the working parameters of the image acquisition equipment connected to the mobile baseband workstation 101, and if the total code rate of the image acquisition equipment is less than or equal to the compression coding capacity of the data processing module 5011 running on the Soc501, step S704 is executed; if the total code rate of the image capturing device is greater than the compression coding capability of the data processing module 5011 running on the Soc501, step S705 is executed.
S704: and configuring the working parameters and the priority of the image acquisition equipment.
The code stream adaptation module 5012 running on the Soc501 obtains identifier parameters, working parameters, and fixed parameters of all image acquisition devices connected to the mobile baseband workstation 101, and configures the working parameters of the image acquisition devices. Step S703 is executed.
The configuration process comprises the following steps: the code stream adaptation module 5012 changes the identifier parameter of the image acquisition device by configuring the working parameter of the image acquisition device with the lowest priority and after configuring the working parameter, and specifically includes:
the code stream adaptation module 5012 configures the image acquisition device with the resolution or the frame number and other working parameters higher than those of the previous working parameters, and reduces the identifier parameters of the image acquisition device. The code stream adaptation module 5012 configures the working parameters of the image acquisition device, such as resolution or frame number, which are lower than the previous working parameters, and improves the identifier parameters of the image acquisition device. If the code stream adaptation module 5012 adjusts the resolution and the frame number at the same time, and the code rate of the image acquisition equipment after adjustment is lower than that before, the identifier parameter of the image acquisition equipment is improved; and reducing the identifier parameter of the image acquisition equipment when the code rate of the image acquisition equipment is higher than that before after the adjustment. The code stream adaptation module 5012 reduces the device identifier parameter if the resolution fails to improve the working parameter of the image acquisition device. The code stream adaptation module 5012 increases the device identifier parameter if the failure to reduce the working parameter of the image acquisition device is reduced. The identifier parameter represents a priority between the inside of the image capturing device.
S705: the sum of the USB camera code rates is less than or equal to the maximum rate of the USB interface 301.
The code stream adaptation module 5012 acquires working parameters of image acquisition equipment connected to the mobile baseband workstation 101, and if the total code rate of the image acquisition equipment is less than or equal to the compression coding capacity of the data processing module 5011 running on the Soc501, the operation is finished; if the total code rate of the image capturing device is greater than the compression coding capability of the data processing module 5011 running on the Soc501, step S706 is executed.
S706: and configuring working parameters and priority of the USB camera.
The code stream adaptation module 5012 running on the Soc501 obtains the identifier parameters, the working parameters, and the fixed parameters of all the USB cameras connected to the mobile baseband workstation 101. The code stream adaptation module 5012 reconfigures the working parameters of the USB camera according to the identifier parameters so that the total code rate of the image acquisition device is not greater than the maximum rate supported by the USB interface 301 on the Soc 501.
The configuration process is as follows: the code stream adaptation module 5012 changes the identifier parameter of the image acquisition device by configuring the working parameter of the USB camera with the lowest priority and after configuring the working parameter.
The code stream adaptation module 5012 refers to S704 for adjusting the USB camera operating parameters and the identifier parameters.
Fig. 8 shows another workflow of code stream adaptation of the monitoring device, as shown in fig. 8, including:
s801: the number of image information collecting devices varies.
The code stream adaptation module 5012 running on the Soc501 detects the number of image capturing devices in the system directory/dev of the mobile baseband workstation 101 to determine whether the number of image capturing devices is reduced, if so, step S802 is executed, and if not, the process is ended.
S802: and configuring the working parameters and the priority of the image acquisition equipment.
The code stream adaptation module 5012 running on the Soc501 obtains the working parameters and identifier parameters of the fixed parameters of the image acquisition device. The code stream adaptation module 5012 configures the working parameters of the image acquisition device with the highest priority according to the identifier parameters. Step S803 is executed.
The codestream adaptation module 5012 configures the operating parameters and the identifier parameters of the image capturing device as in step S704.
S803: the total code rate of the image acquisition equipment is less than or equal to the Soc501 compression coding capacity.
The code stream adaptation module 5012 running on the Soc501 determines whether the total code rate of the image acquisition device is less than or equal to the compression coding capability of the data processing module 5011 running on the Soc 501. If yes, go to step S804; if the determination result is negative, step S802 is executed.
S804: the total code rate of the USB camera is less than or equal to the maximum rate of the USB interface.
The code stream adaptation module 5012 running on the Soc501 determines whether the total code rate of the USB camera is less than or equal to the compression coding capability of the data processing module 5011 running on the Soc 501. If the judgment result is yes, ending; if the determination result is negative, step S805 is executed.
S805: and configuring the working parameters and the priority of the USB camera.
The configuration method is as step S706.
In the embodiment shown in fig. 2 or fig. 3, the mobile baseband workstation 101 configures the operating parameters of at least three image capturing devices through the code stream adaptation module 5012 running on the Soc501, so that the image capturing devices can specifically and effectively cover the monitored area comprehensively.
If the network transmission device 503 running on the Soc501 in the mobile baseband workstation 101 receives a pull request from the network side, the Soc501 establishes a sub-thread to write the data encoded and synthesized by the data processing module 5011 into the head of the data buffer queue or write the video data stored in the memory 502 into the head of the data buffer queue.
The network transmission means 503 supporting operation on the mobile baseband workstation 101 may transmit the combined video data to the user terminal through the internet through a frame dropping transmission mode, or the network transmission means 503 supporting operation on the mobile baseband workstation 101 may transmit the combined video data to the user terminal through the internet through a frame dropping transmission mode.
Fig. 9 shows a data interaction flow between the monitoring device and the network side, as shown in fig. 9,
s901: the queue is full.
The network transmission device 503 on the mobile baseband workstation 101 receives the video data synthesized by the data processing module 5011 running on the Soc501 or receives the adaptation data stored in the memory 502. The network transmission device 503 establishes a data buffer queue, and if the data buffer queue is full, executes step S902; if the data buffer queue is not full, step S907 is executed.
S902: whether the last time to full queue is greater than a threshold time.
The network transmission device 503 on the mobile baseband workstation 101 records the queue full time of the data buffer queue. The network transmission device 503 calculates the time interval of this completion according to the time of this completion and the time identifier. If the time interval is greater than the configured threshold time, execute step S903; if the time interval is less than or equal to the configured threshold time, step S904 is executed.
The time stamp is a record of the time the queue is full.
S903: and updating the time identification and deleting partial frame data.
The network transmission device 503 on the mobile baseband workstation 101 updates the time identifier and deletes the data of the partial frame. Step S906 is performed.
The rule for deleting partial frame data is as follows: the network transmission device 503 divides the buffer data queue into O segments, and selects one, multiple or all the data in each segment at a fixed position to delete. When the length of the data queue cannot be divided by O, the length of the data queue is increased to meet the adjustment of the length of the data queue which can be divided by O, and the data queue is recovered after the frame data is deleted. O is an integer of 1 or more.
S904: the threshold time can be less than Q full.
The network transmission device 503 on the mobile baseband workstation 101 calculates that the present queue fullness is the R-th queue fullness within the threshold time according to the present queue fullness time, the time identifier and other parameters. If R is less than Q, execute step S905; if R is not less than Q, go to step S906.
S905: and deleting the partial frame data in the limited range.
The network transmission device 503 on the mobile baseband workstation 101 deletes a part of frame data within a limited range in the data buffer queue. Step S907 is performed.
The method of deleting partial frame data is as in step S903.
The limited range is that the network transmission device 503 defines the position of the data written at the last time of queue fullness in the data buffer queue to the tail of the data buffer queue according to the parameters such as the current time of queue fullness and time identification.
S906: and deleting all frame data and updating the time identification.
The network transmission device 503 on the mobile baseband workstation 101 deletes all frame data in the buffer queue and updates the time identifier. Step S907 is performed.
S907: the data is written to the head of the queue.
The network transmission device 503 on the mobile baseband workstation 101 writes the video data into the head of the data buffer queue. The threshold interval is a configurable time interval.
Fig. 10 shows another data interaction flow of the monitoring device with the network side, as shown in fig. 10,
s1001: whether the queue is full.
A data buffer queue is arranged on the network transmission device 503 of the mobile baseband workstation 101, and if the data buffer queue is full, the step S1002 is executed; if the data buffer queue is not full, step S1003 is executed.
S1002: and transmitting every other frame.
The network transmission device 503 on the mobile baseband workstation 101 switches the data transmission mode to the frame-separated transmission mode, and if the data transmission mode is the frame-separated transmission mode, stops writing data into the data buffer queue and waits until the transmission of the data of the tail frame of the queue is finished. Step S1005 is performed.
The frame transmission mode comprises the following steps: and after the frame data is successfully sent, deleting the frame data and the T frame data from the tail part of the buffer data queue. T is a positive integer not less than 1.
S1003, carrying out: whether the queue is empty.
Whether the data buffer queue on the network transmission device 503 on the mobile baseband workstation 101 is empty. If the data buffer queue is not empty, executing step S1004; if the data cache queue is empty, step S1005 is executed.
S1004: and continuously transmitting.
The network transmission device 503 on the mobile baseband workstation 101 switches the transmission mode to continuous transmission. Step S1002 is executed.
The continuous transmission mode is as follows: and after the frame data is successfully sent, deleting the frame data from the tail part of the buffer data queue.
S1005: the data is added.
The network transmission device 503 on the mobile baseband workstation 101 writes the video data into the head of the data buffer queue.
Preferably, when the mobile baseband workstation 101 receives a live streaming request from the network side, the interaction manner between the mobile baseband workstation 101 and the network side is as shown in fig. 9, and the data processing module 5011 serves as a data producer to produce data; when the mobile baseband workstation 101 receives a playback pull request from the network side, the interaction between the mobile baseband workstation 101 and the network side is as shown in fig. 10, and the memory 502 is used as a data producer to produce data, which is video data.
While the invention has been described in connection with what is presently considered to be the most practical and preferred embodiment, it is to be understood that the invention is not to be limited to the disclosed embodiment, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.

Claims (8)

1. A video acquisition method is characterized in that the method is applied to a mobile baseband workstation in a monitoring system, the monitoring system comprises the mobile baseband workstation and at least three image acquisition devices connected with the mobile baseband workstation, the at least three image acquisition devices comprise at least one MIPI camera and at least two USB cameras, the at least two USB cameras are connected to the mobile baseband workstation through a hub, and the MIPI cameras are in telecommunication connection with the mobile baseband workstation;
the method comprises the following steps:
the mobile baseband workstation acquires fixed parameters and identifier parameters of the at least three image acquisition devices; the fixed parameters comprise the resolution and the frame number preset by each image acquisition device;
the mobile baseband workstation configures working parameters of the at least three image acquisition devices according to the identifier parameters and the fixed parameters of the at least three image acquisition devices respectively; the working parameters are the resolution and the frame number adopted when each image acquisition device actually acquires the image information; the working parameters and the fixed parameters are different;
the mobile baseband workstation respectively receives at least three paths of image information acquired by the at least three image acquisition devices according to working parameters;
the mobile baseband workstation respectively performs compression coding on the at least three paths of image information;
the mobile baseband workstation configures the working parameters of the at least three image acquisition devices according to the identifier parameters and the fixed parameters of the at least three image acquisition devices, and specifically includes:
the mobile baseband workstation determining priorities of the at least three image acquisition devices based on the identifier parameters, the at least three image acquisition devices including a first image acquisition device having a lowest priority;
and under the condition that the code rates corresponding to the working parameters of the at least three image acquisition devices are greater than the compression coding capacity of the mobile baseband, reducing the code rate of the acquired images of the first image acquisition device and improving the priority of the first image acquisition device by configuring the working parameters of the first image acquisition device.
2. The method of claim 1, wherein after the mobile baseband workstation compression encodes the at least three paths of image information, respectively, the method further comprises:
the mobile baseband workstation respectively synthesizes the information respectively compressed and coded into video data;
the mobile baseband workstation sends the video data to a user terminal through a local area network, and the mode for sending the video data is a frame loss sending mode; the frame loss sending mode comprises deleting the frame data and M frame data after the frame after the mobile baseband workstation successfully sends the frame data, wherein M is a positive integer greater than or equal to 1.
3. The method of claim 2, wherein the frame loss sending mode comprises deleting a frame of data and M frames of data following the frame after the frame of data is successfully sent by the mobile baseband station, wherein M is a positive integer not less than one, and comprises:
the mobile baseband workstation is provided with a data buffer queue for sending the video data; the mobile baseband workstation is provided with a threshold time parameter; and the mobile baseband workstation sets the value of M according to the time interval between the completion of the data cache queue and the completion of the data cache queue last time, the completion times of the data cache queue in the threshold time and the threshold time parameter.
4. The method according to claim 1, wherein after the mobile baseband workstation receives at least three paths of image information collected by the at least three image collecting devices according to the operating parameters, the method further comprises:
the mobile baseband workstation synthesizes the compressed and coded information into video data;
the mobile baseband workstation sends the video data to a user terminal through a local area network, and the mode for sending the video data is a frame-spaced sending mode;
the alternate frame sending mode is as follows: the mobile baseband workstation is provided with a data buffer queue for sending the video data; when a data cache queue of the mobile baseband workstation is full, after the mobile baseband workstation successfully sends a frame of data, deleting the frame of data and N frames of data after the frame from the tail of the data cache queue, wherein N is a preset positive integer which is more than or equal to 1; and when the data buffer queue on the mobile baseband workstation is not fully queued, deleting the frame data from the tail of the data buffer queue after the mobile baseband workstation successfully sends the frame data.
5. A mobile baseband workstation, comprising:
the mobile baseband workstation comprises: the USB interface, the MIPI interface, the memory and the processor which support the OTG function are connected through a bus;
the mobile baseband workstation is used for connecting at least three image acquisition devices, the at least three image acquisition devices include: the system comprises at least one MIPI camera and at least two USB cameras; the USB camera is connected to the USB interface supporting the OTG function through a concentrator; the MIPI camera is in telecommunication connection with the MIPI interface;
the processor is used for acquiring fixed parameters and identifier parameters of the at least three image acquisition devices; the fixed parameters comprise the resolution and the frame number preset by each image acquisition device;
the processor is used for configuring working parameters of each image acquisition device according to the identifier parameters and the fixed parameters of the at least three image acquisition devices respectively; the working parameters are the resolution and the frame number adopted when each image acquisition device actually acquires the image information; the working parameters and the fixed parameters are different;
the processor is used for respectively receiving at least three paths of image information acquired by the at least three image acquisition devices according to working parameters;
the processor is used for respectively carrying out video coding on the at least three paths of image information;
the processor is configured to configure the operating parameters of each image capturing device according to the identifier parameters and the fixed parameters of the at least three image capturing devices, and specifically includes: the processor is configured to determine priorities of the at least three image capturing devices, where the at least three image capturing devices include a first image capturing device, and the priority of the first image capturing device is the lowest; and under the condition that the code rates corresponding to the working parameters of the at least three image acquisition devices are larger than the compression coding capacity of the mobile baseband work, the code rate of the acquired images of the first image acquisition device is reduced and the priority of the first image acquisition device is improved by configuring the working parameters of the first image acquisition device.
6. Mobile baseband workstation according to claim 5,
the mobile baseband workstation also comprises a memory and a network transmission device;
the processor is used for synthesizing the compressed and coded information into video data;
the memory is used for storing the video data;
the network transmission device is used for transmitting the video data to a user terminal through a local area network; the mode of the network transmission device for transmitting the video data is a frame loss transmission mode;
the frame loss transmission mode comprises the following steps: and after the network transmission device successfully sends a frame of data, deleting the frame of data and M frames of data after the frame, wherein M is a positive integer greater than or equal to 1.
7. The mobile baseband workstation of claim 6, wherein said network transmission means deletes a frame of data and M frames of data following said frame after said frame of data is successfully transmitted, wherein said M is a positive integer greater than or equal to 1, comprising:
the network transmission device is provided with a buffer area, and the buffer area is used for sending the video data;
and the network transmission device sets the value of M according to the time interval between the buffer area fullness and the last buffer area fullness, the area fullness times of the buffer area in the threshold time and a preset threshold time parameter.
8. Mobile baseband workstation according to claim 5,
the mobile baseband workstation also comprises a memory and a network transmission device;
the processor is used for synthesizing the compressed and coded information into video data;
the memory is used for storing the video data;
the network transmission device is used for transmitting the video data to a user terminal through a local area network;
a buffer area is arranged on the network transmission device and used for sending the video data;
the mode of the network transmission device for transmitting the video data is a frame-alternate transmission mode; the alternate frame transmission mode includes:
when a buffer area on the network transmission device is full, after the network transmission device successfully sends a frame of data, deleting the frame of data and N frames of data after the frame, wherein N is a positive integer greater than or equal to 1; and when the buffer area on the network transmission device is not full, the network transmission device deletes the frame data after successfully sending the frame data.
CN202010942399.0A 2020-09-09 2020-09-09 Video acquisition method and mobile baseband workstation Active CN112104840B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010942399.0A CN112104840B (en) 2020-09-09 2020-09-09 Video acquisition method and mobile baseband workstation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010942399.0A CN112104840B (en) 2020-09-09 2020-09-09 Video acquisition method and mobile baseband workstation

Publications (2)

Publication Number Publication Date
CN112104840A CN112104840A (en) 2020-12-18
CN112104840B true CN112104840B (en) 2022-10-04

Family

ID=73752521

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010942399.0A Active CN112104840B (en) 2020-09-09 2020-09-09 Video acquisition method and mobile baseband workstation

Country Status (1)

Country Link
CN (1) CN112104840B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112988626B (en) * 2021-02-27 2021-11-16 深圳市数码龙电子有限公司 Method and system for realizing network camera parameter setting by USB bus communication

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103475902A (en) * 2013-09-06 2013-12-25 同观科技(深圳)有限公司 Video coding and network transmission method and video forwarding server
CN105992023A (en) * 2015-02-11 2016-10-05 杭州海康威视数字技术股份有限公司 Video image data processing method and apparatus thereof
CN106559632A (en) * 2015-09-30 2017-04-05 杭州萤石网络有限公司 A kind of storage method and device of multimedia file
CN108933917A (en) * 2017-05-22 2018-12-04 大唐移动通信设备有限公司 A kind of video retransmission method and device
CN109640056A (en) * 2018-12-27 2019-04-16 深圳市有方科技股份有限公司 A kind of USB camera monitoring system and its method based on Android platform
CN109660879A (en) * 2018-12-20 2019-04-19 广州虎牙信息科技有限公司 Frame losing method, system, computer equipment and storage medium is broadcast live
CN109714526A (en) * 2018-11-22 2019-05-03 中国科学院计算技术研究所 Intelligent video camera head and control system
CN111385463A (en) * 2018-12-29 2020-07-07 华为技术有限公司 Method for controlling camera, control device, network equipment and camera

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3882793B2 (en) * 2003-07-14 2007-02-21 セイコーエプソン株式会社 Output image adjustment of image data
JP4458923B2 (en) * 2004-05-11 2010-04-28 富士通マイクロエレクトロニクス株式会社 Image processing device
CN101702778B (en) * 2009-11-30 2011-04-13 公安部第一研究所 Network video encoder using PS encapsulation technology to carry OSD information
JP2014200058A (en) * 2013-03-11 2014-10-23 パナソニック株式会社 Electronic device
US9733715B2 (en) * 2013-03-15 2017-08-15 Leap Motion, Inc. Resource-responsive motion capture

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103475902A (en) * 2013-09-06 2013-12-25 同观科技(深圳)有限公司 Video coding and network transmission method and video forwarding server
CN105992023A (en) * 2015-02-11 2016-10-05 杭州海康威视数字技术股份有限公司 Video image data processing method and apparatus thereof
CN106559632A (en) * 2015-09-30 2017-04-05 杭州萤石网络有限公司 A kind of storage method and device of multimedia file
CN108933917A (en) * 2017-05-22 2018-12-04 大唐移动通信设备有限公司 A kind of video retransmission method and device
CN109714526A (en) * 2018-11-22 2019-05-03 中国科学院计算技术研究所 Intelligent video camera head and control system
CN109660879A (en) * 2018-12-20 2019-04-19 广州虎牙信息科技有限公司 Frame losing method, system, computer equipment and storage medium is broadcast live
CN109640056A (en) * 2018-12-27 2019-04-16 深圳市有方科技股份有限公司 A kind of USB camera monitoring system and its method based on Android platform
CN111385463A (en) * 2018-12-29 2020-07-07 华为技术有限公司 Method for controlling camera, control device, network equipment and camera

Also Published As

Publication number Publication date
CN112104840A (en) 2020-12-18

Similar Documents

Publication Publication Date Title
KR100890236B1 (en) A method for capturing video data by utilizing a camera cell phone as a camera of a computer
CN109640056B (en) USB camera monitoring system and method based on Android platform
US20220318195A1 (en) Card Rendering Method and Electronic Device
CN102571726B (en) Method, system and the state judgment server that multi-medium data is shared
CN107484011B (en) Video resource decoding method and device
WO2022048474A1 (en) Method for multiple applications to share camera, and electronic device
CN102724396A (en) Wireless real-time display, control and cloud storage image pickup system based on WIFI (wireless fidelity)
KR102053689B1 (en) Compressing Method of image data for camera and Electronic Device supporting the same
CN103686328A (en) Method and device for adding camera to smart television
CN110198475B (en) Video processing method, device, equipment, server and readable storage medium
US9596435B2 (en) Distribution control apparatus, distribution control method, and computer program product
CN112104840B (en) Video acquisition method and mobile baseband workstation
US10535353B2 (en) Information processing system and information processing apparatus
US10468029B2 (en) Communication terminal, communication method, and computer program product
JP2014116805A (en) Imaging device, information processing device, control method therefor, and video processing system
US20050175036A1 (en) IP image transmission apparatus
CN116052701B (en) Audio processing method and electronic equipment
WO2022170866A1 (en) Data transmission method and apparatus, and storage medium
JPH0818622A (en) Information communication terminal
CN115550559A (en) Video picture display method, device, equipment and storage medium
WO2019000877A1 (en) Audio data processing method and device
CN113141480A (en) Screen recording method, device, equipment and storage medium
CN111131019B (en) Multiplexing method and terminal for multiple HTTP channels
CN117294690B (en) QoE evaluation method and electronic equipment
JP2006195807A (en) Image search system, image search method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant