Detailed Description
To make the objects, technical solutions and advantages of the present disclosure clearer, the present disclosure will be described in further detail with reference to the accompanying drawings, and it is apparent that the described embodiments are only a part of the embodiments of the present disclosure, rather than all embodiments. All other embodiments, which can be derived by one of ordinary skill in the art from the embodiments disclosed herein without making any creative effort, shall fall within the scope of protection of the present disclosure.
The terminology used in the embodiments of the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in the disclosed embodiments and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise, and "a plurality" typically includes at least two.
It should be understood that the term "and/or" as used herein is merely one type of association that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
It should be understood that although the terms first, second, third, etc. may be used to describe … … in embodiments of the present disclosure, these … … should not be limited to these terms. These terms are used only to distinguish … …. For example, the first … … can also be referred to as the second … … and, similarly, the second … … can also be referred to as the first … … without departing from the scope of embodiments of the present disclosure.
The words "if", as used herein, may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It is also noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that an article or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such article or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in the article or device in which the element is included.
Alternative embodiments of the present disclosure are described in detail below with reference to the accompanying drawings.
Example 1
As shown in fig. 1, a schematic diagram of a hardware structure according to an embodiment of the present disclosure is shown, where the schematic diagram of the hardware structure includes an android device, a USB hub device, a USB video capture card, and a multimedia playing device, which are connected in sequence. Through improving the system of the android device, richer video contents can be obtained by means of an external USB video capture card so as to meet the requirements of different video contents. But not limited to this unique hardware connection structure, it should be understood that any hardware connection structure that can be applied to this embodiment is included, and for convenience of explanation, this embodiment is described by taking the above-mentioned hardware structure schematic diagram as an example. In this embodiment, the device and the method are described by using the android system in a unified manner, but the listening is not limited to the android system, and actually, both the I OS and the gas intelligent operating system are included in the disclosure of the present application. For the convenience of explanation, this disclosure adopts tall and erect smart machine of ann to explain.
The android device used in this embodiment may be any terminal device having an android operating system, and the terminal device has at least one camera, for example, a smart camera, a smart phone, a PAD, a computer, or the like. For simplicity, the following description may use a smart phone as an example.
The video capture card adopted in the embodiment is any existing video capture card with a video processing function, and more abundant video contents can be obtained through the connection of the video capture card and the smart phone.
In an embodiment provided by the disclosure, a method for controlling video capture content on an android device is provided. The method comprises the following steps:
step S202: the android device is connected with the video capture card through the line concentration device.
The android device comprises but is not limited to a smart camera, a smart phone, a PAD or a computer.
HUB devices include, but are not limited to, USB HUBs, which are devices that can extend a USB interface to multiple interfaces and allow the interfaces to be used simultaneously. USB HUBs are classified into USB2.0 HUBs, USB3.0 HUBs, and USB3.1 HUBs according to the USB protocol.
The Video Capture card (Video Capture card) is used for inputting Video signal data or mixed data of Video and audio output by smart phones, analog cameras, Video recorders, LD Video disk players, televisions and the like into a computer, converting the Video signal data or the mixed data of the Video and the audio into digital data which can be distinguished by the computer, storing the digital data in the computer and converting the digital data into a Video data file which can be edited and processed. The video capture card adopted in the implementation includes but is not limited to a 1394 capture card, a USB capture card, an HDMI capture card, a VGA video capture card, a PCI video card and a PCI-E video capture card. Preferably a USB acquisition card.
Optionally, the android device is connected to the video capture card through the line concentration device, and then further includes: the video capture card is connected with the multimedia player and is used for playing the captured video content in real time.
The multimedia player comprises any hardware device and software device which can play video and/or audio files. The multimedia player can play the multimedia file acquired by the video acquisition card through being connected with the video acquisition card.
Optionally, the hub device is a USB hub device, and the USB hub device is connected to the android device through a USB interface.
Optionally, the video capture card is a USB video capture card, and the USB hub device is connected to the USB video capture card through a USB interface.
The USB connection mode is only a preferred mode, and is a connection mode generally used by the current interface, and other improved connection interfaces may be used according to the improvement of hardware devices. Such as Type-C.
Optionally, the USB video capture card is connected to the multimedia player through an HDMI interface, an AV interface, or an Ycbcr interface.
An HDMI (High Definition Multimedia Interface) is a fully digital video and audio transmission Interface, and can transmit uncompressed audio and video signals. The HDMI interface can send audio frequency and video signal simultaneously, because audio frequency and video signal adopt same wire rod, simplify the installation degree of difficulty of system's circuit greatly.
The AV interface is a video interface and consists of lines with three colors of yellow, white and red, wherein the yellow line is a video transmission line, and the white and red are responsible for sound transmission of left and right sound channels.
The Ycbcr interface, Ycbcr or Y' CBCR, is a type of color space that is commonly used in video processing, or digital photography systems. Y' is the luminance component of the color, while CB and CR are the density offset components of the blue and red colors. Y' and Y are different, and Y is the so-called lumen, which represents the concentration of light and is non-linear, using a gamma correction encoding process.
Optionally, the android device is connected to the video capture card through the line concentrator, and then the method further includes the following steps:
step S203: and adding UVC support to an android system layer of the android device so that the android device can acquire video content through the USB video acquisition card.
UVC, collectively referred to as: USB video class or USB video device class. Is a protocol standard developed by Microsoft for USB video capture devices, and has become one of the USB org standards. For the situation of providing UVC support in an android system, the UVC support needs to be added in the kernel of the android system in advance, so that the android device can acquire video content through the USB video acquisition card.
Step S204: and configuring the equipment node number of the drive layer of the android equipment system to enable the equipment node number to be matched with the equipment number of the video acquisition card.
Optionally, the configuring the device node number of the android device system driver layer includes: and configuring the node number of dev and/or video equipment of the android equipment system driving layer. For example, if the original default serial number of the front camera is 1 and the serial number of the rear camera is 0, the serial numbers of the devices of the USB acquisition cards are replaced, so that the video images acquired by the front camera and the rear camera conform to the standard of the USB acquisition card.
Step S206: and configuring the capability parameters of the front camera and/or the rear camera of the middle layer of the android device to enable the capability parameters of the front camera and/or the rear camera to be matched with the capability parameters of the video acquisition card.
The capability parameters of the front-end and/or rear-end camera include but are not limited to the size of pixels, color saturation and the like so as to adapt to the capability of the USB acquisition card, and program abnormity caused by mismatching of the USB acquisition card in configuration capability is avoided.
Step S208: and acquiring video contents conforming to the capability parameters of the video acquisition card through a front camera and/or a rear camera of the android device.
After each hardware device is connected and configured to the position, video content can be obtained through a camera of the smart phone by operating a shooting APP of the android smart phone. At this time, the image content acquired by the android device is the content acquired by the USB acquisition card, that is, the content played by the multimedia player, that is, the content of the customized video source file.
According to the method, the additional USB video capture card is provided, and the existing android system is improved, so that the existing android equipment can capture more unique video content, the limitation that the existing hardware equipment is limited by functions and parameters of a camera is expanded, and a solution is provided for obtaining high-quality video content in a specific occasion. According to the method and the device, the video acquisition content of the android device is completely controlled, and the original camera acquisition content of the android device is replaced. After the video content is controlled, the effects of adding special effects to the collected content, changing local collected content or changing all collected content can be realized. The changed content provides convenience for later-stage video coding, debugging, testing and other applications, and has higher market value.
Example 2
As shown in fig. 1, a schematic diagram of a hardware structure according to an embodiment of the present disclosure is shown, where the schematic diagram of the hardware structure includes an android device, a USB hub device, a USB video capture card, and a multimedia playing device, which are connected in sequence. Through improving the system of the android device, richer video contents can be obtained by means of an external USB video capture card so as to meet the requirements of different video contents. But not limited to this unique hardware connection structure, it should be understood that any hardware connection structure that can be applied to this embodiment is included, and for convenience of explanation, this embodiment is described by taking the above-mentioned hardware structure schematic diagram as an example. In this embodiment, the device and the method are described by using the android system in a unified manner, but the listening is not limited to the android system, and in fact, the IOS or the gas intelligent operating system is included in the disclosure of the present application. For the convenience of explanation, this disclosure adopts tall and erect smart machine of ann to explain. The embodiment is similar to embodiment 1 in the explanation of the method steps for implementing the method steps as described in embodiment 1 based on the same names and meanings, and has the same technical effects as embodiment 1, and thus the description thereof is omitted.
The android device used in this embodiment may be any terminal device having an android operating system, and the terminal device has at least one camera, for example, a smart camera, a smart phone, a PAD, a computer, or the like. For simplicity, the following description may use a smart phone as an example.
The video capture card adopted in the embodiment is any existing video capture card with a video processing function, and more abundant video contents can be obtained through the connection of the video capture card and the smart phone.
In the implementation manner provided by the disclosure, an apparatus for controlling video capture content on an android device is provided. The method comprises the following steps:
the video acquisition unit 302: the android device is configured to be connected with a video capture card through a line concentration device.
The android device comprises but is not limited to a smart camera, a smart phone, a PAD or a computer.
HUB devices include, but are not limited to, USB HUBs, which are devices that can extend a USB interface to multiple interfaces and allow the interfaces to be used simultaneously. USB HUBs are classified into USB2.0 HUBs, USB3.0 HUBs, and USB3.1 HUBs according to the USB protocol.
The Video Capture card (Video Capture card) is used for inputting Video signal data or mixed data of Video and audio output by smart phones, analog cameras, Video recorders, LD Video disk players, televisions and the like into a computer, converting the Video signal data or the mixed data of the Video and the audio into digital data which can be distinguished by the computer, storing the digital data in the computer and converting the digital data into a Video data file which can be edited and processed. The video capture card adopted in the implementation includes but is not limited to a 1394 capture card, a USB capture card, an HDMI capture card, a VGA video capture card, a PCI video card and a PCI-E video capture card. Preferably a USB acquisition card.
Optionally, the android device is connected to the video capture card through the line concentration device, and then further includes a video playing unit: the video capture card is connected with the multimedia player and is used for playing the captured video content in real time.
The multimedia player comprises any hardware device and software device which can play video and/or audio files. The multimedia player can play the multimedia file acquired by the video acquisition card through being connected with the video acquisition card.
Optionally, the hub device is a USB hub device, and the USB hub device is connected to the android device through a USB interface.
Optionally, the video capture card is a USB video capture card, and the USB hub device is connected to the USB video capture card through a USB interface.
The USB connection mode is only a preferred mode, and is a connection mode generally used by the current interface, and other improved connection interfaces may be used according to the improvement of hardware devices. Such as Type-C.
Optionally, the USB video capture card is connected to the multimedia player through an HDMI interface, an AV interface, or an Ycbcr interface.
An HDMI (High Definition Multimedia Interface) is a fully digital video and audio transmission Interface, and can transmit uncompressed audio and video signals. The HDMI interface can send audio frequency and video signal simultaneously, because audio frequency and video signal adopt same wire rod, simplify the installation degree of difficulty of system's circuit greatly.
The AV interface is a video interface and consists of lines with three colors of yellow, white and red, wherein the yellow line is a video transmission line, and the white and red are responsible for sound transmission of left and right sound channels.
The Ycbcr interface, Ycbcr or Y' CBCR, is a type of color space that is commonly used in video processing, or digital photography systems. Y' is the luminance component of the color, while CB and CR are the density offset components of the blue and red colors. Y' and Y are different, and Y is the so-called lumen, which represents the concentration of light and is non-linear, using a gamma correction encoding process.
Optionally, the apparatus further comprises a support unit: and adding UVC support to an android system layer of the android device so that the android device can acquire video content through the USB video acquisition card.
UVC, collectively referred to as: USB video class or USB video device class. Is a protocol standard developed by Microsoft for USB video capture devices, and has become one of the USB org standards. For the situation of providing UVC support in an android system, the UVC support needs to be added in the kernel of the android system in advance, so that the android device can acquire video content through the USB video acquisition card.
The first configuration unit 304: and configuring the equipment node number of the drive layer of the android equipment system to enable the equipment node number to be matched with the equipment number of the video acquisition card.
Optionally, the configuring the device node number of the android device system driver layer includes: and configuring the node number of dev and/or video equipment of the android equipment system driving layer. For example, if the original default serial number of the front camera is 1 and the serial number of the rear camera is 0, the serial numbers of the devices of the USB acquisition cards are replaced, so that the video images acquired by the front camera and the rear camera conform to the standard of the USB acquisition card.
The second configuration unit 306: and configuring the capability parameters of the front camera and/or the rear camera of the middle layer of the android device to enable the capability parameters of the front camera and/or the rear camera to be matched with the capability parameters of the video acquisition card.
The capability parameters of the front-end and/or rear-end camera include but are not limited to the size of pixels, color saturation and the like so as to adapt to the capability of the USB acquisition card, and program abnormity caused by mismatching of the USB acquisition card in configuration capability is avoided.
Video content unit 308: and the method is configured to acquire video content conforming to the capability parameters of the video acquisition card through a front camera and/or a rear camera of the android device.
After each hardware device is connected and configured to the position, video content can be obtained through a camera of the smart phone by operating a shooting APP of the android smart phone. At this time, the image content acquired by the android device is the content acquired by the USB acquisition card, that is, the content played by the multimedia player, that is, the content of the customized video source file.
This device is through providing extra USB video capture card to improve current tall and erect system, thereby make current tall and erect equipment of ann can gather more unique video content, expanded original hardware equipment and received the restriction from taking camera function, parameter, provide a solution for obtaining high-quality video content under the specific occasion. According to the method and the device, the video acquisition content of the android device is completely controlled, and the original camera acquisition content of the android device is replaced. After the video content is controlled, the effects of adding special effects to the collected content, changing local collected content or changing all collected content can be realized. The changed content provides convenience for later-stage video coding, debugging, testing and other applications, and has higher market value.
Example 3
As shown in fig. 4, this embodiment provides an electronic device, where the electronic device is used for controlling video capture content on an android device, and the electronic device includes: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the one processor to cause the at least one processor to perform the method steps of the above embodiments.
Example 4
The disclosed embodiments provide a non-volatile computer storage medium having stored thereon computer-executable instructions that may perform the method steps as described in the embodiments above.
Example 5
Referring now to FIG. 4, shown is a schematic diagram of an electronic device suitable for use in implementing embodiments of the present disclosure. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 4 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 4, the electronic device may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 401 that may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)402 or a program loaded from a storage means 408 into a Random Access Memory (RAM) 403. In the RAM 403, various programs and data necessary for the operation of the electronic apparatus are also stored. The processing device 401, the ROM 402, and the RAM 403 are connected to each other via a bus 404. An input/output (I/O) interface 405 is also connected to bus 404.
Generally, the following devices may be connected to the I/O interface 405: input devices 406 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 407 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 408 including, for example, tape, hard disk, etc.; and a communication device 409. The communication means 409 may allow the electronic device to communicate with other devices wirelessly or by wire to exchange data. While fig. 4 illustrates an electronic device having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication device 409, or from the storage device 408, or from the ROM 402. The computer program performs the above-described functions defined in the methods of the embodiments of the present disclosure when executed by the processing device 401.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of an element does not in some cases constitute a limitation on the element itself.