CN221103455U - Live video acquisition system and electronic equipment - Google Patents

Live video acquisition system and electronic equipment Download PDF

Info

Publication number
CN221103455U
CN221103455U CN202322946047.4U CN202322946047U CN221103455U CN 221103455 U CN221103455 U CN 221103455U CN 202322946047 U CN202322946047 U CN 202322946047U CN 221103455 U CN221103455 U CN 221103455U
Authority
CN
China
Prior art keywords
module
sensor module
camera lens
image sensor
soc
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202322946047.4U
Other languages
Chinese (zh)
Inventor
程文波
孟环宇
贾宇宁
卢士波
葛天杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Xingxi Technology Co ltd
Original Assignee
Hangzhou Xingxi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Xingxi Technology Co ltd filed Critical Hangzhou Xingxi Technology Co ltd
Priority to CN202322946047.4U priority Critical patent/CN221103455U/en
Application granted granted Critical
Publication of CN221103455U publication Critical patent/CN221103455U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Studio Devices (AREA)

Abstract

The application relates to a live video acquisition system and electronic equipment, wherein the system comprises an SoC module, a lens control module, an image sensor module and a ToF sensor module; the SoC module is used for encoding and decoding the data acquired from the sensor module; the lens control module is used for automatically adjusting the aperture of the camera lens, the shutter of the camera lens and the focusing of the camera lens; the image sensor module is used for converting optical signals acquired by the camera lens into electric signals; the ToF sensor module is used for collecting distance information and depth information in the live broadcast picture. According to the application, the problem of how to improve the acquisition effect of live video images is solved, live data acquisition based on multiple sensors such as an image sensor, a ToF sensor and the like is realized, the image effect in the live process is effectively improved, and meanwhile, the camera lens control module realizes automatic control of camera lens parameters.

Description

Live video acquisition system and electronic equipment
Technical Field
The application relates to the field of live video equipment, in particular to a live video acquisition system and electronic equipment.
Background
With the development of the internet media technology, it is of course desirable for viewers to watch real-time program content with higher picture quality; for the anchor, in the process of video image acquisition of the current multimedia live broadcast, if a professional single-phase video camera is used for video image acquisition, the problems of high cost, complex debugging and the like exist; if a mobile phone or a general live camera is used for video acquisition, the problems of poor image quality and the like exist.
At present, no effective solution is proposed for the problem of how to improve the acquisition effect of live video images in the related technology.
Disclosure of utility model
The application provides a live video acquisition system and electronic equipment, which at least solve the problem of how to improve the acquisition effect of live video images in the related technology.
In a first aspect, an embodiment of the present application provides a live video acquisition system, where the system includes an SoC module, a lens control module, an image sensor module, and a ToF sensor module, where the SoC module is electrically connected to the lens control module, the image sensor module, and the ToF sensor module, respectively;
The SoC module is used for encoding and decoding the data acquired from the sensor module;
The lens control module is used for automatically adjusting the aperture of the camera lens, the shutter of the camera lens and the focusing of the camera lens;
The image sensor module is used for converting the optical signals acquired by the camera lens into electric signals, and the ToF sensor module is used for acquiring distance information and depth information in a live broadcast picture.
In some of these embodiments, the SoC module embeds a multi-core processor and a 4k 60fps ISP processor.
In some of these embodiments, the lens control module adapts the M43 mount camera lens.
In some embodiments, the image sensor module is an image sensor module with a target surface size of not less than 17.3mm by 13 mm.
In some embodiments, the SoC module adjusts the focusing of the camera lens in real time through a preset algorithm based on the distance information and the depth information acquired by the ToF sensor module.
In some of these embodiments, the SoC module is a heisi 22ap70 SoC module.
In some embodiments, the lens control module is an STM32F103RCT6 chip-based lens control module.
In some of these embodiments, the image sensor module is a sony IMX294 image sensor.
In some of these embodiments, the system further comprises a motion sensor module, a temperature sensor module, a power supply module, and an audio video output module
The motion sensor module is used for acquiring pose and acceleration changes of the live video acquisition system in a live broadcast process in real time, and performing automatic anti-shake through an optical anti-shake algorithm;
the temperature sensor module is used for acquiring working temperatures of the SoC module and the image sensor module in real time and ensuring normal operation of the live video acquisition system; the power supply module is used for providing power supply for the live video acquisition system;
The audio and video output module is used for outputting video data after the encoding and decoding processing of the SoC module.
In a second aspect, an embodiment of the present application provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, when executing the computer program, driving the system according to the first aspect.
Compared with the related art, the live video acquisition system and the electronic device provided by the embodiment of the application comprise an SoC module, a lens control module, an image sensor module and a ToF sensor module; the SoC module is used for encoding and decoding the data acquired from the sensor module; the lens control module is used for automatically adjusting the aperture of the camera lens, the shutter of the camera lens and the focusing of the camera lens; the image sensor module is used for converting optical signals acquired by the camera lens into electric signals; the ToF sensor module is used for collecting distance information and depth information in the live broadcast picture. Through the system, the problem of how to improve the acquisition effect of live video images is solved, live data acquisition based on multiple sensors such as an image sensor, a ToF sensor and the like is realized, the image effect of a live process is effectively improved, and meanwhile, the camera lens control module realizes automatic control of camera lens parameters.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
fig. 1 is a block diagram of a live video acquisition system according to an embodiment of the present application;
Fig. 2 is a schematic diagram of a preferred architecture of a live video capture system in accordance with an embodiment of the present application;
Fig. 3 is a schematic diagram of an internal structure of an electronic device according to an embodiment of the present application.
Detailed Description
The present application will be described and illustrated with reference to the accompanying drawings and examples in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application. All other embodiments, which can be made by a person of ordinary skill in the art based on the embodiments provided by the present application without making any inventive effort, are intended to fall within the scope of the present application.
It is apparent that the drawings in the following description are only some examples or embodiments of the present application, and it is possible for those of ordinary skill in the art to apply the present application to other similar situations according to these drawings without inventive effort. Moreover, it should be appreciated that while such a development effort might be complex and lengthy, it would nevertheless be a routine undertaking of design, fabrication, or manufacture for those of ordinary skill having the benefit of this disclosure, and thus should not be construed as having the benefit of this disclosure.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is to be expressly and implicitly understood by those of ordinary skill in the art that the described embodiments of the application can be combined with other embodiments without conflict.
Unless defined otherwise, technical or scientific terms used herein should be given the ordinary meaning as understood by one of ordinary skill in the art to which this application belongs. The terms "a," "an," "the," and similar referents in the context of the application are not to be construed as limiting the quantity, but rather as singular or plural. The terms "comprising," "including," "having," and any variations thereof, are intended to cover a non-exclusive inclusion; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to only those steps or elements but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. The terms "connected," "coupled," and the like in connection with the present application are not limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. The term "plurality" as used herein means two or more. "and/or" describes an association relationship of an association object, meaning that there may be three relationships, e.g., "a and/or B" may mean: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. The terms "first," "second," "third," and the like, as used herein, are merely distinguishing between similar objects and not representing a particular ordering of objects.
The embodiment of the application provides a live video acquisition system, fig. 1 is a structural block diagram of the live video acquisition system according to the embodiment of the application, and as shown in fig. 1, the system comprises an SoC module 11, a lens control module 12, an image sensor module 13 and a ToF sensor module 14, wherein the SoC module 11 is electrically connected with the lens control module 12, the image sensor module 13 and the ToF sensor module 14 respectively;
The SoC module 11 is configured to perform encoding and decoding processing on the data acquired from the sensor module;
Specifically, the SoC module 11 has built therein a multi-core processor and a 4k 60fps ISP processor. It should be added that, the multi-core processor refers to a processor integrated with multiple CPU cores on a chip, which can improve the capability of parallel computing and multitasking, and improve the response speed and the operation efficiency of the system, such as a communication processor. The ISP processor is referred to as an image signal processor (Image SignalProcessor) for short, which is a processor specially used for processing image signals, and can perform operations such as noise reduction, white balance color correction, auto-focusing, auto-exposure, etc. on raw image data collected from an image sensor (e.g., CMOS or CCD) to improve image quality and performance. ISP processors are commonly used in image acquisition and processing devices such as digital cameras, video cameras, monitoring devices, and the like.
Preferably, the SoC module 11 is a heisi 22ap70 SoC module. The present embodiment is not limited to the specific model of the SoC module 11, and any SoC module satisfying the built-in multi-core processor and 4k 60fps ISP processor may be used as the SoC module 11 in the present embodiment.
It should be noted that, fig. 2 is a schematic diagram of a preferred structure of a live video acquisition system according to an embodiment of the present application, as shown in fig. 2, a haisi 22ap70 SoC module is used as a main control chip of the system, and supports four paths of sensor (sensor) inputs at the highest, supports ISP image processing capability of the highest 4K60FPS, supports multiple image enhancement and processing algorithms such as 3FWDR, multi-stage noise reduction, six-axis anti-shake, hardware splicing, etc., provides a user with excellent image processing capability, and has significantly improved powerful encoding and decoding performance compared with the existing live video encoder on the market, and can effectively improve quality of video images acquired in a live scene.
A lens control module 12 for automatically adjusting the aperture of the camera lens, the shutter of the camera lens, and the focus of the camera lens; it should be added that the aperture of the camera lens and the shutter of the camera lens affect the exposure of the camera, the Soc module 11 calculates appropriate exposure parameters according to factors such as brightness, contrast, color, etc. of the live scene, the lens control module 12 automatically adjusts the aperture size and shutter speed of the camera lens based on the exposure parameters, and similarly, the lens control module 12 automatically adjusts the focal length of the camera lens based on the focal point and focal plane calculated by the Soc module 11, so that the live subject focuses clearly.
Specifically, the lens control module 12 adapts the M43 bayonet camera lens.
Preferably, the lens control module 12 is a STM32F103RCT6 chip based lens control module. The present embodiment is not limited to the specific model of the lens control module 12, and any lens control module that meets the requirement of fitting the M43 mount camera lens may be used as the lens control module 12 in the present embodiment.
It should be noted that, as shown in fig. 2, the lens control module based on the ST chip (STM 32F103RCT 6) in the present embodiment may be adapted to the market M43 bayonet camera lens, so as to implement the aperture, shutter and automatic focus control of the adapted camera lens. STM32F103RCT6 Performance series integrates high PerformanceA 32 bit RISC core, operating at 72MHz, high speed embedded memory (flash up to 512kb and SRAM up to 64 kb), and widely enhanced I/O and peripherals connected to the two APB buses. All devices provided three 12-bit adc, four universal 16-bit timers and two PWM timers, as well as standard and advanced communication interfaces (two i2c interfaces, three SPls interfaces, two i2s interfaces, one SDIO interface, five USARTs interfaces, one USB interface and one CAN interface).
The image sensor module 13 is configured to convert an optical signal collected by the camera lens into an electrical signal.
Specifically, the image sensor module 13 is an image sensor module having a single-inverse-stage large target surface. It should be added that, the single lens reflex generally includes a middle frame camera, a full frame camera, an APS frame camera, and an M43 frame camera, where the target surface size of the middle frame camera is 4cm x 6cm or 6cm x 6cm, the target surface size of the full frame camera is 36mm x 24mm, the target surface size of the APS frame camera is 28.7mm x 19mm or 22.5mm x 15mm, the target surface size of the M43 frame camera is 17.3mm x 13mm, and the single lens reflex large target surface may refer to a target surface with a size not less than 17.3mm x 13 mm.
Preferably, the image sensor module 13 is a sony IMX294 image sensor. The present embodiment is not limited to the specific model of the image sensor module 13, and any image sensor module having a single reverse large target surface may be used as the image sensor module 13 in the present embodiment.
It should be noted that, as shown in fig. 2, the IMX294CJK-C image sensor is a diagonal 21.63mm (type 4/3) CMOS image sensor, having a color square pixel array and about 10.71M effective pixels; the 12-bit digital output enables it to output effective pixels (about 17:9 aspect ratio) of about 9.07M in size, realizing 12-bit digital output of photographing a high-speed moving image by horizontal and vertical addition and sub-sampling; the power supply device also has an I/O interface for simulating three power supply voltages of 2.9V, 1.2V and 1.8V, can be effectively adapted to various power supply devices and realizes low power consumption. In addition, the sensor has an electronic shutter function with high sensitivity, low dark current, and variable storage time. The method can ensure the permeability, the sensitivity, the picture fineness, the definition and the like of the whole image effect under the live broadcast scene, realize the conversion of the optical signals collected from the lens into the electric signals, and can achieve the image effect of the single-lens reflex camera by matching with the related algorithm.
The ToF sensor module 14 is configured to collect distance information and depth information in the live view.
Specifically, based on the distance information and the depth information acquired by the ToF sensor module 14, the SoC module 11 adjusts the focus of the camera lens in real time through a preset algorithm. It should be added that, based on the collected depth information (three-dimensional coordinate information of the live broadcast subject in space), the SoC module 11 may calculate to obtain the focus of the live broadcast subject in the image through the existing optical algorithm, and then obtain the focal plane (same as the plane of the live broadcast subject) based on the collected distance information, and the lens control module 12 automatically adjusts the focusing of the camera lens based on the focus and the focal plane.
It should be noted that, the ToF sensor module 14 has the characteristics of large angle, multiple pixels, high precision, long acquisition distance, and the like, and has stronger ambient light immunity by adopting a 940nm band light source. As shown in fig. 2, the distance and depth information of the object of the detected main body can be transmitted to the main chip (SoC module 11) in real time, and the camera can be focused accurately and quickly by processing the real-time adjustment lens through an algorithm.
In the video acquisition system provided by the embodiment of the application, the Haisi 22AP70 SoC module is internally provided with the four-core A55 processor and the 4K 60FPS ISP processor, and the powerful encoding and decoding performance of the system is obviously improved compared with that of the existing live broadcast encoder in the market; based on the technology that the optical signals of the Sony IMX294 image sensor are converted into electric signals, scene depth and distance information are collected by combining a ToF chip, and an image algorithm and a professional live broadcasting room effect are customized by combining live broadcasting scenes, so that accurate, rapid and special 4K image output for live broadcasting is realized; a lens control module based on STM32F103RCT6 chip supports an interchangeable M43 interface camera lens. The live broadcast equipment is replaced in live broadcast industry pain spot different live broadcast scenes, use the camera with high costs, live broadcast camera difficult debugging and image effect subalternation problem.
In some of these embodiments, the system further includes an auxiliary sensor module, a power supply module, and an audio video output module.
The auxiliary sensor module is used for assisting the automatic anti-shake and normal operation of the live video acquisition system;
Specifically, the auxiliary sensor module comprises a motion sensor module and a temperature sensor module; the motion sensor module is used for acquiring pose and acceleration changes of the live video acquisition system in a live broadcast process in real time and automatically preventing shake through an optical anti-shake algorithm; and the temperature sensor module is used for acquiring the working temperatures of the SoC module and the image sensor module in real time and ensuring the normal operation of the live video acquisition system. It should be added that when the collected working temperature of the SoC module or the image sensor module is higher than the preset temperature (e.g. 50 ℃), the working frequency of the SoC module or the image sensor module can be actively reduced to reduce heat generation, if the system includes a heat dissipation module, the heat dissipation module can be started to increase heat dissipation.
Preferably, as shown in fig. 2, the motion sensor module is a 6-axis MEMS motion tracking device, and combines a 3-axis gyroscope and a 3-axis accelerometer, and provides the motion sensor module for the whole machine to cooperate with an optical anti-shake algorithm in real time, so as to realize automatic anti-shake in the photographing process. The temperature sensor module is a chip NTC thermistor, has excellent weldability, high stability and high precision in an application environment, can collect working temperatures of a main chip, an image sensor and a storage chip in real time, and ensures that the whole machine can work normally.
The power supply module is used for providing power supply for the live video acquisition system;
Preferably, as shown in fig. 2, the external interface of the power supply module is USB Type-C, and an overvoltage protection chip and a direct current 9V/12V fast charging protocol support are built in, so as to provide safe, stable and continuous reliable power supply for the whole hardware.
And the audio and video output module is used for outputting video data after the encoding and decoding processing of the SoC module.
Preferably, as shown in fig. 2, the audio/video output module transmits the video real-time data processed by the SoC module outwards through the built-in USB-to-Type-C3.0 interface and the HDMI interface, that is, the 4k 60fps ultra-high-definition live video picture can be displayed in real time by connecting the USB Type-C3.0 interface and the HDMIOUT interface.
In some of these embodiments, as shown in fig. 2, the system further comprises:
Optical filter and camera lens module: the optical filter adopts two optical filters to filter the influence of external light, so as to solve the problems of invisible light, moire and image quality degradation. The camera lens module can be detached and replaced, environmental image information can be acquired to the image sensor in real time, and the image sensor converts optical signals into electric signals to complete image acquisition.
Serial port module: the inside of the device is provided with two paths of serial ports, and the built-in serial ports are used for system debugging.
DDR and eMMC memory modules: it has internal preset kernel program and firmware storage functions.
Live broadcast customization algorithm module: the device adopts live broadcast customized image algorithm and parameter-adjusting APP, and can meet the special customized working scene of live broadcast scenes.
The above-described respective modules may be functional modules or program modules, and may be implemented by software or hardware. For modules implemented in hardware, the various modules described above may be located in the same processor; or the above modules may be located in different processors in any combination.
The present embodiment also provides an electronic device comprising a memory having stored therein a computer program and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, where the transmission device is connected to the processor, and the input/output device is connected to the processor.
It should be noted that, specific examples in this embodiment may refer to examples described in the foregoing embodiments and alternative implementations, and this embodiment is not repeated herein.
In addition, in combination with the live video capturing system in the above embodiment, the embodiment of the present application may provide a storage medium. The storage medium has a computer program stored thereon; the computer program, when executed by a processor, implements any of the live video capture systems of the above embodiments.
In one embodiment, a computer device is provided, which may be a terminal. The computer device includes a processor, a memory, a network interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The network interface of the computer device is used for communicating with an external terminal through a network connection.
The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, can also be keys, a track ball or a touch pad arranged on the shell of the computer equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
In one embodiment, fig. 3 is a schematic diagram of an internal structure of an electronic device according to an embodiment of the present application, and as shown in fig. 3, an electronic device, which may be a server, is provided, and an internal structure diagram thereof may be as shown in fig. 3. The electronic device includes a processor, a network interface, an internal memory, and a non-volatile memory connected by an internal bus, where the non-volatile memory stores an operating system, computer programs, and a database. The processor is used to provide computing and control capabilities, the network interface is used to communicate with an external terminal over a network connection, and the internal memory is used to provide an environment for the operation of the operating system and computer programs.
It will be appreciated by those skilled in the art that the structure shown in fig. 3 is merely a block diagram of a portion of the structure associated with the present inventive arrangements and is not limiting of the electronic device to which the present inventive arrangements are applied, and that a particular electronic device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in embodiments provided herein may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous link (SYNCHLINK) DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
It should be understood by those skilled in the art that the technical features of the above-described embodiments may be combined in any manner, and for brevity, all of the possible combinations of the technical features of the above-described embodiments are not described, however, they should be considered as being within the scope of the description provided herein, as long as there is no contradiction between the combinations of the technical features.
The above examples illustrate only a few embodiments of the utility model, which are described in detail and are not to be construed as limiting the scope of the utility model. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the utility model, which are all within the scope of the utility model. Accordingly, the scope of protection of the present utility model is to be determined by the appended claims.

Claims (10)

1. The live video acquisition system is characterized by comprising an SoC module, a lens control module, an image sensor module and a ToF sensor module, wherein the SoC module is electrically connected with the lens control module, the image sensor module and the ToF sensor module respectively;
The SoC module is used for encoding and decoding the data acquired from the sensor module;
The lens control module is used for automatically adjusting the aperture of the camera lens and the focal length of the camera lens and controlling the shutter of the camera lens;
The image sensor module is used for converting the optical signals acquired by the camera lens into electric signals;
The ToF sensor module is used for collecting distance information and depth information in the live broadcast picture.
2. The system of claim 1, wherein the SoC module houses a multi-core processor and a 4k 60fps ISP processor.
3. The system of claim 1, wherein the lens control module is adapted to an M43 mount camera lens.
4. The system of claim 1, wherein the image sensor module is an image sensor module having a target surface size of no less than 17.3mm by 13 mm.
5. The system of claim 1, wherein the SoC module adjusts the focus of the camera lens in real time by a preset algorithm based on the distance information and depth information collected by the ToF sensor module.
6. The system of claim 2, wherein the SoC module is a haisi 22AP70SoC module.
7. The system of claim 3, wherein the lens control module is an STM32F103RCT6 chip based lens control module.
8. The system of claim 4, wherein the image sensor module is a sony IMX294 image sensor.
9. The system of claim 1, further comprising a motion sensor module, a temperature sensor module, a power supply module, and an audio video output module;
The motion sensor module is used for acquiring pose and acceleration changes of the live video acquisition system in a live broadcast process in real time, and performing automatic anti-shake through an optical anti-shake algorithm;
The temperature sensor module is used for acquiring working temperatures of the SoC module and the image sensor module in real time and ensuring normal operation of the live video acquisition system; the power supply module is used for providing power for the live video acquisition system;
The audio and video output module is used for outputting video data after the encoding and decoding processing of the SoC module.
10. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, the processor being arranged to run the computer program to drive the system of any of claims 1 to 9.
CN202322946047.4U 2023-10-30 2023-10-30 Live video acquisition system and electronic equipment Active CN221103455U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202322946047.4U CN221103455U (en) 2023-10-30 2023-10-30 Live video acquisition system and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202322946047.4U CN221103455U (en) 2023-10-30 2023-10-30 Live video acquisition system and electronic equipment

Publications (1)

Publication Number Publication Date
CN221103455U true CN221103455U (en) 2024-06-07

Family

ID=91328432

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202322946047.4U Active CN221103455U (en) 2023-10-30 2023-10-30 Live video acquisition system and electronic equipment

Country Status (1)

Country Link
CN (1) CN221103455U (en)

Similar Documents

Publication Publication Date Title
CN217849511U (en) Image capturing apparatus, device and system and integrated sensor optical component assembly
US11588984B2 (en) Optimized exposure temporal smoothing for time-lapse mode
US9961272B2 (en) Image capturing apparatus and method of controlling the same
JP2017505004A (en) Image generation method and dual lens apparatus
CN103188423A (en) Camera shooting device and camera shooting method
CN110198418B (en) Image processing method, image processing device, storage medium and electronic equipment
US10692196B2 (en) Color correction integrations for global tone mapping
CN110430370B (en) Image processing method, image processing device, storage medium and electronic equipment
CN113810596B (en) Time-delay shooting method and device
US11503223B2 (en) Method for image-processing and electronic device
CN204316622U (en) Image processing apparatus and user terminal
KR20140080815A (en) Photographing apparatus, method for controlling the same, and computer-readable storage medium
US20100007766A1 (en) Camera device and image processing method
CN112822371A (en) Image processing chip, application processing chip, data statistical system and method
JP7383911B2 (en) Imaging system, image processing device, imaging device and program
CN110266967B (en) Image processing method, image processing device, storage medium and electronic equipment
KR102336449B1 (en) Photographing apparatus and method for controlling the same
CN221103455U (en) Live video acquisition system and electronic equipment
CN108881731B (en) Panoramic shooting method and device and imaging equipment
KR102301940B1 (en) Method and apparatus for image fusion
CN207427311U (en) A kind of PLCC encapsulation plus the cobasis plate dual camera of CSP encapsulation
CN115529411A (en) Video blurring method and device
CN112104796B (en) Image processing method and device, electronic equipment and computer readable storage medium
JP7379884B2 (en) Imaging device, image processing system, method and program
JP7451888B2 (en) Imaging device, imaging system, method and program

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant