CN215734456U - Camera and display device - Google Patents

Camera and display device Download PDF

Info

Publication number
CN215734456U
CN215734456U CN202121349111.5U CN202121349111U CN215734456U CN 215734456 U CN215734456 U CN 215734456U CN 202121349111 U CN202121349111 U CN 202121349111U CN 215734456 U CN215734456 U CN 215734456U
Authority
CN
China
Prior art keywords
image
module
camera
camera body
motor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202121349111.5U
Other languages
Chinese (zh)
Inventor
王宏斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202121349111.5U priority Critical patent/CN215734456U/en
Application granted granted Critical
Publication of CN215734456U publication Critical patent/CN215734456U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Controls And Circuits For Display Device (AREA)

Abstract

The application provides a camera and display device, this camera includes: the first camera body is used for collecting RGB data; the second camera body is connected with the first camera body; the camera body is used for receiving a synchronous signal from the first camera body and acquiring image phase data according to the synchronous signal; the first camera body and the second camera body are placed on the holder device; the holder device is used for receiving control information transmitted by the display equipment; controlling the rotation and/or the pitching of the first camera body and the rotation and/or the pitching of the second camera body according to the control information; this application is integrated second camera body and cloud platform device simultaneously in the camera, and cloud platform device has realized the collection of the image of different angles of first camera and second camera, makes the display device who is connected with the camera obtain image phase data and RGB data after, can obtain the stereoscopic image.

Description

Camera and display device
Technical Field
The embodiment of the application relates to a display technology. And more particularly, to a camera and a display device.
Background
A schematic structural diagram of a camera currently used is shown in fig. 1, where the camera 01 is connected with a display device 02. The camera 01 comprises an image acquisition module, an image data processing module, an image data compression module and a USB controller; the image data acquired by the image data acquisition module is processed by the image data processing module and the image data compression module and then transmitted to the USB controller; and then the image data is transmitted to an image algorithm module of the display device 02, and the image algorithm module performs related calculation on the image data to obtain an image to be displayed by the display device 02.
The camera has the following problems: the camera can only obtain RGB data, and cannot freely rotate to obtain a stereoscopic image.
SUMMERY OF THE UTILITY MODEL
The exemplary embodiment of the application provides a camera and a display device, so as to solve the problem that the module design complexity of the existing camera is high.
In a first aspect, an embodiment of the present application provides a camera, including:
the first camera body is used for collecting RGB data;
the second camera body is connected with the first camera body; the second camera body is used for receiving the synchronous signal from the first camera body and acquiring image phase data according to the synchronous signal;
the first camera body and the second camera body are placed on the holder device; the holder device is used for receiving control information transmitted by the display equipment; and controlling the rotation and/or the pitch of the first camera body and the rotation and/or the pitch of the second camera body according to the control information.
In one embodiment, the camera further includes:
the device comprises an image processing module, an image compression module, an image algorithm module and a phase data processing module;
the input end of the image processing module is connected with the output end of the first camera body, and the output end of the image processing module is respectively connected with the input end of the image compression module and the input end of the image algorithm module; the output end of the image compression module is connected with the display device; the input end of the phase data processing module is connected with the output end of the second camera body; the output end of the phase data processing module is respectively connected with the input end of the image algorithm module and the display device;
the phase data processing module is used for converting the image phase data into image depth data and transmitting the image depth data to the display equipment and the image algorithm module;
and the image algorithm module is used for receiving a first algorithm instruction from the display equipment, performing algorithm processing on the first image data and the depth data to obtain first algorithm result data, and transmitting the first algorithm result data to the display equipment so that the display equipment performs rendering display on the second image data and the image depth data according to the first algorithm result data.
In one embodiment, the camera further includes: a controller;
the image processing module, the image compression module and the image algorithm module are arranged on the controller;
the phase data processing module is arranged outside the controller.
In one embodiment, the image processing module, the image compression module, the image algorithm module and the phase data processing module are all arranged on the controller.
In one embodiment, the display device further includes:
an interface conversion module; the interface conversion module is arranged on the controller;
the input end of the interface conversion module is connected with the output end of the phase processing module; the output end of the interface conversion module is connected with the display equipment;
the interface conversion module is used for receiving the image depth data transmitted by the phase data processing module, performing interface conversion on the image depth data and transmitting the image depth data to the display equipment.
In one embodiment, the holder device includes: the system comprises a processor, a motor control module and a motor;
the processor is respectively connected with the controller and the motor control module;
the motor control module is also connected with the motor;
the processor is used for acquiring control information from the controller and transmitting the control information to the motor control module; and the motor control module is used for controlling the rotation and/or the pitching of the motor according to the control information.
In one embodiment, the motors comprise a pitching motor and a horizontal motor;
and the motor control module is specifically used for controlling the pitching of the pitching motor and the horizontal rotation of the horizontal motor according to the control information.
In one embodiment, the motor control module comprises: the motor comprises a connector, a signal transmission circuit and a power supply circuit for supplying power to the motor;
the control information is transmitted to the motor through the signal transmission circuit and the connector, and the feedback signal from the motor is transmitted to the processor through the connector and the signal transmission circuit.
In one embodiment, the controller comprises: and a system-on-chip.
In a second aspect, the present application provides a display apparatus, including any one of the above cameras connected to a display device; wherein, the display device includes: a display; the display comprises a display screen component for presenting pictures and a driving component for driving the display of images; the camera transmits the acquired image data to the display for display.
The application provides a camera and display device, this camera includes: the first camera body is used for collecting RGB data; the second camera body is connected with the first camera body; the second camera body is used for receiving the synchronous signal from the first camera body and acquiring image phase data according to the synchronous signal; the first camera body and the second camera body are placed on the holder device; the holder device is used for receiving control information transmitted by the display equipment; controlling the rotation and/or the pitching of the first camera body and the rotation and/or the pitching of the second camera body according to the control information; in this application embodiment, simultaneously integrated second camera body and cloud platform device in the camera, then the second camera body gathers image phase data, and first camera body gathers RGB data, and cloud platform device has realized the collection of the image of different angles of first camera and second camera, makes the display device who is connected with the camera obtain image phase data and RGB data after, can obtain the stereoscopic image.
Drawings
In order to more clearly illustrate the embodiments of the present application or the implementation manner in the related art, a brief description will be given below of the drawings required for the description of the embodiments or the related art, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained by those skilled in the art according to the drawings.
Fig. 1 is a schematic view exemplarily showing a camera used in the related art;
FIG. 2 is a schematic diagram illustrating an operational scenario between a display device and a control apparatus according to some embodiments;
fig. 3 is a block diagram illustrating a hardware configuration of a display device according to some embodiments;
fig. 4 is a block diagram illustrating a hardware configuration of a control apparatus according to some embodiments;
FIG. 5 is a diagram illustrating a software configuration in a display device according to some embodiments;
FIG. 6 is a schematic diagram illustrating an icon control interface display of an application in a display device according to some embodiments;
fig. 7 is a schematic view exemplarily illustrating a display device provided in some embodiments;
fig. 8 is a schematic diagram illustrating a camera provided in some embodiments;
fig. 9 is a schematic diagram schematically illustrating another camera provided in some embodiments;
fig. 10 is a schematic view exemplarily showing another camera provided in the related art;
fig. 11 is a schematic diagram illustrating yet another camera provided in some embodiments;
fig. 12 is a schematic view exemplarily illustrating a pan and tilt head apparatus provided in some embodiments;
fig. 13 is a schematic diagram illustrating an electric machine control module provided in some embodiments;
FIG. 14 is a schematic diagram illustrating a processor provided in some embodiments;
fig. 15 is a schematic diagram illustrating a power supply module provided in some embodiments.
Detailed Description
To make the objects, embodiments and advantages of the present application clearer, the following description of exemplary embodiments of the present application will clearly and completely describe the exemplary embodiments of the present application with reference to the accompanying drawings in the exemplary embodiments of the present application, and it is to be understood that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments described herein without inventive step, are intended to be within the scope of the claims appended hereto. In addition, while the disclosure herein has been presented in terms of one or more exemplary examples, it should be appreciated that aspects of the disclosure may be implemented solely as a complete embodiment.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first", "second", and the like in the description and claims of this application and in the drawings described above, are used for distinguishing between similar or analogous objects or entities and are not necessarily meant to define a particular order or sequence Unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein.
Furthermore, the terms "comprises" and "comprising," as well as any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
The term "module" as used herein refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The term "remote control" as used in this application refers to a component of an electronic device, such as the display device disclosed in this application, that is typically wirelessly controllable over a short range of distances. Typically using infrared and/or Radio Frequency (RF) signals and/or bluetooth to connect with the electronic device, and may also include WiFi, wireless USB, bluetooth, motion sensor, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in the common remote control device with the user interface in the touch screen.
The term "gesture" as used in this application refers to a user's behavior through a change in hand shape or an action such as hand motion to convey a desired idea, action, purpose, or result.
The camera has the basic functions of video shooting/transmission, static image capture and the like, and processes and converts images into digital signals which can be identified by display equipment by a photosensitive component circuit and a control component in the camera after the images are collected by a lens, and then the digital signals are input into the display equipment by a parallel port or USB connection and then are restored by software.
In general, the display device needs to perform algorithm processing, such as image matting, background replacement, and the like, on an image transmitted from the camera, and the image after the algorithm processing is displayed on the display device, which requires that the display device has good computing power. For display equipment with poor computing power, such as common display equipment like televisions, projection equipment and watches, the existing camera is not suitable for the display equipment with poor computing power, and the use of the camera is limited.
Based on this, the present application provides a camera, which on one hand can remove the need of integrating a USB hub (USB controller) in the camera in the prior art, and further reduce the module design complexity of the camera, and furthermore, further reduce the module design complexity of the camera by arranging an SoC (system on chip) integrated image processing module, an image compression module and an image algorithm module, and in addition, integrate the image algorithm module at the camera side, avoid the algorithm processing of the image by the display device, so that the camera can be applied to the display device with poor computing capability, and realize the functions of stereo and pan-tilt control, and expand the application range of the camera.
Fig. 2 is a schematic diagram illustrating a camera usage scenario according to an embodiment. As shown in fig. 2, an embodiment of the present application provides a camera 500, and the camera 500 may be connected with a display device 200. The user may operate the display apparatus 200 through the mobile terminal 300 and the control device 100.
Specifically, the camera 500 may be embedded in the display device 200, as illustrated in fig. 2. Alternatively, the camera 500 and the display apparatus 200 are connected by a USB cable. Alternatively, the camera 500 and the display device 200 may also be communicatively connected.
As also shown in fig. 2, the display apparatus 200 also performs data communication with the server 400 through various communication means. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200.
The display device 200 may be a liquid crystal display, an OLED display, a projection display device. The particular display device type, size, resolution, etc. are not limiting, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
The display apparatus 200 may additionally provide an intelligent network tv function of a computer support function including, but not limited to, a network tv, an intelligent tv, an Internet Protocol Tv (IPTV), and the like, in addition to the broadcast receiving tv function.
A hardware configuration block diagram of the display apparatus 200 according to an exemplary embodiment is exemplarily shown in fig. 3.
In some embodiments, at least one of the controller 250, the tuner demodulator 210, the communicator 220, the detector 230, the input/output interface 255, the display 275, the audio output interface 285, the memory 260, the power supply 290, the user interface 265, and the external device interface 240 is included in the display apparatus 200.
In some embodiments, a display 275 receives image signals originating from the first processor output and displays video content and images and components of the menu manipulation interface.
In some embodiments, the display 275, includes a display screen assembly for presenting a picture, and a driving assembly that drives the display of an image.
In some embodiments, a driver assembly for driving the display is also included, depending on the type of display 275.
In some embodiments, display 275 is a projection display and may also include a projection device and a projection screen.
In some embodiments, as shown in fig. 3, the input/output interface 255 is configured to allow data transfer between the controller 250 and external other devices or other controllers 250. Such as receiving video signal data and audio signal data of an external device, or command instruction data, etc.
In some embodiments, the frequency points demodulated by the tuner demodulator 210 are controlled by the controller 250, and the controller 250 can send out control signals according to user selection, so that the modem responds to the television signal frequency selected by the user and modulates and demodulates the television signal carried by the frequency.
In some embodiments, the controller 250 and the modem 210 may be located in different separate devices, that is, the modem 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box. Therefore, the set top box outputs the television audio and video signals modulated and demodulated by the received broadcast television signals to the main body equipment, and the main body equipment receives the audio and video signals through the first input/output interface.
As shown in fig. 3, the cavity controller 250 includes at least one of a Random Access Memory 251 (RAM), a Read-Only Memory 252 (ROM), a video processor 270, an audio processor 280, other processors 253 (e.g., a Graphics Processing Unit (GPU), a Central Processing Unit 254 (CPU), a Communication Interface (Communication Interface), and a Communication Bus 256(Bus), which connects the respective components.
In some embodiments, the video processor 270 is configured to receive an external video signal, and perform video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image synthesis, and the like according to a standard codec protocol of the input signal, so as to obtain a signal that can be displayed or played on the direct display device 200.
In some embodiments, video processor 270 includes a demultiplexing module, a video decoding module, an image synthesis module, a frame rate conversion module, a display formatting module, and the like.
In some embodiments, the audio output, under the control of controller 250, receives sound signals output by audio processor 280, such as: the speaker 286, and an external sound output terminal of a generating device that can output to an external device, in addition to the speaker carried by the display device 200 itself, such as: external sound interface or earphone interface, etc., and may also include a near field communication module in the communication interface, for example: and the Bluetooth module is used for outputting sound of the Bluetooth loudspeaker.
The memory 260 includes a memory storing various software modules for driving the display device 200. Such as: various software modules stored in the first memory, including: at least one of a basic module, a detection module, a communication module, a display control module, a browser module, and various service modules.
The base module is a bottom layer software module for signal communication between various hardware in the display device 200 and for sending processing and control signals to the upper layer module. The detection module is used for collecting various information from various sensors or user input interfaces, and the management module is used for performing digital-to-analog conversion and analysis management.
Fig. 4 exemplarily shows a block diagram of a configuration of the control apparatus 100 according to an exemplary embodiment. As shown in fig. 4, the control apparatus 100 includes a controller 110, a communication interface 130, a user input/output interface, a memory, and a power supply source.
The control device 100 is configured to control the display device 200 and may receive an input operation instruction of a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200. Such as: the user responds to the channel up and down operation by operating the channel up and down keys on the control device 100.
In some embodiments, the control device 100 may be a smart device. Such as: the control apparatus 100 may install various applications that control the display apparatus 200 according to user demands.
The controller 110 includes a processor 112 and RAM 113 and ROM 114, a communication interface 130, and a communication bus. The controller is used to control the operation of the control device 100, as well as the communication cooperation between the internal components and the external and internal data processing functions.
The communication interface 130 enables communication of control signals and data signals with the display apparatus 200 under the control of the controller 110. Such as: the received user input signal is transmitted to the display apparatus 200. The communication interface 130 may include at least one of a WiFi chip 131, a bluetooth module 132, an NFC module 133, and other near field communication modules.
In some embodiments, the control device 100 includes at least one of a communication interface 130 and an input-output interface 140. The control device 100 is provided with a communication interface 130, such as: the WiFi, bluetooth, NFC, etc. modules may transmit the user input command to the display device 200 through the WiFi protocol, or the bluetooth protocol, or the NFC protocol code.
Referring to fig. 5, in some embodiments, the system is divided into four layers, which are, from top to bottom, an Application (Applications) layer (referred to as an "Application layer"), an Application Framework (Application Framework) layer (referred to as a "Framework layer"), an Android runtime (Android runtime) layer and a system library layer (referred to as a "system runtime library layer"), and a kernel layer.
As shown in fig. 5, in the embodiment of the present application, the application framework layer includes a manager (Managers), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used for interacting with all activities running in the system; the Location Manager (Location Manager) is used for providing the system service or application with the access of the system Location service; a Package Manager (Package Manager) for retrieving various information related to an application Package currently installed on the device; a Notification Manager (Notification Manager) for controlling display and clearing of Notification messages; a Window Manager (Window Manager) is used to manage the icons, windows, toolbars, wallpapers, and desktop components on a user interface.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 5, the core layer includes at least one of the following drivers: audio drive, display drive, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (such as fingerprint sensor, temperature sensor, touch sensor, pressure sensor, etc.), and so on.
In some embodiments, as shown in fig. 6, the application layer containing at least one application may display a corresponding icon control in the display, such as: the system comprises a live television application icon control, a video on demand application icon control, a media center application icon control, an application center icon control, a game application icon control and the like.
Specifically, the camera 500 acquires image data and transmits the image data to the display device 200 for display, the display device 200 may receive a demand instruction sent by a user through the control device 100 or the mobile terminal 300, the display device 200 sends a corresponding algorithm instruction to the camera 500 according to the demand instruction, the camera 500 performs algorithm processing on the acquired image data according to the algorithm instruction, sends an algorithm result to the display device 200, and the display device 200 displays an image. Illustratively, a user sends a demand instruction to the display device 200 through the control apparatus 100 or the mobile terminal 300, where the demand instruction requires that an image displayed by the display device 200 only includes a human face, the camera 500 performs matting algorithm processing on a captured original image after receiving a corresponding algorithm instruction to obtain a corresponding algorithm result, and then sends the original image and the algorithm result to the display device 200, and the display device 200 performs rendering display on the original image through the algorithm result to display a human face image required by the user. The algorithm process is moved to the camera 500 for processing, the calculation pressure of the display device 200 is reduced, and the display device 200 such as a television can display the image processed by the algorithm.
It should be emphasized that, unless otherwise specified, the "connection" in any embodiment of the present application may be "electrical connection" or "communication connection", and the embodiment of the present application is not limited thereto.
The technical solution of the present application will be described in detail below with reference to specific examples. It should be noted that the following specific embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments.
Referring to fig. 7, the present application provides a schematic structural diagram of a camera, where the camera 500 includes:
the first camera body 501, the first camera body 501 is used for collecting RGB data;
a second camera body 502, the second camera body 502 being connected to the first camera body 501; the second camera body 502 is configured to receive the synchronization signal from the first camera body 501, and acquire image phase data according to the synchronization signal;
the holder device 503 is provided with a first camera body 501 and a second camera body 502 which are placed on the holder device 503; the pan-tilt device 503 is configured to receive the control information transmitted by the display apparatus 200; and controls the rotation and/or tilt of the first camera body 501 and the rotation and/or tilt of the second camera body 502 according to the control information.
Specifically, the arrangement of the holder device 503 can improve the shooting viewing angles of the first camera body 501 and the second camera body 502, and improve the frame of the collected image. The first camera body 501 and the second camera body 502 are placed on the same holder device 503, and then the first camera body 501 and the second camera body 502 can rotate synchronously and pitch by controlling the holder device, so that the first camera body 501 and the second camera body 502 can acquire RGB data and image phase data of the same picture, and the RGB data and the image phase data of different angles are acquired.
Referring to fig. 8 and 9, the camera 500 further includes:
an image processing module 5041, an image compression module 5042, an image algorithm module 5043, and a phase data processing module 5044;
the input end of the image processing module 5041 is connected to the output end of the first camera body 501, and the output end of the image processing module 5041 is connected to the input end of the image compression module 5042 and the input end of the image algorithm module 5043 respectively; an output of the image compression module 5042 is connected to the display device 200; the input end of the phase data processing module 5044 is connected with the output end of the second camera body 502; the output of the phase data processing module 5044 is connected to the input of the image algorithm module 5043 and the display device 200, respectively;
a phase data processing module 5044 for converting image phase data into image depth data and transmitting the image depth data to the display device 200 and an image algorithm module 5043;
the image algorithm module 5043 is configured to receive a first algorithm instruction from the display device 200, perform algorithm processing on the first image data and the depth data to obtain first algorithm result data, and transmit the first algorithm result data to the display device 200, so that the display device 200 performs rendering display on the second image data and the image depth data according to the first algorithm result data.
In one embodiment, referring to fig. 8, the camera 500 further includes: a controller 504;
an image processing module 5041, an image compression module 5042, and an image algorithm module 5043 are provided on the controller 504; the phase data processing module 5044 is located outside the controller.
In one embodiment, the image processing module 5041, the image compression module 5042, the image algorithm module 5043, and the phase data processing module 5044 are disposed on the controller 504.
In one embodiment, the display device further includes:
an interface conversion module 5045; the interface conversion module 5045 is disposed on the controller 504;
the input end of the interface conversion module 5045 is connected with the output end of the phase processing module 5044; the output end of the interface conversion module 5045 is connected with the display device 200;
the interface conversion module 5045 is configured to receive the image depth data transmitted by the phase data processing module 5044, perform interface conversion on the image depth data, and transmit the image depth data to the display device 200.
For example, the first camera body 501 transmits the collected RGB data to the image processing module 5041 through a Mobile Industry Processor Interface (MIPI).
The image processing performed by the image processing module 5041 on the RGB data may include, but is not limited to: and enhancing and restoring the image. Among them, the purpose of image enhancement and restoration is to improve the quality of an image. For example, noise is removed, the sharpness of the image is improved, and the like. The image enhancement does not consider the reason of image degradation, highlights the interested part in the image, for example, enhances the high-frequency component of the image, can make the outline of the object in the image clear, the details are obvious, and also can reduce the noise influence in the image by enhancing the low-frequency component; the reason for image degradation is required to be known to a certain extent, and generally, a 'degradation model' is established according to a degradation process, and then an original image is restored or reconstructed by adopting a certain filtering method. After being processed by the image processing module 5041, the resulting first image data is an image that can be displayed in a display device.
In some examples, the image compression module 5042 may employ H264 (highly compressed digital video codec standard) or MJPEG (Motion Joint Photographic Expels Group, a technique that is Motion still image (or frame-by-frame) compression) to compress and/or encode the first image data. Correspondingly, a corresponding image decompression module is arranged in the display device 200, and when the compressed second image data is transmitted to the display device 200, the display device 200 can perform decompression processing and/or decoding processing through the image decompression module, and then play can be performed. Here, the image compression module 5042 functions to compress the first image data, so that the compressed second image data is more efficiently transferred from the camera 500 to the display device 200. In addition, the second image data may be transmitted from the image compression module 5042 to the display device 200 in a USB manner or in a network manner. In some embodiments, the transmission is performed in a USB manner, so as to improve the transmission efficiency, and the specific transmission manner is not limited herein.
In addition, the image algorithm module 5043 may perform an algorithm process on the first image data using DSP (Digital Signal Processing) technology.
In some embodiments, the algorithmic processing may include: the image recognition processing is mainly to perform image segmentation and feature extraction on the first image data. The techniques employed may include: fuzzy pattern recognition and artificial neural networks.
Referring to fig. 10, currently, an image phase acquisition module is arranged in a camera 01, image phase data acquired by the image phase acquisition module is transmitted to a USB controller, image phase data is transmitted to a display device 02 through the USB controller, a phase data processing module is integrated on the display device 02 side, and the image phase data is processed on the display device side.
Specifically, in order to display a more stereoscopic image on the display device 200, two or more lenses are usually required to be disposed in the camera, wherein one lens is a first camera body 501 for collecting RGB data, and one lens is a second camera body 502 for collecting TOF data (image phase data), and the RGB data is processed by the image phase data, so that the displayed image can be more stereoscopic.
The synchronization signal is used to enable the second camera body 502 and the first camera body 501 to perform image acquisition synchronously, that is, each frame of RGB data is acquired by the first camera body 501, and each frame of image phase data is acquired by the second camera body 502 for the same target object at the same time.
In fig. 8, an image processing module 5041, an image compression module 5042, an image algorithm module 5043, and an interface conversion module 5024 are integrated into the controller 504, wherein a phase data processing module 5044 is integrated into the camera 500, external to the controller 504; by integrating the phase data processing module 5044 in the camera 500, the requirement for the computing power of the display device 200 can be reduced, and the application range of the camera 500 is further expanded.
In fig. 9, the phase data processing module 5044 is integrated on the controller 504, so that the integration level of each module in the camera 500 is further improved, and the complexity of the design of the modules of the camera 500 is reduced.
In some embodiments, second camera body 502 includes: a pulsed infrared laser transmitter and a corresponding receiver; the pulse infrared laser transmitter transmits laser, and the receiver receives the laser; the corresponding sending time and receiving time are used as corresponding image phase data and transmitted to the phase data processing module 5044, the phase data processing module 5044 calculates a corresponding phase difference, and then the relative distance of each pixel point in the shot target object, namely the image depth data, is obtained.
In addition, a plurality of user requirement identifiers may be preset in the display device 200, a user selects one user requirement identifier by controlling the device 100 or touching a display screen of the display device 200, each requirement identifier corresponds to one first algorithm instruction, the display device 200 sends the first algorithm instruction to the image algorithm module 5043, and the image algorithm module 5043 starts a corresponding algorithm program to perform algorithm processing on the first image data and the image depth data according to the first algorithm instruction, so as to obtain a first algorithm result.
Illustratively, the user requirement identifier preset in the display device 200 includes: image cutout, object cutout or background replacement, etc., the user selects image cutout and object cutout by means of the control device 100 or touching the display screen of the display device 200, for example, in the first image data, a person holds a cup sub-station under a tree, the user only needs to hold the content of a cup, then image matting and object (cup) matting can be selected, the display device 200 sends a first algorithm instruction corresponding to the image matting and object (cup) matting to the image algorithm module 5043, the image algorithm module passes through image recognition processing, determining coordinate values corresponding to the portrait and the object (cup) in the first image data, and acquiring image depth data corresponding to the portrait and the cup, whether the cup is in front of the portrait or behind the portrait can be determined from the image depth data, and the coordinate value is transmitted to the display device 200 as a result of the first algorithm. After receiving the second image data and the first algorithm result, the display device 200 decompresses the second image data to obtain the first image data, then selects the pixels at the corresponding positions in the first image data and the image depth data through the coordinate values, and renders and displays the pixels, so that the display device 200 displays the image matting and the object matting required by the user.
In this embodiment of the application, the image algorithm module 5043 may further perform other algorithm processing to avoid the operation performed by the display device 200, and the display device 200 only needs to render and display the algorithm result, the image depth data, and the first image data transmitted by the camera. Moreover, any algorithmic processing performed by image algorithm module 5043 within camera 500 is within the scope of the present application.
In the embodiment of the application, data transmission is performed between the first camera body 501 and the image processing module 5041 in an MIPI and/or 12C manner; data transmission is performed between the second camera body 502 and the phase data processing module 5044 in an MIPI and/or 12C manner; data transmission is performed between the phase data processing module 5044 and the Interface conversion module 5045 by using at least one of SPI (Serial Peripheral Interface), MIPI, and 12C.
In addition, the camera 500 further includes an operating memory and a storage, wherein the operating memory is a memory required by the program during the operation, and only temporarily stores data for exchanging cache data with the controller; the memory is an EMMC (Embedded Multi Media Card), provides a standard interface, and manages an operating memory.
In the embodiment of the present application, by adding the second camera body 502, the image phase data that can be collected enables the display device 200 to obtain the image depth data, and finally the image displayed by the display device 200 is more stereoscopic; furthermore, by integrating the phase data processing module 504 in the camera 500, the requirement for the computing power of the display device 200 adapted to the camera 500 is reduced, the application range of the camera 500 is expanded, and in addition, by integrating the interface conversion module 50244 in the controller, the complexity of the design of the camera module is reduced.
In the embodiment of the application, the image processing module, the image compression module and the image algorithm module are integrated on the controller, so that the image processing module, the image compression module and the image algorithm module are prevented from being arranged on a camera in a scattered manner, the module design complexity of the camera is reduced, the image algorithm module is integrated on the side of the camera, the camera can be applied to display equipment with poor computing capability, and the application range of the camera is expanded.
In some embodiments, referring to fig. 11, a camera includes:
the first camera body 501, the first camera body 501 is used for collecting RGB data;
the controller 502 is integrated with the controller 504 and includes an image processing module 5041, an image compression module 5042 and an image algorithm module 5043, wherein an input end of the image processing module 5041 is connected to an output end of the first camera body 501, and an output end of the image processing module 5041 is connected to an input end of the image compression module 5042 and an input end of the image algorithm module 5043 respectively; an output of the image compression module 5042 is connected to the display device 200;
the image processing module 5041 is configured to perform image processing on the RGB data to obtain first image data, and transmit the first image data to the image compression module 5042 and the image algorithm module 5043;
an image compression module 5042, configured to perform compression coding processing on the first image data to obtain second image data, and transmit the second image data to the display device 200;
the image algorithm module 5043 is configured to receive a second algorithm instruction from the display device 200, perform algorithm processing on the first image data to obtain second algorithm result data, and transmit the second algorithm result data to the display device 200, so that the display device 200 performs rendering display on the second image data according to the second algorithm result data.
Specifically, a plurality of user requirement identifiers may be preset in the display device 200, and each requirement identifier corresponds to one second algorithm instruction. In practical application, a user may select a user requirement identifier by controlling the device 100 or touching a display screen of the display device 200, and in response, the display device 200 sends a second algorithm instruction to the image algorithm module 5043, and the image algorithm module 5043 starts a corresponding algorithm program to perform algorithm processing on the first image data according to the second algorithm instruction, so as to obtain a second algorithm result.
Illustratively, the user requirement identifier preset in the display device 200 includes: the user selects the portrait cutout by controlling the device 100 or touching the display screen of the display device 200, the display device 200 sends a second algorithm instruction corresponding to the portrait cutout to the image algorithm module 5043, the image algorithm module determines a coordinate value corresponding to the portrait in the first image data by image recognition processing, and sends the coordinate value as a second algorithm result to the display device 200. After receiving the second image data and the second algorithm result, the display device 200 decompresses the second image data to obtain the first image data, selects pixels at corresponding positions in the first image data according to coordinate values, and renders and displays the pixels, so that the display device 200 displays the portrait cutout required by the user.
Referring to fig. 12, the pan-tilt apparatus 503 includes: a processor 531, a motor control module 532, and a motor 533; the processor 531 is connected to the controller 504 and the motor control module 532, respectively; the motor control module 531 is also connected to a motor 533;
the processor 531 is configured to obtain control information from the controller, and transmit the control information to the motor control module 532; a motor control module 532 for controlling the rotation and/or pitch of the motor 533 according to the control information.
Referring to fig. 12, the motors 533 include a pitch motor 5331 and a horizontal motor 5332; the motor control module 532 is specifically configured to control the pitch of the pitch motor 5331 and the horizontal rotation of the horizontal motor 5332 according to the control information.
Referring to fig. 13, the motor control module 532 includes: a connector 5323, a signal transmission circuit 5324, and a power supply circuit 5325 that supplies power to the motor;
the control information is transmitted to the motor 503 through the signal transmission circuit 5324 and the connector 5323, and the feedback signal from the motor 503 is transmitted to the processor 531 through the connector 5323 and the signal transmission circuit.
Wherein, the controller 504 includes: a system on chip (SoC); the controller 504 may also be another chip, which is not limited herein.
Specifically, the motor control module 532 may include: a pitch motor control module 5321 and a horizontal motor control module 5322. The pitching motor control module 5321 is connected with the pitching motor 5331 and controls the pitching motor 5331 to operate; the pitching motor 5331 controls the first camera body 501 and/or the second camera body 502 to pitch; the horizontal motor control module 5322 is connected with the horizontal motor 5332 to control the operation of the horizontal motor; the horizontal motor 5332 controls the first camera body 501 and/or the second camera body 502 to horizontally rotate.
In one embodiment, the horizontal motor control module 5322 is identical to the pitch motor control module 5321 in circuit diagram.
The control information is sent by the display device 200, the control information sent by the display device 200 is transmitted to the controller, the controller transmits the control information to the processor 531, the processor 531 correspondingly transmits the control information to the horizontal motor control module 5322 and the pitching motor control module 5321, and the horizontal motor control module 5322 and the pitching motor control module 5321 correspondingly control the horizontal motor 5332 and the pitching motor 5331 according to the control information.
Specifically, in fig. 13, the DC terminal DC of the connector 5323 is connected to a +5V DC voltage, and the DC terminal DC is connected to the camera 500 for obtaining the +5V DC voltage from the camera 500, and then the connector 5323 converts the DC voltage into a 3.3V DC voltage to be connected to the motor for supplying power to the motor. The P _ PWMC, P _ FAULT, P _ PWMB, and P _ PWMA terminals of the connector 5323 are connected to the motor, the P _ H1 and P _ H2 of the connector 5323 are connected to the signal transmission circuit 5324, respectively, the MP _ H1 and MP _ H2 of the signal transmission circuit 5324 are connected to the MP _ H1 and MP _ H2 terminals corresponding to the MCU53111 in fig. 14, and are configured to send the feedback signal transmitted from the motor through the connector 5323 to the MCU 53111. In addition, the MP _ I and P _ I terminals of the power supply circuit 5325 are connected to the MCU53111, and the DC +5V terminal is connected to the camera 500, for obtaining a 5V DC current from the camera 500, and then converting it into a 3.3V DC current to be connected to the motor, so as to supply power to the motor. GND denotes ground, and R1, R2, R3, and R4 denote resistors having arbitrary resistance values, respectively.
In some embodiments, the signal transmitting circuit 5324 includes two identical first sub-signal transmitting circuits 53241 and second sub-signal transmitting circuits 53242, wherein the first sub-signal transmitting circuit 53241 includes: a first capacitor C1 and a first resistor R1, a first end of the first capacitor C1 is grounded, a second end of the first capacitor C1 is connected with a first end of a first resistor R1, a second end of the first resistor R1 is connected with the connector 5323, and a first end of the first resistor R1 is further connected with the MCU 53111; in addition, the second sub-signal transmission circuit 53242 has the same circuit and is also connected between the connector 5323 and the MCU53111, and the MCU53111 determines the positions of the horizontal motor and the pitch motor by acquiring the two feedback signals transmitted by the first sub-signal transmission circuit 53241 and the second sub-signal transmission circuit 53242.
Referring to fig. 14 and 15, the processor 531 further includes: a processing module 5311 and a power supply module 5312. In fig. 14, the processing module 5311 includes: an MCU53111, a signal circuit 53112, and a first power supply 53113; in fig. 15, the power supply module 5312 includes: a second power supply 53121, a voltage stabilizing circuit 53122, and a voltage converter 53123.
The MCU53111 is connected with the signal circuit 53112 and the first power supply 53113, and when the motor works, the MCU53111 controls the diode VDREL in the signal circuit 53112 to emit light to indicate that the motor works; the MDC +3.3V end of the first power supply 53113 is connected with the DC +3.3V end of the voltage converter 53123, and the REF +2.5V end of the first power supply 53113 and the MCU53111 supply power to the MCU 53111.
Further, DC +5V of the second power supply 53121 is connected to a serial port of the camera for obtaining 5V DC current from the camera 500, P _ TX and P _ RX of the second power supply 53121 are connected to the MCU53111, P _ TX is used for sending a signal to the MCU53111, P _ RX is used for receiving a signal from the MCU53111 for determining whether the MCU53111 is turned on to operate, and further determining whether 5V DC current is to be obtained from the camera 500; one end of the voltage stabilizing circuit 53122 is connected to the serial port of the camera, and the other end is grounded, so as to protect the processor 531. The voltage stabilizing circuit 53122 is composed of a plurality of capacitors C5-C24 in parallel. The DC +5V of the voltage converter 53123 is connected to the serial port of the camera for obtaining 5V DC current from the camera 500, converting 3.3V DC current, and supplying power to the first power supply 53113.
Further, in fig. 13, the resistance values of the first resistor R1, the second resistor R2, the third resistor R3, and the fourth resistor R4 are within an error range of 1% allowed for 10k Ω; the first capacitor C1, the second capacitor C2 and the third capacitor C3 are all 104/50. In fig. 14, the resistance value of the fifth resistor R5 is 510 Ω. In fig. 15, the specifications of the fifth capacitor C5 to the twenty-fourth capacitor C24 are 226/10V, and the specification of the twenty-fifth capacitor C25 is 104/5V.
In a specific implementation, the display device 200 sets an angle between the first camera body 501 and the second camera body 502 through a serial port ModBus-RTU protocol, the display device 200 packages the angle into control information, the UVC XU instruction protocol is adopted to transmit the control information to the SoC, the SoC sends the control information to the processor 531 through a UART serial port protocol, the processor 531 sends the control information to the motor control module 532, the motor control module 532 controls the motor according to the control information, and firstly, the position loop of the motor outputs speed through a PID algorithm { proportion (P), integral (I) and differential (D) }; and then, the speed is transmitted to a speed ring of the motor, the speed ring outputs corresponding PWM (pulse width modulation) through a PID algorithm, then the speed ring sends the angle and the PWM set by the display device to an SVPWM module of the motor, the corresponding SVPWM is calculated, and the SVPWM is input to a driver of the motor and used for driving the motor to run.
Based on fig. 13 to 15, a circuit is provided which can realize horizontal rotation and pitch control of the motor.
Illustratively, a display device such as a television sends an instruction to a controller, the communication protocol employs an XU instruction of UVC, and a pan-tilt control XU command set is defined as follows:
1. the cloud deck control is specifically the angle returned by the camera; an angle data structure issued by a television terminal; whether the command is successfully issued to the holder is judged; the cradle head returns: moving, horizontally in place, vertically in place, or stationary; a fault code, wherein the fault code comprises: at least one of horizontal reset abnormality, pitch reset abnormality, horizontal locked rotor, pitch locked rotor, horizontal overcurrent, and pitch overcurrent; real-time angle _ level and pitch; acquiring a real-time angle _ level; acquiring a real-time angle _ pitch; setting absolute angle _ horizontal & vertical; setting an absolute angle _ horizontal; setting an absolute angle _ vertical; setting relative angle _ horizontal & vertical; setting a relative angle _ horizontal; setting a relative angle _ vertical; setting a movement speed _ level; setting a motion speed _ vertical; set movement speed _ horizontal and vertical; acquiring the motion speed _ horizontal and vertical; acquiring a motion speed _ level; acquiring a motion speed _ pitch; acquiring a real-time current _ level; acquiring real-time current _ pitch; the firmware version number is obtained.
2. The controller sends an instruction to the MCU, and specifically, a communication protocol adopts a Modbus protocol.
Referring to fig. 7 to 9, an embodiment of the present application further provides a display apparatus, including any one of the cameras 500 described above, where the camera 500 is connected with the display device 200. Wherein, the display device 200 includes: a display; the display comprises a display screen component for presenting pictures and a driving component for driving the display of images; the camera 500 transmits the acquired image data to the display for display.
The application provides a camera and display device, this camera includes: the first camera body is used for collecting RGB data; the second camera body is connected with the first camera body; the second camera body is used for receiving the synchronous signal from the first camera body and acquiring image phase data according to the synchronous signal; the first camera body and the second camera body are placed on the holder device; the holder device is used for receiving control information transmitted by the display equipment; controlling the rotation and/or the pitching of the first camera body and the rotation and/or the pitching of the second camera body according to the control information; in this application embodiment, simultaneously integrated second camera body and cloud platform device in the camera, then the second camera body gathers image phase data, and first camera body gathers RGB data, and cloud platform device has realized the collection of the image of different angles of first camera and second camera, makes the display device who is connected with the camera obtain image phase data and RGB data after, can obtain the stereoscopic image. In addition, the image processing module, the image compression module and the image algorithm module are integrated on the controller, the image processing module, the image compression module and the image algorithm module are prevented from being arranged on the camera in a scattered mode, the module design complexity of the camera is reduced, the image algorithm module is integrated on the side of the camera, the camera can be applied to display equipment with poor computing capability, and the application range of the camera is expanded.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, those of ordinary skill in the art will understand that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (9)

1. A camera, comprising:
the first camera body is used for collecting RGB data;
the second camera body is connected with the first camera body; the second camera body is used for receiving the synchronous signal from the first camera body and acquiring image phase data according to the synchronous signal;
the holder device, the first camera body and the second camera body are placed on the holder device; the holder device is used for receiving control information transmitted by the display equipment; and controlling the rotation and/or the pitching of the first camera body and the rotation and/or the pitching of the second camera body according to the control information.
2. The camera of claim 1, further comprising:
the device comprises an image processing module, an image compression module, an image algorithm module and a phase data processing module;
the input end of the image processing module is connected with the output end of the first camera body, and the output end of the image processing module is respectively connected with the input end of the image compression module and the input end of the image algorithm module; the output end of the image compression module is connected with the display equipment; the input end of the phase data processing module is connected with the output end of the second camera body; the output end of the phase data processing module is respectively connected with the input end of the image algorithm module and the display device;
the phase data processing module is used for converting the image phase data into image depth data and transmitting the image depth data to the display equipment and the image algorithm module;
the image algorithm module is used for receiving a first algorithm instruction from the display device, performing algorithm processing on first image data and the depth data to obtain first algorithm result data, and transmitting the first algorithm result data to the display device, so that the display device renders and displays second image data and the image depth data according to the first algorithm result data.
3. The camera of claim 2, further comprising: a controller;
the image processing module, the image compression module and the image algorithm module are arranged on the controller;
the phase data processing module is arranged outside the controller.
4. The camera of claim 3, further comprising: an interface conversion module; the interface conversion module is arranged on the controller;
the input end of the interface conversion module is connected with the output end of the phase data processing module; the output end of the interface conversion module is connected with the display equipment;
the interface conversion module is used for receiving the image depth data transmitted by the phase data processing module, performing interface conversion on the image depth data and transmitting the image depth data to the display equipment.
5. A camera head according to claim 3, characterized in that said pan-tilt device comprises: the system comprises a processor, a motor control module and a motor;
the processor is respectively connected with the controller and the motor control module;
the motor control module is also connected with the motor;
the processor is used for acquiring the control information from the controller and transmitting the control information to the motor control module; and the motor control module is used for controlling the rotation and/or the pitching of the motor according to the control information.
6. The camera of claim 5, wherein the motor comprises a tilt motor and a level motor;
the motor control module is specifically configured to control the pitching of the pitching motor and the horizontal rotation of the horizontal motor according to the control information.
7. The camera of claim 5 or 6, wherein the motor control module comprises: the motor comprises a connector, a signal transmission circuit and a power supply circuit for supplying power to the motor;
the control information is transmitted to the motor through the signal transmission circuit and the connector, and a feedback signal from the motor is transmitted to the processor through the connector and the signal transmission circuit.
8. The camera of claim 3, wherein the controller comprises: and a system-on-chip.
9. A display apparatus comprising a display device and the camera according to any one of claims 1 to 8, the camera being connected to the display device; wherein the display device includes: a display; the display comprises a display screen component for presenting pictures and a driving component for driving the display of images; and the camera transmits the acquired image data to the display for display.
CN202121349111.5U 2021-06-17 2021-06-17 Camera and display device Active CN215734456U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202121349111.5U CN215734456U (en) 2021-06-17 2021-06-17 Camera and display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202121349111.5U CN215734456U (en) 2021-06-17 2021-06-17 Camera and display device

Publications (1)

Publication Number Publication Date
CN215734456U true CN215734456U (en) 2022-02-01

Family

ID=80043175

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202121349111.5U Active CN215734456U (en) 2021-06-17 2021-06-17 Camera and display device

Country Status (1)

Country Link
CN (1) CN215734456U (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114745576A (en) * 2022-03-25 2022-07-12 上海合志信息技术有限公司 Family fitness interaction method and device, electronic equipment and storage medium
CN115576765A (en) * 2022-11-16 2023-01-06 南京芯驰半导体科技有限公司 Test method, test device, electronic equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114745576A (en) * 2022-03-25 2022-07-12 上海合志信息技术有限公司 Family fitness interaction method and device, electronic equipment and storage medium
CN115576765A (en) * 2022-11-16 2023-01-06 南京芯驰半导体科技有限公司 Test method, test device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN215734456U (en) Camera and display device
KR102170781B1 (en) Electronic device and method for processing image
CN111899680B (en) Display device and setting method thereof
KR102220118B1 (en) Display device and calibration method thereof
KR101677663B1 (en) Method and device for implementing analog high-definition image capturing
CN112672062B (en) Display device and portrait positioning method
CN111880711B (en) Display control method, display control device, electronic equipment and storage medium
US20210289165A1 (en) Display Device and Video Communication Data Processing Method
CN105681720A (en) Video playing processing method and device
CN115526787A (en) Video processing method and device
CN113014804A (en) Image processing method, image processing device, electronic equipment and readable storage medium
CN113473024A (en) Display device, holder camera and camera control method
US11284094B2 (en) Image capturing device, distribution system, distribution method, and recording medium
US20200252537A1 (en) Network-controlled 3d video capture
CN211266945U (en) 4K video all-in-one machine and audio and video system
CN112804547B (en) Interactive live broadcast system based on unmanned aerial vehicle VR makes a video recording
WO2022037215A1 (en) Camera, display device and camera control method
EP3110143A1 (en) Image transmission device and image transmission system
CN115328414A (en) Collaborative display method, electronic device, medium, and program product
CN106060481A (en) Video collection method and device of pan-tilt-zoom camera
CN113824941A (en) Projection display device, method and system
CN112218156A (en) Method for adjusting video dynamic contrast and display equipment
US20120019672A1 (en) Internet protocol camera for visual monitoring system
KR20200091196A (en) Method for augmented reality contents data optimization using user feedback
CN105323427B (en) Mobile terminal and method for realizing camera function in shutdown state

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant