CN114302101A - Display apparatus and data sharing method - Google Patents

Display apparatus and data sharing method Download PDF

Info

Publication number
CN114302101A
CN114302101A CN202110642158.9A CN202110642158A CN114302101A CN 114302101 A CN114302101 A CN 114302101A CN 202110642158 A CN202110642158 A CN 202110642158A CN 114302101 A CN114302101 A CN 114302101A
Authority
CN
China
Prior art keywords
camera
memory
camera application
application
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110642158.9A
Other languages
Chinese (zh)
Inventor
姜俊厚
刘健
吴汉勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202110642158.9A priority Critical patent/CN114302101A/en
Publication of CN114302101A publication Critical patent/CN114302101A/en
Pending legal-status Critical Current

Links

Images

Abstract

The application provides a display device and a data sharing method. The display device includes a display and a controller. Wherein the controller is configured to: when a calling request for calling a camera, which is sent by a first camera application, is received, allocating a camera memory; and sending the memory address pointing to the memory of the camera to the first camera application. When a memory address returned by the first camera application is received, the image data collected by the camera is stored in the camera memory, and meanwhile, the memory address is sent to the second camera application, and the second camera application can acquire the image data according to the memory address. And sending the image data in the camera memory to a first camera application, wherein the first camera application can process the image data to realize the relevant functions of the first camera application. The multiple camera applications can acquire image data acquired by the cameras at the same time, and data sharing among the multiple camera applications is achieved.

Description

Display apparatus and data sharing method
Technical Field
The present application relates to the field of display device technologies, and in particular, to a display device and a data sharing method.
Background
Along with the rapid development of display equipment, the function of the display equipment is more and more abundant, the performance is more and more powerful, the bidirectional man-machine interaction function can be realized, and various functions such as audio and video, entertainment, data and the like are integrated, so that the diversified and personalized requirements of users are met. The display device can be internally provided with or externally connected with a camera device so as to realize specific functions such as video chatting, photographing, video recording and the like.
Camera applications that require the use of a camera, such as "video call", "looking into the mirror", may be installed in the display device. After the display device starts the camera application related to the camera, the camera can be used for acquiring multi-frame image data, a specific picture is generated according to the image data, and the picture is displayed in the display, so that the related function of the camera application is realized.
However, a camera can only be used by one camera application at a time. When a plurality of camera applications call the cameras at the same time, the camera application with the highest priority needs to use the camera according to the preset priority. At this time, other camera applications cannot acquire the image data acquired by the camera.
Disclosure of Invention
The invention provides a display device and a data sharing method. The problem that a plurality of camera applications cannot acquire image data collected by cameras simultaneously in an existing display device is solved.
In a first aspect, the present application provides a display device comprising a display and a controller. Wherein the controller is configured to perform the steps of:
when a calling request for calling a camera, which is sent by a first camera application, is received, allocating a camera memory; sending a memory address pointing to the camera memory to a first camera application; when a memory address returned by a first camera application is received, storing image data acquired by a camera into a camera memory, and sending the memory address to a second camera application so that the second camera application acquires the image data according to the memory address; and sending the image data in the camera memory to a first camera application.
In some implementations, the controller is further configured to: when receiving a call request for calling a camera transmitted by a first camera application,
calling a camera service through a framework layer; controlling the camera service to call a hardware abstraction layer; establishing communication connection with the camera by using a camera provider process of the hardware abstraction layer; sending a starting instruction to the camera; and receiving image data sent by the camera.
In some implementations, the controller is further configured to: in performing the step of sending the image data in the camera memory to the first camera application,
sending the image data in the memory of the camera to a decoder, and controlling the decoder to decode the image data; and sending the decoded image data to the first camera application.
In some implementations, the controller is further configured to: after performing the step of sending the memory address to the second camera application,
receiving an acquisition request sent by a second camera application, wherein the acquisition request is used for acquiring image data stored in the camera memory pointed by the memory address; and sending the decoded image data to a second camera application.
In some implementations, the controller is further configured to:
and when a memory address returned by the first camera application is received, the memory address is sent to the third camera application, so that the third camera application can acquire image data according to the memory address.
In some implementations, the controller is further configured to: before the step of sending the memory address to the second camera application is performed,
detecting whether the second camera application opens the address acquisition permission; and when detecting that the second camera application opens the address acquisition right, executing the step of sending the memory address to the second camera application.
In some implementations, the controller is further configured to:
responding to a query instruction which is input by a user and indicates the condition of address acquisition permission, and controlling a display to display the condition of the address acquisition permission so that the user can adjust the address acquisition permission starting state applied by each camera according to the condition of the address acquisition permission; the address acquisition permission condition comprises all camera applications installed in the display equipment and the address acquisition permission starting state of each camera application.
In a second aspect, the present application further provides a display device comprising a display and a controller. Wherein the controller is configured to perform the steps of:
when a calling request for calling a camera, which is sent by a first camera application, is received, allocating a camera memory; sending a memory address pointing to the camera memory to a first camera application so that the first camera application sends the memory address to a second camera application, wherein the memory address is used for enabling the second camera application to acquire image data; when a memory address returned by a first camera application is received, storing image data acquired by a camera into a camera memory; and sending the image data in the camera memory to a first camera application.
In a third aspect, the present application further provides a display device comprising a display and a controller. Wherein the controller is configured to perform the steps of:
when a calling request for calling a camera, which is sent by a first camera application, is received, allocating a camera memory; sending a memory address pointing to the camera memory to a first camera application; when a memory address returned by a first camera application is received, storing image data acquired by a camera into a camera memory; sending the image data in the camera memory to a first camera application and a preset database; the preset database is used for providing image data for other camera applications.
In a fourth aspect, the present application provides a data sharing method, applied to a display device, the method including:
when a calling request for calling a camera, which is sent by a first camera application, is received, allocating a camera memory; sending a memory address pointing to the camera memory to a first camera application; when a memory address returned by a first camera application is received, storing image data acquired by a camera into a camera memory, and sending the memory address to a second camera application so that the second camera application acquires the image data according to the memory address; and sending the image data in the camera memory to a first camera application.
According to the technical scheme, the display equipment and the data sharing method are provided, and when a calling request for calling the camera, which is sent by the first camera application, is received, the memory of the camera is allocated; and sending the memory address pointing to the memory of the camera to the first camera application. When a memory address returned by the first camera application is received, the image data collected by the camera is stored in the camera memory, and meanwhile, the memory address is sent to the second camera application, and the second camera application can acquire the image data according to the memory address. And sending the image data in the camera memory to a first camera application, wherein the first camera application can process the image data to realize the relevant functions of the first camera application. The multiple camera applications can acquire image data acquired by the cameras at the same time, and data sharing among the multiple camera applications is achieved. The application of a plurality of cameras can process image data at the same time, and the intelligent level of the display equipment is improved.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without any creative effort.
FIG. 1 illustrates a usage scenario of a display device according to some embodiments;
fig. 2 illustrates a hardware configuration block diagram of the control apparatus 100 according to some embodiments;
fig. 3 illustrates a hardware configuration block diagram of the display apparatus 200 according to some embodiments;
FIG. 4 illustrates a software configuration diagram in the display device 200 according to some embodiments;
FIG. 5 illustrates an interaction flow diagram for components of a display device in some embodiments;
FIG. 6 illustrates a user interface in a display in a possible embodiment;
FIG. 7 illustrates an application list interface in a display in a possible embodiment;
FIG. 8 is a flow diagram that illustrates the controller sending an open command in some embodiments;
FIG. 9 is a diagram illustrating a camera sharing mode confirmation message displayed on the display in one possible embodiment;
FIG. 10a is a diagram illustrating a camera application acquiring a memory address in a feasible embodiment;
FIG. 10b is a diagram illustrating a camera application acquiring a memory address in a feasible embodiment;
FIG. 10c is a diagram illustrating a camera application acquiring a memory address in a feasible embodiment;
FIG. 11 is a diagram illustrating a display displaying the address acquisition permission status in a possible embodiment;
FIG. 12 shows a schematic diagram of a permission list in a possible embodiment;
FIG. 13 shows a flow diagram of one embodiment of a data sharing method.
Detailed Description
To make the purpose and embodiments of the present application clearer, the following will clearly and completely describe the exemplary embodiments of the present application with reference to the attached drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
Fig. 1 is a schematic diagram of a usage scenario of a display device according to an embodiment. As shown in fig. 1, the display apparatus 200 is also in data communication with a server 400, and a user can operate the display apparatus 200 through the smart device 300 or the control device 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes at least one of an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, and controls the display device 200 in a wireless or wired manner. The user may control the display apparatus 200 by inputting a user instruction through at least one of a key on a remote controller, a voice input, a control panel input, and the like.
In some embodiments, the smart device 300 may include any of a mobile terminal, a tablet, a computer, a laptop, an AR/VR device, and the like.
In some embodiments, the smart device 300 may also be used to control the display device 200. For example, the display device 200 is controlled using a camera application running on the smart device.
In some embodiments, the smart device 300 and the display device may also be used for communication of data.
In some embodiments, the display device 200 may also be controlled in a manner other than the control apparatus 100 and the smart device 300, for example, the voice instruction control of the user may be directly received by a module configured inside the display device 200 to obtain a voice instruction, or may be received by a voice control apparatus provided outside the display device 200.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers.
In some embodiments, software steps executed by one step execution agent may be migrated on demand to another step execution agent in data communication therewith for execution. Illustratively, software steps performed by the server may be migrated to be performed on a display device in data communication therewith, and vice versa, as desired.
Fig. 2 exemplarily shows a block diagram of a configuration of the control apparatus 100 according to an exemplary embodiment. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply. The control apparatus 100 may receive an input operation instruction from a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200.
In some embodiments, the communication interface 130 is used for external communication, and includes at least one of a WIFI chip, a bluetooth module, NFC, or an alternative module.
In some embodiments, the user input/output interface 140 includes at least one of a microphone, a touchpad, a sensor, a key, or an alternative module.
Fig. 3 shows a hardware configuration block diagram of the display apparatus 200 according to an exemplary embodiment.
In some embodiments, the display apparatus 200 includes at least one of a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, a user interface.
In some embodiments the controller comprises a central processor, a video processor, an audio processor, a graphics processor, a RAM, a ROM, a first interface to an nth interface for input/output.
In some embodiments, the display 260 includes a display screen component for displaying pictures, and a driving component for driving image display, a component for receiving image signals from the controller output, displaying video content, image content, and menu manipulation interface, and a user manipulation UI interface, etc.
In some embodiments, the display 260 may be at least one of a liquid crystal display, an OLED display, and a projection display, and may also be a projection device and a projection screen.
In some embodiments, the tuner demodulator 210 receives broadcast television signals via wired or wireless reception, and demodulates audio/video signals, such as EPG data signals, from a plurality of wireless or wired broadcast television signals.
In some embodiments, communicator 220 is a component for communicating with external devices or servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver. The display apparatus 200 may establish transmission and reception of control signals and data signals with the control device 100 or the server 400 through the communicator 220.
In some embodiments, the detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for collecting ambient light intensity; alternatively, the detector 230 includes an image collector, such as a camera, which may be used to collect external environment scenes, attributes of the user, or user interaction gestures, or the detector 230 includes a sound collector, such as a microphone, which is used to receive external sounds.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, and the like. The interface may be a composite input/output interface formed by the plurality of interfaces.
In some embodiments, the controller 250 and the modem 210 may be located in different separate devices, that is, the modem 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink, an icon, or other actionable control. The operations related to the selected object are: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to the icon.
In some embodiments the controller comprises at least one of a Central Processing Unit (CPU), a video processor, an audio processor, a Graphics Processing Unit (GPU), a RAM Random Access Memory (RAM), a ROM (Read-Only Memory), a first to nth interface for input/output, a communication Bus (Bus), and the like.
A CPU processor. The system is used for executing the operating system and the camera application instructions stored in the memory and executing various camera applications, data and contents according to various interaction instructions received from the outside so as to finally display and play various audio and video contents. The CPU processor may include a plurality of processors. E.g. comprising a main processor and one or more sub-processors.
In some embodiments, a graphics processor for generating various graphics objects, such as: at least one of an icon, an operation menu, and a user input instruction display figure. The graphic processor comprises an arithmetic unit, which performs operation by receiving various interactive instructions input by a user and displays various objects according to display attributes; the system also comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, the video processor is configured to receive an external video signal, and perform at least one of video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a signal displayed or played on the direct display device 200.
In some embodiments, the video processor includes at least one of a demultiplexing module, a video decoding module, an image composition module, a frame rate conversion module, a display formatting module, and the like. The demultiplexing module is used for demultiplexing the input audio and video data stream. And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like. And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display. And the frame rate conversion module is used for converting the frame rate of the input video. And the display formatting module is used for converting the received video output signal after the frame rate conversion, and changing the signal to be in accordance with the signal of the display format, such as an output RGB data signal.
In some embodiments, the audio processor is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform at least one of noise reduction, digital-to-analog conversion, and amplification processing to obtain a sound signal that can be played in the speaker.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on display 260, and the user input interface receives the user input commands through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between a camera application or operating system and a user that enables conversion between an internal form of information and a user-acceptable form. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include at least one of an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc. visual interface elements.
In some embodiments, user interface 280 is an interface that may be used to receive control inputs (e.g., physical buttons on the body of the display device, or the like).
In some embodiments, a system of a display device may include a Kernel (Kernel), a command parser (shell), a file system, and a camera application. The kernel, shell, and file system together make up the basic operating system structure that allows users to manage files, run programs, and use the system. After power-on, the kernel is started, kernel space is activated, hardware is abstracted, hardware parameters are initialized, and virtual memory, a scheduler, signals and interprocess communication (IPC) are operated and maintained. And after the kernel is started, loading the Shell and the user camera application. The camera application is compiled into machine code after being started, and a process is formed.
Referring to fig. 4, in some embodiments, the system is divided into four layers, which are, from top to bottom, a camera Application (Applications) layer (abbreviated as "Application layer"), a camera Application Framework (Application Framework) layer (abbreviated as "Framework layer"), an Android runtime (Android runtime) and system library layer (abbreviated as "system runtime library layer"), and a kernel layer.
In some embodiments, at least one camera application runs in the camera application layer, and the camera applications may be a Window (Window) program of an operating system, a system setting program, a clock program, or the like; or a camera application developed by a third party developer. In particular, the camera application package in the camera application layer is not limited to the above example.
The framework layer provides an Application Programming Interface (API) and a programming framework for the camera application of the camera application layer. The camera application framework layer includes some predefined functions. The camera application framework layer acts as a processing center that decides to let the camera applications in the application layer act. The camera application can access resources in the system and obtain services of the system in execution through the API interface.
As shown in fig. 4, in the embodiment of the present application, the camera application framework layer includes a manager (Managers), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used for interacting with all activities running in the system; the Location Manager (Location Manager) is used for providing the system service or application with the access of the system Location service; a Package Manager (Package Manager) for retrieving various information related to the camera application Package currently installed on the device; a Notification Manager (Notification Manager) for controlling display and clearing of Notification messages; a Window Manager (Window Manager) is used to manage the icons, windows, toolbars, wallpapers, and desktop components on a user interface.
In some embodiments, the activity manager is used to manage the lifecycle of the various camera applications and the usual navigation fallback functions, such as controlling the exit, opening, fallback, etc. of the camera applications. The window manager is used for managing all window programs, such as obtaining the size of a display screen, judging whether a status bar exists, locking the screen, intercepting the screen, controlling the change of the display window (for example, reducing the display window, displaying a shake, displaying a distortion deformation, and the like), and the like.
In some embodiments, the system runtime layer provides support for the upper layer, i.e., the framework layer, and when the framework layer is used, the android operating system runs the C/C + + library included in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 4, the core layer includes at least one of the following drivers: audio drive, display driver, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (like fingerprint sensor, temperature sensor, pressure sensor etc.) and power drive etc..
The camera can only be used by one camera application at a time. When a plurality of camera applications call the cameras at the same time, the camera application with the highest priority needs to use the camera according to the preset priority. At this time, other camera applications cannot acquire the image data acquired by the camera.
A display device includes a display and a controller.
In some embodiments, the display device further comprises a camera for collecting image data.
The camera can be used as a detector to be arranged in the display equipment, and can also be used as an external device to be externally connected on the display equipment. For the camera externally connected to the display equipment, the camera can be connected to an external device interface of the display equipment and is accessed into the display equipment. The user can utilize the camera to accomplish on display device and shoot or shoot the function to show the data that the camera was gathered in the display, in order to supply the user to watch.
The camera head may further comprise a lens assembly, wherein the lens assembly is provided with a photosensitive element and a lens. The lens can enable light of an image of a scene to be irradiated on the photosensitive element through the refraction effect of the plurality of lenses on the light. The photosensitive element can select a detection principle based on a CCD (Charge-coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) according to the specification of the camera, convert an optical signal into an electrical signal through a photosensitive material, and output the converted electrical signal into image data. The camera can also acquire image data frame by frame according to a set sampling frequency so as to form video stream data according to the image data.
In some embodiments, camera applications may be installed in the display device, and these camera applications may call the cameras to implement their respective related functions. The camera application refers to a camera application that needs to access a camera, and can process image data acquired by the camera, so that related functions, such as video chat, are realized.
FIG. 5 illustrates a flow diagram for interaction of components of a display device in some embodiments.
In some embodiments, the user may view all applications that have been installed in the display device. The user may transmit a query instruction to query all applications to the display device, and the controller may control the display of the application list in the display in response to the query instruction. FIG. 6 illustrates a user interface in a display in a possible embodiment. The user may click on "my applications" in the user interface to view all applications installed in the display device.
The user can select the application which is required to run from the application list, and simultaneously can send a control instruction for starting the camera application to the display device so as to start the corresponding camera application. According to different interaction modes of the display equipment and the user, the user can input a control instruction for starting the camera application through different interaction actions.
The user can enter the application list interface through key interaction on a control device (such as a remote controller), adjust the position of a focus cursor in the application list interface through an up key, a down key, a left key and a right key, and press a confirm/OK key to input a control instruction after the focus cursor is positioned on an application icon so as to start and run a corresponding camera application. Fig. 7 shows an application list interface in a display in a feasible embodiment, and a display device is currently installed with 6 camera applications, and a user can click one of the icons to start the corresponding camera application.
The user may also enter control instructions for starting the camera application in other ways. For example, for a display device supporting touch operation, a user may directly click touch on an application icon position in an application list interface to start running a corresponding camera application. For the display device with the built-in intelligent voice system, a user can control the display device to start and run the corresponding camera application by inputting a voice instruction such as 'open the camera application A'.
After a control instruction for starting the camera application sent by a user is received, the display equipment can operate the corresponding camera application according to the instruction, so that an interface corresponding to the camera application is displayed. According to different functions of the camera application, the presented application interfaces are different. For example, when a user inputs a control instruction for starting the camera application a, the display apparatus may run the camera application a and present a home page condition of the camera application a on the display.
In some embodiments, after the display device starts the camera application, the camera application may send a call request for calling the camera. The controller may start the Camera through a Camera Service (Camera Service) in the system framework layer in response to the call request, for example, send a start instruction to the Camera control Service, and control the Camera to start after the start instruction received by the Camera control Service.
FIG. 8 illustrates a flow diagram for the controller to send an open command in some embodiments.
In some embodiments, when controlling the camera to be turned on, the controller may first call a camera service (camera service) through a framework layer (frame) to perform an open camera operation. The Camera Service (Camera Service) may call the HAL (Hardware abstract layer) of the schema vendor through the HIDL.
The hidl (HAL Interface Description language) is a Description language for defining an Interface between the HAL and its user, and can be used in a system for performing communication between independently compiled code libraries, i.e. for inter-process communication.
The controller may establish a communication connection with the camera using a camera provider process (camera) in the hardware abstraction layer.
In some embodiments, the camera provider process may establish a communication connection with the camera via the UVC protocol. The UVC protocol (USB Video Class) is a protocol standard defined by USB Video capture devices.
In some embodiments, the camera provider process may also establish a communication connection with the camera via a MIPI (Mobile Industry Processor Interface) protocol.
After establishing communication connection with the camera, the controller can send a starting instruction to the camera, so that the camera is started. Meanwhile, the camera starts to work, image data can be collected, and the image data is sent to the display equipment.
In some embodiments, the display device may send a start instruction to the camera while starting the camera application to control the camera to start running.
The camera can be with the image data transmission who gathers to display device, and the controller can further send image data to corresponding camera application simultaneously to realize the relevant function of camera application, in order to supply the user to use.
In some embodiments, the display device may be provided with a camera sharing mode. Under the camera sharing mode, a plurality of camera applications can simultaneously acquire image data acquired by the cameras so as to realize the sharing of the image data. Each camera application can acquire image data acquired by the camera and process the image data.
In some embodiments, the user may send a camera sharing mode instruction to the display device by operating a designated key of the remote controller. And binding the corresponding relation between the camera sharing mode command and the remote controller key in advance in the process of practical application. For example, a camera sharing mode key is arranged on the remote controller, when a user touches the key, the remote controller sends a camera sharing mode command to the controller, and at the moment, the controller controls the display device to enter the camera sharing mode. When the user touches the key again, the controller can control the display device to exit the camera sharing mode.
In some embodiments, the corresponding relationship between the camera sharing mode instruction and the plurality of remote controller keys may also be bound in advance, and when the user touches the plurality of keys bound to the camera sharing mode instruction, the remote controller sends the camera sharing mode instruction. In a feasible embodiment, the keys bound by the camera sharing mode command are sequentially direction keys (left, down, left, down), that is, when the user continuously touches the keys (left, down, left, down) within a preset time, the remote controller sends the camera sharing mode command to the controller. By adopting the binding method, the camera sharing mode instruction can be prevented from being sent out due to misoperation of a user. The embodiment of the application is only an exemplary binding relationship between the camera sharing mode instruction and the key, and the binding relationship between the camera sharing mode instruction and the key can be set according to habits of users in the process of practical application, so that excessive limitation is not required.
In some embodiments, the user may send a camera sharing mode instruction to the display device by means of voice input using a sound collector of the display device, for example, a microphone, to control the display device to enter the camera sharing mode. An intelligent voice system can be arranged in the display device, and the intelligent voice system can recognize the voice of the user so as to extract the instruction content input by the user. The user can input a preset awakening word through the microphone so as to start the intelligent voice system, and the controller can respond to the instruction input by the user. And a camera sharing mode instruction is input within a certain time, so that the display equipment enters a camera sharing mode. For example, the user may enter "something classmate" to activate the intelligent speech system. And inputting the 'entering into a camera sharing mode' to realize sending a camera sharing mode instruction to the display equipment.
In some embodiments, the user may also send a camera sharing mode instruction to the display device through a preset gesture. The display device may detect the user's behavior through an image collector, such as a camera. When the user makes a preset gesture, the user may be considered to have sent a camera sharing mode instruction to the display device. For example, it can be set as: when the V-shaped word is detected to be scribed by the user, the user is judged to input a camera sharing mode instruction to the display device. The user can also send a camera sharing mode instruction to the display device through a preset action. For example, it can be set as: when it is detected that the user lifts up the left foot and the right hand at the same time, it is determined that the user has input a camera sharing mode instruction to the display device.
In some embodiments, when the user controls the display device using the smart device, for example, using a mobile phone, a camera sharing mode instruction may also be sent to the display device. In the process of practical application, a control can be set in the mobile phone, whether the mobile phone enters the camera sharing mode can be selected through the control, so that a camera sharing mode instruction is sent to the controller, and at the moment, the controller can control the display equipment to enter the camera sharing mode.
In some embodiments, when the user controls the display device using the cell phone, a continuous click command may be issued to the cell phone. The continuous click command refers to: in a preset period, the number of times that a user clicks the same area of the mobile phone touch screen exceeds a preset threshold value. For example: when the user continuously clicks a certain area of the mobile phone touch screen for 3 times within 1s, the user is regarded as a continuous clicking instruction. After receiving the continuous click command, the mobile phone can send a camera sharing mode command to the display device, so that the controller controls the display device to enter a camera sharing mode.
In some embodiments, when the user uses the mobile phone to control the display device, the following may also be set: when detecting that a touch pressure value of a certain area of the mobile phone touch screen by a user exceeds a preset pressure threshold, the mobile phone can send a camera sharing mode instruction to the display device.
The method can further set a camera sharing mode option in the UI interface of the display device, and when the user clicks the option, the display device can be controlled to enter or exit the camera sharing mode.
In some embodiments, to prevent the camera sharing mode from being triggered by mistake, when the controller receives the camera sharing mode command, the controller may control the display to display camera sharing mode confirmation information, so that the user performs secondary confirmation. Fig. 9 is a schematic diagram illustrating a camera sharing mode confirmation message displayed on the display in a possible embodiment.
In some embodiments, the display device is in a camera sharing mode. When a user opens a camera application a, the camera application sends a call request to the controller, and the call request is used for indicating to call the camera. In response to the call request, the controller may allocate a physical memory space as a camera memory, and the camera memory may be used to store image data acquired by the camera.
After the camera memory is allocated, the controller may send the memory address to the camera application a. When the camera application acquires the memory address, a connection relationship can be established between the memory address and the camera memory pointed by the memory address, so that image data can be acquired.
In some embodiments, the memory address may be a virtual address corresponding to the camera memory. The controller can map the camera memory to a User space, so as to obtain a virtual address of a virtual memory corresponding to the camera memory. Wherein the virtual address points to the camera memory. The camera application A and the camera memory can establish a connection relationship through the virtual address, so that the camera application A can further acquire image data in the camera memory through the virtual address.
When the camera application a receives the memory address, the memory address may be returned to the controller. When the controller receives the memory address returned by the camera application A, the image data collected by the camera can be stored in the camera memory.
In some embodiments, the Camera application a may upload a memory address to a Camera Service (Camera Service), and after receiving the memory address, the Camera Service (Camera Service) may store image data acquired by the Camera in a Camera memory corresponding to the memory address.
In some embodiments, the display device further comprises a decoder. The controller may transmit the image data to the decoder to cause the decoder to decode the image data.
In some embodiments, the controller may send the memory address to the decoder, and the decoder may obtain, through the memory address, the camera memory mapped with the memory address, so as to obtain the image data in the camera memory. The decoder may decode the image data and send the decoded image data to the camera application a. At this time, the camera application a can obtain image data acquired by the camera, and can implement the related functions of the camera application a according to the image data.
In some embodiments, when the decoder sends the decoded image data to the camera application a, the controller may further control the display to display the image data, so that the user views the image pictures captured by the camera.
In some embodiments, when receiving the memory address returned by the camera application a, the controller may further send the memory address to another camera application. For example, the controller may send the memory address to the camera application B, and the camera application B may establish a connection relationship with the camera memory through the memory address, so that the camera application B may further acquire the image data in the camera memory through the memory address.
In some embodiments, the display device may be provided with a monitoring module, and the monitoring module may detect in real time whether a process of returning a memory address by a camera application is available. When detecting that there is a camera application returning memory address, the monitoring module can read the memory address and send the memory address to other camera applications. Fig. 10a is a schematic diagram illustrating a camera application acquiring a memory address in a feasible embodiment. After the Camera application a obtains the memory address, the memory address is uploaded to a Camera Service (Camera Service). When monitoring the uploading behavior of the camera application a, the monitoring module may acquire the memory address, and at the same time, the monitoring module may send the memory address to the camera application B. The camera application A and the camera application B both acquire memory addresses corresponding to the camera memories, and further image data can be acquired simultaneously.
In some embodiments, when the Camera application passes the memory address back to the Camera Service (Camera Service), the Camera Service (Camera Service) may directly send the memory address to other Camera applications. Fig. 10b is a schematic diagram illustrating that the camera application acquires the memory address in a feasible embodiment. After the Camera application a obtains the memory address, the memory address is uploaded to a Camera Service (Camera Service). The camera service stores the image data collected by the camera into the camera memory, and meanwhile, the camera service actively sends the memory address to the camera application B. The camera application A and the camera application B both acquire memory addresses corresponding to the camera memories, and further image data can be acquired simultaneously.
In some embodiments, when the Camera application a receives a memory address corresponding to a Camera memory, the memory address may be uploaded to a Camera Service (Camera Service), and meanwhile, the Camera application a may also directly send the memory address to another Camera application. Fig. 10c is a schematic diagram illustrating that the camera application acquires the memory address in a feasible embodiment. After the Camera application a obtains the memory address, the memory address is uploaded to a Camera Service (Camera Service), and meanwhile, the memory address is sent to the Camera application B. The camera application A and the camera application B both acquire memory addresses corresponding to the camera memories, and further image data can be acquired simultaneously.
In some embodiments, when receiving the memory address, the camera application B may send a data obtaining request to the controller, where the obtaining request is used to obtain the image data in the camera memory pointed by the memory address.
When the controller receives a data acquisition request sent by the camera application B, the controller may send the image data in the camera memory to the decoder, so that the decoder decodes the image data.
After the decoder decodes the image data, the decoded image data may be sent to the camera application B. At this time, the camera application B may obtain image data acquired by the camera, and may implement the related function of the camera application B according to the image data.
In some embodiments, when receiving the memory address returned by the camera application a, the controller may further send the memory address to the plurality of camera applications. For example, the controller may send the memory address to the camera application B and the camera application C at the same time, so that both the camera application B and the camera application C can acquire the image data in the camera memory.
In some embodiments, the controller may set whether each camera application has an address obtaining authority, where the address obtaining authority refers to an authority that the controller may actively send a memory address to a certain camera application when the certain camera application does not send a call request.
When the control receives the memory address returned by a certain camera application, whether other camera applications open the address acquisition permission or not can be detected. When other camera applications open the address acquisition right, the controller can actively send the memory address to the camera applications.
For example, the display device is installed with three camera applications, where the camera application a and the camera application B open the address acquisition permission, and the camera application C does not open the address acquisition permission. When a user starts the camera application A, the camera application A sends a calling request, and at the moment, after the controller distributes the memory of the camera, the memory address is sent to the camera application A. When detecting that the camera application a returns the memory address, the controller needs to detect whether other camera applications open the address acquisition permission. Because the camera application B opens the address acquisition permission, the controller can send the memory address to the camera application B, so that the camera application B can acquire the image data according to the memory address. Because the camera application C does not have the address obtaining permission to be opened, the controller does not send the memory address to the camera application C at this time, and therefore the camera application C cannot obtain the memory address.
In some embodiments, the display device may show an address acquisition permission condition to the user, where the address acquisition permission condition includes all camera applications installed in the display device and address acquisition permission on states of all the camera applications. Fig. 11 is a schematic diagram of a display for displaying the address acquisition permission in a feasible embodiment.
The user can send the query instruction to the display device by operating the designated key of the remote controller. And binding the corresponding relation between the query instruction and the remote controller key in advance in the actual application process. For example, an inquiry key is arranged on the remote controller, when a user touches the inquiry key, the remote controller sends an inquiry instruction to the controller, and the controller controls the display to display the address acquisition permission condition, so that the user can automatically set the address acquisition permission starting state of each camera application according to the address acquisition permission condition.
Or the user calls a set key UI menu, and the controller controls the OSD layer menu to display the address and acquire the authority condition. The user can set the address acquisition permission starting state of each camera application according to the address acquisition permission condition.
The user can set the address acquisition permission starting state of each camera application by himself, at the moment, the controller can actively send the memory address to the camera application with the address acquisition permission started, and therefore the camera application with the address acquisition permission started can acquire image data collected by the camera, and data sharing is achieved.
In some embodiments, the display device may present to the user a list of permissions for all camera applications for which address acquisition permissions have been opened. Fig. 12 shows a schematic diagram of a rights list in a possible embodiment. The user can acquire all the camera applications with the address acquisition permission opened, and can select to add or delete the camera applications with the address acquisition permission opened.
In some embodiments, when the memory address returned by the camera application a is received, the controller may store the image data collected by the camera in the camera memory. And the controller sends the image data in the camera memory to the camera application A and a preset database. The predetermined database may provide image data to other camera applications.
For example, the camera application B may send a data acquisition request to the controller, the acquisition request for acquiring image data. When the controller receives a data acquisition request sent by the camera application B, image data in a preset database may be sent to the camera application B. At this time, the camera application B may obtain image data acquired by the camera, and may implement the related function of the camera application B according to the image data.
In some embodiments, the controller may send image data in the camera memory to the decoder. The decoder sends the decoded data to the camera application A and a preset database.
An embodiment of the present application further provides an audio output method, which is applied to a display device, and as shown in fig. 13, the method includes:
s1301, when a calling request for calling the camera, which is sent by a first camera application, is received, allocating a camera memory;
s1302, sending a memory address pointing to the camera memory to a first camera application;
s1303, when the memory address returned by the first camera application is received, storing the image data acquired by the camera into the camera memory, and sending the memory address to the second camera application, so that the second camera application acquires the image data according to the memory address;
and S1304, sending the image data in the camera memory to a first camera application.
The same and similar parts in the embodiments in this specification may be referred to one another, and are not described herein again.
Those skilled in the art will readily appreciate that the techniques of the embodiments of the present invention may be implemented as software plus a required general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be essentially or partially implemented in the form of software products, which may be stored in a storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and include instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method in the embodiments or some parts of the embodiments of the present invention.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. A display device, comprising:
a display;
a controller configured to:
when a calling request for calling a camera, which is sent by a first camera application, is received, allocating a camera memory;
sending a memory address pointing to the camera memory to a first camera application;
when a memory address returned by a first camera application is received, storing image data acquired by a camera into a camera memory, and sending the memory address to a second camera application so that the second camera application acquires the image data according to the memory address;
and sending the image data in the camera memory to a first camera application.
2. The display device of claim 1, wherein the controller is further configured to:
when receiving a call request for calling a camera transmitted by a first camera application,
calling a camera service through a framework layer;
controlling the camera service to call a hardware abstraction layer;
establishing communication connection with the camera by using a camera provider process of the hardware abstraction layer;
sending a starting instruction to the camera;
and receiving image data sent by the camera.
3. The display device of claim 1, wherein the controller is further configured to:
in performing the step of sending the image data in the camera memory to the first camera application,
sending the image data in the memory of the camera to a decoder, and controlling the decoder to decode the image data;
and sending the decoded image data to the first camera application.
4. The display device of claim 3, wherein the controller is further configured to:
after performing the step of sending the memory address to the second camera application,
receiving an acquisition request sent by a second camera application, wherein the acquisition request is used for acquiring image data stored in the camera memory pointed by the memory address;
and sending the decoded image data to a second camera application.
5. The display device of claim 1, wherein the controller is further configured to:
and when a memory address returned by the first camera application is received, the memory address is sent to the third camera application, so that the third camera application can acquire image data according to the memory address.
6. The display device of claim 1, wherein the controller is further configured to:
before the step of sending the memory address to the second camera application is performed,
detecting whether the second camera application opens the address acquisition permission;
and when detecting that the second camera application opens the address acquisition right, executing the step of sending the memory address to the second camera application.
7. The display device of claim 6, wherein the controller is further configured to:
responding to a query instruction which is input by a user and indicates the condition of address acquisition permission, and controlling a display to display the condition of the address acquisition permission so that the user can adjust the address acquisition permission starting state applied by each camera according to the condition of the address acquisition permission;
the address acquisition permission condition comprises all camera applications installed in the display equipment and the address acquisition permission starting state of each camera application.
8. A display device, comprising:
a display;
a controller configured to:
when a calling request for calling a camera, which is sent by a first camera application, is received, allocating a camera memory;
sending a memory address pointing to the camera memory to a first camera application so that the first camera application sends the memory address to a second camera application, wherein the memory address is used for enabling the second camera application to acquire image data;
when a memory address returned by a first camera application is received, storing image data acquired by a camera into a camera memory;
and sending the image data in the camera memory to a first camera application.
9. A display device, comprising:
a display;
a controller configured to:
when a calling request for calling a camera, which is sent by a first camera application, is received, allocating a camera memory;
sending a memory address pointing to the camera memory to a first camera application;
when a memory address returned by a first camera application is received, storing image data acquired by a camera into a camera memory;
sending the image data in the camera memory to a first camera application and a preset database; the preset database is used for providing image data for other camera applications.
10. A data sharing method is applied to display equipment and is characterized by comprising the following steps:
when a calling request for calling a camera, which is sent by a first camera application, is received, allocating a camera memory;
sending a memory address pointing to the camera memory to a first camera application;
when a memory address returned by a first camera application is received, storing image data acquired by a camera into a camera memory, and sending the memory address to a second camera application so that the second camera application acquires the image data according to the memory address;
and sending the image data in the camera memory to a first camera application.
CN202110642158.9A 2021-06-09 2021-06-09 Display apparatus and data sharing method Pending CN114302101A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110642158.9A CN114302101A (en) 2021-06-09 2021-06-09 Display apparatus and data sharing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110642158.9A CN114302101A (en) 2021-06-09 2021-06-09 Display apparatus and data sharing method

Publications (1)

Publication Number Publication Date
CN114302101A true CN114302101A (en) 2022-04-08

Family

ID=80963886

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110642158.9A Pending CN114302101A (en) 2021-06-09 2021-06-09 Display apparatus and data sharing method

Country Status (1)

Country Link
CN (1) CN114302101A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116775317A (en) * 2023-08-24 2023-09-19 广州希倍思智能科技有限公司 Data distribution method and device, storage medium and electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116775317A (en) * 2023-08-24 2023-09-19 广州希倍思智能科技有限公司 Data distribution method and device, storage medium and electronic equipment
CN116775317B (en) * 2023-08-24 2024-03-22 广州希倍思智能科技有限公司 Data distribution method and device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
CN112672195A (en) Remote controller key setting method and display equipment
CN114302190A (en) Display device and image quality adjusting method
CN112667184A (en) Display device
CN113490024A (en) Control device key setting method and display equipment
CN112860331A (en) Display device and voice interaction prompting method
CN114077724A (en) Account management method and display device
CN113111214A (en) Display method and display equipment for playing records
CN113163258A (en) Channel switching method and display device
CN114302101A (en) Display apparatus and data sharing method
CN113064691B (en) Display method and display equipment for starting user interface
CN113132809B (en) Channel switching method, channel program playing method and display equipment
CN114390190B (en) Display equipment and method for monitoring application to start camera
CN113064534A (en) Display method and display equipment of user interface
CN113542882A (en) Method for awakening standby display device, display device and terminal
CN114302070A (en) Display device and audio output method
CN114302203A (en) Image display method and display device
CN113608715A (en) Display device and voice service switching method
CN112668546A (en) Video thumbnail display method and display equipment
CN112601116A (en) Display device and content display method
CN112584210A (en) Display device, video recording method and recorded file display method
CN112256449A (en) Interface calling method of webpage application program, display equipment and server
CN114302199A (en) Display apparatus and data sharing method
CN113076042B (en) Local media resource access method and display device
CN112199612B (en) Bookmark adding and combining method and display equipment
CN113064515B (en) Touch display device and USB device switching method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination