CN114302203A - Image display method and display device - Google Patents

Image display method and display device Download PDF

Info

Publication number
CN114302203A
CN114302203A CN202110290335.1A CN202110290335A CN114302203A CN 114302203 A CN114302203 A CN 114302203A CN 202110290335 A CN202110290335 A CN 202110290335A CN 114302203 A CN114302203 A CN 114302203A
Authority
CN
China
Prior art keywords
image
image quality
camera
target
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110290335.1A
Other languages
Chinese (zh)
Inventor
何营昊
姜俊厚
于新磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202110290335.1A priority Critical patent/CN114302203A/en
Publication of CN114302203A publication Critical patent/CN114302203A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Controls And Circuits For Display Device (AREA)

Abstract

The application provides an image display method and display equipment. The application can call a camera on the display device to shoot images and can call a camera externally connected with the display device to shoot images. After the calling, the display equipment decodes the original image shot by the camera, and performs image quality processing on the decoded data by using the applied target image quality parameters to further obtain target image data; and the display equipment displays the image of the original image after image quality processing on a video display layer of the user interface by using the target image data. Therefore, no matter what kind of camera is called on the display device, the display device in the application can perform corresponding image quality processing on image data shot by different cameras, so that image quality requirements of different applications are met, and better camera use experience is brought to a user.

Description

Image display method and display device
Technical Field
The present application relates to the field of display technologies, and in particular, to an image display method and a display device.
Background
At present, a display device can be provided with a camera, and the display device can also be externally connected with the camera to realize functions of photographing, video call and the like. The camera and the external camera on the display device have certain image quality processing capacity, the shot image can be directly subjected to image quality processing, and then the display device displays the image subjected to the image quality processing.
However, scenes shot by calling the camera by each application on the display device are different, and the image quality requirements of different scenes on the shot images are also different. Both the camera carried by the display device and the externally connected camera have respective image quality processing capabilities, and if the image quality processing capability of the camera cannot meet the image quality requirement applied to a specific scene, the image finally displayed by the display device cannot meet the watching requirement of the user, so that the use experience of the user is influenced.
Disclosure of Invention
The application provides an image display method and display equipment, which are used for solving the problem that the image quality processing capability of a camera on the existing display equipment cannot meet the image quality requirement of an image applied to a specific scene.
In a first aspect, the present application provides a display device comprising:
a display for displaying a user interface;
a controller configured to:
acquiring decoding data of an original image shot by a camera;
performing image quality processing on the decoded data by using the target image quality parameter to obtain target image data;
and displaying the image of the original image after image quality processing on a video display layer of the user interface by using the target image data.
In some embodiments, the controller is further configured to:
turning the decoded data to obtain a turned image of the original image shot by the camera;
and performing image quality processing on the decoded data subjected to the overturning processing by using the target image quality parameters to obtain target image data.
In some embodiments, the controller is further configured to:
when the target application calls the camera, displaying a prompt for setting an image mode to a user on the user interface;
detecting whether a user sets a target image mode;
and under the condition that a user sets a target image mode, determining a target image quality parameter corresponding to the target application according to the target image mode.
In some embodiments, the controller is further configured to:
acquiring default image quality parameters of the target application under the condition that a user does not set a target image mode;
and carrying out image quality processing on the decoded data by using the default image quality parameter to obtain target image data.
In some embodiments, the controller is further configured to:
identifying the type of the image shot by the camera; the image types at least comprise a picture type and a video type;
and modifying the target image quality parameter according to the image type.
In some embodiments, the controller is further configured to:
acquiring the image quality requirement of a target application calling the camera;
and modifying the target image quality parameter according to the image quality requirement.
In a second aspect, the present application also provides another display device, including:
a display for displaying a user interface;
a controller configured to:
acquiring decoding data of an original image shot by a camera under the condition that a target application calling the camera needs to perform image quality processing;
performing image quality processing on the decoded data by using the target image quality parameter to obtain target image data;
and displaying the image of the original image after image quality processing on a video display layer of the user interface by using the target image data.
In some embodiments, the controller is further configured to:
under the condition that a target application calling a camera does not need to carry out image quality processing, acquiring an image data stream of an original image shot by the camera;
displaying the original image on a dynamic display layer of the user interface using the image data stream.
In a third aspect, the present application provides an image display method, comprising:
acquiring decoding data of an original image shot by a camera;
performing image quality processing on the decoded data by using the target image quality parameter to obtain target image data;
and displaying the image of the original image after image quality processing on a video display layer of a user interface by using the target image data.
In a fourth aspect, the present application further provides another image display method, including:
acquiring decoding data of an original image shot by a camera under the condition that a target application calling the camera needs to perform image quality processing;
performing image quality processing on the decoded data by using the target image quality parameter to obtain target image data;
and displaying the image of the original image after image quality processing on a video display layer of a user interface by using the target image data.
As can be seen from the foregoing, the present disclosure provides an image display method and a display device. The application can call a camera on the display device to shoot images and can call a camera externally connected with the display device to shoot images. After the calling, the display equipment decodes the original image shot by the camera, and performs image quality processing on the decoded data by using the applied target image quality parameters to further obtain target image data; and the display equipment displays the image of the original image after image quality processing on a video display layer of the user interface by using the target image data. Therefore, no matter what kind of camera is called on the display device, the display device in the application can perform corresponding image quality processing on image data shot by different cameras, so that image quality requirements of different applications are met, and better camera use experience is brought to a user.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 illustrates a schematic diagram of a usage scenario of a display device according to some embodiments;
fig. 2 illustrates a hardware configuration block diagram of the control apparatus 100 according to some embodiments;
fig. 3 illustrates a hardware configuration block diagram of the display apparatus 200 according to some embodiments;
FIG. 4 illustrates a software configuration diagram in the display device 200 according to some embodiments;
FIG. 5 illustrates a schematic diagram of a camera image display on a display device 200, according to some embodiments;
FIG. 6 illustrates a schematic diagram of an Android Camera architecture based Camera image display, in accordance with some embodiments;
FIG. 7 illustrates a schematic view of an original image taken by a camera according to some embodiments;
FIG. 8 illustrates a schematic diagram of a display device 200 displaying an original image according to some embodiments;
FIG. 9 illustrates a schematic diagram of a display device 200 displaying a flipped image according to some embodiments;
FIG. 10 illustrates a schematic diagram of a display page of a display device 200 according to some embodiments;
FIG. 11 illustrates a schematic diagram of a display page while in a video call in accordance with some embodiments;
FIG. 12 illustrates a schematic diagram of image quality processing on a display device 200 according to some embodiments;
FIG. 13 illustrates a schematic diagram of an image mode selection page, according to some embodiments;
FIG. 14 shows a schematic diagram of a camera image display on a second display device 200 according to some embodiments;
FIG. 15 illustrates a schematic diagram of another image quality process on a display device 200, according to some embodiments;
FIG. 16 illustrates a schematic diagram of an image mode display, according to some embodiments;
FIG. 17 illustrates a schematic diagram of a setup menu according to some embodiments;
fig. 18 illustrates a schematic diagram of a process of adjusting target image quality parameters according to some embodiments;
fig. 19 illustrates another process for adjusting target image quality parameters, according to some embodiments;
FIG. 20 illustrates a schematic diagram of a video image display in accordance with some embodiments;
FIG. 21 illustrates a schematic diagram of another video image display in accordance with some embodiments;
FIG. 22 illustrates a schematic diagram of a camera image display on the third display device 200 according to some embodiments;
FIG. 23 illustrates a schematic diagram of a hint flow, according to some embodiments.
Detailed Description
To make the purpose and embodiments of the present application clearer, the following will clearly and completely describe the exemplary embodiments of the present application with reference to the attached drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
FIG. 1 illustrates a schematic diagram of a usage scenario of a display device according to some embodiments. As shown in fig. 1, the display apparatus 200 is also in data communication with a server 400, and a user can operate the display apparatus 200 through the smart device 300 or the control device 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes at least one of an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, and controls the display device 200 in a wireless or wired manner. The user may control the display apparatus 200 by inputting a user instruction through at least one of a key on a remote controller, a voice input, a control panel input, and the like.
In some embodiments, the smart device 300 may include any of a mobile terminal, a tablet, a computer, a laptop, an AR/VR device, and the like.
In some embodiments, the smart device 300 may also be used to control the display device 200. For example, the display device 200 is controlled using an application program running on the smart device.
In some embodiments, the smart device 300 and the display device may also be used for communication of data.
In some embodiments, the display device 200 may also be controlled in a manner other than the control apparatus 100 and the smart device 300, for example, the voice instruction control of the user may be directly received by a module configured inside the display device 200 to obtain a voice instruction, or may be received by a voice control apparatus provided outside the display device 200.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers.
In some embodiments, software steps executed by one step execution agent may be migrated on demand to another step execution agent in data communication therewith for execution. Illustratively, software steps performed by the server may be migrated to be performed on a display device in data communication therewith, and vice versa, as desired.
Fig. 2 illustrates a block diagram of a hardware configuration of the control apparatus 100 according to some embodiments. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply. The control apparatus 100 may receive an input operation instruction from a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200.
In some embodiments, the communication interface 130 is used for external communication, and includes at least one of a WIFI chip, a bluetooth module, NFC, or an alternative module.
In some embodiments, the user input/output interface 140 includes at least one of a microphone, a touchpad, a sensor, a key, or an alternative module.
Fig. 3 illustrates a hardware configuration block diagram of a display device 200 according to some embodiments.
In some embodiments, the display apparatus 200 includes at least one of a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, a user interface.
In some embodiments the controller comprises a central processor, a video processor, an audio processor, a graphics processor, a RAM, a ROM, a first interface to an nth interface for input/output.
In some embodiments, the display 260 includes a display screen component for displaying pictures, and a driving component for driving image display, a component for receiving image signals from the controller output, displaying video content, image content, and menu manipulation interface, and a user manipulation UI interface, etc.
In some embodiments, the display 260 may be at least one of a liquid crystal display, an OLED display, and a projection display, and may also be a projection device and a projection screen.
In some embodiments, the tuner demodulator 210 receives broadcast television signals via wired or wireless reception, and demodulates audio/video signals, such as EPG data signals, from a plurality of wireless or wired broadcast television signals.
In some embodiments, communicator 220 is a component for communicating with external devices or servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver. The display apparatus 200 may establish transmission and reception of control signals and data signals with the control device 100 or the server 400 through the communicator 220.
In some embodiments, the detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for collecting ambient light intensity; alternatively, the detector 230 includes an image collector, such as a camera, which may be used to collect external environment scenes, attributes of the user, or user interaction gestures, or the detector 230 includes a sound collector, such as a microphone, which is used to receive external sounds.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, and the like. The interface may be a composite input/output interface formed by the plurality of interfaces.
In some embodiments, the controller 250 and the modem 210 may be located in different separate devices, that is, the modem 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink, an icon, or other actionable control. The operations related to the selected object are: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to the icon.
In some embodiments the controller comprises at least one of a Central Processing Unit (CPU), a video processor, an audio processor, a Graphics Processing Unit (GPU), a RAM Random Access Memory (RAM), a ROM (Read-Only Memory), a first to nth interface for input/output, a communication Bus (Bus), and the like.
A CPU processor. For executing operating system and application program instructions stored in the memory, and executing various application programs, data and contents according to various interactive instructions receiving external input, so as to finally display and play various audio-video contents. The CPU processor may include a plurality of processors. E.g. comprising a main processor and one or more sub-processors.
In some embodiments, a graphics processor for generating various graphics objects, such as: at least one of an icon, an operation menu, and a user input instruction display figure. The graphic processor comprises an arithmetic unit, which performs operation by receiving various interactive instructions input by a user and displays various objects according to display attributes; the system also comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, the video processor is configured to receive an external video signal, and perform at least one of video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a signal displayed or played on the direct display device 200.
In some embodiments, the video processor includes at least one of a demultiplexing module, a video decoding module, an image composition module, a frame rate conversion module, a display formatting module, and the like. The demultiplexing module is used for demultiplexing the input audio and video data stream. And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like. And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display. And the frame rate conversion module is used for converting the frame rate of the input video. And the display formatting module is used for converting the received video output signal after the frame rate conversion, and changing the signal to be in accordance with the signal of the display format, such as an output RGB data signal.
In some embodiments, the audio processor is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform at least one of noise reduction, digital-to-analog conversion, and amplification processing to obtain a sound signal that can be played in the speaker.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on display 260, and the user input interface receives the user input commands through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form that is acceptable to the user. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include at least one of an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc. visual interface elements.
In some embodiments, user interface 280 is an interface that may be used to receive control inputs (e.g., physical buttons on the body of the display device, or the like).
In some embodiments, a system of a display device may include a Kernel (Kernel), a command parser (shell), a file system, and an application program. The kernel, shell, and file system together make up the basic operating system structure that allows users to manage files, run programs, and use the system. After power-on, the kernel is started, kernel space is activated, hardware is abstracted, hardware parameters are initialized, and virtual memory, a scheduler, signals and interprocess communication (IPC) are operated and maintained. And after the kernel is started, loading the Shell and the user application program. The application program is compiled into machine code after being started, and a process is formed.
Referring to fig. 4, in some embodiments, the system is divided into four layers, which are an Application (Applications) layer (abbreviated as "Application layer"), an Application Framework (Application Framework) layer (abbreviated as "Framework layer"), an Android runtime (Android runtime) and system library layer (abbreviated as "system runtime library layer"), and a kernel layer from top to bottom.
In some embodiments, at least one application program runs in the application program layer, and the application programs may be windows (windows) programs carried by an operating system, system setting programs, clock programs or the like; or an application developed by a third party developer. In particular implementations, the application packages in the application layer are not limited to the above examples.
The framework layer provides an Application Programming Interface (API) and a programming framework for the application. The application framework layer includes a number of predefined functions. The application framework layer acts as a processing center that decides to let the applications in the application layer act. The application program can access the resources in the system and obtain the services of the system in execution through the API interface.
As shown in fig. 4, in the embodiment of the present application, the application framework layer includes a manager (Managers), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used for interacting with all activities running in the system; the Location Manager (Location Manager) is used for providing the system service or application with the access of the system Location service; a Package Manager (Package Manager) for retrieving various information related to an application Package currently installed on the device; a Notification Manager (Notification Manager) for controlling display and clearing of Notification messages; a Window Manager (Window Manager) is used to manage the icons, windows, toolbars, wallpapers, and desktop components on a user interface.
In some embodiments, the activity manager is used to manage the lifecycle of the various applications as well as general navigational fallback functions, such as controlling exit, opening, fallback, etc. of the applications. The window manager is used for managing all window programs, such as obtaining the size of a display screen, judging whether a status bar exists, locking the screen, intercepting the screen, controlling the change of the display window (for example, reducing the display window, displaying a shake, displaying a distortion deformation, and the like), and the like.
In some embodiments, the system runtime layer provides support for the upper layer, i.e., the framework layer, and when the framework layer is used, the android operating system runs the C/C + + library included in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 4, the core layer includes at least one of the following drivers: audio drive, display driver, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (like fingerprint sensor, temperature sensor, pressure sensor etc.) and power drive etc..
The display device 200 may have a camera, or may have an external camera, for implementing functions such as photographing and video call. The camera and the external camera of the display device 200 have certain image quality processing capability, and can directly perform image quality processing on the shot image, and then the display device 200 displays the image after the image quality processing.
A video display layer (video layer) and a dynamic display layer (osd layer) may be included on the user interface displayed by the display device 200. No matter the application calls the camera of the display device 200 or the external camera to shoot the image, the display device 200 can directly display the original image processed by the image quality of the camera on the osd layer of the user interface.
However, scenes shot by each application calling the camera on the display device 200 are different, and the image quality requirements of the shot images are different for different scenes. For example, if the application a is an application dedicated to capturing images or videos, the requirement on the image quality is high; however, the application B does not pay attention to the image quality and only requires the smoothness of image display, so that the requirement on the image quality is low. However, the image quality processing of the camera itself may be different, and if the application a calls a camera with low image quality processing capability, the image processed by the camera cannot meet the image quality requirement of the application a.
It can be seen that, no matter the camera of the display device 200 is a self-contained camera or an external camera, the camera has its own image quality processing capability, and if the image quality processing capability of the camera cannot meet the image quality requirement applied in a specific scene, the image finally displayed by the display device 200 cannot meet the viewing requirement of the user, thereby affecting the user experience.
In order to solve the problem that the image displayed by the display device 200 cannot meet the application scene requirement due to insufficient image quality processing capability of the camera, as shown in fig. 5, the display device 200 in the embodiment of the present application may perform image quality processing on the image captured by the camera again according to the image quality requirement of the specific application after detecting that the camera is called by the application, and display the image after the image quality processing on the video layer of the user interface, so that the finally displayed image meets the image quality requirement of the application. In this process, the controller 250 in the display apparatus 200 is configured to: the original image shot by the camera is acquired from the camera, and then the original image is decoded to obtain decoded data. Then, the image quality processing is carried out on the obtained decoded data by using the target image quality parameters provided by the target application calling the camera, and further the processed target image data is obtained. And finally, displaying an image of the original image after image quality processing, namely the target image, on a video layer of the user interface by using the target image data. The target image quality parameter is used for representing the image quality requirement of the target application calling the camera on the image.
The video layer and the osd layer can both display complete image content, and the difference between the two layers is that the osd layer can only directly display image data streams sent by a camera, and the video layer can display processed data of the image data streams. In the embodiment of the present application, in order to avoid that the image subjected to image quality processing by the camera itself hardly meets the image quality requirements of the application, the image data stream of the original image subjected to image quality processing by the camera needs to be decoded and then subjected to image quality processing again; since the image data stream of the original image obtained from the camera is decoded, the processed target image data can no longer be displayed by the osd layer. In the embodiment of the application, the target image data is displayed on the video layer. Therefore, the image after the image quality processing can be displayed, the displayed image can meet the image quality requirement of the target application, and the use experience of a user is guaranteed.
Since the camera itself can process the captured image, and the output image data is a data stream after encoding, in order to perform secondary processing on the image data, the controller 250 needs to decode the image data stream of the original image captured by the camera first, and then perform image quality processing on the decoded data.
FIG. 6 illustrates a schematic diagram of a Camera image display based on the Android Camera architecture, according to some embodiments.
In some embodiments, the controller 250 may be configured to decode the image data stream and perform image quality processing and image display on the decoded data based on the Android Camera architecture. As shown in fig. 6, after the target application calls the Camera, the target application registers in the upper Android Camera architecture, and a server (server) in the Android Camera architecture instantiates a corresponding bottom decoding client (client). The underlying client can communicate with the underlying omx (decoding component) and tell omx when decoding should occur while receiving some of the underlying decoding parameters set by the target application for decoding. When the camera shoots an image, the Server end takes out the data stream of the camera image in the h264 format through an ioctrl function, and sends the code stream to omx at the bottom layer for decoding. omx, after receiving the image data stream, calls a decoding capability, decodes the image data stream, and stores the decoded data obtained after decoding at a kernel data address according to the decoding parameters.
After decoding, omx sends the kernel data address back into the upper Android Camera architecture. The nvtd display (display component) in the framework acquires the decoding data from the corresponding address, and the render (rendering component) performs image quality processing on the decoding data to further acquire target image data. Finally, nvtd display displays the target image data on the video layer of the user interface.
The image in the embodiment of the application can generally refer to the content such as pictures, videos and the like shot by the camera. For a picture, the image data stream is data after picture coding; for video, the image data stream is the data after encoding each frame of the video.
In some embodiments, the camera used on the display device 200 is typically a front camera, and in order to make the image displayed on the display device 200 form a mirror image with the photographic subject, the controller 250 is further configured to: and performing overturning processing on the decoded data to obtain an overturning image of the original image shot by the camera. For example, if the person a runs in a real scene, lifts the left arm and lifts the right leg at the same time, the action of the person a taken from the camera angle is to lift the right arm and lift the left leg at the same time as shown in fig. 7. If, on the other hand, controller 250 displays fig. 7 directly on display device 200, then the situation as shown in fig. 8 is presented, i.e., person a has its left arm raised in front of display device 200, and person a in display device 200 has its arm raised to the right. This manner of display may make it difficult for a user to determine in front of the display device 200 what direction to move, thereby affecting the user's experience. Therefore, the controller 250 needs to perform the flipping process on the decoded data so that the image displayed on the display device 200 is in a mirror image relationship with the original image, as shown in fig. 9, so that when the mirrored image of the person a is displayed on the display device 200, the moving direction of the person a in the image is the same as the moving direction of the person a in the real scene.
Then, the controller 250 may be further configured to perform image quality processing on the decoded data subjected to the inversion processing using the target image quality parameter, thereby obtaining target image data.
FIG. 10 illustrates a schematic diagram of a display page of display device 200 according to some embodiments.
"media asset 1" shown in fig. 10 may correspond to "target application a", and "target application a" is an application that can make a video call. When the user selects "asset 1" on the page as shown in fig. 10, "asset 1" calls the camera to the display device 200, and after the video call connection is successful, the video image a of the user in front of the display device 200 is shot by the camera and displayed in the manner as shown in fig. 11, and the video image b of the other party of the video call acquired through the network is displayed in the upper right corner of the display image.
If the current "target application a" has a high requirement on the quality of the displayed video image a, the controller 250 in the embodiment of the present application acquires the image data stream of the user video captured by the camera and the target quality parameter set on the "target application a" after the "target application a" calls the camera and when the camera starts capturing the user video content; then decoding the image data stream to obtain the decoded data of each frame of image; and then, carrying out image quality processing on the decoded data of each frame of image by using the target image quality parameters so that each frame of image can meet the image quality requirement when the user carries out video call.
In some embodiments, some target applications may provide some image mode selections for the user in order to bring a better user experience, and further adjust the image quality of the finally presented image according to the user selections. Before the user selects the image mode, the display apparatus 200 displays a selection page of the image mode to the user on the display page after the target application is opened. As shown in fig. 12, in this process, the controller 250 is configured to display a prompt to the user for setting the image mode when the target application calls the camera. The prompt is presented in a selection page as shown in fig. 13, and some selectable image modes, such as standard mode, vivid mode, movie mode, game mode, etc., are also provided to the user on the page. The user selects a corresponding mode according to the requirement of the user, namely, a target image mode is set.
In some embodiments, different image modes are applied to different image quality parameters, and the user determines the target image quality parameter of the target application after selecting the target image mode. In this process, as shown in fig. 12, the controller 250 is further configured to: and under the condition that the user sets the target image mode, determining the target image quality parameter corresponding to the target application according to the target image mode. And processing the decoded data according to the target image quality parameters, so that the processed target image data meets the requirements of target application on image quality.
For example, if the user selects the vivid mode, the target image quality parameters corresponding to the vivid mode are different from those of the other image modes in terms of parameters for adjusting the color of the image. At this time, the controller 250 may adjust the color vividness of the target image by processing the decoded data using the target image quality parameter corresponding to the vividness mode, so that the image quality of the target image satisfies the requirements of the vividness mode.
Alternatively, some target applications provide the user with an option similar to "standard mode" for maintaining the original image quality, and if the user selects the image mode of this type, the target application calls the image captured by the camera, and the image quality processing is not required, but only the original image quality is used. In this process, as shown in fig. 14, the controller 250 may also be configured to directly display the image data stream captured by the camera on the osd layer, i.e., not to perform image quality processing again on the original image captured by the camera.
In some embodiments, the user may also select the "cancel" option after viewing the selection page as shown in fig. 13, or directly quit the operation of image mode selection, in which case the user may be considered not to make the selection of the target image mode. In this process, as shown in fig. 15, the controller 250 is configured to: and performing image quality processing on the decoded data by using the default image quality parameter to obtain target image data. The default image quality parameters are basic image quality parameters set by a technician according to a use scene, a function requirement and the like of the target application. When the user has no special requirement on the image quality, the controller 250 may perform image quality processing using the default image quality parameters of the target application, thereby ensuring that the displayed image can meet the most basic image quality requirement of the target application.
In addition, in some embodiments, regardless of whether the user selects the target image mode or does not select the target image mode, the display apparatus 200 may simultaneously display the current image mode on the image while displaying the image. For example, as shown in fig. 16, if the user selects the vivid mode, the controller 250 displays the image after the image quality processing on the video layer, and also displays the current "vivid mode" at a certain position of the image. However, the position of the image mode display is not fixed, but it is generally set for the purpose of not blocking the main content of the image.
In some embodiments, the user may select the image mode again during the image display process, and at this time, the user only needs to input an image mode selection instruction to the display device 200 to call up the selection page again on the display device 200. The mode of inputting the image mode selection command may be a function key such as "up", "down", "left", "right", "home" or the like on the control device 100, or a voice command may be input by voice. For example, on the display page shown in fig. 17, the user can call up a setting menu containing some function items on the display page by pressing the "home" key on the control apparatus 100, and "image mode" is also one of the function items. Then, the user may continue to select the option of "image mode" on the display page, thereby controlling the display device 200 to display again a selection page containing several image modes.
Fig. 18 illustrates a schematic diagram of a process of adjusting target image quality parameters according to some embodiments.
In some embodiments, some target applications may have different image quality requirements for the image due to their own usage attributes. For example, the application a is specially used for taking a picture, and the image displayed by the application a also needs to meet the requirements of higher image color, contrast, brightness, etc. when the image quality requirement set by the user is met; and the application B is specially used for carrying out video call, so that the image correspondingly displayed by the application B meets the basic image quality requirement, higher image quality is not needed, and the video is ensured to have better fluency and less delay risk. Therefore, in some embodiments, as shown in fig. 18, even if the user sets the corresponding target image mode, the target image quality parameter may be further adjusted according to the actual requirement of the target application, so as to further realize the requirement of the target application on the image quality when the requirement of the user on the image quality is satisfied.
In this process, the controller 250 is further configured to: firstly, the image quality requirement of the target application is obtained from the target application calling the camera. And then modifying the target image quality parameters according to the image quality requirements of the target application. It should be noted that the image quality parameter in the embodiment of the present application does not refer to only one parameter, but refers to a series of parameters for adjusting image brightness, color, contrast, motion compensation, noise, and the like. When the target image quality parameters are modified, one or more of the target image quality parameters may be modified according to the image quality requirements of the target application.
Alternatively, in some cases, the target application is not a functional application dedicated to shooting or video call, but a general application providing a shooting function or a shooting function, and the image quality requirement of the target application for the image may also be determined according to the scene of the camera, that is, which image the camera is shooting. If the camera takes a picture, the current requirements of the target application for the image need to meet the requirements of higher image color, contrast, brightness and the like. If the video content is shot by the camera, the delay risk needs to be reduced for the current requirement of the target application on the image in order to reduce the delay condition of video display, and then the corresponding parameters of the processes of time-consuming motion compensation, noise reduction and the like need to be modified.
Therefore, in some embodiments, as shown in fig. 19, the controller 250 is further configured to recognize the type of image taken by the camera, such as a picture type or a video type, according to the mode taken by the camera, and the like. And then correspondingly modifying the target image quality parameters according to the image types.
Taking the example that the target application calls the camera to shoot the video, the number of fingers of the user in the video is changed from one to two, if the image quality processing is not performed, when the video image is displayed, as shown in fig. 20, the process that the number of fingers of the user is changed from one to two may be synchronously displayed on the display device 200; however, if the image quality processing is performed, a series of operations such as motion compensation and noise reduction need to be performed on each frame of image of the video, and the processing time for each frame of image becomes long, as shown in fig. 21, the current number of fingers of the user may be changed to two, but an image with a hand index of one is still displayed on the display device 200, which causes a problem of delay in image display. In order to avoid such a problem, according to the method in the above embodiment, when it is detected that the camera is shooting a video, parameter items such as Memc (motion compensation) and noise reduction which take a long time in target image quality parameters are modified, and then processes such as motion compensation and noise reduction are directly closed, so that the speed of image quality processing is increased, and the delay risk of video image display is reduced.
As can be seen from the above, the display apparatus 200 according to the embodiment of the present application may perform the second image quality processing on the image data of the original image processed by the camera again using the target image quality parameter set by the user and display the target image on the video layer of the user interface using the processed target image data when the target application calls the image captured by the camera, in a case where the camera itself performs the basic image quality processing on the original image. Therefore, no matter whether the target application calls the display device 200 to be a self-contained camera or an external camera, the display device 200 can perform corresponding image quality processing on image data of different cameras, so that image quality requirements of different applications and different users are met, and better camera use experience is brought to the users.
As described in the previous embodiments, the user interface displayed by the display 260 includes a video layer and an osd layer. When the display device 200 needs to perform image quality processing on an image shot by the camera, the processed image data needs to be displayed on a video layer; when the display device 200 does not need to perform image quality processing on the image captured by the camera, the image data stream captured by the camera can be directly displayed on the osd layer.
Furthermore, as shown in fig. 22, the display apparatus 200 in the embodiment of the present application may further detect whether the target application needs to perform image quality processing after the target application calls the camera. In this process, the controller 250 is configured to: after the target application calls the camera, whether the target application needs to carry out image quality processing is detected. If the target application calling the camera needs to carry out image quality processing, the original shot by the camera is acquired from the camera, and then the original image is decoded to obtain decoded data. Then, the decoded data is subjected to image quality processing by using the target image quality parameters provided by the target application, and further target image data is obtained. And finally, displaying an image of the original image after image quality processing, namely the target image, on a video layer of the user interface by using the target image data.
Also, if the target application does not need to perform the image quality processing, the controller 250 is further configured to obtain the image data stream of the original image it takes from the camera, as shown in fig. 22. Then, the original image shot by the camera is displayed on the osd layer of the user interface by using the image data stream.
When the controller 250 detects whether the target application needs to perform the image quality processing, the detection may be performed according to the decoding parameters set by the target application itself. In general, the decoding parameters of the target application include fields for indicating whether the video layer is required to display the image, and the like, in addition to some basic parameters for decoding, such as a decoding frame rate, a decoding format, and the like. If the content of the field indicates that the target application needs to use the video layer to display the image, the controller 250 detects that the target application needs to perform image quality processing; if the contents of the field indicate that the target application does not need to display the image using the video layer, the controller 250 detects that the target application does not need to perform the image quality processing.
Furthermore, it should be noted that, the technician may estimate whether the target application has a higher requirement on the image quality according to the usage scenario, the functional requirement, and the like of the target application, and if the basic image quality processing of the camera can meet the usage requirement of the target application, the technician may determine that the target application does not need to perform the image quality processing of the display device 200, and at this time, the technician may modify the field content to a content indicating that the image quality processing is not needed; on the other hand, if the basic image quality processing of the camera is difficult to meet the use requirement of the target application, the target application is considered to need to perform the image quality processing of the display device 200, and at this time, the technician may modify the field content to indicate that the image quality processing is needed.
And the controller 250 may still decode the image data stream and perform image quality processing and image display on the decoded data based on the Android Camera architecture. After the target application calls the Camera, the target application is registered in the Android Camera architecture at the upper layer, and whether the video layer display is needed or not can be judged according to the decoding parameters of the target application during the registration. If necessary, at this time, a server (server) in the Android Camera architecture instantiates a corresponding underlying decoding client (client). The underlying client can communicate with the underlying omx (decoding component) and tell omx when decoding should occur while receiving some basic parameters set by the target application for decoding. When video data of the camera exists, the Server end extracts the data stream of the camera image in the h264 format through the ioctrl function, and sends the code stream to omx at the bottom layer for decoding. omx, after receiving the image data stream, calls a decoding capability, decodes the image data stream, and stores the decoded data obtained after decoding at a kernel data address according to the decoding parameters.
After decoding, omx sends the kernel data address back into the upper Android Camera architecture. The nvtd display (display component) in the framework acquires the decoding data from the corresponding address, and the render (rendering component) performs image quality processing on the decoding data to further acquire target image data. Finally, nvtd display displays the target image data on the video layer of the user interface.
And if the target application is registered, the Android Camera architecture judges that video layer display is not needed and osd layer display is needed according to the decoding parameters, and then the Android Camera process can be directly carried out on the display process of the image shot by the Camera, namely the image data stream of the Camera is directly displayed on the osd layer.
FIG. 23 illustrates a schematic diagram of a hint flow, according to some embodiments.
In some embodiments, as shown in fig. 23, when the target application calls that the image captured by the camera does not need to be subjected to image quality processing, the display device 200 may send a prompt to the user to remind the user that the image is currently being displayed by using the image quality processing parameters of the camera; when the target application calls the image shot by the camera to be subjected to image quality processing, the display device 200 also sends a prompt to the user, so as to remind the user to select a corresponding image mode. In this process, the controller 250 is further configured to: detecting whether the target application needs image quality processing; when the target application needs to perform image quality processing, a prompt for displaying an image using the camera parameters is displayed on the display 260; on the other hand, when the target application does not need to perform image quality processing, a prompt for prompting the user to select the image mode is displayed on the display 260. The page for selecting the image mode can still be seen in fig. 10 in the previous embodiment.
As can be seen from the above, the display device 200 provided in the embodiment of the present application can capture an image with its own camera or with an external camera. After the target application calls the camera, if the target application needs to perform further image quality processing, the display device 200 decodes the image data stream shot by the camera, and performs image quality processing on the decoded data by using the target image quality parameters of the application, so as to obtain target image data; the display device 200 displays the image of the original image after the image quality processing on the video layer of the user interface by using the target image data. However, if the target application does not need further image quality processing, the display device 200 may directly display the image data stream captured by the camera on the osd layer, thereby directly displaying the original image.
Therefore, no matter what kind of camera is called by the target application on the display device 200, the display device 200 in the embodiment of the present application can perform corresponding image quality processing on image data of different cameras, so as to meet image quality requirements of different applications, and bring better camera use experience to users.
In addition, in order to meet image quality requirements of different applications, an embodiment of the present application further provides an image display method, which can be applied to the display device 200 in the foregoing embodiment and implemented by the controller 250, and specifically includes the following steps: acquiring decoding data of an image data stream shot by a camera; performing image quality processing on the decoded data by using the target image quality parameter to obtain target image data; the target image photographed by the camera is displayed on the video layer of the user interface displayed on the display 260 using the target image data.
Meanwhile, in order to meet the image quality requirements of different applications, another image display method is provided in the embodiment of the present application, and the method may also be applied to the display device 200 in the foregoing embodiment and implemented by the controller 250, and specifically may include the following steps: acquiring decoding data of an image data stream shot by a camera under the condition that a target application calling the camera needs to perform image quality processing; performing image quality processing on the decoded data by using the target image quality parameter to obtain target image data; the target image photographed by the camera is displayed on the video layer of the user interface displayed on the display 260 using the target image data. Acquiring an image data stream shot by the camera under the condition that the target application calling the camera does not need to carry out image quality processing; the image data stream is used to display the target image captured by the camera on the osd layer of the user interface displayed on the display 260.
Since the image display method in the embodiment of the present application can be applied to the controller 250 in the foregoing embodiment, other contents of the image display method in the embodiment of the present application can refer to the contents of the foregoing embodiment, and are not described herein again.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. A display device, comprising:
a display for displaying a user interface;
a controller configured to:
acquiring decoding data of an original image shot by a camera;
performing image quality processing on the decoded data by using the target image quality parameter to obtain target image data;
and displaying the image of the original image after image quality processing on a video display layer of the user interface by using the target image data.
2. The display device of claim 1, wherein the controller is further configured to:
turning the decoded data to obtain a turned image of the original image shot by the camera;
and performing image quality processing on the decoded data subjected to the overturning processing by using the target image quality parameters to obtain target image data.
3. The display device of claim 1, wherein the controller is further configured to:
when the target application calls the camera, displaying a prompt for setting an image mode to a user on the user interface;
detecting whether a user sets a target image mode;
and under the condition that a user sets a target image mode, determining a target image quality parameter corresponding to the target application according to the target image mode.
4. The display device of claim 3, wherein the controller is further configured to:
acquiring default image quality parameters of the target application under the condition that a user does not set a target image mode;
and carrying out image quality processing on the decoded data by using the default image quality parameter to obtain target image data.
5. The display device according to any one of claims 1-3, wherein the controller is further configured to:
identifying the type of the image shot by the camera; the image types at least comprise a picture type and a video type;
and modifying the target image quality parameter according to the image type.
6. The display device according to any one of claims 1-3, wherein the controller is further configured to:
acquiring the image quality requirement of a target application calling the camera;
and modifying the target image quality parameter according to the image quality requirement.
7. A display device, comprising:
a display for displaying a user interface;
a controller configured to:
acquiring decoding data of an original image shot by a camera under the condition that a target application calling the camera needs to perform image quality processing;
performing image quality processing on the decoded data by using the target image quality parameter to obtain target image data;
and displaying the image of the original image after image quality processing on a video display layer of the user interface by using the target image data.
8. The display device of claim 7, wherein the controller is further configured to:
under the condition that a target application calling a camera does not need to carry out image quality processing, acquiring an image data stream of an original image shot by the camera;
displaying the original image on a dynamic display layer of the user interface using the image data stream.
9. An image display method, comprising:
acquiring decoding data of an original image shot by a camera;
performing image quality processing on the decoded data by using the target image quality parameter to obtain target image data;
and displaying the image of the original image after image quality processing on a video display layer of a user interface by using the target image data.
10. An image display method, comprising:
acquiring decoding data of an original image shot by a camera under the condition that a target application calling the camera needs to perform image quality processing;
performing image quality processing on the decoded data by using the target image quality parameter to obtain target image data;
and displaying the image of the original image after image quality processing on a video display layer of a user interface by using the target image data.
CN202110290335.1A 2021-03-18 2021-03-18 Image display method and display device Pending CN114302203A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110290335.1A CN114302203A (en) 2021-03-18 2021-03-18 Image display method and display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110290335.1A CN114302203A (en) 2021-03-18 2021-03-18 Image display method and display device

Publications (1)

Publication Number Publication Date
CN114302203A true CN114302203A (en) 2022-04-08

Family

ID=80963871

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110290335.1A Pending CN114302203A (en) 2021-03-18 2021-03-18 Image display method and display device

Country Status (1)

Country Link
CN (1) CN114302203A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023246489A1 (en) * 2022-06-21 2023-12-28 苏州源控电子科技有限公司 All-in-one computer device supporting multi-channel audio and video input

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010023966A1 (en) * 2008-08-27 2010-03-04 ミツミ電機株式会社 Image quality adjusting device, image quality adjusting method, and image quality adjusting program
CN102685379A (en) * 2011-03-18 2012-09-19 卡西欧计算机株式会社 Image processing apparatus with function for specifying image quality, and method and storage medium
CN104378688A (en) * 2014-10-27 2015-02-25 小米科技有限责任公司 Mode switching method and device
CN109068056A (en) * 2018-08-17 2018-12-21 Oppo广东移动通信有限公司 A kind of electronic equipment and its filter processing method of shooting image, storage medium
CN111405338A (en) * 2020-02-27 2020-07-10 海信视像科技股份有限公司 Intelligent image quality switching method and display device
CN112511750A (en) * 2020-11-30 2021-03-16 维沃移动通信有限公司 Video shooting method, device, equipment and medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010023966A1 (en) * 2008-08-27 2010-03-04 ミツミ電機株式会社 Image quality adjusting device, image quality adjusting method, and image quality adjusting program
CN102685379A (en) * 2011-03-18 2012-09-19 卡西欧计算机株式会社 Image processing apparatus with function for specifying image quality, and method and storage medium
CN104378688A (en) * 2014-10-27 2015-02-25 小米科技有限责任公司 Mode switching method and device
CN109068056A (en) * 2018-08-17 2018-12-21 Oppo广东移动通信有限公司 A kind of electronic equipment and its filter processing method of shooting image, storage medium
CN111405338A (en) * 2020-02-27 2020-07-10 海信视像科技股份有限公司 Intelligent image quality switching method and display device
CN112511750A (en) * 2020-11-30 2021-03-16 维沃移动通信有限公司 Video shooting method, device, equipment and medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023246489A1 (en) * 2022-06-21 2023-12-28 苏州源控电子科技有限公司 All-in-one computer device supporting multi-channel audio and video input

Similar Documents

Publication Publication Date Title
CN114302219B (en) Display equipment and variable frame rate display method
WO2022073392A1 (en) Picture display method, and display device
CN112672195A (en) Remote controller key setting method and display equipment
CN111970548B (en) Display device and method for adjusting angle of camera
CN112667184A (en) Display device
WO2022048203A1 (en) Display method and display device for manipulation prompt information of input method control
CN113094142A (en) Page display method and display equipment
CN112698905A (en) Screen protection display method, display device, terminal device and server
CN113825002B (en) Display device and focal length control method
WO2022028060A1 (en) Display device and display method
CN114095769B (en) Live broadcast low-delay processing method of application-level player and display device
CN114302203A (en) Image display method and display device
CN113453069B (en) Display device and thumbnail generation method
CN113064691B (en) Display method and display equipment for starting user interface
CN113434240B (en) Display method and display device of image mode
CN113132809B (en) Channel switching method, channel program playing method and display equipment
CN114390190B (en) Display equipment and method for monitoring application to start camera
CN112905008B (en) Gesture adjustment image display method and display device
CN116980554A (en) Display equipment and video conference interface display method
CN114302101A (en) Display apparatus and data sharing method
CN113992960A (en) Subtitle previewing method on display device and display device
CN113064534A (en) Display method and display equipment of user interface
CN112668546A (en) Video thumbnail display method and display equipment
CN112911371A (en) Double-channel video resource playing method and display equipment
CN114296664A (en) Auxiliary screen brightness adjusting method and display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination