CN115967830A - Display device and sound and picture synchronization method - Google Patents

Display device and sound and picture synchronization method Download PDF

Info

Publication number
CN115967830A
CN115967830A CN202111169799.3A CN202111169799A CN115967830A CN 115967830 A CN115967830 A CN 115967830A CN 202111169799 A CN202111169799 A CN 202111169799A CN 115967830 A CN115967830 A CN 115967830A
Authority
CN
China
Prior art keywords
audio
output
output delay
data
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111169799.3A
Other languages
Chinese (zh)
Inventor
吴燕丽
王昊
卢平光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202111169799.3A priority Critical patent/CN115967830A/en
Publication of CN115967830A publication Critical patent/CN115967830A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Controls And Circuits For Display Device (AREA)

Abstract

The application shows a display device and a sound and picture synchronization method. Wherein, the display device includes: a display; an external device interface for connecting an external audio output apparatus; a built-in audio output module; a controller to: acquiring audio and video data of the media assets, wherein the audio and video data comprise image data and audio data; determining output delay of image data and output delay of audio data, wherein the output delay of the image data is consistent with the output delay of the audio data; and generating a statement containing the output delay of the image data and the output delay of the audio data, so as to declare the output delay of the image data and the output delay of the audio data to an external audio output device or a built-in audio output module by the statement. The method and the device can complete interaction between the display device and the external audio output device or the built-in audio output module of the display device through the statement, so that sound and picture synchronization between the display device and the external audio output device or the built-in audio output module of the display device is realized.

Description

Display device and sound and picture synchronization method
Technical Field
The application relates to the technical field of display equipment, in particular to display equipment and a sound and picture synchronization method.
Background
The display equipment supports media asset playing, and in the media asset playing process, the display equipment analyzes the audio and video data of the media assets into image data and audio data. The display device processes the image data and displays the processed image data through the display, and processes the audio data and plays the processed audio data through the audio output device. Meanwhile, the display device also supports the playing of media assets under various image setting parameters, such as different image resolution setting parameters and different image mode setting parameters (such as a standard mode, a sport mode, a game mode, and the like), and the time consumption for the display device to analyze audio and video data is different under different image setting parameters.
Because the image data and the audio data are processed by adopting different data processing paths, and the processing time of the display device on the image data is longer than that of the audio data, the phenomenon of asynchronous sound and picture is easy to occur when the display device plays media resources under the same image mode setting parameter or different image mode setting parameters because the processing time of the display device on the image data and the audio data is different.
Disclosure of Invention
The application provides a display device and a sound and picture synchronization method, which can complete interaction between the display device and an external audio output device or a built-in audio output module of the display device through declaration so as to realize sound and picture synchronization between the display device and the external audio output device or the built-in audio output module of the display device.
In some embodiments of the present application, there is provided a display device including: a display; an external device interface for connecting an external audio output apparatus; a built-in audio output module; a controller configured to: acquiring audio and video data of the media assets, wherein the audio and video data comprise image data and audio data; determining the output delay of the image data and the output delay of the audio data, wherein the output delay of the image data is consistent with the output delay of the audio data; and generating a statement containing the output delay of the image data and the output delay of the audio data, so as to declare the output delay of the image data and the output delay of the audio data by declaring to an external audio output device or a built-in audio output module. By adopting the implementation mode, the interaction between the display equipment and the external audio output equipment or the built-in audio output module of the display equipment can be completed through declaration, so that the sound and picture synchronization between the display equipment and the external audio output equipment or the built-in audio output module of the display equipment is realized.
In some embodiments of the present application, the controller performs determining an output delay of the image data and an output delay of the audio data, further configured to: acquiring current image setting parameters of display equipment; the output delay of the image data and the output delay of the audio data are determined by combining the image setting parameters, and the output delays determined according to different image setting parameters are different. By adopting the implementation mode, the output delay of the image data and the output delay of the audio data can be determined according to the current image setting parameters of the display equipment, so that the sound and picture synchronization between the display equipment and the external audio output equipment or the built-in audio output module of the display equipment is realized.
In some embodiments of the application, the controller executes a statement to generate an output delay including the image data and the audio data output delay, further configured to: generating a CEC message including an output delay of image data and an output delay of audio data; the controller, in performing declaring an output delay of the image data and an output delay of the audio data to the external audio output device or the built-in audio output module by declaring, is further configured to transmit a CEC message to the external audio output device. By adopting the implementation mode, the output delay of the image data and the output delay of the audio data can be declared to the external audio output equipment in real time through the CEC message, so that the external audio output equipment can acquire the output delay of the image data and the output delay of the audio data in real time.
In some embodiments of the present application, the controller is further configured to: when the image setting parameters of the display equipment are monitored to be changed, the output delay of the image data and the output delay of the audio data are determined again by combining the new image setting parameters; generating a new CEC message according to the redetermined output delay of the image data and the output delay of the audio data; the new CEC message is sent to the external audio output device. By adopting the implementation mode, the audio and video data can declare the output delay of the image data and the output delay of the audio data to the external audio output equipment in real time through CEC messages under the condition of different image setting parameters, so that the external audio output equipment can acquire the output delay of the image data and the output delay of the audio data in real time.
In some embodiments of the present application, the declaration is an extended display identification data declaration. By adopting the implementation mode, when the display equipment outputs the audio data through the built-in audio output module or the external audio output equipment, the sound and picture synchronization of the image data and the audio data at the display equipment end can be automatically realized.
In some embodiments of the present application, the sound and picture data further includes signal formats of the image data, the signal formats including a P-system signal format and an I-system signal format; the controller performs determining an output delay of the image data and an output delay of the audio data, and is further configured to: respectively determining the output delay of the image data of the P system signal format and the output delay of the image data of the I system signal format; generating a declaration containing an output delay of the image data and an output delay of the audio data, comprising: and generating a statement containing the output delay of the image data in the P system signal format, the output delay of the image data in the I system signal format and the output delay of the audio data. By adopting the implementation mode, the output time delay of the image data with different signal formats can be respectively set, so that the display equipment supports the sound and picture synchronization with an external audio output device or a built-in audio output module of the display equipment under various signal formats.
In some embodiments of the present application, the controller performs determining an output delay of the image data and an output delay of the audio data, further configured to: acquiring the processing time consumption of the audio data; subtracting the processing time consumption of the audio data by utilizing the output delay of the image data to obtain the delay compensation of the audio data; and determining the output delay of the audio data according to the processing time consumption of the audio data and the delay compensation of the audio data so as to ensure that the output delay of the image data is consistent with the output delay of the audio data. By adopting the implementation mode, the output delay of the audio data can be consistent with the output delay of the image data, so that the sound and picture synchronization of the display equipment end is realized.
In some embodiments of the present application, the controller is further configured to: and sending the statement containing the output delay of the image data and the output delay of the audio data to the built-in audio output module so as to enable the built-in audio output module to output the audio data. By adopting the implementation mode, the audio data can be output through the built-in audio output module, and the sound and picture synchronization of the display equipment end is realized.
In some embodiments of the present application, the controller is further configured to: receiving external output delay of audio data sent by external audio output equipment, wherein the external output delay is determined by the external audio output equipment according to the processing time consumption of the external audio output equipment on the audio data; the sum of the output delay of the image data and the external output delay of the audio data is determined as the final output delay of the image data. By adopting the implementation mode, the audio data can be output through the external audio output equipment, and the sound and picture synchronization between the display equipment and the external audio output equipment is realized.
In some embodiments of the present application, there is also provided a sound-picture synchronization method, where the method is applied to a display device, and the display device includes: an external device interface and a built-in audio output module for connecting an external audio output device; the method comprises the following steps: acquiring audio and video data of the media assets, wherein the audio and video data comprise image data and audio data; determining output delay of image data and output delay of audio data, wherein the output delay of the image data is consistent with the output delay of the audio data; and generating a statement containing the output delay of the image data and the output delay of the audio data, so as to declare the output delay of the image data and the output delay of the audio data by declaring to an external audio output device or a built-in audio output module. By adopting the implementation mode, the interaction between the display equipment and the external audio output device or the built-in audio output module of the display equipment can be completed through the declaration, so that the sound and picture synchronization between the display equipment and the external audio output device or the built-in audio output module of the display equipment is realized.
Therefore, in the application, the interaction between the display device and the external audio output device or the built-in audio output module of the display device can be completed through the declaration, so that the sound and picture synchronization between the display device and the external audio output device or the built-in audio output module of the display device is realized.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
FIG. 1 illustrates a schematic diagram of an operational scenario between a display device and a control apparatus, in accordance with some embodiments;
fig. 2 shows a block configuration diagram of the control device 100 according to some embodiments;
fig. 3 illustrates a hardware configuration block diagram of the display apparatus 200 according to some embodiments;
fig. 4 shows a software configuration block diagram of the display device 200 according to some embodiments;
FIG. 5 illustrates an application icon control interface display schematic of the display device 200, according to some embodiments;
FIG. 6 illustrates a media asset playback scenario diagram, in accordance with some embodiments;
FIG. 7 illustrates a schematic diagram of a media asset playback principle, according to some embodiments;
fig. 8 illustrates a process diagram for image quality processing according to some embodiments;
FIG. 9 illustrates a display device configuration flow diagram according to some embodiments;
FIG. 10 illustrates a display device configuration flow diagram according to some embodiments;
FIG. 11 illustrates an EDID claim schematic according to some embodiments;
FIG. 12 illustrates a display device configuration flow diagram according to some embodiments;
FIG. 13 illustrates a display device configuration flow diagram according to some embodiments;
FIG. 14 illustrates an image resolution setting parameter setting diagram according to some embodiments;
FIG. 15 illustrates an image mode setting parameter setting diagram according to some embodiments.
Detailed Description
To make the objects, embodiments and advantages of the present application clearer, the following description of exemplary embodiments of the present application will clearly and completely describe the exemplary embodiments of the present application with reference to the accompanying drawings in the exemplary embodiments of the present application, and it is to be understood that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments described herein without inventive step, are intended to be within the scope of the claims appended hereto. In addition, while the disclosure herein has been presented in terms of one or more exemplary examples, it should be appreciated that aspects of the disclosure may be implemented solely as a complete embodiment. It should be noted that the brief descriptions of the terms in the present application are only for convenience of understanding of the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a control apparatus according to one or more embodiments of the present application, and as shown in fig. 1, a user may operate the display device 200 through a mobile terminal 300 and the control apparatus 100. The control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes infrared protocol communication, bluetooth protocol communication, wireless or other wired method to control the display device 200. The user may input a user command through a key on a remote controller, a voice input, a control panel input, etc. to control the display apparatus 200. In some embodiments, mobile terminals, tablets, computers, laptops, and other smart devices may also be used to control the display device 200.
In some embodiments, the mobile terminal 300 may install a software application with the display device 200, implement connection communication through a network communication protocol, and implement the purpose of one-to-one control operation and data communication. The audio and video contents displayed on the mobile terminal 300 may also be transmitted to the display device 200, so that the display device 200 with the synchronous display function may also perform data communication with the server 400 through multiple communication modes. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. The display device 200 may be a liquid crystal display, an OLED display, or a projection display device. The display apparatus 200 may additionally provide an intelligent network television function providing a computer support function in addition to the broadcast receiving television function.
Fig. 2 exemplarily shows a block diagram of a configuration of the control apparatus 100 according to an exemplary embodiment. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply. The control apparatus 100 may receive an input operation instruction from a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200. The communication interface 130 is used for communicating with the outside, and includes at least one of a WIFI chip, a bluetooth module, NFC, or an alternative module. The user input/output interface 140 includes at least one of a microphone, a touch pad, a sensor, a key, or an alternative module.
Fig. 3 shows a hardware configuration block diagram of the display apparatus 200 according to an exemplary embodiment. The display apparatus 200 as shown in fig. 3 includes at least one of a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, and a user interface 280. The controller includes a central processor, a video processor, an audio processor, a graphic processor, a RAM, a ROM, and first to nth interfaces for input/output. The display 260 may be at least one of a liquid crystal display, an OLED display, a touch display, and a projection display, and may also be a projection device and a projection screen. The tuner demodulator 210 receives a broadcast television signal through a wired or wireless reception manner, and demodulates an audio/video signal, such as an EPG data signal, from a plurality of wireless or wired broadcast television signals. The detector 230 is used to collect signals of an external environment or interaction with the outside. The controller 250 and the tuner-demodulator 210 may be located in different separate devices, that is, the tuner-demodulator 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in memory. The controller 250 controls the overall operation of the display apparatus 200. The user may input a user command through a Graphical User Interface (GUI) displayed on the display 260, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form that is acceptable to the user. A common presentation form of a User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include at least one of an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc. visual interface elements.
Fig. 4 is a schematic diagram of a software configuration in a display device 200 according to one or more embodiments of the present Application, and as shown in fig. 4, the system is divided into four layers, which are, from top to bottom, an Application (Applications) layer (referred to as an "Application layer"), an Application Framework (Application Framework) layer (referred to as a "Framework layer"), an Android runtime (Android runtime) and system library layer (referred to as a "system runtime library layer"), and a kernel layer. The inner core layer comprises at least one of the following drivers: audio drive, display driver, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (like fingerprint sensor, temperature sensor, pressure sensor etc.) and power drive etc..
Fig. 5 is a schematic diagram of an icon control interface display of an application program in the display device 200 according to one or more embodiments of the present application, as shown in fig. 5, an application layer includes at least one application program that can display a corresponding icon control in a display, for example: the system comprises a live television application icon control, a video on demand application icon control, a media center application icon control, an application center icon control, a game application icon control and the like. The live television application program can provide live television through different signal sources. A video-on-demand application may provide video from different storage sources. Unlike live television applications, video on demand provides video displays from some storage source. The media center application program can provide various applications for playing multimedia contents. The application program center can provide and store various application programs.
Fig. 6 is a schematic diagram illustrating a scenario of playing media assets according to an exemplary embodiment, and as shown in fig. 6, the scenario of playing media assets may involve one or more devices, such as the control apparatus 100, the display device 200, the smart device 300, the server 400, and the external audio output device 500. Wherein, the display device 200 includes: a display 260; an external device interface 240 for connecting the external audio output apparatus 500; a built-in audio output module; and a controller. The user can input a user instruction to control the playing of the media assets through the control device 100 or the intelligent device 300. The audio and video data of the media asset includes image data and audio data, and the display device 200 plays the image data through the display 260 and plays the audio data through the built-in audio output module or the external audio output device 500.
Fig. 7 exemplarily shows a schematic diagram of a play principle of a media asset according to an exemplary embodiment, and as shown in fig. 7, when the display device plays the media asset, the display device parses the audio-visual data of the media asset into image data and audio data, and processes the image data and the audio data through different data processing paths, respectively. The image data is output to the display after passing through the image quality processing path of the display device so that the display plays the image data, and the audio data is output to the built-in audio output module or the external audio output device after passing through the audio processing path of the display device so that the built-in audio output module or the external audio output device plays the audio data.
The image quality processing path is used to perform image quality processing on image data. The processing procedure of the image quality processing path may include: noise reduction processing, de-interlacing processing, scaling processing and dynamic compensation processing. The noise reduction process is used to reduce noise in the image data. The de-interlacing process is used to convert an interlaced image to a progressive image to eliminate or reduce the disadvantages of interlaced scanning. The scaling process is for matching an image with the size of a display area and generating a thumbnail of the corresponding image. The dynamic compensation process adopts a dynamic mapping system, and adds a frame of motion compensation frame between two traditional frames of images, so that the picture moving at high speed can be natural and clear. In the embodiments illustrated in the present application, the processing procedure of the image quality processing path may be implemented in a video processor of the display device.
It should be added that, in practical applications, the image quality processing path may include other processing procedures besides the above processing procedures, for example, luminance processing, contrast processing, chrominance processing, hue processing, sharpness processing, dynamic contrast processing, gamma correction processing, color temperature processing, white balance processing, color correction processing, luminance dynamic processing, and the like, which will not be described in detail in this embodiment.
It should be noted that the audio processing path is used for performing audio processing on the audio data, and the audio processing path includes an audio processing module. The audio processing module can perform processing procedures such as decoding processing, sound effect processing, transmission processing and the like, and the processing procedure of the sound effect processing mainly comprises the following steps: digital Theater sound System (DTS) sound processing, dolby panoramic sound (ATMOS) sound processing, graphic Equalizer (GEQ) processing, parametric Equalizer (PEQ) processing, and the like.
The DTS sound effect processing and the ATMOS sound effect processing are used for processing sound effects and improving the playing effect of the sound. GEQ adopts the technology of invariable Q value, is equipped with a push-pull potentiometer at every frequency point, no matter promote or attenuate certain frequency, the frequency bandwidth of wave filter is invariable all the time, through the distribution of push-pull key on the panel, can reflect the balanced compensation song of current audio data and the promotion and the decay condition of each frequency directly perceivedly. The PEQ can finely adjust various parameters of the equalization adjustment, the PEQ is additionally arranged on a sound console, but an independent parametric equalizer is also arranged, the adjusted parameter content comprises frequency bands, frequency points, gains, quality factor Q values and the like, the sound can be beautified and modified, the sound style is more vivid and prominent, and the required artistic effect is achieved. In this embodiment, the display device and the external audio output device both include an audio processor, and the specific processing procedure of the audio processing path may be implemented in the audio processors of the display device and the external audio output device.
It should be noted that, compared to the audio processing process of the audio data, the image quality processing process of the image data takes a long time, so that the time required for the image data to reach the display is longer than the time required for the audio data to reach the built-in audio output module or the external audio output device, thereby easily causing the phenomenon of audio-video asynchronization.
The reason why the image quality processing process of the image data takes a long time is analyzed as follows. When the image quality processing path performs processing of one frame of image data, it is necessary to read not only the image data of the current frame but also the image data of the next several frames, and further, according to the image data of the current frame and the image data of the next several frames, the current frame is processed and the processed data is written into the current frame. In the process of processing the current frame, image data of the next several frames need to be read, which results in a large time consumption.
Fig. 8 is a schematic diagram illustrating a processing procedure of image quality of image data according to an exemplary embodiment, and as shown in fig. 8, in some embodiments, a threshold of the number of image data reading frames is 4, that is, each time a display device reads a frame of image data, a current frame of image data and a last 3 frames of image data need to be read. For example, if the current frame to be processed is the nth frame, the image data of the nth frame, the (n + 1) th frame, the (n + 2) th frame and the (n + 3) th frame need to be read, the image data of the 4 frames needs to be processed, and then the processing result is written into the nth frame. The processing procedures of the (n + 1) th frame, the (n + 2) th frame and the subsequent frame are similar, and are not described herein again. Assuming that the refresh rate is 60Hz, if the threshold value of the number of read frames is 4, it is necessary to read 4 frames of image data, which corresponds to a time consumption of 1/60 × 4=66ms for image quality processing for each frame.
In order to solve the problem of asynchronous voice and picture caused by long time consumption of the image quality processing process of the image data, the audio processing path further comprises: and an audio delay processing module. After the audio processing module performs audio processing on the audio data, the audio delay processing module is further required to perform audio delay processing. The audio delay processing module is used for caching the audio data, and outputting the audio data to the built-in audio output module or the external audio output equipment after the audio data is cached for a certain time, so that the time for the audio data to reach the built-in audio output module or the external audio output equipment is the same as the time for the image data to reach the display, and the phenomenon of sound and picture asynchronization is eliminated.
For convenience of description, in the present application, the processing time of the image data by the image quality processing path is determined as the output delay of the image data, and as can be seen from fig. 8 and the related description, the output delay of the image data is related to the threshold of the reading frame number of the video processor, and when the threshold of the reading frame number of the video processor is determined, the output delay of the image data corresponding to the video processor can be determined. And determining the processing of the audio data by the audio processing module in the audio processing path as the time consumption for processing the audio data, and determining the buffer duration of the audio data by the audio delay processing module as the delay compensation of the audio data. The sum of the processing time of the audio data and the delay compensation of the audio data is determined as the output delay of the audio data. It should be noted that the audio processing module and the audio delay processing module are both internal modules of the display device, and the delay compensation of the audio data is the delay compensation performed by the audio processor of the display device.
It should be added that when the audio data of the display device is output by the external audio output device, the external audio output device cannot directly play the audio data of the display device, but needs to process the audio data and then play the audio data. The audio data is processed via an audio processor of the external audio output device. The method comprises the steps of determining that the processing of audio data by an audio processing module in an audio processing path of the external audio output device is time-consuming for external processing of the audio data, and determining the delay compensation of the audio data by an audio delay processing module in the audio processing path of the external audio output device as the external delay compensation of the audio data. And determining the sum of the external processing time consumption and the external delay compensation as the external output delay of the audio data. It should be noted that, in the embodiment of the present application, the audio delay processing module in the audio processing path of the external audio output device does not perform delay compensation on the audio data, so that the external output delay of the audio data in the embodiment of the present application is substantially time-consuming for the external audio output device to process the audio data. The sum of the output delay of the audio data and the external output delay is determined as the final output delay of the audio data.
The following describes a scenario in which the embodiment of the present application is applied.
Scene 1: the display device plays the audio data through the built-in audio output module.
When the display equipment receives the audio and video data of the media assets, the display equipment analyzes the audio and video data of the media assets into image data and audio data, and the image data and the audio data are processed through an image quality processing path and a sound effect processing path respectively. And if the display equipment outputs through the built-in audio output module, determining the output delay of the audio data according to the output delay of the image data so as to enable the output delay of the image data to be consistent with the output delay of the audio data, and outputting the audio data from the built-in audio output module by the time length of the output delay of the audio data. Wherein, the built-in audio output module may include: and a sound component. The audio assembly may include: a power Amplifier (AMP) and a Speaker (Speaker). Generally, the loudspeaker box assembly can at least realize the output of sound of two sound channels; when the panoramic surround sound effect is to be achieved, a plurality of loudspeaker box assemblies are required to be arranged to output sound of a plurality of sound channels, and a detailed description is not provided herein.
Scene 2: the display device plays the audio data through the external audio output device.
When the display equipment receives the audio and video data of the media assets, the display equipment analyzes the audio and video data of the media assets into image data and audio data, and the image data and the audio data are processed through an image quality processing path and a sound effect processing path respectively. And if the display equipment outputs through the external audio output equipment, determining the output delay of the audio data according to the output delay of the image data so as to ensure that the output delay of the image data is consistent with the output delay of the audio data, and outputting the audio data from the external audio output equipment in the final output delay. The display device can be connected with an external audio output device through the audio output interface, so that the external audio output device plays audio data. The external audio output device may include a bluetooth speaker, a power amplifier, and the like.
It should be noted that, since the display device and the external audio output device belong to different hardware control systems, the display device and the external audio output device do not know the time consumed by the audio processing of the other side, and therefore, the display device cannot keep the output time of the image data of the display device consistent with the output time of the audio data of the external audio output device, and therefore cannot synchronize the real-time sound and image of the display device and the external audio output device.
It should be added that, when the display device is connected to the external audio output device to play audio, the devices related to the media asset playing scene can be divided into a sending end device and a receiving end device according to the output direction of the audio data. The sending end device may also be referred to as a source end, and refers to a party that sends audio data in a media asset playing scene, for example, a display device shown in fig. 6; the sink device may also be referred to as a sink device, and refers to a party receiving audio data in a media asset playing scene, for example, an external audio output device as shown in fig. 6. Therefore, in a media asset playing scene, the sending end device can send the audio data being played to the receiving end device through wireless or wired connection, so that the receiving end device can play the audio data received by the receiving end device in real time. The transmitting terminal device and the receiving terminal device in the media asset playing scene are logically determined according to the output direction of the audio, so that the display device can be used as both the transmitting terminal device and the receiving terminal device according to the difference of the audio output direction.
Illustratively, for the display device and the external audio output device, when the display device outputs audio data to the external audio output device, the audio data is played through the external audio output device, the display device is a sending end device, and the external audio output device is a receiving end device. When the external audio output device outputs audio data to the display device, the audio data is played through the audio processor of the display device when the audio data is output through the audio processor of the display device, the external audio output device is a sending end device, and the display device is a receiving end device.
It should be noted that, in the scenario of playing media assets as shown in fig. 6, one receiving end device can only establish connection with one sending end device at the same time. For example, when the display device is connected with the external audio output device and the external audio output device outputs audio data of the media assets being played by the display device, the intelligent device cannot be connected with the external audio output device at the same time, and the external audio output device cannot output the audio data of the media assets being played by the intelligent device at the same time; on the contrary, when the intelligent device is connected with the external audio output device and the external audio output device outputs the audio data of the media asset being played by the intelligent device, the external audio output device cannot simultaneously output the audio data of the media asset being played by the display device.
Therefore, the embodiment of the application provides a display device, which can avoid the problem that the time required for image data to reach a display is longer than the time required for audio data to reach an internal audio output module or an external audio output device due to the fact that the time consumed in the image quality processing process of the image data is longer, so that the phenomenon of asynchronization of sound and pictures is easily caused, and the problem that the display device and the external audio output device cannot perform real-time sound and picture synchronization is solved.
In a specific implementation, the present application provides a display device, which is used to avoid the problem of asynchronous sound and picture in the scene 1, where the display device includes a display; an external device interface for connecting an external audio output apparatus; a built-in audio output module; a controller for executing the following steps S901-S904 shown in FIG. 9:
step S901, acquiring audio and video data of the media assets, wherein the audio and video data comprises image data and audio data;
in some embodiments, when a user inputs a user instruction to the display device through the control device or the intelligent device, the display device may perform data communication with the server to acquire the audio-visual data of the media asset. The display device may be in data communication with the server through a variety of communication means. In various embodiments of the present application, the display device may be allowed to be in wired or wireless communication connection with a server via a local area network, a wireless local area network, or other network. The server may provide various content and interactions to the display device.
Illustratively, the display device receives software Program updates, or accesses a remotely stored digital media library by sending and receiving information, and Electronic Program Guide (EPG) interactions. The servers can be one group or multiple groups, and can be one or more types of servers. The server may provide other network service content, such as video-on-demand and advertising services, to the display device.
The display device includes but is not limited to obtaining the audio and video data of the medium asset by means of data communication with the server, the display device can also obtain the audio and video data of the medium asset from the intelligent device by means of wired or wireless connection with the intelligent device, and can also obtain the audio and video data of the medium asset from a local storage location.
In an example, the smart device establishes a connection with the display device in a screen-casting manner, and the smart device can cast a screen game (such as a traditional arcade game) to the display device to enhance the user game experience.
For example, the display device stores a local game inside, such as a motion sensing game (e.g., a ball game, a boxing game, a running game, a dance game, etc.), and the control device or the smart device can open and run the local game from the local storage location by inputting a user instruction.
In some embodiments, the audio-visual data further includes signal formats of the image data, and the signal formats include a P-system signal format and an I-system signal format. In a specific implementation process, when the display device receives audio and video data of a media asset, the video processor is configured to be capable of automatically acquiring a signal format of image data.
It should be noted that the P-system signal format and the I-system signal format are both effective display formats of audio-visual data. The P-system signal format is a progressive scanning signal format, taking the signal format of the image data as 1080P as an example, at this time, the effective display format of the image data is 1920 × 1080, 1080 indicates that there are 1080 scanning lines in the vertical direction of the image data, each horizontal scanning line is simultaneously represented on the picture, the image can be displayed smoothly, and the method is suitable for playing motion and movie media. The I-system signal format is an interlaced signal format, taking the signal format of the image data as 1080I as an example, at this time, the effective display format of the image is 1920 × 1080, each odd line image is displayed behind each even line image, the image display is not smooth, and the video playing device is suitable for playing documentary sheets and wildlife subject material media.
It should be added that the P-system signal format includes but is not limited to 1080P, and also includes other P-system signal formats such as 720P, 360P, etc.; the I-system signal format includes, but is not limited to 1080I, and also includes 720I and other I-system signal formats.
Step S902, determining the output delay of the image data and the output delay of the audio data, wherein the output delay of the image data is consistent with the output delay of the audio data;
in some embodiments, the controller is configured to perform the following steps S9021-S9023 shown in FIG. 10;
step S9021, the processing of acquiring the audio data consumes time.
The processing time of the audio data is the processing time of the audio data by the audio processing module, and the processing time of the controller for acquiring the audio data through the audio processor is the processing time.
And step S9022, subtracting the processing time consumption of the audio data from the output delay of the image data to obtain the delay compensation of the audio data.
The output delay of the image data is the time consumed for processing the image data by the image quality processing path, and the time consumed for processing the image data acquired by the controller through the video processor is consumed. In order to ensure the synchronization of sound and picture, the output delay of the image data needs to be consistent with the output delay of the audio data, so the delay compensation of the audio data is obtained by subtracting the processing time consumption of the audio data from the output delay of the image data.
And step S9023, determining the output delay of the audio data according to the processing time consumption of the audio data and the delay compensation of the audio data, so that the output delay of the image data is consistent with the output delay of the audio data.
In specific implementation, taking the example that the output delay of the image data is 80ms and the processing time consumption of the audio data is 30ms, the processing time consumption of the controller for acquiring the audio data through the audio processor is 30ms; subtracting the processing time consumption of the audio data by the output delay of the image data of 80ms to obtain the delay compensation of the audio data of 50ms; and according to the time consumption of 30ms for processing the audio data and the time delay compensation of 50ms for the audio data, determining the output time delay of the audio data as the sum of the time consumption of 30ms for processing the audio data and the time delay compensation of 50ms for the audio data to be 80ms.
In step S903, a statement including the output delay of the image data and the output delay of the audio data is generated to declare the output delay of the image data and the output delay of the audio data to the built-in audio output module by the statement.
In some embodiments, the declaration is an Extended Display Identification Data (EDID) declaration, which is a standard for display identification data. EDID is stored in a Direct Digital Control (DDC) memory in the display, and when the computer host is connected with the display, the DDC channel can read the EDID stored in the DDC memory of the display by the computer host. The EDID includes: since the video signal format can be identified in the EDID, the output delay of the image data in the P-system signal format and the output delay of the image data in the I-system signal format can be determined separately when the delay setting is performed in the EDID.
Fig. 11 exemplarily shows an EDID statement diagram in the embodiment of the present application, and the image output delay and the audio output delay of the image data in the P-system signal format statement 1101 are both set to 80ms, and the output delay of the image data and the output delay of the audio data in the I-system signal format statement 1102 are both set to 100ms, with the delay shown in fig. 11 as an example, to generate the EDID statement containing the output delay of the image data in the P-system signal format, the output delay of the image data in the I-system signal format, and the output delay of the audio data as shown in fig. 11.
Step S904, a statement including the output delay of the image data and the output delay of the audio data is sent to the built-in audio output module, so that the built-in audio output module outputs the audio data.
By adopting the implementation mode, the interaction between the display equipment and the built-in audio output module can be completed through the EDID statement, the output delay of the image data of the display equipment end is kept consistent with the output delay of the audio data through the EDID statement, and the sound and picture synchronization is realized when the display equipment plays audio through the built-in audio output module.
In a specific implementation, the present application further provides a display device, configured to avoid a problem of asynchrony between sound and pictures that may occur in the scene 2, where the display device shown in the present application includes a display; an external device interface for connecting an external audio output apparatus; a built-in audio output module; a controller for executing the following steps S1201-S1204 shown in FIG. 12:
step S1201, acquiring audio and video data of the media assets, wherein the audio and video data comprise image data and audio data;
in a specific implementation, a specific manner of acquiring the audio-visual data of the media asset is the same as that in step S901, and details are not described here.
Step S1202, determining the output delay of the image data and the output delay of the audio data, wherein the output delay of the image data is consistent with the output delay of the audio data;
in a specific implementation, the specific manner of determining the output delay of the image data and the output delay of the audio data is the same as that of step S902, and is not described in detail here.
Step S1203, generating a statement including an output delay of the image data and an output delay of the audio data, so as to declare the output delay of the image data and the output delay of the audio data to the external audio output device by the statement;
in some embodiments, the declaration is an EDID declaration, and the image output delay and the audio output delay of the display device are declared to be consistent to the external audio output device through the EDID declaration, so that the sound and picture synchronization is realized on the display device side.
Step S1204, receiving an external output delay of the audio data sent by the external audio output device, wherein the external output delay is determined by the external audio output device according to the processing time consumption of the external audio output device on the audio data; the sum of the output delay of the image data and the external output delay of the audio data is determined as the final output delay of the image data.
In some embodiments, taking the example that the output delay of the image data is 80ms and the processing time of the audio data is 30ms, the processing time of the controller acquiring the audio data through the audio processor is 30ms; subtracting the processing time consumption of the audio data by the output delay of the image data of 80ms to obtain the delay compensation of the audio data of 50ms; and determining that the output delay of the audio data is 80ms which is the sum of the time consumption of 30ms for processing the audio data and the time delay compensation of 50ms for the audio data according to the time consumption of 30ms for processing the audio data and the time delay compensation of 50ms for the audio data, and at the moment, realizing sound and picture synchronization at the display equipment end. Taking the external output delay as 40ms as an example, when the external audio output device reads the EDID statement, the external audio output device sends the external output delay of the audio data to the display device for 40ms, when the display device receives the external output delay of the audio data sent by the external audio output device for 40ms, the sum of the output delay of the image data for 80ms and the external output delay of the audio data for 40ms is determined as the final output delay of the image data for 120ms, and meanwhile, the final output delay of the audio data is the sum of the output delay of the audio data for 80ms and the external output delay for 40ms, so that the final output delay of the image data is consistent with the final output delay of the audio data, and the sound and picture synchronization of the display device and the external audio output device is realized.
In a specific implementation, the present application further provides another display device, configured to avoid a problem of asynchrony between sound and pictures that may occur in the scene 2, where the display device shown in the present application includes a display; an external device interface for connecting an external audio output apparatus; a built-in audio output module; a controller for performing the following steps S1301-S1304 shown in FIG. 13:
step S1301, obtaining sound and picture data of the media assets, wherein the sound and picture data comprise image data and audio data;
in a specific implementation, a specific manner of acquiring the audio-visual data of the media asset is the same as that in step S901, and details are not described here.
Step S1302, determining the output delay of the image data and the output delay of the audio data, wherein the output delay of the image data is consistent with the output delay of the audio data;
in some embodiments, current image setting parameters of the display device are obtained; and determining the output delay of the image data and the output delay of the audio data by combining the image setting parameters, and determining that the output delays are different according to different image setting parameters. Wherein the image setting parameters may include: an image resolution setting parameter, an image mode setting parameter, and an image resource setting parameter.
Fig. 14 exemplarily shows an image resolution setting parameter setting diagram according to an exemplary embodiment, and as shown in fig. 14, a complete machine setting interface 1400 of a user includes: a power-on scene control 1401, a screen adjustment control 1402, a resolution adjustment control 1403, a multi-screen interaction control 1404, and a native name control 1405. And the user can enter the setting interface of the corresponding control by clicking the corresponding control. For example, when the user clicks the resolution adjustment control 1403, the resolution of the image data can be set at the resolution adjustment interface. As an example, the resolution adjustment interface includes a plurality of resolutions such as 4k (2160 × 4096), 2k (1152 × 2048), 1080p (1080 × 1920), 720p (720 × 1280), and DV (480 × 720).
Fig. 15 exemplarily shows an image mode setting parameter setting diagram according to an exemplary embodiment, and as shown in fig. 15, an image mode setting interface 1500 of a user includes: standard mode control 1501, motion mode control 1502, game mode control 1503, theater mode control 1504, concert mode control 1505, studio mode control 1506, and custom mode control 1507. Wherein a user clicking on the game mode control 1503 may set the display device to game mode.
It should be noted that, under different image mode setting parameters, the output delay of the image data is different, taking the game mode as an example, when the display device is in the game mode, the display device allows to optimize the screen response speed of the display.
In a specific implementation, the reason for the delay of image data is as follows: the delay of image data output from the HDMI interface to the display device through the video line is 5-10ms, the image data is transmitted to the controller through the HDMI interface of the display device and is converted into a data form which can be processed by the video processor through the controller, the time for the video processor to process the image data is 5-10ms, and the time for the video processor to process the image data is 20-100ms, so that the delay of the image data from transmission to display of the display device is 30-120ms in total, and the user experience is obviously influenced. Since the longest delay time is generated by the video processor, the image quality processing path in the video processor includes a plurality of video processing effects to improve the image quality, and when the display apparatus is in the game mode, the display apparatus disables the plurality of video processing effects for improving the image quality and displays the image data on the display more quickly.
It is to be noted that the game mode does not completely eliminate the output delay of the image data but significantly reduces the delay by 2 to 3 times, for example, the output delay of the image data is 80ms when the image mode setting parameter is set to the standard mode, and the output delay of the image data is 25ms when the game mode is changed.
The game mode can reduce the output delay of the image data, so the game mode is a low-delay mode in essence, the low-delay mode in the display device includes but is not limited to the game mode, and other low-delay modes such as an automatic low-delay mode, etc., the low-delay mode of the display device can be manually switched by inputting an operation instruction by a user, and the medium resource content played by the display device can be automatically switched in the automatic low-delay mode.
When a user accesses the PC equipment, the PC equipment sends audio-visual data of the media assets to the display equipment, and when the audio-visual data of the media assets is a picture resource packet, image resource setting parameters are as follows: picture resource format (image/jpeg).
It should be noted that, the image resource setting parameter is a Content Type (Content-Type) of the audio-visual data of the media asset, the image resource setting parameter is used to determine what Content Type to read the audio-visual data of the media asset, and when the audio-visual data of the media asset is a game resource pack, the image resource setting parameter is: game resource format (application instance/json).
In some embodiments, when it is monitored that the image setting parameters of the display device are changed, the output delay of the image data and the output delay of the audio data are re-determined by combining new image setting parameters;
in specific implementation, when it is monitored that any one image setting parameter of the display device changes, the output delay of the image data and the output delay of the audio data are determined again by combining a new image setting parameter.
The image setting parameter change of the display device includes the following scenes:
scene three: the image resolution setting parameter of the display device is switched from 2k to 4k.
When the image resolution setting parameter of the display device is switched from 2k to 4k, the processing time consumption of the image data is changed, so that the output delay of the image data and the output delay of the audio data need to be re-determined.
Scene four: the image mode setting parameters of the display device are switched from the standard mode to the game mode.
When the image mode setting parameters of the display device are switched from the standard mode to the game mode, the processing time consumption of the image data is changed, so that the output delay of the image data and the output delay of the audio data need to be determined again.
Scene five: and the image resource setting parameters of the display equipment are switched from the image resource format to the game resource format.
When the display equipment is connected with the PC equipment, the PC equipment sends the audio-visual data of the media resources to the display equipment, and when the audio-visual data of the media resources are the picture resource packet, the image resource setting parameters are as follows: a picture resource format; when the audio and video data of the media assets are game resource packets, the image resource setting parameters are as follows: a game resource format; when the audio and video data of the media asset is switched from the picture resource package to the game resource package, the image resource setting parameter is switched from the picture resource format to the game resource format, and at this time, because the image resource setting parameter is changed, the output delay of the image data and the output delay of the audio data need to be determined again.
Therefore, in the above-mentioned scene three-scene five, taking the output delay of image data as 80ms in the conventional case as an example, if hard declaration (such as EDID declaration) is adopted, the output delay of image data declared by the display device to the external audio output device is 80ms no matter whether the image setting parameter of the display device is changed, and only automatic sound and picture synchronization can be realized. Actually, in the low latency mode, the output latency of the image data is less than 80ms, when it is monitored that any one of the image setting parameters of the display device is changed, the output latency of the image data is changed, and the output latency of the image data needs to be determined again, at this time, the output latency of the image data is determined in real time and the output latency of the image data determined in real time is declared to the external audio output device, so that real-time audio and video synchronization between the display device and the external audio output device can be performed.
Step S1303, generating a CEC message including an output delay of image data and an output delay of audio data, to declare the output delay of image data and the output delay of audio data to an external audio output device through the CEC message;
the controller, in performing declaring an output delay of the image data and an output delay of the audio data to the external audio output device through the CEC message, is further configured to transmit the CEC message to the external audio output device.
Consumer Electronics Control (CEC) is a complete set of single-wire bus protocols, a display device can Control all HDMI-connected devices on an HDMI interface through CEC messages and allows HDMI devices to command each other without user intervention. The HDMI includes a CEC bus and an Audio Return Channel (ARC) Channel, where the CEC bus is a general control bus used for interconnection of HDMI devices. The ARC channel is used for outputting digital audio of a television, and can be connected with an external device which also supports the ARC function, so that the sound of the display device is transmitted to the external audio output device.
In some embodiments, when it is monitored that the image setting parameters of the display device are changed, the output delay of the image data and the output delay of the audio data are re-determined by combining new image setting parameters;
generating a new CEC message according to the redetermined output delay of the image data and the output delay of the audio data;
the new CEC message is sent to the external audio output device.
In the third-fifth scenario, when it is monitored that any one image setting parameter of the display device changes, the display device sets a delay compensation scheme through the structural body. The structure is included in the CEC message and is used to declare an output delay of image data and an output delay of audio data to an external audio output device.
In some embodiments, the output delay of the image data and the output delay of the audio data are both conventionally 80ms as an example. If the current image mode setting parameter is a low-delay mode, when the image resolution setting parameter is switched from 2k to 4k, the output delay of the image data and the output delay of the audio data are re-determined, because the current image mode setting parameter is a low-delay mode, the output delay of the actual image data is less than 80ms, the output delay of the image data and the output delay of the audio data can be determined to be 60ms by combining the current image setting parameter, meanwhile, a new CEC message is sent to the external audio output device in real time, and the output delay of the image data and the output delay of the audio data are declared to be 60ms to the external audio output device.
In some embodiments, the external declaration of the output delay of the image data and the output delay of the audio data are both 80ms in the conventional case is taken as an example. If the current image mode setting parameter is a low-latency mode, when the image mode setting parameter is switched from a standard mode to a game mode, the output latency of the image data and the output latency of the audio data are re-determined, because the game mode is substantially the low-latency mode, the output latency of the actual image data is less than 80ms, the output latency of the image data and the output latency of the audio data can be determined to be 60ms by combining the current image setting parameter, meanwhile, a new CEC message is sent to the external audio output device in real time, and the output latency of the image data and the output latency of the audio data are declared to the external audio output device to be 60ms.
In some embodiments, the external declaration of the output delay of the image data and the output delay of the audio data are both 80ms in the conventional case is taken as an example. If the current image mode setting parameter is a low-delay mode, when the image resource setting parameter is switched from the image resource format to the game resource format, the output delay of the image data and the output delay of the audio data are re-determined, because the game mode is essentially the low-delay mode, the output delay of the actual image data is less than 80ms, the output delay of the image data and the output delay of the audio data can be determined to be 60ms by combining the current image setting parameter, meanwhile, a new CEC message is sent to the external audio output device in real time, and the output delay of the image data and the output delay of the audio data are declared to the external audio output device to be 60ms.
Therefore, by taking the above embodiment as an example, when it is monitored that any one of the image setting parameters of the display device changes, the output delay of the image data changes, and after the display device end realizes real-time sound-picture synchronization, the output delay of the image data and the output delay of the audio data are declared to the external audio output device in real time.
Step S1304, receiving an external output delay of the audio data sent by the external audio output device, wherein the external output delay is determined by the external audio output device according to the processing time consumption of the external audio output device on the audio data;
the sum of the output delay of the image data and the external output delay of the audio data is determined as the final output delay of the image data.
In a specific implementation, step S1304 is the same as the specific implementation of step 804.
In a specific implementation, an embodiment of the present application further illustrates a method for synchronizing sound and pictures, including:
acquiring sound and picture data of the media assets, wherein the sound and picture data comprise image data and audio data;
determining output delay of the image data and output delay of the audio data, wherein the output delay of the image data is consistent with the output delay of the audio data;
and generating a statement containing the output delay of the image data and the output delay of the audio data, so as to declare the output delay of the image data and the output delay of the audio data to the external audio output device or the built-in audio output module through the statement.
It should be understood that, for specific implementation manners of the steps in the sound and picture synchronization method, reference may be made to the foregoing display device embodiment, and details are not described herein. According to the display device and the sound and picture synchronization method, the problems that the time for image data to reach the display is longer than the time for audio data to reach the built-in audio output module or the external audio output device, and therefore the sound and picture are not synchronized easily, and the display device and the external audio output device cannot be synchronized in real time can be solved. The user experience is improved.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. A display device, comprising:
a display;
an external device interface for connecting an external audio output apparatus;
a built-in audio output module;
a controller configured to:
acquiring sound and picture data of the media assets, wherein the sound and picture data comprise image data and audio data;
determining the output delay of the image data and the output delay of the audio data, wherein the output delay of the image data is consistent with the output delay of the audio data;
and generating a statement including the output delay of the image data and the output delay of the audio data, so as to declare the output delay of the image data and the output delay of the audio data to the external audio output device or the built-in audio output module through the statement.
2. The display device of claim 1, wherein the controller performs determining an output latency of the image data and an output latency of the audio data, and is further configured to:
acquiring current image setting parameters of the display equipment;
and determining the output delay of the image data and the output delay of the audio data by combining the image setting parameters, wherein the output delays determined according to different image setting parameters are different.
3. The display device of claim 2, wherein the controller performs generating a statement that includes an output latency of the image data and an output latency of the audio data, and is further configured to:
generating a CEC message including an output delay of the image data and an output delay of the audio data;
the controller, in performing declaring an output delay of the image data and an output delay of the audio data to the external audio output device through the declarations, is further configured to transmit the CEC message to the external audio output device.
4. The display device according to claim 3, wherein the controller is further configured to:
when the image setting parameters of the display equipment are monitored to be changed, the output delay of the image data and the output delay of the audio data are determined again by combining the new image setting parameters;
generating a new CEC message according to the redetermined output delay of the image data and the output delay of the audio data;
transmitting the new CEC message to the external audio output device.
5. The display device of claim 1, wherein the declaration is an extended display identification data declaration.
6. The display device according to claim 5, wherein the sound-picture data further includes signal formats of image data, the signal formats including a P-system signal format and an I-system signal format;
the controller performs determining an output delay of the image data and an output delay of the audio data, further configured to:
respectively determining the output delay of the image data in the P system signal format and the output delay of the image data in the I system signal format;
the controller executes a statement that generates an output delay including the image data and the audio data, further configured to:
and generating a statement containing the output delay of the image data in the P system signal format, the output delay of the image data in the I system signal format and the output delay of the audio data.
7. The display device according to any one of claims 1 to 6, wherein the controller performs determining an output delay time of the image data and an output delay time of the audio data, and is further configured to:
the processing time for obtaining the audio data is consumed;
subtracting the processing time consumption of the audio data by using the output delay of the image data to obtain the delay compensation of the audio data;
and determining the output delay of the audio data according to the processing time consumption of the audio data and the delay compensation of the audio data so as to enable the output delay of the image data to be consistent with the output delay of the audio data.
8. The display device of claim 7, wherein the controller is further configured to:
and sending a statement containing the output delay of the image data and the output delay of the audio data to the built-in audio output module so that the built-in audio output module outputs the audio data.
9. The display device of claim 7, wherein the controller is further configured to:
receiving external output delay of the audio data sent by the external audio output equipment, wherein the external output delay is determined by the external audio output equipment according to the processing time consumption of the external audio output equipment on the audio data;
determining a sum of an output delay of the image data and an external output delay of the audio data as a final output delay of the image data.
10. A sound and picture synchronization method is applied to a display device, wherein the display device comprises an external device interface and a built-in audio output module, wherein the external device interface is used for connecting an external audio output device;
the method comprises the following steps:
acquiring sound and picture data of the media assets, wherein the sound and picture data comprise image data and audio data;
determining output delay of the image data and output delay of the audio data, wherein the output delay of the image data is consistent with the output delay of the audio data;
and generating a statement including the output delay of the image data and the output delay of the audio data, so as to declare the output delay of the image data and the output delay of the audio data to the external audio output device or the built-in audio output module through the statement.
CN202111169799.3A 2021-10-08 2021-10-08 Display device and sound and picture synchronization method Pending CN115967830A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111169799.3A CN115967830A (en) 2021-10-08 2021-10-08 Display device and sound and picture synchronization method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111169799.3A CN115967830A (en) 2021-10-08 2021-10-08 Display device and sound and picture synchronization method

Publications (1)

Publication Number Publication Date
CN115967830A true CN115967830A (en) 2023-04-14

Family

ID=87358752

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111169799.3A Pending CN115967830A (en) 2021-10-08 2021-10-08 Display device and sound and picture synchronization method

Country Status (1)

Country Link
CN (1) CN115967830A (en)

Similar Documents

Publication Publication Date Title
CN108924621B (en) Display methods, device and television set, storage medium
WO2021042655A1 (en) Sound and picture synchronization processing method and display device
WO2020248680A1 (en) Video data processing method and apparatus, and display device
CN111405221B (en) Display device and display method of recording file list
US20230179824A1 (en) Display apparatus and method of controlling the same
CA2950642C (en) Minimizing input lag in a remote gui tv application
JP2011015011A (en) Device and method for adjusting image quality
CN111954043B (en) Information bar display method and display equipment
CN111064982B (en) Display control method, storage medium and display device
CN113132769A (en) Display device and sound and picture synchronization method
US11234042B2 (en) Display device, control method therefor and recording medium
CN115967830A (en) Display device and sound and picture synchronization method
CN115955588A (en) Display device and sound and picture synchronization method
CN115547265A (en) Display apparatus and display method
KR20160011158A (en) Screen sharing system and method
KR20090038660A (en) Media sink device and method for controlling of the same
CN114979736B (en) Display device and audio and video synchronization method
CN114339344B (en) Intelligent device and video recording method
CN112702549B (en) Sound output method and display device
US20170257679A1 (en) Multi-audio annotation
CN116489460A (en) Display method and display device
CN117939213A (en) Display device, multi-window display method, and storage medium
CN117915136A (en) Display equipment and sound and picture synchronization method
CN116095403A (en) Intelligent device and display method
CN115119035A (en) Display device, image processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination