CN117651196A - Display device and multimedia subtitle display method - Google Patents

Display device and multimedia subtitle display method Download PDF

Info

Publication number
CN117651196A
CN117651196A CN202211632453.7A CN202211632453A CN117651196A CN 117651196 A CN117651196 A CN 117651196A CN 202211632453 A CN202211632453 A CN 202211632453A CN 117651196 A CN117651196 A CN 117651196A
Authority
CN
China
Prior art keywords
subtitle
output
played
path
file
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211632453.7A
Other languages
Chinese (zh)
Inventor
朱宗花
李斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202211632453.7A priority Critical patent/CN117651196A/en
Publication of CN117651196A publication Critical patent/CN117651196A/en
Pending legal-status Critical Current

Links

Abstract

The embodiment of the application provides a display device and a multimedia subtitle display method, and relates to the technical field of speech understanding. The display device includes: a user interface for receiving a selection operation of a user; a controller configured to: and responding to the selection operation, determining one or more paths of subtitles to be output from the subtitles of the multimedia file to be played, acquiring the subtitle data of each path of the subtitles to be output, acquiring a global clock of a playing pipeline of the multimedia file to be played and synchronous rendering logic corresponding to each path of the subtitles to be output, and finally synchronously rendering the subtitle data of each path of the subtitles to be output based on the global clock and the synchronous rendering logic corresponding to each path of the subtitles to be output so as to synchronously display each path of the subtitles to be output when the multimedia file to be played.

Description

Display device and multimedia subtitle display method
Technical Field
The embodiment of the application relates to the technical field of data processing. And more particularly, to a display apparatus and a multimedia subtitle display method.
Background
In the field of audio and video, for different Track (Track) types, such as a video Track, an audio Track and a subtitle Track, although the display device can obtain multiple paths of subtitles corresponding to audio and video through the subtitle Track, the prior art only supports the display of one path of subtitles, or synthesizes multiple paths of subtitles to be displayed in one path, so that multiple paths of subtitles cannot be displayed on the screen of the display device at the same time, and the user experience is poor.
Disclosure of Invention
The exemplary embodiment of the application provides a display device, which is used for solving the problem that only one caption can be displayed on the display device and improving user experience.
The technical scheme provided by the embodiment of the application is as follows:
in a first aspect, an embodiment of the present application provides a display device, including:
a user interface for receiving a selection operation of a user;
a controller configured to:
responding to the selection operation, and determining one or more paths of subtitles to be output from the subtitles of the multimedia file to be played;
acquiring subtitle data of each path of subtitle to be output;
acquiring a global clock of a play pipeline of the multimedia file to be played and synchronous rendering logic corresponding to each path of subtitle to be output;
And respectively carrying out synchronous rendering on the subtitle data of each path of the subtitle to be output based on the global clock and synchronous rendering logic corresponding to each path of the subtitle to be output so as to synchronously display each path of the subtitle to be output when the playing multimedia file is played.
In a second aspect, an embodiment of the present application provides a method for displaying a multimedia subtitle, including:
receiving a selection operation of a user;
responding to the selection operation, and determining one or more paths of subtitles to be output from the subtitles of the multimedia file to be played;
acquiring subtitle data of each path of subtitle to be output;
acquiring a global clock of a play pipeline of the multimedia file to be played and synchronous rendering logic corresponding to each path of subtitle to be output;
and respectively carrying out synchronous rendering on the subtitle data of each path of the subtitle to be output based on the global clock and synchronous rendering logic corresponding to each path of the subtitle to be output so as to synchronously display each path of the subtitle to be output when the playing multimedia file is played.
In a third aspect, embodiments of the present application provide a computer readable storage medium, where a computer program is stored, where the computer program, when executed by a computing device, causes the computing device to implement a display device method according to any one of the embodiments of the first aspect.
In a fourth aspect, the present invention provides a computer program product which, when run on a computer, causes the computer to implement the multimedia subtitle display method of the second aspect or any embodiment of the second aspect.
As can be seen from the above technical solutions, the display device and the multimedia subtitle display method provided in the embodiments of the present application are configured to receive a selection operation of a user through a user interface; a controller configured to: responding to the selection operation, and determining one or more paths of subtitles to be output from the subtitles of the multimedia file to be played; acquiring the subtitle data of each path of subtitle to be output; then, acquiring a global clock of a play pipeline of the multimedia file to be played and synchronous rendering logic corresponding to each path of subtitle to be output; finally, based on the global clock and synchronous rendering logic corresponding to each path of subtitle to be output, respectively synchronously rendering subtitle data of each path of subtitle to be output so as to synchronously display each path of subtitle to be output when the playing multimedia file is played.
Drawings
In order to more clearly illustrate the embodiments of the present application or the implementation in the related art, a brief description will be given below of the drawings required for the embodiments or the related art descriptions, and it is apparent that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings for those of ordinary skill in the art.
Fig. 1 illustrates a scene architecture diagram of a multimedia subtitle display method in some embodiments;
FIG. 2 illustrates a hardware configuration block diagram of a control device in some embodiments;
FIG. 3 illustrates a hardware configuration block diagram of a display device in some embodiments;
FIG. 4 illustrates a software configuration diagram in a display device in some embodiments;
FIG. 5 illustrates a flow chart of steps of a method of multimedia subtitle display in some embodiments;
fig. 6 is a schematic view of a scene of a multimedia subtitle display method according to an embodiment of the present invention;
fig. 7 is a flowchart illustrating steps of a method for displaying multimedia subtitles according to an embodiment of the present invention;
fig. 8 is a schematic diagram illustrating a method for displaying multimedia subtitles according to fig. 7 according to an embodiment of the present invention;
fig. 9 is a flowchart illustrating steps of another method for displaying multimedia subtitles according to an embodiment of the present invention;
Fig. 10 is a schematic diagram illustrating a method for displaying multimedia subtitles according to fig. 9 according to an embodiment of the present invention;
fig. 11 is a flowchart illustrating steps of another method for displaying multimedia subtitles according to an embodiment of the present invention;
fig. 12 is a schematic diagram illustrating a multimedia subtitle display method based on the multimedia subtitle display method shown in fig. 11 according to an embodiment of the present invention;
fig. 13 is a flowchart showing steps of another method for displaying multimedia subtitles according to an embodiment of the present invention;
fig. 14 is a schematic diagram illustrating a method for displaying multimedia subtitles according to fig. 13 according to an embodiment of the present invention;
fig. 15 is a flowchart showing steps of another method for displaying multimedia subtitles according to an embodiment of the present invention;
fig. 16 is a schematic diagram illustrating a method for displaying multimedia subtitles according to fig. 15 according to an embodiment of the present invention;
fig. 17 is a flowchart showing steps of another method for displaying multimedia subtitles according to an embodiment of the present invention;
fig. 18 is a flowchart illustrating steps of another method for displaying multimedia subtitles according to an embodiment of the present invention.
Detailed Description
For purposes of clarity and implementation of the present application, the following description will make clear and complete descriptions of exemplary implementations of the present application with reference to the accompanying drawings in which exemplary implementations of the present application are illustrated, it being apparent that the exemplary implementations described are only some, but not all, of the examples of the present application.
It should be noted that the brief description of the terms in the present application is only for convenience in understanding the embodiments described below, and is not intended to limit the embodiments of the present application. Unless otherwise indicated, these terms should be construed in their ordinary and customary meaning.
The terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements explicitly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
Fig. 1 is a schematic view of a scenario architecture of a control method of a display device according to an embodiment of the present application. As shown in fig. 1, a scenario architecture provided in an embodiment of the present application includes: control device 100, display device 200, and server 300.
The display device provided in the embodiment of the application may have various implementation forms, for example, the display device may be a television, an intelligent speaker refrigerator with a display function, a curtain with a display function, a personal computer (Personal Computer, PC), a laser projection device, a display (monitor), an electronic whiteboard (electronic bulletin board), a wearable device, an on-board device, an electronic desktop (electronic table), and the like.
In some embodiments, the control device 100 may be a remote control, and the communication between the remote control and the display device may include infrared protocol communication or bluetooth protocol communication, and other short-range communication methods, and the display device may be controlled by a wireless or wired method. The user may control the display device by inputting user instructions through keys on the remote control, voice input, control panel input, etc.
In some embodiments, the control device 100 may be a terminal device as well. For example: the terminal equipment can be mobile terminals such as mobile phones, tablet computers, notebook computers and the like.
In some embodiments, the display device may also be controlled in a manner other than the control device, for example, the selection operation of the user may be directly received through a receiving user selection module configured inside the display device.
In some embodiments, the display device may also be in data communication with the server 300 to obtain related media assets from the server 300. The display device may be allowed to communicate with the server via a Local Area Network (LAN), a Wireless Local Area Network (WLAN). The server 300 may provide media asset services and various content and interactions to a display device. The server 300 may be a cluster or a plurality of clusters, and may include one or more types of servers.
Fig. 2 exemplarily shows a block diagram of a configuration of the control apparatus 100 in the embodiment shown in fig. 1. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user interface 140, a memory, and a power supply. The control device 100 may receive an operation instruction input by a user, convert the operation instruction into an instruction that can be recognized and responded by the display device, and forward the operation instruction or an instruction obtained by converting a voice instruction to the display device, so as to perform interaction between the user and the display device.
In some embodiments, the user interface 140 of the control device 100 is configured to perform the steps of: and receiving a selection operation of a user.
In some embodiments, the user interface 140 of the control device 100 is further configured to perform the steps of: and receiving a deleting operation of a user, wherein the deleting operation is used for stopping synchronous display of the first subtitle in the subtitles to be output.
In some embodiments, the user interface 140 of the control device 100 is further configured to perform the steps of: and receiving an adding operation of a user, wherein the adding operation is used for adding synchronous display of the second subtitles except the subtitles to be output.
As shown in fig. 3, the display apparatus includes at least one of a modem 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, and a user interface 280.
In some embodiments, the controller 250 includes a processor, a video processor, an audio processor, a graphics processor, RAM, ROM, a first interface to an nth interface for input/output.
The display 260 includes a display screen component for presenting a picture, and a driving component for driving an image display, a component for receiving an image signal from the controller output, displaying video content, image content, and a menu manipulation interface, and a user manipulation UI interface.
The display 260 may be a liquid crystal display, an OLED display, a projection device, or a projection screen.
The communicator 220 is a component for communicating with external devices or servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, or other network communication protocol chip or a near field communication protocol chip, and an infrared receiver. The display device may establish transmission and reception of control signals and data signals with the external control device 100 or the server 400 through the communicator 220.
A user interface operable to receive control signals entered by a user via control device 100 (e.g., an infrared remote control, etc.) or touch or gesture, etc.
The detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for capturing the intensity of ambient light; alternatively, the detector 230 includes an image collector such as a camera, which may be used to collect external environmental scenes, user attributes, or user interaction gestures, or alternatively, the detector 230 includes a sound collector such as a microphone, or the like, which is used to receive external sounds.
The external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, etc. The input/output interface may be a composite input/output interface formed by a plurality of interfaces.
The modem 210 receives broadcast television signals through a wired or wireless reception manner, and demodulates audio and video signals, such as EPG data signals, from a plurality of wireless or wired broadcast television signals.
In some embodiments, the controller 250 and the modem 210 may be located in separate devices, i.e., the modem 210 may also be located in an external device to the main device in which the controller 250 is located, such as an external set-top box or the like.
The controller 250 controls the operation of the display device and responds to the user's operations through various software control programs stored on the memory. The controller 250 controls the overall operation of the display device. For example: in response to receiving a user command to select a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments the controller includes at least one of a central processing unit (Central Processing Unit, CPU), video processor, audio processor, graphics processor (Graphics Processing Unit, GPU), RAM Random Access Memory, RAM), ROM (Read-Only Memory, ROM), first to nth interfaces for input/output, a communication Bus (Bus), and the like.
The user may input a user command through a Graphical User Interface (GUI) displayed on the display 260, and the user interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may enter a user command by entering a specific sound or gesture, and the user interface recognizes the sound or gesture through the sensor to receive the user input command.
A "user interface" is a media interface for interaction and exchange of information between an application or operating system and a user, which enables conversion between an internal form of information and a user-acceptable form. A commonly used presentation form of the user interface is a graphical user interface (Graphic User Interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
In some embodiments, the user interface 280 in the display device is configured to perform the steps of: receiving a selection operation of a user; a controller 250 configured to determine one or more subtitles to be output from subtitles of a multimedia file to be played in response to the selection operation; acquiring subtitle data of each path of subtitle to be output; acquiring a global clock of a play pipeline of the multimedia file to be played and synchronous rendering logic corresponding to each path of subtitle to be output; and respectively carrying out synchronous rendering on the subtitle data of each path of the subtitle to be output based on the global clock and synchronous rendering logic corresponding to each path of the subtitle to be output so as to synchronously display each path of the subtitle to be output when the playing multimedia file is played.
In some embodiments, the manner in which the controller 250 obtains the subtitle data of each path of the subtitle to be output may be: acquiring an encapsulation file corresponding to the multimedia file to be played; unpacking the package file to obtain an elementary stream of the multimedia file to be played, wherein the elementary stream of the multimedia file to be played comprises at least one path of subtitle elementary stream of the subtitle; and decoding the subtitle basic stream of the subtitle to be output in the subtitle basic streams of the at least one path of subtitle to obtain the subtitle data of the subtitle to be output.
In some embodiments, the user interface 140 of the control device 100 is further configured to perform the steps of: and receiving a deleting operation of a user, wherein the deleting operation is used for stopping synchronous display of the first subtitle in the subtitles to be output.
In some embodiments, the user interface 140 of the control device 100 is further configured to perform the steps of: and receiving an adding operation of a user, wherein the adding operation is used for adding synchronous display of the second subtitles except the subtitles to be output.
In some embodiments, the manner in which the controller 250 obtains the subtitle data of each path of the subtitle to be output may be: acquiring at least one plug-in subtitle file of the multimedia file to be played based on a native layer of the display device; respectively decapsulating the at least one plug-in subtitle file to obtain a subtitle elementary stream corresponding to the at least one plug-in subtitle file; and decoding the subtitle elementary stream of the subtitle to be output in the subtitle elementary streams corresponding to the at least one plug-in subtitle file to obtain the subtitle data of the subtitle to be output.
In some embodiments, the manner in which the controller 250 obtains the subtitle data of each path of the subtitle to be output may be: acquiring at least one plug-in subtitle file of the multimedia file to be played based on an application layer of the display device; and respectively analyzing the at least one plug-in subtitle file to acquire subtitle data corresponding to each plug-in subtitle file.
In some embodiments, the controller 250 performs synchronous rendering on the subtitle data of each path of the subtitle to be output based on the global clock and synchronous rendering logic corresponding to each path of the subtitle to be output, so as to synchronously display each path of the subtitle to be output when the playing multimedia file is played, where the manner may be that: and synchronously rendering the subtitle data of each path of subtitle to be output based on the global clock and a preset synchronous rendering logic so as to synchronously display each path of subtitle to be output when the played multimedia file is played.
In some embodiments, the controller 250 is further configured to add identification information of the target subtitle in a rendering event corresponding to the target subtitle; the target subtitle is any one of the one or more paths of subtitles to be output, and the identification information is used for uniquely identifying the target subtitle.
In some embodiments, the manner of the controller 250 obtaining the global clock of the playing pipeline of the multimedia file to be played may be: acquiring an audio clock of the multimedia file to be played; and determining the audio clock as a global clock of a playing pipeline of the multimedia file to be played.
In some embodiments, the user interface 280 is further configured to: receiving a deleting operation of a user, wherein the deleting operation is used for stopping synchronous display of a first subtitle in the subtitles to be output; the controller 250 is further configured to: and responding to the deleting operation of the user, stopping synchronously rendering the first subtitle so as to stop synchronously displaying the first subtitle when the played multimedia file is played.
In some embodiments, the user interface 280 is further configured to: receiving an adding operation of a user, wherein the adding operation is used for adding synchronous display of a second subtitle except the subtitle to be output; the controller 250 is further configured to: and acquiring the subtitle data of the second subtitle and synchronous rendering logic corresponding to the second subtitle, and synchronously rendering the subtitle data of the second subtitle according to the global clock and the synchronous rendering logic corresponding to the second subtitle so as to add synchronous display of the second subtitle.
Referring to fig. 4, in some embodiments, the operating system of the display device is divided into two parts, an application layer and a native layer, the application layer is an application layer (application layer), the native layer is a native application framework layer (Application Framework layer) from top to bottom, a An Zhuoyun row (Android run) and a system library layer (system runtime layer), and a kernel layer.
In some embodiments, at least one application program is running in the application program layer, and these application programs may be a Window (Window) program of an operating system, a system setting program, a clock program, or the like; or may be an application developed by a third party developer. In particular implementations, the application packages in the application layer are not limited to the above examples.
The framework layer provides an application programming interface (application programming interface, API) and programming framework for the application. The application framework layer includes a number of predefined functions. The application framework layer corresponds to a processing center that decides to let the applications in the application layer act. Through the API interface, the application program can access the resources in the system and acquire the services of the system in the execution.
As shown in fig. 4, the application framework layer in the embodiment of the present application includes a manager (manager), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used to interact with all activities that are running in the system; a Location Manager (Location Manager) is used to provide system services or applications with access to system Location services; a Package Manager (Package Manager) for retrieving various information about applications currently installed on the device; a notification manager (Notification Manager) for controlling the display and clearing of notification messages; a Window Manager (Window Manager) is used to manage icons, windows, toolbars, wallpaper, and desktop components on the user interface.
In some embodiments, the activity manager is used to manage the lifecycle of the individual applications as well as the usual navigation rollback functions, such as controlling the exit, opening, fallback, etc. of the applications. The window manager is used for managing all window programs, such as obtaining the size of the display screen, judging whether a status bar exists or not, locking the screen, intercepting the screen, controlling the change of the display window (for example, reducing the display window to display, dithering display, distorting display, etc.), etc.
In some embodiments, the system runtime layer provides support for the upper layer, the framework layer, and when the framework layer is in use, the android operating system runs the C/C++ libraries contained in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 4, the kernel layer contains at least one of the following drivers: audio drive, display drive, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (e.g., fingerprint sensor, temperature sensor, pressure sensor, etc.), and power supply drive, etc.
Fig. 5 is a schematic flow chart illustrating a method for displaying multimedia subtitles according to an embodiment of the present application, and as shown in fig. 5, the method for displaying multimedia subtitles according to an embodiment of the present application includes the following steps:
S501, receiving a selection operation of a user.
In some embodiments, the object of the selection operation of the receiving user may be any display device capable of playing audio and video, for example, a television, a mobile phone, etc.; the implementation manner of the selection operation of the receiving user may be: when the display equipment is a television, a user selects subtitles through a remote controller, and the television receives a selection signal of the remote controller through a communication interface; when the display device is a mobile phone, a user selects the subtitle to be output through the subtitle selection interface, and a communication interface in the mobile phone receives a selection signal.
S502, determining one or more subtitles to be output from the subtitles of the multimedia file to be played in response to the selection operation.
In some embodiments, the multimedia file to be played may be a multimedia file stored locally by the display device, or may be a multimedia file downloaded by the display device from a network, for example, a multimedia file provided by video playing software on the display device; the multimedia file may include a video file, an audio file, and a subtitle file, where the audio file includes at least one path of audio, and the subtitle file includes at least one path of subtitle. The output-carrying subtitle may be a text subtitle or a picture subtitle, for example.
Wherein the multimedia file includes a subtitle file called an embedded subtitle file,
s503, acquiring the subtitle data of each path of the subtitle to be output.
In some embodiments, the acquiring the subtitle data of each path of the subtitle to be output may be performing decapsulation and decoding on the subtitle file of each path of the subtitle to be output; or analyzing and acquiring the subtitle file of each path of subtitle to be output.
S504, acquiring a global clock of a playing pipeline of the multimedia file to be played and synchronous rendering logic corresponding to each path of subtitle to be output.
The method for acquiring the global clock of the playing pipeline of the multimedia file to be played comprises the following steps of 1 and 2:
step 1, acquiring an audio clock of the multimedia file to be played.
In some embodiments, the audio clock of the multimedia file to be played is carried by an audio stream in the multimedia file, and the audio stream is decoded to obtain the audio clock.
And step 2, determining the audio clock as a global clock of a playing pipeline of the multimedia file to be played.
In some embodiments, the global clock provides monotonically increasing absolute time, that is, the current time of audio and video playing of the multimedia file, and the display period of each path of subtitle to be output can be determined through the global clock.
S505, based on the global clock and synchronous rendering logic corresponding to each path of subtitle to be output, respectively performing synchronous rendering on subtitle data of each path of subtitle to be output so as to synchronously display each path of subtitle to be output when the playing multimedia file is played.
In S505, the synchronously rendering the subtitle data of each path of the subtitle to be output further includes:
and adding the identification information of the target subtitle in the rendering event corresponding to the target subtitle.
The target subtitle is any one of the one or more paths of subtitles to be output, and the identification information is used for uniquely identifying the target subtitle.
In some embodiments, the identification information corresponding to the target subtitle needs to be determined according to the current rendering event, and then synchronous rendering and broadcasting are performed. For example, when the target subtitle is synchronously rendered, the global clock can be acquired through the player pipeline, the running time for rendering the target subtitle can be obtained, and the synchronous processing can be performed by comparing the difference between the display time for rendering the target subtitle and the running time. And rendering the target caption if the accumulated sum of the display time of the target caption and the frame display time which are currently transmitted is within a certain threshold range of the current running time, waiting if the current running time reaches the target caption in advance, otherwise discarding the frame data.
For example, referring to fig. 6, a scene diagram of a multimedia subtitle display method, when a user selects 2 paths of subtitles to be displayed on a display device, the scene includes: the display screen 600 of the display device, the subtitle 1 display area 601 and the subtitle 2 display area 602, specifically, the user may also set the subtitle reality area, and the font style, font size, and color of the subtitle, etc.
According to the technical scheme, the display equipment and the multimedia subtitle display method provided by the embodiment of the application receive the selection operation of a user through the user interface; the controller responds to the selection operation and determines one or more paths of subtitles to be output from the subtitles of the multimedia file to be played; acquiring the subtitle data of each path of subtitle to be output; then, acquiring a global clock of a play pipeline of the multimedia file to be played and synchronous rendering logic corresponding to each path of subtitle to be output; finally, based on the global clock and synchronous rendering logic corresponding to each path of subtitle to be output, respectively synchronously rendering subtitle data of each path of subtitle to be output so as to synchronously display each path of subtitle to be output when the playing multimedia file is played.
Fig. 7 is a schematic flow chart illustrating a method for displaying multimedia subtitles according to an embodiment of the present application, and as shown in fig. 7, the method for displaying multimedia subtitles according to an embodiment of the present application includes the following steps:
s701, receiving a selection operation of a user.
S702, determining one or more subtitles to be output from the subtitles of the multimedia file to be played in response to the selection operation.
S703, obtaining the encapsulation file corresponding to the multimedia file to be played.
In some embodiments, the multimedia files are divided into streaming media files and non-streaming media files, for the streaming media files, the obtaining the encapsulation files corresponding to the multimedia files to be played is to obtain the media fragment addresses by parsing the streaming media file protocol and download the encapsulation files, and for the non-streaming media files, the encapsulation files can be directly obtained.
S704, unpacking the package file to obtain the basic stream of the multimedia file to be played.
The elementary streams of the multimedia file to be played comprise at least one path of subtitle elementary streams of the subtitle.
In some embodiments, the elementary streams of the multimedia file to be played further include: a video stream and at least one audio stream.
S705, decoding the subtitle basic stream of the subtitle to be output in the subtitle basic streams of the at least one path of subtitle to obtain the subtitle data of the subtitle to be output.
In some embodiments, the subtitle data to be output includes: text content, display duration, and display time stamp of the text subtitle data; or a Bitmap (Bitmap) of the picture subtitle data, a display duration, and a display time stamp.
S706, acquiring a global clock of a playing pipeline of the multimedia file to be played and synchronous rendering logic corresponding to each path of subtitle to be output.
In some embodiments of the present invention, in some embodiments,
and S707, based on the global clock and synchronous rendering logic corresponding to each path of subtitle to be output, respectively synchronously rendering the subtitle data of each path of subtitle to be output so as to synchronously display each path of subtitle to be output when the playing multimedia file is played.
Exemplary, referring to fig. 8, a schematic view of a scenario for implementing a multi-subtitle display method based on the player architecture shown in fig. 1 is shown, where the multimedia files in the scenario include: video files, audio files, and subtitle files, the scene including:
The decapsulator 81 is configured to decapsulate the multimedia file to obtain a corresponding video stream, at least one audio stream, and at least one subtitle stream.
The buffer 82 is configured to buffer the video stream, the at least one audio stream, and the at least one subtitle stream obtained after the decapsulation by the decapsulator 81.
The buffer queue 83 is configured to buffer the video stream, at least one audio stream, and at least one subtitle stream.
An audio-video selector 84 for selecting one of the at least one audio.
A multi-subtitle selector 85 for selecting the target subtitle according to identification information of the target subtitle added in a rendering event,
the target subtitles are, for example, subtitle 1 and subtitle 2.
The video decoder 86 is configured to decode the video elementary stream to obtain video data.
The audio 1 decoder 87 is configured to decode one audio selected by the audio/video selector 84, and obtain audio data.
The subtitle 1 decoder 88 acquires subtitle 1 data for decoding the elementary stream of subtitle 1.
The subtitle 2 decoder 89 acquires subtitle 1 data for decoding the elementary stream of subtitle 2.
The rendering module 810 is configured to synchronously render video data, audio data, subtitle 1 data, and subtitle 2 data.
A display module 811 for playing audio, displaying video, subtitle 1, subtitle 2 on a display device.
Fig. 9 is a schematic flow chart illustrating a method for displaying multimedia subtitles according to an embodiment of the present application, and as shown in fig. 9, the method for displaying multimedia subtitles according to an embodiment of the present application includes the following steps:
s901, receiving a selection operation of a user.
And S902, determining one or more paths of subtitles to be output from the subtitles of the multimedia file to be played in response to the selection operation.
S903, acquiring at least one plug-in subtitle file of the multimedia file to be played based on a native layer of the display device.
In some embodiments, the native layer of the display device is a development environment based on the c++ language.
S904, respectively decapsulating the at least one external subtitle file to obtain a subtitle elementary stream corresponding to the at least one external subtitle file.
S905, decoding the subtitle elementary stream of the subtitle to be output in the subtitle elementary streams corresponding to the at least one plug-in subtitle file to obtain the subtitle data of the subtitle to be output.
S906, acquiring a global clock of a playing pipeline of the multimedia file to be played and synchronous rendering logic corresponding to each path of subtitle to be output.
S907, synchronously rendering the subtitle data of each path of the subtitle to be output based on the global clock and a preset synchronous rendering logic so as to synchronously display each path of the subtitle to be output when the played multimedia file is played.
For example, referring to fig. 10, a schematic view of a scenario for implementing a multi-subtitle display method based on the player architecture shown in fig. 1 is shown, where a streaming media file in the scenario includes a video file and an audio file, and the scenario includes:
the stream media file decapsulator 1001 is configured to decapsulate the stream media file to obtain a corresponding video stream and at least one audio stream.
The caption file decapsulator 1002 is configured to decapsulate the plug-in caption file to obtain at least one path of corresponding caption stream.
The buffer 1003 is configured to buffer the decapsulated video stream, at least one audio stream, and at least one subtitle stream.
The buffer queue 1004 is configured to buffer the video stream, at least one audio stream, and at least one subtitle stream.
An audio/video selector 1005 is configured to select one audio from at least one audio.
A multi-subtitle selector 1006 for selecting the target subtitle according to identification information of the target subtitle added in a rendering event,
The target subtitles are, for example, subtitle 1 and subtitle 2.
A video decoder 1007 is used for decoding the video elementary stream to obtain video data.
The audio 1 decoder 1008 is used for decoding one path of audio selected by the audio/video selector 84 to obtain audio data.
The subtitle 1 decoder 1009 acquires subtitle 1 data for decoding the elementary stream of subtitle 1.
The subtitle 2 decoder 1010 acquires subtitle 1 data for decoding the elementary stream of subtitle 2.
The rendering module 1011 is used for synchronously rendering video data, audio data, caption 1 data and caption 2 data.
A display module 1012 is used for playing audio, displaying video, caption 1, and caption 2 on the display device.
In the technical solution of the above embodiment, for each path of subtitle files to be output subtitle in a native layer of the display device, at least one plug-in subtitle file of the multimedia file to be played in the native layer of the display device is obtained to unpack the at least one plug-in subtitle file, obtain a subtitle elementary stream corresponding to the at least one plug-in subtitle file, decode a subtitle elementary stream of the subtitle to be output in the subtitle elementary stream corresponding to the at least one plug-in subtitle file to obtain subtitle data of the subtitle to be output,
And based on the global clock and a preset synchronous rendering logic, synchronously rendering the subtitle data of each path of subtitle to be output so as to synchronously display each path of subtitle to be output when the playing multimedia file is played, thereby realizing the display of at least one path of subtitle on display equipment and improving user experience.
Fig. 11 is a schematic flow chart illustrating a method for displaying multimedia subtitles according to an embodiment of the present application, and as shown in fig. 11, the method for displaying multimedia subtitles according to an embodiment of the present application includes the following steps:
s1101, receiving a selection operation by the user.
And S1102, determining one or more paths of subtitles to be output from the subtitles of the multimedia file to be played in response to the selection operation.
S1103, acquiring at least one plug-in subtitle file of the multimedia file to be played based on an application layer of the display device.
In some embodiments, the application layer of the display device is a JAVE language based development environment,
s1104, analyzing the at least one external subtitle file respectively to obtain subtitle data corresponding to each external subtitle file.
S1105, acquiring a global clock of a playing pipeline of the multimedia file to be played and synchronous rendering logic corresponding to each path of subtitle to be output.
And S1106, respectively synchronously rendering the subtitle data of each path of the subtitle to be output based on the global clock and synchronous rendering logic corresponding to each path of the subtitle to be output so as to synchronously display each path of the subtitle to be output when the playing multimedia file is played.
For example, in connection with the embodiment described in fig. 11, referring to fig. 12, a schematic view of a scene for implementing a multi-subtitle display method based on the player architecture shown in fig. 1 is shown, in the scene, the to-be-output subtitles are 2 paths of subtitles 1 and 2, and the subtitle file including the subtitles 1 and 2 is in an externally hung subtitle file of an application layer, where the scene includes:
the method for obtaining video and audio may refer to the description of the embodiment shown in fig. 8, which is not repeated here.
The subtitle buffer queue 1208 is used for buffering at least one path of plug-in subtitle file.
The parsing module 1209 is configured to parse at least one path of the plug-in subtitle files, and obtain subtitle data corresponding to each of the plug-in subtitle files.
The multi-subtitle synchronizer 1210 is used for acquiring a global clock of a playing pipeline of the multimedia file to be played and synchronous rendering logic corresponding to each path of subtitle to be output.
The subtitle rendering module 1211 is configured to perform synchronous rendering on subtitle data of each path of the subtitle to be output based on the global clock and synchronous rendering logic corresponding to each path of the subtitle to be output, so as to synchronously display each path of the subtitle to be output when the multimedia file is played.
And the display module 1212 is used for selecting the caption 1 and the caption 2 for rendering and displaying according to the identification information added with the target caption in the rendering event corresponding to the target caption.
In the technical solution of the foregoing embodiment, for each path of subtitle files to be output, the application layer of the display device is configured to obtain at least one plug-in subtitle file of the multimedia file to be played by obtaining the application layer of the display device, parse the at least one plug-in subtitle file to obtain subtitle data corresponding to each plug-in subtitle file, obtain a global clock of a playing pipeline of the multimedia file to be played and synchronous rendering logic corresponding to each path of the subtitle to be output, and further synchronously render the subtitle data of each path of the subtitle to be output, so as to synchronously display each path of subtitle to be output when the multimedia file to be played, thereby realizing at least one path of subtitle to be displayed on the display device and improving user experience.
Fig. 13 is a schematic flow chart illustrating a method for displaying multimedia subtitles according to an embodiment of the present application, and as shown in fig. 13, the method for displaying multimedia subtitles according to an embodiment of the present application includes the following steps:
s1301, receiving a selection operation of a user.
And S1302, determining one or more paths of subtitles to be output from the subtitles of the multimedia file to be played in response to the selection operation.
S1303, obtaining the encapsulation file corresponding to the multimedia file to be played.
S1304, unpacking the package file to obtain an elementary stream of the multimedia file to be played, wherein the elementary stream of the multimedia file to be played comprises at least one path of subtitle elementary stream of the subtitle.
S1305, decoding the subtitle basic stream of the subtitle to be output in the subtitle basic streams of the at least one path of subtitle to obtain the subtitle data of the subtitle to be output.
S1306, acquiring a global clock of a playing pipeline of the multimedia file to be played and synchronous rendering logic corresponding to each path of subtitle to be output.
S1307, based on the global clock and synchronous rendering logic corresponding to each path of subtitle to be output, respectively synchronously rendering the subtitle data of each path of subtitle to be output so as to synchronously display each path of subtitle to be output when the playing multimedia file is played.
The embodiment of the present application executes the following steps S1308 to S1312 while executing the steps S1303 to S1307 described above:
s1308, obtaining at least one plug-in subtitle file of the multimedia file to be played based on the native layer of the display device.
S1309, respectively decapsulating the at least one plug-in subtitle file to obtain a subtitle elementary stream corresponding to the at least one plug-in subtitle file.
S1310, decoding the subtitle elementary stream of the subtitle to be output in the subtitle elementary streams corresponding to the at least one plug-in subtitle file to obtain the subtitle data of the subtitle to be output.
S1311, acquiring a global clock of a playing pipeline of the multimedia file to be played and synchronous rendering logic corresponding to each path of subtitle to be output.
S1312, based on the global clock and synchronous rendering logic corresponding to each path of subtitle to be output, respectively performing synchronous rendering on subtitle data of each path of subtitle to be output so as to synchronously display each path of subtitle to be output when the playing multimedia file is played.
For example, referring to fig. 14, a schematic view of a scene for implementing a multi-subtitle display method based on the player architecture shown in fig. 1 is shown in the embodiment described in connection with fig. 13, where the subtitle to be output is a 2-way subtitle: the method for acquiring the caption 1 by the native layer can refer to the description of the scene schematic diagram in the embodiment shown in fig. 8, and the method for acquiring the caption a by the native layer can refer to fig. 10, which is not repeated herein.
Fig. 15 is a schematic flow chart illustrating a method for displaying multimedia subtitles according to an embodiment of the present application, and as shown in fig. 15, the method for displaying multimedia subtitles according to an embodiment of the present application includes the following steps:
s1501, receiving a selection operation of a user.
S1502, one or more subtitles to be output are determined from the subtitles of the multimedia file to be played in response to the selection operation.
S1503, obtaining the encapsulation file corresponding to the multimedia file to be played.
S1504, unpacking the package file to obtain an elementary stream of the multimedia file to be played, wherein the elementary stream of the multimedia file to be played comprises at least one path of subtitle elementary stream of the subtitle.
S1505, decoding the subtitle basic stream of the subtitle to be output in the subtitle basic streams of the at least one path of subtitle to obtain the subtitle data of the subtitle to be output.
S1506, acquiring a global clock of a playing pipeline of the multimedia file to be played and synchronous rendering logic corresponding to each path of subtitle to be output.
S1507, based on the global clock and synchronous rendering logic corresponding to each path of subtitle to be output, respectively synchronously rendering the subtitle data of each path of subtitle to be output so as to synchronously display each path of subtitle to be output when the playing multimedia file is played.
The embodiment of the present application executes the following S1508-S1510 while executing the above S1503-S1507:
s1508, acquiring at least one plug-in subtitle file of the multimedia file to be played based on an application layer of the display device.
S1509, analyzing the at least one plug-in subtitle file respectively to acquire subtitle data corresponding to each plug-in subtitle file.
And S1510, synchronously rendering the subtitle data of each path of subtitle to be output based on the global clock and a preset synchronous rendering logic so as to synchronously display each path of subtitle to be output when the played multimedia file is played.
For example, referring to fig. 16, a schematic view of a scene for implementing a multi-subtitle display method based on the player architecture shown in fig. 1 is shown in the embodiment described in connection with fig. 15, where the subtitle to be output is 2-way subtitle: the method for the native layer to acquire the subtitle 1 can refer to the description of the embodiment scene schematic diagram shown in fig. 8, then the rendered video and the subtitle 1 are transmitted to the application layer, and then the video and the subtitle a are synchronously displayed on the display device, and the method for the application layer to acquire the subtitle a can refer to the description of the embodiment scene schematic diagram shown in fig. 12 and is not repeated herein.
As an extension and refinement of the above embodiment, referring to fig. 17, the multimedia subtitle display method provided in the embodiment of the present application further includes the following S1701 and S1702:
s1701, receiving a deleting operation of a user.
The deleting operation is used for stopping synchronous display of the first subtitle in the subtitles to be output.
For example, when the first subtitle is subtitle 1 and subtitle 2 are being displayed on the current display device, the user does not need to display subtitle 2 at this time, or wants to switch subtitle 2 to subtitle 3, and the user may delete subtitle 1 through a remote controller of the display device or a touch button of a user interface.
Since the subtitle synchronous rendering module does not provide a clock, the subtitle decoding module and the rendering module can be dynamically added and removed. If the user selects other subtitles in the playing process, the data in each subtitle decoding module and the subtitle synchronous rendering module are directly cleared, and the modules are removed. And creating a corresponding subtitle decoding module and a subtitle synchronous rendering module based on the new output basic stream of the multi-subtitle selector, and establishing connection between the data streams.
S1702, responding to a deleting operation of a user, stopping synchronously rendering the first subtitle so as to stop synchronously displaying the first subtitle when the multimedia file is played.
In some embodiments, the stopping the synchronous rendering of the first subtitle is stopping the synchronous rendering of the first subtitle by finding the first subtitle through the identification information of the target subtitle.
In the above technical solution, according to the embodiment of the present application, by receiving a deletion operation of a user, the synchronous rendering of the first subtitle is stopped in response to the deletion operation of the user, so that the synchronous display of the first subtitle is stopped when the multimedia file is played, and the user can customize and delete unnecessary subtitle display according to the current requirement, thereby improving user experience.
As an extension and refinement of the above embodiment, referring to fig. 18, the multimedia subtitle display method provided in the embodiment of the present application further includes the following S1801 and S1802:
s1801, receiving an add operation by the user.
The adding operation is used for adding synchronous display of the second subtitles except the subtitles to be output.
For example, when the current display device is displaying the subtitles 3 and 4 and the second subtitle is the subtitle 5, the user wants to display the subtitle 5 on the display device or wants to switch the subtitle 4 to the subtitle 5, and the user can add the subtitle 5 through a remote controller of the display device or a touch button of a user interface, or delete the subtitle 4 first and then add the subtitle 5.
S1802, acquiring the caption data of the second caption and the synchronous rendering logic corresponding to the second caption, and synchronously rendering the caption data of the second caption according to the global clock and the synchronous rendering logic corresponding to the second caption so as to add synchronous display of the second caption.
In the above technical solution, in the embodiment of the present application, by receiving an adding operation of a user, the caption data of the second caption and the synchronous rendering logic corresponding to the second caption are obtained, and according to the global clock and the synchronous rendering logic corresponding to the second caption, synchronous rendering is performed on the caption data of the second caption, so as to add synchronous display of the second caption, and according to the current requirement, the user can customize to add the caption to be displayed, thereby improving user experience.
In some embodiments, embodiments of the present application provide a computer readable storage medium having a computer program stored thereon, which when executed by a computing device, causes the computing device to implement the method for displaying multimedia subtitles according to any of the embodiments above.
In some embodiments, embodiments of the present application provide a computer program product, which when run on a computer causes the computer to implement the multimedia subtitle display method of the second aspect or any embodiment of the second aspect.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. A display device, characterized by comprising:
a user interface for receiving a selection operation of a user;
A controller configured to:
responding to the selection operation, and determining one or more paths of subtitles to be output from the subtitles of the multimedia file to be played;
acquiring subtitle data of each path of subtitle to be output;
acquiring a global clock of a play pipeline of the multimedia file to be played and synchronous rendering logic corresponding to each path of subtitle to be output;
and respectively carrying out synchronous rendering on the subtitle data of each path of the subtitle to be output based on the global clock and synchronous rendering logic corresponding to each path of the subtitle to be output so as to synchronously display each path of the subtitle to be output when the playing multimedia file is played.
2. The display device of claim 1, wherein the controller is specifically configured to:
acquiring an encapsulation file corresponding to the multimedia file to be played;
unpacking the package file to obtain an elementary stream of the multimedia file to be played, wherein the elementary stream of the multimedia file to be played comprises at least one path of subtitle elementary stream of the subtitle;
and decoding the subtitle basic stream of the subtitle to be output in the subtitle basic streams of the at least one path of subtitle to obtain the subtitle data of the subtitle to be output.
3. The display device of claim 1, wherein the controller is specifically configured to:
acquiring at least one plug-in subtitle file of the multimedia file to be played based on a native layer of the display device;
respectively decapsulating the at least one plug-in subtitle file to obtain a subtitle elementary stream corresponding to the at least one plug-in subtitle file;
and decoding the subtitle elementary stream of the subtitle to be output in the subtitle elementary streams corresponding to the at least one plug-in subtitle file to obtain the subtitle data of the subtitle to be output.
4. The display device of claim 1, wherein the controller is specifically configured to:
acquiring at least one plug-in subtitle file of the multimedia file to be played based on an application layer of the display device;
and respectively analyzing the at least one plug-in subtitle file to acquire subtitle data corresponding to each plug-in subtitle file.
5. The display device of claim 4, wherein the controller is specifically configured to:
and synchronously rendering the subtitle data of each path of subtitle to be output based on the global clock and a preset synchronous rendering logic so as to synchronously display each path of subtitle to be output when the played multimedia file is played.
6. The display device of claim 1, wherein the controller is further configured to:
adding identification information of a target subtitle in a rendering event corresponding to the target subtitle;
the target subtitle is any one of the one or more paths of subtitles to be output, and the identification information is used for uniquely identifying the target subtitle.
7. The display device of claim 1, wherein the controller is specifically configured to:
acquiring an audio clock of the multimedia file to be played;
and determining the audio clock as a global clock of a playing pipeline of the multimedia file to be played.
8. The display device according to any one of claims 1-7, wherein,
the user interface is further configured to: receiving a deleting operation of a user, wherein the deleting operation is used for stopping synchronous display of a first subtitle in the subtitles to be output;
the controller is further configured to: and responding to the deleting operation of the user, stopping synchronously rendering the first subtitle so as to stop synchronously displaying the first subtitle when the played multimedia file is played.
9. The display device of any of claims 1-7, wherein the controller is further configured to:
The user interface is further configured to: receiving an adding operation of a user, wherein the adding operation is used for adding synchronous display of a second subtitle except the subtitle to be output;
the controller is further configured to: and acquiring the subtitle data of the second subtitle and synchronous rendering logic corresponding to the second subtitle, and synchronously rendering the subtitle data of the second subtitle according to the global clock and the synchronous rendering logic corresponding to the second subtitle so as to add synchronous display of the second subtitle.
10. A multimedia subtitle display method, comprising:
receiving a selection operation of a user;
responding to the selection operation, and determining one or more paths of subtitles to be output from the subtitles of the multimedia file to be played;
acquiring subtitle data of each path of subtitle to be output;
acquiring a global clock of a play pipeline of the multimedia file to be played and synchronous rendering logic corresponding to each path of subtitle to be output;
and respectively carrying out synchronous rendering on the subtitle data of each path of the subtitle to be output based on the global clock and synchronous rendering logic corresponding to each path of the subtitle to be output so as to synchronously display each path of the subtitle to be output when the playing multimedia file is played.
CN202211632453.7A 2022-12-19 2022-12-19 Display device and multimedia subtitle display method Pending CN117651196A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211632453.7A CN117651196A (en) 2022-12-19 2022-12-19 Display device and multimedia subtitle display method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211632453.7A CN117651196A (en) 2022-12-19 2022-12-19 Display device and multimedia subtitle display method

Publications (1)

Publication Number Publication Date
CN117651196A true CN117651196A (en) 2024-03-05

Family

ID=90046585

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211632453.7A Pending CN117651196A (en) 2022-12-19 2022-12-19 Display device and multimedia subtitle display method

Country Status (1)

Country Link
CN (1) CN117651196A (en)

Similar Documents

Publication Publication Date Title
CN114302190B (en) Display equipment and image quality adjusting method
US11425466B2 (en) Data transmission method and device
CN112580302B (en) Subtitle correction method and display equipment
US11960674B2 (en) Display method and display apparatus for operation prompt information of input control
CN113014939A (en) Display device and playing method
CN113535019A (en) Display device and display method of application icons
CN112752156A (en) Subtitle adjusting method and display device
CN113490042A (en) Display device and channel searching method
CN114302204B (en) Split-screen playing method and display device
CN112601117B (en) Display device and content presentation method
CN112911381B (en) Display device, mode adjustment method, device and medium
CN113992960A (en) Subtitle previewing method on display device and display device
CN113596559A (en) Method for displaying information in information bar and display equipment
CN117651196A (en) Display device and multimedia subtitle display method
CN112668546A (en) Video thumbnail display method and display equipment
CN112367550A (en) Method for realizing multi-title dynamic display of media asset list and display equipment
CN114390190A (en) Display equipment and method for monitoring application to start camera
CN113766164B (en) Display equipment and signal source interface display method
CN113064515B (en) Touch display device and USB device switching method
CN113436564B (en) EPOS display method and display equipment
CN114866636B (en) Message display method, terminal equipment, intelligent equipment and server
CN114302131A (en) Display device and black screen detection method
CN117768705A (en) Display equipment and control display method
CN114298119A (en) Display apparatus and image recognition method
CN112995734A (en) Display device and channel searching method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination