WO2022010130A1 - Dispositif électronique et son procédé de commande - Google Patents

Dispositif électronique et son procédé de commande Download PDF

Info

Publication number
WO2022010130A1
WO2022010130A1 PCT/KR2021/007715 KR2021007715W WO2022010130A1 WO 2022010130 A1 WO2022010130 A1 WO 2022010130A1 KR 2021007715 W KR2021007715 W KR 2021007715W WO 2022010130 A1 WO2022010130 A1 WO 2022010130A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
electronic device
signals
image data
data
Prior art date
Application number
PCT/KR2021/007715
Other languages
English (en)
Korean (ko)
Inventor
신현종
윤석현
Original Assignee
삼성전자(주)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자(주) filed Critical 삼성전자(주)
Publication of WO2022010130A1 publication Critical patent/WO2022010130A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4305Synchronising client clock from received content stream, e.g. locking decoder clock with encoder clock, extraction of the PCR packets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • H04N21/43632Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wired protocol, e.g. IEEE 1394
    • H04N21/43635HDMI
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/438Interfacing the downstream path of the transmission network originating from a server, e.g. retrieving encoded video stream packets from an IP network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/438Interfacing the downstream path of the transmission network originating from a server, e.g. retrieving encoded video stream packets from an IP network
    • H04N21/4382Demodulation or channel decoding, e.g. QPSK demodulation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440263Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440281Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0117Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving conversion of the spatial resolution of the incoming video signal
    • H04N7/012Conversion between an interlaced and a progressive signal

Definitions

  • the present invention relates to an electronic device that processes and displays a multi-view image, and a method for controlling the same.
  • a conventional display device requires a plurality of video processors for processing each of a plurality of image signals received through a plurality of HDMI receivers for multi-view display.
  • the conventional display apparatus performs respective image processing on a plurality of image signals received through a plurality of HDMI receivers in order to display an image for multi-view, thereby increasing data throughput on a transmission line according to an increase in the amount of data to be processed.
  • Throughput is very large, and there is a problem in that the system configuration cost increases.
  • the electronic device receives a plurality of image signals, extracts image data of the plurality of image signals, respectively, and generates integrated image data obtained by allocating a plurality of images based on the plurality of image data to a plurality of regions of a screen, respectively. and an input signal processing unit for generating a single integrated image signal with an input signal and an image processing unit for generating an image frame for display on the screen based on the integrated image data of the integrated image signal.
  • the plurality of image signals may be received through a high-definition multimedia interface (HDMI).
  • HDMI high-definition multimedia interface
  • the input signal processing unit includes: a decoder for decoding the plurality of received image signals; and an image integrator for generating the combined image data by extracting a plurality of image data from each of the decoded plurality of image signals.
  • the image integrator may generate the integrated image signal based on information about at least one of positions and sizes of the plurality of image signals.
  • the image integrator may generate a synchronization signal of the combined image signal corresponding to the combined image data based on the synchronization signal of the plurality of image signals.
  • the integrated image signal may include information on a difference in position or size before and after the integration of the plurality of image signals.
  • the display unit may further include a display unit for displaying the image frame based on the integrated image signal.
  • the display unit may include a timing controller for adjusting the time of the image frame generated by the image processing unit.
  • the electronic device may further include a second interface unit for receiving the second image signal and a second image processing unit for generating a second image frame for displaying on the screen based on the second image signal.
  • the display unit may display the combined image signal by synthesizing the second image signal.
  • the image processing unit decoding decoding (decoding) corresponding to the image format of the image data, de-interlacing for converting image data of the interlace method into a progressive method, and adjusting the image data to a preset resolution At least one of scaling, noise reduction for image quality improvement, detail enhancement, and frame refresh rate conversion may be performed.
  • a method of controlling an electronic device includes receiving a plurality of image signals, extracting image data of the plurality of image signals, respectively, and allocating a plurality of images based on the plurality of image data to a plurality of regions of a screen, respectively generating a single integrated image signal having one integrated image data; and generating an image frame for display on the screen based on the integrated image data of the integrated image signal.
  • the generating of the integrated image signal may include decoding the plurality of image signals.
  • the plurality of image signals may be generated based on information on at least one of positions and sizes.
  • the generating of the combined image signal may include generating a synchronization signal of the combined image signal corresponding to the combined image data based on the synchronization signals of the plurality of image signals.
  • the method of controlling the electronic device may further include receiving a second image signal from a second interface unit.
  • the method of controlling the electronic device may further include displaying the combined image signal by synthesizing the second image signal.
  • the generating of the image frame includes decoding corresponding to an image format of image data, de-interlacing converting image data of an interlace method into a progressive method, and converting the image data to a preset resolution. At least one of scaling, noise reduction for improving image quality, detail enhancement, and frame refresh rate conversion may be performed.
  • the electronic device integrates a plurality of images displayed on a plurality of areas of a screen into one integrated image signal in an input step, and then processes it as an image frame for display on the screen in one image processing unit, thereby It is possible to reduce the data throughput on the transmission line and reduce the system configuration cost.
  • FIG. 1 is a diagram illustrating a screen of an electronic device according to a first embodiment of the present invention.
  • FIG. 2 is a block diagram showing the configuration of an electronic device according to a first embodiment of the present invention.
  • FIG. 3 is a flowchart illustrating a control method of an electronic device for displaying a divided image according to a first embodiment of the present invention.
  • FIG. 4 is a diagram showing video signals 1 to 4 of the TMDS format.
  • FIG. 5 is a diagram illustrating an integrated video signal (MVS).
  • FIG. 6 is a block diagram showing the configuration of an electronic device according to a second embodiment of the present invention.
  • FIG. 7 is a block diagram showing the configuration of an electronic device according to a third embodiment of the present invention.
  • FIG. 8 is a diagram illustrating a scenario in which a plurality of image signals of an electronic device are processed and displayed according to a fourth embodiment of the present invention.
  • FIG. 9 is a diagram illustrating a scenario in which a plurality of image signals of an electronic device are processed and displayed according to a fifth embodiment of the present invention.
  • expressions such as “have,” “may have,” “includes,” or “may include” refer to the presence of a corresponding characteristic (eg, a numerical value, function, operation, or component such as a part). and does not exclude the presence of additional features.
  • expressions such as “A or B,” “at least one of A or/and B,” or “one or more of A or/and B” may include all possible combinations of the items listed together.
  • “A or B,” “at least one of A and B,” or “at least one of A or B” means (1) includes at least one A, (2) includes at least one B; Or (3) it may refer to all cases including both at least one A and at least one B.
  • a device configured to may mean that the device is “capable of” with other devices or parts.
  • a subprocessor configured (or configured to perform) A, B, and C may include a processor dedicated to performing the operations (eg, an embedded processor), or executing one or more software programs stored in a memory device. By doing so, it may refer to a generic-purpose processor (eg, a CPU or an application processor) capable of performing corresponding operations.
  • the electronic device 1 processes an image, for example, a television, a smartphone, a tablet PC, a mobile phone, a video phone, an e-book reader, a desktop PC, a laptop PC, and a netbook computer. , a workstation, a server, a PDA, a portable multimedia player (PMP), an MP3 player, a medical device, a camera, and at least one of a wearable device.
  • the electronic device 1 is, for example, a Blu-ray player, a digital video disk (DVD) player, a set-top box, a home automation control panel, a security control panel, a media box, a game console, an electronic device. It may include at least one of a dictionary, a camcorder, and an electronic picture frame.
  • the electronic device 1 is a navigation device, a global navigation satellite system (GNSS), an event data recorder (EDR), a flight data recorder (FDR), an automotive infotainment device, a marine electronic equipment (
  • GNSS global navigation satellite system
  • EDR event data recorder
  • FDR flight data recorder
  • automotive infotainment device a marine electronic equipment
  • it may include at least one of a navigation device for a ship, a gyro compass, etc.), avionics, a security device, and a head unit for a vehicle.
  • the term user may refer to a person who uses the electronic device 1 or a device (eg, an artificial intelligence electronic device) using the electronic device 1 .
  • a device eg, an artificial intelligence electronic device
  • FIG. 1 is a view showing a screen of an electronic device 1 according to a first embodiment of the present invention.
  • the electronic device 1 divides the screen into four areas and displays four images (Videos 1 to 4).
  • the electronic device 1 is not limited to dividing and displaying 4 images on one screen, and may divide and display 1 to 3 images or 5 or more images.
  • FIG. 2 is a block diagram showing the configuration of the electronic device 1 according to the first embodiment of the present invention.
  • the electronic device 1 includes an interface unit 10 capable of receiving, for example, four image signals from a source device 2 , an input signal processing unit 11 , an image processing unit 12 , It may include a display unit 13 , an audio processing unit 14 , an audio output unit 15 , and a processor 16 .
  • the source device 2 may include a device capable of transmitting image content to the electronic device 1 , for example, a set-top box, a server, a relay device, a computer, or a mobile device.
  • the interface unit 10 may include, for example, four High-Definition Multimedia Interface (HDMI) Rx1 to Rx4 for receiving four video signals in a TMDS (Transition Minimized Differential Signal) format.
  • HDMI High-Definition Multimedia Interface
  • TMDS Transition Minimized Differential Signal
  • the interface unit 10 may include a wired interface unit and a wireless interface unit.
  • the wired interface unit may include a terrestrial/satellite broadcasting antenna connection tuner for receiving a broadcast signal, a cable broadcasting cable connection interface, and the like.
  • the wired interface unit may include HDMI, DP, DVI, Component, S-Video, and composite (RCA terminal) for connecting video devices.
  • the wired interface unit may include a USB interface for connecting a general-purpose electronic device.
  • the wired interface unit may include a connection interface of an optical cable device.
  • the wired interface unit may include an audio device connection interface such as a headset, an earphone, and an external speaker.
  • the wired interface unit may include a connection interface of a wired network device such as Ethernet.
  • the wireless interface unit may include a connection interface of a wireless network device such as Wi-Fi, Bluetooth, ZigBee, Z-wave, RFID, WiGig, WirelessHD, Ultra-Wide Band (UWB), Wireless USB, and Near Field Communication (NFC).
  • a wireless network device such as Wi-Fi, Bluetooth, ZigBee, Z-wave, RFID, WiGig, WirelessHD, Ultra-Wide Band (UWB), Wireless USB, and Near Field Communication (NFC).
  • the wireless interface unit may include an IR transceiver module for transmitting and/or receiving a remote control signal.
  • the wireless interface unit may include a mobile communication device connection interface such as 2G to 5G.
  • the interface unit 10 may include a dedicated communication module for performing dedicated communication with respect to each of the various source devices 2 .
  • the interface unit 10 may include a common communication module for performing communication in common with the various source devices 2 , for example, a Wi-Fi module.
  • the interface unit 10 may include an input interface unit and an output interface unit.
  • the input interface unit and the output interface unit may be integrated into one module or implemented as separate modules.
  • the input signal processing unit 11 may generate one integrated video signal MVS by integrating the four 4k video signals 1 to 4 received through HDMI Rx1 to Rx4.
  • the input signal processing unit 11 may include a decoder 111 for decoding an image signal, an image integrator 112 for integrating a plurality of image signals, and an audio switching unit 113 .
  • the decoder 111 may decode the video signals 1 to 4 of the TMDS format, respectively.
  • the decoder 111 may include, for example, a Forward Error Correction (FEC) decoder or a Display Stream Compression (DSC) decoder.
  • FEC Forward Error Correction
  • DSC Display Stream Compression
  • the image integrator 112 may extract, for example, the active image data 1-4 from the image signals 1-4 of the TMDS format, respectively.
  • the image integrator 112 may generate merged video data (MVD) in which the extracted four active image data 1 to 4 are allocated to four regions of the screen at a set position and size, respectively.
  • VMD merged video data
  • the image integrator 112 may select an image signal to be displayed on the screen from among the plurality of image signals received from the interface unit 10 .
  • the image integrator 112 may generate the combined image data MVD based on information about at least one of a location of a selected image signal and/or a size of a selected image scene.
  • the image integrator 112 may generate a synchronization signal corresponding to the combined image data MVD based on the synchronization signals of the four image signals 1 to 4 .
  • the video integration unit 112 includes information on the four integrated video signals 1 to 4, information on the difference between before and after integration, for example, merged video information (MVI) such as information on the size difference between images changed during integration. ) can be created.
  • MVI merged video information
  • the image integrator 112 generates a merged video signal (MVS) based on the combined image data (MVD), the synchronization signal, and the integrated information (MVI) generated during the integration and transmits the generated merged video signal (MVS) to the image processing unit 12 .
  • a merged video signal (MVS) based on the combined image data (MVD), the synchronization signal, and the integrated information (MVI) generated during the integration and transmits the generated merged video signal (MVS) to the image processing unit 12 .
  • the audio switching unit 113 may extract the four audio signals Audio1 to 4 included in each of the four image signals 1 to 4 and transmit it to the audio processing units 1 to 4(14).
  • the audio switching unit 113 is an output device for outputting four audio signals (Audio1 to 4) corresponding to the four images (Video1 to 4), for example, based on information about speakers 1 to 4, the audio signal (Audio1). ⁇ 4) can be switched to be assigned to the audio processing units 1 to 4.
  • Information on speakers 1 to 4 to output the audio signals Audio 1 to 4 may be received from the processor 16 through a user input.
  • the input signal processing unit 11 can be implemented in a form included in a main SoC (Main SoC) mounted on a PCB embedded in the electronic device 1 .
  • Main SoC Main SoC
  • the input signal processing unit 11 loads at least a part of the control program including instructions from the nonvolatile memory in which the control program is installed into the volatile memory, and the processor 16 executes the instructions of the loaded control program.
  • the processor 16 executes the instructions of the loaded control program.
  • the image processing unit 12 performs various image processing processes for generating image frames to be displayed on the display unit 13 with respect to the integrated image signal MVS received by the input signal processing unit 11 .
  • the image processing process includes, for example, decoding corresponding to the image format of the integrated image data (MVS), and de-interlacing for converting the integrated image data (MVS) of the interlace method into a progressive method. ), scaling for adjusting the integrated image data (MVS) to a preset resolution, noise reduction for image quality improvement, detail enhancement, frame refresh rate conversion, etc. may include
  • the image processing unit 12 may transmit an image frame resulting from performing this process to the display unit 13 built in the electronic device 1 .
  • the display unit 13 may display the image frame processed by the image processing unit 12 .
  • the implementation method of the display unit 13 is not limited, but liquid crystal, plasma, light-emitting diode, organic light-emitting diode, and surface conduction electron gun (surface) are not limited thereto.
  • -conduction electron-emitter carbon nano-tube, nano-crystal, etc. can be implemented in various display panels.
  • the display unit 13 may additionally include an additional configuration according to an implementation method.
  • the display unit 13 may include a timing controller 131 for adjusting the time of an image frame generated by the image processing unit 12 and a panel 132 for configuring a screen for displaying an image.
  • the display unit 13 may further include a panel driver that additionally drives the panel 132 .
  • the audio processing units 1 to 4 (14) may process the received four audio signals (Audio 1 to 4).
  • the audio processing units 1 to 4 14 may convert, amplify, mix, and the like, from the digital audio signals Audio 1 to 4 received by the audio switching unit 113 to the analog audio signals Audio 1 to 4 .
  • the audio processing units 1 to 4 14 may output the mixed analog audio signals Audio 1 to 4 to the audio output unit 15 .
  • the audio output unit 15 may include four speakers 1 to 4 for reproducing each audio signal (Audio 1 to 4) included in the four image signals 1 to 4 .
  • the speakers 1 to 3 may be built in the electronic device 1
  • the speaker 4 may be provided externally and connected through a second interface unit, for example, the Bluetooth communication module 19 .
  • all of the speakers 1 to 4 may be built-in, or all may be provided externally.
  • the processor 16 includes each component of the electronic device 1 , for example, the interface 10 , the input signal processing unit 11 , the image processing unit 12 , the display unit 13 , the audio processing unit 14 , and the audio
  • the output unit 15 can be controlled.
  • the processor 16 may transmit to the input signal processing unit 11 based on information about selection of an image to be displayed on a plurality of regions of the screen, a position at which the selected image is displayed, and the size of the displayed image. Such information may be obtained based on screen setting information input by the user through the OSD, for example.
  • the processor 16 may transmit it to the input signal processing unit 11 and reflect it in the generation of an integrated image signal (MVS).
  • the screen setting change input may include changing an image displayed in a specific region into a new image, exchanging a display image between regions, or changing the size of a displayed image.
  • the processor 16 may transmit the finally set screen setting information to the input signal processing unit 11 to reflect the traffic video signal MVS generation.
  • the input signal processing unit 11 may directly refer to the last screen setting information stored in the memory.
  • the processor 16 may transmit screen setting information set for each user ID to the input signal processing unit 11 .
  • the processor 16 may store information on speakers 1 to 4 to reproduce audio signals (Audio 1 to 4) corresponding to four images (Video 1 to 4) selected or set by the user or to transmit it to the input signal processing unit 11 . have.
  • the processor 16 collects data, analyzes the collected data, processes at least a part of it, and generates result information as a rule-based or artificial intelligence (Artificial Intelligence) algorithm for machine learning, neural network, or deep learning. This may be performed using at least one of the algorithms.
  • Artificial Intelligence Artificial Intelligence
  • the processor 16 may perform the functions of a learning unit and a recognition unit.
  • the learning unit may, for example, perform a function of generating a learned neural network
  • the recognition unit may perform a function of recognizing (or reasoning, predicting, estimating, determining) data using the learned neural network network.
  • the learning unit may create or update the neural network.
  • the learning unit may acquire learning data to generate a neural network.
  • the learning unit may acquire learning data from a memory or from the outside.
  • the training data may be data used for learning of the neural network.
  • the learning unit may perform a preprocessing operation on the acquired training data before training the neural network using the training data, or may select data to be used for learning from among a plurality of training data. For example, the learning unit may process the learning data into a preset format, filter it, or add/remove noise to process the learning data into a form suitable for learning.
  • the learned neural network network may be composed of a plurality of neural network networks (or layers). Nodes of the plurality of neural networks have weights, and the plurality of neural networks may be connected to each other so that an output value of one neural network is used as an input value of another neural network.
  • neural networks include Convolutional Neural Network (CNN), Deep Neural Network (DNN), Recurrent Neural Network (RNN), Restricted Boltzmann Machine (RBM), Deep Belief Network (DBN), Bidirectional Recurrent Deep Neural Network (BRDNN) and It can include models such as Deep Q-Networks.
  • CNN Convolutional Neural Network
  • DNN Deep Neural Network
  • RNN Recurrent Neural Network
  • RBM Restricted Boltzmann Machine
  • DNN Deep Belief Network
  • BDN Bidirectional Recurrent Deep Neural Network
  • BDN Bidirectional Recurrent Deep Neural Network
  • It can include models such as Deep Q-Networks.
  • the recognizer may acquire target data.
  • the target data may be obtained from a memory or externally.
  • the target data may be data to be recognized by the neural network.
  • the recognizer may perform preprocessing on the acquired target data before applying the target data to the learned neural network, or select data to be used for recognition from among a plurality of target data.
  • the recognition unit may process the target data into a preset format, filter, or add/remove noise to process the target data into a form suitable for recognition.
  • the recognizer may obtain an output value output from the neural network by applying the preprocessed target data to the neural network.
  • the recognition unit may acquire an academic rate value (or a reliability value) together with an output value.
  • the processor 16 loads at least a portion of the control program including instructions from the nonvolatile memory in which the control program is installed into the volatile memory, and at least one general-purpose processor that executes the instructions of the loaded control program.
  • Including, for example, may be implemented as a central processing unit (CPU), an application processor (AP), or a microprocessor.
  • the processor 16 may include single core, dual core, triple core, quad core, and multiple cores thereof. A plurality of processors 16 may be provided.
  • the processor 16 may include, for example, a main processor and a sub-processor operating in a sleep mode (eg, a mode in which only standby power is supplied).
  • a sleep mode eg, a mode in which only standby power is supplied.
  • the processor, ROM and RAM are interconnected through an internal bus.
  • the processor 16 may be implemented in a form included in a main SoC mounted on a PCB embedded in the electronic device 1 .
  • the main SoC may further include an image processing unit.
  • the control program may include program(s) implemented in the form of at least one of a BIOS, a device driver, an operating system, firmware, a platform, and an application program (application).
  • the application program may be installed or stored in advance when the electronic device 1 is manufactured, or may be installed based on the received data by receiving data of the application program from the outside when using it later.
  • Data of the application program may be downloaded to the electronic device 1 from, for example, an external server such as an application market.
  • an external server such as an application market.
  • Such a control program, an external server, etc. is an example of a computer program product, but is not limited thereto.
  • the electronic device 1 may further include a memory 17 .
  • the memory 17 is a computer-readable recording medium, in which unlimited data is stored.
  • the memory 17 is accessed by the processor 16, and reading, writing, modifying, deleting, updating, and the like of data are performed by them.
  • the memory 17 may store screen setting information, for example, image information allocated to display in a plurality of areas of the screen, image location information, image size information, and the like.
  • the data stored in the memory 17 may include various image/audio content received through the interface unit 10 and multiple frame data sequentially displayed by processing the received image.
  • the memory 17 may include a voice recognition module (voice recognition engine) for voice recognition.
  • the memory 17 may include an operating system, various applications executable on the operating system, image data, additional data, and the like.
  • the memory 17 includes a non-volatile memory in which the control program is installed, and a volatile memory in which at least a part of the installed control program is loaded.
  • the memory 17 may include a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (eg SD or XD memory), and a RAM.
  • RAM Random Access Memory
  • SRAM Static Random Access Memory
  • ROM Read-Only Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • PROM Programmable Read-Only Memory
  • magnetic memory magnetic disk, It may include at least one type of storage medium among optical disks.
  • the electronic device 1 may include a voice recognition unit 18 .
  • the voice recognition unit 18 executes the voice recognition module stored in the memory 17 and is input or transmitted from a microphone built into the electronic device 1 or an external device, for example, a microphone built into a mobile device or a remote control. Can recognize voice.
  • the mobile device or the remote control may digitize the analog voice signal and transmit it to the electronic device 1 through, for example, Bluetooth.
  • the received analog voice signal may be digitized and transmitted to the processor 16 of the electronic device 1 .
  • the electronic device 1 may transmit the received voice signal to the server.
  • the server may be an STT server that converts voice signal related data into appropriate text or a main server that also performs an STT server function.
  • the data processed in the STT server may be received again by the electronic device 1 or may be directly transmitted to another server.
  • the electronic device 1 may process the received voice signal by itself in the electronic device 1 without transmitting the received voice signal to the STT server. That is, the electronic device 1 may perform the STT server role by itself.
  • the electronic device 1 may perform a specific function using text transmitted from the server or text converted by itself.
  • it may be the processor 16 in the electronic device 1 that performs the function, or a separate server to which the converted text is transmitted (a server different from the STT server or a server that also serves as an STT server).
  • FIG. 3 is a flowchart illustrating a control method of the electronic device 1 for displaying a divided image according to an embodiment of the present invention.
  • the user can set to display Video1, Video2, Video3, and Video4 in the 1/4, 2/4, 3/4, and 4/4 quadrants of the screen, respectively. In this case, the user can also adjust the sizes of Video1, Video2, Video3, and Video4 displayed in each area.
  • the processor 16 may transmit such screen setting input information, that is, the position and/or size of the image to the input signal processing unit 11 .
  • the interface unit 10 may receive four image signals 1 to 4 .
  • the interface unit 10 may be HDMI as shown in FIG. 2 .
  • the input signal processing unit 11 may select four 4k video signals 1 to 4 to be displayed on the screen among the plurality of video signals, and perform decoding.
  • the video signals 1 to 4 may be formed in four TMDS formats as shown in FIG. 4 .
  • Transition Minimized Differential Signal transmits video, audio, and other data using one of three modes: 'video data period', 'data island period', and 'control period'.
  • image data may be transmitted.
  • the 'data island period' occurs during the 'horizontal/vertical retrace period', so that voice and other data may be divided into several packets and transmitted.
  • the 'control period' may occur between the 'image data period' and the 'data island period'.
  • HDMI can transmit 10-bit image data using the 8b/10b encoding method for the 'video data period' and the 2b/10b encoding method for the 'control period'.
  • HDMI can transmit voice and other data in the 4b/10b encoding method during the 'data island period'.
  • data as much as 32 pixels may be transmitted in one 'data island period', and a 32-bit packet header describing the contents of the packet may be included.
  • the packet header may include 8-bit BCH Error Correction Code (ECC) parity data for an error correction function.
  • ECC Error Correction Code
  • Each packet can have 4 subpackets. Each subpacket may consist of 64 bits. This packet may also include 8-bit BCH ECC parity data. In addition, a maximum of 18 packets can be transmitted in each 'data island period'. Of the 15 packet types in the HDMI 1.3a specification, 7 can be allocated for voice, and the remaining 8 can be allocated for other data. Among these are standard control packets and gamut metadata packets.
  • the standard control packet may have a function (AVMUTE) for muting a sound when noise is generated and information on color depth.
  • the Gamut metadata packet may contain information about the color space for the video stream being reproduced, which is required to use xvYCC.
  • Videos 1 to 4 are views showing video signals 1 to 4 (Videos 1 to 4) in TMDS format.
  • the image signal 1 may include active image data 1 (Active Video1), audio data 1 (Audio1), and other data 1 displayed on the entire screen.
  • the image signal 1 may include vertical and horizontal synchronization signals 1 (VSync1, HSync1) for displaying the active image data 1 on the entire screen.
  • the image signal 2 may include active image data 2 (Active Video2), audio data 2 (Audio2), and other data 2 displayed on the entire screen.
  • the image signal 2 may include vertical and horizontal synchronization signals 2 (VSync2, HSync2) for displaying the active image data 2 on the entire screen.
  • the image signal 3 may include active image data 3 (Active Video3), audio data 3 (Audio3), and other data 3 displayed on the entire screen.
  • the image signal 3 may include vertical and horizontal synchronization signals 3 VSync3 and HSync3 for displaying the active image data 3 on the entire screen.
  • the image signal 4 may include active image data 4 (Active Video4), audio data 4 (Audio4), and other data 4 displayed on the entire screen.
  • the image signal 4 may include vertical and horizontal synchronization signals 4 (VSync4, HSync4) for displaying the active image data 4 on the entire screen.
  • step S13 the input signal processing unit 11 may extract active video data 1 to 4 (Active Video 1 to 4) from the video signals 1 to 4 .
  • step S14 the input signal processing unit 11 transfers the active video data 1 to 4 (Active Videos 1 to 4) to the plurality of areas of the screen based on the location and size information to be displayed among the plurality of areas of the screen.
  • the allocated integrated image data (MVD) can be created.
  • active image data 1 (Active Video1) of a predetermined size is arranged in a quarter of the screen
  • active image data 2 (Active Video2) of a predetermined size is arranged in a second quarter of the screen. . .
  • FIG. 5 is a diagram illustrating an integrated video signal (MVS).
  • the input signal processing unit 11 includes the positions and/or sizes of active video data 1 to 4 (Active Video 1 to 4) and vertical and horizontal synchronization signals 1 to 4 (VSync1 to VSync1 to) of the active video data 1 to 4 (Active Video 1 to 4).
  • VSync1 to VSync1 to vertical and horizontal synchronization signals 1 to 4
  • HSync1 to 4 can be used to generate vertical and horizontal synchronization signals M (MVSync, MHSync) of the integrated image data (MVD).
  • the input signal processing unit 11 may generate integrated information MVI, such as information on the combined image data MVD and information on the changed position and/or size difference between the active video data 1 to 4 (Active Videos 1 to 4).
  • integrated information MVI such as information on the combined image data MVD and information on the changed position and/or size difference between the active video data 1 to 4 (Active Videos 1 to 4).
  • the input signal processing unit 11 may generate an integrated image signal MVS including the generated integrated image data MVD, vertical and horizontal synchronization signals M (MVSync, MHSync), and integrated information MVI.
  • MVS integrated image signal
  • MVSync vertical and horizontal synchronization signals
  • MVI integrated information
  • step S15 the image processing unit 12 extracts the active combined image data based on the combined image signal (MVS), and performs various processes for displaying the extracted combined image data on the screen to generate an image frame.
  • the processing for generating the video frame is, for example, decoding corresponding to the video format of the integrated video data (MVS), and deinterlacing for converting the integrated video data (MVS) of the interlace method into a progressive method. (de-interlacing), scaling for adjusting the integrated image data (MVS) to a preset resolution, noise reduction for image quality improvement, detail enhancement, frame refresh rate ) transformation, and the like.
  • step S16 the display unit 14 may display an integrated image corresponding to the integrated image data MVD on the entire screen based on the generated image frame.
  • the electronic device 1 selects a position to be displayed among a plurality of regions of the screen and an active image data of each of the plurality of image signals. / Or, by adjusting and integrating according to the size, only one image processing unit 12 can process a plurality of image signals to be displayed on a plurality of areas of the screen.
  • the electronic device 1 of the present invention divides the screen into four and then allocates and displays the four 4k image signals 1 to 4 to each divided area, so that it is possible to display an 8k image as a whole.
  • FIG. 6 is a block diagram showing the configuration of the electronic device 1 according to the second embodiment of the present invention.
  • the electronic device 1 according to the second embodiment excludes a display unit that displays an image by itself, and integrates the four image signals 1 to 4 received through the interface unit 10, for example, HDMI to obtain an integrated image signal.
  • the MVS may be generated, and an image frame may be generated based on the integrated image signal MVS, and output to the external display device 3, for example, a television or a monitor.
  • the audio signals 1 to 4 included in the four video signals 1 to 4 can be processed and output to the audio output device 4 provided externally through the cable C or the blue pitcher communication module 19 .
  • the electronic device 1 according to the second embodiment may include a display unit for displaying a simple notification, a control menu, and the like.
  • FIG. 7 is a block diagram showing the configuration of the electronic device 1 according to the third embodiment of the present invention.
  • the electronic device 1 displays the image to be integrated, the arrangement of the image on the screen at the time of integration, the size of the displayed image, speaker information for reproducing the audio signal of the displayed image, and the like, to the display device 3 .
  • the electronic device 1 generates an integrated video signal MVS by integrating, for example, four video signals 1 to 4 to be displayed by allocating to a plurality of areas of a screen, and then the display device 3 can be transmitted as In this case, the image signals 1 to 4 may be received from the external source device 2 through the interface unit.
  • the display device 3 receives, from the electronic device 1, an integrated video signal MVS in which video signals 1 to 4 to be displayed on a plurality of areas of the screen are combined, generates an image frame for screen display, and displays the image frame. It can be displayed on the screen of the unit 33 .
  • FIG 8 is a diagram illustrating a scenario in which a plurality of image signals of the electronic device 1 are processed and displayed according to the fourth embodiment of the present invention.
  • the processor 16 may transmit the screen setting input information to the input signal processing unit 11 .
  • the interface unit 10 may include a USB interface connected to a USB in which HDMI1 to 4 receiving the four first video signals 1 to 4, respectively, and a second video signal (USB Movie) are stored therein.
  • the input signal processing unit 11 selects first video signals 1 to 3 among the four first video signals 1 to 4 to extract active video data 1 to 3 (Active Video 1 to 3), and the active video data 1 to 3 (Active Video 1 ⁇ 3) is assigned to the areas of the 1/4, 2/4, and 3/4 quadrants of the screen to generate the integrated integrated video data (MVD), and based on this, the integrated video signal (MVS ) can be created.
  • the combined image data MVD may be generated in a state in which the area of the fourth quarter of the screen is empty.
  • the combined image signal MVS may be transmitted to the image processing unit 1 121 .
  • the second image signal (USB Movie) received through the USB interface may be transmitted to the image processing unit 2 122 .
  • the input signal processing unit 11 may extract the audio signals 1 and 2 corresponding to the image signals 1 and 2 and then deliver them to the audio processing units 1 and 2 ( 141 and 142 ).
  • the image processing unit 1 121 may generate a first image frame for displaying the integrated image signal MVS in the 1/4, 2/4, and 3/4 quadrants of the screen.
  • the image processing unit 2 122 may generate a second image frame for displaying the second image signal (USB Movie) in the fourth quarter of the screen.
  • the display unit 13 may display the first images 1 to 3 (Video 1 to 3) and the second image (USB Movie) on the screen by synthesizing the first image frame and the second image frame.
  • the user can designate that the audio signal 1 of video 1 (Video1) of HDMI1 be reproduced by the TV speaker 151 and the audio signal 2 of video 2 (Video2) of HDMI2 is reproduced by the Bluetooth speaker 152 .
  • the audio processing units 1 and 2 may process the audio signals 1 and 2, respectively, and transmit them to the TV speaker 151 and the Bluetooth speaker 152 of the audio output unit 15 for playback.
  • FIG. 9 is a diagram illustrating a scenario in which a plurality of image signals of the electronic device 1 are processed and displayed according to a fifth embodiment of the present invention.
  • the user can change Video1, Video2, Video3, and Video4 displayed in the 1/4, 2/4, 3/4, and 4/4 quadrants of the existing screen to Video3, Video2, Video1, and Video4, respectively.
  • the processor 16 may transmit the screen setting input information to the input signal processing unit 11 .
  • the input signal processing unit 11 selects all of the first video signals 1 to 4 among the four first video signals 1 to 4 to extract active video data 1 to 4 (Active Video 1 to 4), and the active video data 1 to 4 4 (Active Video 1 ⁇ 4) are assigned to the 3/4, 2/4, 1/4, and 4/4 quadrants of the screen, respectively, to generate the integrated integrated video data (MVD), and then to generate an integrated video signal (MVS).
  • the integrated image data (MVD) is generated so that the display order of the images is changed by reflecting the change in the input signal processing unit 11 and transmitted to the image processing unit 12. .
  • the input signal processing unit 11 may extract the audio signals 1 and 2 corresponding to the image signals 1 and 2, and then transmit them to the audio processing units 1 and 2 (141 and 142).
  • the image processing unit 12 generates video 3 (Video3), video 2 (Video2), An image frame for displaying the image 1 (Video1) and the image 4 (Video4) can be generated.
  • the display unit 13 may display the image frame transmitted from the image processing unit 12 on the screen.
  • the user can designate that the audio signal 1 of the video 1 (Video1) of HDMI1 be reproduced by the TV speaker 151 and the audio signal 2 of the video 2 (Video2) of the HDMI2 is reproduced by the Bluetooth speaker 152 .
  • the audio processing units 1 and 2 may process the audio signals 1 and 2, respectively, and transmit them to the TV speaker 151 and the Bluetooth speaker 152 of the audio output unit 15 for playback.
  • the electronic device 1 may be applied to a picture in picture (PIP) capable of simultaneously displaying a small additional image separately within the main image on the screen.
  • the image integrator 112 may receive the PIP setting information and generate an integrated image signal (MVS) by reflecting the position and size of the additional image on the main image of the entire screen.
  • the electronic device 1 may display the PIP image by processing the main image and the additional image by one image processing unit.
  • the input signal processing module for allocating and displaying a plurality of images to a plurality of areas of a screen is a computer program product stored in the memory 17 as a computer readable recording medium or a computer program product transmitted and received through network communication can be implemented as In addition, the above-described input signal processing module may be implemented as a computer program singly or integrated.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Graphics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

Un dispositif électronique est divulgué pour diviser et afficher une pluralité d'images. Le dispositif électronique comprend : une unité de traitement de signal d'entrée qui reçoit une pluralité de signaux d'image, extrait une pluralité de données d'image respectivement à partir de la pluralité de signaux d'image et génère un unique signal d'image intégré comprenant des données d'image intégrées dans lesquelles une pluralité d'images sur la base de la pluralité extraite de données d'image sont respectivement attribuées à une pluralité de régions d'un écran ; et une unité de traitement d'image qui génère une trame d'image à afficher sur l'écran sur la base des données d'image intégrées du signal d'image intégré.
PCT/KR2021/007715 2020-07-10 2021-06-21 Dispositif électronique et son procédé de commande WO2022010130A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2020-0085406 2020-07-10
KR1020200085406A KR20220007319A (ko) 2020-07-10 2020-07-10 전자장치 및 그의 제어방법

Publications (1)

Publication Number Publication Date
WO2022010130A1 true WO2022010130A1 (fr) 2022-01-13

Family

ID=79553481

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/007715 WO2022010130A1 (fr) 2020-07-10 2021-06-21 Dispositif électronique et son procédé de commande

Country Status (2)

Country Link
KR (1) KR20220007319A (fr)
WO (1) WO2022010130A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100998547B1 (ko) * 2008-12-09 2010-12-07 주식회사 어니언텍 다중화면표시를 위한 방송 시스템 및 방법
KR20140111736A (ko) * 2013-03-12 2014-09-22 삼성전자주식회사 디스플레이장치 및 그 제어방법
KR101553846B1 (ko) * 2014-12-16 2015-09-17 연세대학교 산학협력단 영상 합성을 위한 프레임 동기화 장치 및 그 방법
KR101885215B1 (ko) * 2011-12-30 2018-08-06 삼성전자주식회사 디스플레이 장치 및 그 디스플레이 방법
JP6449318B2 (ja) * 2014-03-26 2019-01-09 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Hdmiインタフェースを介して補助データフレームを同期送信するためのトランスミッタ、レシーバ、及び、システム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100998547B1 (ko) * 2008-12-09 2010-12-07 주식회사 어니언텍 다중화면표시를 위한 방송 시스템 및 방법
KR101885215B1 (ko) * 2011-12-30 2018-08-06 삼성전자주식회사 디스플레이 장치 및 그 디스플레이 방법
KR20140111736A (ko) * 2013-03-12 2014-09-22 삼성전자주식회사 디스플레이장치 및 그 제어방법
JP6449318B2 (ja) * 2014-03-26 2019-01-09 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Hdmiインタフェースを介して補助データフレームを同期送信するためのトランスミッタ、レシーバ、及び、システム
KR101553846B1 (ko) * 2014-12-16 2015-09-17 연세대학교 산학협력단 영상 합성을 위한 프레임 동기화 장치 및 그 방법

Also Published As

Publication number Publication date
KR20220007319A (ko) 2022-01-18

Similar Documents

Publication Publication Date Title
US8713598B2 (en) Electronic device and control method therein
WO2014116088A1 (fr) Dispositif source, procédé de fourniture de contenu utilisant le dispositif source, dispositif collecteur et procédé de commande du dispositif collecteur
JP6477692B2 (ja) 通信装置及び通信方法、並びにコンピューター・プログラム
US8887210B2 (en) Transmission apparatus, information transmission method, reception apparatus, and information processing method
US8174619B2 (en) Reception apparatus and method of controlling image output by reception apparatus
WO2014142557A1 (fr) Dispositif électronique et procédé de traitement d'images
WO2018131806A1 (fr) Appareil électronique et son procédé de fonctionnement
WO2016098992A1 (fr) Dispositif d'affichage et procédé de commande dudit dispositif
WO2014163394A1 (fr) Économie d'énergie pour les transmissions audio/vidéo via une interface câblée
JPWO2014141425A1 (ja) 映像表示システム、ソース機器、シンク機器及び映像表示方法
EP3459240A1 (fr) Appareil d'affichage et support d'enregistrement
WO2016052908A1 (fr) Émetteur, récepteur, et procédé de commande correspondant
WO2016056804A1 (fr) Appareil de traitement de contenu et procédé de traitement de contenu associé
WO2015046724A1 (fr) Appareil d'affichage d'image, serveur de synchronisation de contenus et procédé de mise en œuvre du serveur
US10134356B2 (en) Transmission apparatus, method of transmitting image data with wide color gamut, reception apparatus, method of receiving image data with color gamut
WO2022010130A1 (fr) Dispositif électronique et son procédé de commande
JP2016052015A (ja) 電子機器および色域判定方法
US10965882B2 (en) Video display apparatus, video display method, and video signal processing apparatus
WO2018221855A1 (fr) Appareil électronique et son procédé de fonctionnement
WO2022139182A1 (fr) Dispositif électronique et procédé de commande de ce dernier
WO2015079562A1 (fr) Dispositif électronique et procédé de commande d'alimentation électrique inter-dispositif électronique
WO2021075672A1 (fr) Dispositif d'affichage et son procédé de fonctionnement
WO2020067701A1 (fr) Dispositif d'affichage, procédé de commande associé et support d'enregistrement
JP2012141787A (ja) 映像表示装置及びその表示方法
US20140181657A1 (en) Portable device and audio controlling method for portable device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21837248

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21837248

Country of ref document: EP

Kind code of ref document: A1