WO2021218473A1 - Procédé et dispositif d'affichage - Google Patents

Procédé et dispositif d'affichage Download PDF

Info

Publication number
WO2021218473A1
WO2021218473A1 PCT/CN2021/081562 CN2021081562W WO2021218473A1 WO 2021218473 A1 WO2021218473 A1 WO 2021218473A1 CN 2021081562 W CN2021081562 W CN 2021081562W WO 2021218473 A1 WO2021218473 A1 WO 2021218473A1
Authority
WO
WIPO (PCT)
Prior art keywords
display device
angle
initial
panoramic picture
user
Prior art date
Application number
PCT/CN2021/081562
Other languages
English (en)
Chinese (zh)
Inventor
王大勇
于颜梅
杨鲁明
王卫明
鲍姗娟
陈验方
Original Assignee
海信视像科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202010342885.9A external-priority patent/CN113645502B/zh
Priority claimed from CN202010559804.0A external-priority patent/CN113825001B/zh
Application filed by 海信视像科技股份有限公司 filed Critical 海信视像科技股份有限公司
Publication of WO2021218473A1 publication Critical patent/WO2021218473A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof

Definitions

  • This application relates to the technical field of smart TVs, and in particular to a display method and display device.
  • a panoramic picture is a picture displayed through a wide-angle representation method, and a panoramic picture can express more images of the surrounding environment. Generally, the panoramic picture cannot be completely displayed on the display device. When content other than the current display content of the panoramic picture needs to be browsed, the panoramic picture needs to be adjusted so that other contents of the panoramic picture can be adjusted to the display area of the display device.
  • the remote control when browsing panoramic pictures on a display device, it is usually necessary to use the remote control to adjust the display content of the panoramic picture. For example, press the arrow keys on the remote control to move the panoramic picture in the corresponding direction on the screen of the display device. Display the content that was not displayed before in the opposite direction; or collect the operator’s gestures, and then adjust the displayed panoramic picture content according to the direction of the gesture, for example, the operator’s finger slides to the right on the display device screen, then the panoramic picture It will move to the right on the screen to display the content that was not displayed on the left side of the previous panoramic picture.
  • This application provides a display method and display device.
  • this application provides a panoramic picture browsing method, including:
  • the display device displays a panoramic picture, identifying a target person in front of the display device, where the target person is used to indicate an operator who browses the panoramic picture in front of the display device;
  • this application also provides a display device, including:
  • Detector used to collect the image in front of the display device
  • the controller is configured as:
  • the display device displays a panoramic picture, identifying a target person in front of the display device, where the target person is used to indicate an operator who browses the panoramic picture in front of the display device;
  • FIG. 1 exemplarily shows a schematic diagram of an operation scene between a display device and a control device according to some embodiments
  • FIG. 2 exemplarily shows a hardware configuration block diagram of a display device 200 according to some embodiments
  • FIG. 3 exemplarily shows a block diagram of the hardware configuration of the control device 100 according to some embodiments
  • FIG. 4 exemplarily shows a schematic diagram of software configuration in a display device 200 according to some embodiments
  • FIG. 5 exemplarily shows a schematic diagram of the icon control interface display of the application program in the display device 200 according to some embodiments
  • FIG. 6 is a schematic diagram of interaction between a display device 200 and an operator according to an embodiment of the application
  • FIG. 7 is a flowchart of a panoramic picture browsing method shown in an embodiment of the application.
  • FIG. 8 is a schematic diagram of a detector 230 collecting images according to an embodiment of the application.
  • FIG. 9 is a schematic diagram of the angle of the aircraft shown in an embodiment of the application.
  • FIG. 10 is a schematic diagram of interaction between another display device 200 and an operator according to an embodiment of the application.
  • FIG. 11 is a schematic diagram of the initial face angle of the target person obtained by the controller 110 according to an embodiment of the application.
  • FIG. 12 is a schematic diagram of the current face angle of the target person obtained by the controller 110 according to an embodiment of the application.
  • FIG. 13 exemplarily shows a flowchart of a method for dynamically adjusting a control according to an embodiment
  • FIG. 14 exemplarily shows a flowchart of a method for obtaining the initial position parameter of a target control according to an embodiment
  • FIG. 15 exemplarily shows a schematic diagram of the reference coordinate system according to the embodiment.
  • FIG. 16 exemplarily shows a schematic diagram of environmental image data corresponding to the initial position in the embodiment
  • FIG. 17 exemplarily shows a schematic diagram of environmental image data corresponding to the end position in the embodiment
  • FIG. 18 exemplarily shows a schematic diagram of the position change of the center point of the face frame on the display interface during the movement of the user according to the embodiment
  • FIG. 19 exemplarily shows a flowchart of a method for calculating the offset of a target control according to an embodiment
  • FIG. 20 exemplarily shows a schematic diagram when the theoretical second distance is determined according to the embodiment
  • FIG. 21 exemplarily shows the first schematic diagram when the position of the control is dynamically adjusted according to the embodiment
  • Fig. 22 exemplarily shows a second schematic diagram when the position of the control is dynamically adjusted according to the embodiment.
  • module refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware or/and software code that can perform functions related to the element.
  • remote control refers to a component of an electronic device (such as the display device disclosed in this application), which can generally control the electronic device wirelessly within a short distance.
  • infrared and/or radio frequency (RF) signals and/or Bluetooth are used to connect to electronic devices, and may also include functional modules such as WiFi, wireless USB, Bluetooth, and motion sensors.
  • RF radio frequency
  • a handheld touch remote control uses a user interface in a touch screen to replace most of the physical built-in hard keys in general remote control devices.
  • gesture used in this application refers to a user's behavior through a change of hand shape or hand movement to express expected ideas, actions, goals, and/or results.
  • Fig. 1 exemplarily shows a schematic diagram of an operation scenario between a display device and a control device according to an embodiment. As shown in FIG. 1, the user can operate the display device 200 through the mobile terminal 300 and the control device 100.
  • control device 100 may be a remote controller, and the communication between the remote controller and the display device includes infrared protocol communication or Bluetooth protocol communication, and other short-range communication methods, and the display device 200 is controlled by wireless or other wired methods.
  • the user can control the display device 200 by inputting user instructions through keys on the remote control, voice input, control panel input, etc.
  • the user can control the display device 200 by inputting corresponding control commands through the volume plus and minus keys, channel control keys, up/down/left/right movement keys, voice input keys, menu keys, and power on/off keys on the remote control. Function.
  • mobile terminals, tablet computers, computers, notebook computers, and other smart devices can also be used to control the display device 200.
  • an application program running on a smart device is used to control the display device 200.
  • the application can be configured to provide users with various controls in an intuitive user interface (UI) on the screen associated with the smart device.
  • UI intuitive user interface
  • the mobile terminal 300 can install a software application with the display device 200, realize connection communication through a network communication protocol, and realize the purpose of one-to-one control operation and data communication.
  • the mobile terminal 300 can be used to establish a control command protocol with the display device 200, the remote control keyboard can be synchronized to the mobile terminal 300, and the function of controlling the display device 200 can be realized by controlling the user interface on the mobile terminal 300. It is also possible to transmit the audio and video content displayed on the mobile terminal 300 to the display device 200 to realize the synchronous display function.
  • the display device 200 also performs data communication with the server 400 through multiple communication methods.
  • the display device 200 may be allowed to communicate through a local area network (LAN), a wireless local area network (WLAN), and other networks.
  • the server 400 may provide various contents and interactions to the display device 200.
  • the display device 200 can receive software program updates or access a remotely stored digital media library by sending and receiving information and interacting with an electronic program guide (EPG).
  • EPG electronic program guide
  • the server 400 may be one cluster or multiple clusters, and may include one or more types of servers.
  • the server 400 provides other network service content such as video-on-demand and advertising services.
  • the display device 200 may be a liquid crystal display, an OLED display, or a projection display device.
  • the specific display device type, size, resolution, etc. are not limited, and those skilled in the art can understand that the display device 200 can make some changes in performance and configuration as required.
  • the display device 200 may also additionally provide computer-supported functions of smart network TV, including but not limited to, network TV, smart TV, Internet Protocol TV (IPTV), and the like.
  • smart network TV including but not limited to, network TV, smart TV, Internet Protocol TV (IPTV), and the like.
  • IPTV Internet Protocol TV
  • FIG. 2 exemplarily shows a block diagram of the hardware configuration of the display device 200 according to an exemplary embodiment.
  • the display device 200 includes a controller 250, a tuner and demodulator 210, a communicator 220, a detector 230, an input/output interface 255, a display 275, an audio output interface 285, a memory 260, a power supply 290, At least one of the user interface 265 and the external device interface 240.
  • the display 275 is used to receive the image signal output from the first processor, and display the components of the video content and images and the menu manipulation interface.
  • the display 275 includes a display screen component for presenting images, and a driving component for driving image display.
  • the displayed video content can be from broadcast television content, or it can be said that various broadcast signals can be received through wired or wireless communication protocols. Or, it can display various image content received from the network server side from the network communication protocol.
  • the display 275 is used to present a user manipulation UI interface generated in the display device 200 and used to control the display device 200.
  • a driving component for driving the display is further included.
  • the display 275 is a projection display, and may also include a projection device and a projection screen.
  • the communicator 220 is a component for communicating with external devices or external servers according to various communication protocol types.
  • the communicator may include at least one of Wifi chip, Bluetooth communication protocol chip, wired Ethernet communication protocol chip or other network communication protocol chip or near field communication protocol chip, and infrared receiver.
  • the display device 200 may establish control signal and data signal transmission and reception with the external control device 100 or the content providing device through the communicator 220.
  • the user interface 265 may be used to receive infrared control signals of the control device 100 (such as an infrared remote control, etc.).
  • the detector 230 is the display device 200 for collecting signals from the external environment or interacting with the outside.
  • the detector 230 includes a light receiver, a sensor used to collect the intensity of ambient light, and can adaptively display parameter changes and the like by collecting ambient light.
  • the detector 230 may also include an image collector, such as a camera, a camera, etc., which can be used to collect external environment scenes, and to collect attributes of the user or interact with the user gestures, and can adaptively change the display parameters. It can also recognize user gestures to realize the function of interacting with the user.
  • an image collector such as a camera, a camera, etc.
  • the detector 230 may also include a temperature sensor or the like, for example, by sensing the ambient temperature.
  • the display device 200 may adaptively adjust the display color temperature of the image. For example, when the temperature is relatively high, the display device 200 can be adjusted to display a colder image color temperature, or when the temperature is relatively low, the display device 200 can be adjusted to display a warmer image.
  • the detector 230 may also be a sound collector or the like, such as a microphone, which may be used to receive the user's voice.
  • a voice signal including a control instruction for the user to control the display device 200, or collecting environmental sounds is used to identify the type of environmental scene, so that the display device 200 can adapt to environmental noise.
  • the input/output interface 255 is configured to perform data transmission between the controller 250 and other external devices or other controllers 250. Such as receiving video signal data and audio signal data from external devices, or command instruction data.
  • the external device interface 240 may include, but is not limited to, the following: any one or more of high-definition multimedia interface HDMI interface, analog or data high-definition component input interface, composite video input interface, USB input interface, RGB port, etc. interface. It is also possible that the above-mentioned multiple interfaces form a composite input/output interface.
  • the tuner and demodulator 210 is configured to receive broadcast and television signals through wired or wireless reception, and can perform modulation and demodulation processing such as amplification, mixing, and resonance, from multiple wireless channels.
  • the audio and video signal is demodulated from the cable broadcast television signal, and the audio and video signal may include the television audio and video signal carried in the frequency of the television channel selected by the user, and the EPG data signal.
  • the frequency point demodulated by the tuner and demodulator 210 is controlled by the controller 250, and the controller 250 can send a control signal according to the user's selection, so that the modem responds to the TV signal frequency selected by the user and modulates and demodulates the frequency.
  • the TV signal carried by the frequency is controlled by the controller 250, and the controller 250 can send a control signal according to the user's selection, so that the modem responds to the TV signal frequency selected by the user and modulates and demodulates the frequency.
  • the TV signal carried by the frequency.
  • broadcast television signals can be classified into terrestrial broadcast signals, cable broadcast signals, satellite broadcast signals, or Internet broadcast signals according to different television signal broadcast formats. Or it can be divided into digital modulation signal, analog modulation signal, etc. according to different modulation types. Or it can be divided into digital signal, analog signal, etc. according to different signal types.
  • the controller 250 and the tuner and demodulator 210 may be located in different separate devices, that is, the tuner and demodulator 210 may also be in an external device of the main device where the controller 250 is located, such as an external set-top box. Wait. In this way, the set-top box outputs the TV audio and video signals modulated and demodulated of the received broadcast TV signals to the main device, and the main device receives the audio and video signals through the first input/output interface.
  • the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in the memory.
  • the controller 250 may control the overall operation of the display device 200. For example, in response to receiving a user command for selecting a UI object to be displayed on the display 275, the controller 250 may perform an operation related to the object selected by the user command.
  • the object may be any one of selectable objects, such as a hyperlink or an icon.
  • Operations related to the selected object for example: display operations connected to hyperlink pages, documents, images, etc., or perform operations corresponding to the icon.
  • the user command for selecting the UI object may be a command input through various input devices (for example, a mouse, a keyboard, a touch pad, etc.) connected to the display device 200 or a voice command corresponding to the voice spoken by the user.
  • the controller 250 includes a random access memory 251 (Random Access Memory, RAM), a read-only memory 252 (Read-Only Memory, ROM), a video processor 270, an audio processor 280, and other processors 253.
  • RAM Random Access Memory
  • ROM Read-Only Memory
  • video processor 270 a video processor 270
  • audio processor 280 an audio processor 280
  • other processors 253. At least one of Graphics Processing Unit (GPU), Central Processing Unit (CPU), Communication Interface, and Communication Bus 256 (Bus).
  • the communication bus Connect the various components.
  • RAM 251 is used to store temporary data of the operating system or other running programs
  • the ROM 252 is used to store various system startup instructions.
  • the ROM 252 is used to store a basic input output system, which is called a basic input output system (Basic Input Output System, BIOS). It is used to complete the power-on self-check of the system, the initialization of each functional module in the system, the basic input/output driver of the system and the boot operating system.
  • BIOS Basic Input Output System
  • the power of the display device 200 starts to start, and the CPU runs the system startup instruction in the ROM 252 to copy the temporary data of the operating system stored in the memory to the RAM 251 to facilitate startup or operation operating system.
  • the CPU copies the temporary data of various application programs in the memory to RAM 251, and then it is convenient to start or run various application programs.
  • the CPU processor 254 is configured to execute operating system and application program instructions stored in the memory. And according to receiving various interactive instructions input from the outside, to execute various applications, data and content, so as to finally display and play various audio and video content.
  • the CPU processor 254 may include multiple processors.
  • the multiple processors may include a main processor and one or more sub-processors.
  • the main processor is used to perform some operations of the display device 200 in the pre-power-on mode, and/or to display images in the normal mode.
  • the graphics processor 253 is used to generate various graphics objects, such as icons, operation menus, and user input instructions to display graphics. Including an arithmetic unit, which performs operations by receiving various interactive instructions input by the user, and displays various objects according to the display attributes. It also includes a renderer, which renders various objects obtained based on the arithmetic unit, and the rendered objects are used for display on the display.
  • the video processor 270 is configured to receive an external video signal, and perform decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image synthesis, etc. according to the standard codec protocol of the input signal. After video processing, a signal that can be directly displayed or played on the display device 200 can be obtained.
  • the video processor 270 includes a demultiplexing module, a video decoding module, an image synthesis module, a frame rate conversion module, a display formatting module, and the like.
  • the demultiplexing module is used to demultiplex the input audio and video data stream. For example, if MPEG-2 is input, the demultiplexing module will demultiplex into a video signal and an audio signal.
  • the video decoding module is used to process the demultiplexed video signal, including decoding and scaling.
  • An image synthesis module such as an image synthesizer, is used to superimpose and mix the GUI signal generated by the graphics generator with the zoomed video image according to the user input or itself, so as to generate an image signal for display.
  • the frame rate conversion module is used to convert the frame rate of the input video, such as converting a 60Hz frame rate to a 120Hz frame rate or a 240Hz frame rate, and the usual format is realized by inserting a frame.
  • the display formatting module is used to convert the video output signal after the received frame rate is converted, and change the signal to conform to the signal of the display format, such as outputting RGB data signals.
  • the graphics processor 253 can be integrated with the video processor, or can be configured separately.
  • the graphics signal output to the display can be processed when integrated, and different functions can be performed separately when configured separately.
  • GPU+FRC Full Rate Conversion
  • the audio processor 280 is used to receive an external audio signal, and perform decompression and decoding, as well as processing such as noise reduction, digital-to-analog conversion, and amplification processing, according to the standard codec protocol of the input signal, to obtain The sound signal played in the speaker.
  • the video processor 270 may include one or more chips.
  • the audio processor may also include one or more chips.
  • the video processor 270 and the audio processor 280 may be separate chips, or may be integrated in one or more chips together with the controller.
  • the audio output receives the sound signal output by the audio processor 280 under the control of the controller 250, such as the speaker 286, and can output to an external device in addition to the speaker carried by the display device 200 itself.
  • the external audio output terminal of the device such as an external audio interface or earphone interface, may also include a short-distance communication module in the communication interface, for example, a Bluetooth module for sound output from a Bluetooth speaker.
  • the power supply 290 under the control of the controller 250, provides power supply support for the display device 200 with power input from an external power supply.
  • the power supply 290 may include a built-in power supply circuit installed inside the display device 200, or may be an external power supply installed in the display device 200, and a power interface for providing an external power supply in the display device 200.
  • the user interface 265 is used to receive user input signals, and then send the received user input signals to the controller 250.
  • the user input signal may be a remote control signal received through an infrared receiver, and various user control signals may be received through a network communication module.
  • the user inputs user commands through the control device 100 or the mobile terminal 300, the user input interface is based on the user's input, and the display device 200 responds to the user's input through the controller 250.
  • the user may input a user command on a graphical user interface (GUI) displayed on the display 275, and the user input interface receives the user input command through the graphical user interface (GUI).
  • GUI graphical user interface
  • the user may input a user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through the sensor to receive the user input command.
  • the "user interface” is a medium interface for interaction and information exchange between an application or operating system and a user, and it realizes the conversion between the internal form of information and the form acceptable to the user.
  • the commonly used form of the user interface is the Graphic User Interface (GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It can be an icon, window, control and other interface elements displayed on the display screen of an electronic device.
  • the control can include icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, Widgets, etc. Visual interface elements.
  • the memory 260 includes storing various software modules used to drive the display device 200.
  • various software modules stored in the first memory include: at least one of a basic module, a detection module, a communication module, a display control module, a browser module, and various service modules.
  • the basic module is used to communicate signals between various hardware in the display device 200 and send processing and control signals to the lower layer software module of the upper layer module.
  • the detection module is a management module used to collect various information from various sensors or user input interfaces, and perform digital-to-analog conversion and analysis management.
  • the voice recognition module includes a voice parsing module and a voice command database module.
  • the display control module is a module for controlling the display to display image content, and can be used to play information such as multimedia image content and UI interface.
  • the communication module is a module used for control and data communication with external devices.
  • the browser module is a module used to perform data communication between browsing servers.
  • the service module is used to provide various services and modules including various applications.
  • the memory 260 is also used to store and receive external data and user data, images of various items in various user interfaces, and visual effect diagrams of focus objects, etc.
  • Fig. 3 exemplarily shows a configuration block diagram of the control device 100 according to an exemplary embodiment.
  • the control device 100 includes a controller 110, a communication interface 130, a user input/output interface, a memory, and a power supply.
  • the control device 100 is configured to control the display device 200, and can receive input operation instructions from the user, and convert the operation instructions into instructions that the display device 200 can recognize and respond to, so as to serve as an intermediary between the user and the display device 200.
  • the user operates the channel addition and subtraction key on the control device 100, and the display device 200 responds to the channel addition and subtraction operation.
  • control device 100 may be a smart device.
  • control device 100 can install various applications for controlling the display device 200 according to user requirements.
  • the mobile terminal 300 or other smart electronic device can perform a similar function to control the device 100 after installing an application that controls the display device 200.
  • the user can install various function keys or virtual buttons of the graphical user interface that can be provided on the mobile terminal 300 or other smart electronic devices by installing applications to realize the function of controlling the physical keys of the device 100.
  • the controller 110 includes a processor 112, a RAM 113 and a ROM 114, a communication interface 130, and a communication bus.
  • the controller is used to control the operation and operation of the control device 100, as well as the communication and cooperation between internal components, and external and internal data processing functions.
  • the communication interface 130 realizes the communication of control signals and data signals with the display device 200 under the control of the controller 110. For example, the received user input signal is sent to the display device 200.
  • the communication interface 130 may include at least one of other near field communication modules such as a WiFi chip 131, a Bluetooth module 132, and an NFC module 133.
  • the user input/output interface 140 wherein the input interface includes at least one of other input interfaces such as a microphone 141, a touch panel 142, a sensor 143, and a button 144.
  • the user can implement the user instruction input function through voice, touch, gesture, pressing and other actions.
  • the input interface converts the received analog signal into a digital signal and the digital signal into a corresponding instruction signal, and sends it to the display device 200.
  • the output interface includes an interface for sending the received user instruction to the display device 200.
  • it may be an infrared interface or a radio frequency interface.
  • the user input instruction needs to be converted into an infrared control signal according to the infrared control protocol, and then sent to the display device 200 via the infrared sending module.
  • a radio frequency signal interface a user input instruction needs to be converted into a digital signal, which is then modulated according to the radio frequency control signal modulation protocol, and then sent to the display device 200 by the radio frequency sending terminal.
  • control device 100 includes at least one of a communication interface 130 and an input/output interface 140.
  • the control device 100 is configured with a communication interface 130, such as WiFi, Bluetooth, NFC, etc. modules, which can encode user input instructions to the display device 200 through the WiFi protocol, or the Bluetooth protocol, or the NFC protocol.
  • the memory 190 is used to store various operating programs, data, and applications for driving and controlling the control device 200 under the control of the controller.
  • the memory 190 can store various control signal instructions input by the user.
  • the power supply 180 is used to provide operating power support for each element of the control device 100 under the control of the controller. Can battery and related control circuit.
  • the system may include a kernel (Kernel), a command parser (shell), a file system, and an application program.
  • Kernel a kernel
  • shell command parser
  • file system a file system
  • application program an application program.
  • the kernel, shell, and file system together form the basic operating system structure. They allow users to manage files, run programs, and use the system.
  • the kernel starts, activates the kernel space, abstracts hardware, initializes hardware parameters, etc., runs and maintains virtual memory, scheduler, signals, and inter-process communication (IPC).
  • IPC inter-process communication
  • the Shell and user applications are loaded.
  • the application is started, it is compiled into machine code to form a process.
  • the system is divided into four layers, from top to bottom, respectively, the application (Applications) layer (referred to as the “application layer”), and the Application Framework layer (referred to as the “framework layer”). “), Android runtime and system library layer (referred to as “system runtime library layer”), and kernel layer.
  • These applications may be Window programs, system setting programs, clock programs, camera applications, etc., which are included in the operating system; they may also be developed by third parties.
  • the application programs developed by the author such as Hi Jian program, K song program, magic mirror program, etc.
  • the application package in the application layer is not limited to the above examples, and may actually include other application packages, which is not limited in the embodiment of the present application.
  • the framework layer provides application programming interfaces (application programming interface, API) and programming frameworks for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer is equivalent to a processing center, which decides to let applications in the application layer take actions. Through the API interface, the application can access the resources in the system and obtain the services of the system during execution.
  • the application framework layer in the embodiment of the present application includes managers (Managers), content providers (Content Provider), etc., where the manager includes at least one of the following modules: It interacts with all activities running in the system; Location Manager is used to provide system services or applications with access to system location services; Package Manager is used to retrieve current installations on the device Various information related to the application package; Notification Manager is used to control the display and clearing of notification messages; Window Manager is used to manage icons, windows, toolbars, and wallpapers on the user interface And desktop widgets.
  • Managers includes at least one of the following modules: It interacts with all activities running in the system; Location Manager is used to provide system services or applications with access to system location services; Package Manager is used to retrieve current installations on the device Various information related to the application package; Notification Manager is used to control the display and clearing of notification messages; Window Manager is used to manage icons, windows, toolbars, and wallpapers on the user interface And desktop widgets.
  • the activity manager is used to: manage the life cycle of each application and the usual navigation rollback functions, such as controlling the exit of the application (including switching the user interface currently displayed in the display window to the system desktop), opening , Back (including switching the user interface currently displayed in the display window to the upper level user interface of the currently displayed user interface), etc.
  • the window manager is used to manage all window programs, such as obtaining the size of the display screen, determining whether there is a status bar, locking the screen, capturing the screen, and controlling changes in the display window (for example, shrinking the display window, dithering, or distorting the display window). Deformation display, etc.) and so on.
  • the system runtime layer provides support for the upper layer, that is, the framework layer.
  • the Android operating system will run the C/C++ library included in the system runtime layer to implement functions to be implemented by the framework layer.
  • the kernel layer is a layer between hardware and software. As shown in Figure 4, the kernel layer contains at least one of the following drivers: audio driver, display driver, Bluetooth driver, camera driver, WIFI driver, USB driver, HDMI driver, sensor driver (such as fingerprint sensor, temperature sensor, touch Sensors, pressure sensors, etc.) etc.
  • the kernel layer further includes a power drive module for power management.
  • the software programs and/or modules corresponding to the software architecture in FIG. 4 are stored in the first memory or the second memory shown in FIG. 2 or FIG. 3.
  • the corresponding hardware interrupt is sent to the kernel layer.
  • the kernel layer processes the input operation into the original input event (including the value of the input operation, the time stamp of the input operation and other information).
  • the original input events are stored in the kernel layer.
  • the application framework layer obtains the original input event from the kernel layer, recognizes the control corresponding to the input event according to the current position of the focus, and regards the input operation as a confirmation operation.
  • the control corresponding to the confirmation operation is the control of the magic mirror application icon.
  • the mirror application calls the interface of the application framework layer, starts the magic mirror application, and then starts the camera driver by calling the kernel layer to realize the capture of still images or videos through the camera.
  • the display device receives input operations (such as split-screen operations) that the user acts on the display screen, and the kernel layer can generate corresponding input operations based on the input operations. Enter the event and report the event to the application framework layer.
  • the activity manager of the application framework layer sets the window mode (such as multi-window mode) and the window position and size corresponding to the input operation.
  • the window management of the application framework layer draws the window according to the settings of the activity manager, and then sends the drawn window data to the display driver of the kernel layer, and the display driver displays the corresponding application interface in different display areas of the display screen.
  • the application layer includes at least one application that can display corresponding icon controls on the display, such as: live TV application icon controls, video on demand application icon controls, media center applications Program icon controls, application center icon controls, game application icon controls, etc.
  • the live TV application can provide live TV through different signal sources.
  • a live TV application may use input from cable TV, wireless broadcasting, satellite services, or other types of live TV services to provide TV signals.
  • the live TV application can display the video of the live TV signal on the display device 200.
  • video-on-demand applications can provide videos from different storage sources. Unlike live TV applications, VOD provides video display from certain storage sources. For example, video on demand can come from the server side of cloud storage, and from the local hard disk storage that contains stored video programs.
  • the media center application can provide various multimedia content playback applications.
  • the media center can provide services that are different from live TV or video on demand, and users can access various images or audio through the media center application.
  • the application center may provide for storing various application programs.
  • the application program may be a game, an application program, or some other application program that is related to a computer system or other equipment but can be run on a smart TV.
  • the application center can obtain these applications from different sources, store them in the local storage, and then run on the display device 200.
  • the display device 200 can display not only some ordinary pictures, but also panoramic pictures.
  • a panoramic picture is a picture displayed through a wide-angle representation method, and a panoramic picture can express more images of the surrounding environment. Generally, the panoramic picture cannot be completely displayed on the display device 200.
  • the panoramic picture needs to be adjusted so that other contents of the panoramic picture can be adjusted to the display area of the display device 200.
  • the operator when browsing panoramic pictures on the display device 200, the operator usually needs to use the remote control to adjust the display content of the panoramic picture. Move in the corresponding direction to display the content that was not displayed before in the opposite direction; or the display device 200 collects the operator’s gesture, and then adjusts the displayed panoramic image content according to the direction of the gesture, for example, the operator’s finger is on the display device Swipe to the right on the 200 screen, then the panoramic picture will move to the right on the screen, and then display the content that was not displayed on the left side of the panoramic picture before.
  • all panoramic pictures are moved according to a fixed step preset in the display device 200.
  • the operator only wants to fine-tune the content of a certain panoramic picture, if it is fixed If the step size is too large, the distance the panoramic picture moves on the screen of the display device 200 will be too large, and it is difficult to meet the operator's requirement for fine adjustment of the panoramic picture.
  • the present application provides a panoramic picture browsing method and display device, which can control the moving direction and distance of the panoramic picture on the display device 200 through the operator’s face angle change in front of the display device 200, and then operate The content of the panoramic picture that the user needs to see is adjusted to the screen of the display device 200.
  • the display 275 can be used to display a panoramic picture
  • the detector 230 can be used to collect an image of a person in front of the display device 200
  • the controller 110 can be used to identify the target person based on the image collected by the detector 230 and recognize
  • the face angle information of the target person in front of the display device 200 can also calculate the offset distance of the panoramic picture and control the movement of the panoramic picture.
  • FIG. 6 is a schematic diagram of interaction between a display device 200 and an operator according to an embodiment of the application.
  • the operator when browsing panoramic pictures using the panoramic picture browsing method provided by the embodiment of the present application, the operator needs to stand in front of the display device 200, and the operator controls the panoramic picture on the display device 200 by turning his head. Move to show what you want to watch.
  • Fig. 7 is a flowchart of a panoramic picture browsing method shown in an embodiment of the application. As shown in Figure 7, the panoramic picture browsing method specifically includes the following steps:
  • step S101 when the display device 200 displays a panoramic picture, a target person in front of the display device 200 is identified, and the target person is used to indicate an operator who browses the panoramic picture in front of the display device 200.
  • the target person mentioned in this embodiment specifically refers to the actual operator.
  • multiple people in front of the display device 200 are viewing panoramic pictures at the same time.
  • the display device 200 can only receive operations from one person. , Then this person can be the target person.
  • the person in front of the display device is identified first by means of face recognition, and then it is specifically determined whether the person is the target person.
  • the way to select the target person is not unique.
  • the person closest to the display device 200 can be selected as the target person.
  • the pixel area of each face in front of the display device 200 needs to be compared. Generally, the closer to the display device 200, the pixels of the face The larger the area; or if a specific person is selected as the target person, it is necessary to compare whether there is a specific face among the faces in front of the display device 200.
  • the step of identifying the target person in front of the display device 200 may specifically include:
  • Step S201 Detect whether there is a preset person in front of the display device 200, and the preset person is used to represent an operator who browses a panoramic picture pre-stored in the display device.
  • each display device 200 stores a preset person.
  • the preset character can be set during initialization.
  • the display device 200 browses the panoramic picture for the first time, it can be recognized whether there is this preset character in front of the display device 200; in addition, the preset character can also be the previous panoramic picture browse.
  • the target person saved last time can be used as the preset person, and it is recognized whether the preset person exists before the display device 200.
  • step S202 if it exists, it is determined that the preset person in front of the display device 200 is the target person.
  • the step of identifying the target person in front of the display device 200 may further include:
  • step S301 in the case that there are multiple characters in front of the display device 200 and there is no preset character, the pixel area corresponding to the face of each character is calculated respectively.
  • the person closest to the display device 200 is selected.
  • the distance between a person and the display device 200 is determined by the pixel area of the person's face. The closer the person is to the display device 200, the larger the pixel area of the person's face.
  • the controller 110 can identify the face images belonging to the person, and then further calculate the pixel area corresponding to each face image.
  • step S302 the person corresponding to the face with the largest pixel area is selected as the target person.
  • FIG. 8 is a schematic diagram of a detector 230 collecting images according to an embodiment of the application.
  • the controller 110 can recognize that the person standing in the front has the largest face pixel area, and then determine that the person is the target person.
  • the dashed frame in FIG. 8 is the face range of the target person recognized by the controller 110.
  • the controller 110 can regard this person as the target person. And then control the movement of the panoramic picture.
  • Step S102 Obtain the initial face angle and the current face angle of the target person before and after the first preset duration.
  • the panoramic picture needs to be adjusted according to the change in the angle of the target person's face.
  • the angle of the face has a certain change time, and this time is the first preset duration.
  • the controller 110 may obtain the face rotation angle of the target person in front of the display device 200 after recognizing the target person.
  • This angle can refer to the three angles in the aerospace field.
  • FIG. 9 is a schematic diagram of the aircraft angle shown in an embodiment of the application. As shown in FIG. And the yaw angle is achieved.
  • the concepts of roll angle, pitch angle, and yaw angle are also used to define the face angle of the target person, where the roll angle refers to the angle at which the face rotates, and the pitch angle refers to the up or down of the face.
  • the yaw angle refers to the angle at which the face rotates in the horizontal direction.
  • the initial face angle before and after the first preset duration and the current face angle are generally different.
  • the initial face angle specifically includes the three angles of the face, which are the initial roll angle, the initial pitch angle, and the initial yaw angle;
  • the current face angle also specifically includes the three angles of the face, which are the current roll angle, Current pitch angle and current yaw angle.
  • the initial roll angle, initial pitch angle, and initial yaw angle of the target person in front of the display device 200, and then obtain again after the first preset time period has elapsed.
  • the current roll angle, current pitch angle, and current yaw angle of the target person it is necessary to obtain the initial roll angle, initial pitch angle, and initial yaw angle of the target person in front of the display device 200, and then obtain again after the first preset time period has elapsed.
  • the current roll angle, current pitch angle, and current yaw angle of the target person it is necessary to obtain the initial roll angle, initial pitch angle, and initial yaw angle of the target person in front of the display device 200, and then obtain again after the first preset time period has elapsed.
  • Step S103 Obtain the offset distance of the panoramic picture by using the initial face angle and the current face angle.
  • the angle of the face will change after the first preset time period, and the current face angle will have a certain offset from the initial face angle. In this embodiment, it is Use this offset to obtain the offset distance corresponding to the panoramic picture.
  • FIG. 10 is a schematic diagram of another interaction between a display device 200 and an operator according to an embodiment of the application.
  • the display device 200 when the display device 200 displays a panoramic picture, it can only display a screen as large as the screen of the display 275. If the operator wants to browse the content on the right side of the panoramic picture that is not displayed on the screen, he can face the display device 200 and turn his face to the right.
  • the display device 200 will calculate the direction of the panoramic picture according to the angle at which the operator turns the face. By the distance of the left offset, the panoramic picture is then controlled to shift to the left, and then more content on the right side is displayed on the display device 200.
  • the operation of the operator to turn the face is not absolutely right or left, and it will have a certain offset downwards or downwards.
  • the movement of the panoramic picture on the display 275 is not absolute to the left or right or up and down.
  • the operator can also turn his face in a specific direction to browse content in a specific direction according to his own browsing needs. For example, the operator Rotate the face to the upper right direction, and the panoramic picture moves in the reverse direction to the lower left to display the content in the upper right direction. Therefore, in this embodiment, it is necessary to detect and calculate the three angles of the face rotation to obtain a more accurate offset distance of the panoramic picture.
  • Step S104 Adjust the display content of the panoramic picture on the display device 200 according to the offset distance.
  • the panoramic picture displayed on the display device 200 is a two-dimensional picture, and the direction in which the two-dimensional picture moves includes two directions, horizontal and vertical. Therefore, the calculated offset distance includes the offset distance of the panoramic picture in the horizontal direction and the offset distance in the vertical direction.
  • the specific adjustment method is not limited to the direction of the panoramic picture movement being opposite to the direction of the operator's face rotation, making the direction of the panoramic picture movement the same as the direction of the operator's face rotation can also achieve the purpose of browsing the panoramic picture in this embodiment. .
  • the face of the target person identified in the above steps can be displayed on the screen of the display 275, so that the target person can always observe the angle of his face, and then appropriately determine the rotation angle based on the range of the panoramic picture movement.
  • the specific display position can be set in the upper right corner of the screen, etc. For some other characters in front of the display device 200, after observing the target person on the screen, they will also know who the actual operator is, so that they will not be too close to the display device 200, so as not to affect the viewing of panoramic pictures.
  • the panoramic picture browsing method in the embodiment of the present application can realize the browsing of the panoramic picture by displaying the face angle change of the target person in front of the device, and the offset distance of the panoramic picture can be determined by the angle of the target person’s face rotation.
  • the size is determined, so that the operator can browse the panoramic picture by himself according to his own needs, and there is no need to adjust the panoramic picture according to a fixed moving step.
  • the step of obtaining the offset distance of the panoramic picture by using the initial face angle and the current face angle includes:
  • Step S401 Calculate the horizontal offset distance of the panoramic picture by using the initial roll angle, the current roll angle, the initial yaw angle and the current yaw angle.
  • the following formula may be used to calculate the horizontal offset distance of the panoramic picture:
  • Offset X represents the horizontal offset distance of the panoramic picture
  • Roll 1 and Roll 2 represent the initial roll angle and the current roll angle
  • Yaw 1 and Yaw 2 represent the initial yaw angle and the current yaw angle
  • A1 represents the correction Coefficient
  • delta X represents the minimum adjustment angle value in the horizontal direction
  • Time represents the minimum adjustment offset time.
  • Step S402 Calculate the vertical offset distance of the panoramic picture by using the initial pitch angle, the current pitch angle, the initial roll angle and the current roll angle.
  • the following formula may be used to calculate the vertical offset distance of the panoramic picture:
  • Offset Y represents the vertical offset distance of the panoramic picture
  • Roll 1 and Roll 2 represent the initial roll angle and the current roll angle
  • Pitch 1 and Pitch 2 represent the initial pitch angle and the current pitch angle
  • A2 represents the correction coefficient
  • delta Y represents the minimum adjustment angle value in the vertical direction
  • Time represents the minimum adjustment offset time.
  • delta X and delta Y need to be determined according to the resolution of the panoramic picture.
  • the larger the resolution of the panoramic picture the larger the angle of delta X and delta Y ;
  • Time needs to be determined according to the first preset duration , Generally, the longer the first preset duration, the longer the Time.
  • the values of A1 and A2 are usually less than 0.5.
  • FIG. 11 is a schematic diagram of the initial face angle of the target person obtained by the controller 110 according to an embodiment of the application
  • FIG. 12 is a schematic diagram of the current face angle of the target person obtained by the controller 110 according to an embodiment of the application.
  • the content in the dashed frame represents the face range of the target person recognized by the controller 110.
  • the initial roll angle of the target person’s face Roll 1 is -3.2°
  • the initial yaw angle Yaw 1 is 4.2°
  • the initial pitch angle Pitch 1 is -20.9°
  • the current roll of the target person’s face
  • the angle Roll 2 is -0.4°
  • the current yaw angle Yaw 2 is 20.5°
  • the current pitch angle Pitch 2 is -2.0°.
  • the controller 110 can finally control the panoramic picture to move a corresponding distance in the horizontal direction and the vertical direction, thereby displaying more content of the panoramic picture.
  • the controller 110 may also pre-determine the coordinate system on the screen of the display 275, and may establish a two-dimensional rectangular coordinate system with the four vertices of the screen as the coordinate origin. , You can also establish a two-dimensional rectangular coordinate system with the exact center of the screen as the coordinate origin, and then move the panoramic picture based on the established coordinate system.
  • the following steps are further included:
  • step S501 if the target person in front of the display device 200 is not recognized after a preset period of time, the pixel area corresponding to each human face currently existing in front of the display device 200 is recalculated.
  • the operator who initially operated the panoramic picture browsing may quit the operation midway, but other operators still need to continue browsing.
  • the detector 230 cannot recognize the original target person again, and a new target person needs to be determined again.
  • the method for determining the new target person is as described in the foregoing, and the face pixel area of all the characters in front of the display device 200 can also be judged, and then the person closest to the display device 200 is selected as the target person.
  • step S502 the person corresponding to the face with the largest pixel area is selected as the target person.
  • the controller 110 can identify the face images belonging to the person, and then further calculate the pixel area corresponding to each face image.
  • Step S503 Obtain the initial face angle of the target person.
  • Step S504 After a second preset time period, obtain the current face angle of the target person.
  • the second preset duration in this embodiment is essentially a preset duration, but the time range of the second preset duration is larger than the time range of the first preset duration
  • the first preset duration is T1
  • the second preset duration may be 2T1.
  • the first preset duration T1 is usually set to 0.3 seconds.
  • the panoramic picture may also be adjusted after waiting for a certain period of time. This length of time needs to ensure the efficiency of the panoramic picture adjustment and the adjustment effect of the panoramic picture. It is usually set to 2 seconds.
  • the panoramic picture browsing method not only needs to recognize the target person and determine whether the target person has changed, but also determine whether the currently browsed panoramic picture has changed. If the panoramic picture is adjusted according to the offset distance, After the picture is displayed on the display device 200, the controller 110 detects that the panoramic picture currently displayed on the display 275 is not the previous one, and then the process of identifying the target person in the above embodiment needs to be performed again for the current new panoramic picture.
  • the controller 110 detects that the panoramic picture currently displayed on the display 275 is still the previous picture, then controls The device 110 can directly recognize the face angle change of the target person in front of the display device 200, so as to realize the next adjustment of the panoramic picture.
  • the embodiment of the present application provides a panoramic picture browsing method.
  • the target person in front of the display device 200 can be recognized, and then the initial face angle and the target person’s initial face angle and The current face angle after the preset duration is used to calculate the offset distance that the panoramic picture on the display device 200 needs to move by using the initial face angle and the current face angle.
  • the panoramic picture is adjusted according to the offset distance so that the panoramic picture has not been previously
  • the displayed content is displayed on the display device 200.
  • the solution of the present application can realize the browsing of the panoramic picture through the change of the face angle of the target person in front of the display device 200, and the offset distance of the panoramic picture can be determined by the angle of the target person’s face rotation, and the operator can follow his own You need to browse the panoramic pictures by yourself, no need to adjust the panoramic pictures according to the fixed moving step.
  • the present application also provides a display device 200, including: a display 275; a detector 230 for collecting an image in front of the display device 200; and a controller 110 configured to: recognize when the display device 200 displays a panoramic picture Display the target person in front of the device 200, where the target person is used to indicate the operator who browses the panoramic picture in front of the display device 200; obtain the initial face angle and the current face angle of the target person before and after the first preset duration; The offset distance of the panoramic picture is obtained by using the initial face angle and the current face angle; and the display content of the panoramic picture on the display device 200 is adjusted according to the offset distance.
  • a display device 200 including: a display 275; a detector 230 for collecting an image in front of the display device 200; and a controller 110 configured to: recognize when the display device 200 displays a panoramic picture Display the target person in front of the device 200, where the target person is used to indicate the operator who browses the panoramic picture in front of the display device 200; obtain the initial face angle and the current face angle of the target
  • the controller 110 is further configured to detect whether there is a preset person in front of the display device 200, and the preset person is used to represent the operator who browses the panoramic picture pre-stored in the display device 200; if there is , It is determined that the preset person in front of the display device 200 is the target person.
  • the controller 110 is further configured to: when multiple characters exist in front of the display device 200 and there are no preset characters, respectively calculate the pixel area corresponding to the face of each character; select The person corresponding to the face with the largest pixel area is the target person.
  • the controller 110 is further configured to: obtain the initial roll angle, the initial pitch angle, and the initial yaw angle of the face of the target person in front of the display device 200; after the first preset time period has elapsed, obtain the display device 200 The current roll angle, current pitch angle, and current yaw angle of the front target person's face.
  • the controller 110 is further configured to calculate the horizontal offset distance of the panoramic picture using the initial roll angle, the current roll angle, the initial yaw angle, and the current yaw angle; The pitch angle, the initial roll angle and the current roll angle are used to calculate the vertical offset distance of the panoramic picture.
  • controller 110 is further configured to calculate the horizontal offset distance of the panoramic picture by using the following formula:
  • Offset X represents the horizontal offset distance of the panoramic picture
  • Roll 1 and Roll 2 represent the initial roll angle and the current roll angle
  • Yaw 1 and Yaw 2 represent the initial yaw angle and the current yaw angle
  • A1 represents the correction Coefficient
  • delta X represents the minimum adjustment angle value in the horizontal direction
  • Time represents the minimum adjustment offset time.
  • controller 110 is further configured to calculate the vertical offset distance of the panoramic picture by using the following formula:
  • Offset Y represents the vertical offset distance of the panoramic picture
  • Roll 1 and Roll 2 represent the initial roll angle and the current roll angle
  • Pitch 1 and Pitch 2 represent the initial pitch angle and the current pitch angle
  • A2 represents the correction coefficient
  • delta Y represents the minimum adjustment angle value in the vertical direction
  • Time represents the minimum adjustment offset time.
  • the controller 110 is further configured to: if the target person in front of the display device 200 is not recognized after a preset period of time, recalculate the pixel area corresponding to each face currently existing in the display device 200; The person corresponding to the face with the largest pixel area is selected as the target person; the initial face angle of the target person is obtained; after a second preset time period, the current face angle of the target person is obtained.
  • the display device provided by the embodiment of the present invention can control the control when the user moves the position while using the display device. Adjust the position according to the user's movement. For example, if the user moves to the left in front of the display device, the control follows the user's movement and moves the corresponding position to the left in the display interface, so that the perspective of the user and the control remains unchanged, thereby ensuring the user Regardless of the position of the display device, the display content of the control can be clearly seen from a certain angle of view.
  • a display device provided by an embodiment of the present invention includes a controller, and a display and a camera respectively communicating with the controller.
  • the camera is configured to collect environmental image data.
  • the environmental image data is used to characterize the user's position parameters relative to the display.
  • the camera sends the collected environmental image data to the controller, and the controller can obtain the user's position parameters; the position parameters include user and The vertical distance between the displays and the position of the user's face frame on the display interface when the center point of the user's face frame falls vertically on the display; the face frame is where the camera falls on the user's face when the camera captures the image of the user in front of the display device
  • the center point of the face frame can be the center position of the face frame or the center position between the two pupils of the user.
  • the display is configured to present a display interface, and a target control is displayed in the display interface.
  • the target control can be a notification, a pop-up frame, or a floating window.
  • the control key that realizes the function of dynamically adjusting the position of the control can be configured in the controller.
  • the control on the display page needs to be controlled by the display device to adjust the position following the movement of the user, it can be turned on in advance
  • the control key enables the display device to dynamically adjust the position of the control. If the control key is not turned on, the control is displayed normally, and when the user moves, the control does not adjust the position according to the user's movement.
  • FIG. 13 exemplarily shows a flowchart of a method for dynamically adjusting a control according to an embodiment.
  • the controller when realizing the dynamic adjustment of the control, the controller realizes the control based on the face recognition and distance detection algorithm. Specifically, the controller is configured to perform the following steps:
  • the display device After the user turns on the control key, the display device has the function of dynamically adjusting the position of the control, and the controller obtains the environmental image data collected by the camera in real time. If the position of the user using the display device changes, the position before the position change is taken as the initial position of the user, and the position after the position change is taken as the end position of the user.
  • the environmental image data corresponding to the user's initial position and the environmental image data corresponding to the end position can be obtained.
  • the environmental image data corresponding to different positions can represent different relative distances between the user and the display and different positions on the display interface when the center point of the user's face frame falls vertically on the display.
  • the controller recognizes the number of human faces on the environmental image data, and when only one human face is recognized, continues to execute the subsequent method of dynamically adjusting the control.
  • the controller is further configured to: receive the environmental image data collected by the camera; recognize the number of faces in the environmental image data; when the number of faces in the environmental image data is 1, execute the acquisition of user initial position parameters and user end Positional parameter steps.
  • the controller can control the normal display of the target control without executing the method of dynamically adjusting the control.
  • the controller can also choose one of the users as the target follower, and the target follower is used as the control control to adjust the position. in accordance with.
  • FIG. 14 exemplarily shows a flowchart of a method for obtaining the initial position parameter of a target control according to an embodiment
  • FIG. 15 exemplarily shows a schematic diagram of a reference coordinate system according to an embodiment.
  • the controller is executed to obtain the initial position parameters of the target control, and is further configured to perform the following steps:
  • a reference coordinate system can be established in the display interface.
  • the coordinate origin O of the reference coordinate system is set at the upper left corner of the display interface
  • the positive direction of the X axis is the direction from the left to the right of the display interface
  • the positive direction of the Y axis is the direction from the top to the bottom of the display interface. direction.
  • S122 Obtain the number of pixels at the origin of the coordinates and the number of horizontal pixels and the number of vertical pixels at the control center point of the target control.
  • the initial position parameter of the target control can be represented by coordinate values, and the horizontal and vertical coordinate values can be calculated according to the pixel points of the control center point of the target control.
  • the controller can obtain the resolution of the current display device, and then can determine the number of pixels at the origin of the coordinate and the number of pixels at the control center point of the target control. Since the coordinate origin is located on the leftmost side of the display interface, the number of pixels at which the coordinate origin can be determined equivalently is 0.
  • the coordinate position of the control center point M of the target control is used to represent.
  • the number of horizontal pixels and the number of vertical pixels of the control center point M of the target control are respectively obtained.
  • the number of horizontal pixels refers to the number of pixels contained in the X-axis direction between the control center point M of the target control and the coordinate origin O.
  • the number of vertical pixels refers to the number of pixels contained in the Y-axis direction between the control center point M of the target control and the coordinate origin O.
  • S123 Calculate the difference in the number of horizontal pixels between the number of pixels at the origin of the coordinates and the number of horizontal pixels at the center of the control, and the difference between the number of pixels at the origin of the coordinates and the number of vertical pixels at the center of the control.
  • the number of pixels of the coordinate origin O is 0, and the coordinates of the corresponding coordinate origin are (0, 0).
  • the number of horizontal pixels of the control center point is P 1
  • the number of vertical pixels of the control center point is P 2
  • the pixel coordinates of the corresponding control center point are (P 1 , P 2 ).
  • the number of pixels included in its display interface is certain, that is, one resolution corresponds to a set of pixel numbers. Therefore, the difference between two adjacent pixels can be obtained, that is, the length value of each pixel. If the pixel is square, the length and width of the pixel are the same. By multiplying the difference in the number of pixels by the length of each pixel, the corresponding distance can be obtained.
  • the coordinates of the control center point of the target control are determined by the horizontal initial distance and the vertical initial distance, that is, the coordinates of the control center point are (L 1 , L 2 ); the pixels of the control center point of the target control are determined by the number of horizontal and vertical pixels Point coordinates, that is, the pixel coordinates of the center point of the control are (P 1 , P 2 ). The coordinates of the control center point and the pixel coordinates of the control center point are used as the initial position parameters of the target control.
  • the length of its long side is fixed at about 135.5cm.
  • the controller can be based on the acquired environment
  • the user's initial position parameter and the user's end position parameter are directly retrieved from the image data.
  • FIG. 16 exemplarily shows a schematic diagram of the environmental image data corresponding to the initial position in the embodiment
  • FIG. 17 exemplarily shows a schematic diagram of the environmental image data corresponding to the end position in the embodiment.
  • the controller when the user is in the initial position in front of the display device, the controller can directly obtain the vertical distance (relative distance) between the user and the display from the corresponding environmental image data, for example, 1.70 m as shown in FIG. 16.
  • the controller when the user moves from the initial position to the end position, the controller can directly obtain the vertical distance (relative distance) between the user and the display from the environmental image data corresponding to the end position, for example, 2.24 shown in Fig. 17 m.
  • FIG. 18 exemplarily shows a schematic diagram of the position change of the center point of the face frame on the display interface during the movement of the user according to the embodiment.
  • the position on the display interface is point X
  • the AX line is perpendicular to the display interface.
  • the position on the display interface is point N
  • the BN connection is perpendicular to the display interface.
  • the line AX is the vertical distance (relative distance) between the user and the display
  • point X is when the center point of the user's face frame falls vertically on the display.
  • the user's initial position parameter can be determined by connecting AX and point X.
  • the line BN is the vertical distance (relative distance) between the user and the display
  • point N is the position on the display interface when the center point of the user's face frame falls vertically on the display. Therefore, connecting BN and point N can determine the user end position parameter.
  • the display In order to ensure that the user’s viewing angle when viewing the display device remains the same, that is, the user’s viewing angle when viewing the display content of the target control remains the same, no matter where the user is, he can clearly view the display content of the target control, the display provided in this embodiment The device needs to control the target control to follow the user's movement to adjust the position when the user moves the position.
  • the offset of the target control can be calculated according to the user's initial position parameter and the user's end position parameter, and the position parameter that the target control needs to move is determined by the offset.
  • the user’s initial position parameters include the user’s initial relative distance to the display and initial position parameters.
  • the user’s end position parameters include the user’s end relative distance to the display and the end position parameters.
  • the user’s corresponding position parameters refer to the parameters of the center point of the face frame. .
  • the initial position parameter refers to the position on the display interface when the center point of the user's face frame falls vertically on the display when the user is in the initial position.
  • the end position parameter refers to the position on the display interface where the center point of the user's face frame falls vertically on the display when the user moves to the end position.
  • Fig. 19 exemplarily shows a flow chart of the method for calculating the offset of the target control according to the embodiment.
  • the controller is further configured to perform the following steps in the process of calculating the offset of the target control based on the user's initial position parameter and the user's end position parameter:
  • the first distance refers to the planar distance between the position X on the display interface and the control center point M of the target control when the center point of the user's face frame falls vertically on the display when the user is at the initial position A;
  • the distance refers to the planar distance between the position N on the display interface and the control center point M of the target control when the center point of the user's face frame falls vertically on the display when the user moves to the end position B.
  • the first distance (XM line) represents the horizontal plane distance along the display interface
  • the second distance (NM line) represents the horizontal plane distance along the display interface.
  • the first distance and the second distance can be calculated by the pixel point difference between the center point of the face frame and the control center point of the target control.
  • the controller executes the calculation of the first distance between the initial position parameter corresponding to the user and the initial position parameter of the target control, it is further configured to:
  • Step 311 Obtain the number of pixels at the center point of the face frame when the user is at the initial position and the number of pixels at the control center point of the target control.
  • Step 312 Calculate the difference between the number of pixels at the center of the face frame and the number of pixels at the center of the control.
  • Step 313 Calculate the first distance between the initial position parameter corresponding to the user and the initial position parameter of the target control according to the difference in the number of pixels and the length value of each pixel.
  • the number of pixels at the center point of the face frame when the user is at the initial position can be obtained by the controller from the environmental image data corresponding to the initial position, and the number of pixels at the control center point of the target control can be obtained according to the system properties. Both pixel points can read the corresponding pixel point coordinates in the reference coordinate system.
  • the number of pixels at the center point of the face frame does not change in the Y-axis direction. Therefore, the number of pixels at the center point of the face frame corresponding to the initial position and the control center point
  • the number of pixels that is, the difference in the number of pixels is calculated based on the number of horizontal pixels at the center of the face frame and the number of horizontal pixels at the center of the control.
  • the specific calculation method of the pixel point difference and the first distance please refer to the content of steps S121 to S124 provided in the foregoing embodiment, which will not be repeated here.
  • the controller performs the calculation of the second distance between the end position parameter corresponding to the user and the initial position parameter of the target control, which is further configured to:
  • Step 321 Obtain the number of pixels at the center point of the face frame when the user is at the end position and the number of pixels at the center point of the control of the target control.
  • Step 322 Calculate the difference between the number of pixels at the center of the face frame and the number of pixels at the center of the control.
  • Step 323 Calculate the second distance between the end position parameter corresponding to the user and the initial position parameter of the target control according to the difference in the number of pixels and the length value of each pixel.
  • the number of pixels at the center point of the face frame has not changed in the Y-axis direction. Therefore, the number of pixels at the center point of the face frame corresponding to the end position and the control center point.
  • the number of pixels, that is, the difference in the number of pixels is calculated based on the number of horizontal pixels at the center of the face frame and the number of horizontal pixels at the center of the control.
  • the specific calculation method of the pixel point difference and the second distance can refer to the content of steps S121 to S124 provided in the foregoing embodiment, which will not be repeated here.
  • the user’s viewing angle of the target control is the same as the viewing angle of the target control when the user is at the initial position. Therefore, it is necessary to change the target control’s To adjust the position, it is necessary to determine the theoretical second distance required for the user to view the target control with the same angle of view when the user moves to the end position.
  • the controller calculates the theoretical second distance when the user moves to the end position based on the initial relative distance, the end relative distance, and the first distance as follows:
  • S 2 ′ is the theoretical second distance
  • S 1 is the first distance
  • AX is the initial relative distance
  • BN is the ending relative distance.
  • FIG. 20 exemplarily shows a schematic diagram when the theoretical second distance is determined according to the embodiment.
  • the line between the center point of the face frame and the center point of the control must be the same as the line between the center point of the face and the display interface.
  • the theoretical second distance is the theoretical distance between the end position corresponding to the user and the end position M'of the target control. Therefore, the offset Offset of the target control is obtained according to the distance difference between the theoretical second distance and the second distance.
  • the offset can realize the distance from the control center point M of the target control to the point M'.
  • the end position parameter of the target control is obtained, and the target control is moved to the position corresponding to the end position parameter.
  • the initial position parameter of the target control and the offset of the target control it is possible to determine the end position of the target control after the position needs to be adjusted, and realize the position adjustment of the target control according to the end position parameter.
  • FIG. 21 exemplarily shows the first schematic diagram when the position of the control is dynamically adjusted according to the embodiment.
  • point M is the initial position parameter of the target control
  • point M' is the end position parameter of the target control. Move the target control from point M to point M'to realize the position adjustment of the target control when the user moves from the initial position A to the end position B.
  • the end position parameter of the target control initial position parameter-offset.
  • the foregoing embodiment is based on a situation in which the position adjustment of the target control is realized when the user moves from the initial position A to the end position B.
  • the user may also move in the vertical direction during the movement, that is, the user changes from a standing state to a sitting state. At this time, in the vertical direction, the user also has a position in the Y-axis direction. Change.
  • the display device In order to adapt to the situation where the user changes both in the X-axis direction and the Y-axis direction, the display device provided by the embodiment of the present invention needs to determine the horizontal offset and the vertical offset when determining the offset of the target control. For example, when the user changes from a state of standing directly in front of the display device to sitting on a chair at the rear left, the target control needs to be controlled to move from the initial position to the lower left corner.
  • the user initial position parameter includes a horizontal initial position parameter and a vertical initial position parameter
  • the user end position parameter includes a horizontal end position parameter and a vertical end position parameter.
  • the horizontal initial position parameter includes the horizontal initial relative distance and the horizontal initial position parameter of the user in the initial position relative to the display
  • the vertical initial position parameter includes the vertical initial relative distance and the vertical initial position parameter of the user in the initial position relative to the display.
  • the horizontal end position parameter includes the horizontal initial relative distance and the horizontal initial position parameter of the user at the end position relative to the display
  • the vertical end position parameter includes the vertical initial relative distance and the vertical initial position parameter of the user at the end position relative to the display.
  • the vertical relative distance refers to the corresponding distance when the center point of the face frame moves along the Y axis, that is, the height of the center point of the face frame when the user is standing and the center point of the face frame when the user sits down The height difference.
  • the vertical position parameter refers to the position on the display interface when the center point of the user's face frame falls vertically on the display when the user moves vertically to the end position.
  • the controller calculates the offset of the target control based on the user's initial position parameter and the user's end position parameter, and is further configured to:
  • Step 701 Calculate the lateral offset of the target control based on the lateral initial position parameter and the lateral end position parameter.
  • Step 702 Calculate the longitudinal offset of the target control based on the longitudinal initial position parameter and the longitudinal end position parameter.
  • step S3 When calculating the horizontal offset and vertical offset of the target control, you can refer to all the content described in step S3 provided in the foregoing embodiment for calculation, which can realize the calculation of the target from the horizontal initial position parameter and the horizontal end position parameter.
  • the horizontal offset of the control, and the vertical offset of the target control are calculated from the vertical initial position parameter and the vertical end position parameter. The specific calculation process is not repeated here.
  • the final position parameter of the target control after the adjustment of the target control can be determined according to the initial position parameter of the target control.
  • the initial position parameter of the target control includes a horizontal initial position parameter and a vertical initial position parameter. Therefore, the determined end position parameter of the target control also includes the horizontal end position parameter and the vertical end position parameter of the target control.
  • the controller obtains the end position parameter of the target control based on the initial position parameter of the target control and the offset of the target control, and is further configured to:
  • Step 801 Calculate the horizontal end position parameter of the target control according to the horizontal initial position parameter and the horizontal offset of the target control.
  • Step 802 Calculate the longitudinal end position parameter of the target control according to the longitudinal initial position parameter and the longitudinal offset of the target control.
  • the horizontal initial position parameter and horizontal offset of the target control you can determine the horizontal end position of the target control after adjusting the position, and according to the vertical initial position parameter and vertical offset of the target control, you can determine the target control needs The longitudinal end position after adjusting the position.
  • the horizontal termination position parameter and the vertical termination position parameter the position adjustment of the target control is realized, so that the target control can adjust the position following the movement of the user.
  • Fig. 22 exemplarily shows a second schematic diagram when the position of the control is dynamically adjusted according to the embodiment.
  • the target control needs to be controlled by The initial position moves to the lower left corner.
  • the horizontal end position parameter of the target control the horizontal initial position parameter-the horizontal offset
  • the vertical end position parameter of the target control the vertical initial position parameter + the vertical offset
  • the controller when the user moves from the initial position to the end position, receives the environmental image data corresponding to the initial position collected by the camera and the corresponding end position.
  • Environmental image data to obtain user initial position parameters and user end position parameters; calculate the offset of the target control according to the user initial position parameters and user end position parameters; obtain based on the initial position parameters of the target control and the offset of the target control The end position parameter of the target control, move the target control to the position corresponding to the end position parameter.
  • the display device provided by the embodiment of the present invention can adjust the position of the target control following the movement of the user, so that the user can watch the target control in any direction within the visual range of the camera of the display device, thereby ensuring that the user Both can clearly see the display content of the control and improve the user's subjective visual experience.
  • FIG. 13 exemplarily shows a flowchart of a method for dynamically adjusting a control according to an embodiment.
  • This application also provides a method for dynamically adjusting controls, which is executed by a controller in a display device, and the method includes the following steps:
  • the calculating the offset of the target control based on the user initial position parameter and the user end position parameter includes:
  • the user's initial position parameter includes the user's initial relative distance to the display and initial position parameters
  • the user end position parameter includes the user's end relative distance to the display and end position parameters
  • the user's corresponding position parameter refers to a person Parameters of the center point of the face frame
  • the position parameter corresponding to the control refers to the parameter of the center point of the control
  • the end relative distance calculates the theoretical second distance when the user moves to the end position, and the theoretical second distance is used to characterize the end position corresponding to the user and the termination of the target control
  • the theoretical distance between locations
  • the distance difference between the theoretical second distance and the second distance is calculated to obtain the offset of the target control.
  • the calculating the first distance between the initial position parameter corresponding to the user and the initial position parameter of the target control includes:
  • the first distance between the initial position parameter corresponding to the user and the initial position parameter of the target control is calculated.
  • the calculating the second distance between the end position parameter corresponding to the user and the initial position parameter of the target control includes:
  • the second distance between the end position parameter corresponding to the user and the initial position parameter of the target control is calculated.
  • the calculation of the theoretical second distance when the user moves to the end position based on the initial relative distance, the end relative distance, and the first distance includes:
  • S 2 ′ is the theoretical second distance
  • S 1 is the first distance
  • AX is the initial relative distance
  • BN is the ending relative distance.
  • the obtaining the initial position parameter of the target control includes:
  • the horizontal initial distance and the vertical initial distance between the control center point of the target control and the coordinate origin are calculated, and the horizontal initial distance and the vertical initial distance are calculated.
  • the initial distance, the number of horizontal pixels and the number of vertical pixels of the control center point are used as the initial position parameters of the target control.
  • the calculating the offset of the target control based on the user initial position parameter and the user end position parameter includes:
  • the user initial position parameter includes a horizontal initial position parameter and a vertical initial position parameter
  • the user end position parameter includes a horizontal end position parameter and a vertical end position parameter
  • the longitudinal offset of the target control is calculated.
  • the obtaining the end position parameter of the target control based on the initial position parameter of the target control and the offset of the target control includes:
  • the initial position parameter of the target control includes a horizontal initial position parameter and a vertical initial position parameter
  • the longitudinal end position parameter of the target control is calculated.
  • it further includes:
  • the step of acquiring the user's initial position parameter and the user's end position parameter is performed.
  • the present invention also provides a computer storage medium, wherein the computer storage medium may store a program, and the program may include some or all of the steps in each embodiment of the method for dynamically adjusting controls provided by the present invention when the program is executed.
  • the storage medium may be a magnetic disk, an optical disc, a read-only memory (English: read-only memory, abbreviated as ROM) or a random access memory (English: random access memory, abbreviated as: RAM), etc.

Abstract

La présente demande concerne un procédé et un dispositif d'affichage. Lorsque le dispositif d'affichage affiche une image panoramique, une personne cible devant le dispositif d'affichage peut être identifiée, et ensuite, un angle initial du visage de la personne cible et un angle courant du visage après une durée prédéfinie sont obtenus séparément ; une distance de décalage, sur laquelle l'image panoramique sur le dispositif d'affichage doit se déplacer, est calculée à l'aide de l'angle initial du visage et de l'angle courant du visage ; et enfin, l'image panoramique est ajustée en fonction de la distance de décalage pour permettre au contenu de l'image panoramique qui n'a pas été affiché avant d'être affiché sur le dispositif d'affichage.
PCT/CN2021/081562 2020-04-27 2021-03-18 Procédé et dispositif d'affichage WO2021218473A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202010342885.9A CN113645502B (zh) 2020-04-27 2020-04-27 一种动态调整控件的方法及显示设备
CN202010342885.9 2020-04-27
CN202010559804.0 2020-06-18
CN202010559804.0A CN113825001B (zh) 2020-06-18 2020-06-18 全景图片浏览方法及显示设备

Publications (1)

Publication Number Publication Date
WO2021218473A1 true WO2021218473A1 (fr) 2021-11-04

Family

ID=78374066

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/081562 WO2021218473A1 (fr) 2020-04-27 2021-03-18 Procédé et dispositif d'affichage

Country Status (1)

Country Link
WO (1) WO2021218473A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114449162A (zh) * 2021-12-22 2022-05-06 天翼云科技有限公司 一种播放全景视频的方法、装置、计算机设备及存储介质
CN114449162B (en) * 2021-12-22 2024-04-30 天翼云科技有限公司 Method, device, computer equipment and storage medium for playing panoramic video

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140009503A1 (en) * 2012-07-03 2014-01-09 Tourwrist, Inc. Systems and Methods for Tracking User Postures to Control Display of Panoramas
CN105988578A (zh) * 2015-03-04 2016-10-05 华为技术有限公司 一种交互式视频显示的方法、设备及系统
CN106383655A (zh) * 2016-09-19 2017-02-08 北京小度互娱科技有限公司 在全景播放过程中控制视角转换的交互控制方法及装置
US9596401B2 (en) * 2006-10-02 2017-03-14 Sony Corporation Focusing an image based on a direction of a face of a user
CN106598428A (zh) * 2016-11-29 2017-04-26 宇龙计算机通信科技(深圳)有限公司 播放全景视频的方法、系统及终端设备
CN108235132A (zh) * 2018-03-13 2018-06-29 哈尔滨市舍科技有限公司 基于人眼定位的全景视频视角调整方法与装置
CN108319362A (zh) * 2018-01-02 2018-07-24 联想(北京)有限公司 一种全景信息显示方法、电子设备和计算机存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9596401B2 (en) * 2006-10-02 2017-03-14 Sony Corporation Focusing an image based on a direction of a face of a user
US20140009503A1 (en) * 2012-07-03 2014-01-09 Tourwrist, Inc. Systems and Methods for Tracking User Postures to Control Display of Panoramas
CN105988578A (zh) * 2015-03-04 2016-10-05 华为技术有限公司 一种交互式视频显示的方法、设备及系统
CN106383655A (zh) * 2016-09-19 2017-02-08 北京小度互娱科技有限公司 在全景播放过程中控制视角转换的交互控制方法及装置
CN106598428A (zh) * 2016-11-29 2017-04-26 宇龙计算机通信科技(深圳)有限公司 播放全景视频的方法、系统及终端设备
CN108319362A (zh) * 2018-01-02 2018-07-24 联想(北京)有限公司 一种全景信息显示方法、电子设备和计算机存储介质
CN108235132A (zh) * 2018-03-13 2018-06-29 哈尔滨市舍科技有限公司 基于人眼定位的全景视频视角调整方法与装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114449162A (zh) * 2021-12-22 2022-05-06 天翼云科技有限公司 一种播放全景视频的方法、装置、计算机设备及存储介质
CN114449162B (en) * 2021-12-22 2024-04-30 天翼云科技有限公司 Method, device, computer equipment and storage medium for playing panoramic video

Similar Documents

Publication Publication Date Title
CN113330736A (zh) 一种显示器及图像处理方法
WO2021179359A1 (fr) Dispositif d'affichage et procédé d'adaptation de la rotation des images d'affichage
WO2020248680A1 (fr) Procédé et appareil de traitement de données vidéo et dispositif d'affichage
CN111970548B (zh) 显示设备及调整摄像头角度的方法
CN112291599B (zh) 显示设备及调整摄像头角度的方法
US11960674B2 (en) Display method and display apparatus for operation prompt information of input control
WO2021212463A1 (fr) Dispositif d'affichage et procédé de projection sur écran
CN112866773B (zh) 一种显示设备及多人场景下摄像头追踪方法
CN111899175A (zh) 图像转换方法及显示设备
CN112243141A (zh) 投屏功能的显示方法及显示设备
WO2022028060A1 (fr) Dispositif et procédé d'affichage
WO2021213097A1 (fr) Appareil d'affichage et procédé de projection d'écran
CN114430492B (zh) 显示设备、移动终端及图片同步缩放方法
CN113825002A (zh) 显示设备及焦距控制方法
WO2021212470A1 (fr) Dispositif d'affichage et procédé d'affichage d'image d'écran projeté
WO2021031598A1 (fr) Procédé d'ajustement auto-adaptatif pour la position d'une fenêtre de dialogue en ligne vidéo, et dispositif d'affichage
CN111078926A (zh) 一种人像缩略图像的确定方法及显示设备
WO2021218473A1 (fr) Procédé et dispositif d'affichage
WO2021180223A1 (fr) Procédé et dispositif d'affichage
CN112218156B (zh) 一种调节视频动态对比度的方法及显示设备
CN113824870A (zh) 显示设备及摄像头角度调整方法
CN113825001B (zh) 全景图片浏览方法及显示设备
CN113473024A (zh) 显示设备、云台摄像头和摄像头控制方法
CN114302203A (zh) 图像显示方法及显示设备
CN114417035A (zh) 一种图片浏览方法和显示设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21796851

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21796851

Country of ref document: EP

Kind code of ref document: A1