WO2022028060A1 - Dispositif et procédé d'affichage - Google Patents

Dispositif et procédé d'affichage Download PDF

Info

Publication number
WO2022028060A1
WO2022028060A1 PCT/CN2021/096429 CN2021096429W WO2022028060A1 WO 2022028060 A1 WO2022028060 A1 WO 2022028060A1 CN 2021096429 W CN2021096429 W CN 2021096429W WO 2022028060 A1 WO2022028060 A1 WO 2022028060A1
Authority
WO
WIPO (PCT)
Prior art keywords
angle
camera
target
user
application
Prior art date
Application number
PCT/CN2021/096429
Other languages
English (en)
Chinese (zh)
Inventor
李东航
刘晋
姜俊厚
司洪龙
李保成
Original Assignee
海信视像科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202010789824.7A external-priority patent/CN111970548B/zh
Priority claimed from CN202110156378.0A external-priority patent/CN112954425A/zh
Application filed by 海信视像科技股份有限公司 filed Critical 海信视像科技股份有限公司
Priority to CN202180053612.5A priority Critical patent/CN116264864A/zh
Publication of WO2022028060A1 publication Critical patent/WO2022028060A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present application relates to the technical field of display devices, and in particular, to a display device and a display method.
  • the display device can provide the user with playback pictures such as audio, video, and pictures.
  • display devices can not only provide users with the content of live TV programs received through data broadcasting, but also provide users with various application and service content such as online video and online games.
  • the related art configures a camera on the display device, obtains local image data collected by the camera through the controller of the display device for processing, and implements functions such as video chat, photography, and video recording on the display device.
  • the present application provides a display device, including:
  • the angle at which the camera is positioned can be adjusted to capture images in different ranges when the camera is at different angles;
  • the user operation carrying an identifier for indicating the target adjustment direction and an operation mode
  • the target angle is determined according to the target adjustment direction and the saved starting angle, and the angle of the camera is adjusted to the target angle.
  • the angle is the target angle determined last time or the angle at which the camera was located when the adjustment was terminated last time;
  • FIG. 1 exemplarily shows a schematic diagram of an operation scene between a display device and a control apparatus according to some embodiments
  • FIG. 2 exemplarily shows a hardware configuration block diagram of a display device 200 according to some embodiments
  • FIG. 3 exemplarily shows a hardware configuration block diagram of the control device 100 according to some embodiments
  • FIG. 4 exemplarily shows a schematic diagram of software configuration in the display device 200 according to some embodiments
  • FIG. 5 exemplarily shows a schematic diagram of displaying an icon control interface of an application in the display device 200 according to some embodiments
  • FIG. 6 to FIG. 11 exemplarily show the state diagrams when the camera is at different angles
  • FIG. 12 is a flowchart of a method for adjusting a camera angle according to an exemplary embodiment of the present application.
  • 13a is a schematic diagram of a display device shown in some embodiments of the present application.
  • 13b is a schematic diagram of a display device shown in some embodiments of the present application.
  • FIG. 14 exemplarily shows a flow chart of a camera control method provided by the present application.
  • FIG. 15 exemplarily shows a flowchart of a camera control method provided by the present application.
  • module refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic or combination of hardware or/and software code capable of performing the function associated with that element.
  • remote control refers to a component of an electronic device, such as the display device disclosed in this application, that can wirelessly control the electronic device, usually over a short distance.
  • infrared and/or radio frequency (RF) signals and/or Bluetooth are used to connect with electronic devices, and functional modules such as WiFi, wireless USB, Bluetooth, and motion sensors may also be included.
  • RF radio frequency
  • a hand-held touch remote control replaces most of the physical built-in hard keys in a general remote control device with a user interface in a touch screen.
  • gesture used in this application refers to a user's behavior that is used by a user to express an expected thought, action, purpose/or result through an action such as a change of hand shape or hand movement.
  • FIG. 1 exemplarily shows a schematic diagram of an operation scenario between a display device and a control apparatus according to an embodiment.
  • a user may operate the display apparatus 200 through the mobile terminal 300 and the control apparatus 100 .
  • control device 100 may be a remote controller, and the communication between the remote controller and the display device includes infrared protocol communication or Bluetooth protocol communication, and other short-range communication methods, etc., and controls the display device 200 by wireless or other wired methods.
  • the user can control the display device 200 by inputting user instructions through keys on the remote control, voice input, control panel input, and the like.
  • the user can control the display device 200 by inputting corresponding control commands through the volume up/down key, channel control key, up/down/left/right movement keys, voice input key, menu key, power-on/off key, etc. on the remote control. function.
  • mobile terminals, tablet computers, computers, notebook computers, and other smart devices may also be used to control the display device 200 .
  • the display device 200 is controlled using an application running on the smart device.
  • the app can be configured to provide users with various controls in an intuitive user interface (UI) on the screen associated with the smart device.
  • UI intuitive user interface
  • the mobile terminal 300 may install a software application with the display device 200 to implement connection communication through a network communication protocol, so as to achieve the purpose of one-to-one control operation and data communication.
  • a control command protocol can be established between the mobile terminal 300 and the display device 200
  • the remote control keyboard can be synchronized to the mobile terminal 300
  • the function of controlling the display device 200 can be realized by controlling the user interface on the mobile terminal 300.
  • the audio and video content displayed on the mobile terminal 300 may also be transmitted to the display device 200 to implement a synchronous display function.
  • the display device 200 also performs data communication with the server 400 through various communication methods.
  • the display device 200 may be allowed to communicate via local area network (LAN), wireless local area network (WLAN), and other networks.
  • the server 400 may provide various contents and interactions to the display device 200 .
  • the display device 200 interacts by sending and receiving information, and electronic program guide (EPG), receiving software program updates, or accessing a remotely stored digital media library.
  • EPG electronic program guide
  • the server 400 may be a cluster or multiple clusters, and may include one or more types of servers. Other network service contents such as video-on-demand and advertising services are provided through the server 400 .
  • the display device 200 may be a liquid crystal display, an OLED display, or a projection display device.
  • the specific display device type, size and resolution are not limited. Those skilled in the art can understand that the display device 200 can make some changes in performance and configuration as required.
  • the display device 200 may additionally provide a smart IPTV function that provides computer-supported functions, including but not limited to, IPTV, smart TV, Internet Protocol TV (IPTV), and the like, in addition to the broadcast receiving TV function.
  • a smart IPTV function that provides computer-supported functions, including but not limited to, IPTV, smart TV, Internet Protocol TV (IPTV), and the like, in addition to the broadcast receiving TV function.
  • FIG. 2 exemplarily shows a block diagram of the hardware configuration of the display device 200 according to the exemplary embodiment.
  • the display device 200 includes a controller 250, a tuner and demodulator 210, a communicator 220, a detector 230, an input/output interface (the first interface to the nth interface 255), a display 275, and an audio output interface 285 , a memory 260 , a power supply 290 , a user interface 265 , and at least one of an external device interface 240 .
  • the display 275 for receiving the image signal from the output of the first processor, performs components for displaying video content and images and a menu manipulation interface.
  • the display 275 includes a display screen component for presenting pictures, and a driving component for driving image display.
  • the video content displayed may be from broadcast television content or various broadcast signals that may be received via wired or wireless communication protocols.
  • various image contents sent from the network server side can be displayed and received from the network communication protocol.
  • display 275 is used to present a user-manipulated UI interface generated in display device 200 and used to control display device 200 .
  • a driving component for driving the display is also included.
  • display 275 is a projection display, and may also include a projection device and projection screen.
  • communicator 220 is a component for communicating with external devices or external servers according to various communication protocol types.
  • the communicator may include at least one of a Wifi chip, a Bluetooth communication protocol chip, a wired Ethernet communication protocol chip and other network communication protocol chips or a near field communication protocol chip, and an infrared receiver.
  • the display device 200 may establish control signal and data signal transmission and reception between the communicator 220 and the external control device 100 or the content providing device.
  • the user interface 265 may be used to receive infrared control signals from the control device 100 (eg, an infrared remote control, etc.).
  • the detector 230 is a signal used by the display device 200 to collect the external environment or interact with the outside.
  • the detector 230 includes a light receiver, a sensor for collecting ambient light intensity, and can adaptively display parameter changes and the like by collecting ambient light.
  • the detector 230 may also include an image collector, such as a camera, a camera, etc., which can be used to collect external environment scenes, and used to collect user attributes or interactive gestures with the user, and can adaptively change display parameters, User gestures can also be recognized to implement functions that interact with users.
  • an image collector such as a camera, a camera, etc., which can be used to collect external environment scenes, and used to collect user attributes or interactive gestures with the user, and can adaptively change display parameters, User gestures can also be recognized to implement functions that interact with users.
  • detector 230 may also include a temperature sensor or the like, such as by sensing ambient temperature.
  • the display device 200 can adaptively adjust the display color temperature of the image. For example, when the temperature is relatively high, the display device 200 can be adjusted to display a relatively cool color temperature of the image, or when the temperature is relatively low, the display device 200 can be adjusted to display a warmer color of the image.
  • the detector 230 may also be a sound collector or the like, such as a microphone, which may be used to receive the user's voice. Exemplarily, it includes a voice signal of a control instruction of the user to control the display device 200, or collects ambient sounds, and is used to identify the type of the ambient scene, so that the display device 200 can adapt to the ambient noise.
  • the input/output interfaces are configured to enable data transmission between the controller 250 and other external devices or other controllers 250 . Such as receiving video signal data and audio signal data of external equipment, or command instruction data, etc.
  • the external device interface 240 may include, but is not limited to, the following: any one or more of a high-definition multimedia interface HDMI interface, an analog or data high-definition component input interface, a composite video input interface, a USB input interface, an RGB port, etc. interface. It is also possible to form a composite input/output interface by a plurality of the above-mentioned interfaces.
  • the tuner and demodulator 210 is configured to receive broadcast television signals through wired or wireless reception, and can perform modulation and demodulation processing such as amplification, frequency mixing, and resonance, and can perform modulation and demodulation processing from multiple wireless receivers.
  • the audio and video signal may include the TV audio and video signal carried in the frequency of the TV channel selected by the user, and the EPG data signal.
  • the frequency demodulated by the tuner-demodulator 210 is controlled by the controller 250, and the controller 250 can send a control signal according to the user's selection, so that the modem responds to the user-selected TV signal frequency and modulates and demodulates the frequency.
  • broadcast television signals may be classified into terrestrial broadcast signals, cable broadcast signals, satellite broadcast signals, or Internet broadcast signals, etc. according to different broadcast formats of the television signals. Or according to different modulation types, it can be divided into digital modulation signal, analog modulation signal, etc. Or, it can be divided into digital signals, analog signals, etc. according to different types of signals.
  • the controller 250 and the tuner 210 may be located in different separate devices, that is, the tuner 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box Wait.
  • the set-top box outputs the TV audio and video signals modulated and demodulated by the received broadcast TV signal to the main device, and the main device receives the audio and video signals through the first input/output interface.
  • the controller 250 controls the operation of the display device and responds to user operations.
  • the controller 250 may control the overall operation of the display apparatus 200 .
  • the controller 250 may perform an operation related to the object selected by the user command.
  • the object may be any of the selectable objects, such as a hyperlink or an icon.
  • Operations related to the selected object such as displaying operations linked to hyperlinked pages, documents, images, etc., or executing operations corresponding to the icon.
  • the user command for selecting the UI object may be an input command through various input devices (eg, a mouse, a keyboard, a touchpad, etc.) connected to the display device 200 or a voice command corresponding to a voice spoken by the user.
  • the controller 250 includes a random access memory 251 (Random Access Memory, RAM), a read-only memory 252 (Read-Only Memory, ROM), a video processor 270, an audio processor 280, and other processors 253 (For example: at least one of graphics processing unit (Graphics Processing Unit, GPU), central processing unit 254 (Central Processing Unit, CPU), communication interface (Communication Interface), and communication bus 256 (Bus). Wherein, the communication bus Connect the parts.
  • RAM 251 is used to store temporary data for the operating system or other running programs
  • ROM 252 is used to store various system startup instructions.
  • ROM 252 is used to store a basic input output system, called a Basic Input Output System (BIOS). It is used to complete the power-on self-check of the system, the initialization of each functional module in the system, the driver program of the basic input/output of the system, and the boot operating system.
  • BIOS Basic Input Output System
  • the power supply of the display device 200 starts to start, and the CPU executes the system start-up instruction in the ROM 252, and copies the temporary data of the operating system stored in the memory to the RAM 251, so as to facilitate startup or operation operating system.
  • the CPU copies the temporary data of various application programs in the memory to the RAM 251, so as to facilitate starting or running various application programs.
  • the CPU processor 254 executes operating system and application program instructions stored in memory. And various application programs, data and content are executed according to various interactive instructions received from the external input, so as to finally display and play various audio and video content.
  • CPU processor 254 may include multiple processors.
  • the plurality of processors may include a main processor and one or more sub-processors.
  • the main processor is used to perform some operations of the display device 200 in the pre-power-on mode, and/or an operation of displaying a picture in the normal mode.
  • One or more sub-processors for an operation in a state such as standby mode.
  • the graphics processor 253 is used to generate various graphic objects, such as: icons, operation menus, and user input instructions to display graphics and the like. It includes an operator, which performs operations by receiving various interactive instructions input by the user, and displays various objects according to the display properties. and includes a renderer, which renders various objects obtained based on the operator, and the rendered objects are used for displaying on a display.
  • various graphic objects such as: icons, operation menus, and user input instructions to display graphics and the like. It includes an operator, which performs operations by receiving various interactive instructions input by the user, and displays various objects according to the display properties. and includes a renderer, which renders various objects obtained based on the operator, and the rendered objects are used for displaying on a display.
  • the video processor 270 is configured to receive the external video signal and perform decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image synthesis, etc. according to the standard codec protocol of the input signal. After video processing, a signal that can be directly displayed or played on the display device 200 can be obtained.
  • the video processor 270 includes a demultiplexing module, a video decoding module, an image synthesis module, a frame rate conversion module, a display formatting module, and the like.
  • the demultiplexing module is used for demultiplexing the input audio and video data stream. For example, if MPEG-2 is input, the demultiplexing module demultiplexes it into video signals and audio signals.
  • the video decoding module is used to process the demultiplexed video signal, including decoding and scaling.
  • the image synthesizing module such as an image synthesizer, is used for superimposing and mixing the GUI signal generated by the graphics generator according to the user's input or itself, and the zoomed video image, so as to generate an image signal that can be displayed.
  • the frame rate conversion module is used to convert the input video frame rate, such as converting 60Hz frame rate to 120Hz frame rate or 240Hz frame rate.
  • the usual format is implemented by means of frame insertion.
  • the display formatting module is used for converting the received frame rate into the video output signal, and changing the signal to conform to the display format signal, such as outputting the RGB data signal.
  • the graphics processor 253 may be integrated with the video processor, or may be separately configured.
  • the processing of the graphics signal output to the display may be performed.
  • different functions may be performed respectively. For example, GPU+FRC (Frame Rate Conversion)) architecture.
  • the audio processor 280 is configured to receive an external audio signal, perform decompression and decoding, and noise reduction, digital-to-analog conversion, and amplification processing according to a standard codec protocol of the input signal to obtain a The sound signal played in the speaker.
  • the video processor 270 may comprise one or more chips.
  • the audio processor may also include one or more chips.
  • the video processor 270 and the audio processor 280 may be separate chips, or may be integrated into one or more chips together with the controller.
  • the audio output under the control of the controller 250, receives the sound signal output by the audio processor 280, such as the speaker 286, and in addition to the speaker carried by the display device 200 itself, can be output to an external device.
  • the external audio output terminal of the device such as an external audio interface or an earphone interface, etc., may also include a short-range communication module in the communication interface, such as a Bluetooth module for outputting sound from a Bluetooth speaker.
  • the power supply 290 under the control of the controller 250, provides power supply support for the display device 200 with the power input from the external power supply.
  • the power supply 290 may include a built-in power supply circuit installed inside the display device 200 , or may be an external power supply installed in the display device 200 to provide an external power supply interface in the display device 200 .
  • the user interface 265 is used for receiving user input signals, and then sending the received user input signals to the controller 250 .
  • the user input signal may be a remote control signal received through an infrared receiver, and various user control signals may be received through a network communication module.
  • the user inputs user commands through the control device 100 or the mobile terminal 300 , the user input interface is based on the user's input, and the display device 200 responds to the user's input through the controller 250 .
  • the user may input user commands on a graphical user interface (GUI) displayed on the display 275, and the user input interface receives the user input commands through the graphical user interface (GUI).
  • GUI graphical user interface
  • the user may input a user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through a sensor to receive the user input command.
  • a "user interface” is a medium interface for interaction and information exchange between an application program or an operating system and a user, which enables conversion between an internal form of information and a form acceptable to the user.
  • the commonly used form of user interface is Graphical User Interface (GUI), which refers to a user interface related to computer operations displayed in a graphical manner. It can be an icon, window, control and other interface elements displayed on the display screen of the electronic device, wherein the control can include icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, Widgets, etc. visual interface elements.
  • GUI Graphical User Interface
  • the memory 260 includes storing various software modules for driving the display device 200 .
  • various software modules stored in the first memory include at least one of a basic module, a detection module, a communication module, a display control module, a browser module, and various service modules.
  • the basic module is used for signal communication between various hardwares in the display device 200, and is a low-level software module that sends processing and control signals to the upper-layer module.
  • the detection module is a management module used to collect various information from various sensors or user input interfaces, perform digital-to-analog conversion, and analyze and manage.
  • the speech recognition module includes a speech parsing module and a speech instruction database module.
  • the display control module is a module used to control the display to display image content, and can be used to play information such as multimedia image content and UI interface.
  • Communication module a module for control and data communication with external devices.
  • the browser module is a module for performing data communication between browsing servers. Service modules are used to provide various services and modules including various applications.
  • the memory 260 is also used to store and receive external data and user data, images of various items in various user interfaces, and visual effect diagrams of focus objects, and the like.
  • FIG. 3 exemplarily shows a block diagram of the configuration of the control device 100 according to an exemplary embodiment.
  • the control device 100 includes a controller 110, a communication interface 130, a user input/output interface, a memory, and a power supply.
  • the control device 100 is configured to control the display device 200 , and can receive the user's input operation instructions, and convert the operation instructions into instructions that the display device 200 can recognize and respond to, and play an intermediary role between the user and the display device 200 .
  • the user operates the channel addition and subtraction keys on the control device 100, and the display device 200 responds to the channel addition and subtraction operations.
  • control device 100 may be a smart device.
  • control device 100 may install various applications for controlling the display device 200 according to user requirements.
  • the mobile terminal 300 or other intelligent electronic device can perform a similar function of controlling the device 100 after installing an application for operating the display device 200 .
  • the user can install the application, various function keys or virtual buttons of the graphical user interface provided on the mobile terminal 300 or other intelligent electronic devices, so as to realize the function of controlling the physical keys of the device 100 .
  • the controller 110 includes a processor 112 and RAM 113 and ROM 114, a communication interface 130, and a communication bus.
  • the controller is used to control the operation and operation of the control device 100, as well as the communication and cooperation among the internal components and the external and internal data processing functions.
  • the communication interface 130 realizes the communication of control signals and data signals with the display device 200 under the control of the controller 110 .
  • the received user input signal is sent to the display device 200 .
  • the communication interface 130 may include at least one of other near field communication modules such as a WiFi chip 131 , a Bluetooth module 132 , and an NFC module 133 .
  • the user input/output interface 140 wherein the input interface includes at least one of other input interfaces such as a microphone 141, a touch panel 142, a sensor 143, and a key 144.
  • the user can implement the user command input function through actions such as voice, touch, gesture, pressing, etc.
  • the input interface converts the received analog signal into a digital signal, and converts the digital signal into a corresponding command signal, and sends it to the display device 200.
  • the output interface includes an interface for transmitting received user instructions to the display device 200 .
  • it can be an infrared interface or a radio frequency interface.
  • the infrared signal interface when used, the user input instruction needs to be converted into an infrared control signal according to the infrared control protocol, and sent to the display device 200 through the infrared sending module.
  • the radio frequency signal interface when a radio frequency signal interface is used, the user input command needs to be converted into a digital signal, and then modulated according to the radio frequency control signal modulation protocol, and then sent to the display device 200 by the radio frequency transmission terminal.
  • control device 100 includes at least one of a communication interface 130 and an input-output interface 140 .
  • the communication interface 130 is configured in the control device 100, such as: WiFi, Bluetooth, NFC and other modules, which can send user input instructions to the display device 200 through WiFi protocol, or Bluetooth protocol, or NFC protocol encoding.
  • the memory 190 is used to store various operating programs, data and applications for driving and controlling the control device 200 under the control of the controller.
  • the memory 190 can store various control signal instructions input by the user.
  • the power supply 180 is used to provide operating power support for each element of the control device 100 under the control of the controller. Can battery and related control circuit.
  • a system may include a kernel (Kernel), a command parser (shell), a file system, and applications.
  • kernel Kernel
  • shell command parser
  • file system a file system
  • applications the kernel, shell, and file system make up the basic operating system structures that allow users to manage files, run programs, and use the system.
  • the kernel starts, activates the kernel space, abstracts hardware, initializes hardware parameters, etc., runs and maintains virtual memory, scheduler, signals and inter-process communication (IPC).
  • IPC inter-process communication
  • the shell and user applications are loaded.
  • An application is compiled into machine code after startup, forming a process.
  • the system is divided into four layers, from top to bottom, they are an application layer (referred to as “application layer”), an application framework layer (referred to as “framework layer”) ”), the Android runtime and the system library layer (referred to as the “system runtime layer”), and the kernel layer.
  • application layer an application layer
  • frame layer an application framework layer
  • Android runtime the Android runtime
  • system library layer the system library layer
  • kernel layer the kernel layer
  • At least one application program runs in the application program layer, and these application programs may be a Window program, a system setting program, a clock program, a camera application, etc. built into the operating system; they may also be developed by a third party
  • the application programs developed by the author such as the Hijian program, the K song program, the magic mirror program, etc.
  • the application package in the application layer is not limited to the above examples, and may actually include other application packages, which is not limited in this embodiment of the present application.
  • the framework layer provides an application programming interface (API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer is equivalent to a processing center, which decides to let the applications in the application layer take action.
  • the application program can access the resources in the system and obtain the services of the system during execution through the API interface.
  • the application framework layer in the embodiment of the present application includes managers (Managers), content providers (Content Provider), etc., wherein the manager includes at least one of the following modules: an activity manager (Activity Manager) uses Interacts with all activities running in the system; Location Manager is used to provide system services or applications with access to system location services; Package Manager is used to retrieve files currently installed on the device Various information related to the application package; Notification Manager (Notification Manager) is used to control the display and clearing of notification messages; Window Manager (Window Manager) is used to manage icons, windows, toolbars, wallpapers on the user interface and desktop widgets.
  • an activity manager uses Interacts with all activities running in the system
  • Location Manager is used to provide system services or applications with access to system location services
  • Package Manager is used to retrieve files currently installed on the device Various information related to the application package
  • Notification Manager Notification Manager
  • Window Manager Window Manager
  • the activity manager is used to: manage the life cycle of each application and the usual navigation and fallback functions, such as controlling the exit of the application (including switching the user interface currently displayed in the display window to the system desktop), opening the , back (including switching the currently displayed user interface in the display window to the upper-level user interface of the currently displayed user interface), and the like.
  • the window manager is used to manage all window programs, such as obtaining the size of the display screen, judging whether there is a status bar, locking the screen, taking screenshots, and controlling the change of the display window (for example, reducing the display window to display, shaking display, twisting deformation display, etc.), etc.
  • the system runtime layer provides support for the upper layer, that is, the framework layer.
  • the Android operating system will run the C/C++ library included in the system runtime layer to implement the functions to be implemented by the framework layer.
  • the kernel layer is the layer between hardware and software. As shown in Figure 4, the kernel layer at least includes at least one of the following drivers: audio driver, display driver, Bluetooth driver, camera driver, WIFI driver, USB driver, HDMI driver, sensor driver (such as fingerprint sensor, temperature sensor, touch sensors, pressure sensors, etc.), etc.
  • the kernel layer at least includes at least one of the following drivers: audio driver, display driver, Bluetooth driver, camera driver, WIFI driver, USB driver, HDMI driver, sensor driver (such as fingerprint sensor, temperature sensor, touch sensors, pressure sensors, etc.), etc.
  • the kernel layer further includes a power driver module for power management.
  • software programs and/or modules corresponding to the software architecture in FIG. 4 are stored in the first memory or the second memory shown in FIG. 2 or FIG. 3 .
  • the remote control receiving device receives the input operation of the remote control
  • the corresponding hardware interrupt is sent to the kernel layer.
  • the kernel layer processes the input operation into the original input event (including the value of the input operation, the timestamp of the input operation and other information).
  • Raw input events are stored at the kernel layer.
  • the application framework layer obtains the original input event from the kernel layer, identifies the control corresponding to the input event according to the current position of the focus, and regards the input operation as a confirmation operation, and the control corresponding to the confirmation operation is the control of the magic mirror application icon.
  • the mirror application calls the interface of the application framework layer, starts the mirror application, and then starts the camera driver by calling the kernel layer to capture still images or videos through the camera.
  • the display device receives an input operation (such as a split-screen operation) performed by the user on the display screen, and the kernel layer can generate corresponding input operations according to the input operation. Enter an event and report the event to the application framework layer.
  • the window mode (such as multi-window mode) and window position and size corresponding to the input operation are set by the activity manager of the application framework layer.
  • the window management of the application framework layer draws the window according to the settings of the activity manager, and then sends the drawn window data to the display driver of the kernel layer, and the display driver displays the corresponding application interface in different display areas of the display screen.
  • the application layer contains at least one application that can display corresponding icon controls in the display, such as: live TV application icon control, video on demand application icon control, media center application Program icon controls, application center icon controls, game application icon controls, etc.
  • the live TV application may provide live TV from different sources.
  • a live TV application may provide a TV signal using input from cable, over-the-air, satellite services, or other types of live TV services.
  • the live TV application may display the video of the live TV signal on the display device 200 .
  • a video-on-demand application may provide video from various storage sources. Unlike live TV applications, video-on-demand provides a display of video from certain storage sources. For example, video-on-demand can come from a server-side cloud storage, from a local hard disk storage containing existing video programs.
  • the media center application may provide various multimedia content playback applications.
  • a media center may provide services other than live TV or video-on-demand, where users can access various images or audio through a media center application.
  • the application center may provide storage of various applications.
  • An application can be a game, an application, or some other application that is related to a computer system or other device but can be run on a Smart TV.
  • the application center can obtain these applications from various sources, store them in local storage, and then run them on the display device 200 .
  • the angle at which the camera is located is related to the field of view of the camera.
  • the gimbal is a supporting device for installing and fixing the camera. It is divided into fixed gimbal and electric gimbal. Among them, the electric gimbal is suitable for large-scale scanning and shooting, and it can expand the field of view of the camera.
  • the camera of the display device is installed on the electric pan/tilt, and the display device controller is used to control the electric pan/tilt, so that the camera can shoot at multiple angles.
  • the electric gimbal can be a horizontal rotating gimbal that can only rotate left and right, or an omnidirectional gimbal that can rotate left and right as well as up and down.
  • two motors are installed in the omnidirectional pan/tilt head, which are used to drive the pan/tilt head to rotate in the horizontal direction and the vertical direction respectively, so as to change the angle of the camera.
  • the limit angle that the camera can rotate in the horizontal direction and/or the vertical direction can be set by the user according to the needs.
  • the rotatable angle of the camera in the horizontal direction may range from 0° to 120°, where 0° and 120° are the corresponding limits of the two rotation directions (leftward and rightward) in the horizontal direction, respectively.
  • Angle; the rotatable angle of the camera in the vertical direction can be from 0° to 180°, where 0° and 180° are respectively the limit angles corresponding to the two rotation directions (up and down) in the vertical direction.
  • Fig. 6 to Fig. 11 are schematic diagrams of the angle of the camera exemplarily shown in the application, wherein Fig. 6 exemplarily shows the state when the tilt angle of the camera in the vertical direction is 0°, and Fig. 7 exemplarily shows The state when the tilt angle of the camera in the vertical direction is 90°, FIG. 8 exemplarily shows the state when the tilt angle of the camera in the vertical direction is 105°, and FIG. 9 exemplarily shows the state of the camera in the horizontal direction. The state when the horizontal angle is 0°, Fig. 10 exemplarily shows the state when the horizontal angle of the camera in the horizontal direction is 60°, and Fig. 11 exemplarily shows the state when the horizontal angle of the camera in the horizontal direction is 120° status.
  • the user can control the rotation of the PTZ by operating a designated button on the control device to adjust the angle at which the camera is located.
  • the user can input user operations for controlling the left and right rotation of the PTZ by operating the "left" and “right” direction keys on the control device. "Down” and “Up” arrow keys to input user actions for controlling the down and up rotation of the gimbal.
  • the display device receives the aforementioned user operation for adjusting the camera angle
  • the motor in the PTZ is controlled to rotate to drive the PTZ to rotate in the target adjustment direction indicated by the user operation.
  • the controller performs different processing to adjust the angle of the camera according to different operation modes of the user on the above-mentioned designated keys.
  • the controller adjusts the direction according to the target indicated by the key operated by the user in response to the received user operation. , control the camera to make incremental adjustments to the target adjustment direction based on the current angle. For example, if the target adjustment direction indicated by the key operated by the user is "right", and the angle of the camera is 15° when the user's operation is received, then adjust 10° to the right on the basis of 15°, That is, adjust to 25°.
  • the first operation mode may be a user's short-pressing operation on a designated key. The user can perform a short press operation continuously, so that the controller controls the camera to make continuous incremental adjustments to the target adjustment direction.
  • the controller adjusts the camera to the limit angle corresponding to the target adjustment direction according to the target adjustment direction indicated by the user's operation button in response to the received user operation. For example, if the target adjustment direction indicated by the key operated by the user is "right" and the limit angle corresponding to "right” is 120°, the angle of the camera is adjusted to the right until it reaches 120°.
  • the second operation mode may be a user's long-pressing operation on a designated key. Different from the processing logic of adjusting the camera angle through a short press operation, the user can perform a long press operation to adjust the camera angle to the limit angle corresponding to a certain direction at one time.
  • the controller since the controller makes incremental adjustments based on the current angle of the camera every time the controller receives a short-pressing operation, the angle adjustment may not be in place. The problem.
  • the controller when the controller receives the first short press operation and the angle of the camera is 15° when the first short press operation is received, the response The first short press operation, the controller will control the camera to rotate 10° to the right on the basis of the current 15°, that is, adjust to 25°, when the controller receives the second short press operation and after receiving the second short press
  • the angle the camera is in is 18° when pressing the operation, that is, it has not reached 25°.
  • the controller will control the camera to rotate 10° to the right on the basis of the current 18°, that is, adjust to 28° instead of 35°, resulting in the problem of not properly adjusting the angle.
  • the controller of the display device is configured to: receive a user operation for adjusting the angle of the camera, and the user operation carries a mark used to indicate the target adjustment direction and the operation mode;
  • the indicated operation mode is the first operation mode, then the target angle is determined according to the target adjustment direction and the saved starting angle, and the angle of the camera is adjusted to the target angle, wherein the saved starting angle is the last determined target angle or the upper
  • the first operation mode may be a user's short-press operation on a designated key
  • the second operation mode may be a user's long-press operation on the designated key.
  • the remote control sends the user's operation information to the display device, and the operation information includes an identifier for indicating the target adjustment direction and an identifier for indicating the operation mode.
  • the identifier used to indicate the target adjustment direction may be a key code
  • the identifier used to indicate the operation mode may be a count value. For example, when the count value in the operation information is 0, it indicates that the operation is a short-press operation, and when the count value in the operation information is greater than 0, the operation is a long-press operation.
  • the target angle is obtained by adding or subtracting the starting angle and a preset step angle.
  • the preset step angle is 10° or other values as in the above example.
  • the target adjustment direction is "rightward”
  • the starting angle is added to the preset step angle to obtain the target angle; when the target adjustment direction is "leftward”, the starting angle is added to the preset step angle.
  • the target adjustment direction is “rightward”
  • the target angle is determined to be 25°
  • the target adjustment direction is "towards the right” Left” determine the target angle as 5°.
  • the determined target angle is saved in the system, or after the camera angle adjustment is terminated, the camera angle when the adjustment is terminated is saved in the system as a next time The starting angle of the adjustment.
  • the limit angle corresponding to the target adjustment direction is used as the target angle for adjustment.
  • the limit angle of "rightward" adjustment is 120°
  • the limit angle of 120° is used as the target angle for adjustment.
  • the user can operate the designated button through different operation modes to input the user operation for adjusting the angle of the camera, wherein the user operation input through the first operation mode can control the camera at the starting angle
  • the user operation input through the second operation mode can control the camera to rotate to the limit angle corresponding to the target adjustment direction at one time.
  • the controller since every time the controller determines a target angle or terminates adjusting the camera angle, it will save the determined target angle or the camera angle when the adjustment is terminated as the starting angle for the next adjustment.
  • the camera When the camera receives a user operation based on the first operation mode, it will determine a new target angle according to the target adjustment direction and the saved starting angle, and adjust the camera to the target angle, so it can be avoided when the user repeatedly short-presses the specified There is a problem that the angle is not adjusted properly when pressing the button.
  • the controller determines the target angle to be 25° according to the starting angle of 15° and the preset step angle of 10°, and starts to control the camera to rotate to the right to 25°, and at the same time rotates the target angle determined this time. 25° is saved as the starting angle.
  • the controller receives the second short press operation, although the actual angle reached by the camera is 18°, the controller will use the saved starting angle of 25° and the preset step angle of 10°.
  • the new target angle is determined to be 35°, that is to say, the camera angle will be adjusted to 35° instead of 28° calculated from 18°, thus avoiding the angle adjustment when the user presses the specified button several times in a row The problem is not in place.
  • the controller will adjust the direction according to the target indicated by the user's operation, and the camera of the controller will rotate smoothly and uniformly to the corresponding target adjustment direction. limit angle. Furthermore, the user can adjust the camera angle to the limit angle at one time by a long press operation.
  • the controller of the display device is further configured to: before adjusting the angle of the camera to the limit angle corresponding to the target adjustment direction, detect whether to receive a message for terminating the A user operation for adjusting the angle of the camera; when receiving a user operation for terminating the adjustment of the angle of the camera, the adjustment of the angle of the camera is terminated.
  • the user operation for terminating the adjustment of the angle of the camera may be a lifting operation after receiving a long-press operation on the designated key.
  • the controller of the display device is further configured to: in the process of adjusting the angle of the camera to the limit angle corresponding to the target adjustment direction, periodically obtain the current angle of the camera, and determine whether the current angle of the camera reaches the target The limit angle corresponding to the adjustment direction; when the current angle of the camera reaches the limit angle corresponding to the target adjustment direction, the adjustment of the angle of the camera is terminated.
  • the display device controller will control the camera to rotate in the target adjustment direction; when it is detected that the user lifts the specified button or the angle of the camera reaches the corresponding target adjustment direction. When the limit angle is reached, the rotation of the camera is terminated.
  • the current angle of the camera reaches the limit angle corresponding to the target adjustment direction, including the situation where the current angle of the camera completely matches the limit angle, and/or the angle value at which the current angle of the camera reaches the limit angle is less than the preset value. situation.
  • the main thread receives the user operation for adjusting the camera angle, such as the operation information sent by the remote control, and sends the received user operation to CameraSettingActivity.
  • the user operation is repackaged, so that the repackaged user operation carries the identification used to indicate the target adjustment direction and the identification used to indicate the operation mode, and according to the different user operation modes, the repackaged user operation is assigned to different child thread for processing. Specifically, if it is a user operation based on the first operation mode, the repackaged user operation is added as a pending message to the pending queue of the first sub-thread, and the first sub-thread obtains the pending processing from the pending queue. If it is a user operation based on the second operation mode, the repackaged user operation is directly sent to the second sub-thread, and the second sub-thread processes it.
  • the first sub-thread is specifically used to obtain the pending user operation for adjusting the angle of the camera from the pending queue; obtain the target adjustment direction according to the identifier carried by the user operation and used to indicate the target adjustment direction; adjust the direction according to the target and save the
  • the starting angle determines the target angle of this operation; the method used to adjust the camera angle is called, and the camera angle is adjusted to the target angle through this method.
  • the angle of the camera reaches the target angle, the first child thread is closed.
  • the second sub-thread is specifically used to receive the user operation for adjusting the camera angle sent by CameraSettingActivity; obtain the target adjustment direction according to the identifier used to indicate the target adjustment direction carried by the user operation; call the method for adjusting the camera angle, through the The method adjusts the angle of the camera to the limit angle corresponding to the target adjustment direction. And, in the process of adjusting the angle of the camera to the limit angle corresponding to the target adjustment direction, periodically obtain the current angle of the camera, and determine whether the obtained current angle reaches the limit angle corresponding to the target adjustment direction, and when it is reached, terminate. Adjustment to the camera angle. When the adjustment of the camera angle is completed or terminated, the second sub-thread is closed.
  • the angle of the camera will not be adjusted.
  • the rotatable angle of the camera in the target adjustment direction is also referred to as the remaining angle
  • the adjustment error is also referred to as the preset minimum angle.
  • the controller of the display device is further configured to: before determining the target angle according to the target adjustment direction and the saved start angle, calculate the remaining angle according to the limit angle and the start angle corresponding to the target adjustment direction ; Determine whether the remaining angle is greater than the preset minimum angle; if the remaining angle is greater than the preset minimum angle, execute the step of determining the target angle; if the remaining angle is not greater than the preset minimum angle, generate an interface prompt and display the interface prompt to show The starting angle reaches the limit angle corresponding to the target adjustment direction.
  • the target adjustment direction is "right”
  • the limit angle corresponding to "right” is 120°
  • the saved starting angle is 118°
  • the preset minimum angle is 3°.
  • the corresponding baseline angle and The remaining angle of the saved starting angle calculation is 2°, which is smaller than the preset minimum angle, then the step of determining the target angle is not performed, but an interface prompt is generated and displayed to show that the starting angle reaches the limit corresponding to the target adjustment direction angle information.
  • the embodiments of the present application further provide a method for adjusting the angle of a camera.
  • the method is applied to the display device provided by the above embodiments, and the execution subject of the method includes but is not limited to a controller of the display device.
  • FIG. 12 is a flowchart of a method for adjusting the angle of a camera according to an exemplary embodiment of the present application. As shown in FIG. 12 , the method may include:
  • Step 121 Receive a user operation for adjusting the angle of the camera, where the user operation carries an identifier for indicating the target adjustment direction and an operation mode.
  • Step 122 if the operation mode indicated by the identification is the first operation mode, determine the target angle according to the target adjustment direction and the saved starting angle, and adjust the angle of the camera to the target angle, and the save The starting angle of is the target angle determined last time or the angle at which the camera was located when the adjustment was terminated last time.
  • the first operation mode is a user's short-press operation on a designated key
  • the second operation mode is a user's long-press operation on the designated key
  • the designated key is used to adjust the camera angle button
  • the method before determining the target angle according to the target adjustment direction and the saved starting angle, the method further includes: calculating a remaining angle according to the limit angle corresponding to the target adjustment direction and the starting angle; whether the remaining angle is greater than the preset minimum angle; if the remaining angle is greater than the preset minimum angle, execute the step of determining the target angle; if the remaining angle is not greater than the preset minimum angle, generate an interface prompt, and display the interface prompt to show that the starting angle reaches the limit angle corresponding to the target adjustment direction.
  • the determining the target angle according to the target adjustment direction and the saved starting angle includes: adding or subtracting the starting angle and a preset step angle according to different target adjustment directions , to get the target angle.
  • Step 123 if the operation mode indicated by the flag is the second operation mode, adjust the angle of the camera to the limit angle corresponding to the target adjustment direction.
  • the method before adjusting the angle of the camera to a limit angle corresponding to the target adjustment direction, the method further includes: detecting whether a user operation for terminating adjusting the angle of the camera is received; When the user operation of adjusting the angle of the camera is terminated, the adjustment of the angle of the camera is terminated.
  • the user operation for terminating the adjustment of the angle of the camera is a lifting operation after receiving a long-press operation on the designated key.
  • the process of adjusting the angle of the camera to the limit angle corresponding to the target adjustment direction further includes: periodically acquiring the current angle of the camera; judging whether the current angle of the camera reaches The limit angle corresponding to the target adjustment direction; when the current angle of the camera reaches the limit angle corresponding to the target adjustment direction, the adjustment of the angle of the camera is terminated.
  • adjusting the angle of the camera to the target angle or a limit angle corresponding to the target adjustment direction includes: controlling the camera to rotate at a predetermined speed to the target angle or the limit at a constant speed angle.
  • the target angle is determined according to the target adjustment direction and the saved start angle
  • the target angle is saved as the start angle, or the adjustment of the camera angle is terminated. Then, the angle at which the camera is located is saved as the starting angle.
  • the angle at which the camera is located is related to the field of view of the camera.
  • the gimbal is a supporting device for installing and fixing the camera. It is divided into fixed gimbal and electric gimbal. Among them, the electric gimbal is suitable for large-scale scanning and shooting, and it can expand the field of view of the camera.
  • the electric gimbal can be a horizontal rotating gimbal that can only rotate left and right, or an omnidirectional gimbal that can rotate left and right as well as up and down.
  • two motors are installed in the omnidirectional pan/tilt head, which are used to drive the pan/tilt head to rotate in the horizontal direction and the vertical direction respectively, so as to change the angle of the camera.
  • the camera installed on the electric PTZ is also called the PTZ camera.
  • the display device 200 includes a detector 230, and the detector 230 includes a camera, and the camera may be a pan-tilt camera 232 as shown in FIG. 13a.
  • the display device 200 includes an external device interface 240 through which an external device can access the display device 200 .
  • the external PTZ camera 232 can be connected to the controller 220 of the display device through the external device interface 240 .
  • the PTZ can be controlled by the display device controller, so that the camera can shoot at multiple angles.
  • the limit angle that the camera can rotate in the horizontal and/or vertical direction can be designed according to the needs.
  • the rotatable angle of the camera in the horizontal direction may range from 0° to 120°, where 0° and 120° are the corresponding limits of the two rotation directions (leftward and rightward) in the horizontal direction, respectively.
  • Angle; the rotatable angle of the camera in the vertical direction can be from 0° to 180°, where 0° and 180° are respectively the limit angles corresponding to the two rotation directions (up and down) in the vertical direction.
  • Fig. 9 to Fig. 11 are schematic diagrams of the angle of the camera exemplarily shown in the application, wherein Fig. 9 exemplarily shows the state when the tilt angle of the camera in the vertical direction is 0°, Fig. 7 exemplarily shows The state when the tilt angle of the camera in the vertical direction is 90°, FIG. 8 exemplarily shows the state when the tilt angle of the camera in the vertical direction is 105°, and FIG. 9 exemplarily shows the state of the camera in the horizontal direction. The state when the horizontal angle is 0°, Fig. 10 exemplarily shows the state when the horizontal angle of the camera in the horizontal direction is 60°, and Fig. 11 exemplarily shows the state when the horizontal angle of the camera in the horizontal direction is 120° status.
  • the camera is tilted up, which means that the tilt angle of the camera in the vertical direction is greater than a preset minimum angle; the camera is down, which means that the tilt angle of the camera in the vertical direction is smaller than a preset maximum angle.
  • the preset minimum angle and the preset maximum angle may be the same angle value, or may be different angle values.
  • the state when the camera is raised may be the state shown in FIG. 7
  • the state when the camera is falling may be the state shown in FIG. 9 .
  • the camera in the time period when the user does not use the camera, if the camera is always in a tilted state, from the perspective of user experience, it is easy to create an illusion for the user that the camera is working, thereby causing confusion to the user.
  • the camera in the event of user misoperation or system error, the camera may be turned on by mistake without the user's knowledge, which may easily lead to the leakage of user privacy, which is not conducive to user experience.
  • the camera when the application associated with the camera is switched to the foreground application, or when the foreground application calls the interface for starting the camera, the camera is controlled to tilt up; when the display device is turned off or associated with the camera The app exits from the foreground and controls the camera to land.
  • a nonsensical landing and raising occurs for the foreground app when switching between two apps, both of which are associated with the camera. Specifically, when the foreground application is switched from the first application to the second application, because the first application exits the foreground application, the camera is controlled to land, and then, because the second application enters the foreground application, the camera is controlled to be raised up. It can be seen that the The process of first landing and then leaning up is meaningless, it will increase the working frequency of the motor, reduce its service life, and prolong the waiting time of the user, which is not conducive to the user experience.
  • an embodiment of the present application provides a display device, as shown in FIG. 2 and FIG. 13 a or 6 , the display device includes a display 275 for displaying a user interface, for collecting local The camera 232 of the image and the controller 220 connected to the camera, wherein the controller 220 is configured to execute: when continuously receiving the instruction of the application to close the camera and the instruction of the application to open the camera, control the camera to maintain the current angle, wherein, continuously receiving an instruction to close the camera by the application and an instruction to open the camera by the application means that the time interval between receiving the two instructions is less than a preset duration.
  • an instruction to turn on the camera from the first application or the second application is received within a preset period of time after receiving the instruction from the first application to turn off the camera.
  • the instruction of the first application to close the camera is received within a preset time period after the instruction of the first application to open the camera is received.
  • the first application and the second application are different applications associated with the camera.
  • the camera when an instruction to turn on the camera from the first application or the second application is received within a preset period of time after the instruction to turn off the camera from the first application is received, the camera is controlled to maintain the second preset angle.
  • the angle of the camera is adjusted to the first preset angle.
  • controller 220 is configured to execute the steps shown in FIG. 14 :
  • Step 801 when the camera is in an off state, an instruction for starting the camera from the first application is received.
  • Step 802 in response to an instruction of the first application to activate the camera, activate the camera, and adjust the angle at which the camera is located from the first preset angle to the second preset angle.
  • the first preset angle is the pitch angle at which the camera is positioned after landing
  • the second preset angle is the pitch angle at which the camera is positioned after being raised.
  • Step 803 When the camera is in the on state, an instruction of the first application to turn off the camera is received. Wherein, when the first application is closed or exits the foreground, an instruction to close the camera is sent.
  • Step 804 in response to the first application closes the instruction of the camera, closes the camera, and monitors whether the first application or the second application opens the instruction of the camera within the preset duration;
  • Step 805 if an instruction to turn on the camera from the first application or the second application is received within the preset time period, the camera is turned on and the camera is maintained at the second preset angle. Wherein, after the first application or the second application is started or switched to the foreground, an instruction to turn on the camera is sent.
  • Step 806 If the first application or the second application does not receive an instruction to turn on the camera within the preset time period, adjust the angle at which the camera is located to the first preset angle.
  • the first application when the first application starts, it sends an instruction instructing to turn on the camera to the CameraService service of the system; after the CameraService service receives the instruction to turn on the camera from the first application, it calls the camera provided by the CameraControl service for controlling the tilting of the camera.
  • the method resetTorecorded() is used to control the camera tilt to the second preset angle.
  • the first application will send an instruction to the CameraService service instructing to close the camera; the CameraService service starts timing after receiving the instruction of the first application to close the camera, and simultaneously monitors whether Receive an instruction to turn on the camera from the first app or the second app; if the first app or the second app is starting or entering the foreground, the first app or the second app will send the CameraService service an instruction to turn on the camera; Before the preset time period (such as 3s) is reached, an instruction to turn on the camera from the first application or the second application is received, and the process ends; if the instruction to turn on the camera from the first application or the second application is not received before the timing period reaches the preset time length , then stop timing after the timing duration reaches the preset duration, and call the method reset() provided by the CameraControl service for controlling the camera to land to control the camera to land at the first preset angle.
  • the CameraService service starts timing after receiving the instruction of the first application to close the camera, and simultaneously monitors whether Receive an instruction to turn on the camera from the first app or
  • both the "Magic Mirror” application and the “Chat” application are applications associated with the camera.
  • the "Magic Mirror” app After the user opens the “Magic Mirror” app or switches the “Magic Mirror” app to the foreground by operation, the “Magic Mirror” app sends an instruction to turn on the camera to the CameraService service of the system; after the CameraService service receives the instruction, it calls the CameraControl service.
  • the method resetTorecorded() provided by the service for controlling the tilt of the camera is used to control the tilt of the camera to the second preset angle.
  • the "Magic Mirror” application When the user switches the “Magic Mirror” application to the foreground and switches the “Chat” application to the foreground, the "Magic Mirror” application will send an instruction to turn off the camera to the CameraService service, and the “Chat” application will send an instruction to the CameraService service. Send an instruction to turn on the camera. After the CameraService service receives the command to close the “Magic Mirror” application, it turns off the camera and starts timing.
  • the camera will be kept on, and the camera will be maintained at the second preset angle; if it is not received before the timing reaches the preset duration To the "Magic Mirror” application's opening command, the camera will be turned off, and the camera will be controlled to land at the first preset angle at the same time.
  • the preset duration such as 3s
  • the CameraService service receives an instruction from the first application or the second application to start the camera within the preset time period when the first application closes the camera, it will not control the camera to land, but will maintain the camera. at the second preset angle, thereby avoiding meaningless landing and pitching controls due to application switching.
  • a landing task is generated and added to the task queue, where the landing task refers to the task of adjusting the angle of the camera to the first preset angle ;
  • the waiting time of the landing task to be executed in the task queue reaches the preset time, read and execute the landing task, that is, adjust the angle of the camera to the first preset angle;
  • the CameraService service receives the first When the application or the second application is instructed to turn on the camera, the landing task to be executed in the task queue is deleted.
  • the waiting time of the landing task to be executed in the task queue reaches the preset time, it will be read and executed by the CameraService service. Therefore, if the first application or the second application is received in the CameraService service, the camera is turned on.
  • the waiting time of the landing tasks to be executed must be less than the preset time. By deleting the landing tasks to be executed in the task queue, it can be avoided that the waiting time reaches the predetermined time. When the time is set, it is read and executed.
  • the CameraService service receives an instruction from the first application or the second application to start the camera within the preset time period when the first application closes the camera, it will not control the camera to land, but will maintain the camera at the second preset angle. This avoids meaningless landing and pitching controls due to application switching.
  • the CameraService service receives an instruction from the application to turn on the camera when there is no landing task to be executed in the task queue, it turns on the camera and adjusts the angle of the camera from the first preset angle to the second preset angle. .
  • the camera when an instruction to turn off the camera from the first application is received within a preset period of time after the instruction to turn on the camera from the first application is received, the camera is controlled to maintain the first preset angle.
  • the angle at which the camera is located is adjusted to a second preset angle.
  • controller 220 is further configured to perform the steps shown in FIG. 15 :
  • Step 901 When the camera is in an on state, an instruction of the first application to turn off the camera is received.
  • Step 902 in response to an instruction of the first application to close the camera, close the camera, and adjust the angle at which the camera is located from the second preset angle to the first preset angle.
  • Step 903 when the camera is in an off state, an instruction of the first application to turn on the camera is received.
  • the first application enters the foreground to run, it sends an instruction to turn on the camera.
  • Step 904 in response to the first application's instruction to turn on the camera, turn on the camera, and monitor whether the first application's instruction to turn off the camera is received within a preset time period;
  • Step 905 if an instruction of the first application to close the camera is received within the preset time period, the camera is closed, and the camera is maintained at the first preset angle. Wherein, when the first application is closed or exits the foreground, an instruction to close the camera is sent.
  • Step 906 if no instruction from the first application to turn off the camera is received within the preset time period, adjust the angle at which the camera is located to a second preset angle.
  • the CameraService service when the CameraService service receives an instruction from the first application to turn on the camera, it generates a tilt task and adds it to the task queue, where the tilt task refers to adjusting the angle of the camera to the second preset angle. task; when the waiting time of the lift-up task to be executed in the task queue reaches the preset duration, read and execute the lift-up task, that is, adjust the angle of the camera to the second preset angle; when the CameraService service receives When the first application instructs to turn off the camera, delete the tilt-up task to be executed in the task queue.
  • the waiting time of the tilt-up task to be executed in the task queue reaches the preset time, it will be read and executed by the CameraService service. Therefore, if the CameraService service receives an instruction from the first application to turn off the camera , there are still pending lift-up tasks in the task queue, then the waiting time of the lift-up tasks to be executed must be less than the preset duration. When the time is set, it is read and executed. That is to say, if the CameraService service receives an instruction from the first application to turn off the camera within the preset time period when the first application opens the camera, it will not control the camera to tilt up, but will maintain the camera at the first preset angle, thereby avoiding no Meaningful landing and pitch control.
  • the present invention also provides a computer storage medium, wherein the computer storage medium can store a program, and when the program is executed, it can include some or all of the steps in the various embodiments of the camera angle adjustment method provided by the present invention .
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (English: read-only memory, abbreviated as: ROM) or a random access memory (English: random access memory, abbreviated as: RAM) and the like.
  • the technology in the embodiments of the present invention can be implemented by means of software plus a necessary general hardware platform.
  • the technical solutions in the embodiments of the present invention may be embodied in the form of software products in essence or the parts that make contributions to the prior art, and the computer software products may be stored in a storage medium, such as ROM/RAM , magnetic disk, optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in various embodiments or some parts of the embodiments of the present invention.
  • a computer device which may be a personal computer, a server, or a network device, etc.

Abstract

La présente invention concerne un dispositif d'affichage et un procédé d'affichage. Une opération d'utilisateur introduite selon une première méthode d'opération peut commander une caméra pour effectuer, sur la base d'un angle de départ, ajustements incrémentaux dans une direction de réglage visée, et une opération d'utilisateur introduite selon une seconde méthode d'opération peut commander la caméra pour pivoter, en une seule opération, jusqu'à un angle limite correspondant à la direction de réglage visée.
PCT/CN2021/096429 2020-08-07 2021-05-27 Dispositif et procédé d'affichage WO2022028060A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202180053612.5A CN116264864A (zh) 2020-08-07 2021-05-27 一种显示设备及显示方法

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
CN202010789824.7A CN111970548B (zh) 2020-08-07 2020-08-07 显示设备及调整摄像头角度的方法
CN202010789824.7 2020-08-07
CN202010851738.4 2020-08-21
CN202010851738 2020-08-21
CN202110156378.0A CN112954425A (zh) 2020-08-21 2021-02-04 显示设备及摄像头控制方法
CN202110156378.0 2021-02-04

Publications (1)

Publication Number Publication Date
WO2022028060A1 true WO2022028060A1 (fr) 2022-02-10

Family

ID=80119870

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/096429 WO2022028060A1 (fr) 2020-08-07 2021-05-27 Dispositif et procédé d'affichage

Country Status (2)

Country Link
CN (1) CN116264864A (fr)
WO (1) WO2022028060A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114739361A (zh) * 2022-02-25 2022-07-12 中国科学院空天信息创新研究院 对地观测方法、装置、电子设备及存储介质
CN115031694A (zh) * 2022-04-25 2022-09-09 中国科学院空天信息创新研究院 对地观测方法、设备、存储介质及程序产品

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1739534A2 (fr) * 2005-06-30 2007-01-03 Sony Corporation Dispositif d'interface utilisateur graphique, procédé de saisie opérationnelle et dispositif de communication bidirectionnelle
CN105120162A (zh) * 2015-08-27 2015-12-02 广东欧珀移动通信有限公司 一种摄像头旋转控制方法及终端
CN110213489A (zh) * 2019-06-20 2019-09-06 维沃移动通信有限公司 一种控制方法、装置及终端设备
CN110418050A (zh) * 2018-04-26 2019-11-05 Oppo广东移动通信有限公司 移动终端的摄像头控制方法、装置、移动终端及存储介质
CN111970548A (zh) * 2020-08-07 2020-11-20 海信视像科技股份有限公司 显示设备及调整摄像头角度的方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1739534A2 (fr) * 2005-06-30 2007-01-03 Sony Corporation Dispositif d'interface utilisateur graphique, procédé de saisie opérationnelle et dispositif de communication bidirectionnelle
CN105120162A (zh) * 2015-08-27 2015-12-02 广东欧珀移动通信有限公司 一种摄像头旋转控制方法及终端
CN110418050A (zh) * 2018-04-26 2019-11-05 Oppo广东移动通信有限公司 移动终端的摄像头控制方法、装置、移动终端及存储介质
CN110213489A (zh) * 2019-06-20 2019-09-06 维沃移动通信有限公司 一种控制方法、装置及终端设备
CN111970548A (zh) * 2020-08-07 2020-11-20 海信视像科技股份有限公司 显示设备及调整摄像头角度的方法

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114739361A (zh) * 2022-02-25 2022-07-12 中国科学院空天信息创新研究院 对地观测方法、装置、电子设备及存储介质
CN114739361B (zh) * 2022-02-25 2023-06-13 中国科学院空天信息创新研究院 对地观测方法、装置、电子设备及存储介质
CN115031694A (zh) * 2022-04-25 2022-09-09 中国科学院空天信息创新研究院 对地观测方法、设备、存储介质及程序产品

Also Published As

Publication number Publication date
CN116264864A (zh) 2023-06-16

Similar Documents

Publication Publication Date Title
CN111741372B (zh) 一种视频通话的投屏方法、显示设备及终端设备
WO2022073392A1 (fr) Procédé d'affichage d'image et dispositif d'affichage
CN111970548B (zh) 显示设备及调整摄像头角度的方法
CN112291599B (zh) 显示设备及调整摄像头角度的方法
WO2022048203A1 (fr) Procédé d'affichage et dispositif d'affichage destinés à la manipulation d'informations d'invite de commande de procédé de saisie
WO2022021669A1 (fr) Procédé pour la commande d'un mode d'image intelligent et dispositif d'affichage
CN112118400B (zh) 显示设备上图像的显示方法及显示设备
CN113938724A (zh) 显示设备及录屏分享方法
WO2022028060A1 (fr) Dispositif et procédé d'affichage
CN111970549A (zh) 菜单显示方法和显示设备
CN111954059A (zh) 屏保的展示方法及显示设备
CN112243141A (zh) 投屏功能的显示方法及显示设备
CN111866498B (zh) 一种摄像头异常处理方法及显示设备
CN112306604B (zh) 一种传输文件的进度显示方法及显示设备
CN112040535B (zh) 一种Wifi处理方法及显示设备
WO2022083357A1 (fr) Dispositif d'affichage et procédé de commande de caméra
CN113824870A (zh) 显示设备及摄像头角度调整方法
CN114302197A (zh) 一种语音分离控制方法及显示设备
CN114417035A (zh) 一种图片浏览方法和显示设备
CN113825001B (zh) 全景图片浏览方法及显示设备
CN112866768B (zh) 显示设备及信息提示方法
WO2022105410A1 (fr) Procédé d'affichage, et procédé de mémoire de paramètres de dispositif et son procédé de récupération
CN113194355B (zh) 一种视频播放方法及显示设备
CN113436564B (zh) 一种epos的展示方法及显示设备
WO2022100252A1 (fr) Dispositif d'affichage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21853847

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21853847

Country of ref document: EP

Kind code of ref document: A1