CN116264864A - Display equipment and display method - Google Patents

Display equipment and display method Download PDF

Info

Publication number
CN116264864A
CN116264864A CN202180053612.5A CN202180053612A CN116264864A CN 116264864 A CN116264864 A CN 116264864A CN 202180053612 A CN202180053612 A CN 202180053612A CN 116264864 A CN116264864 A CN 116264864A
Authority
CN
China
Prior art keywords
angle
camera
target
user
application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180053612.5A
Other languages
Chinese (zh)
Inventor
李东航
刘晋
姜俊厚
司洪龙
李保成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202010789824.7A external-priority patent/CN111970548B/en
Priority claimed from CN202110156378.0A external-priority patent/CN112954425A/en
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Publication of CN116264864A publication Critical patent/CN116264864A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses a display device and a display method, wherein user operation input through a first operation mode can control a camera to perform incremental adjustment to a target adjustment direction on the basis of an initial angle, and user operation input through a second operation mode can control the camera to rotate to a limit angle corresponding to the target adjustment direction at one time.

Description

Display equipment and display method
The present application claims priority from chinese patent office, application number 202010851738.4, entitled "display device and camera control method," filed on 21/8/2020, the entire contents of which are incorporated herein by reference; the present application claims priority from the chinese patent office filed 2 months 04 of 2021, application number 202110156378.0, entitled "display device and camera control method", the entire contents of which are incorporated herein by reference; the present application claims priority from the chinese patent office, application number 202010789824.7, entitled "display device and camera control method," filed on even 07, 8/2020, the entire contents of which are incorporated herein by reference.
Technical Field
The application relates to the technical field of display equipment, in particular to display equipment and a display method.
Background
The display device may provide a user with a play screen such as audio, video, pictures, etc. Today, a display device can provide not only live television program content received through data broadcasting but also various applications and service content such as web videos, web games, and the like to a user.
In order to further meet the personalized demands of users, related technologies configure a camera on a display device, acquire local image data acquired by the camera through a controller of the display device for processing, and realize the functions of video chat, photographing, video recording and the like on the display device.
Disclosure of Invention
The application provides a display device, comprising:
a display for displaying a user interface;
the camera is used for collecting images, and the angle of the camera can be adjusted so as to collect images in different ranges when the camera is at different angles;
and the controller is connected with the camera and is used for:
receiving user operation for adjusting the angle of the camera, wherein the user operation carries an identifier for indicating the target adjustment direction and an operation mode;
If the operation mode indicated by the identification is a first operation mode, determining a target angle according to the target adjustment direction and a stored starting angle, and adjusting the angle of the camera to the target angle, wherein the stored starting angle is the last determined target angle or the angle of the camera when the adjustment is stopped last time;
and if the operation mode indicated by the identification is a second operation mode, adjusting the angle of the camera to a limit angle corresponding to the target adjustment direction.
Drawings
In order to more clearly illustrate the technical solutions of the present application, the drawings that are needed in the embodiments will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
A schematic diagram of an operational scenario between a display device and a control apparatus according to some embodiments is schematically shown in fig. 1;
a hardware configuration block diagram of a display device 200 according to some embodiments is exemplarily shown in fig. 2;
a hardware configuration block diagram of the control device 100 according to some embodiments is exemplarily shown in fig. 3;
A schematic diagram of the software configuration in a display device 200 according to some embodiments is exemplarily shown in fig. 4;
an icon control interface display schematic of an application in a display device 200 according to some embodiments is illustrated in fig. 5;
fig. 6 to 11 exemplarily show state diagrams of the camera at different angles;
FIG. 12 is a flowchart of a method of adjusting camera angles according to an exemplary embodiment of the present application;
FIG. 13a is a schematic diagram of a display device shown in some embodiments of the present application;
FIG. 13b is a schematic diagram of a display device shown in some embodiments of the present application;
fig. 14 is a flowchart exemplarily showing a method for controlling a camera provided in the present application;
fig. 15 is a flowchart exemplarily showing a method for controlling a camera provided in the present application.
Detailed Description
For purposes of clarity, embodiments and advantages of the present application, the following description will make clear and complete the exemplary embodiments of the present application, with reference to the accompanying drawings in the exemplary embodiments of the present application, it being apparent that the exemplary embodiments described are only some, but not all, of the examples of the present application.
Based on the exemplary embodiments described herein, all other embodiments that may be obtained by one of ordinary skill in the art without making any inventive effort are within the scope of the claims appended hereto. Furthermore, while the disclosure is presented in the context of an exemplary embodiment or embodiments, it should be appreciated that the various aspects of the disclosure may, separately, comprise a complete embodiment.
It should be noted that the brief description of the terms in the present application is only for convenience in understanding the embodiments described below, and is not intended to limit the embodiments of the present application. Unless otherwise indicated, these terms should be construed in their ordinary and customary meaning.
The terms first, second, third and the like in the description and in the claims and in the above-described figures are used for distinguishing between similar or similar objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated (Unless otherwise indicated). It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are, for example, capable of operation in sequences other than those illustrated or otherwise described herein.
Furthermore, the terms "comprise" and "have," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to those elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" as used in this application refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the function associated with that element.
The term "remote control" as used in this application refers to a component of an electronic device (such as a display device as disclosed in this application) that can typically be controlled wirelessly over a relatively short distance. Typically, the electronic device is connected to the electronic device using infrared and/or Radio Frequency (RF) signals and/or bluetooth, and may also include functional modules such as WiFi, wireless USB, bluetooth, motion sensors, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in a general remote control device with a touch screen user interface.
The term "gesture" as used herein refers to a user behavior by which a user expresses an intended idea, action, purpose, and/or result through a change in hand shape or movement of a hand, etc.
A schematic diagram of an operation scenario between a display device and a control apparatus according to an embodiment is exemplarily shown in fig. 1. As shown in fig. 1, a user may operate the display apparatus 200 through the mobile terminal 300 and the control device 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes infrared protocol communication or bluetooth protocol communication, and other short-range communication modes, etc., and the display device 200 is controlled by a wireless or other wired mode. The user may control the display device 200 by inputting user instructions through keys on a remote control, voice input, control panel input, etc. Such as: the user can input corresponding control instructions through volume up-down keys, channel control keys, up/down/left/right movement keys, voice input keys, menu keys, on-off keys, etc. on the remote controller to realize the functions of the control display device 200.
In some embodiments, mobile terminals, tablet computers, notebook computers, and other smart devices may also be used to control the display device 200. For example, the display device 200 is controlled using an application running on a smart device. The application program, by configuration, can provide various controls to the user in an intuitive User Interface (UI) on a screen associated with the smart device.
In some embodiments, the mobile terminal 300 may install a software application with the display device 200, implement connection communication through a network communication protocol, and achieve the purpose of one-to-one control operation and data communication. Such as: it is possible to implement a control command protocol established between the mobile terminal 300 and the display device 200, synchronize a remote control keyboard to the mobile terminal 300, and implement a function of controlling the display device 200 by controlling a user interface on the mobile terminal 300. The audio/video content displayed on the mobile terminal 300 can also be transmitted to the display device 200, so as to realize the synchronous display function.
As also shown in fig. 1, the display device 200 is also in data communication with the server 400 via a variety of communication means. The display device 200 may be permitted to make communication connections via a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display device 200. By way of example, display device 200 receives software program updates, or accesses a remotely stored digital media library by sending and receiving information, as well as Electronic Program Guide (EPG) interactions. The server 400 may be a cluster, or may be multiple clusters, and may include one or more types of servers. Other web service content such as video on demand and advertising services are provided through the server 400.
The display device 200 may be a liquid crystal display, an OLED display, a projection display device. The particular display device type, size, resolution, etc. are not limited, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
The display apparatus 200 may additionally provide a smart network television function of a computer support function, including, but not limited to, a network television, a smart television, an Internet Protocol Television (IPTV), etc., in addition to the broadcast receiving television function.
A hardware configuration block diagram of the display device 200 according to an exemplary embodiment is illustrated in fig. 2.
In some embodiments, at least one of the controller 250, the modem 210, the communicator 220, the detector 230, the input/output interface (first through n-th interfaces 255), the display 275, the audio output interface 285, the memory 260, the power supply 290, the user interface 265, and the external device interface 240 is included in the display apparatus 200.
In some embodiments, the display 275 is configured to receive image signals from the first processor output, and to display video content and images and components of the menu manipulation interface.
In some embodiments, display 275 includes a display screen assembly for presenting pictures, and a drive assembly for driving the display of images.
In some embodiments, the video content is displayed from broadcast television content, or alternatively, from various broadcast signals that may be received via a wired or wireless communication protocol. Alternatively, various image contents received from the network server side transmitted from the network communication protocol may be displayed.
In some embodiments, the display 275 is used to present a user-manipulated UI interface generated in the display device 200 and used to control the display device 200.
In some embodiments, depending on the type of display 275, a drive assembly for driving the display is also included.
In some embodiments, display 275 is a projection display and may further include a projection device and a projection screen.
In some embodiments, communicator 220 is a component for communicating with external devices or external servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi chip, a bluetooth communication protocol chip, a wired ethernet communication protocol chip, or other network communication protocol chip or a near field communication protocol chip, and an infrared receiver.
In some embodiments, the display device 200 may establish control signal and data signal transmission and reception between the communicator 220 and the external control device 100 or the content providing device.
In some embodiments, the user interface 265 may be used to receive infrared control signals from the control device 100 (e.g., an infrared remote control, etc.).
In some embodiments, the detector 230 is a signal that the display device 200 uses to capture or interact with the external environment.
In some embodiments, the detector 230 includes an optical receiver, a sensor for capturing the intensity of ambient light, a parameter change may be adaptively displayed by capturing ambient light, etc.
In some embodiments, the detector 230 may further include an image collector, such as a camera, a video camera, etc., which may be used to collect external environmental scenes, collect attributes of a user or interact with a user, adaptively change display parameters, and recognize a user gesture to realize an interaction function with the user.
In some embodiments, the detector 230 may also include a temperature sensor or the like, such as by sensing ambient temperature.
In some embodiments, the display device 200 may adaptively adjust the display color temperature of the image. The display device 200 may be adjusted to display a colder color temperature shade of the image, such as when the temperature is higher, or the display device 200 may be adjusted to display a warmer color shade of the image when the temperature is lower.
In some embodiments, the detector 230 may also be a sound collector or the like, such as a microphone, that may be used to receive the user's sound. Illustratively, a voice signal including a control instruction for a user to control the display apparatus 200, or an acquisition environmental sound is used to recognize an environmental scene type so that the display apparatus 200 can adapt to environmental noise.
In some embodiments, as shown in fig. 2, the input/output interfaces (first through nth interfaces 255) are configured to enable data transfer between the controller 250 and external other devices or other controllers 250. Such as receiving video signal data and audio signal data of an external device, command instruction data, or the like.
In some embodiments, external device interface 240 may include, but is not limited to, the following: any one or more interfaces of a high definition multimedia interface HDMI interface, an analog or data high definition component input interface, a composite video input interface, a USB input interface, an RGB port, and the like can be used. The plurality of interfaces may form a composite input/output interface.
In some embodiments, as shown in fig. 2, the modem 210 is configured to receive the broadcast television signal by a wired or wireless receiving manner, and may perform modulation and demodulation processes such as amplification, mixing, and resonance, and demodulate the audio/video signal from the plurality of wireless or wired broadcast television signals, where the audio/video signal may include a television audio/video signal carried in a television channel frequency selected by a user, and an EPG data signal.
In some embodiments, the frequency point demodulated by the modem 210 is controlled by the controller 250, and the controller 250 may send a control signal according to the user selection, so that the modem responds to the television signal frequency selected by the user and modulates and demodulates the television signal carried by the frequency.
In some embodiments, the broadcast television signal may be classified into a terrestrial broadcast signal, a cable broadcast signal, a satellite broadcast signal, an internet broadcast signal, or the like according to a broadcasting system of the television signal. Or may be differentiated into digital modulation signals, analog modulation signals, etc., depending on the type of modulation. Or it may be classified into digital signals, analog signals, etc. according to the kind of signals.
In some embodiments, the controller 250 and the modem 210 may be located in separate devices, i.e., the modem 210 may also be located in an external device to the main device in which the controller 250 is located, such as an external set-top box or the like. In this way, the set-top box outputs the television audio and video signals modulated and demodulated by the received broadcast television signals to the main body equipment, and the main body equipment receives the audio and video signals through the first input/output interface.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored on the memory. The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command to select to display a UI object on the display 275, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: displaying an operation of connecting to a hyperlink page, a document, an image, or the like, or executing an operation of a program corresponding to the icon. The user command for selecting the UI object may be an input command through various input means (e.g., mouse, keyboard, touch pad, etc.) connected to the display device 200 or a voice command corresponding to a voice uttered by the user.
As shown in fig. 2, the controller 250 includes at least one of a random access Memory 251 (Random Access Memory, RAM), a Read-Only Memory 252 (ROM), a video processor 270, an audio processor 280, other processors 253 (e.g., a graphics processor (Graphics Processing Unit, GPU), a central processing unit 254 (Central Processing Unit, CPU), a communication interface (Communication Interface), and a communication Bus 256 (Bus), which connects the respective components.
In some embodiments, RAM 251 is used to store temporary data for the operating system or other on-the-fly programs,
in some embodiments, ROM 252 is used to store instructions for various system boots.
In some embodiments, ROM 252 is used to store a basic input output system, referred to as a basic input output system (Basic Input Output System, BIOS). The system comprises a drive program and a boot operating system, wherein the drive program is used for completing power-on self-checking of the system, initialization of each functional module in the system and basic input/output of the system.
In some embodiments, upon receipt of the power-on signal, the display device 200 power starts up, the CPU runs system boot instructions in the ROM 252, copies temporary data of the operating system stored in memory into the RAM 251, in order to start up or run the operating system. When the operating system is started, the CPU copies temporary data of various applications in the memory to the RAM 251, and then, facilitates starting or running of the various applications.
In some embodiments, CPU processor 254 is used to execute operating system and application program instructions stored in memory. And executing various application programs, data and contents according to various interactive instructions received from the outside, so as to finally display and play various audio and video contents.
In some exemplary embodiments, the CPU processor 254 may comprise a plurality of processors. The plurality of processors may include one main processor and one or more sub-processors. A main processor for performing some operations of the display apparatus 200 in the pre-power-up mode and/or displaying a picture in the normal mode. One or more sub-processors for one operation in a standby mode or the like.
In some embodiments, the graphics processor 253 is configured to generate various graphical objects, such as: icons, operation menus, user input instruction display graphics, and the like. The device comprises an arithmetic unit, wherein the arithmetic unit is used for receiving various interaction instructions input by a user to carry out operation and displaying various objects according to display attributes. And a renderer for rendering the various objects obtained by the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, video processor 270 is configured to receive external video signals, perform video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image composition, etc., according to standard codec protocols for input signals, and may result in signals that are displayed or played on directly displayable device 200.
In some embodiments, video processor 270 includes a demultiplexing module, a video decoding module, an image compositing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio/video data stream, such as the input MPEG-2, and demultiplexes the input audio/video data stream into video signals, audio signals and the like.
And the video decoding module is used for processing the demultiplexed video signals, including decoding, scaling and the like.
And an image synthesis module, such as an image synthesizer, for performing superposition mixing processing on the graphic generator and the video image after the scaling processing according to the GUI signal input by the user or generated by the graphic generator, so as to generate an image signal for display.
The frame rate conversion module is configured to convert the input video frame rate, for example, converting the 60Hz frame rate into the 120Hz frame rate or the 240Hz frame rate, and the common format is implemented in an inserting frame manner.
The display format module is used for converting the received frame rate into a video output signal, and changing the video output signal to a signal conforming to the display format, such as outputting an RGB data signal.
In some embodiments, the graphics processor 253 may be integrated with the video processor, or may be separately configured, where the integrated configuration may perform processing of graphics signals output to the display, and the separate configuration may perform different functions, such as gpu+ FRC (Frame Rate Conversion)) architecture, respectively.
In some embodiments, the audio processor 280 is configured to receive an external audio signal, decompress and decode the audio signal according to a standard codec protocol of an input signal, and perform noise reduction, digital-to-analog conversion, and amplification processing, so as to obtain a sound signal that can be played in a speaker.
In some embodiments, video processor 270 may include one or more chips. The audio processor may also comprise one or more chips.
In some embodiments, video processor 270 and audio processor 280 may be separate chips or may be integrated together with the controller in one or more chips.
In some embodiments, the audio output, under the control of the controller 250, receives sound signals output by the audio processor 280, such as: the speaker 286, and an external sound output terminal that can be output to a generating device of an external device, other than the speaker carried by the display device 200 itself, such as: external sound interface or earphone interface, etc. can also include the close range communication module in the communication interface, for example: and the Bluetooth module is used for outputting sound of the Bluetooth loudspeaker.
The power supply 290 supplies power input from an external power source to the display device 200 under the control of the controller 250. The power supply 290 may include a built-in power circuit installed inside the display device 200, or may be an external power source installed in the display device 200, and a power interface for providing an external power source in the display device 200.
The user interface 265 is used to receive an input signal from a user and then transmit the received user input signal to the controller 250. The user input signal may be a remote control signal received through an infrared receiver, and various user control signals may be received through a network communication module.
In some embodiments, a user inputs a user command through the control apparatus 100 or the mobile terminal 300, the user input interface is then responsive to the user input through the controller 250, and the display device 200 is then responsive to the user input.
In some embodiments, a user may input a user command through a Graphical User Interface (GUI) displayed on the display 275, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through the sensor to receive the user input command.
In some embodiments, a "user interface" is a media interface for interaction and exchange of information between an application or operating system and a user that enables conversion between an internal form of information and a form acceptable to the user. A commonly used presentation form of the user interface is a graphical user interface (Graphic User Interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
The memory 260 includes memory storing various software modules for driving the display device 200. Such as: various software modules stored in the first memory, including: at least one of a base module, a detection module, a communication module, a display control module, a browser module, various service modules, and the like.
The base module is a bottom software module for signal communication between the various hardware in the display device 200 and for sending processing and control signals to the upper modules. The detection module is used for collecting various information from various sensors or user input interfaces and carrying out digital-to-analog conversion and analysis management.
For example, the voice recognition module includes a voice analysis module and a voice instruction database module. The display control module is used for controlling the display to display the image content, and can be used for playing the multimedia image content, the UI interface and other information. And the communication module is used for carrying out control and data communication with external equipment. And the browser module is used for executing data communication between the browsing servers. And the service module is used for providing various services and various application programs. Meanwhile, the memory 260 also stores received external data and user data, images of various items in various user interfaces, visual effect maps of focus objects, and the like.
Fig. 3 exemplarily shows a block diagram of a configuration of the control apparatus 100 in accordance with an exemplary embodiment. As shown in fig. 3, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface, a memory, and a power supply.
The control device 100 is configured to control the display device 200, and may receive an input operation instruction of a user, and convert the operation instruction into an instruction recognizable and responsive to the display device 200, to function as an interaction between the user and the display device 200. Such as: the user responds to the channel addition and subtraction operation by operating the channel addition and subtraction key on the control apparatus 100, and the display apparatus 200.
In some embodiments, the control device 100 may be a smart device. Such as: the control apparatus 100 may install various applications for controlling the display apparatus 200 according to user's needs.
In some embodiments, as shown in fig. 1, a mobile terminal 300 or other intelligent electronic device may function similarly to the control device 100 after installing an application that manipulates the display device 200. Such as: the user may implement the functions of controlling the physical keys of the device 100 by installing various function keys or virtual buttons of a graphical user interface available on the mobile terminal 300 or other intelligent electronic device.
The controller 110 includes a processor 112 and RAM 113 and ROM 114, a communication interface 130, and a communication bus. The controller is used to control the operation and operation of the control device 100, as well as the communication collaboration among the internal components and the external and internal data processing functions.
The communication interface 130 enables communication of control signals and data signals with the display device 200 under the control of the controller 110. Such as: the received user input signal is transmitted to the display device 200. The communication interface 130 may include at least one of a WiFi chip 131, a bluetooth module 132, an NFC module 133, and other near field communication modules.
A user input/output interface 140, wherein the input interface includes at least one of a microphone 141, a touchpad 142, a sensor 143, keys 144, and other input interfaces. Such as: the user can implement a user instruction input function through actions such as voice, touch, gesture, press, and the like, and the input interface converts a received analog signal into a digital signal and converts the digital signal into a corresponding instruction signal, and sends the corresponding instruction signal to the display device 200.
The output interface includes an interface that transmits the received user instruction to the display device 200. In some embodiments, an infrared interface may be used, as well as a radio frequency interface. Such as: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. And the following steps: when the radio frequency signal interface is used, the user input instruction is converted into a digital signal, and then the digital signal is modulated according to a radio frequency control signal modulation protocol and then transmitted to the display device 200 through the radio frequency transmission terminal.
In some embodiments, the control device 100 includes at least one of a communication interface 130 and an input-output interface 140. The control device 100 is provided with a communication interface 130 such as: the WiFi, bluetooth, NFC, etc. modules may send the user input instruction to the display device 200 through a WiFi protocol, or a bluetooth protocol, or an NFC protocol code.
A memory 190 for storing various operation programs, data and applications for driving and controlling the control device 200 under the control of the controller. The memory 190 may store various control signal instructions input by a user.
A power supply 180 for providing operating power support for the various elements of the control device 100 under the control of the controller. May be a battery and associated control circuitry.
In some embodiments, the system may include a Kernel (Kernel), a command parser (shell), a file system, and an application. The kernel, shell, and file system together form the basic operating system architecture that allows users to manage files, run programs, and use the system. After power-up, the kernel is started, the kernel space is activated, hardware is abstracted, hardware parameters are initialized, virtual memory, a scheduler, signal and inter-process communication (IPC) are operated and maintained. After the kernel is started, shell and user application programs are loaded again. The application program is compiled into machine code after being started to form a process.
Referring to FIG. 4, in some embodiments, the system is divided into four layers, from top to bottom, an application layer (referred to as an "application layer"), an application framework layer (Application Framework layer) (referred to as a "framework layer"), a An Zhuoyun row (Android run) and a system library layer (referred to as a "system runtime layer"), and a kernel layer, respectively.
In some embodiments, at least one application program is running in the application program layer, and these application programs may be a Window (Window) program of an operating system, a system setting program, a clock program, a camera application, and the like; and may be an application program developed by a third party developer, such as a hi-see program, a K-song program, a magic mirror program, etc. In particular implementations, the application packages in the application layer are not limited to the above examples, and may actually include other application packages, which are not limited in this embodiment of the present application.
The framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions. The application framework layer corresponds to a processing center that decides to let the applications in the application layer act. Through the API interface, the application program can access the resources in the system and acquire the services of the system in the execution.
As shown in fig. 4, the application framework layer in the embodiment of the present application includes a manager (manager), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used to interact with all activities that are running in the system; a Location Manager (Location Manager) is used to provide system services or applications with access to system Location services; a Package Manager (Package Manager) for retrieving various information about an application Package currently installed on the device; a notification manager (Notification Manager) for controlling the display and clearing of notification messages; a Window Manager (Window Manager) is used to manage bracketing icons, windows, toolbars, wallpaper, and desktop components on the user interface.
In some embodiments, the activity manager is to: the lifecycle of each application program is managed, as well as the usual navigation rollback functions, such as controlling the exit of the application program (including switching the currently displayed user interface in the display window to the system desktop), opening, backing (including switching the currently displayed user interface in the display window to the previous user interface of the currently displayed user interface), etc.
In some embodiments, the window manager is configured to manage all window procedures, such as obtaining a display screen size, determining whether there is a status bar, locking the screen, intercepting the screen, controlling display window changes (e.g., scaling the display window down, dithering, distorting, etc.), and so on.
In some embodiments, the system runtime layer provides support for the upper layer, the framework layer, and when the framework layer is in use, the android operating system runs the C/C++ libraries contained in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 4, the kernel layer contains at least one of the following drivers: audio drive, display drive, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (e.g., fingerprint sensor, temperature sensor, touch sensor, pressure sensor, etc.), and the like.
In some embodiments, the kernel layer further includes a power driver module for power management.
In some embodiments, the software programs and/or modules corresponding to the software architecture in fig. 4 are stored in the first memory or the second memory shown in fig. 2 or fig. 3.
In some embodiments, taking a magic mirror application (photographing application) as an example, when the remote control receiving device receives an input operation of the remote control, a corresponding hardware interrupt is sent to the kernel layer. The kernel layer processes the input operation into the original input event (including the value of the input operation, the timestamp of the input operation, etc.). The original input event is stored at the kernel layer. The application program framework layer acquires an original input event from the kernel layer, identifies a control corresponding to the input event according to the current position of the focus and takes the input operation as a confirmation operation, wherein the control corresponding to the confirmation operation is a control of a magic mirror application icon, the magic mirror application calls an interface of the application framework layer, the magic mirror application is started, and further, a camera driver is started by calling the kernel layer, so that a still image or video is captured through a camera.
In some embodiments, for a display device with a touch function, taking a split screen operation as an example, the display device receives an input operation (such as a split screen operation) acted on a display screen by a user, and the kernel layer may generate a corresponding input event according to the input operation and report the event to the application framework layer. The window mode (e.g., multi-window mode) and window position and size corresponding to the input operation are set by the activity manager of the application framework layer. And window management of the application framework layer draws a window according to the setting of the activity manager, then the drawn window data is sent to a display driver of the kernel layer, and the display driver displays application interfaces corresponding to the window data in different display areas of the display screen.
In some embodiments, as shown in fig. 5, the application layer contains at least one icon control that the application can display in the display, such as: a live television application icon control, a video on demand application icon control, a media center application icon control, an application center icon control, a game application icon control, and the like.
In some embodiments, the live television application may provide live television via different signal sources. For example, a live television application may provide television signals using inputs from cable television, radio broadcast, satellite services, or other types of live television services. And, the live television application may display video of the live television signal on the display device 200.
In some embodiments, the video on demand application may provide video from different storage sources. Unlike live television applications, video-on-demand provides video displays from some storage sources. For example, video-on-demand may come from the server side of cloud storage, from a local hard disk storage containing stored video programs.
In some embodiments, the media center application may provide various multimedia content playing applications. For example, a media center may be a different service than live television or video on demand, and a user may access various images or audio through a media center application.
In some embodiments, an application center may be provided to store various applications. The application may be a game, an application, or some other application associated with a computer system or other device but which may be run in a smart television. The application center may obtain these applications from different sources, store them in local storage, and then be run on the display device 200.
The angle at which the camera is positioned, such as its pitch angle in the vertical direction or its angle in the horizontal direction directly in front of the display device, is related to the field of view of the camera. The cradle head is a supporting device for installing and fixing the camera, and is divided into a fixed cradle head and an electric cradle head, wherein the electric cradle head is suitable for large-range scanning shooting, and the field of view range of the camera can be enlarged. The camera of the display device is arranged on the electric cradle head, and the electric cradle head is controlled by the display device controller, so that the camera can shoot under a plurality of angles. The electric cradle head can be a horizontal rotating cradle head which can only rotate left and right, or an omnibearing cradle head which can rotate left and right and up and down. Generally, two motors are installed in the omnibearing cradle head and are respectively used for driving the cradle head to rotate in the horizontal direction and the vertical direction so as to change the angle of the camera.
The limit angle of the camera which can rotate in the horizontal direction and/or the vertical direction can be set by a user according to the requirement. For example, the rotatable angle range of the camera in the horizontal direction may be 0 ° to 120 °, wherein 0 ° and 120 ° are the corresponding limit angles of the two rotation directions (left and right) in the horizontal direction, respectively; the rotatable angle of the camera in the vertical direction may be 0 ° to 180 °, wherein 0 ° and 180 ° are the limit angles corresponding to the two rotation directions (upward and downward) in the vertical direction, respectively.
Fig. 6 to 11 are schematic views of angles at which the camera is exemplarily shown in the present application, in which fig. 6 is an exemplary view showing a state in which a tilt angle of the camera in a vertical direction is 0 °, fig. 7 is an exemplary view showing a state in which a tilt angle of the camera in a vertical direction is 90 °, fig. 8 is an exemplary view showing a state in which a tilt angle of the camera in a vertical direction is 105 °, fig. 9 is an exemplary view showing a state in which a horizontal angle of the camera in a horizontal direction is 0 °, fig. 10 is an exemplary view showing a state in which a horizontal angle of the camera in a horizontal direction is 60 °, and fig. 11 is an exemplary view showing a state in which a horizontal angle of the camera in a horizontal direction is 120 °.
In some embodiments, the user may control the rotation of the pan-tilt by operating a designated key on the control device, so as to adjust the angle at which the camera is located. For example, the user may input a user operation for controlling the rotation of the cradle head to the left and right by operating the "left" and "right" direction keys on the control device, and for example, the user may input a user operation for controlling the rotation of the cradle head to the down and up by operating the "down" and "up" direction keys on the control device. When the display device receives the user operation for adjusting the angle of the camera, the motor in the cradle head is controlled to rotate in response to the received user operation so as to drive the cradle head to rotate towards the target adjustment direction indicated by the user operation.
In some embodiments, the controller performs different processes to adjust the angle of the camera according to the different operation modes of the specified keys by the user.
Taking an entity designated key for adjusting the angle of the camera on the remote controller as an example, when a user operates the designated key in a first operation mode, the controller responds to the received user operation and controls the camera to perform incremental adjustment to a target adjustment direction on the basis of the current angle according to the target adjustment direction indicated by the key operated by the user. For example, if the target adjustment direction indicated by the key operated by the user is "right", and the angle at which the camera is positioned is 15 ° upon receiving the user operation, the adjustment is made to 10 ° to the right on the basis of 15 °, that is, to 25 °. The first operation mode may be a short-press operation of a specified key by a user. The user can continuously execute the short pressing operation so that the controller controls the camera to continuously and incrementally adjust towards the target adjustment direction.
When the user operates the designated key in the second operation mode, the controller responds to the received user operation and adjusts the camera to the limit angle corresponding to the target adjustment direction according to the target adjustment direction indicated by the key operated by the user. For example, if the target adjustment direction indicated by the key operated by the user is "right" and the limit angle corresponding to "right" is 120 °, the angle of the camera is adjusted to the right until 120 °. The second operation mode may be a long-press operation of the specified key by the user. Different from the processing logic for adjusting the angle of the camera through short pressing operation, the user can adjust the angle of the camera to a limit angle corresponding to a certain direction at one time by executing long pressing operation once.
In the implementation scenario that the user performs continuous short-press operation on the designated key, the controller performs incremental adjustment on the basis of the current angle of the camera every time the controller receives the short-press operation, so that the problem that the angle adjustment is not in place can occur. For example, if the user continuously presses the "right" direction key twice, when the controller receives the first short press operation and the angle at which the camera is positioned is 15 ° on the basis of the current 15 °, the controller will control the camera to rotate to the right by 10 °, i.e., adjust to 25 °, on the basis of the current 15 ° in response to the first short press operation, and when the controller receives the second short press operation and the angle at which the camera is positioned is 18 °, i.e., does not reach 25 °, on the basis of the second short press operation, the controller will control the camera to rotate to the right by 10 °, i.e., adjust to 28 °, instead of 35 °, on the basis of the current 18 °, so that the problem of the angle adjustment being out of place arises.
To solve the above-described problem, in the embodiment of the present application, a controller of a display device is configured to: receiving user operation for adjusting the angle of the camera, wherein the user operation carries a mark for indicating the target adjustment direction and an operation mode; if the operation mode indicated by the mark is a first operation mode, determining a target angle according to the target adjustment direction and a stored initial angle, and adjusting the angle of the camera to the target angle, wherein the stored initial angle is the last determined target angle or the angle of the camera when the adjustment is stopped last time; and if the operation mode indicated by the identification is the second operation mode, adjusting the angle of the camera to the limit angle corresponding to the target adjustment direction.
In the above embodiment, the first operation mode may be a short-press operation of a specified key by a user, and the second operation mode may be a long-press operation of a specified key by a user, where the specified key is a key for adjusting an angle of a camera on a control device (such as a remote controller). When the user presses a designated key on the remote controller for a short time or presses a long time, the remote controller sends operation information of the user to the display device, wherein the operation information comprises an identifier for indicating the target adjustment direction and an identifier for indicating an operation mode. The identifier for indicating the target adjustment direction may be a key code, and the identifier for indicating the operation mode may be a count value. For example, when the count value in the operation information is 0, the operation is indicated as a short press operation, and when the count value in the operation information is greater than 0, the operation is a long press operation.
In some embodiments, the target angle is obtained by adding or subtracting the start angle and a preset step angle according to the difference of the target adjustment directions. Wherein the preset step angle is 10 ° or other value as in the above example. For example, when the target adjustment direction is "right", the starting angle and the preset step angle are added to obtain the target angle; and when the target adjustment direction is leftwards, subtracting the initial angle from the preset step angle to obtain the target angle. For example, in the case where the stored start angle is 15 °, the preset step angle is 10 °, if the target adjustment direction is "right", the target angle is determined to be 25 °, and if the target adjustment direction is "left", the target angle is determined to be 5 °.
In some embodiments, after each time the target angle is determined, the determined target angle is saved in the system, or after the adjustment of the angle of the camera is terminated, the angle at which the camera is positioned at the time of terminating the adjustment is saved in the system, so as to be used as the starting angle of the next adjustment.
When the determined target angle exceeds the limit angle corresponding to the target adjustment direction, the target angle is adjusted by using the limit angle corresponding to the target adjustment direction as the target angle. For example, assuming that the limit angle of the "rightward" adjustment is 120 °, if the target angle determined from the target adjustment direction and the stored start angle is 125 °, the adjustment is performed with the limit angle of 120 ° as the target angle.
It can be seen that, in the above embodiment, the user may operate the designated key through different operation modes to input the user operation for adjusting the angle of the camera, where the user operation input through the first operation mode may control the camera to perform incremental adjustment to the target adjustment direction based on the initial angle, and the user operation input through the second operation mode may control the camera to rotate to the limit angle corresponding to the target adjustment direction at one time. And, because after the controller confirms a goal angle or after stopping adjusting the angle of the camera, will confirm the goal angle or stop adjusting the angle that the camera is located and regarded as the initial angle of the next adjustment to save, every time the controller receives the user's operation based on first operation mode, will confirm the new goal angle according to goal adjusting direction and initial angle preserved, and adjust the camera to this goal angle, therefore can avoid the problem that the angle adjustment is not in place when the user presses the appointed button for many times continuously.
For example, if the user continuously short presses the "right" direction key twice, when the controller receives the first short press operation and the stored starting angle is 15 ° when the first short press operation is received, in response to the first short press operation, the controller determines that the target angle is 25 ° according to the starting angle 15 ° and the preset step angle 10 °, and starts to control the camera to rotate to 25 ° rightward, while storing the currently determined target angle 25 ° as the starting angle, when the controller receives the second short press operation, the controller determines that the new target angle is 35 ° according to the stored starting angle 25 ° and the preset step angle 10 °, that is, the camera angle will be adjusted to 35 ° instead of 28 ° calculated according to the 18 °, so that the problem that the angle adjustment is not in place when the user continuously short presses the designated key a plurality of times is avoided.
In addition, if the operation mode indicated by the identification carried by the user operation is the second operation mode, namely the long-press operation, the controller smoothly and uniformly rotates to the limit angle corresponding to the target adjustment direction according to the target adjustment direction indicated by the user operation. Furthermore, the user can adjust the camera angle to the limit angle once through one long-press operation.
Note that, in the above-described implementation scenario in which the user performs a long-press operation on a specified key, in the absence of a rotation stop interface for terminating rotation of the camera, rotation control of the camera cannot be stopped halfway.
To further address this issue to further enhance the user experience, in some embodiments, the controller of the display device is further configured to: before adjusting the angle of the camera to a limit angle corresponding to the target adjustment direction, detecting whether a user operation for ending adjusting the angle of the camera is received; when a user operation for terminating the adjustment of the angle of the camera is received, the adjustment of the camera angle is terminated. For example, the user operation for terminating the adjustment of the angle of the camera may be a lifting operation after receiving a long-press operation of the specified key.
In other embodiments, the controller of the display device is further configured to: in the process of adjusting the angle of the camera to the limit angle corresponding to the target adjustment direction, periodically acquiring the current angle of the camera, and judging whether the current angle of the camera reaches the limit angle corresponding to the target adjustment direction; and when the current angle of the camera reaches the limit angle corresponding to the target adjustment direction, the adjustment of the angle of the camera is terminated.
As can be seen from the above embodiments, when a user presses a certain designated key for a long time, the display device controller will control the camera to rotate toward the target adjustment direction; and stopping rotating the camera when the user is detected to lift the designated key or the angle of the camera reaches the limit angle corresponding to the target adjustment direction.
It should be noted that, when the current angle of the camera reaches the limit angle corresponding to the target adjustment direction, the current angle of the camera is completely matched with the limit angle, and/or when the current angle distance of the camera reaches the limit angle, the angle value of the limit angle is smaller than the preset value.
In particular, when the camera angle adjustment method is implemented, the main thread receives user operation for adjusting the camera angle, such as operation information sent by a remote controller, and sends the received user operation to CameraSettingActivity, cameraSettingActivity, and the user operation is repackaged according to a key code corresponding to the user operation and an operation mode of the user, so that the repackaged user operation carries an identifier for indicating a target adjustment direction and an identifier for indicating the operation mode, and the repackaged user operation is distributed to different sub-threads for processing according to different operation modes of the user. Specifically, if the user operation is based on the first operation mode, adding the repackaged user operation as a to-be-processed message into a to-be-processed queue of the first sub-thread, and acquiring the to-be-processed user operation from the to-be-processed queue by the first sub-thread and processing the to-be-processed user operation; and if the user operation is based on the second operation mode, directly sending the repackaged user operation to the second sub-thread, and processing the repackaged user operation by the second sub-thread.
The first sub-thread is specifically used for acquiring user operation to be processed for adjusting the angle of the camera from a queue to be processed; acquiring a target adjustment direction according to a mark carried by user operation and used for indicating the target adjustment direction; determining a target angle of the operation according to the target adjustment direction and the stored initial angle; and calling a method for adjusting the angle of the camera, and adjusting the angle of the camera to the target angle through the method. When the angle of the camera reaches the target angle, the first sub-thread is closed.
The second sub-thread is specifically used for receiving user operation sent by the CameraSettingActivity and used for adjusting the angle of the camera; acquiring a target adjustment direction according to a mark carried by user operation and used for indicating the target adjustment direction; and calling a method for adjusting the angle of the camera, and adjusting the angle of the camera to a limit angle corresponding to the target adjusting direction by the method. And in the process of adjusting the angle of the camera to the limit angle corresponding to the target adjustment direction, periodically acquiring the current angle of the camera, judging whether the acquired current angle reaches the limit angle corresponding to the target adjustment direction, and stopping adjusting the angle of the camera when the acquired current angle reaches the limit angle. When the adjustment of the camera angle is completed or terminated, the second sub-thread is turned off.
In the embodiment of the application, in consideration of an adjustment error existing when the angle of the camera is adjusted, when the rotatable angle of the camera in the target adjustment direction is smaller than the adjustment error, the angle of the camera is not adjusted any more. In some embodiments, the rotatable angle of the camera in the target adjustment direction is also referred to as the residual angle, and the adjustment error is also referred to as the preset minimum angle.
Based on this, in some embodiments, the controller of the display device is further configured to: before determining a target angle according to the target adjustment direction and the stored initial angle, calculating a residual angle according to a limit angle and the initial angle corresponding to the target adjustment direction; judging whether the residual angle is larger than a preset minimum angle or not; if the remaining angle is larger than the preset minimum angle, executing the step of determining the target angle; if the remaining angle is not greater than the preset minimum angle, generating an interface prompt, and displaying the interface prompt to show that the initial angle reaches the limit angle corresponding to the target adjustment direction.
For example, assuming that the target adjustment direction is "right", the limit angle corresponding to the "right" is 120 °, the stored starting angle is 118 °, the preset minimum angle is 3 °, and the remaining angle calculated from the base line angle corresponding to the target adjustment direction and the stored starting angle is 2 °, which is smaller than the preset minimum angle, the step of determining the target angle is not performed, but an interface prompt is generated and displayed to show information that the starting angle reaches the limit angle corresponding to the target adjustment direction.
According to the above embodiment, the embodiment of the present application further provides a method for adjusting an angle of a camera, where the method is applied to the display device provided in the above embodiment, and an execution body of the method includes, but is not limited to, a controller of the display device. Fig. 12 is a flowchart of a method for adjusting an angle of a camera according to an exemplary embodiment, where the method may include:
step 121, receiving a user operation for adjusting the angle of the camera, wherein the user operation carries an identifier for indicating the target adjustment direction and an operation mode.
Step 122, if the operation mode indicated by the identifier is the first operation mode, determining a target angle according to the target adjustment direction and the stored starting angle, and adjusting the angle of the camera to the target angle, where the stored starting angle is the last determined target angle or the angle at which the camera is located when the adjustment is terminated last time.
In some embodiments, the first operation mode is a short-press operation of a specified key by a user, the second operation mode is a long-press operation of the specified key by the user, and the specified key is a key for adjusting the angle of the camera.
In some embodiments, before the determining the target angle according to the target adjustment direction and the saved starting angle, the method further includes: calculating a residual angle according to the limit angle corresponding to the target adjustment direction and the initial angle; judging whether the residual angle is larger than a preset minimum angle or not; if the remaining angle is larger than the preset minimum angle, executing the step of determining the target angle; and if the remaining angle is not greater than the preset minimum angle, generating an interface prompt, and displaying the interface prompt to show that the initial angle reaches the limit angle corresponding to the target adjustment direction.
In some embodiments, the determining the target angle according to the target adjustment direction and the saved starting angle includes: and adding or subtracting the initial angle and a preset step angle according to different target adjustment directions to obtain the target angle.
Step 123, if the operation mode indicated by the identification is the second operation mode, adjusting the angle of the camera to the limit angle corresponding to the target adjustment direction.
In some embodiments, before adjusting the angle of the camera to the limit angle corresponding to the target adjustment direction, the method further includes: detecting whether a user operation for terminating adjustment of the angle of the camera is received; and when the user operation for stopping adjusting the angle of the camera is received, stopping adjusting the angle of the camera. The user operation for terminating the adjustment of the angle of the camera is a lifting operation after the long-press operation of the designated key is received.
In some embodiments, in the process of adjusting the angle of the camera to the limit angle corresponding to the target adjustment direction, the method further includes: periodically acquiring the current angle of the camera; judging whether the current angle of the camera reaches a limit angle corresponding to the target adjustment direction; and when the current angle of the camera reaches the limit angle corresponding to the target adjustment direction, ending the adjustment of the angle of the camera.
In some embodiments, adjusting the angle of the camera to the target angle or a limit angle corresponding to the target adjustment direction includes: and controlling the camera to rotate to the target angle or the limit angle at a constant speed at a preset speed.
In some embodiments, after the target angle is determined according to the target adjustment direction and the stored starting angle, the target angle is stored as the starting angle, or after the adjustment of the camera angle is terminated, the angle at which the camera is located is stored as the starting angle.
The angle at which the camera is positioned, such as its pitch angle in the vertical direction or its angle in the horizontal direction, is related to the field of view range of the camera. The cradle head is a supporting device for installing and fixing the camera, and is divided into a fixed cradle head and an electric cradle head, wherein the electric cradle head is suitable for large-range scanning shooting, and the field of view range of the camera can be enlarged. The electric cradle head can be a horizontal rotating cradle head which can only rotate left and right, or an omnibearing cradle head which can rotate left and right and up and down. Generally, two motors are installed in the omnibearing cradle head and are respectively used for driving the cradle head to rotate in the horizontal direction and the vertical direction so as to change the angle of the camera. The camera installed on the electric cradle head is also called a cradle head camera.
As shown in fig. 2, the display device 200 comprises a detector 230, the detector 230 comprising a camera, which may be a pan-tilt camera 232 as shown in fig. 13 a.
As shown in fig. 2, the display apparatus 200 includes an external device interface 240, and an external device can access the display apparatus 200 through the external device interface 240. As shown in fig. 13b, the external pan/tilt camera 232 may be connected to the controller 220 of the display apparatus through the external device interface 240.
For the display equipment internally or externally connected with the cradle head camera, the display equipment controller is utilized to control the cradle head, so that the camera can shoot under a plurality of angles.
The rotatable limit angle of the camera in the horizontal direction and/or the vertical direction can be designed according to the requirement. For example, the rotatable angle range of the camera in the horizontal direction may be 0 ° to 120 °, wherein 0 ° and 120 ° are the corresponding limit angles of the two rotation directions (left and right) in the horizontal direction, respectively; the rotatable angle of the camera in the vertical direction may be 0 ° to 180 °, wherein 0 ° and 180 ° are the limit angles corresponding to the two rotation directions (upward and downward) in the vertical direction, respectively.
Fig. 9 to 11 are schematic views of angles at which the camera is exemplarily shown in the present application, in which fig. 9 is schematically showing a state in which a tilt angle of the camera in a vertical direction is 0 °, fig. 7 is schematically showing a state in which a tilt angle of the camera in a vertical direction is 90 °, fig. 8 is schematically showing a state in which a tilt angle of the camera in a vertical direction is 105 °, fig. 9 is schematically showing a state in which a horizontal angle of the camera in a horizontal direction is 0 °, fig. 10 is schematically showing a state in which a horizontal angle of the camera in a horizontal direction is 60 °, and fig. 11 is schematically showing a state in which a horizontal angle of the camera in a horizontal direction is 120 °.
Note that, cameras in the following embodiments of the present application are pan-tilt cameras unless otherwise specified.
In some embodiments, the camera is tilted up, which means that the tilt angle of the camera in the vertical direction is greater than a preset minimum angle; the camera drops, which means that the pitching angle of the camera in the vertical direction is smaller than a preset maximum angle. The preset minimum angle and the preset maximum angle may be the same angle value or different angle values. For example, the state when the camera is tilted up may be as shown in fig. 7, and the state when the camera is dropped may be as shown in fig. 9.
Referring to fig. 7, in a period when the user does not use the camera, if the camera is always in an up state, the illusion that the camera is working is easily caused to the user from the user experience perspective, so as to cause trouble to the user. In addition, when a user misoperates or a system is wrong, the camera is possibly turned on by mistake under the condition that the camera is not known by the user, so that the problem of privacy leakage of the user is easily caused, and the user experience is not facilitated.
In some solutions that may solve the above-mentioned problem, when an application associated with the camera is switched to a foreground application, or when the foreground application invokes an interface for starting the camera, the camera is controlled to tilt up; and controlling the camera to drop when the display device is shut down or an application associated with the camera exits from the foreground.
That is, only when the camera is required to be used, the camera is lifted up, and the camera can be lowered after the use is finished. In this implementation, a nonsensical drop and pitch up may occur for the foreground application when switching between two applications, both associated with the camera. Specifically, when the foreground application is switched from the first application to the second application, the camera is controlled to fall due to the fact that the first application exits from the foreground application, and then the camera is controlled to rise due to the fact that the second application enters into the foreground application, the process of falling firstly and rising secondly is meaningless, the working times of the motor can be increased, the service life of the motor can be shortened, the waiting time of a user can be prolonged, and user experience is not benefited.
To avoid meaningless elevation and landing control, embodiments of the present application provide a display device, as shown in fig. 2 and 13a or 6, comprising a display 275 for displaying a user interface, a camera 232 for capturing local images, and a controller 220 connected to the camera, wherein the controller 220 is configured to perform: when an instruction of closing the camera by the application and an instruction of opening the camera by the application are continuously received, the camera is controlled to be maintained at the current angle, wherein the continuous reception of the instruction of closing the camera by the application and the instruction of opening the camera by the application means that the time interval of receiving the two instructions is smaller than the preset time length. For example, an instruction that the first application or the second application opens the camera is received within a preset time period after an instruction that the first application closes the camera is received. For another example, an instruction for the first application to close the camera is received within a preset time period after an instruction for the first application to open the camera is received. Wherein the first application and the second application are different applications associated with the camera.
In some embodiments, when an instruction for the first application to open the camera is received within a preset time period after an instruction for the first application to close the camera is received, the camera is controlled to be maintained at a second preset angle. And when the instruction of opening the camera by the first application or the second application is not received within the preset time after the instruction of closing the camera by the first application is received, adjusting the angle of the camera to a first preset angle.
In particular, the controller 220 is configured to perform the steps shown in fig. 14:
step 801, when the camera is in a closed state, an instruction of starting the camera by the first application is received.
Step 802, in response to an instruction of the first application to start the camera, starting the camera, and adjusting the angle of the camera from the first preset angle to the second preset angle.
The first preset angle is a pitching angle of the camera after falling down, and the second preset angle is a pitching angle of the camera after rising up.
Step 803, when the camera is in an on state, an instruction of closing the camera by the first application is received. And sending an instruction for closing the camera when the first application is closed or the foreground is exited.
Step 804, responding to an instruction of the first application for closing the camera, and monitoring whether an instruction of the first application or the second application for opening the camera is received within a preset time length;
in step 805, if an instruction for starting the camera by the first application or the second application is received within a preset time period, starting the camera, and maintaining the camera at a second preset angle. After the first application or the second application is started or switched to the foreground, an instruction for starting the camera is sent.
Step 806, if the instruction of starting the camera by the first application or the second application is not received within the preset time, adjusting the angle of the camera to the first preset angle. In some implementation scenarios, when a first application is started, an instruction for indicating to start a camera is sent to a camera service of a system; and after receiving an instruction of starting the camera by the first application, the camera service calls a method resettureformed () provided by the camera control service for controlling the elevation of the camera to control the elevation of the camera to a second preset angle.
Then, in one case, if the first application is closed or exits the foreground, the first application sends an instruction for indicating to close the camera to the camera service; the camera service starts timing after receiving an instruction of closing the camera by the first application, and monitors whether an instruction of opening the camera by the first application or the second application is received; if the first application or the second application is started or enters the foreground, the first application or the second application sends an instruction for starting the camera to the CameraService service; if the instruction of starting the camera by the first application or the second application is received before the timing duration reaches the preset duration (such as 3 s), ending the flow; if the instruction of starting the camera by the first application or the second application is not received before the timing duration reaches the preset duration, stopping timing after the timing duration reaches the preset duration, and calling a method reset () provided by the CameraControl service for controlling the camera to drop to a first preset angle.
In one example, the "magic mirror" application and the "chat" application are both applications associated with a camera. After a user opens a magic mirror application or switches the magic mirror application to a foreground operation through operation, the magic mirror application sends an instruction for indicating to start a camera to a Camera service of the system; after receiving the instruction, the camera service invokes a method resettreec orded () provided by the camera control service for controlling the camera to tilt up to a second preset angle. When the user switches the 'magic mirror' application to the foreground operation through operation and switches the 'chat' application to the foreground operation, the 'magic mirror' application sends an instruction for closing the camera to the CameraService service, the 'chat' application sends an instruction for opening the camera to the CameraService service, and the CameraService service closes the camera and starts timing after receiving the closing instruction of the 'magic mirror' application. If an opening instruction of the magic mirror application is received before the timing duration reaches the preset duration (such as 3 s), the camera is maintained to be opened, and the camera is maintained at a second preset angle; if the opening instruction of the magic mirror application is not received before the timing duration reaches the preset duration, the camera is closed, and the camera is controlled to fall to a first preset angle.
As can be seen from the embodiment shown in fig. 14, if the CameraService receives an instruction for starting the camera by the first application or the second application within a preset time period for closing the camera by the first application, the camera is not controlled to fall, but is maintained at the second preset angle, so that meaningless falling and rising control caused by switching of the applications is avoided.
In other embodiments, when the camelervice service receives an instruction that the first application closes the camera, a landing task is generated and added to the task queue, wherein the landing task is a task of adjusting the angle of the camera to a first preset angle; when the waiting time of the landing task to be executed in the task queue reaches a preset time, reading and executing the landing task, namely, adjusting the angle of the camera to the first preset angle; and deleting the landing task to be executed in the task queue when the CameraService service receives the instruction of opening the camera by the first application or the second application.
In these embodiments, when the waiting time length of the landing task to be executed in the task queue reaches the preset time length, the waiting time length of the landing task to be executed is necessarily smaller than the preset time length, and by deleting the landing task to be executed in the task queue, the landing task to be executed can be prevented from being read and executed when the waiting time length reaches the preset time length if the camera service receives the instruction of opening the camera by the first application or the second application. That is, if the camera service receives an instruction that the first application or the second application starts the camera within a preset time period when the first application closes the camera, the camera is not controlled to fall, but is maintained at a second preset angle, so that meaningless falling and lifting control caused by application switching is avoided.
It should be appreciated that when the camera service receives an instruction to start the camera by an application if there is no landing task to be performed in the task queue, the camera is started, and the angle of the camera is adjusted from the first preset angle to the second preset angle.
In some embodiments, when an instruction for the first application to close the camera is received within a preset time period after an instruction for the first application to open the camera is received, the camera is controlled to be maintained at a first preset angle. When the instruction of closing the camera by the first application is not received within the preset time after the instruction of opening the camera by the first application is received, the angle of the camera is adjusted to a second preset angle.
In particular implementations, the controller 220 is further configured to perform the steps shown in fig. 15:
step 901, when the camera is in an on state, an instruction of closing the camera by the first application is received.
In step 902, the camera is turned off in response to the instruction of the first application to turn off the camera, and the angle at which the camera is located is adjusted from the second preset angle to the first preset angle.
In step 903, when the camera is in the closed state, an instruction of the first application to open the camera is received. When the first application enters the foreground to run, an instruction for opening the camera is sent.
Step 904, responding to an instruction of the first application for opening the camera, and monitoring whether an instruction of the first application for closing the camera is received within a preset duration;
in step 905, if an instruction for closing the camera by the first application is received within a preset time period, closing the camera and maintaining the camera at a first preset angle. And when the first application is closed or exits from the foreground, sending an instruction for closing the camera.
Step 906, if the instruction of closing the camera by the first application is not received within the preset time, adjusting the angle of the camera to a second preset angle.
As can be seen from the embodiment shown in fig. 15, if the instruction of the first application to close the camera is received within the preset time period of the first application to open the camera, the camera is not controlled to tilt up, but is maintained at the first preset angle, so that meaningless falling and tilting up control is avoided.
In some embodiments, when the camelervice service receives an instruction that the first application opens the camera, generating an elevation task and adding the elevation task into a task queue, wherein the elevation task is a task of adjusting the angle of the camera to a second preset angle; when the waiting time length of the upward task to be executed in the task queue reaches a preset time length, reading and executing the upward task, namely adjusting the angle of the camera to the second preset angle; and deleting the upward task to be executed in the task queue when the CameraService service receives the instruction of closing the camera by the first application.
In these embodiments, when the waiting time length of the upward task to be executed in the task queue reaches the preset time length, the upward task to be executed will be read and executed by the camelervice service, so if the upward task to be executed still exists in the task queue when the camelervice service receives the instruction of closing the camera by the first application, the waiting time length of the upward task to be executed is necessarily smaller than the preset time length, and by deleting the upward task to be executed in the task queue, the upward task to be executed can be prevented from being read and executed when the waiting time length reaches the preset time length. That is, if the camera service receives an instruction that the first application closes the camera within a preset time period when the first application opens the camera, the camera is not controlled to tilt up, but is maintained at the first preset angle, so that meaningless falling and tilting up control is avoided.
In a specific implementation, the present invention further provides a computer storage medium, where the computer storage medium may store a program, where the program may include some or all of the steps in each embodiment of the method for adjusting a camera angle provided by the present invention when executed. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), a random-access memory (random access memory, RAM), or the like.
It will be apparent to those skilled in the art that the techniques of embodiments of the present invention may be implemented in software plus a necessary general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be embodied in essence or what contributes to the prior art in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the embodiments or some parts of the embodiments of the present invention.
The same or similar parts between the various embodiments in this specification are referred to each other. In particular, for the display device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference should be made to the description of the method embodiments for the matters.
The embodiments of the present invention described above do not limit the scope of the present invention.

Claims (10)

  1. A display device, characterized by comprising:
    a display for displaying a user interface;
    the camera is used for collecting images, and the angle of the camera can be adjusted so as to collect images in different ranges when the camera is at different angles;
    And the controller is connected with the camera and is used for:
    receiving user operation for adjusting the angle of the camera, wherein the user operation carries an identifier for indicating the target adjustment direction and an operation mode;
    if the operation mode indicated by the identification is a first operation mode, determining a target angle according to the target adjustment direction and a stored starting angle, and adjusting the angle of the camera to the target angle, wherein the stored starting angle is the last determined target angle or the angle of the camera when the adjustment is stopped last time;
    and if the operation mode indicated by the identification is a second operation mode, adjusting the angle of the camera to a limit angle corresponding to the target adjustment direction.
  2. The display device according to claim 1, wherein the first operation mode is a short-press operation of a specified key by a user, the second operation mode is a long-press operation of the specified key by the user, and the specified key is a key for adjusting a camera angle.
  3. The display device according to claim 2, wherein before the adjusting the angle of the camera to the limit angle corresponding to the target adjustment direction, further comprising:
    Detecting whether a user operation for terminating adjustment of the angle of the camera is received;
    and when the user operation for stopping adjusting the angle of the camera is received, stopping adjusting the angle of the camera.
  4. A display device according to claim 3, wherein the user operation for terminating the adjustment of the angle of the camera is a lifting operation after receiving a long-press operation of the specified key.
  5. The display device according to claim 2, wherein in the process of adjusting the angle of the camera to the limit angle corresponding to the target adjustment direction, further comprising:
    periodically acquiring the current angle of the camera;
    judging whether the current angle of the camera reaches a limit angle corresponding to the target adjustment direction;
    and when the current angle of the camera reaches the limit angle corresponding to the target adjustment direction, ending the adjustment of the angle of the camera.
  6. The display device according to claim 3 or 5, wherein after the target angle is determined according to the target adjustment direction and the stored start angle, the target angle is stored as the start angle, or the angle at which the camera is located is stored as the start angle after the adjustment of the camera angle is terminated.
  7. The display device according to claim 1, wherein before the determining the target angle according to the target adjustment direction and the saved starting angle, further comprises:
    calculating a residual angle according to the limit angle corresponding to the target adjustment direction and the initial angle;
    judging whether the residual angle is larger than a preset minimum angle or not;
    if the remaining angle is larger than the preset minimum angle, executing the step of determining the target angle;
    and if the remaining angle is not greater than the preset minimum angle, generating an interface prompt, and displaying the interface prompt to show that the initial angle reaches the limit angle corresponding to the target adjustment direction.
  8. The display device according to claim 1, wherein the determining the target angle from the target adjustment direction and the saved starting angle comprises:
    and adding or subtracting the initial angle and a preset step angle according to different target adjustment directions to obtain the target angle.
  9. The display device according to claim 1, wherein adjusting the angle of the camera to the target angle or a limit angle corresponding to the target adjustment direction includes:
    And controlling the camera to rotate to the target angle or the limit angle at a constant speed at a preset speed.
  10. A display method, the method comprising:
    receiving user operation for adjusting the angle of the camera, wherein the user operation carries an identifier for indicating the target adjustment direction and an operation mode;
    if the operation mode indicated by the identification is a first operation mode, determining a target angle according to the target adjustment direction and a stored starting angle, and adjusting the angle of the camera to the target angle, wherein the stored starting angle is the last determined target angle or the angle of the camera when the adjustment is stopped last time;
    and if the operation mode indicated by the identification is a second operation mode, adjusting the angle of the camera to a limit angle corresponding to the target adjustment direction.
CN202180053612.5A 2020-08-07 2021-05-27 Display equipment and display method Pending CN116264864A (en)

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
CN2020107898247 2020-08-07
CN202010789824.7A CN111970548B (en) 2020-08-07 2020-08-07 Display device and method for adjusting angle of camera
CN2020108517384 2020-08-21
CN202010851738 2020-08-21
CN202110156378.0A CN112954425A (en) 2020-08-21 2021-02-04 Display device and camera control method
CN2021101563780 2021-02-04
PCT/CN2021/096429 WO2022028060A1 (en) 2020-08-07 2021-05-27 Display device and display method

Publications (1)

Publication Number Publication Date
CN116264864A true CN116264864A (en) 2023-06-16

Family

ID=80119870

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180053612.5A Pending CN116264864A (en) 2020-08-07 2021-05-27 Display equipment and display method

Country Status (2)

Country Link
CN (1) CN116264864A (en)
WO (1) WO2022028060A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114739361B (en) * 2022-02-25 2023-06-13 中国科学院空天信息创新研究院 Earth observation method, apparatus, electronic device and storage medium
CN115031694B (en) * 2022-04-25 2023-06-09 中国科学院空天信息创新研究院 Earth observation method, apparatus, storage medium, and program product

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007043629A (en) * 2005-06-30 2007-02-15 Sony Corp Graphic user interface device, operation input processing method, and two-way communication device
CN105120162B (en) * 2015-08-27 2019-04-16 Oppo广东移动通信有限公司 A kind of camera method of controlling rotation and terminal
CN110418050A (en) * 2018-04-26 2019-11-05 Oppo广东移动通信有限公司 Camera control method, device, mobile terminal and the storage medium of mobile terminal
CN110213489B (en) * 2019-06-20 2021-06-15 维沃移动通信有限公司 Control method, control device and terminal equipment
CN111970548B (en) * 2020-08-07 2022-10-21 海信视像科技股份有限公司 Display device and method for adjusting angle of camera

Also Published As

Publication number Publication date
WO2022028060A1 (en) 2022-02-10

Similar Documents

Publication Publication Date Title
CN112214189B (en) Image display method and display device
CN111970548B (en) Display device and method for adjusting angle of camera
CN112291599B (en) Display device and method for adjusting angle of camera
CN112019782B (en) Control method and display device of enhanced audio return channel
CN111970549B (en) Menu display method and display device
CN112243141B (en) Display method and display equipment for screen projection function
CN112118400A (en) Display method of image on display device and display device
CN112087671A (en) Display method and display equipment for control prompt information of input method control
CN116264864A (en) Display equipment and display method
CN112306604B (en) Progress display method and display device for file transmission
CN112040535B (en) Wifi processing method and display device
CN111984167B (en) Quick naming method and display device
CN116017006A (en) Display device and method for establishing communication connection with power amplifier device
CN114390190B (en) Display equipment and method for monitoring application to start camera
CN111918056B (en) Camera state detection method and display device
CN113542878B (en) Wake-up method based on face recognition and gesture detection and display device
CN114302203A (en) Image display method and display device
CN114302197A (en) Voice separation control method and display device
CN114417035A (en) Picture browsing method and display device
CN111988649A (en) Control separation amplification method and display device
CN113438553B (en) Display device awakening method and display device
CN112866768B (en) Display device and information prompting method
CN113194355B (en) Video playing method and display equipment
CN113825001B (en) Panoramic picture browsing method and display device
CN112231088B (en) Browser process optimization method and display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination