CN113645502B - Method for dynamically adjusting control and display device - Google Patents

Method for dynamically adjusting control and display device Download PDF

Info

Publication number
CN113645502B
CN113645502B CN202010342885.9A CN202010342885A CN113645502B CN 113645502 B CN113645502 B CN 113645502B CN 202010342885 A CN202010342885 A CN 202010342885A CN 113645502 B CN113645502 B CN 113645502B
Authority
CN
China
Prior art keywords
user
distance
position parameter
initial position
target control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010342885.9A
Other languages
Chinese (zh)
Other versions
CN113645502A (en
Inventor
王大勇
鲍姗娟
王卫明
陈验方
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202010342885.9A priority Critical patent/CN113645502B/en
Priority to PCT/CN2021/081562 priority patent/WO2021218473A1/en
Publication of CN113645502A publication Critical patent/CN113645502A/en
Application granted granted Critical
Publication of CN113645502B publication Critical patent/CN113645502B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4318Generation of visual interfaces for content selection or interaction; Content or additional data rendering by altering the content in the rendering process, e.g. blanking, blurring or masking an image region

Abstract

The application discloses a method for dynamically adjusting a control and display equipment, wherein in the process that a user moves from an initial position to an end position, a controller receives environment image data corresponding to the initial position and environment image data corresponding to the end position, which are acquired by a camera, so as to obtain a user initial position parameter and a user end position parameter; calculating the offset of the target control according to the initial position parameter of the user and the end position parameter of the user; and obtaining a termination position parameter of the target control based on the initial position parameter of the target control and the offset of the target control, and moving the target control to a position corresponding to the termination position parameter. Therefore, the method and the display device provided by the application can realize that the position of the target control is adjusted along with the movement of the user, so that the visual angle of the user watching the target control at any direction within the visual range of the camera of the display device is unchanged, the user can be ensured to clearly see the display content of the control, and the subjective visual experience of the user is improved.

Description

Method for dynamically adjusting control and display device
Technical Field
The application relates to the technical field of smart televisions, in particular to a method for dynamically adjusting a control and display equipment.
Background
With the continuous development of communication technology, terminal devices such as computers, smart phones and display devices have become more and more popular, and display devices such as smart televisions gradually enter the lives of people, so that entertainment experience is brought to people. In order to be able to provide rich display contents, an application program that can implement various functions is installed in a display device, and the contents of the application program are displayed in a display of the display device.
When an application program is displayed in a display interface of a display, the display interface is provided with a plurality of controls to provide various functions, for example, when one application program receives a new message, the new message is displayed in the display interface in the form of a notification or a pop-up frame, and at the moment, the notification or the pop-up frame can be used as a control; or, one application program is currently displayed in the display interface, and the content of another application program is displayed above the content of the application program in the form of a floating window, and at this time, the floating window can be used as a control.
When the existing display device displays the control, the control is usually displayed in a fixed position of the display interface, such as an upper right corner area. However, since the user may use the display device in different directions or at different distances, if the control is displayed in a fixed position, the size of the control viewed by the user may be different from the visual imaging angle, which results in different degrees of clarity of the content of the control viewed by the user. Therefore, if the user uses the display device at a position far away from the display device, the situation that the user cannot clearly see the display content of the control is very easy to occur, and the experience of the user is influenced.
Disclosure of Invention
The application provides a method for dynamically adjusting a control and display equipment, which are used for solving the problem that a user cannot clearly see the display content of the control when the user uses the display equipment at a position far away from the user.
In a first aspect, the present application provides a display device comprising:
a camera configured to acquire environmental image data characterizing a position parameter of a user relative to a display;
a display configured to present a display interface in which a target control is displayed;
a controller in communication with the display and the camera, respectively, the controller configured to:
in the process that a user moves from an initial position to an end position, acquiring initial position parameters of the target control, and receiving environment image data corresponding to the initial position and environment image data corresponding to the end position, which are acquired by the camera;
acquiring a user initial position parameter carried in the environment image data corresponding to the initial position and a user end position parameter carried in the environment image data corresponding to the end position;
calculating the offset of the target control based on the user initial position parameter and the user end position parameter, wherein the offset is used for representing the movement parameter of the target control when the position is adjusted;
and obtaining a termination position parameter of the target control based on the initial position parameter of the target control and the offset of the target control, and moving the target control to a position corresponding to the termination position parameter.
Further, the controller, in performing the calculating the offset of the target control based on the user initial position parameter and the user end position parameter, is further configured to:
the initial user position parameters comprise an initial relative distance and initial position parameters of a user relative to the display, the end user position parameters comprise an end relative distance and end position parameters of the user relative to the display, and the position parameters corresponding to the user refer to parameters of the center point of the face frame;
calculating a first distance between the initial position parameter corresponding to the user and the initial position parameter of the target control, and a second distance between the ending position parameter corresponding to the user and the initial position parameter of the target control, wherein the position parameter corresponding to the target control refers to a parameter of a control center point;
calculating a theoretical second distance of the user when the user moves to the end position based on the initial relative distance, the end relative distance and the first distance, wherein the theoretical second distance is used for representing a theoretical distance between the end position corresponding to the user and the end position of the target control;
and calculating the distance difference between the theoretical second distance and the second distance to obtain the offset of the target control.
Further, the controller, in performing the calculating the first distance between the initial position parameter corresponding to the user and the initial position parameter of the target control, is further configured to:
acquiring the pixel point number of the center point of the face frame when the user is at the initial position and the pixel point number of the control center point of the target control;
calculating the pixel point number difference between the pixel point number of the center point of the face frame and the pixel point number of the center point of the control;
and calculating a first distance between the initial position parameter corresponding to the user and the initial position parameter of the target control according to the pixel point number difference and the length value of each pixel point.
Further, the controller, in performing the calculating a second distance between the end position parameter corresponding to the user and the initial position parameter of the target control, is further configured to:
acquiring the pixel point number of the center point of the face frame when the user is positioned at the end position and the pixel point number of the control center point of the target control;
calculating the pixel point number difference between the pixel point number of the center point of the face frame and the pixel point number of the center point of the control;
and calculating a second distance between the ending position parameter corresponding to the user and the initial position parameter of the target control according to the pixel number difference and the length value of each pixel.
Further, the controller, in performing the calculating of the theoretical second distance of the user when moving to the end position based on the initial relative distance, the end relative distance, and the first distance, is further configured to:
according to formula S 2 '=BNS 1 -AX, calculating a theoretical second distance of the user when moving to the end position;
in the formula, S 2 ' is a theoretical second distance, S 1 AX is the initial relative distance and BN is the ending relative distance.
Further, the controller, in executing the obtaining of the initial position parameter of the target control, is further configured to:
establishing a reference coordinate system by taking the upper left corner of the display interface as a coordinate origin, taking the direction from the left side to the right side of the display interface as an X-axis forward direction, and taking the direction from the upper side to the lower side of the display interface as a Y-axis forward direction;
acquiring the number of pixels of the origin of coordinates and the number of horizontal pixels and the number of vertical pixels of the control center point of the target control;
calculating a horizontal pixel point number difference between the pixel point number of the coordinate origin and the horizontal pixel point number of the control center point, and a longitudinal pixel point number difference between the pixel point number of the coordinate origin and the longitudinal pixel point number of the control center point;
and calculating a transverse initial distance and a longitudinal initial distance between a control center point and a coordinate origin of the target control according to the transverse pixel point number difference, the longitudinal pixel point number difference and the length value of each pixel point, and taking the transverse initial distance, the longitudinal initial distance, the transverse pixel point number and the longitudinal pixel point number of the control center point as initial position parameters of the target control.
Further, the controller, in performing the calculating the offset of the target control based on the user initial position parameter and the user end position parameter, is further configured to:
the user initial position parameters comprise transverse initial position parameters and longitudinal initial position parameters, and the user end position parameters comprise transverse end position parameters and longitudinal end position parameters;
calculating the transverse offset of the target control based on the transverse initial position parameter and the transverse end position parameter;
and calculating the longitudinal offset of the target control based on the longitudinal initial position parameter and the longitudinal end position parameter.
Further, the controller, in executing the obtaining of the ending position parameter of the target control based on the initial position parameter of the target control and the offset of the target control, is further configured to:
the initial position parameters of the target control comprise a transverse initial position parameter and a longitudinal initial position parameter;
calculating a transverse termination position parameter of the target control according to the transverse initial position parameter and the transverse offset of the target control;
and calculating the longitudinal termination position parameter of the target control according to the longitudinal initial position parameter and the longitudinal offset of the target control.
Further, the controller is further configured to:
receiving environmental image data collected by the camera;
identifying the number of human faces in the environment image data;
and when the number of the human faces in the environment image data is 1, executing the step of acquiring the initial position parameter and the end position parameter of the user.
In a second aspect, the present application further provides a method for dynamically adjusting a control, including the following steps:
in the process that a user moves from an initial position to an end position, acquiring initial position parameters of the target control, and receiving environment image data corresponding to the initial position and environment image data corresponding to the end position, which are acquired by the camera;
acquiring a user initial position parameter carried in the environment image data corresponding to the initial position and a user end position parameter carried in the environment image data corresponding to the end position;
calculating the offset of the target control based on the user initial position parameter and the user end position parameter, wherein the offset is used for representing the movement parameter of the target control when the position is adjusted;
and obtaining a termination position parameter of the target control based on the initial position parameter of the target control and the offset of the target control, and moving the target control to a position corresponding to the termination position parameter.
In a third aspect, the present application further provides a storage medium, where the computer storage medium may store a program, and the program may implement, when executed, some or all of the steps in the embodiments of the method for dynamically adjusting a control provided in the present application.
According to the technical scheme, in the process that the user moves from the initial position to the end position, the controller receives the environment image data corresponding to the initial position and the environment image data corresponding to the end position, which are acquired by the camera, so as to obtain the initial position parameter and the end position parameter of the user; calculating the offset of the target control according to the initial position parameter of the user and the end position parameter of the user; and obtaining a termination position parameter of the target control based on the initial position parameter of the target control and the offset of the target control, and moving the target control to a position corresponding to the termination position parameter. Therefore, the method and the display device provided by the embodiment of the invention can realize that the position of the target control is adjusted along with the movement of the user, so that the visual angle of the user watching the target control at any direction within the visual range of the camera of the display device is unchanged, the user can clearly see the display content of the control, and the subjective visual experience of the user is improved.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without any creative effort.
Fig. 1 is a schematic diagram illustrating an operation scenario between a display device and a control apparatus according to an embodiment;
fig. 2 is a block diagram exemplarily showing a hardware configuration of a display device 200 according to an embodiment;
fig. 3 is a block diagram exemplarily showing a hardware configuration of the control apparatus 100 according to the embodiment;
fig. 4 is a diagram exemplarily showing a functional configuration of the display device 200 according to the embodiment;
fig. 5a schematically shows a software configuration in the display device 200 according to an embodiment;
fig. 5b schematically shows a configuration of an application in the display device 200 according to an embodiment;
FIG. 6 is a flow diagram illustrating a method of dynamically adjusting a control according to an embodiment;
FIG. 7 is a flow diagram illustrating a method of obtaining initial position parameters of a target control according to an embodiment;
a schematic diagram of a reference coordinate system according to an embodiment is exemplarily shown in fig. 8;
fig. 9 is a diagram schematically illustrating environment image data corresponding to an initial position according to an embodiment;
fig. 10 is a diagram exemplarily illustrating environment image data corresponding to an end position according to an embodiment;
FIG. 11 is a schematic diagram illustrating an exemplary change in position of a center point of a face frame on a display interface during movement of a user according to an embodiment;
FIG. 12 is a flow diagram illustrating a method of calculating an offset for a target control according to an embodiment;
fig. 13 shows an exemplary illustration of the determination of the theoretical second distance according to an embodiment;
FIG. 14 illustrates a first diagram when dynamically adjusting the position of a control according to an embodiment;
a second schematic diagram in accordance with an embodiment of dynamically adjusting the position of a control is illustrated in fig. 15.
Detailed Description
To make the objects, technical solutions and advantages of the exemplary embodiments of the present application clearer, the technical solutions in the exemplary embodiments of the present application will be clearly and completely described below with reference to the drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, but not all the embodiments.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments shown in the present application without inventive effort, shall fall within the scope of protection of the present application. Moreover, while the disclosure herein has been presented in terms of exemplary one or more examples, it is to be understood that each aspect of the disclosure can be utilized independently and separately from other aspects of the disclosure to provide a complete disclosure.
It should be understood that the terms "first," "second," "third," and the like in the description and in the claims of the present application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used are interchangeable under appropriate circumstances and can, for example, be implemented in sequences other than those illustrated or otherwise described herein with reference to the embodiments of the application.
Furthermore, the terms "comprises" and "comprising," as well as any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
The term "module," as used herein, refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The term "remote control" as used in this application refers to a component of an electronic device (such as the display device disclosed in this application) that is typically wirelessly controllable over a relatively short range of distances. Typically using infrared and/or Radio Frequency (RF) signals and/or bluetooth to interface with the electronic device, and may also include WiFi, wireless USB, bluetooth, motion sensor, etc. functional modules. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in the common remote control device with the user interface in the touch screen.
The term "gesture" as used in this application refers to a user's behavior through a change in hand shape or an action such as hand motion to convey a desired idea, action, purpose, or result.
Fig. 1 is a schematic diagram illustrating an operation scenario between a display device and a control apparatus according to an embodiment. As shown in fig. 1, the user may operate the display device 200 through the mobile terminal 300 and the control apparatus 100.
The control device 100 may control the display device 200 in a wireless or other wired manner by using a remote controller, including infrared protocol communication, bluetooth protocol communication, other short-distance communication manners, and the like. The user may input a user command through a key on a remote controller, voice input, control panel input, etc. to control the display apparatus 200. Such as: the user may input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right movement keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement a function of controlling the display apparatus 200.
In some embodiments, mobile terminals, tablets, computers, laptops, and other smart devices may also be used to control the display device 200. For example, the display device 200 is controlled using an application program running on the smart device. The application, through configuration, may provide the user with various controls in an intuitive User Interface (UI) on a screen associated with the smart device.
For example, the mobile terminal 300 may install a software application with the display device 200, implement connection communication through a network communication protocol, and implement the purpose of one-to-one control operation and data communication. Such as: the mobile terminal 300 and the display device 200 can establish a control instruction protocol, synchronize a remote control keyboard to the mobile terminal 300, and control the display device 200 by controlling a user interface on the mobile terminal 300. The audio and video content displayed on the mobile terminal 300 can also be transmitted to the display device 200, so as to realize the synchronous display function.
As also shown in fig. 1, the display apparatus 200 also performs data communication with the server 400 through various communication means. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. Illustratively, the display device 200 receives software program updates, or accesses a remotely stored digital media library, by sending and receiving information, as well as Electronic Program Guide (EPG) interactions. The servers 400 may be a group or groups of servers, and may be one or more types of servers. Other web service contents such as video on demand and advertisement services are provided through the server 400.
The display device 200 may be a liquid crystal display, an OLED display, a projection display device. The particular display device type, size, resolution, etc. are not limiting, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
The display apparatus 200 may additionally provide an intelligent network tv function that provides a computer support function in addition to the broadcast receiving tv function. Examples include a web tv, a smart tv, an Internet Protocol Tv (IPTV), and the like.
A hardware configuration block diagram of a display device 200 according to an exemplary embodiment is exemplarily shown in fig. 2. As shown in fig. 2, the display device 200 includes a controller 210, a tuning demodulator 220, a communication interface 230, a detector 240, an input/output interface 250, a video processor 260-1, an audio processor 60-2, a display 280, an audio output 270, a memory 290, a power supply, and an infrared receiver.
A display 280 for receiving the image signal from the video processor 260-1 and displaying the video content and image and components of the menu manipulation interface. The display 280 includes a display screen assembly for presenting a picture, and a driving assembly for driving the display of an image. The video content may be displayed from broadcast television content, or may be broadcast signals that may be received via a wired or wireless communication protocol. Alternatively, various image contents received from a network communication protocol and transmitted from a network server side can be displayed.
Meanwhile, the display 280 simultaneously displays a user manipulation UI interface generated in the display apparatus 200 and used to control the display apparatus 200.
And, a driving component for driving the display according to the type of the display 280. Alternatively, in case the display 280 is a projection display, it may also comprise a projection device and a projection screen.
The communication interface 230 is a component for communicating with an external device or an external server according to various communication protocol types. For example: the communication interface 230 may be a Wifi chip 231, a bluetooth communication protocol chip 232, a wired ethernet communication protocol chip 233, or other network communication protocol chips or near field communication protocol chips, and an infrared receiver (not shown).
The display apparatus 200 may establish control signal and data signal transmission and reception with an external control apparatus or a content providing apparatus through the communication interface 230. And an infrared receiver, an interface device for receiving an infrared control signal for controlling the apparatus 100 (e.g., an infrared remote controller, etc.).
The detector 240 is a signal used by the display device 200 to collect an external environment or interact with the outside. The detector 240 includes a light receiver 242, a sensor for collecting the intensity of ambient light, and parameters such as parameter changes can be adaptively displayed by collecting the ambient light.
The image acquisition device 241, such as a camera and a camera, may be used to acquire an external environment scene, acquire attributes of a user or interact gestures with the user, adaptively change display parameters, and recognize gestures of the user, so as to implement an interaction function with the user.
In some other exemplary embodiments, the detector 240, a temperature sensor, etc. may be provided, for example, by sensing the ambient temperature, and the display device 200 may adaptively adjust the display color temperature of the image. For example, the display apparatus 200 may be adjusted to display a cool tone when the temperature is in a high environment, or the display apparatus 200 may be adjusted to display a warm tone when the temperature is in a low environment.
In other exemplary embodiments, the detector 240, and a sound collector, such as a microphone, may be used to receive a user's voice, a voice signal including a control instruction from the user to control the display device 200, or collect an ambient sound for identifying an ambient scene type, and the display device 200 may adapt to the ambient noise.
The input/output interface 250 controls data transmission between the display device 200 of the controller 210 and other external devices. Such as receiving video and audio signals or command instructions from an external device.
Input/output interface 250 may include, but is not limited to, the following: any one or more of high definition multimedia interface HDMI interface 251, analog or data high definition component input interface 253, composite video input interface 252, USB input interface 254, RGB ports (not shown in the figures), etc.
In some other exemplary embodiments, the input/output interface 250 may also form a composite input/output interface with the above-mentioned plurality of interfaces.
The tuning demodulator 220 receives the broadcast television signals in a wired or wireless receiving manner, may perform modulation and demodulation processing such as amplification, frequency mixing, resonance, and the like, and demodulates the television audio and video signals carried in the television channel frequency selected by the user and the EPG data signals from a plurality of wireless or wired broadcast television signals.
The tuner demodulator 220 is responsive to the user-selected television signal frequency and the television signal carried by the frequency, as selected by the user and controlled by the controller 210.
The tuner-demodulator 220 may receive signals in various ways according to the broadcasting system of the television signal, such as: terrestrial broadcast, cable broadcast, satellite broadcast, or internet broadcast signals, etc.; and according to different modulation types, the modulation mode can be digital modulation or analog modulation. Depending on the type of television signal received, both analog and digital signals are possible.
In other exemplary embodiments, the tuner/demodulator 220 may be in an external device, such as an external set-top box. In this way, the set-top box outputs television audio/video signals after modulation and demodulation, and the television audio/video signals are input into the display device 200 through the input/output interface 250.
The video processor 260-1 is configured to receive an external video signal, and perform video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image synthesis, and the like according to a standard codec protocol of the input signal, so as to obtain a signal that can be displayed or played on the direct display device 200.
Illustratively, the video processor 260-1 includes a demultiplexing module, a video decoding module, an image synthesizing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio and video data stream, and if the input MPEG-2 is input, the demultiplexing module demultiplexes the input audio and video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert an input video frame rate, such as a 60Hz frame rate into a 120Hz frame rate or a 240Hz frame rate, and the normal format is implemented in, for example, an interpolation frame mode.
The display format module is used for converting the received video output signal after the frame rate conversion, and changing the signal to conform to the signal of the display format, such as outputting an RGB data signal.
The audio processor 260-2 is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform noise reduction, digital-to-analog conversion, amplification processing, and the like to obtain an audio signal that can be played in the speaker.
In other exemplary embodiments, video processor 260-1 may comprise one or more chips. The audio processor 260-2 may also comprise one or more chips.
And, in other exemplary embodiments, the video processor 260-1 and the audio processor 260-2 may be separate chips or may be integrated together with the controller 210 in one or more chips.
An audio output 272, which receives the sound signal output from the audio processor 260-2 under the control of the controller 210, such as: the speaker 272, and the external sound output terminal 274 that can be output to the generation device of the external device, in addition to the speaker 272 carried by the display device 200 itself, such as: an external sound interface or an earphone interface and the like.
The power supply provides power supply support for the display device 200 from the power input from the external power source under the control of the controller 210. The power supply may include a built-in power supply circuit installed inside the display device 200, or may be a power supply interface installed outside the display device 200 to provide an external power supply in the display device 200.
A user input interface for receiving an input signal of a user and then transmitting the received user input signal to the controller 210. The user input signal may be a remote controller signal received through an infrared receiver, and various user control signals may be received through the network communication module.
For example, the user inputs a user command through the remote controller 100 or the mobile terminal 300, the user input interface responds to the user input through the controller 210 according to the user input, and the display device 200 responds to the user input.
In some embodiments, a user may enter a user command on a Graphical User Interface (GUI) displayed on the display 280, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
The controller 210 controls the operation of the display apparatus 200 and responds to the user's operation through various software control programs stored in the memory 290.
As shown in fig. 2, the controller 210 includes a RAM213 and a ROM214, as well as a graphics processor 216, a CPU processor 212, and a communication interface 218, such as: a first interface 218-1 through an nth interface 218-n, and a communication bus. The RAM213 and the ROM214, the graphic processor 216, the CPU processor 212, and the communication interface 218 are connected via a bus.
A ROM213 for storing instructions for various system boots. If the display apparatus 200 starts power-on upon receipt of the power-on signal, the CPU processor 212 executes a system boot instruction in the ROM, copies the operating system stored in the memory 290 to the RAM213, and starts running the boot operating system. After the start of the operating system is completed, the CPU processor 212 copies the various application programs in the memory 290 to the RAM213, and then starts running and starting the various application programs.
A graphics processor 216 for generating various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The display device comprises an arithmetic unit which carries out operation by receiving various interactive instructions input by a user and displays various objects according to display attributes. And a renderer for generating various objects based on the operator and displaying the rendered result on the display 280.
A CPU processor 212 for executing operating system and application program instructions stored in memory 290. And executing various application programs, data and contents according to various interactive instructions received from the outside so as to finally display and play various audio and video contents.
In some exemplary embodiments, the CPU processor 212 may include a plurality of processors. The plurality of processors may include one main processor and a plurality of or one sub-processor. A main processor for performing some operations of the display apparatus 200 in a pre-power-up mode and/or operations of displaying a screen in a normal mode. A plurality of or one sub-processor for one operation in a standby mode or the like.
The controller 210 may control the overall operation of the display apparatus 100. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
Wherein the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to the icon. The user command for selecting the UI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch pad, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice spoken by the user.
The memory 290 includes a memory for storing various software modules for driving the display device 200. Such as: various software modules stored in memory 290, including: the system comprises a basic module, a detection module, a communication module, a display control module, a browser module, various service modules and the like.
Wherein the basic module is a bottom layer software module for signal communication among the various hardware in the postpartum care display device 200 and for sending processing and control signals to the upper layer module. The detection module is used for collecting various information from various sensors or user input interfaces, and the management module is used for performing digital-to-analog conversion and analysis management.
For example: the voice recognition module comprises a voice analysis module and a voice instruction database module. The display control module is a module for controlling the display 280 to display image content, and may be used to play information such as multimedia image content and UI interface. And the communication module is used for carrying out control and data communication with external equipment. And the browser module is used for executing a module for data communication between browsing servers. And the service module is used for providing various services and modules including various application programs.
Meanwhile, the memory 290 is also used to store visual effect maps and the like for receiving external data and user data, images of respective items in various user interfaces, and a focus object.
A block diagram of the configuration of the control apparatus 100 according to an exemplary embodiment is exemplarily shown in fig. 3. As shown in fig. 3, the control apparatus 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory 190, and a power supply 180.
The control device 100 is configured to control the display device 200 and may receive an input operation instruction of a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200. Such as: the user operates the channel up/down keys on the control device 100, and the display device 200 responds to the channel up/down operation.
In some embodiments, the control device 100 may be a smart device. Such as: the control apparatus 100 may install various applications for controlling the display apparatus 200 according to user demands.
In some embodiments, as shown in fig. 1, a mobile terminal 300 or other intelligent electronic device may function similar to the control device 100 after installing an application that manipulates the display device 200. Such as: the user may implement the functions of controlling the physical keys of the device 100 by installing applications, various function keys or virtual buttons of a graphical user interface available on the mobile terminal 300 or other intelligent electronic device.
The controller 110 includes a processor 112 and RAM113 and ROM114, a communication interface 218, and a communication bus. The controller 110 is used to control the operation of the control device 100, as well as the internal components for communication and coordination and external and internal data processing functions.
The communication interface 130 enables communication of control signals and data signals with the display apparatus 200 under the control of the controller 110. Such as: the received user input signal is transmitted to the display apparatus 200. The communication interface 130 may include at least one of a WiFi chip, a bluetooth module, an NFC module, and other near field communication modules.
A user input/output interface 140, wherein the input interface includes at least one of a microphone 141, a touch pad 142, a sensor 143, keys 144, and other input interfaces. Such as: the user can realize a user instruction input function through actions such as voice, touch, gesture, pressing, and the like, and the input interface converts the received analog signal into a digital signal and converts the digital signal into a corresponding instruction signal, and sends the instruction signal to the display device 200.
The output interface includes an interface that transmits the received user instruction to the display apparatus 200. In some embodiments, the interface may be an infrared interface or a radio frequency interface. Such as: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then the digital signal is modulated according to the rf control signal modulation protocol and then transmitted to the display device 200 through the rf transmitting terminal.
In some embodiments, the control device 100 includes at least one of a communication interface 130 and an output interface. The control device 100 is configured with a communication interface 130, such as: the WiFi, bluetooth, NFC, etc. modules may transmit the user input command to the display device 200 through the WiFi protocol, or the bluetooth protocol, or the NFC protocol code.
A memory 190 for storing various operation programs, data and applications for driving and controlling the control apparatus 200 under the control of the controller 110. The memory 190 may store various control signal commands input by a user.
And a power supply 180 for providing operational power support to the various elements of the control device 100 under the control of the controller 110. A battery and associated control circuitry.
Fig. 4 is a diagram schematically illustrating a functional configuration of the display device 200 according to an exemplary embodiment. As shown in fig. 4, the memory 290 is used to store an operating system, an application program, contents, user data, and the like, and performs system operations for driving the display device 200 and various operations in response to a user under the control of the controller 210. The memory 290 may include volatile and/or nonvolatile memory.
The memory 290 is specifically configured to store an operating program for driving the controller 210 in the display device 200, and to store various application programs installed in the display device 200, various application programs downloaded by a user from an external device, various graphical user interfaces related to the applications, various objects related to the graphical user interfaces, user data information, and internal data of various supported applications. The memory 290 is used to store system software such as an OS kernel, middleware, and applications, and to store input video data and audio data, and other user data.
The memory 290 is specifically used for storing drivers and related data such as the audio/video processors 260-1 and 260-2, the display 280, the communication interface 230, the tuning demodulator 220, the input/output interface of the detector 240, and the like.
In some embodiments, memory 290 may store software and/or programs, software programs for representing an Operating System (OS) including, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. For example, the kernel may control or manage system resources, or functions implemented by other programs (e.g., the middleware, APIs, or applications), and the kernel may provide interfaces to allow the middleware and APIs, or applications, to access the controller to implement controlling or managing system resources.
The memory 290, for example, includes a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an image control module 2904, a display control module 2905, an audio control module 2906, an external instruction recognition module 2907, a communication control module 2908, a light receiving module 2909, a power control module 2910, an operating system 2911, and other applications 2912, a browser module, and the like. The controller 210 performs functions such as: a broadcast television signal reception demodulation function, a television channel selection control function, a volume selection control function, an image control function, a display control function, an audio control function, an external instruction recognition function, a communication control function, an optical signal reception function, an electric power control function, a software control platform supporting various functions, a browser function, and the like.
A block diagram of a configuration of a software system in a display device 200 according to an exemplary embodiment is exemplarily shown in fig. 5 a.
As shown in fig. 5a, an operating system 2911, including executing operating software for handling various basic system services and for performing hardware related tasks, acts as an intermediary for data processing performed between application programs and hardware components. In some embodiments, portions of the operating system kernel may contain a series of software to manage the display device hardware resources and provide services to other programs or software code.
In other embodiments, portions of the operating system kernel may include one or more device drivers, which may be a set of software code in the operating system that assists in operating or controlling the devices or hardware associated with the display device. The drivers may contain code that operates the video, audio, and/or other multimedia components. Examples include a display screen, a camera, Flash, WiFi, and audio drivers.
The accessibility module 2911-1 is configured to modify or access the application program to achieve accessibility and operability of the application program for displaying content.
A communication module 2911-2 for connection to other peripherals via associated communication interfaces and a communication network.
User interface modules 2911-3, which are used to provide objects for displaying user interfaces for access by various applications, enable user operability.
Control applications 2911-4 for controllable process management, including runtime applications and the like.
The event transmission system 2914, which may be implemented within the operating system 2911 or within the application program 2912, in some embodiments, on the one hand, within the operating system 2911 and on the other hand, within the application program 2912, is configured to listen for various user input events, and to refer to handlers that perform one or more predefined operations in response to the identification of various types of events or sub-events, depending on the various events.
The event monitoring module 2914-1 is configured to monitor an event or a sub-event input by the user input interface.
The event identification module 2914-1 is configured to input definitions of various types of events for various user input interfaces, identify various events or sub-events, and transmit the same to a process for executing one or more corresponding sets of processes.
The event or sub-event refers to an input detected by one or more sensors in the display device 200 and an input of an external control device (e.g., the control device 100). Such as: the method comprises the following steps of inputting various sub-events through voice, inputting gestures through gesture recognition, inputting sub-events through remote control key commands of the control equipment and the like. Illustratively, the one or more sub-events in the remote control include a variety of forms including, but not limited to, one or a combination of key presses up/down/left/right/, ok keys, key presses, and the like. And non-physical key operations such as move, hold, release, etc.
The interface layout manager 2913, directly or indirectly receiving the input events or sub-events from the event transmission system 2914, monitors the input events or sub-events, and updates the layout of the user interface, including but not limited to the position of each control or sub-control in the interface, and the size, position, and level of the container, and other various execution operations related to the layout of the interface.
As shown in fig. 5b, the application layer 2912 contains various applications that may also be executed at the display device 200. The application may include, but is not limited to, one or more applications such as: live television applications, video-on-demand applications, media center applications, application centers, gaming applications, and the like.
The live television application program can provide live television through different signal sources. For example, a live television application may provide television signals using input from cable television, radio broadcasts, satellite services, or other types of live television services. And, the live television application may display video of the live television signal on display device 200.
A video-on-demand application may provide video from different storage sources. Unlike live television applications, video on demand provides a video display from some storage source. For example, the video on demand may come from a server side of cloud storage, from a local hard disk storage containing stored video programs.
The media center application program can provide various applications for playing multimedia contents. For example, a media center, which may be other than live television or video on demand, may provide services that a user may access to various images or audio through a media center application.
The application program center can provide and store various application programs. The application may be a game, an application, or some other application associated with a computer system or other device that may be run on the smart television. The application center may obtain these applications from different sources, store them in local storage, and then be executable on the display device 200.
In order to enable a user to still clearly see the display content of the control when the user uses the display device at a position far away from the user, the display device provided by the embodiment of the invention can control the control to adjust the position along with the movement of the user in the process that the user moves the position when using the display device, for example, if the user moves left in front of the display device, the control moves along with the movement of the user and also moves the corresponding position left in the display interface, so that the visual angles of the user and the control are unchanged, and the user can clearly see the display content of the control at a certain visual angle no matter which direction of the display device the user is located.
Specifically, an embodiment of the present invention provides a display device, including: a controller, and a display and a camera in communication with the controller, respectively. The camera is configured to acquire environment image data, the environment image data are used for representing position parameters of a user relative to the display, the camera sends the acquired environment image data to the controller, and the controller can acquire the position parameters of the user; the position parameters comprise the vertical distance between the user and the display and the position of the center point of the face frame of the user on the display interface when the center point vertically falls on the display; the face frame is a calibration frame of the face of the user when the camera collects the user image in front of the display device, and the center point of the face frame can be the center position of the face frame or the center position between two pupils of the user. The display is configured to present a display interface, and a target control is displayed in the display interface, and the target control can be a notification, a popup box, a floating window and the like.
In order to enable the display equipment to have the function of dynamically adjusting the position of the control, a control key for realizing the function of dynamically adjusting the position of the control can be configured in the controller, and when the control on the display page is required to be controlled by the display equipment to adjust the position along with the movement of a user, the control key can be started in advance, so that the display equipment has the function of dynamically adjusting the position of the control. And if the control key is not started, the control is normally displayed, and when the user moves, the control does not adjust the position along with the movement of the user.
A flow diagram of a method of dynamically adjusting a control according to an embodiment is illustrated in fig. 6. Referring to fig. 6, in the display device provided in the embodiment of the present invention, when implementing dynamic adjustment of the control, the controller implements control based on a face recognition algorithm and a distance detection algorithm, and specifically, the controller is configured to perform the following steps:
and S1, acquiring initial position parameters of the target control in the process that the user moves from the initial position to the end position, and receiving environment image data corresponding to the initial position and environment image data corresponding to the end position, which are acquired by the camera.
And the user starts a control key, so that the controller acquires the environmental image data acquired by the camera in real time after the display equipment has the function of dynamically adjusting the position of the control. If a user using the display device has a position change, the position before the position change is taken as the initial position of the user, and the position after the position change is taken as the end position of the user.
Because the camera collects the environmental image data in front of the display equipment in real time, the environmental image data corresponding to the initial position and the environmental image data corresponding to the end position of the user can be obtained. The environment image data corresponding to different positions can represent different relative distances between the user and the display and different positions on the display interface where the center point of the face frame of the user vertically falls on the display.
In order to realize the position adjustment of the control, the environment image data acquired by the camera needs to include the face of the user. In some embodiments, the controller performs face number recognition on the environment image data, and when only one face is recognized, the subsequent method for dynamically adjusting the control is continuously performed.
Specifically, the controller is further configured to: receiving environmental image data collected by a camera; identifying the number of human faces in the environmental image data; and when the number of the human faces in the environment image data is 1, executing the step of acquiring the initial position parameter and the end position parameter of the user.
When only one user uses the display device, if the position of the control is adjusted by executing the method for dynamically adjusting the control, the control is controlled to adjust the position according to the position parameter of the user, and the accuracy can be improved. If the number of the faces is multiple, the controller can control the target control to normally display, and the method for dynamically adjusting the control is not executed.
In other embodiments, if the faces of multiple users exist in the same piece of environmental image data, the controller may also select one of the users as the target following user in an alternative manner, and use the target following user as the basis for controlling the position adjustment of the control.
FIG. 7 is a flow diagram illustrating a method of obtaining initial position parameters of a target control according to an embodiment; a schematic diagram of a reference coordinate system according to an embodiment is exemplarily shown in fig. 8. To adjust the position of the target control, an initial position of the target is first determined, and referring to fig. 7, the controller is further configured to perform the following steps in executing the step of obtaining an initial position parameter of the target control:
and S121, establishing a reference coordinate system by taking the upper left corner of the display interface as a coordinate origin, taking the direction from the left side to the right side of the display interface as an X-axis forward direction, and taking the direction from the upper side to the lower side of the display interface as a Y-axis forward direction.
In order to accurately determine the initial position parameters of the target control, in this embodiment, a reference coordinate system may be established in the display interface. Referring to fig. 8, the origin O of the reference coordinate system is set at the upper left corner of the display interface, the X-axis forward direction is the direction from the left side to the right side of the display interface, and the Y-axis forward direction is the direction from the top side to the bottom side of the display interface.
And S122, acquiring the number of pixels of the coordinate origin and the number of horizontal pixels and the number of vertical pixels of the control center point of the target control.
The initial position parameter of the target control can be represented by coordinate values, and the horizontal and vertical coordinate values can be calculated according to pixel points of the control center point of the target control.
Therefore, according to the system attribute of the display device, the controller can obtain the resolution of the current display device, and further can determine the pixel point number of the origin of coordinates and the pixel point number of the control center point of the target control. Since the coordinate origin is located at the leftmost side of the display interface, the number of pixel points for equivalently determining the coordinate origin is 0.
In order to accurately represent the initial position parameter of the target control, in this embodiment, the coordinate position of the control center point M of the target control is used for representation. Therefore, the number of horizontal pixels and the number of vertical pixels of a control center point M of the target control are respectively obtained, the number of horizontal pixels refers to the number of pixels contained in the X-axis direction between the control center point M and the origin of coordinates O of the target control, and the number of vertical pixels refers to the number of pixels contained in the Y-axis direction between the control center point M and the origin of coordinates O of the target control.
S123, calculating a horizontal pixel point number difference between the pixel point number of the coordinate origin and the horizontal pixel point number of the control center point, and a longitudinal pixel point number difference between the pixel point number of the coordinate origin and the longitudinal pixel point number of the control center point.
The number of pixels of the origin of coordinates O is 0, and the coordinates corresponding to the origin of coordinates are (0, 0). The number of transverse pixel points of the control center point is P 1 The number of longitudinal pixel points of the control center point is P 2 The coordinate of the pixel point corresponding to the center point of the control is (P) 1 ,P 2 )。
Difference M between horizontal pixel points 1 P is the number of horizontal pixel points at the center point of the control-the number of pixel points at the origin of coordinates 1 -0=P 1
Difference M between longitudinal pixel points 2 P is the number of longitudinal pixel points of the center point of the control-the number of pixel points of the origin of coordinates 2 -0=P 2
And S124, calculating a horizontal initial distance and a vertical initial distance between the control center point and the coordinate origin of the target control according to the horizontal pixel point number difference, the vertical pixel point number difference and the length value of each pixel point, and taking the horizontal initial distance, the vertical initial distance, the horizontal pixel point number and the vertical pixel point number of the control center point as initial position parameters of the target control.
For a display device with a fixed resolution, the number of pixels included in a display interface is fixed, that is, one resolution corresponds to a group of pixel numbers. Therefore, the difference value of two adjacent pixel points, namely the length value of each pixel point, can be obtained. If the pixel point is square, the length value and the width value of the pixel point are the same. And multiplying the length value of each pixel point by the pixel point number difference to obtain the corresponding distance.
Transverse initial distance L 1 The difference of the number of horizontal pixels and the length of the pixel are M 1 And x n, determining the transverse initial distance between the control center point of the target control and the coordinate origin.
Longitudinal initial distance L 2 The difference of the number of longitudinal pixels and the length of the pixel are M 2 And x n, determining the longitudinal initial distance between the control center point of the target control and the coordinate origin.
Determining the coordinate of the control center point of the target control according to the horizontal initial distance and the vertical initial distance, namely the coordinate of the control center point is (L) 1 ,L 2 ) (ii) a Determining the pixel point coordinate of the control center point of the target control according to the number of the horizontal pixel points and the number of the vertical pixel points, namely determining the pixel point coordinate of the control center point as (P) 1 ,P 2 ). And taking the coordinates of the control center point and the coordinates of the pixel point of the control center point as initial position parameters of the target control.
For example, in a 60-inch display device, the length of the long side is fixed to be about 135.5cm, and for a common display interface with a resolution of 1080P, the corresponding inter-pixel difference (the length value of a pixel) is about 0.00071 m. If the coordinates of the pixel point of the control center point of the target control are (720P, 480P), the transverse initial distance L can be determined 1 720 × 0.00071 ═ 0.5112m, initial longitudinal distance L 2 480 × 0.00071 ═ 0.3408m, that is, the coordinates of the control center point of the target control are determined to be (0.5112m, 0.3408 m).
S2, acquiring the initial position parameter of the user carried in the environment image data corresponding to the initial position, and the end position parameter of the user carried in the environment image data corresponding to the end position.
Because the vertical distance between the user and the display and the position of the center point of the face frame of the user on the display interface when the center point of the face frame vertically falls on the display can be represented in the environment image data acquired by the camera, the controller can directly call the initial position parameter of the user and the end position parameter of the user according to the acquired environment image data.
Fig. 9 is a diagram schematically illustrating environment image data corresponding to an initial position according to an embodiment; fig. 10 is a schematic diagram illustrating environment image data corresponding to an end position according to an embodiment.
Referring to fig. 9, when the user is at an initial position in front of the display device, the controller may directly acquire a vertical distance (relative distance) between the user and the display from the corresponding environment image data, for example, 1.70m as shown in fig. 9. Referring to fig. 10, when the user moves from the initial position to the end position, the controller may directly obtain a vertical distance (relative distance) between the user and the display, for example, 2.24m shown in fig. 10, from the environment image data corresponding to the end position.
Fig. 11 is a schematic diagram illustrating a change in position of a center point of a face frame on a display interface during movement of a user according to an embodiment. Referring to fig. 11, when the user is located at the initial position a, and the center point of the face frame of the user vertically falls on the display, the position on the display interface where the center point is located is point X, and the AX connection line is perpendicular to the display interface. When the user moves to the end position B and the center point of the face frame of the user vertically falls on the display, the position on the display interface where the center point is located is a point N, and the BN connecting line is vertical to the display interface.
Therefore, when the user is at the initial position A, the connecting line AX is the vertical distance (relative distance) between the user and the display, and the point X is the position on the display interface where the center point of the face frame of the user vertically falls on the display, so that the connecting line AX and the point X can determine the initial position parameters of the user. When the user moves to the ending position B, the connection line BN is the vertical distance (relative distance) between the user and the display, and the point N is the position on the display interface when the center point of the face frame of the user vertically falls on the display, so that the parameters of the ending position of the user can be determined by the connection line BN and the point N.
And S3, calculating the offset of the target control based on the user initial position parameter and the user end position parameter, wherein the offset is used for representing the moving parameter of the target control when the position is adjusted.
In order to ensure that the viewing angle of the user when viewing the display device is not changed, that is, the viewing angle of the user when viewing the display content of the target control is not changed, the display device provided by the embodiment can clearly view the display content of the target control no matter where the user is located.
Therefore, in order to accurately determine the parameters of the target control in the position adjustment, the offset of the target control can be calculated according to the user initial position parameter and the user end position parameter, and the position parameter of the target control required to move is determined according to the offset.
The initial position parameters of the user comprise an initial relative distance and initial position parameters of the user relative to the display, the end position parameters of the user comprise an end relative distance and end position parameters of the user relative to the display, and the position parameters corresponding to the user refer to the parameters of the center point of the face frame. The initial position parameter refers to a position on a display interface where a center point of a face frame of a user vertically falls on a display when the user is at the initial position. The ending position parameter refers to a position on a display interface where a center point of a face frame of a user vertically falls on a display when the user moves to an ending position.
A flow diagram of a method of calculating an offset for a target control according to an embodiment is illustrated in fig. 12. Referring to fig. 12, in performing the step of calculating the offset of the target control based on the user initial position parameter and the user end position parameter, the controller is further configured to perform the steps of:
and S31, calculating a first distance between the initial position parameter corresponding to the user and the initial position parameter of the target control, and a second distance between the end position parameter corresponding to the user and the initial position parameter of the target control, wherein the position parameter corresponding to the target control is the parameter of the center point of the control.
The first distance is a plane distance between a position X on a display interface where a user faces and a control center point M of a target control when the user faces and the center point of a face frame of the user vertically falls on a display when the user is at an initial position A; the second distance is a plane distance between a position N on the display interface where the center point of the face frame of the user vertically falls on the display and the control center point M of the target control when the user moves to the end position B.
Taking the user's translation from the initial position to the end position as an example, the first distance (XM line) represents the lateral planar distance along the display interface, and the second distance (NM line) represents the lateral planar distance along the display interface. In this embodiment, the first distance and the second distance may be calculated by a pixel point difference between a center point of the face frame and a center point of a control of the target control.
Specifically, the controller, in performing the calculation of the first distance between the initial position parameter corresponding to the user and the initial position parameter of the target control, is further configured to:
and 311, acquiring the pixel point number of the center point of the face frame and the pixel point number of the center point of the control of the target control when the user is at the initial position.
And step 312, calculating the pixel point difference between the pixel point number of the center point of the face frame and the pixel point number of the center point of the control.
And 313, calculating a first distance between the initial position parameter corresponding to the user and the initial position parameter of the target control according to the pixel point number difference and the length value of each pixel point.
The pixel point number of the center point of the face frame when the user is located at the initial position can be obtained from the environment image data corresponding to the initial position by the controller, and the pixel point number of the center point of the control of the target control can be obtained according to the system attribute. The two pixel points can read corresponding pixel point coordinates in the reference coordinate system.
Taking the translation of the user from the initial position to the end position as an example, the number of pixels at the center point of the face frame does not change in the Y-axis direction, and therefore, the difference in the number of pixels can be calculated according to the number of pixels at the center point of the face frame corresponding to the initial position and the number of pixels at the center point of the control, that is, according to the number of horizontal pixels at the center point of the face frame and the number of horizontal pixels at the center point of the control. The specific calculation method of the pixel point difference and the first distance may refer to the contents of steps S121 to S124 provided in the foregoing embodiments, and details are not repeated here.
For example, when the user is at the initial position a, the number of pixels in the center point X of the face frame is 480P, and the number of pixels in the center point of the control of the target control is 720P, so that the difference between the numbers of pixels is determined to be 720P-480P-240P.
According to the length value of each pixel point being 0.00071m, the first distance S can be determined 1 The value was 240 × 0.00071 ═ 0.1704 m.
The controller, in performing the calculating of the second distance between the end position parameter corresponding to the user and the initial position parameter of the target control, is further configured to:
and 321, acquiring the pixel point number of the center point of the face frame and the pixel point number of the center point of the control of the target control when the user is at the end position.
And 322, calculating the pixel point difference between the pixel point number of the center point of the face frame and the pixel point number of the center point of the control.
And 323, calculating a second distance between the ending position parameter corresponding to the user and the initial position parameter of the target control according to the pixel point number difference and the length value of each pixel point.
Taking the translation of the user from the initial position to the end position as an example, the number of pixels at the center point of the face frame does not change in the Y-axis direction, and therefore, the difference in the number of pixels can be calculated according to the number of pixels at the center point of the face frame corresponding to the end position and the number of pixels at the center point of the control, that is, according to the number of horizontal pixels at the center point of the face frame and the number of horizontal pixels at the center point of the control. The specific calculation method of the pixel point difference and the second distance may refer to the contents of steps S121 to S124 provided in the foregoing embodiments, and details are not repeated here.
For example, when the user is at the end position B, the number of pixels in the center point X of the face frame is 360P, and the number of pixels in the center point of the control of the target control is 720P, so that the difference between the number of pixels is determined to be 720P-360P, which is 360P.
According to the length value of each pixel point being 0.00071m, the second distance S can be determined 2 Is 360 × 0.00071 ═ 0.2556m。
And S32, calculating a theoretical second distance when the user moves to the end position based on the initial relative distance, the end relative distance and the first distance, wherein the theoretical second distance is used for representing the theoretical distance between the end position corresponding to the user and the end position of the target control.
Because the second distance from the control center point of the target control after the user moves the position cannot guarantee that the viewing angle of the user viewing the target control is the same as the viewing angle of the user viewing the target control when the user is located at the initial position, the position of the target control needs to be adjusted, that is, the theoretical second distance required by the user viewing the target control at the same viewing angle when the user moves to the end position needs to be determined.
In this embodiment, the controller calculates a theoretical second distance of the user when moving to the end position based on the initial relative distance, the end relative distance, and the first distance as follows:
S 2 '=BNS 1 /AX;
in the formula, S 2 ' is a theoretical second distance, S 1 AX is the initial relative distance and BN is the ending relative distance.
Fig. 13 shows an exemplary illustration of the determination of the theoretical second distance according to an exemplary embodiment. Referring to fig. 11 and 13, in order to ensure that the viewing angles of the user viewing the target control are consistent in the moving process, it is necessary that the included angle formed by the connecting line between the center point of the face frame and the center point of the control and the connecting line between the center point of the face frame and the display interface is the same, that is, α and β are the same, where α is the included angle formed by the connecting line between the center point of the face frame and the center point of the control and the connecting line between the center point of the face frame and the display interface when the user is at the initial position, that is, the included angle formed by the connecting line between the AM connecting line and the AX connecting line; and beta is an included angle formed by a connecting line between the center point of the face frame and the center point of the control and a connecting line between the center point of the face and the display interface when the user moves to the ending position, namely the included angle between the BM connecting line and the BN connecting line.
To make α ═ β, tan (α) ═ tan (β), i.e., S, is required 1 /AX=S 2 '/BN, i.e. the theoretical second distance can be calculated
S 2 '=BNS 1 /AX。
And S33, calculating the distance difference between the theoretical second distance and the second distance, and obtaining the offset of the target control.
The theoretical second distance is a theoretical distance between the end position corresponding to the user and the end position M' of the target control, and therefore, the Offset of the target control is obtained according to the distance difference between the theoretical second distance and the second distance.
Offset=S 2 -S 2 '。
The offset may enable a distance that a control center point M of the target control moves to point M'.
And S4, obtaining the termination position parameter of the target control based on the initial position parameter of the target control and the offset of the target control, and moving the target control to the position corresponding to the termination position parameter.
And determining the termination position of the target control after the position of the target control needs to be adjusted according to the initial position parameter of the target control and the offset of the target control, and realizing the position adjustment of the target control according to the termination position parameter.
A first schematic diagram in dynamically adjusting the position of a control according to an embodiment is illustrated in fig. 14. Referring to fig. 14, point M is the initial position parameter of the target control, and point M' is the terminal position parameter of the target control. And moving the target control from the point M to a point M' to realize the position adjustment of the target control when the user moves from the initial position A to the end position B. At this time, the ending position parameter of the target control is the initial position parameter, offset.
In the above embodiment, when the user translates from the initial position a to the end position B, the position of the target control is adjusted. In practical applications, there may be a vertical movement of the user during the movement, that is, the user changes from a standing state to a sitting state, and at this time, there is a position change of the user in the Y-axis direction in the vertical direction.
In order to adapt to the situation that the user changes in both the X-axis direction and the Y-axis direction, the display device provided in the embodiment of the present invention needs to determine the horizontal offset and the vertical offset when determining the offset of the target control. For example, when the user changes from a state of standing right in front of the display device to a state of sitting on a chair at the back left, the target control needs to be controlled to move from the initial position to the lower left corner.
At this time, the user initial position parameter includes a horizontal initial position parameter and a vertical initial position parameter, and the user end position parameter includes a horizontal end position parameter and a vertical end position parameter. The lateral initial position parameter includes a lateral initial relative distance of the user relative to the display in the initial position and a lateral initial position parameter, and the portrait initial position parameter includes a portrait initial relative distance of the user relative to the display in the initial position and a portrait initial position parameter. The lateral end position parameters include a lateral initial relative distance of the user relative to the display at the end position and a lateral initial position parameter, and the portrait end position parameters include a portrait initial relative distance of the user relative to the display at the end position and a portrait initial position parameter.
The longitudinal relative distance (including the initial position and the end position) refers to a distance corresponding to the movement of the center point of the face frame along the Y axis, that is, a height difference between the height of the center point of the face frame when the user stands and the height of the center point of the face frame when the user sits down. The longitudinal position parameters (including the initial position and the end position) refer to the position of the center point of the face frame of the user on the display interface when the user longitudinally moves to the end position and the center point vertically falls on the display.
Specifically, the controller in executing calculating the offset of the target control based on the user initial position parameter and the user end position parameter is further configured to:
and 701, calculating the transverse offset of the target control based on the transverse initial position parameter and the transverse end position parameter.
And step 702, calculating the longitudinal offset of the target control based on the longitudinal initial position parameter and the longitudinal end position parameter.
When the horizontal offset and the longitudinal offset of the target control are calculated, the calculation may be performed by referring to all the contents described in step S3 provided in the foregoing embodiment, that is, the horizontal offset of the target control is calculated according to the horizontal initial position parameter and the horizontal end position parameter, and the longitudinal offset of the target control is calculated according to the longitudinal initial position parameter and the longitudinal end position parameter, which is not described herein again in detail.
After the transverse offset and the longitudinal offset of the target control are determined, the termination position parameter after the position of the target control is adjusted can be determined according to the initial position parameter of the target control. In this embodiment, the initial position parameters of the target control include a horizontal initial position parameter and a vertical initial position parameter, and therefore, the determined end position parameters of the target control also include a horizontal end position parameter and a vertical end position parameter of the target control.
Specifically, the controller, in executing the step of obtaining the termination position parameter of the target control based on the initial position parameter of the target control and the offset of the target control, is further configured to:
step 801, calculating a transverse ending position parameter of the target control according to the transverse initial position parameter and the transverse offset of the target control.
And step 802, calculating a longitudinal termination position parameter of the target control according to the longitudinal initial position parameter and the longitudinal offset of the target control.
According to the horizontal initial position parameter and the horizontal offset of the target control, the horizontal end position of the target control after the position of the target control needs to be adjusted can be determined, and according to the longitudinal initial position parameter and the longitudinal offset of the target control, the longitudinal end position of the target control after the position of the target control needs to be adjusted can be determined. And according to the transverse end position parameter and the longitudinal end position parameter, realizing the position adjustment of the target control, so that the target control can adjust the position along with the movement of the user.
A second schematic diagram in accordance with an embodiment of dynamically adjusting the position of a control is illustrated in fig. 15. Referring to fig. 15, when the user changes in both the X-axis direction and the Y-axis direction, for example, the user changes from a state of standing right in front of the display device to a state of sitting on a chair at the back left, the target control needs to be controlled to move from the initial position to the lower left corner.
At this time, the horizontal end position parameter of the target control is the horizontal initial position parameter-the horizontal offset, and the vertical end position parameter of the target control is the vertical initial position parameter + the vertical offset.
As can be seen from the foregoing embodiments, in a process that a user moves from an initial position to an end position, a controller receives environment image data corresponding to the initial position and environment image data corresponding to the end position, which are acquired by a camera, to obtain a user initial position parameter and a user end position parameter; calculating the offset of the target control according to the initial position parameter of the user and the end position parameter of the user; and obtaining a termination position parameter of the target control based on the initial position parameter of the target control and the offset of the target control, and moving the target control to a position corresponding to the termination position parameter. Therefore, the display device provided by the embodiment of the invention can realize that the position of the target control is adjusted along with the movement of the user, so that the visual angle of the user watching the target control at any direction within the visual range of the camera of the display device is unchanged, the user can clearly see the display content of the control, and the subjective visual experience of the user is improved.
A flow diagram of a method of dynamically adjusting a control according to an embodiment is illustrated in fig. 6. The application also provides a method for dynamically adjusting the control, which is executed by a controller in the display equipment, and the method comprises the following steps:
s1, acquiring initial position parameters of the target control in the process that a user moves from an initial position to an end position, and receiving environment image data corresponding to the initial position and environment image data corresponding to the end position, wherein the environment image data are acquired by the camera;
s2, acquiring user initial position parameters carried in the environment image data corresponding to the initial position and user end position parameters carried in the environment image data corresponding to the end position;
s3, calculating the offset of the target control based on the user initial position parameter and the user end position parameter, wherein the offset is used for representing the movement parameter of the target control when the position is adjusted;
s4, obtaining the termination position parameter of the target control based on the initial position parameter of the target control and the offset of the target control, and moving the target control to the position corresponding to the termination position parameter.
Further, the calculating the offset of the target control based on the user initial position parameter and the user end position parameter includes:
the initial user position parameters comprise an initial relative distance and initial position parameters of a user relative to the display, the end user position parameters comprise an end relative distance and end position parameters of the user relative to the display, and the position parameters corresponding to the user refer to parameters of the center point of the face frame;
calculating a first distance between the initial position parameter corresponding to the user and the initial position parameter of the target control, and a second distance between the ending position parameter corresponding to the user and the initial position parameter of the target control, wherein the position parameter corresponding to the target control refers to a parameter of a control center point;
calculating a theoretical second distance of the user when the user moves to the end position based on the initial relative distance, the end relative distance and the first distance, wherein the theoretical second distance is used for representing a theoretical distance between the end position corresponding to the user and the end position of the target control;
and calculating the distance difference between the theoretical second distance and the second distance to obtain the offset of the target control.
Further, the calculating a first distance between the initial position parameter corresponding to the user and the initial position parameter of the target control includes:
acquiring the pixel point number of the center point of the face frame when the user is at the initial position and the pixel point number of the control center point of the target control;
calculating the pixel point number difference between the pixel point number of the center point of the face frame and the pixel point number of the center point of the control;
and calculating a first distance between the initial position parameter corresponding to the user and the initial position parameter of the target control according to the pixel point number difference and the length value of each pixel point.
Further, the calculating a second distance between the end position parameter corresponding to the user and the initial position parameter of the target control includes:
acquiring the pixel point number of the center point of the face frame when the user is positioned at the end position and the pixel point number of the control center point of the target control;
calculating the pixel point number difference between the pixel point number of the center point of the face frame and the pixel point number of the center point of the control;
and calculating a second distance between the ending position parameter corresponding to the user and the initial position parameter of the target control according to the pixel number difference and the length value of each pixel.
Further, the calculating a theoretical second distance of the user when moving to the end position based on the initial relative distance, the end relative distance and the first distance includes:
according to formula S 2 '=BNS 1 -AX, calculating a theoretical second distance of the user when moving to the end position;
in the formula, S 2 ' is a theoretical second distance, S 1 AX is the initial relative distance and BN is the ending relative distance.
Further, the obtaining of the initial position parameter of the target control includes:
establishing a reference coordinate system by taking the upper left corner of the display interface as a coordinate origin, taking the direction from the left side to the right side of the display interface as an X-axis forward direction, and taking the direction from the upper side to the lower side of the display interface as a Y-axis forward direction;
acquiring the number of pixels of the origin of coordinates and the number of horizontal pixels and the number of vertical pixels of the control center point of the target control;
calculating a horizontal pixel point number difference between the pixel point number of the coordinate origin and the horizontal pixel point number of the control center point, and a longitudinal pixel point number difference between the pixel point number of the coordinate origin and the longitudinal pixel point number of the control center point;
and calculating a transverse initial distance and a longitudinal initial distance between a control center point and a coordinate origin of the target control according to the transverse pixel point number difference, the longitudinal pixel point number difference and the length value of each pixel point, and taking the transverse initial distance, the longitudinal initial distance, the transverse pixel point number and the longitudinal pixel point number of the control center point as initial position parameters of the target control.
Further, the calculating the offset of the target control based on the user initial position parameter and the user end position parameter includes:
the user initial position parameters comprise transverse initial position parameters and longitudinal initial position parameters, and the user end position parameters comprise transverse end position parameters and longitudinal end position parameters;
calculating the transverse offset of the target control based on the transverse initial position parameter and the transverse end position parameter;
and calculating the longitudinal offset of the target control based on the longitudinal initial position parameter and the longitudinal end position parameter.
Further, the obtaining the termination position parameter of the target control based on the initial position parameter of the target control and the offset of the target control includes:
the initial position parameters of the target control comprise a transverse initial position parameter and a longitudinal initial position parameter;
calculating a transverse termination position parameter of the target control according to the transverse initial position parameter and the transverse offset of the target control;
and calculating the longitudinal termination position parameter of the target control according to the longitudinal initial position parameter and the longitudinal offset of the target control.
Further, still include:
receiving environmental image data collected by the camera;
identifying the number of human faces in the environment image data;
and when the number of the faces in the environment image data is 1, executing a step of acquiring a user initial position parameter and a user end position parameter.
In specific implementation, the present invention further provides a computer storage medium, where the computer storage medium may store a program, and the program may include some or all of the steps in the embodiments of the method for dynamically adjusting a control provided by the present invention when executed. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM) or a Random Access Memory (RAM).
Those skilled in the art will readily appreciate that the techniques of the embodiments of the present invention may be implemented as software plus a required general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be essentially or partially implemented in the form of a software product, which may be stored in a storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments or some parts of the embodiments.
The same and similar parts in the various embodiments in this specification may be referred to each other. Particularly, for the method embodiment of dynamically adjusting the control, since it is substantially similar to the display device embodiment, the description is simple, and the relevant points may be referred to the description in the display device embodiment.
The above-described embodiments of the present invention should not be construed as limiting the scope of the present invention.

Claims (7)

1. A display device, comprising:
a camera configured to acquire environmental image data characterizing a position parameter of a user relative to a display;
a display configured to present a display interface in which a target control is displayed;
a controller in communication with the display and the camera, respectively, the controller configured to:
in the process that a user moves from an initial position to an end position, acquiring initial position parameters of the target control, and receiving environment image data corresponding to the initial position and environment image data corresponding to the end position, which are acquired by the camera;
acquiring a user initial position parameter carried in the environment image data corresponding to the initial position and a user end position parameter carried in the environment image data corresponding to the end position; the user initial position parameters comprise a transverse initial position parameter and a longitudinal initial position parameter, and the user end position parameters comprise a transverse end position parameter and a longitudinal end position parameter; the initial user position parameter is determined based on the vertical distance between the initial position of the user and the display and the position of the central point of the face frame of the user on the display interface when the central point vertically falls on the display; the user ending position parameter is determined based on the vertical distance between the user and the display at the ending position and the position of the center point of the face frame of the user on the display interface when the center point vertically falls on the display;
calculating a transverse first distance between the transverse initial position parameter and the initial position parameter of the target control and a transverse second distance between the transverse end position parameter and the initial position parameter of the target control based on the transverse initial position parameter, the transverse end position parameter and the initial position parameter of the target control;
respectively calculating a longitudinal first distance between the longitudinal initial position parameter and the initial position parameter of the target control and a longitudinal second distance between the longitudinal end position parameter and the initial position parameter of the target control based on the longitudinal initial position parameter, the longitudinal end position parameter and the initial position parameter of the target control;
according to formula S 2 '=BN·S 1 (ii)/AX, calculating a theoretical second distance of the user when moving to the end position, the theoretical second distance comprising a lateral theoretical second distance toAnd a longitudinal theoretical second distance; in the formula, S 2 ' is a theoretical second distance, S 1 Is a first distance, AX is an initial relative distance, BN is an ending relative distance; the first distance comprises a transverse first distance and a longitudinal first distance; the initial relative distance is a vertical distance between the user and the display when the user is at an initial position A, and the ending relative distance is a vertical distance between the user and the display when the user moves to an ending position B;
calculating the distance difference between the transverse theoretical second distance and the transverse second distance to obtain the transverse offset of the target control;
calculating the distance difference between the longitudinal theoretical second distance and the longitudinal second distance to obtain the longitudinal offset of the target control; the offset is used for representing a moving parameter of the target control when the position is adjusted;
and obtaining a termination position parameter of the target control based on the initial position parameter of the target control and the transverse offset and/or the longitudinal offset of the target control, and moving the target control to a position corresponding to the termination position parameter.
2. The display device of claim 1, wherein the controller, in performing the calculating the horizontal first distance and the vertical first distance between the initial position parameter corresponding to the user and the initial position parameter of the target control, is further configured to:
acquiring the pixel point number of the center point of the face frame when the user is at the initial position and the pixel point number of the control center point of the target control;
calculating the pixel point number difference between the pixel point number of the center point of the face frame and the pixel point number of the center point of the control;
and calculating a horizontal first distance and/or a longitudinal first distance between the initial position parameter corresponding to the user and the initial position parameter of the target control according to the pixel point number difference and the length value of each pixel point.
3. The display device of claim 1, wherein the controller, in performing the calculating, is further configured to:
acquiring the pixel point number of the center point of the face frame when the user is positioned at the end position and the pixel point number of the control center point of the target control;
calculating the pixel point number difference between the pixel point number of the center point of the face frame and the pixel point number of the center point of the control;
and calculating a transverse second distance and/or a longitudinal second distance between the ending position parameter corresponding to the user and the initial position parameter of the target control according to the pixel point number difference and the length value of each pixel point.
4. The display device according to claim 1, wherein the controller, in executing the initial position parameter of the acquisition target control, is further configured to:
establishing a reference coordinate system by taking the upper left corner of the display interface as a coordinate origin, taking the direction of the display interface from the left side to the right side as an X-axis forward direction, and taking the direction of the display interface from the upper side to the lower side as a Y-axis forward direction;
acquiring the number of pixels of the origin of coordinates and the number of horizontal pixels and the number of vertical pixels of the control center point of the target control;
calculating a horizontal pixel point number difference between the pixel point number of the coordinate origin and the horizontal pixel point number of the control center point, and a longitudinal pixel point number difference between the pixel point number of the coordinate origin and the longitudinal pixel point number of the control center point;
and calculating a transverse initial distance and a longitudinal initial distance between a control center point and a coordinate origin of the target control according to the transverse pixel point number difference, the longitudinal pixel point number difference and the length value of each pixel point, and taking the transverse initial distance, the longitudinal initial distance, the transverse pixel point number and the longitudinal pixel point number of the control center point as initial position parameters of the target control.
5. The display device according to claim 1, wherein the controller, in executing the obtaining of the ending position parameter of the target control based on the initial position parameter of the target control and the lateral offset and the longitudinal offset of the target control, is further configured to:
the initial position parameters of the target control comprise a transverse initial position parameter and a longitudinal initial position parameter;
calculating a transverse termination position parameter of the target control according to the transverse initial position parameter and the transverse offset of the target control;
and calculating the longitudinal termination position parameter of the target control according to the longitudinal initial position parameter and the longitudinal offset of the target control.
6. The display device of claim 1, wherein the controller is further configured to:
receiving environmental image data collected by the camera;
identifying the number of human faces in the environment image data;
and when the number of the human faces in the environment image data is 1, executing the step of acquiring the initial position parameter and the end position parameter of the user.
7. A method for dynamically adjusting a control, comprising the steps of:
in the process that a user moves from an initial position to an end position, acquiring initial position parameters of a target control, and receiving environment image data corresponding to the initial position and environment image data corresponding to the end position, which are acquired by a camera;
acquiring a user initial position parameter carried in the environment image data corresponding to the initial position and a user end position parameter carried in the environment image data corresponding to the end position; the user initial position parameters comprise a transverse initial position parameter and a longitudinal initial position parameter, and the user end position parameters comprise a transverse end position parameter and a longitudinal end position parameter; the initial user position parameter is determined based on the vertical distance between the initial position of the user and the display and the position of the central point of the face frame of the user on the display interface when the central point vertically falls on the display; the user ending position parameter is determined based on the vertical distance between the user and the display at the ending position and the position of the center point of the face frame of the user on the display interface when the center point vertically falls on the display;
calculating a transverse first distance between the transverse initial position parameter and the initial position parameter of the target control and a transverse second distance between the transverse end position parameter and the initial position parameter of the target control based on the transverse initial position parameter, the transverse end position parameter and the initial position parameter of the target control;
respectively calculating a longitudinal first distance between the longitudinal initial position parameter and the initial position parameter of the target control and a longitudinal second distance between the longitudinal end position parameter and the initial position parameter of the target control based on the longitudinal initial position parameter, the longitudinal end position parameter and the initial position parameter of the target control;
according to formula S 2 '=BN·S 1 Calculating a theoretical second distance of the user when moving to the end position, wherein the theoretical second distance comprises a horizontal theoretical second distance and a vertical theoretical second distance; in the formula, S 2 ' is a theoretical second distance, S 1 Is a first distance, AX is an initial relative distance, BN is an ending relative distance; the first distance comprises a transverse first distance and a longitudinal first distance; the initial relative distance is a vertical distance between the user and the display when the user is at an initial position A, and the ending relative distance is a vertical distance between the user and the display when the user moves to an ending position B;
calculating the distance difference between the transverse theoretical second distance and the transverse second distance to obtain the transverse offset of the target control;
calculating the distance difference between the longitudinal theoretical second distance and the longitudinal second distance to obtain the longitudinal offset of the target control;
the offset is used for representing a moving parameter of the target control when the position is adjusted;
and obtaining a termination position parameter of the target control based on the initial position parameter of the target control and the transverse offset and/or the longitudinal offset of the target control, and moving the target control to a position corresponding to the termination position parameter.
CN202010342885.9A 2020-04-27 2020-04-27 Method for dynamically adjusting control and display device Active CN113645502B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010342885.9A CN113645502B (en) 2020-04-27 2020-04-27 Method for dynamically adjusting control and display device
PCT/CN2021/081562 WO2021218473A1 (en) 2020-04-27 2021-03-18 Display method and display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010342885.9A CN113645502B (en) 2020-04-27 2020-04-27 Method for dynamically adjusting control and display device

Publications (2)

Publication Number Publication Date
CN113645502A CN113645502A (en) 2021-11-12
CN113645502B true CN113645502B (en) 2022-09-16

Family

ID=78414950

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010342885.9A Active CN113645502B (en) 2020-04-27 2020-04-27 Method for dynamically adjusting control and display device

Country Status (1)

Country Link
CN (1) CN113645502B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102163129A (en) * 2011-04-26 2011-08-24 宇龙计算机通信科技(深圳)有限公司 Method and device for implementing icon deformation
CN107562304A (en) * 2017-07-31 2018-01-09 努比亚技术有限公司 A kind of control method, mobile terminal and computer-readable recording medium
CN107930119A (en) * 2017-11-21 2018-04-20 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN108553895A (en) * 2018-04-24 2018-09-21 网易(杭州)网络有限公司 User interface element and the associated method and apparatus of three-dimensional space model

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090064226A1 (en) * 2007-08-29 2009-03-05 Himax Technologies Limited User interface of interactive program guide and method thereof
CN102841725A (en) * 2011-06-21 2012-12-26 鸿富锦精密工业(深圳)有限公司 Electronic device and screen information adjusting method thereof
CN103974008A (en) * 2013-01-30 2014-08-06 联想(北京)有限公司 Information processing method and electronic equipment
CN104464579A (en) * 2013-09-12 2015-03-25 中兴通讯股份有限公司 Data display method, data display device, data display terminal, display control method and display control device
CN103885593B (en) * 2014-03-14 2016-04-06 努比亚技术有限公司 A kind of handheld terminal and screen anti-fluttering method thereof and device
WO2017005070A1 (en) * 2015-07-09 2017-01-12 重庆邮电大学 Display control method and device
CN108427595B (en) * 2018-03-13 2021-07-27 网易(杭州)网络有限公司 Method and device for determining display position of user interface control in virtual reality
CN109871125A (en) * 2019-01-31 2019-06-11 维沃移动通信有限公司 A kind of display control method and terminal device
CN109885368A (en) * 2019-01-31 2019-06-14 维沃移动通信有限公司 A kind of interface display anti-fluttering method and mobile terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102163129A (en) * 2011-04-26 2011-08-24 宇龙计算机通信科技(深圳)有限公司 Method and device for implementing icon deformation
CN107562304A (en) * 2017-07-31 2018-01-09 努比亚技术有限公司 A kind of control method, mobile terminal and computer-readable recording medium
CN107930119A (en) * 2017-11-21 2018-04-20 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN108553895A (en) * 2018-04-24 2018-09-21 网易(杭州)网络有限公司 User interface element and the associated method and apparatus of three-dimensional space model

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Building a dynamically generated virtual museum using a game engine;Panayiotis Kyriakou等;《2013 Digital Heritage International Congress (DigitalHeritage)》;20140220;全文 *
信息技术环境下基于互动体验的博物馆展览设计;董立颖;《中国优秀硕士学位论文全文数据库》;20151015;全文 *

Also Published As

Publication number Publication date
CN113645502A (en) 2021-11-12

Similar Documents

Publication Publication Date Title
CN113330736A (en) Display and image processing method
CN113259741B (en) Demonstration method and display device for classical viewpoint of episode
CN111752518A (en) Screen projection method of display equipment and display equipment
CN112055240B (en) Display device and operation prompt display method for pairing display device with remote controller
CN112073788B (en) Video data processing method and device and display equipment
CN112214189A (en) Image display method and display device
CN112866773B (en) Display equipment and camera tracking method in multi-person scene
CN112073795B (en) Video data processing method and device and display equipment
CN111385631B (en) Display device, communication method and storage medium
CN114430492A (en) Display device, mobile terminal and picture synchronous zooming method
CN112473121A (en) Display device and method for displaying dodging ball based on limb recognition
CN111669662A (en) Display device, video call method and server
CN111078926A (en) Method for determining portrait thumbnail image and display equipment
CN112783380A (en) Display apparatus and method
CN112073663A (en) Audio gain adjusting method, video chatting method and display equipment
CN112399235B (en) Camera shooting effect enhancement method and display device of intelligent television
CN113645502B (en) Method for dynamically adjusting control and display device
CN112437284A (en) Projection picture correction method, terminal equipment and display equipment
CN112218156A (en) Method for adjusting video dynamic contrast and display equipment
CN112073776A (en) Voice control method and display device
CN112073759A (en) Method and device for selecting and scheduling communication modes between two systems and display equipment
CN113825001B (en) Panoramic picture browsing method and display device
CN112073777B (en) Voice interaction method and display device
WO2021218473A1 (en) Display method and display device
CN113453056B (en) Display method and display device for photo album control

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant