CN112486380A - Display interface processing method, device, medium and electronic equipment - Google Patents

Display interface processing method, device, medium and electronic equipment Download PDF

Info

Publication number
CN112486380A
CN112486380A CN202011273105.6A CN202011273105A CN112486380A CN 112486380 A CN112486380 A CN 112486380A CN 202011273105 A CN202011273105 A CN 202011273105A CN 112486380 A CN112486380 A CN 112486380A
Authority
CN
China
Prior art keywords
information
image
display
viewer
relative
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011273105.6A
Other languages
Chinese (zh)
Other versions
CN112486380B (en
Inventor
王珂晟
黄劲
黄钢
许巧龄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Anbo Shengying Education Technology Co ltd
Original Assignee
Beijing Anbo Shengying Education Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Anbo Shengying Education Technology Co ltd filed Critical Beijing Anbo Shengying Education Technology Co ltd
Priority to CN202011273105.6A priority Critical patent/CN112486380B/en
Publication of CN112486380A publication Critical patent/CN112486380A/en
Application granted granted Critical
Publication of CN112486380B publication Critical patent/CN112486380B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure provides a display interface processing method, a display interface processing device, a display interface processing medium and electronic equipment. The relative display direction of the current terminal display device is determined through the first image of the viewer and the position relation information of the eyes of the viewer in the first image, and then the display position of the functional display area is adjusted through the relative display direction, so that the functional display area can automatically adjust the display position according to the relative display direction change of the terminal display device. The problem of the display content of vertical screen layout is watched to the horizontal screen or the display content of horizontal screen layout is watched to the vertical screen is solved, and the inconvenience of watching caused by scrolling or reducing the display content is avoided.

Description

Display interface processing method, device, medium and electronic equipment
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method, an apparatus, a medium, and an electronic device for processing a display interface.
Background
The traditional blackboard is a two-dimensional repeatedly writable and hard board, and is a repeatedly erasable writing tool. Typically for teaching, conference discussion, or personal and home recording. The teacher and the students are in the same room and use the blackboard to carry out face-to-face teaching.
With the development of computer technology, internet-based remote live broadcasting teaching is beginning to rise, and a panoramic intelligent blackboard combining with a multimedia technology is also produced along with live broadcasting teaching. The panoramic intelligent blackboard integrates teacher images of teaching courses in a live classroom, student images of listening to the courses, teaching contents and temporary blackboard-writing of the teacher in the live classroom through a functional display area, and presents the images and the teaching contents at a teacher end and a student end, so that the character images and the teaching contents in the live classroom are closely combined, participants in the live classroom can overcome distance feeling, the scene feeling is enhanced, and the teaching interest is improved.
The position of every function display area in current panorama intelligence blackboard is fixed, and overall layout is suitable for horizontal screen user's use, for example, shows the teacher's image of professor course in the function display area on screen left side one-third region upper portion, shows the student's image of listening to class in the function display area of lower part, shows the teaching content in the function display area on screen right side two-thirds region upper portion. However, this fixed layout is very inconvenient for users who are used to vertical screen viewing, and viewing the contents in the horizontal screen mode through the vertical screen results in viewing the contents in the hidden functional display area either by manually scrolling the screen from left to right or viewing the horizontal screen through the vertical screen, and the teaching contents cannot be normally viewed due to too small handwriting.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
The present disclosure is directed to a method, an apparatus, a medium, and an electronic device for processing a display interface, which are capable of solving at least one of the above-mentioned technical problems. The specific scheme is as follows:
according to a specific implementation manner of the present disclosure, in a first aspect, the present disclosure provides a processing method of a display interface, including:
acquiring a first image of a viewer in front of a terminal display device and position relation information of eyes of the viewer in the first image;
determining a relative display direction of the terminal display device based on the first image and the position relation information, wherein the relative display direction refers to a display direction of the terminal display device relative to the eye position of the viewer;
and adjusting the display position of a function display area in the terminal display device according to the relative display direction.
According to a second aspect, the present disclosure provides a processing apparatus for displaying an interface, including:
the system comprises an acquisition unit, a display unit and a processing unit, wherein the acquisition unit is used for acquiring a first image of a viewer in front of a terminal display device and position relation information of eyes of the viewer in the first image;
a determination unit configured to determine a relative display direction of the terminal display device based on the first image and the positional relationship information, the relative display direction being a display direction of the terminal display device with respect to the viewer eye position;
and the adjusting unit is used for adjusting the display position of the function display area in the terminal display device according to the relative display direction.
According to a third aspect, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method of processing a display interface according to any one of the first aspect.
According to a fourth aspect thereof, the present disclosure provides an electronic device, comprising: one or more processors; a storage device for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement a method of processing a display interface as claimed in any one of the first aspects.
Compared with the prior art, the scheme of the embodiment of the disclosure at least has the following beneficial effects:
the disclosure provides a display interface processing method, a display interface processing device, a display interface processing medium and electronic equipment. The relative display direction of the current terminal display device is determined through the first image of the viewer and the position relation information of the eyes of the viewer in the first image, and then the display position of the functional display area is adjusted through the relative display direction, so that the functional display area can automatically adjust the display position according to the relative display direction change of the terminal display device. The problem of the display content of vertical screen layout is watched to the horizontal screen or the display content of horizontal screen layout is watched to the vertical screen is solved, and the inconvenience of watching caused by scrolling or reducing the display content is avoided.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and elements are not necessarily drawn to scale. In the drawings:
FIG. 1 shows a cross-screen display schematic of a method of processing a display interface according to an embodiment of the present disclosure;
FIG. 2 illustrates a portrait screen display schematic of a method of processing a display interface according to an embodiment of the present disclosure;
FIG. 3 shows a flow diagram of a method of processing a display interface according to an embodiment of the present disclosure;
FIG. 4 illustrates a block diagram of elements of a processing device displaying an interface in accordance with an embodiment of the present disclosure;
fig. 5 shows an electronic device connection structure schematic according to an embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
Alternative embodiments of the present disclosure are described in detail below with reference to the accompanying drawings.
The first embodiment provided by the present disclosure, that is, an embodiment of a processing method of a display interface.
The position of every function display area in current panorama intelligence blackboard is fixed, whole overall arrangement is suitable for horizontal screen user's use, for example, as shown in FIG. 1, the teacher's image of professor course is shown in the function display area on screen left side one-third regional upper portion, the student's image of listening to the class is shown in the function display area of lower part, PPT teaching content is shown in the function display area on screen right side two-thirds regional upper portion, whole panorama intelligence blackboard not only can show the image, can also carry out the blackboard writing on its surface as the blackboard. However, this fixed layout is very inconvenient for users who are used to vertical screens, for example, students watching live-broadcast teaching in the vertical screen mode of a mobile phone can only watch contents in different functional display areas by manually scrolling the display window in the horizontal screen mode from left to right, or the display window in the horizontal screen mode is reduced and then completely displayed in the vertical screen of the mobile phone, but the reduced display mode makes the handwriting and the image too small to watch the teaching contents normally.
The embodiment of the present disclosure is to automatically adjust the display position of the functional display area by recognizing the change of the screen relative to the display direction, so that all the functional display areas can be always clearly displayed in the full screen, as shown in fig. 2.
The embodiments of the present disclosure are described in detail below with reference to fig. 3.
Step S301, acquiring a first image of a viewer in front of a terminal display device and position relation information of the eyes of the viewer in the first image.
The relative display direction of the terminal display device is determined through the position relation information of the eyes of the viewer in front of the terminal display device. The relative display direction refers to the display direction of the terminal display device relative to the eye position of the viewer.
A terminal, which is a computer or computer system, for a user to input data and display the calculation result thereof, comprising: cell-phone, panel computer, notebook computer, desktop, server and panorama intelligent blackboard.
The first image is a two-dimensional image. Correspondingly, the position relation information of the eyes of the viewer is also two-dimensional relation information. For example, the two-dimensional relationship information refers to relationship information of the eyes of the viewer in the two-dimensional image. For example, the positional relationship information includes upper and lower relationship information or left and right relationship information; when the included angle between the connecting line of the two eyes of the viewer and the bottom edge line of the connecting line in the first image is larger than or equal to the preset included angle, the position relation information of the eyes of the viewer is up-down relation information; when the included angle between the connecting line of the two eyes of the viewer and the bottom edge line in the first image is smaller than the preset included angle, the position relation information of the eyes of the viewer is left-right relation information. With the advance of technology, the price of the camera is cheaper and cheaper, and the terminal with the camera becomes a standard configuration, such as a camera of a mobile phone and a camera of a notebook computer. The two-dimensional relationship information can be determined through the information collected by the camera. Therefore, the acquiring a first image of a viewer in front of a terminal display device and position relation information of the eyes of the viewer in the first image comprises the following steps:
step S301-1, facial information of the viewer is extracted based on the first image.
Step S301-2, position information of the eyes of the viewer is determined for the face information.
In the case where there is only one viewer in front of the terminal display device, it is easy to determine the position information of the viewer's eyes. However, if there are multiple viewers in front of the terminal display device, the position information of the eyes of the multiple viewers easily causes the process of determining the relative display direction of the terminal display device to enter into complicated calculations. In order to improve the efficiency of determining the relative display direction of a terminal display device, the determining the position information of the eyes of the viewer aiming at the face information comprises the following steps:
step S301-2-1, main face information at a core position is acquired from the face information.
The main facial information is facial information of the viewer at the core position in the image captured by the camera. For example, the face information of the viewer at the front row near the middle position in the captured image, or the face information of the viewer at the middle position in the captured image.
Further, the acquiring main face information at a core position from the face information includes:
step S301-2-1-1, facial information having facial features is extracted from the facial information.
Step S301-2-1-2, determining the main facial information based on facial information having facial features.
In the face information of the captured image, the face information having the feature of five sense organs should have information of eyebrows, eyes, ears, nose and mouth of the face, and if one face information does not have the feature of five sense organs, that is, a viewer is partially out of the photographing range or in a blocked state within the photographing range, it is determined that the face information of the viewer does not have the feature of five sense organs.
Step S301-2-2, the position information is determined for the main face information.
Information obviously without main face information characteristics is eliminated through the steps, and the efficiency of determining the main face information is improved.
Step S301-3, determining the position relation information according to the first image and the position information.
The positional relationship information of the eyes of the viewer is two-dimensional relationship information. The two-dimensional relationship information includes upper and lower relationship information or left and right relationship information of the eyes of the viewer in the two-dimensional image.
For the two-dimensional relationship information, optionally, the determining the position relationship information according to the first image and the position information includes the following steps:
and S301-3-1, establishing a straight line based on the position information.
And S301-3-2, when the included angle between the straight line and the bottom edge line in the first image is smaller than a preset included angle, determining that the position relation information is left-right relation information.
For example, the predetermined included angle is 45 degrees.
And S301-3-3, when the included angle between the straight line and the bottom edge line in the first image is greater than or equal to a preset included angle, determining that the position relation information is up-down relation information.
Step S302, determining the relative display direction of the terminal display device based on the first image and the position relation information.
The relative display direction is not the display direction of the terminal display device in the physical sense, and refers to the display direction of the terminal display device relative to the eye position of the viewer. For example, in the first image, the relative display directions include a relative landscape screen direction and a relative portrait screen direction; when the included angle between the connecting line of the eyes of the viewer and the long side line of the first image is smaller than a preset included angle, and the bottom side line of the first image is the long side line, the relative display direction of the terminal display device is the relative transverse screen direction; and when the included angle between the connecting line of the eyes of the viewer and the short side line of the two-dimensional image is smaller than a preset included angle and the bottom side line of the first image is equal to the short side line, the relative display direction of the terminal display device is the relative vertical screen direction. Optionally, the determining the relative display direction of the terminal display device based on the first image and the position relationship information includes:
step S302-1, when the aspect ratio of the first image is greater than 1 and the positional relationship information is left-right relationship information, or the aspect ratio of the first image is less than 1 and the positional relationship information is up-down relationship information, determining that the relative display direction is a relative landscape direction.
And when the aspect ratio of the first image is more than 1, indicating that the bottom edge of the first image is a long edge, and if the position relation information is left-right relation information, indicating that the terminal display device seen by the eyes of the viewer is in the transverse screen direction. When the aspect ratio of the first image is less than 1, it indicates that the bottom edge of the first image is a short edge, and if the positional relationship information is the up-down relationship information, it indicates that the terminal display device seen by the eyes of the viewer is also in the landscape direction.
Step S302-2, when the aspect ratio of the first image is greater than 1 and the position relationship information is the up-down relationship information, or the aspect ratio of the first image is less than 1 and the position relationship information is the left-right relationship information, determining that the relative display direction is the relative vertical screen direction.
And when the aspect ratio of the first image is more than 1, indicating that the bottom edge of the first image is a long edge, and if the position relation information is the up-down relation information, indicating that the terminal display device seen by the eyes of the viewer is in the vertical screen direction.
And when the aspect ratio of the first image is less than 1, indicating that the bottom edge of the first image is a short edge, and if the position relation information is left-right relation information, indicating that the terminal display device seen by the eyes of the viewer is also in the vertical screen direction.
Step S303, adjusting the display position of the function display area in the terminal display device according to the relative display direction.
And adjusting the function display area in the terminal display device to a preset display position according to the relative display direction.
Optionally, the adjusting the display position of the function display area in the terminal display device according to the relative display direction includes the following steps:
and step S303-1, setting all function display areas in a visible area of the terminal display device according to the relative display direction.
As shown in fig. 2, when the preset display positions all correspond to the physical position of the screen, all the function display areas can be displayed in the visible area of the terminal display device. Thereby avoiding viewing inconvenience caused by scrolling or zooming out of the displayed content.
The display method and the display device of the terminal display device determine the relative display direction of the current terminal display device through the first image of the viewer and the position relation information of the eyes of the viewer in the first image, and further adjust the display position of the functional display area through the relative display direction, so that the functional display area can automatically adjust the display position according to the relative display direction change of the terminal display device. The problem of the display content of vertical screen layout is watched to the horizontal screen or the display content of horizontal screen layout is watched to the vertical screen is solved, and the inconvenience of watching caused by scrolling or reducing the display content is avoided.
Corresponding to the first embodiment provided by the present disclosure, the present disclosure also provides a second embodiment, that is, a processing apparatus for displaying an interface. Since the second embodiment is basically similar to the first embodiment, the description is simple, and the relevant portions should be referred to the corresponding description of the first embodiment. The device embodiments described below are merely illustrative.
Fig. 4 shows an embodiment of a processing apparatus for displaying an interface provided by the present disclosure.
As shown in fig. 4, the present disclosure provides a processing apparatus for displaying an interface, including:
an obtaining unit 401, configured to obtain a first image of a viewer in front of a terminal display device and position relationship information of eyes of the viewer in the first image;
a determining unit 402, configured to determine a relative display direction of the terminal display device based on the first image and the positional relationship information, where the relative display direction is a display direction of the terminal display device relative to the eye position of the viewer;
an adjusting unit 403, configured to adjust a display position of a function display area in the terminal display device according to the relative display direction.
Optionally, the obtaining unit 401 includes:
a first extraction face information subunit operable to extract face information of a viewer based on the first image;
a first determination position information subunit operable to determine position information of the eyes of the viewer for the face information;
and the relationship determining subunit is used for determining the position relationship information according to the first image and the position information.
Optionally, in the first determined location information subunit, the method includes:
an acquisition main face information subunit configured to acquire main face information at a core position from the face information;
a second determination position information subunit configured to determine the position information with respect to the main face information.
Optionally, the obtaining main facial information subunit includes:
a second extraction face information subunit operable to extract face information having facial features from the face information;
a determine main facial information subunit to determine the main facial information based on facial information having facial features.
Optionally, in the relationship determining subunit, the method includes:
a straight line establishing subunit, configured to establish a straight line based on the position information;
a left-right relation determining subunit, configured to determine that the position relation information is left-right relation information when an included angle between the straight line and a bottom edge line in the first image is smaller than a preset included angle;
and the up-down relation determining subunit is configured to determine that the position relation information is up-down relation information when an included angle between the straight line and the bottom edge line in the first image is greater than or equal to a preset included angle. Optionally, the determining unit 402 includes:
when the aspect ratio of the first image is greater than 1 and the position relationship information is left-right relationship information, or the aspect ratio of the first image is less than 1 and the position relationship information is up-down relationship information, determining that the relative display direction is a relative horizontal screen direction;
and when the aspect ratio of the first image is greater than 1 and the position relation information is the up-down relation information, or the aspect ratio of the first image is less than 1 and the position relation information is the left-right relation information, determining that the relative display direction is the relative vertical screen direction.
Optionally, the adjusting unit 403 includes:
and the full-screen display subunit is used for setting all the functional display areas in the visible area of the terminal display device according to the relative display direction.
The display method and the display device of the terminal display device determine the relative display direction of the current terminal display device through the first image of the viewer and the position relation information of the eyes of the viewer in the first image, and further adjust the display position of the functional display area through the relative display direction, so that the functional display area can automatically adjust the display position according to the relative display direction change of the terminal display device. The problem of the display content of vertical screen layout is watched to the horizontal screen or the display content of horizontal screen layout is watched to the vertical screen is solved, and the inconvenience of watching caused by scrolling or reducing the display content is avoided.
The embodiment of the present disclosure provides a third embodiment, that is, an electronic device, where the electronic device is used for a processing method of a display interface, and the electronic device includes: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the one processor to cause the at least one processor to perform a method of processing a display interface as described in the first embodiment.
The fourth embodiment provides a computer storage medium, which stores computer-executable instructions that can execute the processing method of the display interface as described in the first embodiment.
Referring now to FIG. 5, shown is a schematic diagram of an electronic device suitable for use in implementing embodiments of the present disclosure. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 5, the electronic device may include a processing means (e.g., central processing unit, graphics processor, etc.) 501 that may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)502 or a program loaded from a storage means 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data necessary for the operation of the electronic apparatus are also stored. The processing device 501, the ROM 502, and the RAM 503 are connected to each other through a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
Generally, the following devices may be connected to the I/O interface 505: input devices 506 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 507 including, for example, a Liquid Crystal Display (LCD), speakers, vibrators, and the like; storage devices 508 including, for example, magnetic tape, hard disk, etc.; and a communication device 509. The communication means 509 may allow the electronic device to communicate with other devices wirelessly or by wire to exchange data. While fig. 5 illustrates an electronic device having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 509, or installed from the storage means 508, or installed from the ROM 502. The computer program performs the above-described functions defined in the methods of the embodiments of the present disclosure when executed by the processing device 501.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
Computer program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of an element does not in some cases constitute a limitation on the element itself.
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (10)

1. A processing method for a display interface is characterized by comprising the following steps:
acquiring a first image of a viewer in front of a terminal display device and position relation information of eyes of the viewer in the first image;
determining a relative display direction of the terminal display device based on the first image and the position relation information, wherein the relative display direction refers to a display direction of the terminal display device relative to the eye position of the viewer;
and adjusting the display position of a function display area in the terminal display device according to the relative display direction.
2. The processing method according to claim 1, wherein the acquiring of the first image of the viewer in front of the terminal display device and the information on the position relationship of the eyes of the viewer in the first image comprises:
extracting facial information of a viewer based on the first image;
determining position information of the eyes of the viewer for the face information;
and determining the position relation information according to the first image and the position information.
3. The processing method according to claim 2, wherein the determining the position information of the eyes of the viewer for the face information comprises:
acquiring main face information at a core position from the face information;
the position information is determined for the main face information.
4. The processing method according to claim 3, wherein the acquiring the main face information at the core position from the face information comprises:
extracting face information having facial features from the face information;
determining the main facial information based on facial information having facial features.
5. The processing method according to claim 2, wherein the determining the positional relationship information from the first image and the positional information includes:
establishing a straight line based on the position information;
when the included angle between the straight line and the bottom edge line in the first image is smaller than a preset included angle, determining that the position relation information is left-right relation information;
and when the included angle between the straight line and the bottom edge line in the first image is larger than or equal to a preset included angle, determining that the position relation information is up-down relation information.
6. The processing method according to claim 5, wherein the determining a relative display direction of the terminal display device based on the first image and the positional relationship information comprises:
when the aspect ratio of the first image is greater than 1 and the position relationship information is left-right relationship information, or the aspect ratio of the first image is less than 1 and the position relationship information is up-down relationship information, determining that the relative display direction is a relative horizontal screen direction;
and when the aspect ratio of the first image is greater than 1 and the position relation information is the up-down relation information, or the aspect ratio of the first image is less than 1 and the position relation information is the left-right relation information, determining that the relative display direction is the relative vertical screen direction.
7. The processing method according to claim 1, wherein the adjusting the display position of the functional display area in the terminal display device according to the relative display direction comprises:
and setting all function display areas in a visible area of the terminal display device according to the relative display direction.
8. A processing apparatus for displaying an interface, comprising:
the system comprises an acquisition unit, a display unit and a processing unit, wherein the acquisition unit is used for acquiring a first image of a viewer in front of a terminal display device and position relation information of eyes of the viewer in the first image;
a determination unit configured to determine a relative display direction of the terminal display device based on the first image and the positional relationship information, the relative display direction being a display direction of the terminal display device with respect to the viewer eye position;
and the adjusting unit is used for adjusting the display position of the function display area in the terminal display device according to the relative display direction.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the processing method according to any one of claims 1 to 7.
10. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to carry out a processing method according to any one of claims 1 to 7.
CN202011273105.6A 2020-11-13 2020-11-13 Display interface processing method, device, medium and electronic equipment Active CN112486380B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011273105.6A CN112486380B (en) 2020-11-13 2020-11-13 Display interface processing method, device, medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011273105.6A CN112486380B (en) 2020-11-13 2020-11-13 Display interface processing method, device, medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN112486380A true CN112486380A (en) 2021-03-12
CN112486380B CN112486380B (en) 2022-06-07

Family

ID=74930557

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011273105.6A Active CN112486380B (en) 2020-11-13 2020-11-13 Display interface processing method, device, medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN112486380B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117111813A (en) * 2023-10-19 2023-11-24 深圳市慧为智能科技股份有限公司 Display adaptation method and device, electronic equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103376893A (en) * 2012-04-25 2013-10-30 华为终端有限公司 Display picture presenting method and terminal
CN103718148A (en) * 2013-01-24 2014-04-09 华为终端有限公司 Screen display module determining method and terminal device
US20160109952A1 (en) * 2014-10-17 2016-04-21 Top Victory Investments Ltd. Method of Controlling Operating Interface of Display Device by User's Motion
CN106951078A (en) * 2017-03-16 2017-07-14 维沃移动通信有限公司 A kind of horizontal/vertical screen changing method and mobile terminal
CN108614634A (en) * 2016-12-09 2018-10-02 北京视联动力国际信息技术有限公司 A kind of mobile device shows the method and mobile device of screen rotation
CN110798570A (en) * 2019-10-18 2020-02-14 深圳传音控股股份有限公司 Message viewing method, intelligent terminal and computer readable storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103376893A (en) * 2012-04-25 2013-10-30 华为终端有限公司 Display picture presenting method and terminal
CN103718148A (en) * 2013-01-24 2014-04-09 华为终端有限公司 Screen display module determining method and terminal device
US20160109952A1 (en) * 2014-10-17 2016-04-21 Top Victory Investments Ltd. Method of Controlling Operating Interface of Display Device by User's Motion
CN108614634A (en) * 2016-12-09 2018-10-02 北京视联动力国际信息技术有限公司 A kind of mobile device shows the method and mobile device of screen rotation
CN106951078A (en) * 2017-03-16 2017-07-14 维沃移动通信有限公司 A kind of horizontal/vertical screen changing method and mobile terminal
CN110798570A (en) * 2019-10-18 2020-02-14 深圳传音控股股份有限公司 Message viewing method, intelligent terminal and computer readable storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117111813A (en) * 2023-10-19 2023-11-24 深圳市慧为智能科技股份有限公司 Display adaptation method and device, electronic equipment and storage medium
CN117111813B (en) * 2023-10-19 2024-02-20 深圳市慧为智能科技股份有限公司 Display adaptation method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN112486380B (en) 2022-06-07

Similar Documents

Publication Publication Date Title
EP4231650A1 (en) Picture display method and apparatus, and electronic device
US12019669B2 (en) Method, apparatus, device, readable storage medium and product for media content processing
US11812152B2 (en) Method and apparatus for controlling video frame image in live classroom
CN111427528A (en) Display method and device and electronic equipment
WO2024104248A1 (en) Rendering method and apparatus for virtual panorama, and device and storage medium
CN114095671A (en) Cloud conference live broadcast system, method, device, equipment and medium
CN112486380B (en) Display interface processing method, device, medium and electronic equipment
CN111862349A (en) Virtual brush implementation method and device and computer readable storage medium
CN115086686A (en) Video processing method and related device
CN111367485B (en) Method, device, medium and electronic equipment for controlling combined multimedia blackboard
CN114125358A (en) Cloud conference subtitle display method, system, device, electronic equipment and storage medium
CN112788426A (en) Display method, device, medium and electronic equipment of function display area
CN110769129B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN112383810A (en) Lyric video display method and device, electronic equipment and computer readable medium
CN114332224A (en) Method, device and equipment for generating 3D target detection sample and storage medium
CN113141464B (en) Camera control method, device, medium and electronic equipment
CN112382147B (en) Method, device, medium and electronic equipment for adjusting function display area
CN113760140B (en) Content display method, device, medium and electronic equipment
CN111401244B (en) Method, device, medium and electronic equipment for adaptively controlling multimedia blackboard
US20220319062A1 (en) Image processing method, apparatus, electronic device and computer readable storage medium
CN112306222A (en) Augmented reality method, device, equipment and storage medium
WO2022213798A1 (en) Image processing method and apparatus, and electronic device and storage medium
CN117234324A (en) Image acquisition method, device, equipment and medium of information input page
CN118283426A (en) Image processing method, device, terminal and storage medium
WO2022055419A2 (en) Character display method and apparatus, electronic device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant