CN112506345B - Page display method and device, electronic equipment and storage medium - Google Patents

Page display method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112506345B
CN112506345B CN202011439816.6A CN202011439816A CN112506345B CN 112506345 B CN112506345 B CN 112506345B CN 202011439816 A CN202011439816 A CN 202011439816A CN 112506345 B CN112506345 B CN 112506345B
Authority
CN
China
Prior art keywords
area
page
target page
target
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011439816.6A
Other languages
Chinese (zh)
Other versions
CN112506345A (en
Inventor
王聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN202011439816.6A priority Critical patent/CN112506345B/en
Publication of CN112506345A publication Critical patent/CN112506345A/en
Application granted granted Critical
Publication of CN112506345B publication Critical patent/CN112506345B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Abstract

The disclosure relates to a page display method, a device, electronic equipment and a storage medium, and relates to a man-machine interaction technology, which can improve the convenience of controlling page display and further improve the browsing experience of users. The specific scheme comprises the following steps: displaying a target page on a display screen; the target page includes at least one page element; determining the position of the gazing area in the target page; the gazing area is an area where eyes of a user gaze at a target page; when the page elements in the gazing area comprise basic page elements, determining that the page elements except the basic page elements are target page elements from the page elements in the gazing area; the target page element is removed from the gaze area. The information recorded by the basic page element belongs to the subject information of the target page; the information recorded by the target page element belongs to preset information; the preset information is information different from the subject information of the target page; the target page element overlays the base page element.

Description

Page display method and device, electronic equipment and storage medium
Technical Field
The disclosure relates to man-machine interaction technology, and in particular relates to a page display method, a page display device, electronic equipment and a storage medium.
Background
Currently, when an electronic device displays a target page, the target page may include, in addition to some necessary theme information, some page elements (such as a push message of an Application (APP) installed in the electronic device) that are not related to the theme information of the target page. Since these page elements will typically obscure part of the subject information in the target page, the user needs to manually adjust the positions of these page elements in order to see the blocked subject information. Therefore, the convenience of the user for controlling the page display is reduced, and the browsing experience of the user is poor.
Disclosure of Invention
The embodiment of the disclosure provides a page display method, a device, electronic equipment and a storage medium, so as to at least improve the convenience of controlling page display and further improve the browsing experience of a user. The technical scheme of the present disclosure is as follows:
in a first aspect, an embodiment of the present disclosure provides a page display method, including: firstly, displaying a target page on a display screen; the target page includes at least one page element; then, determining the position of the gazing area in the target page; the gazing area is an area where eyes of a user gaze at a target page; when the page elements in the gazing area comprise basic page elements, determining that the page elements except the basic page elements are target page elements from the page elements in the gazing area; finally, the target page element is removed from the gaze area.
The information recorded by the basic page element belongs to the subject information of the target page; the information recorded by the target page element belongs to preset information; the preset information is information different from the subject information of the target page; the target page element overlays the base page element.
In one possible implementation manner, the determining the position of the gazing area in the target page includes: acquiring position information of a fixation point of eyes of a user on a display screen, a distance between the display screen and the eyes of the user and a target visual angle of the user; determining the area of the gazing area according to the distance and the target visual angle; and determining the position of the gazing area in the target page according to the position information of the gazing point and the area of the gazing area. The target visual angle is the visual angle of the correct identification information of the user. The target viewing angle is equal to the apex angle of the cone formed by the eyes of the user and the gaze area. The gaze point is the center point of the gaze area.
In another possible embodiment, determining the area of the gazing area according to the distance and the target viewing angle includes: inputting the distance and the target visual angle into a preset trigonometric function to obtain the diameter of the gazing area; and calculating the area of the gazing area according to the diameter of the gazing area.
In another possible implementation manner, the removing the target page element from the gazing area includes: hiding the target page element; or, moving the target page element from the gazing area to other areas except the gazing area in the target page or deleting the target page element.
In another possible implementation manner, the preset information includes at least one of the following: recommendation information, prompt information, media resources.
In another possible implementation manner, the acquiring the position information of the gaze point of the user's eye on the display screen includes: acquiring a face image comprising a face of a user; and determining the position information of the fixation point according to the face image.
In another possible implementation manner, the determining the location information of the gaze point according to the face image includes: establishing a real-time coordinate system by taking facial feature points of the face image as reference objects, and determining coordinates of pupils of the face image under the real-time coordinate system; and determining the coordinates of the gaze point corresponding to the coordinates of the pupil of the face image under the real-time coordinate system according to the pre-established mapping relation between the pupil coordinates and the gaze point coordinates.
The mapping relation is obtained through training according to the sample face image and the sample fixation point coordinates corresponding to the sample face image; the sample gaze point coordinates represent the location on the display screen of the sample user's gaze point on the display screen.
In a second aspect, embodiments of the present disclosure further provide a page display device, including: the device comprises a fixation area determining module, an element determining module and a display module. The display module is used for displaying a target page on the display screen; the gazing area determining module is used for determining the position of the gazing area in the target page; and the element determining module is used for determining that the page elements except the basic page element are target page elements from the page elements positioned in the gazing area when the page elements positioned in the gazing area comprise the basic page element. The display module is also used for removing the target page element from the gazing area.
Wherein the target page comprises at least one page element. The gazing area is an area where the eyes of the user gaze at the target page. The information recorded by the basic page element belongs to the subject information of the target page; the information recorded by the target page element belongs to preset information; the preset information is information different from the subject information of the target page; the target page element overlays the base page element.
In a possible implementation manner, the gazing area determining module is specifically configured to: acquiring position information of a fixation point of eyes of a user on a display screen, a distance between the display screen and the eyes of the user and a target visual angle of the user; determining the area of the gazing area according to the distance and the target visual angle; and determining the position of the gazing area in the target page according to the position information of the gazing point and the area of the gazing area.
The target visual angle is the visual angle of the correct identification information of the user; the target visual angle is equal to the vertex angle of a cone formed by eyes and a fixation area of a user; the gaze point is the center point of the gaze area.
In another possible implementation manner, the gazing area determining module is specifically configured to: inputting the distance and the target visual angle into a preset trigonometric function to obtain the diameter of the gazing area; and calculating the area of the gazing area according to the diameter of the gazing area.
In another possible implementation manner, the display module is specifically configured to hide the target page element; or, moving the target page element from the gazing area to other areas except the gazing area in the target page; alternatively, the target page element is deleted.
In another possible implementation manner, the preset information includes at least one of the following: recommendation information, prompt information, media resources.
In another possible implementation manner, the gazing area determining module is specifically configured to obtain a face image including a face of the user; and determining the position information of the fixation point according to the face image.
In another possible implementation manner, the gazing area determining module is specifically configured to: establishing a real-time coordinate system by taking facial feature points of the face image as reference objects, and determining coordinates of pupils of the face image under the real-time coordinate system; and determining the coordinates of the gaze point corresponding to the coordinates of the pupil of the face image under the real-time coordinate system according to the pre-established mapping relation between the pupil coordinates and the gaze point coordinates.
The mapping relation is obtained through training according to the sample face image and the sample fixation point coordinates corresponding to the sample face image; the sample gaze point coordinates represent the location on the display screen of the sample user's gaze point on the display screen.
In a third aspect, an embodiment of the present disclosure further provides an electronic device, including: a processor and a memory for storing processor-executable instructions. Wherein the processor is configured to execute instructions to cause the electronic device to perform the page display method as in the first aspect and any one of its possible implementation manners.
In a fourth aspect, embodiments of the present application further provide a computer readable storage medium having stored thereon computer instructions that, when executed on an electronic device, cause the electronic device to perform a page display method as in the first aspect and any one of its possible embodiments.
In a fifth aspect, embodiments of the present application also provide a computer program product comprising one or more instructions executable on an electronic device to cause the electronic device to perform a page display method as in the first aspect and any possible implementation thereof.
The technical scheme provided by the embodiment of the disclosure at least brings the following beneficial effects: when the target page is displayed, the position of a gazing area of the user on the target page is determined, wherein the gazing area is the area in the target page which the user is looking at. Then, if the target page has a base page element in the gaze area, then the target page element located in the gaze area (i.e. the area the user is viewing) is determined. Since the target page element is also located in the gazing area (i.e., the area the user is viewing), the target page element overlays the base page element in the gazing area and the base page element belongs to the subject information of the target page, i.e., the target page element overlays the subject information of the target page the user is viewing. Thus, the electronic device removes the target page element from the gaze area, i.e. displays the subject information of the target page that the user is viewing. Thus, without manual operation of a user, the shielding of the subject information of the target page being viewed by the user is automatically removed, and the subject information of the target page being viewed by the user is displayed. Therefore, the convenience of the user for controlling the page display is improved, and the browsing experience of the user is further improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure and do not constitute an undue limitation on the disclosure.
FIG. 1 is a schematic illustration of an interface for a page display provided by the conventional art;
fig. 2 is a schematic hardware structure of an electronic device according to an embodiment of the disclosure;
FIG. 3 is a flowchart showing a method for displaying a page according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of an electronic device displaying a target page according to an embodiment of the disclosure;
FIG. 5 is a schematic diagram of an electronic device hiding target page elements in a target page provided by an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of a target page element in a target page for electronic device movement provided in an embodiment of the present disclosure;
FIG. 7 is a schematic diagram of a geometric relationship between a user's viewing angle and a display screen provided by an embodiment of the present disclosure;
FIG. 8 is a second flowchart of a page display method according to an embodiment of the present disclosure;
Fig. 9 is a schematic structural diagram of a page display device according to an embodiment of the present disclosure.
Detailed Description
In order to enable those skilled in the art to better understand the technical solutions of the present disclosure, the technical solutions of the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the embodiments described in the following exemplary examples do not represent all embodiments consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
Currently, more and more users acquire information through electronic devices, so to speak, the electronic devices have become the main information acquisition channels of the users. The electronic device may display any one of the pages (which may be referred to as a target page) to present the subject information of the target page to the user. Second, the target page may include some other information (e.g., advertisement, push message of some APP) that is not related to the subject information, in addition to the subject information.
The target page can be one page of any APP installed in the electronic device. Each piece of information (e.g., a piece of topic information, a piece of other information) in the target page may be referred to as a page element, or a page element may describe a piece of information (e.g., a piece of topic information, a piece of other information). Each page element may be text, video, picture, etc.
The preset information in the following embodiment refers to other information that is not related to the subject information of the target page, and the other information is different from the subject information of the target page.
An electronic device is taken as an example of a mobile phone. A plurality of APPs may be installed on the handset 100. The mobile phone 100 receives a viewing operation of a user on a target page of a certain APP; in response to the viewing operation, the target page 110 is displayed through a display screen. The target page 110 includes at least one page element, where the at least one page element includes a base page element, and other page elements different from the base page element. The base page element refers to the subject information of the target page 110, and the other page elements refer to information different from the subject information of the target page 110.
As shown in fig. 1, the basic page elements of the target page 110 include Video1, video2, video3, and the like; the other page elements include a page element 111 and a page element 112 displayed in a floating frame manner. It can be seen that the page element 111 and the page element 112 cover a part of the basic page element (i.e. a part of the subject information of the target page 110), and the user needs to manually adjust the position of one other page element (e.g. the page element 111 or the page element 112) in order to view the subject information of the target page 110 covered by the other page element.
In summary, when the electronic device displays the target page, since some other page elements in the target page generally occlude a portion of the subject information of the target page, the user needs to manually adjust the positions of these other page elements in order to view the occluded subject information. Therefore, the convenience of the user for controlling the page display is reduced, and the browsing experience of the user is poor.
Aiming at the problems in the related art, the embodiment of the disclosure provides a page display method, which can improve the convenience of a user in controlling page display and realize better user browsing experience.
It should be noted that the method steps in the subsequent embodiments of the present disclosure may be performed by the page display device. The page display device may be the above-mentioned electronic device, a client (e.g., an APP) in the electronic device, or a server connected to the electronic device. The page display device may also be a part of functional modules (such as a central processing unit (Central Processing Unit, CPU)) in the electronic device and/or a part of functional modules (such as a CPU) in the server, and the execution subject of the page display method is not limited in the embodiments of the present disclosure. The server provides data support for one APP, and comprises the steps of generating a target page of the APP and updating the target page of the APP in response to operation of a user on the target page of the APP.
By way of example, the electronic device in the embodiments of the present disclosure may be a mobile phone, a video player, a smart television, a tablet, a desktop, a laptop, a handheld computer, a notebook, an ultra-mobile personal computer (UMPC), a netbook, a personal digital assistant (personal digital assistant, PDA), an augmented reality (augmented reality, AR) \virtual reality (VR) device, and the like, and the embodiments of the present disclosure do not limit the specific form of the electronic device.
Illustratively, as shown in a schematic structural diagram of an electronic device in fig. 2, the electronic device 200 in the embodiment of the disclosure includes a processor 201 and a memory 202.
Processor 201 may include one or more processing cores, such as a 4-core processor, an 8-core processor, etc., among others. The processor 201 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
Memory 202 may include one or more computer-readable storage media, which may be non-transitory. Memory 202 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 202 is used to store at least one instruction for execution by processor 201 to implement the page display method provided by embodiments of the present disclosure.
In some embodiments, the electronic device 200 may further optionally include: a peripheral interface 203 and at least one peripheral. The processor 201, memory 202, and peripheral interface 203 may be connected via buses or signal lines. The individual peripheral devices may be connected to the peripheral device interface 203 via buses, signal lines, or circuit boards. Specifically, the peripheral device includes: at least one of radio frequency circuitry 204, a display 205, a camera assembly 206, audio circuitry 207, a positioning assembly 208, and a power supply 209.
The peripheral interface 203 may be used to connect at least one Input/Output (I/O) related peripheral to the processor 201 and the memory 202. In some embodiments, processor 201, memory 202, and peripheral interface 203 are integrated on the same chip or circuit board. In some other embodiments, any one or both of the processor 201, the memory 202, and the peripheral interface 203 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 204 is used to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuitry 204 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 204 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 204 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. The radio frequency circuitry 204 may communicate with other electronic devices via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: metropolitan area networks, various generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or Wi-Fi (Wireless Fidelity ) networks. In some embodiments, the radio frequency circuitry 204 may also include NFC (Near Field Communication, short range wireless communication) related circuitry, which is not limited by the present disclosure.
The display screen 205 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display 205 is a touch screen, the display 205 also has the ability to collect touch signals at or above the surface of the display 205. The touch signal may be input as a control signal to the processor 201 for processing. At this time, the display 205 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 205 may be one, providing a front panel of the electronic device 200; the display 205 may be made of LCD (Liquid Crystal Display ), OLED (Organic Light-Emitting Diode) or other materials.
The camera assembly 206 is used to capture images or video. Optionally, the camera assembly 206 includes a front camera and a rear camera. In general, a front camera is disposed on a front panel of an electronic device, and a rear camera is disposed on a rear surface of the electronic device. The audio circuit 207 may include a microphone and a speaker. The microphone is used for collecting sound waves of users and environments, converting the sound waves into electric signals, and inputting the electric signals to the processor 201 for processing, or inputting the electric signals to the radio frequency circuit 204 for voice communication. For purposes of stereo acquisition or noise reduction, the microphone may be multiple and separately disposed at different locations of the electronic device 200. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is used to convert electrical signals from the processor 201 or the radio frequency circuitry 204 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, the audio circuit 207 may also include a headphone jack.
The location component 208 is used to locate the current geographic location of the electronic device 200 to enable navigation or LBS (Location Based Service, location-based services). The positioning component 208 may be a positioning component based on the United states GPS (Global Positioning System ), the Beidou system of China, the Granati system of Russia, or the Galileo system of the European Union.
The power supply 209 is used to power the various components in the electronic device 200. The power source 209 may be alternating current, direct current, disposable or rechargeable. When the power source 209 comprises a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the electronic device 200 further includes one or more sensors 210. The one or more sensors 210 include, but are not limited to: acceleration sensor, gyroscope sensor, pressure sensor, fingerprint sensor, optical sensor, and proximity sensor.
The acceleration sensor may detect the magnitudes of accelerations on three coordinate axes of the coordinate system established with the electronic device 200. The gyro sensor may detect a body direction and a rotation angle of the electronic device 200, and the gyro sensor may collect a 3D motion of the user on the electronic device 200 in cooperation with the acceleration sensor. The pressure sensor may be disposed on a side frame of the electronic device 200 and/or on an underlying layer of the display 205. When the pressure sensor is provided at a side frame of the electronic apparatus 200, a grip signal of the user to the electronic apparatus 200 may be detected. The fingerprint sensor is used for collecting fingerprints of a user. The optical sensor is used to collect the ambient light intensity. A proximity sensor, also called a distance sensor, is typically provided on the front panel of the electronic device 200. The proximity sensor is used to capture the distance between the user and the front of the display 205 of the electronic device 200.
It is to be understood that the structure shown in fig. 2 is not limiting of the electronic device 200 and may include more or fewer components than shown, or may combine certain components, or may employ a different arrangement of components.
In the embodiment of the disclosure, when the electronic device displays the target page, some other page elements (i.e., information different from the subject information of the target page) in the target page are considered to cover part of the subject information in the target page; and, if the user's line of sight falls on the occluded subject information, the user manually adjusts these other page elements of the occluded subject information. Based on the above, in the embodiment of the present disclosure, when the electronic device displays the target page, the gazing area of the user on the target page is obtained. The electronic device then removes other page elements in the target page that are within the gaze area. Therefore, the problem that other page elements shield the theme information in the user noted area is solved without manual operation of the user.
Referring to fig. 3, a flowchart of a page display method according to an embodiment of the disclosure is provided. As shown in fig. 3, the page display method may include S301 to S304.
S301, displaying a target page on a display screen; the target page includes at least one page element.
The page display device receives a viewing operation of a user on a certain target page, and can control the display screen to display the target page in response to the viewing operation.
Illustratively, taking an electronic device as an example, the electronic device performs a page display method, the electronic device includes a display screen. And the electronic equipment receives a viewing operation of a user on a certain target page, and responds to the viewing operation, and the target page is displayed through the display screen.
Taking a page display method executed by a server as an example, the server receives the above-mentioned viewing operation from the electronic device and generates the target page; and then the target page is sent to the electronic equipment, and the electronic equipment is triggered to display the target page on the display screen.
In embodiments of the present disclosure, the target page may include at least one page element. The at least one page element may include a base page element and/or other page elements that are different from the base page element. The basic page element refers to subject information of the target page. The other page element refers to information (called preset information) different from the subject information of the target page, such as media resources (may be called advertisements), prompt information, push message, and the like. Each page element may be text, video, picture, etc.
In the embodiment of the disclosure, the page display device may display at least one page element in a manner of an interface control (or UI component). The at least one page element may be in one-to-one correspondence with the at least one interface control. The interface control comprises a popup window, a suspension frame and the like.
S302, determining the position of a gazing area in a target page; the gazing area is an area where the eyes of the user gaze at the target page.
The page display device may determine a position of the gazing area in the target page while displaying the target page.
In the embodiment of the disclosure, the page display device can acquire the eyeballs and the characteristic changes of the peripheries of the eyeballs; and tracking the change of eyes of the user according to the eyeballs and the characteristics of the peripheries of the eyeballs, and determining the position of the gazing area in the target page.
Alternatively, the page display device may acquire iris angle changes; and tracking the change of eyes of the user according to the change of the angle of the iris, and determining the position of the gazing area in the target page.
The page display device can acquire the position information of the gaze point of the eyes of the user on the display screen and the area of the gaze area; and determining the position of the gazing area in the target page according to the position information of the gazing point and the area of the gazing area. The gaze point may refer to a center point of the user's gaze.
Wherein the gaze area may be circular, elliptical or other shape.
Illustratively, the page display device is a mobile phone, and the gazing area is a circle. Cell-phone 400 installs a plurality of APP, like first APP and second APP, and first APP and second APP are different. The mobile phone 400 receives the operation of the user on the first APP, operates the first APP, and displays the target page 410 in the first APP through the display screen. The running operation is a user viewing operation of the target page 410 in the first APP.
As shown in fig. 4 (a), the target page 410 may include a base page element and other page elements different from the base page element. Wherein the base page element includes a plurality of videos, such as Video1, video2, and Video 3. The other page elements may include push information 411, advertisements 412 of the second APP. It can be seen that the push information 411 and advertisement 412 of the second APP cover a portion of the underlying page elements of the target page 410, respectively.
Secondly, the mobile phone 400 can acquire the position information of the gaze point 421 of the eyes of the user on the mobile phone 400 and the area of the gaze area 422 while displaying the target page 410; the gaze area 422 is then determined based on the position information of the gaze point 421 and the area of the gaze area 422. As shown in fig. 4 (b), the mobile phone 400 determines a gaze area 422 centered on a gaze point 421.
And S303, when the page elements positioned in the watching area comprise basic page elements, determining that the page elements except the basic page elements are target page elements from the page elements positioned in the watching area.
The information recorded by the basic page element belongs to the subject information of the target page. The information recorded by the target page element belongs to preset information; the preset information is information different from the subject information of the target page. The target page element overlays the base page element.
In the embodiment of the disclosure, the page display device may first determine whether the page element located in the gazing area includes a basic page element. And if the page element positioned in the gazing area comprises a basic page element, determining the target page element from the page elements positioned in the gazing area. If the page element located in the gazing area does not include the base page element, it is determined that there is no target page element.
The page display device can judge whether each page element in the gazing area is subject information of a target page or not; the page element which is the subject information of the target page is determined as the base page element.
In an embodiment of the present disclosure, the preset information includes at least one of the following: recommendation information, hint information, media assets (e.g., advertisements).
It can be understood that the recommendation information, the prompt information and the media resource included in the preset information are different information from the subject information of the target page. For a user viewing the target page, the importance degree of the subject information of the target page is higher than that of information (namely preset information) different from the subject information of the target page; therefore, it is possible to determine all of these pieces of information different from the subject information of the target page as the preset information. Furthermore, it can be determined that the target page element belonging to the preset information is a page element with a lower importance level, and the target page element with the lower importance level is removed, so that the problem of reduced user browsing experience caused by that the target page element with the lower importance level covers the subject information (namely the basic page element) of the target page with the higher importance level is avoided.
In the embodiment of the disclosure, the page display device may determine that all page elements located in the gazing area except the base page element are target page elements in the case where the base page element exists in the gazing area.
Alternatively, the electronic device may target the page elements to be selected other than the base page element located in the gazing area in the case where the base page element exists in the gazing area; judging whether each page element to be selected is preset information or not; and determining the page element which is preset information as a target page element.
The page display device pre-stores that each page element in the target page is subject information or preset information.
Illustratively, the example of the handset 400 shown in (b) of fig. 4 is continued. The handset 400 determines the advertisement 412 as a target page element. Wherein, advertisement 412 is located in gazing area 422, advertisement 412 is a media resource in the preset information.
S304, removing the target page element from the gazing area.
The page display device may remove the target page elements in the gazing area, so that no target page elements are in the gazing area after the removal.
Taking an example of the electronic device executing the page display method, the electronic device directly removes the target page element from the gazing area after determining the target page element.
Taking a page display method executed by a server as an example, after determining a target page element, the server removes the target page element from a gazing area of the target page, and generates a removed target page; and then sending the removed target page to the electronic equipment so as to trigger the electronic equipment to display the removed target page.
In the embodiment of the disclosure, the page display device can hide the target page element. Alternatively, the page display device may move the target page element from the gazing region to other regions in the target page than the gazing region. Alternatively still, the page display device may delete the target page element.
After hiding the target page element, if the gazing area changes, the electronic device determines that the target page element is no longer located in the new gazing area, and displays the target page element.
After the page display device moves the target page element from the gazing area to other areas except the gazing area in the target page, if the gazing area changes, the page display device determines that the new gazing area and the gazing area do not overlap (i.e. there is no overlapping area between the new gazing area and the gazing area), and then moves the target page element to the initial position. The initial position refers to a position before the target area is moved to the other area.
It should be noted that, after determining the target page element, the page display device hides, moves, or deletes the target page element is an exemplary implementation manner for removing the target page element. By hiding, moving or deleting the target page element, the removed target page element does not cover the basic page element, and the embodiment of the application does not limit the specific mode of removing the target page element.
The electronic device may display at least one page element in an interface control manner, and the electronic device may move the interface control bearing the target page element to the other area, delete the interface control bearing the target page element, or hide the interface control bearing the target page element.
Illustratively, the example of the handset 400 shown in (b) of fig. 4 is continued. After the handset 400 determines that the advertisement 412 is a target page element, the advertisement 412 may be hidden. As shown in fig. 5 (a), the target page 410 displayed by the handset 400 does not include the advertisement 412. Then, the handset determines a new gaze area 430 and determines that advertisement 412 is not located within new gaze area 430, advertisement 412 is displayed. As shown in fig. 5 (b), the mobile phone 400 redisplays the advertisement 412, and the advertisement 412 does not block the subject information of the target page 410 in the new gazing area 430.
Illustratively, the example of the handset 400 shown in (b) of fig. 4 is continued. After the mobile phone 400 determines that the advertisement 412 is a target page element, the advertisement 412 may be moved to other areas of the target page 410 than the gaze area 422. As shown in fig. 6 (a), the mobile phone 400 moves the advertisement 412 to the top area among the other areas. Then, the handset determines a new gaze area 430 and determines that the new gaze area 430 and gaze area 422 do not overlap, then moves advertisement 412 to the initial position. As shown in fig. 6 (b), the mobile phone 400 moves the advertisement 412 to the initial position before being moved to the other area, and the advertisement 412 does not block the subject information of the target page 410 in the new viewing area 430.
It will be appreciated that the page display device determines the location of the user's gaze area on the target page, i.e. the area of the target page that the user is looking at, when the target page is displayed. Then, if the target page has a base page element in the gaze area, the electronic device determines a target page element located in the gaze area (i.e., the area the user is viewing). Since the target page element is also located in the gazing area (i.e., the area the user is viewing), the target page element overlays the base page element in the gazing area and the base page element belongs to the subject information of the target page, i.e., the target page element overlays the subject information of the target page the user is viewing. Thus, the electronic device removes the target page element from the gaze area, i.e. displays the subject information of the target page that the user is viewing. Thus, without manual operation of a user, the shielding of the subject information of the target page being viewed by the user is automatically removed, and the subject information of the target page being viewed by the user is displayed. Therefore, the convenience of the user for controlling the page display is improved, and the browsing experience of the user is further improved.
In the embodiment of the disclosure, since the line of sight of the user is similar to a cone, the gaze area of the user on the electronic device may be the intersecting plane of the cone and the plane where the display screen of the electronic device is located, i.e. the gaze area may be elliptical or circular. Therefore, the electronic device can firstly acquire the distance between the electronic device and the eyes of the user and the target visual angle of the user; calculating the diameter of the gazing area by utilizing the distance between the display screen and the eyes of the user and the target visual angle of the user based on the geometrical relation among the eyes of the user, the target visual angle of the user and the plane where the display screen of the electronic equipment is positioned; finally, according to the diameter of the gazing area, the area of the gazing area is calculated.
Illustratively, as shown in the schematic diagram of the geometric relationship of the user's viewing angle and display screen in FIG. 7, the user's eye 710 is looking forward at the display screen 720 of the electronic device. The maximum visual angle α of the user determines the user's line of sight, which forms a front view of the cone as triangle 730 in fig. 7. The viewing angle of the user correct identification information is a target viewing angle beta, and the area in the target viewing angle beta is a gazing area. The electronic device may calculate the diameter B of the gaze area according to the distance D between the eyes of the user and the display 720 of the electronic device and the target viewing angle β.
Referring to fig. 8, S302 in the above page display method may include S801 to S803.
S801, acquiring position information of a fixation point of eyes of a user on a display screen, a distance between the display screen and the eyes of the user and a target visual angle of the user; the target visual angle is the visual angle of the correct identification information of the user.
Illustratively, taking an example of the electronic device performing the page display method, the electronic device may include a front camera. The electronic equipment can acquire a face image comprising eyes or faces of a user through the front-facing camera; and processing the face image to determine the position information of the gaze point of the eyes of the user on the display screen.
Illustratively, taking a method of executing page display by a server as an example, the server receives the face image from the electronic device, and a distance between a display screen and eyes of a user; the target viewing angle of the user is also obtained. And the server processes the face image and determines the position information of the gaze point of the eyes of the user on the display screen.
In an embodiment of the present disclosure, the position information of the gaze point may include coordinates of the gaze point, which are coordinates of the gaze point taken under the first coordinate system. The origin of the first coordinate system is the top left vertex of the display screen of the electronic device, the positive direction of the X axis of the first coordinate system is the direction from the top left vertex to the top right vertex of the display screen of the electronic device, and the positive direction of the Y axis of the first coordinate system is the direction from the top left vertex to the bottom left vertex of the display screen of the electronic device.
The electronic device may also include a distance sensor, for example. The electronic equipment can obtain the distance between the display screen and the eyes of the user through measurement of the distance sensor.
In the embodiment of the present disclosure, the target viewing angle of the user may be a preset viewing angle or a viewing angle measured by the electronic device. The preset viewing angle may be obtained by taking a value from a viewing angle range in which the eyes of the user can correctly recognize information. The view angle range in which the user's eyes can correctly recognize information means that, with respect to information within the view angle range, the user can accurately recognize information within the view angle range with a certain distance maintained between the user's eyes and the information within the view angle range.
The viewing angle range of the information which can be correctly identified by the eyes of the user is 10 ° -20 °, and the preset viewing angle can be a value from 10 ° -20 °, for example, the preset viewing angle is the maximum value in the viewing angle range of the information which can be correctly identified by the eyes of the user, namely, 20 °.
S802, determining the area of the gazing area according to the distance and the target visual angle.
In the embodiment of the disclosure, the page display device may input the distance and the target viewing angle to a preset trigonometric function to obtain the diameter of the gazing area; and calculating the area of the gazing area according to the diameter of the gazing area.
The preset trigonometric function is used for calculating the length of a second right-angle side in the right-angle triangle according to the length of the first right-angle side in the right-angle triangle and the acute angle.
In the embodiment of the disclosure, the page display device may input the diameter of the gazing area to a preset area calculation function, and calculate the area of the gazing area. The preset area calculation function is used for calculating the area of the gazing area according to the relation between the diameter and the circular area.
The page display device may calculate, when the gaze area is circular (or when it is determined that the user is looking at the display screen), an area of the gaze area by using the preset area calculation function.
Illustratively, the target viewing angle β and the gaze area in fig. 7 are taken as examples. The preset trigonometric function may be as shown in formula (1), and the preset area calculation function may be as shown in formula (2).
B=D×tan(β/2)×2 (1)
S=π×(B/2) 2 (2)
Wherein S is the area of the gazing region; b/2 is the radius of the gaze area.
S803, determining the position of the gazing area in the target page according to the position information of the gazing point and the area of the gazing area.
The page display device may determine the position of the gazing area in the target page according to the coordinates of the gazing point and the area of the gazing area. Wherein the gaze point is the center point of the gaze area.
In the embodiment of the present disclosure, the position information of the gaze point may be coordinates of the gaze point described above. After the page display device acquires a face image comprising a face of a user, a real-time coordinate system can be established by taking facial feature points of the face image as reference objects, and coordinates of pupils of the face image under the real-time coordinate system are determined; and determining the coordinates of the gaze point corresponding to the coordinates of the pupil of the face image under the real-time coordinate system according to the pre-established mapping relation between the pupil coordinates and the gaze point coordinates.
The facial feature points of the face image are feature points that do not change in position with respect to the head of the face image, such as eyebrow feature points, eye feature points, nose tip feature points, mouth corner feature points, and eyeglass feature points of an eyeglass wearer.
The mapping relation is obtained by training the sample face image and the sample fixation point coordinates corresponding to the sample face image. The sample face image is a face image including a sample user face; the sample gaze point coordinates represent the location on the display screen of the sample user's gaze point on the display screen. The sample face image and the face image are face images of the same user, and the sample user is the user.
It can be appreciated that the page display device can determine the position of the gaze area in the target page by acquiring the face image, the distance between the display screen and the eyes of the user, and the target viewing angle of the user. The face image can be acquired through the front camera, the distance between the display screen and eyes of the user can be acquired through the distance sensor, and the target visual angle of the user can be determined according to the visual angle of the correct identification information of the user. That is, the position of the gazing area in the target page can be determined through the acquisition of the front camera and the distance sensor, and the page display method provided by the embodiment of the disclosure has low requirements on hardware equipment, so that the usability of the page display method is improved.
Secondly, as the face images of different users are different, the position relations of the feature points in the face images are also different; therefore, when the sample face image and the face image used in establishing the mapping relation between the pupil coordinates and the gaze point coordinates are face images of the same person, the error when the established mapping relation between the pupil coordinates and the gaze point coordinates can be applied to the face images can be reduced.
In the embodiment of the disclosure, the page display device establishes a mapping relationship between pupil coordinates and gaze point coordinates in advance before acquiring the coordinates of the gaze point. The method specifically comprises the following steps: the page display device obtains a sample face image and a sample fixation point coordinate corresponding to the sample face image; performing feature recognition on the sample face image, and determining pupils of the sample face image and facial feature points of the sample face image; according to the position relation among the facial feature points of the sample face image, a sample coordinate system taking the facial feature points of the sample face image as a reference object is established, and the coordinates of pupils of the sample face image under the sample coordinate system are determined; and fitting the coordinates of the pupil of the sample face image under the sample coordinate system with the sample fixation point coordinates corresponding to the sample face image to obtain the mapping relation between the pupil coordinates and the fixation point coordinates under the sample coordinate system.
The sample gaze point coordinates are coordinates obtained by the sample gaze point in the first coordinate system. The sample gaze point is a point marked in advance on the display screen of the electronic device. And when the user looks at the sample fixation point, the page display device acquires a corresponding sample face image.
It will be appreciated that the page display device, in order to achieve the above-described functions, includes corresponding hardware structures and/or software modules that perform the respective functions. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present disclosure.
The embodiments of the present disclosure may divide the functional modules of the above-described page display device or the like according to the above-described method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated in one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present disclosure, the division of the modules is merely a logic function division, and other division manners may be implemented in actual practice.
Fig. 9 shows a schematic diagram of one possible configuration of the page display device related to the above-described embodiment in the case where respective functional blocks are divided with corresponding respective functions, the page display device 900 including: a gaze area determination module 901, an element determination module 902, and a display module 903. The display module 903 is configured to display a target page on a display screen; a fixation region determining module 901, configured to determine a position of a fixation region in a target page; the element determining module 902 is configured to determine, when the page element located in the gazing area includes a base page element, that page elements other than the base page element are target page elements from the page elements located in the gazing area. The display module 903 is further configured to remove the target page element from the gaze area.
Wherein the target page comprises at least one page element. The gazing area is an area where the eyes of the user gaze at the target page. The information recorded by the basic page element belongs to the subject information of the target page; the information recorded by the target page element belongs to preset information; the preset information is information different from the subject information of the target page; the target page element overlays the base page element.
In one possible implementation, the gaze area determination module 901 is specifically configured to: acquiring position information of a fixation point of eyes of a user on a display screen, a distance between the display screen and the eyes of the user and a target visual angle of the user; determining the area of the gazing area according to the distance and the target visual angle; and determining the position of the gazing area in the target page according to the position information of the gazing point and the area of the gazing area.
The target visual angle is the visual angle of the correct identification information of the user; the target visual angle is equal to the vertex angle of a cone formed by eyes and a fixation area of a user; the gaze point is the center point of the gaze area.
In another possible implementation manner, the gaze area determining module 901 is specifically configured to: inputting the distance and the target visual angle into a preset trigonometric function to obtain the diameter of the gazing area; and calculating the area of the gazing area according to the diameter of the gazing area.
In another possible implementation, the display module 903 is specifically configured to hide the target page element; or, moving the target page element from the gazing area to other areas except the gazing area in the target page; alternatively, the target page element is deleted.
In another possible implementation manner, the preset information includes at least one of the following: recommendation information, prompt information, media resources.
In another possible implementation manner, the gazing area determining module 901 is specifically configured to obtain a face image including a face of a user; and determining the position information of the fixation point according to the face image.
In another possible implementation manner, the gaze area determining module 901 is specifically configured to: establishing a real-time coordinate system by taking facial feature points of the face image as reference objects, and determining coordinates of pupils of the face image under the real-time coordinate system; and determining the coordinates of the gaze point corresponding to the coordinates of the pupil of the face image under the real-time coordinate system according to the pre-established mapping relation between the pupil coordinates and the gaze point coordinates.
The mapping relation is obtained through training according to the sample face image and the sample fixation point coordinates corresponding to the sample face image; the sample gaze point coordinates represent the location on the display screen of the sample user's gaze point on the display screen.
Of course, the page display device 900 includes, but is not limited to, the unit modules listed above. For example, the page display device 900 may also include a memory module. The storage module may be used for storing the sample face image, the mapping relation between the pupil coordinates and the gaze point coordinates, and the like. In addition, the functions that can be implemented by the above functional units include, but are not limited to, functions corresponding to the method steps described in the above examples, and detailed descriptions of other modules of the page display device 900 may refer to detailed descriptions of the corresponding method steps, which are not repeated herein in the embodiments of the present disclosure.
It will be appreciated that all of the functions of the page display device 900 described above may be implemented by the electronic apparatus 200 shown in fig. 2, with integrated units. The functions of the respective modules in the above-described page display apparatus 900 may be implemented in the processor 201 of the electronic device 200. For example, the functions of the gaze area determination module 901, the element determination module 902, and the display module 903 described above may be integrated into the processor 201. The memory modules of the page display device 900 are each equivalent to the memory 202 of the electronic apparatus 200.
The disclosed embodiments also provide a computer-readable storage medium comprising computer instructions which, when run on the above-described electronic device, cause the electronic device to perform the various functions or steps of the method embodiments described above. For example, the computer readable storage medium may be Read-Only Memory (ROM), random-access Memory (Random Access Memory, RAM), compact disc Read-Only Memory (CD-ROM), magnetic tape, floppy disk, optical data storage device, and the like.
The disclosed embodiments also provide a computer program product which, when run on an electronic device, causes the electronic device to perform the functions or steps of the method embodiments described above.
It will be apparent to those skilled in the art from this description that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided in the present disclosure, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts shown as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present disclosure may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present disclosure may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions to cause a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods described in the embodiments of the present disclosure. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk, etc.
The foregoing is merely a specific embodiment of the disclosure, but the protection scope of the disclosure is not limited thereto, and any changes or substitutions within the technical scope of the disclosure should be covered in the protection scope of the disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (16)

1. A method of displaying a page, the method comprising:
displaying a target page on a display screen; the target page comprises at least one page element;
determining the position of a gazing area in the target page; the gazing area is an area where the eyes of the user gaze at the target page;
when the page elements in the gazing area comprise basic page elements, determining that the page elements except the basic page elements are target page elements from the page elements in the gazing area; wherein, the information recorded by the basic page element belongs to the subject information of the target page; the information recorded by the target page element belongs to preset information; the preset information is information different from the subject information of the target page; the target page element covers the basic page element;
The target page element is removed from the gaze area.
2. The method of claim 1, wherein the determining the location of the gaze area in the target page comprises:
acquiring position information of a fixation point of the eyes of the user on the display screen, a distance between the display screen and the eyes of the user and a target visual angle of the user; the target visual angle is the visual angle of the correct identification information of the user;
determining the area of the gazing area according to the distance and the target visual angle; wherein the target viewing angle is equal to the vertex angle of a cone formed by the eyes of the user and the gaze area;
determining the position of the gazing area in the target page according to the position information of the gazing point and the area of the gazing area; the gaze point is a center point of the gaze area.
3. The method of claim 2, wherein the determining the area of the gaze area based on the distance and the target viewing angle comprises:
inputting the distance and the target visual angle into a preset trigonometric function to obtain the diameter of the gazing area;
And calculating the area of the gazing area according to the diameter of the gazing area.
4. A method according to any of claims 1-3, wherein said removing the target page element from the gaze area comprises:
hiding the target page element;
or, moving the target page element from the gazing area to other areas except the gazing area in the target page;
or deleting the target page element.
5. A method according to any one of claims 1-3, wherein the preset information comprises at least one of: recommendation information, prompt information, media resources.
6. A method according to claim 2 or 3, wherein said obtaining location information of a gaze point of the user's eye on the display screen comprises:
acquiring a face image comprising the face of the user;
and determining the position information of the fixation point according to the face image.
7. The method of claim 6, wherein the determining the location information of the gaze point from the face image comprises:
establishing a real-time coordinate system by taking facial feature points of the face image as reference objects, and determining coordinates of pupils of the face image under the real-time coordinate system;
Determining the coordinates of the gaze point corresponding to the coordinates of the pupil of the face image under the real-time coordinate system according to a pre-established mapping relation between the pupil coordinates and the gaze point coordinates;
the mapping relation is obtained through training according to a sample face image and a sample fixation point coordinate corresponding to the sample face image; the sample gaze point coordinates represent a position of a sample user's gaze point on the display screen.
8. A page display device, characterized in that the page display device comprises: the device comprises a gazing area determining module, an element determining module and a display module;
the display module is used for displaying a target page on a display screen; the target page comprises at least one page element;
the gazing area determining module is used for determining the position of the gazing area in the target page; the gazing area is an area where the eyes of the user gaze at the target page;
the element determining module is used for determining that page elements except the basic page element are target page elements from page elements positioned in the gazing area when the page elements positioned in the gazing area comprise the basic page element; wherein, the information recorded by the basic page element belongs to the subject information of the target page; the information recorded by the target page element belongs to preset information; the preset information is information different from the subject information of the target page; the target page element covers the basic page element;
The display module is further configured to remove the target page element from the gaze area.
9. The apparatus of claim 8, wherein the gaze region determination module for determining a location of a gaze region in the target page comprises:
the gazing area determining module is specifically configured to:
acquiring position information of a fixation point of the eyes of the user on the display screen, a distance between the display screen and the eyes of the user and a target visual angle of the user; the target visual angle is the visual angle of the correct identification information of the user;
determining the area of the gazing area according to the distance and the target visual angle; wherein the target viewing angle is equal to the vertex angle of a cone formed by the eyes of the user and the gaze area;
determining the position of the gazing area in the target page according to the position information of the gazing point and the area of the gazing area; the gaze point is a center point of the gaze area.
10. The apparatus of claim 9, wherein the gaze region determination module configured to determine an area of the gaze region based on the distance and the target viewing angle comprises:
The gazing area determining module is specifically configured to:
inputting the distance and the target visual angle into a preset trigonometric function to obtain the diameter of the gazing area;
and calculating the area of the gazing area according to the diameter of the gazing area.
11. The apparatus of any of claims 8-10, wherein the display module to remove the target page element from the gaze area comprises:
the display module is specifically configured to hide the target page element; or, moving the target page element from the gazing area to other areas except the gazing area in the target page; or deleting the target page element.
12. The apparatus according to any one of claims 8-10, wherein the preset information comprises at least one of: recommendation information, prompt information, media resources.
13. The apparatus according to claim 9 or 10, wherein the gaze area determination module, configured to obtain location information of a gaze point of the user's eyes on the display screen, comprises:
the gazing area determining module is specifically configured to obtain a face image including the face of the user; and determining the position information of the fixation point according to the face image.
14. The apparatus of claim 13, wherein the gaze region determination module configured to determine location information of the gaze point from the face image comprises:
the gazing area determining module is specifically configured to:
establishing a real-time coordinate system by taking facial feature points of the face image as reference objects, and determining coordinates of pupils of the face image under the real-time coordinate system;
determining the coordinates of the gaze point corresponding to the coordinates of the pupil of the face image under the real-time coordinate system according to a pre-established mapping relation between the pupil coordinates and the gaze point coordinates;
the mapping relation is obtained through training according to a sample face image and a sample fixation point coordinate corresponding to the sample face image; the sample gaze point coordinates represent a position of a sample user's gaze point on the display screen.
15. An electronic device, comprising: a processor and a memory for storing instructions executable by the processor;
wherein the processor is configured to execute the instructions to cause the electronic device to perform the page display method of any one of claims 1 to 7.
16. A computer readable storage medium having stored thereon computer instructions which, when executed by a processor of an electronic device, cause the electronic device to perform the page display method of any of claims 1 to 7.
CN202011439816.6A 2020-12-10 2020-12-10 Page display method and device, electronic equipment and storage medium Active CN112506345B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011439816.6A CN112506345B (en) 2020-12-10 2020-12-10 Page display method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011439816.6A CN112506345B (en) 2020-12-10 2020-12-10 Page display method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112506345A CN112506345A (en) 2021-03-16
CN112506345B true CN112506345B (en) 2024-04-16

Family

ID=74970783

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011439816.6A Active CN112506345B (en) 2020-12-10 2020-12-10 Page display method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112506345B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023034049A1 (en) * 2021-08-30 2023-03-09 Chinook Labs Llc Restricting display area of applications
CN114356478A (en) * 2021-12-23 2022-04-15 上海万物新生环保科技集团有限公司 Method and equipment for determining page content display
CN114610432A (en) * 2022-03-17 2022-06-10 芜湖汽车前瞻技术研究院有限公司 Graphic display control method, device, equipment and storage medium for vehicle-mounted display screen

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101489120A (en) * 2009-01-21 2009-07-22 北京中星微电子有限公司 System, apparatus and method for implementing video monitoring regional occlusion
CN105229584A (en) * 2013-05-29 2016-01-06 三菱电机株式会社 Information display device
CN109271027A (en) * 2018-09-17 2019-01-25 北京旷视科技有限公司 Page control method, device and electronic equipment
CN109765994A (en) * 2017-11-09 2019-05-17 托比股份公司 Improvement to the protection and the access that calculate the data in equipment
WO2019171128A1 (en) * 2018-03-06 2019-09-12 Yogesh Chunilal Rathod In-media and with controls advertisement, ephemeral, actionable and multi page photo filters on photo, automated integration of external contents, automated feed scrolling, template based advertisement post and actions and reaction controls on recognized objects in photo or video
CN111488096A (en) * 2020-04-02 2020-08-04 上海连尚网络科技有限公司 Method and equipment for displaying interactive presentation information in reading application

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2395420B1 (en) * 2009-02-05 2018-07-11 Panasonic Intellectual Property Corporation of America Information display device and information display method
US10841662B2 (en) * 2018-07-27 2020-11-17 Telefonaktiebolaget Lm Ericsson (Publ) System and method for inserting advertisement content in 360° immersive video
US10902678B2 (en) * 2018-09-06 2021-01-26 Curious Company, LLC Display of hidden information

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101489120A (en) * 2009-01-21 2009-07-22 北京中星微电子有限公司 System, apparatus and method for implementing video monitoring regional occlusion
CN105229584A (en) * 2013-05-29 2016-01-06 三菱电机株式会社 Information display device
CN109765994A (en) * 2017-11-09 2019-05-17 托比股份公司 Improvement to the protection and the access that calculate the data in equipment
WO2019171128A1 (en) * 2018-03-06 2019-09-12 Yogesh Chunilal Rathod In-media and with controls advertisement, ephemeral, actionable and multi page photo filters on photo, automated integration of external contents, automated feed scrolling, template based advertisement post and actions and reaction controls on recognized objects in photo or video
CN109271027A (en) * 2018-09-17 2019-01-25 北京旷视科技有限公司 Page control method, device and electronic equipment
CN111488096A (en) * 2020-04-02 2020-08-04 上海连尚网络科技有限公司 Method and equipment for displaying interactive presentation information in reading application

Also Published As

Publication number Publication date
CN112506345A (en) 2021-03-16

Similar Documents

Publication Publication Date Title
US11288807B2 (en) Method, electronic device and storage medium for segmenting image
US11710351B2 (en) Action recognition method and apparatus, and human-machine interaction method and apparatus
CN110059744B (en) Method for training neural network, method and equipment for processing image and storage medium
CN112506345B (en) Page display method and device, electronic equipment and storage medium
CN110992493B (en) Image processing method, device, electronic equipment and storage medium
CN109325924B (en) Image processing method, device, terminal and storage medium
CN111445901B (en) Audio data acquisition method and device, electronic equipment and storage medium
EP4000700A1 (en) Camera shot movement control method, device, apparatus, and storage medium
CN110956580B (en) Method, device, computer equipment and storage medium for changing face of image
CN110263617B (en) Three-dimensional face model obtaining method and device
CN111723803B (en) Image processing method, device, equipment and storage medium
CN109978996B (en) Method, device, terminal and storage medium for generating expression three-dimensional model
CN112581358A (en) Training method of image processing model, image processing method and device
CN112269559A (en) Volume adjustment method and device, electronic equipment and storage medium
CN111432245B (en) Multimedia information playing control method, device, equipment and storage medium
CN110677713B (en) Video image processing method and device and storage medium
EP4136622A1 (en) Applying stored digital makeup enhancements to recognized faces in digital images
CN109819308B (en) Virtual resource acquisition method, device, terminal, server and storage medium
CN110152309B (en) Voice communication method, device, electronic equipment and storage medium
CN110728744B (en) Volume rendering method and device and intelligent equipment
CN111325083B (en) Method and device for recording attendance information
CN110336881B (en) Method and device for executing service processing request
CN114143280A (en) Session display method and device, electronic equipment and storage medium
CN114826799A (en) Information acquisition method, device, terminal and storage medium
CN111210001A (en) Method and device for adjusting seat, electronic equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant