CN108156321B - Split screen display method and terminal - Google Patents

Split screen display method and terminal Download PDF

Info

Publication number
CN108156321B
CN108156321B CN201711441438.3A CN201711441438A CN108156321B CN 108156321 B CN108156321 B CN 108156321B CN 201711441438 A CN201711441438 A CN 201711441438A CN 108156321 B CN108156321 B CN 108156321B
Authority
CN
China
Prior art keywords
terminal
display area
user
display
dangerous
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711441438.3A
Other languages
Chinese (zh)
Other versions
CN108156321A (en
Inventor
隋亮
胡彩月
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Chuanying Information Technology Co Ltd
Original Assignee
Shanghai Spreadrise Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Spreadrise Technologies Co Ltd filed Critical Shanghai Spreadrise Technologies Co Ltd
Priority to CN201711441438.3A priority Critical patent/CN108156321B/en
Publication of CN108156321A publication Critical patent/CN108156321A/en
Application granted granted Critical
Publication of CN108156321B publication Critical patent/CN108156321B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons

Abstract

The application discloses a split screen display method and a terminal, wherein the method comprises the following steps: the terminal detects that the sight of a user falls on a display screen of the terminal; responding to the fact that the terminal detects that the sight of a user falls on a display screen of the terminal, judging whether the user is in a dangerous environment or not by the terminal, if so, dividing the display screen of the terminal into at least one first display area and at least one second display area by the terminal, wherein the first display area is used for displaying the surrounding environment, and the second display area is used for displaying a user interface of the terminal; and the terminal displays the surrounding environment in the first display area. The scheme is convenient for a user to notice that the user harms the dangerous objects in the terminal using process, and the user experience is improved.

Description

Split screen display method and terminal
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a split-screen display method and a terminal.
Background
With the development of mobile terminal technology and the wide spread of mobile terminals. The mobile phone is integrated into the life of people. At present, more and more people keep the mobile phone away from the hand all day, stare at the mobile phone when eating, stare at the mobile phone when sleeping, and stare at the mobile phone even when walking on the road, however, when people are very dangerous when walking and watching the mobile phone, the mobile phone brings danger to personal safety, and at present, most mobile phone manufacturers do not improve the scene efficiently.
Disclosure of Invention
The application provides a split-screen display method and a split-screen display terminal, which can remind a user of paying attention to dangerous objects which can injure the user in the process of using the terminal (staring at a mobile phone), and improve user experience.
In a first aspect, the present application provides a split-screen display method, including:
the terminal detects that the sight of a user falls on a display screen of the terminal.
Responding to the fact that the terminal detects that the sight of a user falls on a display screen of the terminal, the terminal judges whether the user is in a dangerous environment, if the user is in the dangerous environment, the terminal divides the display screen of the terminal into at least one first display area and at least one second display area, wherein the first display area is used for displaying the surrounding environment, and the second display area is used for displaying a user interface of the terminal.
The terminal displays the surrounding environment in the first display area.
With reference to the first aspect, in some optional embodiments, the determining, by the terminal, whether the user is in a dangerous environment specifically includes:
the terminal acquires an image of the environment where the user is located;
the terminal judges whether the acquired image contains a first image characteristic, if so, the terminal determines that the user is in a dangerous environment, and the first image characteristic is used for representing a dangerous object.
With reference to the first aspect, in some optional embodiments, the determining, by the terminal, whether the user is in a dangerous environment specifically includes:
the terminal detects sounds in the surrounding environment.
The terminal judges whether the acquired sound contains a first sound characteristic, if yes, the terminal determines that the user is in danger, and the first sound characteristic is used for representing dangerous objects.
With reference to the first aspect, in some optional embodiments, the method further includes:
and if the user is in a dangerous environment, the terminal detects the distance between the terminal and dangerous objects in the surrounding environment, and if the distance is smaller than or equal to a preset safety distance, the terminal adjusts the first display area.
With reference to the first aspect, in some optional embodiments, the adjusting, by the terminal, the first display area specifically includes:
the terminal enlarges the area of the first display area.
With reference to the first aspect, in some optional embodiments, the adjusting, by the terminal, the first display area specifically includes:
and the terminal enlarges the display content of the first display area.
In a second aspect, the present application provides a terminal, comprising:
the detection unit is used for detecting that the sight of a user falls on a display screen of the terminal;
the judging unit is used for responding to the fact that the sight of the user falls on the display screen of the terminal and judging whether the user is in a dangerous environment or not;
the dividing unit is used for dividing the display screen of the terminal into at least one first display area and at least one second display area when the judging unit judges that the user is in the dangerous environment, wherein the first display area is used for displaying the surrounding environment, and the second display area is used for displaying the user interface of the terminal;
a display unit for displaying the ambient environment in the first display area.
In combination with the second aspect, in some optional embodiments, the method further includes:
and the adjusting unit is used for adjusting the first display area if the user is in the dangerous environment and the distance between the terminal and dangerous objects in the surrounding environment is less than or equal to a preset safety distance.
In a third aspect, the present application provides another terminal, including: the method comprises the following steps: an input device, an output device, and a memory, and a processor coupled to the memory, the output device including a display screen, wherein:
the input device is used for detecting that the sight line of the user falls on the display screen.
The processor is configured to determine whether the user is in a dangerous environment in response to the input device detecting that the line of sight of the user falls on the display screen of the terminal, and if the user is in the dangerous environment, divide the display screen of the terminal into at least one first display area and at least one second display area, where the first display area is used for displaying a surrounding environment, and the second display area is used for displaying a user interface of the terminal.
The output device is used for displaying the surrounding environment in the first display area.
In a fourth aspect, the present application provides a computer-readable storage medium storing a computer program comprising program instructions which, when executed by a processor, cause the processor to perform the method of the first aspect described above.
The terminal detects that the sight of a user falls on a display screen of the terminal, responds to the fact that the sight of the user falls on the display screen of the terminal, judges whether the user is in a dangerous environment or not, and divides the display screen of the terminal into at least one first display area and at least one second display area if the user is in the dangerous environment, wherein the first display area is used for displaying the surrounding environment, and the second display area is used for displaying a user interface of the terminal; the terminal displays the surrounding environment in the first display area. Through the scheme, the user can notice that the user harms the dangerous objects in the terminal using process, and the user experience is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flow chart of a split screen display method provided in the present application;
2A-2I are schematic diagrams of a terminal split screen interface provided herein;
fig. 3 is a schematic structural diagram of a terminal provided in the present application;
fig. 4 is a schematic structural diagram of another terminal provided in the present application.
Detailed Description
The technical solutions in the present application will be described clearly and completely with reference to the accompanying drawings in the present application, and it is obvious that the described embodiments are some, not all embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In particular implementations, the terminals described herein include, but are not limited to, other portable devices such as mobile phones, laptop computers, or tablet computers having touch sensitive surfaces (e.g., touch screen displays and/or touch pads). It should also be understood that in some embodiments, the device is not a portable communication device, but is a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or touchpad).
In the discussion that follows, a terminal that includes a display and a touch-sensitive surface is described. However, it should be understood that the terminal may include one or more other physical user interface devices such as a physical keyboard, mouse, and/or joystick.
The terminal supports various applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disc burning application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an email application, an instant messaging application, an exercise support application, a photo management application, a digital camera application, a web browsing application, a digital music player application, and/or a digital video player application.
Various applications that may be executed on the terminal may use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the terminal can be adjusted and/or changed between applications and/or within respective applications. In this way, a common physical architecture (e.g., touch-sensitive surface) of the terminal can support various applications with user interfaces that are intuitive and transparent to the user.
Referring to fig. 1, a schematic flow chart of a split screen display method provided by the present application is shown in fig. 1, where the method may include at least the following steps:
s101, the terminal detects that the sight of the user falls on a display screen of the terminal.
In the application, the terminal may detect whether the line of sight of the user falls on a display screen (screen) of the terminal through the camera (detect whether the user gazes at the screen of the terminal), that is, the terminal detects whether the line of sight of eyes of the user intersects with the display screen of the terminal through the camera, and if the line of sight of eyes of the user intersects with the display screen of the terminal, the terminal considers that the user gazes at the display screen of the terminal. Here, the camera may be a front camera of the terminal, and may also be a 3D camera of the terminal, which is not limited herein. The display screen may be a touch screen configured with a self-capacitance type floating touch panel, or may be a touch screen configured with an infrared type floating touch panel, which is not limited herein.
S102, responding to the fact that the terminal detects that the sight of the user falls on a display screen of the terminal, the terminal judges whether the user is in a dangerous environment, if the user is in the dangerous environment, the terminal divides the display screen of the terminal into at least one first display area and at least one second display area, wherein the first display area is used for displaying the surrounding environment, and the second display area is used for displaying a user interface of the terminal.
In the present application, the hazardous environment may include, but is not limited to, the following three scenarios.
The first scenario is: when the terminal detects that the sight line of a user holding the terminal falls on the display screen of the terminal, the user approaches to dangerous objects in the surrounding environment.
The second scenario is: when the terminal detects that the sight line of a user holding the terminal falls on the display screen of the terminal, dangerous objects in the surrounding environment approach towards the user.
The third scenario is: when the terminal detects that the sight of a user holding the terminal falls on a display screen of the terminal, the user is close to a dangerous object in the surrounding environment.
The terminal determines whether the user is in a dangerous environment, which may include, but is not limited to, the following two ways.
The first mode is as follows: the terminal acquires an image of an environment where a user is located, the terminal can judge whether the acquired image contains a first image feature, and if the terminal judges that the acquired image contains the first image feature, the terminal determines that the user is located in a dangerous environment, wherein the first image feature is used for representing a dangerous object, and the dangerous object can comprise any one of the following: driving cars, trucks, bicycles, puddles, trees that can cause injury to users, without limitation. Here, the first image feature may be an appearance feature of a dangerous object. The first image characteristic can be stored in a memory inside the terminal, and can also be stored in a cloud database of the terminal.
The second mode is as follows: the method comprises the steps that a terminal obtains sound in the surrounding environment, whether the obtained sound contains first sound characteristics is judged, if the terminal judges that the obtained sound contains the first sound characteristics, the terminal determines that a user is in danger, and the first sound characteristics are used for representing dangerous objects. Here, the first sound characteristic may include a sound characteristic of the generation of the hazardous substance. The first sound characteristic can be stored in a memory inside the terminal, and can also be stored in a cloud database of the terminal.
When the terminal determines that the user is in a dangerous environment, the terminal may divide a display screen of the terminal into at least one first display area and at least one second display area, where the first display area may be used to display a surrounding environment and the second display area may be used to display a user interface of the terminal.
Specifically, the first display area may be used to display an ambient environment monitored by the terminal through the camera. The user interface may include at least one of: a system background interface, a WeChat application interface, a QQ application interface, or a video application interface, without limitation.
The display area of the display screen of the terminal related to the embodiment of the method of fig. 1 will be described with reference to fig. 2A-2I.
As shown in fig. 2A, the display area of the display screen of the terminal 100 may include: a first display area 105, a second display area 101, a second display area 102, and a second display area 103. Here, the second display area 101 may be used to display a WeChat chat window interface, the second display area 102 may be used to display a QQ chat window interface, and the second display interface 103 may be used to display a video playback interface.
When a user stares at a mobile phone while walking (the terminal 100 simultaneously displays a WeChat chat window interface, a QQ chat window interface and a video playing interface), at the moment, the terminal 100 detects that a dangerous object (an automobile) in the surrounding environment whistles and drives to the user at a high speed, and when the terminal judges that the user is in a dangerous environment, the terminal divides the display area of the current display screen so that the display area of the current display screen comprises at least one first display area, that is, the terminal 100 divides the display area of the display screen of the terminal 100 shown in FIG. 2A into the display areas of the display screen shown in FIG. 2B.
As shown in fig. 2B, the display area of the display screen of the terminal 100 may include: a second display area 101, a second display area 102, a second display area 103, a first display area 104, and a first display area 105.
The first display area 104 may be used to display the ground environment on which the user stands, and the first display area 105 may be used to display the environment in the walking direction (directly in front) of the user.
When the terminal determines that the user is in a dangerous environment and the user is observing dangerous objects in the surrounding environment through the first display area 104 and the first display area 105 in fig. 2B, if the user has a very important telephone access, the terminal adjusts the display area of the display screen, and at this time, the display area of the display screen of the terminal 100 may include: a first display area for displaying the surrounding environment and a second display area for displaying the application with the highest priority. That is, the terminal 100 adjusts the display area of the display screen of the terminal 100 shown in fig. 2B to the display area of the display screen shown in fig. 2C.
As shown in fig. 2C, the display area of the display screen of the terminal 100 may include: a first display area 104, a first display area 105, and a second display area 106. Where the second display area 106 is used to display the application (e.g., phone) with the highest priority.
Optionally, when the terminal determines that the user is in a dangerous environment, the terminal detects a distance between the user and a dangerous object in the surrounding environment, and if the distance is smaller than or equal to a preset safety distance, the terminal adjusts the first display area. In particular, the method comprises the following steps of,
the terminal may detect the distance between the user and the dangerous object in the surrounding environment by any of the following means: infrared sensors, ultrasonic distance sensors, or dual cameras, without limitation.
The terminal adjusting the first display area may include the following two implementations.
The first implementation mode comprises the following steps: the terminal enlarges the area of the first display region.
The second implementation mode comprises the following steps: the terminal enlarges the display content of the first display area.
In the present application, the first implementation manner and the second implementation manner may be implemented in combination.
A first implementation of the stand-alone implementation is described below.
As shown in fig. 2D, in the display area of the display screen of the terminal 100, the display areas of the first display area 107 and the first display area 108 are increased to some extent relative to the first display area 104 and the second display area 105 in fig. 2B, on one hand, this implementation reminds the user to divert attention to the display area for displaying the surrounding environment, and on the other hand, facilitates the user to clearly see the dangerous objects in the surrounding environment and timely take effective measures for preventing the dangerous objects.
When the terminal determines that the user is in a dangerous environment and the user is observing dangerous objects in the surrounding environment through the first display area 107 and the first display area 108 in fig. 2D, if the user has a very important phone access, the terminal adjusts the display area of the display screen, and at this time, the display area of the display screen of the terminal 100 may include: a first display area for displaying the surrounding environment and a second display area for displaying a telephony access interface (application interface with highest priority). That is, the terminal 100 adjusts the display area of the display screen of the terminal 100 shown in fig. 2D to the display area of the display screen shown in fig. 2E.
As shown in fig. 2E, the display area of the display screen of the terminal 100 may include: a first display area 107, a first display area 108, and a second display area 106. Where the second display area 106 is used to display the application (e.g., phone) with the highest priority.
The second implementation mode comprises the following steps: the terminal enlarges the display content of the first display area.
A second implementation, implemented separately, is described below.
As shown in fig. 2F, in the display area of the display screen of the terminal 100, the display contents in the first display area 109 and the first display area 110 are enlarged relative to the first display area 104 or the first display area 105 in fig. 2B. Here, the area of the first display region 109 is unchanged with respect to the area of the first display region 104 in fig. 2B, and the area of the first display region 110 is unchanged with respect to the area of the first display region 105 in fig. 2B. On the one hand, this implementation reminds the user to divert attention to the display area for displaying the surrounding environment, and on the other hand, is convenient for the user to see the dangerous objects in the surrounding environment clearly, and timely adopts effective dangerous object prevention measures.
When the terminal determines that the user is in a dangerous environment and the user is observing dangerous objects in the surrounding environment through the first display area 109 and the first display area 110 in fig. 2F, if the user has a very important phone access, the terminal adjusts the display area of the display screen, and at this time, the display area of the display screen of the terminal 100 may include: a first display area for displaying the surrounding environment and a second display area for displaying a telephony access interface (application interface with highest priority). That is, the terminal 100 adjusts the display area of the display screen of the terminal 100 shown in fig. 2F to the display area of the display screen shown in fig. 2G.
As shown in fig. 2G, the display area of the display screen of the terminal 100 may include: a first display area 109, and a first display area 110 and a second display area 106. Here the second display area 106 is used to display the phone interface (application interface with highest priority).
The following describes a second implementation based on the first implementation.
As shown in fig. 2H, in the display area of the display screen of the terminal 100, the display contents in both the first display area 111 and the first display area 112 are enlarged with respect to the first display area 107 or the first display area 108 in fig. 2D. Here, the area of the first display region 111 is unchanged with respect to the area of the first display region 107 in fig. 2D, and the area of the first display region 112 is unchanged with respect to the area of the first display region 108 in fig. 2D. On one hand, the implementation further reminds the user to divert attention to the display area for displaying the surrounding environment, and on the other hand, further facilitates the user to clearly see dangerous objects in the surrounding environment and timely adopts effective dangerous object prevention measures.
When the terminal determines that the user is in a dangerous environment and the user is observing dangerous objects in the surrounding environment through the first display area 111 and the first display area 112 in fig. 2H, if the user has a very important phone access, the terminal adjusts the display area of the display screen, and at this time, the display area of the display screen of the terminal 100 may include: a first display area for displaying the surrounding environment and a second display area for displaying a telephony access interface (application interface with highest priority). That is, the terminal 100 adjusts the display area of the display screen of the terminal 100 shown in fig. 2H to the display area of the display screen shown in fig. 2I.
As shown in fig. 2I, the display area of the display screen of the terminal 100 may include: a first display area 111, a first display area 112, and a second display area 106. Here the second display area 106 is used to display the phone interface (application interface with highest priority).
In the present application, the first implementation manner and the second implementation manner are implemented separately, or the first implementation manner and the second implementation manner may be implemented in combination, or the second implementation manner may be implemented on the basis of the first implementation manner, but the first implementation manner is not limited to the above implementation manner, and may be implemented on the basis of the second implementation manner.
In the present application, FIGS. 2A-2I are provided for illustration purposes only and should not be construed as limiting.
It should be appreciated that the display area of the display screen of terminal 100 may include more or less display areas and is not limited thereto.
S103, displaying the surrounding environment in the first display area by the terminal.
In the present application, the first display area displays the surrounding environment monitored by the camera, and here, the terminal can adjust the first display area.
In order to facilitate implementation of the embodiment of the present application, the present application provides a terminal for implementing the method described in the embodiment of fig. 1. The terminal shown in fig. 3 may be used to carry out the description in the respectively corresponding embodiments described in the whole above. As shown in fig. 3, the terminal 300 may include: a detection unit 301, a judgment unit 302, a division unit 303, and a display unit 304, wherein:
the detecting unit 301 may be configured to detect that the line of sight of the user falls on the display screen of the terminal.
And the judging unit 302 may be configured to judge whether the user is in a dangerous environment in response to the detecting unit 301 detecting that the line of sight of the user falls on the display screen of the terminal.
The dividing unit 303 is configured to, when the determining unit 302 determines that the user is in a dangerous environment, divide the display screen of the terminal into at least one first display area and at least one second display area, where the first display area is used for displaying a surrounding environment, and the second display area is used for displaying a user interface of the terminal.
A display unit 304, operable to display the ambient environment in the first display region.
In particular, the detection unit 301 may be configured to detect whether the user's eye is gazed at the display screen of the terminal. The display screen may be a touch screen configured with a self-capacitance type floating touch panel, or may be a touch screen configured with an infrared type floating touch panel, which is not limited herein.
Specifically, the determining unit 302 may include a first obtaining unit, a first determining unit, and a first determining unit.
The first acquisition unit is used for acquiring the image of the environment where the user is located.
And the first judging unit is used for judging that the image acquired by the first acquiring unit contains the first image characteristic.
The first determining unit is used for determining that the user is in the dangerous environment if the first judging unit judges that the image acquired by the first acquiring unit contains the first image feature, and the first image feature is used for representing a dangerous object. Here, the first image feature is used to characterize a hazard, which may include any of: driving cars, trucks, bicycles, puddles, trees that can cause injury to users, without limitation. Here, the first image feature may be an appearance feature of a dangerous object. The first image characteristic can be stored in a memory inside the terminal, and can also be stored in a cloud database of the terminal.
Here, the hazardous environment may include the following three scenarios. The first scenario is: when the detection unit 301 detects that the line of sight of the user holding the terminal falls on the display screen of the terminal, the user approaches to a dangerous object in the surrounding environment. The second scenario is: when the terminal detects that the sight line of a user holding the terminal falls on the display screen of the terminal, dangerous objects in the surrounding environment approach towards the user. The third scenario is: when the detection unit 301 detects that the line of sight of the user holding the terminal falls on the display screen of the terminal, the user and the dangerous objects in the surrounding environment approach each other.
Optionally, the determining unit 302 may include a second obtaining unit, a second determining unit, and a second determining unit.
A second acquisition unit for acquiring sound in the ambient environment.
And the second judging unit is used for judging whether the acquired sound contains the first sound characteristic.
And the second determining unit is used for determining that the user is in the dangerous environment if the second judging unit judges that the sound acquired by the second acquiring unit contains the first sound characteristic, and the first sound characteristic is used for representing dangerous objects. Here, the first sound characteristic is used to characterize a hazard. Here, the first sound characteristic may include a sound characteristic of the generation of the hazardous substance. The first sound characteristic can be stored in a memory inside the terminal, and can also be stored in a cloud database of the terminal.
Further, the terminal 300 includes: the detecting unit 301, the judging unit 302, the dividing unit 303 and the displaying unit 304 may further include: the device comprises a first detection unit, a third judgment unit and an adjustment unit.
The first detection unit can be used for detecting the distance between the terminal and a dangerous object in the surrounding environment.
And the third judging unit is used for judging whether the distance is smaller than or equal to the preset safety distance.
And the adjusting unit is used for adjusting the first display area if the distance is less than or equal to the preset safety distance.
The adjusting unit may adjust the first display region according to the following two implementation manners.
The first implementation mode comprises the following steps: the terminal enlarges the area of the first display region.
The second implementation mode comprises the following steps: the terminal enlarges the display content of the first display area.
In the present application, the first implementation manner and the second implementation manner are implemented separately, or the first implementation manner and the second implementation manner may be implemented in combination, or the second implementation manner may be implemented on the basis of the first implementation manner, but the first implementation manner is not limited to the above implementation manner, and may be implemented on the basis of the second implementation manner.
In particular, the display unit 304 may be configured to display the surrounding environment in a first display area and display the user interface of the terminal in a second display area. The user interface may include at least one of: a system background interface, a WeChat application interface, a QQ application interface, or a video application interface, without limitation.
Fig. 4 is a schematic structural diagram of another terminal provided in the present application. In this embodiment of the application, the terminal may include various devices such as a Mobile phone, a tablet computer, a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), and an intelligent wearable Device (e.g., an intelligent watch and an intelligent bracelet), which are not limited in this embodiment of the application. As shown in fig. 4, the terminal 400 may include: a baseband chip 401, memory 402 (one or more computer-readable storage media), a Radio Frequency (RF) module 403, and a peripheral system 404. These components may communicate over one or more communication buses 405.
The baseband chip 401 includes: one or more processors 406, a clock module 407, and a power management module 408. The clock module 407 integrated in the baseband chip 401 is mainly used for generating clocks required for data transmission and timing control for the processor 406. The power management module 408 integrated in the baseband chip 401 is mainly used to provide stable and high-precision voltages for the processor 406, the Radio Frequency (RF) module 403, and the peripheral system 404.
A Radio Frequency (RF) module 403, which may be used to receive and transmit radio frequency signals, primarily integrates the receiver and transmitter of the terminal 400. Radio Frequency (RF) module 403 communicates with a communication network and other communication devices via radio frequency signals. In particular implementations, the Radio Frequency (RF) module 403 may include, but is not limited to: GPS module 409, a SIM card, an antenna system, an RF transceiver, one or more amplifiers, one or more oscillators, a digital signal processor, CODEC chips, a storage medium, and the like. In some embodiments, the Radio Frequency (RF) module 403 may be implemented on a separate chip.
The peripheral system 404 is mainly used to implement an interactive function between the terminal 400 and a user/external environment, and mainly includes an input and output device of the terminal 400. In particular implementations, the peripheral system 404 may include: a touch screen controller 410, a camera controller 411, and a sensor management module 412. Wherein each controller may be coupled to a respective peripheral device (e.g., touch screen 413, camera 414, and infrared sensor 415). In some embodiments, the touch screen may be configured with a self-capacitive floating touch panel, or may be configured with an infrared floating touch panel. In some embodiments, camera 414 may be a 3D camera. It should be noted that the peripheral system 404 may also include other I/O peripherals.
It will be appreciated that camera 414 may be used to monitor the surrounding environment for hazards, infrared sensor 415 may be used to detect distance from hazards in the environment, and touch screen 413 may be used to display the surrounding environment monitored by camera 414.
The memory 402 is coupled to the processor 406 and may be used to store various software programs and/or sets of instructions. In particular implementations, memory 402 may include high-speed random access memory and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state storage devices. The memory 402 may store an operating system (hereinafter referred to simply as a system), such as an embedded operating system like ANDROID, IOS, WINDOWS, or LINUX. The memory 402 may also store a network communication program that may be used to communicate with one or more additional devices, one or more terminal devices, one or more network devices. The memory 402 may further store a user interface program, which may vividly display the content of the application program through a graphical operation interface, and receive a control operation of the application program from a user through input controls such as menus, dialog boxes, and buttons.
The memory 402 may also store one or more application programs. As shown in fig. 4, these applications may include: social applications (e.g., Facebook), image management applications (e.g., photo album), map-like applications (e.g., Google map), browsers (e.g., Safari, Google Chrome), and so forth.
It should be understood that terminal 400 is only one example provided for the embodiments of the present application and that terminal 400 may have more or fewer components than shown, may combine two or more components, or may have a different configuration implementation of components.
It can be understood that, with regard to the specific implementation manner of the functional modules included in the terminal 400 of fig. 4, reference may be made to the foregoing embodiments, and details are not repeated here.
A computer-readable storage medium stores a computer program, which is implemented when executed by a processor.
The computer readable storage medium may be an internal storage unit of the terminal according to any of the foregoing embodiments, for example, a hard disk or a memory of the terminal. The computer readable storage medium may also be an external storage device of the terminal, such as a plug-in hard disk provided on the terminal, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. Further, the computer-readable storage medium may also include both an internal storage unit and an external storage device of the terminal. The computer-readable storage medium is used for storing a computer program and other programs and data required by the terminal. The computer readable storage medium may also be used to temporarily store data that has been output or is to be output.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the terminal and the unit described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed terminal and method can be implemented in other manners. For example, the components and steps of the various examples are described. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The above-described terminal embodiments are merely illustrative, and for example, the division of the units is only one logical function division, and other division manners may be available in actual implementation, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or may not be executed. Further, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, terminals or units, and may also be an electrical, mechanical or other form of connection.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiments of the present application.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially or partially contributed by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
While the invention has been described with reference to specific embodiments, the scope of the invention is not limited thereto, and those skilled in the art can easily conceive various equivalent modifications or substitutions within the technical scope of the invention. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A split-screen display method, comprising:
the method comprises the steps that a terminal detects that the sight of a user falls on a display screen of the terminal;
responding to the fact that the terminal detects that the sight of a user falls on a display screen of the terminal, judging whether the user is in a dangerous environment or not by the terminal, and if the user is in the dangerous environment, dividing the display screen of the terminal into at least one first display area and at least one second display area by the terminal;
the terminal displays the surrounding environment in the first display area and respectively displays different user interfaces of the current application in each second display area;
and if the terminal detects a display triggering event of the application with the highest priority, combining the second display areas into one second display area, and displaying the user interface of the application with the highest priority in the combined second display area.
2. The method according to claim 1, wherein the determining, by the terminal, whether the user is in a dangerous environment specifically includes:
the terminal acquires an image of the environment where the user is located;
the terminal judges whether the acquired image contains a first image characteristic, if so, the terminal determines that the user is in a dangerous environment, and the first image characteristic is used for representing a dangerous object.
3. The method according to claim 1, wherein the determining, by the terminal, whether the user is in a dangerous environment specifically includes:
the terminal acquires sound in the surrounding environment;
the terminal judges whether the acquired sound contains a first sound characteristic, if yes, the terminal determines that the user is in danger, and the first sound characteristic is used for representing a dangerous object.
4. The method of claim 1, wherein the method further comprises:
the terminal detects the distance between the terminal and dangerous objects in the surrounding environment, and if the distance is smaller than or equal to a preset safety distance, the terminal adjusts the first display area.
5. The method of claim 4, wherein the adjusting of the first display area by the terminal specifically comprises:
the terminal enlarges the area of the first display area.
6. The method according to claim 4 or 5, wherein the terminal adjusting the first display area specifically comprises:
and the terminal enlarges the display content of the first display area.
7. A terminal, comprising:
the detection unit is used for detecting that the sight of a user falls on a display screen of the terminal;
the judging unit is used for responding to the fact that the sight of the user falls on the display screen of the terminal and judging whether the user is in a dangerous environment or not;
the dividing unit is used for dividing the display screen of the terminal into at least one first display area and at least one second display area when the judging unit judges that the user is in the dangerous environment;
the display unit is used for displaying the surrounding environment in the first display area and respectively displaying different user interfaces of the current application in each second display area;
and the adjusting unit is used for merging the second display areas into one second display area if the display triggering event of the application with the highest priority is detected, and displaying the user interface of the application with the highest priority in the merged second display area.
8. The terminal of claim 7, further comprising:
and the adjusting unit is used for detecting the distance between the terminal and dangerous objects in the surrounding environment, and if the distance is less than or equal to a preset safe distance, the terminal adjusts the first display area.
9. A terminal, comprising: an input device, an output device, and a memory, and a processor coupled to the memory, the output device including a display screen, wherein:
the input device is used for detecting that the sight line of a user falls on the display screen;
the processor is used for responding to the fact that the input device detects that the sight line of a user falls on the display screen of the terminal, judging whether the user is in a dangerous environment, and if the user is in the dangerous environment, dividing the display screen of the terminal into at least one first display area and at least one second display area, wherein the first display area is used for displaying the surrounding environment, and the second display area is used for displaying the user interface of the terminal;
the output device is used for displaying the surrounding environment in the first display area and respectively displaying different user interfaces of the current application in the second display areas; and if the display triggering event of the application with the highest priority is detected, combining the second display areas into one second display area, and displaying the user interface of the application with the highest priority in the combined second display area.
10. A computer-readable storage medium, characterized in that the computer storage medium stores a computer program comprising program instructions that, when executed by a processor, cause the processor to perform the method according to any of claims 1-6.
CN201711441438.3A 2017-12-26 2017-12-26 Split screen display method and terminal Active CN108156321B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711441438.3A CN108156321B (en) 2017-12-26 2017-12-26 Split screen display method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711441438.3A CN108156321B (en) 2017-12-26 2017-12-26 Split screen display method and terminal

Publications (2)

Publication Number Publication Date
CN108156321A CN108156321A (en) 2018-06-12
CN108156321B true CN108156321B (en) 2021-12-03

Family

ID=62463128

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711441438.3A Active CN108156321B (en) 2017-12-26 2017-12-26 Split screen display method and terminal

Country Status (1)

Country Link
CN (1) CN108156321B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109495641B (en) * 2018-10-24 2021-01-08 维沃移动通信有限公司 Reminding method and mobile terminal

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105741856A (en) * 2016-04-08 2016-07-06 王美金 Earphone capable of prompting environmental crisis sounds in listening to music state
JP2016181876A (en) * 2015-03-25 2016-10-13 凸版印刷株式会社 Mobile terminal device
CN106550150A (en) * 2016-11-09 2017-03-29 广东欧珀移动通信有限公司 Hazardous environment reminding method, device and terminal
CN106899766A (en) * 2017-03-13 2017-06-27 宇龙计算机通信科技(深圳)有限公司 A kind of safety instruction method and its device and mobile terminal

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101905150B1 (en) * 2012-07-17 2018-10-08 엘지전자 주식회사 Mobile terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016181876A (en) * 2015-03-25 2016-10-13 凸版印刷株式会社 Mobile terminal device
CN105741856A (en) * 2016-04-08 2016-07-06 王美金 Earphone capable of prompting environmental crisis sounds in listening to music state
CN106550150A (en) * 2016-11-09 2017-03-29 广东欧珀移动通信有限公司 Hazardous environment reminding method, device and terminal
CN106899766A (en) * 2017-03-13 2017-06-27 宇龙计算机通信科技(深圳)有限公司 A kind of safety instruction method and its device and mobile terminal

Also Published As

Publication number Publication date
CN108156321A (en) 2018-06-12

Similar Documents

Publication Publication Date Title
US10534534B2 (en) Method for controlling display, storage medium, and electronic device
US11257459B2 (en) Method and apparatus for controlling an electronic device
US10289376B2 (en) Method for displaying virtual object in plural electronic devices and electronic device supporting the method
CN109710132B (en) Operation control method and terminal
CN112162671B (en) Live broadcast data processing method and device, electronic equipment and storage medium
KR102410549B1 (en) Electronic Device with Bezel-less Screen
KR20180020669A (en) Electronic apparatus and method for controlling display
US11297477B2 (en) Tethering distribution apparatus, control method thereof, and central mediator
US8914818B2 (en) Media device power management techniques
KR20150129423A (en) Electronic Device And Method For Recognizing Gestures Of The Same
CN107037966B (en) Electronic device for sensing pressure of input and method for operating electronic device
KR20170017289A (en) Apparatus and method for tranceiving a content
KR20160044252A (en) Method and apparatus for correcting color
CN110036363B (en) Method for adjusting screen size and electronic device used for same
US20160133257A1 (en) Method for displaying text and electronic device thereof
KR20160027778A (en) Electronic apparatus and display method thereof
KR102504308B1 (en) Method and terminal for controlling brightness of screen and computer-readable recording medium
KR20170013623A (en) Apparatus and method for controlling a display
WO2022134632A1 (en) Work processing method and apparatus
EP3469787B1 (en) Electronic device and computer-readable recording medium for displaying images
KR20160143173A (en) Electronic device and controlling method thereof
CN115525383A (en) Wallpaper display method and device, mobile terminal and storage medium
CN111368114B (en) Information display method, device, equipment and storage medium
KR20160026337A (en) Electronic device and method for processing notification event in electronic device and electronic device thereof
CN111459363A (en) Information display method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20221214

Address after: 201203 1st floor, building 1, Lane 36, Xuelin Road, Pudong New Area Free Trade Zone, Shanghai

Patentee after: SHANGHAI TRANSSION INFORMATION TECHNOLOGY Ltd.

Address before: Room 922 / 926, block a, No.1 Lane 399, shengxia Road, Pudong New Area pilot Free Trade Zone, Shanghai 201203

Patentee before: SHANGHAI SPREADRISE COMMUNICATION TECHNOLOGY Ltd.

TR01 Transfer of patent right