CN112558836A - Display control method, display control device, electronic device, and medium - Google Patents

Display control method, display control device, electronic device, and medium Download PDF

Info

Publication number
CN112558836A
CN112558836A CN202011546259.8A CN202011546259A CN112558836A CN 112558836 A CN112558836 A CN 112558836A CN 202011546259 A CN202011546259 A CN 202011546259A CN 112558836 A CN112558836 A CN 112558836A
Authority
CN
China
Prior art keywords
area
program
sub
display
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011546259.8A
Other languages
Chinese (zh)
Inventor
林汉忠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202011546259.8A priority Critical patent/CN112558836A/en
Publication of CN112558836A publication Critical patent/CN112558836A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a display control method, a display control device, electronic equipment and a medium, and belongs to the technical field of electronic equipment. The method comprises the following steps: receiving a first input of a user to the induction control area under the condition that a display area of the display control device comprises a program interface of a first program and receives unread information of a second program; responding to a first input, displaying a first program interface of a first program in a first sub-area of a display area, and displaying a second program interface of a second program in a second sub-area of the display area; wherein the display scale of the first and second sub-areas is determined based on the first input. According to the scheme, when the electronic equipment processes a certain transaction, the received other transaction messages can be processed in parallel, friendly user interaction design is achieved, a user can customize the display ratio of the first program interface and the second program interface, and the flexibility of the display ratio of the first program interface and the second program interface is improved.

Description

Display control method, display control device, electronic device, and medium
Technical Field
The application belongs to the technical field of electronic equipment, and particularly relates to a display control method and device, electronic equipment and a medium.
Background
The AR (Augmented Reality) technology is a technology that skillfully fuses virtual information and the real world, and a plurality of technical means such as multimedia, three-dimensional modeling, real-time tracking and registration, intelligent interaction, sensing and the like are widely applied, and virtual information such as characters, images, three-dimensional models, music, videos and the like generated by a computer is applied to the real world after analog simulation, and the two kinds of information complement each other, thereby realizing the 'enhancement' of the real world.
AR glasses are a further developed AR technology to allow more users to step into the augmented reality world. AR glasses are different from mobile phones, a convenient operation mode needs to be redesigned, and different operation requirements in multi-scene use are improved. Providing a sufficiently good operation is also an important means to enhance the competitiveness of AR eyewear products. In the prior art, in the process of using the AR glasses, a current display program of the AR glasses is a certain program, for example, a video program, when a new message comes from a chat program, a user of the AR glasses needs to quit the current video program first, then display of the new message is controlled by eyeballs, after the message is read, the user needs to switch back to the video program to continue using the video program, and switching between multiple programs needs to be performed, which is troublesome to operate.
Disclosure of Invention
An object of the embodiments of the present application is to provide a display control method, apparatus, electronic device, and medium, which can reduce the operation time for processing multiple transactions simultaneously during using AR glasses.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides a display control method, including:
receiving a first input of a user to the induction control area under the condition that a display area of the display control device comprises a program interface of a first program and receives unread information of a second program;
in response to the first input, displaying a first program interface of the first program on a first sub-area of the display area, and displaying a second program interface of the second program on a second sub-area of the display area;
wherein a display scale of the first and second sub-areas is determined based on the first input.
In a second aspect, an embodiment of the present application provides a display control apparatus, which is applied to an augmented reality AR device, and includes:
the first receiving module is used for receiving first input of a user to the induction control area under the condition that a display area of the display control device comprises a program interface of a first program and receives unread information of a second program;
a first response module, configured to, in response to the first input, display a first program interface of the first program in a first sub-area of the display area, and display a second program interface of the second program in a second sub-area of the display area;
wherein a display scale of the first and second sub-areas is determined based on the first input.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the application, under the condition that the display area comprises a program interface of a first program and receives unread information of a second program, first input of a user to the induction control area is received; in response to the first input, the display proportions of the first sub-area and the second sub-area are determined based on the first input, the first program interface of the first program is displayed in the first sub-area of the display area, and the second program interface of the second program is displayed in the second sub-area of the display area, so that when the AR device processes a certain transaction, the received other transaction messages can be processed in parallel, the display proportions of the first program interface and the second program interface can be adjusted at the same time, and friendly user interaction design is achieved.
Drawings
FIG. 1 is a schematic flow chart illustrating a display control method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a display state according to an embodiment of the present application;
FIG. 3 is a second schematic diagram of a display state according to an embodiment of the present application;
FIG. 4 is a schematic diagram of an input operation in an embodiment of the present application;
FIG. 5 is a third exemplary illustration of a display state according to an embodiment of the present application;
FIG. 6 is a fourth illustration of a display state in accordance with an embodiment of the present application;
FIG. 7 is a schematic view of AR glasses according to an embodiment of the present application;
FIG. 8 is a fifth illustration of a display state in accordance with an embodiment of the present application;
FIG. 9 is a sixth exemplary illustration of a display state according to an embodiment of the present application;
fig. 10 is a block diagram of a display control apparatus according to an embodiment of the present application;
fig. 11 is a hardware configuration diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or described herein. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The following describes the control method provided by the embodiments of the present application in detail through specific embodiments and application scenarios thereof with reference to the accompanying drawings.
As shown in fig. 1, the present application provides a display control method applied to an electronic device, which is specifically an augmented reality AR device, such as AR glasses.
Specifically, the display control method includes the steps of:
step 11: receiving a first input of a user to the induction control area under the condition that a display area of the display control device comprises a program interface of a first program and receives unread information of a second program;
as an implementation manner, the sensing control area is a pressure sensing area corresponding to a pressure sensing component arranged on the electronic device, and the first input is a first input of the pressure sensing area by a user.
Illustratively, the display area is a display area of a display interface of the augmented reality AR device, and as shown in fig. 2, the display area of the AR device displays a program interface 1 of the first program, and receives an unread message 2 of the second program at this time. And if the user needs to process the unread message of the second program in parallel while processing the first program, performing first input on the induction control area of the AR device. Wherein the first input includes, but is not limited to, to the sensing control area: an input or operation such as a click input, a press input, a touch input, or a slide input.
Step 12: in response to the first input, displaying a first program interface of the first program on a first sub-area of the display area, and displaying a second program interface of the second program on a second sub-area of the display area; wherein a display scale of the first and second sub-areas is determined based on the first input.
Illustratively, as shown in fig. 3, a schematic diagram of a first program interface of a first program and a second program interface of a second program corresponding to an unread message are displayed in a split-screen manner, where the first sub-area on the left displays the first program interface of the first program currently processed, and the second sub-area on the right displays the second program interface of the second program corresponding to the unread message.
In the above embodiment, when the display area includes a program interface of the first program and receives unread information of the second program, the first input of the user to the sensing control area is received; in response to a first input, determining display proportions of the first sub-area and the second sub-area based on the first input, and displaying a first program interface of a first program on the first sub-area of the display area, in the second sub-area of the display area, the second program interface of the second program can realize that when the AR device processes the transaction corresponding to the first program, meanwhile, the received unread messages of the second program are processed in parallel, so that parallel transaction processing in an application scene of the AR equipment can be realized, the operation time for processing multiple transactions simultaneously during using the AR glasses is reduced, the display proportion of the first program interface and the second program interface can be adjusted simultaneously, friendly user interaction design is realized, a user can customize the display proportion of the first program interface and the second program interface, and the flexibility of the display proportion of the first program interface and the second program interface is improved.
In an embodiment, in step 11, receiving a first input to the AR device from a user includes:
receiving sliding input of a user to the induction control area;
for example, a sliding input to the AR device by a user's finger or an operating device (such as a stylus) is received.
Exemplarily, as shown in fig. 4, taking the electronic device as AR glasses as an example, the sensing control area is a pressure sensing area 3 corresponding to pressure sensing components arranged on a left frame and a right frame of the AR glasses, where the pressure sensing area 3 is arranged along a length direction of the AR glasses and is used for receiving a first input to wake up the AR glasses to perform split-screen display on the first program interface and the second program interface.
Specifically, the first input is a sliding input of a finger of a user or an operation device to the length direction of the pressure sensing area 3 on the right frame of the AR glasses. For example, when a user slides a finger from the position a to the position B of the pressure sensing area 3, the display ratio of the first program interface to the second program interface in the display area is defined to be 3:1, if the hand of the user does not release the pressure sensing area 3 at this time, the display ratio of the first program interface to the second program interface in the display area is updated in real time, and if the user is not satisfied, the hand can not be released to continue sliding, for example, the finger is slid from the position B to the position C of the pressure sensing area 3, so that the display ratio of the first program interface to the second program interface in the display area is adjusted and updated to be 3: 2.
Further, in an embodiment, the step 12 includes:
in response to the sliding input, obtaining a sliding length of the sliding input;
determining the display scale of the first sub-area and the second sub-area according to the sliding length;
and displaying a first program interface of the first program on the first sub-area of the display area and displaying a second program interface of the second program on the second sub-area of the display area according to the display proportion of the first sub-area and the second sub-area.
For example, the sliding length is proportional to the size of the second sub-region and inversely proportional to the size of the first sub-region; or the sliding length is inversely proportional to the size of the second sub-area and directly proportional to the size of the first sub-area.
Illustratively, the total length of the sensing control area of the right frame of the AR glasses is 100%, and the display proportion of the second display area and the first display area in the display area of the AR equipment is controlled according to the proportion of the length of the sliding input in the total length of the sensing control area, wherein the proportion of the length of the sliding input is in positive correlation or negative correlation with the display proportion. It can be understood that when the length of the slide input is in positive correlation with the display proportion, the larger the proportion of the length of the slide input to the total length of the right frame sensing control area is, the larger the second sub-area is, and the smaller the first sub-area is. When the length proportion of the slide input is in negative correlation with the display proportion, the larger the proportion of the length proportion of the slide input to the total length of the right frame induction control area is, the smaller the second sub area is, and the larger the first sub area is.
In the embodiment, the display proportion of the second sub-area and the first sub-area can be flexibly adjusted by adjusting the length of the sliding input, so that the display requirements of different users with different display proportions can be met.
In an embodiment, the displaying, according to the display scale of the first sub-area and the second sub-area, a first program interface of the first program in the first sub-area of the display area, and a second program interface of the second program in the second sub-area of the display area, includes:
and under the condition that the ratio of the sliding length of the sliding input to the length of the induction control area is larger than a preset ratio, displaying a second program interface of the second program on the full screen of the display area.
Illustratively, when the proportion of the length of the sliding input to the length of the sensing control area on the right frame of the AR glasses is 100%, the unread message content of the second program is displayed in a full screen mode. As shown in fig. 5, it is a schematic diagram showing the unread message content of the second program in full screen.
In the above embodiment, the display ratio of the second sub-area to the first sub-area can be flexibly adjusted by adjusting the length of the sliding input, so that the program interfaces of the first program and the second program can be displayed in a split screen manner, the program interface of the second program can be displayed in a full screen manner, and the display requirements of different clients can be met.
In an embodiment, the displaying, in response to the first input, a first program interface of the first program in a first sub-area of the display area, and after displaying a second program interface of the second program in a second sub-area of the display area, further includes:
receiving a second input of the user to the induction control area;
locking programs associated with the first sub-region and the second sub-region in response to the second input.
In this embodiment, the second input includes, but is not limited to, to the sensing control area: an input or operation such as a click input, a press input, a touch input, or a slide input.
Exemplarily, taking the AR glasses as an example, as shown in fig. 7, the upper side frame of the AR glasses is provided with a sensing control area 3, and the second input may be a double-click operation on the sensing control area in the upper side frame of the AR glasses, and in response to the double-click operation, the program associated with the first sub-area and the second sub-area is locked. As shown in fig. 6, a schematic locked state diagram is shown, where a lock icon a is used to prompt a user that a program associated with the first sub-area and the second sub-area is currently in a locked state, and specifically, the display of the lock icon a may be cancelled after a preset time.
In the above embodiment, by locking the program associating the first sub-area and the second sub-area, the display content of the first sub-area and the display content of the second sub-area can be prevented from being erroneously switched due to an erroneous operation.
Further, for example, when the user double clicks the upper frame sensing control area of the AR glasses again, the locking state of the program associated with the first sub-area and the second sub-area may be released, and after the locking state is released, the user may perform operations such as switching display of the event program or exiting the split screen display.
In an embodiment, in a case where a program associated with the second sub-area is not locked, the responding to the first input, displaying a first program interface of the first program on the first sub-area of the display area, and after displaying a second program interface of the second program on the second sub-area of the display area, further includes:
receiving a third input of the user to the induction control area;
and responding to the third input, and updating the display content of the second sub-area to be a third program interface of a third program.
Wherein, be equipped with the sensing control district on the electronic equipment, the third input includes but not limited to the sensing district: click input, press input, touch input, slide input, or the like.
For example, taking the AR glasses as an example, as shown in fig. 7, the upper side frame of the AR glasses is provided with a pressure sensing area 3, the second input may be a sliding input operation to the pressure sensing area 3 in the upper side frame of the AR glasses, and in response to the sliding input operation, the display content of the second sub-area is updated to a third program interface of a third program. As shown in fig. 8, it is a schematic diagram showing that the first sub-area on the left is in a locked state, and the display content of the second sub-area on the right is switched and displayed as a program interface of a video program (application interface of a third program).
For example, taking the AR glasses shown in fig. 4 as an example, the sensing control area is a pressure sensing area 3 corresponding to the pressure sensing components arranged on the left frame and the right frame of the AR glasses, the second input may be a pressing operation on the sensing control area on the left frame of the AR glasses, and in response to the pressing operation, the display content of the second sub-area is updated to a third program interface of a third program.
For example, when the interface of the M program is displayed on the second sub-area, if the unread message of the N program is received, the interface of the M program displayed on the second sub-area can be switched to the program interface of the N program by receiving the slide input operation or the press operation of the user on the pressure sensing area 3 in the upper side frame of the AR glasses.
In the above embodiment, under the condition that the program associated with the second sub-area is not locked, if a plurality of programs to be processed exist, the user can perform the third input to the sensing control area, so that the second sub-area can switch and display program interfaces corresponding to different programs, the switching and display of different programs can be quickly realized, and the operation process of the user is facilitated.
Further, in an embodiment, after step 12, the method further includes:
receiving a fourth input of the user to the induction control area;
and responding to the fourth input, and displaying a first program interface of the first program in a full screen mode in the display area.
Further, in an embodiment, after step 12, the method further includes:
receiving a fifth input of the user to the induction control area;
and responding to the fifth input, and displaying a second program interface of the second program in a full screen mode in the display area.
Wherein, be equipped with the sensing control area on the electronic equipment, fourth input and fifth input include but are not limited to the sensing control area: click input, press input, touch input, slide input, or the like.
Exemplarily, taking the AR glasses shown in fig. 4 as an example, the sensing control area is a pressure sensing area 3 corresponding to the pressure sensing components arranged on the left frame and the right frame of the AR glasses, when the pressure sensing area 3 on the right side of the AR glasses is double-clicked, the second program interface is displayed in a full screen mode, as shown in fig. 5. When the screen is split, the pressure sensing area 3 on the left side of the AR glasses is double clicked, and the first program interface is displayed in a full screen mode, as shown in fig. 9.
The first to fifth inputs may be inputs using an eyeball detection method, an infrared detection method, or the like, and are not limited to touch inputs.
It should be noted that, in the display control method provided in the embodiment of the present application, the execution main body may be a display control device, or a control module in the display control device for executing the loaded display control method. In the embodiment of the present application, a display control method implemented by a display control device is taken as an example to describe the display control method provided in the embodiment of the present application.
As shown in fig. 10, an embodiment of the present invention further provides a display control apparatus, which is applied to an augmented reality AR device, where the apparatus 1000 includes:
a first receiving module 1001, configured to receive a first input of a user to an induction control area when a display area of a display control apparatus includes a program interface of a first program and receives unread information of a second program;
a first response module 1002, configured to, in response to the first input, display a first program interface of the first program in a first sub-area of the display area, and display a second program interface of the second program in a second sub-area of the display area;
wherein a display scale of the first and second sub-areas is determined based on the first input.
Optionally, the first receiving module 1001 includes:
the first receiving submodule is used for receiving sliding input of a user to the induction control area;
the first response module 1002 includes:
the first response submodule is used for responding to the sliding input and acquiring the sliding length of the sliding input;
the second response submodule is used for determining the display proportion of the first sub-area and the second sub-area according to the sliding length;
and the third response submodule is used for displaying the first program interface of the first program on the first sub-area of the display area and displaying the second program interface of the second program on the second sub-area of the display area according to the display proportion of the first sub-area and the second sub-area.
Optionally, the third response sub-module includes:
and the response unit is used for displaying a second program interface of the second program in a full screen mode in the display area under the condition that the ratio of the sliding length of the sliding input to the length of the induction control area is larger than a preset ratio.
Optionally, the apparatus 1000 further comprises:
the second receiving module is used for receiving a second input of the user to the induction control area;
a second response module to lock the program associated with the first sub-region and the second sub-region in response to the second input.
Optionally, the apparatus 1000 further comprises:
the third receiving module is used for receiving a third input of the user to the induction control area;
and the third response module is used for responding to the third input and updating the display content of the second sub-area into a third program interface of a third program.
The display control device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. Illustratively, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine, a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The display control device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The display control device provided in the embodiment of the present application can implement each process implemented by the display control device in the method embodiment of fig. 1, and is not described herein again to avoid repetition.
In the display control apparatus 1000 according to the embodiment of the present application, when the display area includes a program interface of a first program and receives unread information of a second program, a first input to the sensing control area by a user is received; in response to the first input, the display proportions of the first sub-area and the second sub-area are determined based on the first input, the first program interface of the first program is displayed in the first sub-area of the display area, and the second program interface of the second program is displayed in the second sub-area of the display area, so that when the AR device processes the transaction corresponding to the first program, the received unread message of the second program can be processed in parallel, the parallel transaction processing in the application scene of the AR device can be realized, the display proportions of the first program interface and the second program interface can be adjusted at the same time, a friendly user interaction design is realized, the user can customize the display proportions of the first program interface and the second program interface, and the flexibility of the display proportions of the first program interface and the second program interface is improved.
Optionally, an electronic device is further provided in this embodiment of the present application, and includes a processor 1110, a memory 1109, and a program or an instruction stored in the memory 1109 and capable of running on the processor 1110, where the program or the instruction is executed by the processor 1110 to implement each process of the above-mentioned control method embodiment of the foldable electronic device, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic devices and the non-mobile electronic devices described above.
Fig. 11 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 1100 includes, but is not limited to: a radio frequency unit 1101, a network module 1102, an audio output unit 1103, an input unit 1104, a sensor 1005, a display unit 1106, a user input unit 1107, an interface unit 1108, a memory 1109, a processor 1110, and the like.
Those skilled in the art will appreciate that the electronic device 1100 may further include a power source (e.g., a battery) for supplying power to the various components, and the power source may be logically connected to the processor 1110 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system. The electronic device structure shown in fig. 11 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
Optionally, when the electronic device 1100 is the AR device, the user input unit 1107 is configured to receive a first input of the user to the sensing control area when the display area of the display control apparatus includes a program interface of the first program and receives unread information of the second program;
a processor 1110 configured to display a first program interface of the first program in a first sub-area of the display area and a second program interface of the second program in a second sub-area of the display area in response to the first input;
wherein a display scale of the first and second sub-areas is determined based on the first input.
In the electronic device 1100 provided in the embodiment of the application, when the display area includes a program interface of the first program and receives unread information of the second program, the first input of the user to the sensing control area is received; in response to the first input, the display proportions of the first sub-area and the second sub-area are determined based on the first input, the first program interface of the first program is displayed in the first sub-area of the display area, and the second program interface of the second program is displayed in the second sub-area of the display area, so that when the AR device processes the transaction corresponding to the first program, the received unread message of the second program can be processed in parallel, the parallel transaction processing in the application scene of the AR device can be realized, the display proportions of the first program interface and the second program interface can be adjusted at the same time, a friendly user interaction design is realized, the user can customize the display proportions of the first program interface and the second program interface, and the flexibility of the display proportions of the first program interface and the second program interface is improved.
Optionally, the user input unit 1107 is further configured to receive a sliding input of the user to the sensing control area;
optionally, the processor 1110 is further configured to obtain a sliding length of the sliding input in response to the sliding input;
determining the display scale of the first sub-area and the second sub-area according to the sliding length;
and displaying a first program interface of the first program on the first sub-area of the display area and displaying a second program interface of the second program on the second sub-area of the display area according to the display proportion of the first sub-area and the second sub-area.
Optionally, the processor 1110 is further configured to display a second program interface of the second program in a full screen manner in the display area when the ratio of the sliding length of the sliding input to the length of the sensing control area is greater than a preset ratio. Optionally, in response to the first input, after the first program interface of the first program is displayed in the first sub-area of the display area, and the second program interface of the second program is displayed in the second sub-area of the display area, the user input unit 1107 is further configured to receive a second input to the sensing control area from a user;
processor 1110 is further configured to lock the program associated with the first sub-region and the second sub-region in response to the second input.
Optionally, in the case that a program associated with the second sub-area is not locked, in response to the first input, after the first program interface of the first program is displayed in the first sub-area of the display area, the user input unit 1107 is further configured to receive a third input to the sensing control area from the user after the second program interface of the second program is displayed in the second sub-area of the display area;
the processor 1110 is further configured to update the display content of the second sub-area to a third program interface of a third program in response to the third input.
It should be understood that in the embodiment of the present application, the input Unit 1104 may include a Graphics Processing Unit (GPU) 11041 and a microphone 11042, and the Graphics processor 11041 processes image data of still pictures or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 1106 may include a display panel 11061, and the display panel 11061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1107 includes a touch panel 11071 and other input devices 11072. A touch panel 11071, also called a touch screen. The touch panel 11071 may include two portions of a touch detection device and a touch controller. Other input devices 11072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 1109 may be used for storing software programs and various data including, but not limited to, application programs and an operating system. Processor 1110 may integrate an application processor that handles primarily operating systems, user interfaces, applications, etc. and a modem processor that handles primarily wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1110.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the above display control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the above display control method embodiment, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (12)

1. A display control method, comprising:
receiving a first input of a user to the induction control area under the condition that a display area of the display control device comprises a program interface of a first program and receives unread information of a second program;
in response to the first input, displaying a first program interface of the first program on a first sub-area of the display area, and displaying a second program interface of the second program on a second sub-area of the display area;
wherein a display scale of the first and second sub-areas is determined based on the first input.
2. The display control method of claim 1, wherein receiving a first input from a user to the sensing control area comprises:
receiving sliding input of a user to the induction control area;
the displaying, in response to the first input, a first program interface of the first program in a first sub-region of the display area, comprising:
in response to the sliding input, obtaining a sliding length of the sliding input;
determining the display scale of the first sub-area and the second sub-area according to the sliding length;
and displaying a first program interface of the first program on the first sub-area of the display area and displaying a second program interface of the second program on the second sub-area of the display area according to the display proportion of the first sub-area and the second sub-area.
3. The display control method according to claim 2, wherein the displaying a first program interface of the first program in a first sub-area of the display area and a second program interface of the second program in a second sub-area of the display area in accordance with a display ratio of the first sub-area to the second sub-area comprises:
and under the condition that the ratio of the sliding length of the sliding input to the length of the induction control area is larger than a preset ratio, displaying a second program interface of the second program on the full screen of the display area.
4. The display control method according to claim 1, wherein the displaying, in response to the first input, a first program interface of the first program in a first sub-area of the display area, and after displaying a second program interface of the second program in a second sub-area of the display area, further comprises:
receiving a second input of the user to the induction control area;
locking programs associated with the first sub-region and the second sub-region in response to the second input.
5. The display control method according to claim 1, wherein, in a case where a program associated with a second sub-area is not locked, the displaying, in response to the first input, a first program interface of the first program in the first sub-area of the display area, and after displaying a second program interface of the second program in the second sub-area of the display area, further comprises:
receiving a third input of the user to the induction control area;
and responding to the third input, and updating the display content of the second sub-area to be a third program interface of a third program.
6. A display control apparatus, characterized by comprising:
the first receiving module is used for receiving first input of a user to the induction control area under the condition that a display area of the display control device comprises a program interface of a first program and receives unread information of a second program;
a first response module, configured to, in response to the first input, display a first program interface of the first program in a first sub-area of the display area, and display a second program interface of the second program in a second sub-area of the display area;
wherein a display scale of the first and second sub-areas is determined based on the first input.
7. The display control apparatus according to claim 6, wherein the first receiving means comprises:
the first receiving submodule is used for receiving sliding input of a user to the induction control area;
the first response module comprises:
the first response submodule is used for responding to the sliding input and acquiring the sliding length of the sliding input;
the second response submodule is used for determining the display proportion of the first sub-area and the second sub-area according to the sliding length;
and the third response submodule is used for displaying the first program interface of the first program on the first sub-area of the display area and displaying the second program interface of the second program on the second sub-area of the display area according to the display proportion of the first sub-area and the second sub-area.
8. The display control apparatus according to claim 7, wherein the third response submodule includes:
and the response unit is used for displaying a second program interface of the second program in a full screen mode in the display area under the condition that the ratio of the sliding length of the sliding input to the length of the induction control area is larger than a preset ratio.
9. The display control apparatus according to claim 6, further comprising:
the second receiving module is used for receiving a second input of the user to the induction control area;
a second response module to lock the program associated with the first sub-region and the second sub-region in response to the second input.
10. The display control apparatus according to claim 6, further comprising:
the third receiving module is used for receiving a third input of the user to the induction control area;
and the third response module is used for responding to the third input and updating the display content of the second sub-area into a third program interface of a third program.
11. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions when executed by the processor implementing the steps of the display control method according to any one of claims 1 to 5.
12. A readable storage medium, characterized in that the readable storage medium stores thereon a program or instructions which, when executed by a processor, implement the steps of the display control method according to any one of claims 1 to 5.
CN202011546259.8A 2020-12-24 2020-12-24 Display control method, display control device, electronic device, and medium Pending CN112558836A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011546259.8A CN112558836A (en) 2020-12-24 2020-12-24 Display control method, display control device, electronic device, and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011546259.8A CN112558836A (en) 2020-12-24 2020-12-24 Display control method, display control device, electronic device, and medium

Publications (1)

Publication Number Publication Date
CN112558836A true CN112558836A (en) 2021-03-26

Family

ID=75030529

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011546259.8A Pending CN112558836A (en) 2020-12-24 2020-12-24 Display control method, display control device, electronic device, and medium

Country Status (1)

Country Link
CN (1) CN112558836A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104423584A (en) * 2013-09-02 2015-03-18 Lg电子株式会社 Wearable device and method of outputting content thereof
US20160065952A1 (en) * 2014-08-28 2016-03-03 Samsung Electronics Co., Ltd. Method and apparatus for configuring screen for virtual reality
CN105829948A (en) * 2013-12-18 2016-08-03 微软技术许可有限责任公司 Wearable Display Input System
CN106209797A (en) * 2016-06-29 2016-12-07 乐视控股(北京)有限公司 A kind of method and device of split screen display available reminder message
US20180232056A1 (en) * 2017-02-14 2018-08-16 Samsung Electronics Co., Ltd. Method for display of information from real world environment on a virtual reality (vr) device and vr device thereof
CN109062467A (en) * 2018-07-03 2018-12-21 Oppo广东移动通信有限公司 Split screen application switching method, device, storage medium and electronic equipment
CN109144447A (en) * 2018-07-30 2019-01-04 Oppo广东移动通信有限公司 Split screen window adjusting method, device, storage medium and electronic equipment
CN109739427A (en) * 2018-12-03 2019-05-10 北京梧桐车联科技有限责任公司 Screen operating method and device, display equipment and storage medium
CN110286842A (en) * 2019-06-28 2019-09-27 Oppo广东移动通信有限公司 The control method of a kind of electronic equipment and split screen display available
CN110347305A (en) * 2019-05-30 2019-10-18 华为技术有限公司 A kind of VR multi-display method and electronic equipment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104423584A (en) * 2013-09-02 2015-03-18 Lg电子株式会社 Wearable device and method of outputting content thereof
CN105829948A (en) * 2013-12-18 2016-08-03 微软技术许可有限责任公司 Wearable Display Input System
US20160065952A1 (en) * 2014-08-28 2016-03-03 Samsung Electronics Co., Ltd. Method and apparatus for configuring screen for virtual reality
CN106209797A (en) * 2016-06-29 2016-12-07 乐视控股(北京)有限公司 A kind of method and device of split screen display available reminder message
US20180232056A1 (en) * 2017-02-14 2018-08-16 Samsung Electronics Co., Ltd. Method for display of information from real world environment on a virtual reality (vr) device and vr device thereof
CN109062467A (en) * 2018-07-03 2018-12-21 Oppo广东移动通信有限公司 Split screen application switching method, device, storage medium and electronic equipment
CN109144447A (en) * 2018-07-30 2019-01-04 Oppo广东移动通信有限公司 Split screen window adjusting method, device, storage medium and electronic equipment
CN109739427A (en) * 2018-12-03 2019-05-10 北京梧桐车联科技有限责任公司 Screen operating method and device, display equipment and storage medium
CN110347305A (en) * 2019-05-30 2019-10-18 华为技术有限公司 A kind of VR multi-display method and electronic equipment
CN110286842A (en) * 2019-06-28 2019-09-27 Oppo广东移动通信有限公司 The control method of a kind of electronic equipment and split screen display available

Similar Documents

Publication Publication Date Title
CN112162665B (en) Operation method and device
WO2022121790A1 (en) Split-screen display method and apparatus, electronic device, and readable storage medium
CN112286612B (en) Information display method and device and electronic equipment
CN112099714B (en) Screenshot method and device, electronic equipment and readable storage medium
CN115357173A (en) Screen control method and device and electronic equipment
CN112911147A (en) Display control method, display control device and electronic equipment
CN112817555A (en) Volume control method and volume control device
CN113794795A (en) Information sharing method and device, electronic equipment and readable storage medium
CN113311973A (en) Recommendation method and device
CN112148167A (en) Control setting method and device and electronic equipment
CN111651106A (en) Unread message prompting method, unread message prompting device, unread message prompting equipment and readable storage medium
CN114416269A (en) Interface display method and display device
CN112399010B (en) Page display method and device and electronic equipment
CN113703634A (en) Interface display method and device
CN112286615A (en) Information display method and device of application program
CN111638828A (en) Interface display method and device
CN113794943B (en) Video cover setting method and device, electronic equipment and storage medium
CN113852540B (en) Information transmission method, information transmission device and electronic equipment
CN115718581A (en) Information display method and device, electronic equipment and storage medium
CN114416264A (en) Message display method and device
CN113868269A (en) Screenshot method and device, electronic equipment and readable storage medium
CN114625296A (en) Application processing method and device
CN114442881A (en) Information display method and device, electronic equipment and readable storage medium
CN112558836A (en) Display control method, display control device, electronic device, and medium
CN113867864A (en) Information display method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210326