WO2020213834A1 - Dispositif électronique pour afficher des écrans d'exécution d'une pluralité d'applications et son procédé de fonctionnement - Google Patents

Dispositif électronique pour afficher des écrans d'exécution d'une pluralité d'applications et son procédé de fonctionnement Download PDF

Info

Publication number
WO2020213834A1
WO2020213834A1 PCT/KR2020/003130 KR2020003130W WO2020213834A1 WO 2020213834 A1 WO2020213834 A1 WO 2020213834A1 KR 2020003130 W KR2020003130 W KR 2020003130W WO 2020213834 A1 WO2020213834 A1 WO 2020213834A1
Authority
WO
WIPO (PCT)
Prior art keywords
window
pop
screen
display
electronic device
Prior art date
Application number
PCT/KR2020/003130
Other languages
English (en)
Korean (ko)
Inventor
이상언
김다솜
이상기
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2020213834A1 publication Critical patent/WO2020213834A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1431Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display using a single graphics controller
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas

Definitions

  • Various embodiments of the present disclosure relate to an electronic device that simultaneously displays execution screens of a plurality of applications and a method of operating the same.
  • Recent electronic devices provide a function of simultaneously displaying execution screens of two or more applications on one screen. For example, in order to display the execution screens of two or more applications, the electronic device divides the display into two or more regions and displays the execution screen of the application in each split region, or executes two or more applications. A plurality of windows displaying a screen can be displayed by overlaying them.
  • An electronic device that displays execution screens of a plurality of applications together may display a plurality of execution screens in a predetermined form or superimpose each other while the execution screens of a plurality of applications are displayed together.
  • the size of the execution screen can be adjusted only when a user selects a specific part of the execution screen (eg, a corner part of the execution screen), and multiple execution screens can be easily displayed. It did not provide an interface that could be sorted.
  • Various embodiments of the present invention provide an electronic device capable of easily adjusting the size of an execution screen and aligning the location of the execution screen while displaying execution screens of a plurality of applications together, and an operating method thereof, in order to solve the above problem. Can provide.
  • An electronic device includes a display including a touch screen, and a processor operatively connected to the display, wherein the processor is configured to display a specified screen through the entire screen of the display.
  • the execution screen of the first application is displayed by overlaying the first pop-up window on the designated screen, and the first pop-up window is moved based on the second user input,
  • the first pop-up window is displayed by overlaying the first pop-up window on the designated screen in a first designated area of the display adjacent to the boundary of the display, and the first Based on a third user input for a pop-up window, the execution screen of the first application is displayed as a first split screen in a second designated area of the display including the first designated area, and the designated screen is displayed as the second
  • a second split screen may be displayed in a third designated area of the display excluding the designated area.
  • a method of displaying execution screens of a plurality of applications by an electronic device is an execution screen of a first application based on a first user input while displaying a specified screen through the entire screen of the display.
  • the execution screen of the first application is displayed as a first split screen in a second designated area of the display including the first designated area, and the designated screen is displayed as a third designated area of the display excluding the second designated area.
  • the operation of displaying a second split screen in the area may be included.
  • An electronic device for controlling execution screens of a plurality of applications can easily adjust the size of the execution screen while displaying the execution screens of the plurality of applications, and arrange the positions of the execution screens. I can.
  • FIG. 1 is a block diagram of an electronic device in a network environment according to various embodiments.
  • FIG. 2 is a block diagram of a processor and a display device according to various embodiments.
  • 3 is an example in which an electronic device displays a pop-up window according to an embodiment.
  • FIG. 4 is an example of a pop-up window according to an embodiment.
  • 5 is an example of a method of adjusting transparency of a pop-up window according to an exemplary embodiment.
  • FIG. 6 is an example in which a display according to an exemplary embodiment has a notch area.
  • FIG. 7 is an example for explaining a method of generating a pop-up window according to an exemplary embodiment.
  • FIG. 8 is an example for explaining a method of generating a pop-up window according to another embodiment.
  • 9 and 10 are examples for explaining a method of arranging pop-up windows by an electronic device according to an exemplary embodiment.
  • 11 is a detailed example of an aligned pop-up window.
  • 12 and 13 are examples illustrating a location of a pop-up window according to various embodiments.
  • 15 is an example of a process of adding a new pop-up window by an electronic device according to an embodiment.
  • 16 and 17 are examples for explaining a method of arranging a new pop-up window by an electronic device according to an embodiment.
  • 18 is an example in which the position of an aligned pop-up window changes when an electronic device rotates according to an embodiment.
  • 19 to 21 are examples illustrating a method of converting a pop-up window into a split screen by an electronic device according to an exemplary embodiment.
  • 22 and 23 are examples illustrating a method of converting any one of a plurality of pop-up windows into a split screen by an electronic device according to an exemplary embodiment.
  • 24 and 25 are examples illustrating a process of adding a new pop-up window in a split window state by an electronic device according to an embodiment.
  • 26 is an example of a process in which a position of a divided window is changed when an electronic device rotates according to an embodiment.
  • FIG. 27 is a flowchart illustrating a method of driving an electronic device according to various embodiments of the present disclosure.
  • FIG. 28 is a flowchart illustrating a detailed method of arranging pop-up windows by an electronic device according to various embodiments of the present disclosure.
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100 according to various embodiments.
  • the electronic device 101 communicates with the electronic device 102 through a first network 198 (for example, a short-range wireless communication network), or a second network 199 It is possible to communicate with the electronic device 104 or the server 108 through (eg, a long-distance wireless communication network).
  • the electronic device 101 may communicate with the electronic device 104 through the server 108.
  • the electronic device 101 includes a processor 120, a memory 130, an input device 150, an audio output device 155, a display device 160, an audio module 170, and a sensor module ( 176, interface 177, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196, or antenna module 197 ) Can be included.
  • a sensor module 176, interface 177, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196, or antenna module 197
  • at least one of these components may be omitted or one or more other components may be added to the electronic device 101.
  • some of these components may be implemented as one integrated circuit.
  • the sensor module 176 eg, a fingerprint sensor, an iris sensor, or an illuminance sensor
  • the display device 160 eg, a display.
  • the processor 120 for example, executes software (eg, a program 140) to implement at least one other component (eg, a hardware or software component) of the electronic device 101 connected to the processor 120. It can be controlled and can perform various data processing or operations. According to an embodiment, as at least part of data processing or operation, the processor 120 may store commands or data received from other components (eg, the sensor module 176 or the communication module 190) to the volatile memory 132 The command or data stored in the volatile memory 132 may be processed, and result data may be stored in the nonvolatile memory 134.
  • software eg, a program 140
  • the processor 120 may store commands or data received from other components (eg, the sensor module 176 or the communication module 190) to the volatile memory 132
  • the command or data stored in the volatile memory 132 may be processed, and result data may be stored in the nonvolatile memory 134.
  • the processor 120 includes a main processor 121 (eg, a central processing unit or an application processor), and a secondary processor 123 (eg, a graphics processing unit, an image signal processor) that can be operated independently or together with the main processor 121 (eg, a central processing unit or an application processor). , A sensor hub processor, or a communication processor). Additionally or alternatively, the coprocessor 123 may be set to use lower power than the main processor 121 or to be specialized for a designated function. The secondary processor 123 may be implemented separately from the main processor 121 or as a part thereof.
  • main processor 121 eg, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphics processing unit, an image signal processor
  • the coprocessor 123 may be set to use lower power than the main processor 121 or to be specialized for a designated function.
  • the secondary processor 123 may be implemented separately from the main processor 121 or as a part thereof.
  • the coprocessor 123 is, for example, on behalf of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or the main processor 121 is active (eg, an application is executed). ) While in the state, together with the main processor 121, at least one of the components of the electronic device 101 (for example, the display device 160, the sensor module 176, or the communication module 190) It is possible to control at least some of the functions or states related to. According to an embodiment, the coprocessor 123 (eg, an image signal processor or a communication processor) may be implemented as part of another functionally related component (eg, the camera module 180 or the communication module 190). have.
  • an image signal processor or a communication processor may be implemented as part of another functionally related component (eg, the camera module 180 or the communication module 190). have.
  • the memory 130 may store various data used by at least one component of the electronic device 101 (eg, the processor 120 or the sensor module 176).
  • the data may include, for example, software (eg, the program 140) and input data or output data for commands related thereto.
  • the memory 130 may include a volatile memory 132 or a nonvolatile memory 134.
  • the program 140 may be stored as software in the memory 130, and may include, for example, an operating system 142, middleware 144, or an application 146.
  • the input device 150 may receive a command or data to be used for a component of the electronic device 101 (eg, the processor 120) from an outside (eg, a user) of the electronic device 101.
  • the input device 150 may include, for example, a microphone, a mouse, a keyboard, or a digital pen (eg, a stylus pen).
  • the sound output device 155 may output an sound signal to the outside of the electronic device 101.
  • the sound output device 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback, and the receiver can be used to receive incoming calls. According to one embodiment, the receiver may be implemented separately from the speaker or as part of it.
  • the display device 160 may visually provide information to the outside of the electronic device 101 (eg, a user).
  • the display device 160 may include, for example, a display, a hologram device, or a projector and a control circuit for controlling the device.
  • the display device 160 may include a touch circuitry set to sense a touch, or a sensor circuit (eg, a pressure sensor) set to measure the strength of a force generated by the touch. have.
  • the audio module 170 may convert sound into an electric signal or, conversely, convert an electric signal into sound. According to an embodiment, the audio module 170 acquires sound through the input device 150, the sound output device 155, or an external electronic device directly or wirelessly connected to the electronic device 101 (for example, Sound may be output through the electronic device 102 (for example, a speaker or headphones).
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101, or an external environmental state (eg, a user state), and generates an electrical signal or data value corresponding to the detected state. can do.
  • the sensor module 176 is, for example, a gesture sensor, a gyro sensor, an atmospheric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an infrared (IR) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 177 may support one or more designated protocols that may be used for the electronic device 101 to directly or wirelessly connect with an external electronic device (eg, the electronic device 102 ).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card
  • the connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102 ).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that a user can perceive through a tactile or motor sense.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture a still image and a video.
  • the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101.
  • the power management module 188 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101.
  • the battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, electronic device 102, electronic device 104, or server 108). It is possible to support establishment and communication through the established communication channel.
  • the communication module 190 operates independently of the processor 120 (eg, an application processor), and may include one or more communication processors that support direct (eg, wired) communication or wireless communication.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg : A LAN (local area network) communication module, or a power line communication module) may be included.
  • a corresponding communication module is a first network 198 (for example, a short-range communication network such as Bluetooth, WiFi direct or IrDA (infrared data association)) or a second network 199 (for example, a cellular network, the Internet, or It can communicate with external electronic devices through a computer network (for example, a telecommunication network such as a LAN or WAN).
  • the wireless communication module 192 uses subscriber information (e.g., International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 in a communication network such as the first network 198 or the second network 199.
  • subscriber information e.g., International Mobile Subscriber Identifier (IMSI)
  • IMSI International Mobile Subscriber Identifier
  • the antenna module 197 may transmit a signal or power to the outside (eg, an external electronic device) or receive from the outside.
  • the antenna module may include one antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 197 may include a plurality of antennas. In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is, for example, provided by the communication module 190 from the plurality of antennas. Can be chosen.
  • the signal or power may be transmitted or received between the communication module 190 and an external electronic device through the at least one selected antenna.
  • other components eg, RFIC
  • other than the radiator may be additionally formed as part of the antenna module 197.
  • At least some of the components are connected to each other through a communication method (e.g., bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI))) between peripheral devices and signals ( E.g. commands or data) can be exchanged with each other.
  • a communication method e.g., bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • the command or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199.
  • Each of the electronic devices 102 and 104 may be a device of the same or different type as the electronic device 101.
  • all or part of the operations executed by the electronic device 101 may be executed by one or more of the external electronic devices 102, 104, or 108.
  • the electronic device 101 needs to perform a function or service automatically or in response to a request from a user or another device, the electronic device 101 does not execute the function or service by itself.
  • One or more external electronic devices receiving the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit the execution result to the electronic device 101.
  • the electronic device 101 may process the result as it is or additionally and provide it as at least a part of a response to the request.
  • cloud computing, distributed computing, or client-server computing technology may be used.
  • FIG. 2 is a block diagram of a processor 120 and a display device 160 according to various embodiments.
  • the display device 160 may include a display 210 and a display driver IC (DDI) 230 for controlling the display 210.
  • the DDI 230 may include an interface module 231, a memory 233 (eg, a buffer memory), an image processing module 235, or a mapping module 237.
  • the DDI 230 receives, for example, image data or image information including an image control signal corresponding to a command for controlling the image data from other components of the electronic device 101 through the interface module 231. can do.
  • the image information may include the processor 120 (eg, the main processor 121 (eg, an application processor)) or the auxiliary processor 123 operating independently of the function of the main processor 121 ( Example: It may be received from a graphic processing device)
  • the DDI 230 may communicate with the touch circuit 250 or the sensor module 176 through the interface module 231.
  • the DDI 230 may be the same. At least a portion of the received image information may be stored in the memory 233, for example, in a frame unit.
  • the image processing module 235 may, for example, store at least a part of the image data as a characteristic of the image data or Pre-processing or post-processing (eg, resolution, brightness, or size adjustment) may be performed based at least on the characteristics of the display 210.
  • the mapping module 237 is preprocessed or post-processed through the image processing module 135.
  • a voltage value or a current value corresponding to the image data may be generated According to an embodiment, generation of a voltage value or a current value may include, for example, a property of pixels of the display 210 (eg, an array of pixels ( RGB stripe or pentile structure), or the size of each of the subpixels). At least some of the pixels of the display 210 are, for example, at least partially based on the voltage value or the current value.
  • visual information eg, text, image, or icon
  • corresponding to the image data may be displayed through the display 210.
  • the display device 160 may further include a touch circuit 250.
  • the touch circuit 250 may include a touch sensor 251 and a touch sensor IC 253 for controlling the touch sensor 251.
  • the touch sensor IC 253 may control the touch sensor 251 to detect, for example, a touch input or a hovering input for a specific position of the display 210.
  • the touch sensor IC 253 may detect a touch input or a hovering input by measuring a change in a signal (eg, voltage, amount of light, resistance, or amount of charge) for a specific location of the display 210.
  • the touch sensor IC 253 may provide information (eg, location, area, pressure, or time) on the sensed touch input or hovering input to the processor 120.
  • At least a part of the touch circuit 250 is disposed as a display driver IC 230, a part of the display 210, or outside the display device 160 It may be included as part of other components (for example, the co-processor 123).
  • the display device 160 may further include at least one sensor of the sensor module 176 (eg, a fingerprint sensor, an iris sensor, a pressure sensor, or an illuminance sensor), or a control circuit therefor.
  • the at least one sensor or a control circuit therefor may be embedded in a part of the display device 160 (for example, the display 210 or DDI 230) or a part of the touch circuit 250.
  • the sensor module 176 embedded in the display device 160 includes a biometric sensor (eg, a fingerprint sensor)
  • the biometric sensor may provide biometric information related to a touch input through a partial area of the display 210. (Example: fingerprint image) can be acquired.
  • the pressure sensor may acquire pressure information associated with a touch input through a part or all of the display 210.
  • the touch sensor 251 or the sensor module 176 may be disposed between pixels of a pixel layer of the display 210 or above or below the pixel layer.
  • the processor 120 includes an event collection module 271, a window guide module 273, and a function execution module 275 to provide displaying at least a part of a plurality of application execution screens in a window form. ), or a window processing module 277.
  • the event collection module 271 may collect events occurring in an interface (eg, 177 of FIG. 1) or the touch sensor 251.
  • the event collection module 271 has a multi-window function (eg, a function of displaying a plurality of application execution screens together (eg, displaying an execution screen of a specific application as a pop-up screen, or displaying the execution screen) It is possible to collect events related to setting, outputting or removing a window tray, etc.
  • the event collection module 271 is an item related to multiple windows (eg, a specific window).
  • the event collection module 271 may transmit the event to at least one of the window guide module 273, the function execution module 275, and the window processing module 277.
  • the event collection module 271 may check the event type. For example, the event collection module 271 may analyze the received event to determine whether a tap (or double) touch event, a long touch event, a sweep event, a pinch event, or the like occurred on an item displayed on the display 210. The event collection module 271 may transmit information related to an event type or an event occurrence location to a related module. For example, when an event related to the window guide module 273 (eg, a long touch event or a tap event) is collected, the event collection module 271 may notify the window guide module 273.
  • the event related to the window guide module 273 eg, a long touch event or a tap event
  • the event collection module 271 may notify the function execution module 275 when an event related to function execution (eg, a double tap touch event or a pinch event) is collected.
  • the event collection module 271 may notify the window processing module 277 if it is an event related to window adjustment (eg, a sweep event or a drag event).
  • the window guide module 273 may control output of a window tray (at least a tray in which items related to execution of one function are arranged). For example, the window guide module 273 may provide at least one setting related to window tray output (eg, an icon or menu related to window tray output, key button assignment, etc.). The window guide module 273 may control to output a window tray to the display 210 when an event related to a setting (eg, a window tray icon or a menu or key button selection event) occurs. According to various embodiments, the window guide module 273 may control the output of a recently executed list in response to occurrence of a designated event to output a plurality of windows on the display 210. When an item included in the recently executed list is selected, the window guide module 273 may control output of guide information related to the selected item.
  • a window tray at least a tray in which items related to execution of one function are arranged.
  • the window guide module 273 may provide at least one setting related to window tray output (eg, an icon or menu related to window tray output
  • the window guide module 273 may control the output of guide information.
  • the window guide module 273 may output designated guide information on the display 210 when receiving a notification related to occurrence of a designated event from the event collection module 271.
  • the window guide module 273 may output guide information obtained by changing a display format of a certain area of the display 210 in response to selection of an item included in a window tray or a recently executed list.
  • the window guide module 273 is an image illustrating the shape of a window output to a designated area of the display 210, for example, area information set to be output in a pop-up window format, area information set to be output in a split screen format, or a split window Area information set to be output in the form (eg, displaying an execution screen through a plurality of windows) may be output as guide information.
  • the window guide module 273 may output a thumbnail or a designated image related to the selected item as guide information in an area where the selected item overlaps.
  • the window guide module 273 may output guide information including at least one of text or images related to a designated window shape.
  • the window guide module 273 may output operation information (eg, at least one of text or image) required to output at least one of a pop-up window or a split window in relation to the selected item as guide information.
  • the window guide module 273 may output at least one of text information indicating various sizes of a window and arrangement information of a plurality of windows as guide information.
  • the window guide module 273 may remove the output guide information when receiving a specific event in relation to the selected item from the event collection module 271.
  • the window guide module 273 may remove at least some of the output guide information when a designated gesture event occurs while guide information is being output.
  • the window guide module 273 may control to return to a state before item selection (eg, a state in which a window tray or a recent execution list is displayed, or a state in which a specific window is displayed before a window tray or a recent execution list is displayed).
  • a state before item selection eg, a state in which a window tray or a recent execution list is displayed, or a state in which a specific window is displayed before a window tray or a recent execution list is displayed.
  • the window guide module 273 may change guide information according to an event transmitted from the event collection module 271. For example, when an event related to resizing (eg, a drag event) is received, the window guide module 273 may change the size of the guide information being output. Alternatively, the window guide module 273 may adjust at least one of text or images included in the guide information (eg, reduce or increase the size of the output image) according to an event.
  • an event related to resizing eg, a drag event
  • the window guide module 273 may change the size of the guide information being output.
  • the window guide module 273 may adjust at least one of text or images included in the guide information (eg, reduce or increase the size of the output image) according to an event.
  • the function execution module 275 may execute a function set for a corresponding item.
  • the function execution module 275 may provide a window according to execution of the function to the window processing module 277.
  • the function execution module 275 may control execution of a function related to at least one window when multi-windows are output on the display 210.
  • the function execution module 275 may control execution of a function related to a focused window (eg, a window in which input event processing is designated).
  • the function execution module 275 may process a function related to an unfocused window as background processing.
  • the function execution module 275 may control execution of a corresponding function in response to an event transmitted by the event collection module 271.
  • the function execution module 275 may transmit information according to the function execution to the window processing module 277. In this operation, the function execution module 275 may transmit function execution information including window identification information to the window processing module 277.
  • the window processing module 277 may perform window processing in response to the function execution information transmitted from the function execution module 275.
  • the window processing module 277 may receive function execution information related to a function designated by an input event or a function selected according to a preset job scheduling.
  • the window processing module 277 may generate a window related to outputting the corresponding function execution information.
  • the window processing module 277 may update the window generated according to the function execution information. In this operation, the window processing module 277 may check window identification information included in the function execution information and update the corresponding window.
  • the window processing module 277 may generate a new window upon receiving function execution information related to a new window from the function execution module 275.
  • the window processing module 277 may determine an output type of a new window in response to an event transmitted from the event collection module 271. For example, the window processing module 277 may output a window to be output as a pop-up window or a split window according to an event. Alternatively, the window processing module 277 may adjust at least one of a size or position of a specified pop-up window (or split window) according to an event.
  • the window processing module 277 may receive function execution information according to the widget execution from the function execution module 275. The window processing module 277 may generate a widget window in response to the received function execution information and output it to the display 210.
  • the window processing module 277 may adjust the display type of function execution information according to the type of window to be output. For example, the window processing module 277 may adjust the amount or size of function execution information to be output, or an arrangement position of the information according to the type (or size) of the window to be output. According to various embodiments, the window processing module 277 may adjust a shape (eg, size or position) of a window to be output according to an event. Alternatively, the window processing module 277 may change the shape of the window being output according to an event.
  • a shape eg, size or position
  • An electronic device includes a display including a touch screen and a processor operatively connected to the display, wherein the processor is based on a first user input while displaying a first application execution screen.
  • the execution screen of the second application is overlaid on the first application execution screen as a first pop-up window, and one boundary of the first pop-up window and one boundary of the display at the moved position based on a second user input If the distance between them is less than a specified value, the first pop-up window is displayed by overlaying the first application execution screen on the first application execution screen in the designated first area of the display, and based on a third user input input to the first pop-up window. And splitting the second application screen so that the second application screen is displayed in a second area including the first area, and the first application execution screen and the second application execution screen constitute the entire screen of the display. It can be displayed as a window.
  • 3 is an example in which the electronic device 101 displays a pop-up window according to an embodiment.
  • the electronic device 101 includes a first pop-up window (eg, a pop-up screen) based on a first user input related to the creation of a pop-up window while displaying the main screen 310 ( 320) can be displayed.
  • a first pop-up window eg, a pop-up screen
  • the term'pop-up window' included in this document is that the execution screen of a specific application is displayed in the form of an overlay, and may be referred to as a pop-up screen, a pop-up view, or a floating window. have.
  • pop-up window included in this document may mean a screen in which an execution screen of another application (eg, a second application) is displayed on a specific screen (eg, a main screen or a home screen) in an overlay form.
  • the electronic device may move the pop-up window based on the user touching and dragging the execution screen displayed as an overlay.
  • the electronic device 101 displays the main screen 310 through the first layer, and when receiving a first user input, the first pop-up window ( 320) can be displayed. Accordingly, a first sub-screen (eg, “B” in FIG. 3) included in the first pop-up window 320 may be displayed on the main screen 310 (eg, “A” in FIG. 3).
  • the main screen 310 may be a home screen of the electronic device 101 or an execution screen of a specific application (eg, a first application).
  • the first sub-screen included in the first pop-up window 320 may be an execution screen of another application (eg, a second application).
  • the first user input related to the creation of a pop-up window is a user input through an edge panel (eg, 701 in FIG. 7) as shown in FIG. 7 or a task It may be a user input through a conversion screen. This will be described later in detail with reference to FIGS. 7 and 8.
  • the electronic device 101 may drive the display 210 in a divided state (or divided mode) based on a designated user input.
  • the electronic device 101 divides the display 210 into a plurality of areas by software, and displays screens related to different tasks (eg, an execution screen of an application) for each divided area. It can be a state.
  • the electronic device 101 may divide the main screen 310 into a main area 311 and a sub area 312, and divide the sub area 312 into a first sub area 312a. ) And a second sub-region 312b.
  • the electronic device may arrange a pop-up window (for example, the first pop-up window 320) in a designated area, as described later in FIGS. 9 and 10, and the electronic device displays the designated area. 210) may be determined to correspond to the position of the divided sub-regions when driving in the divided state (or divided mode).
  • a pop-up window for example, the first pop-up window 320
  • the electronic device displays the designated area. 210 may be determined to correspond to the position of the divided sub-regions when driving in the divided state (or divided mode).
  • the first sub-region 312a may be disposed on the second sub-region 312b.
  • the electronic device 101 displays the main screen 310 displayed through the first layer in the divided state, and includes three areas, for example, the main area 311, the first sub area 312a, and the first sub area 312a. 2
  • the sub-region 312b may be divided, and screens related to different tasks (eg, an execution screen of an application) may be displayed for each of the divided regions.
  • the display 210 is not divided and the home screen or a specific application (eg, a first application) is displayed through the entire main screen 310. You can display the execution screen.
  • a divided state eg, a normal state or a normal mode
  • FIG. 4 is an example of a pop-up window according to an embodiment.
  • 5 is an example of a method of adjusting transparency of a pop-up window according to an exemplary embodiment.
  • a pop-up window (eg, the first pop-up window 320 of FIG. 3) according to an embodiment may include a handler 411 and a user input 411 for selecting the handler 411
  • the quick option menu 322 may be displayed.
  • the quick option menu 322 may include a plurality of icons 323, 324, 325, 326 and 327 for executing functions related to the pop-up window 320.
  • a plurality of icons 323, 324, 325, 326, 327 releases the pop-up state of the pop-up window 320, switches the display 210 to a divided state, and pops up through the sub-area 312
  • a third icon 325 for minimizing 320, a fourth icon 326 for expanding the pop-up window 320 to the full screen of the display 210, or a fifth icon for closing the pop-up window 320 (327) may be included.
  • the electronic device 101 When receiving a user input 412 for selecting a second icon, the electronic device 101 according to an embodiment displays a pop-up window with an adjustment bar 511 for adjusting transparency, as shown in FIG. 5. It can be marked in (320).
  • the electronic device 101 may adjust the transparency of the pop-up window 320 based on a user input 521 moving the control bar 511 in a specific direction (eg, a horizontal direction).
  • controlling the transparency of the pop-up window 320 means that the sub-screen (eg, “B” in FIG. 4) included in the pop-up window 320 is the main screen 310 (eg, “B” in FIG. 4 ). It may be to control the degree of covering A').
  • FIG. 6 is an example in which the display 210 according to an exemplary embodiment has a notch area.
  • an electronic device 101 may include a notch area 601 and a non-notch area 602.
  • the notch area 601 may be an area for disposing some components (eg, a camera and at least one sensor) on the front surface of the electronic device 101.
  • the notch area 601 may be disposed at any one corner of the display 210 when the display 210 is viewed from above, and the electronic device 101 is the notch area 601
  • the display 210 may be driven in a divided state based on the position of.
  • the electronic device 101 substitutes a partial area of the display 210 disposed downward of the notch area 601 when the display 210 is viewed from above.
  • the region 312 may be determined, and the remaining regions excluding the sub region 312 (ie, a region not corresponding to the notch region) may be determined as the main region 311.
  • the electronic device 101 may determine the sub area 312 as two Can be divided into areas. For example, the electronic device 101 determines an area relatively close to the notch area 601 among the sub areas 312 as the first sub area 312a, and removes the area relatively far from the notch area 601. It may be determined as 2 sub-regions 312b.
  • FIG. 7 is an example for explaining a method of generating a pop-up window according to an exemplary embodiment.
  • the edge panel 701 may be displayed.
  • the edge panel 701 may include an icon 711 of an application frequently used by a user, an icon 711 of an application designated by the user, or a folder 712 including at least one icon.
  • the electronic device 101 displays a pop-up window in response to a user input 722 moving an icon of a specific application (eg, a second application) included in the edge panel 701 to the designated area 702.
  • the generated pop-up window 320 may include an execution screen of the specific application (eg, a second application) selected by the user.
  • FIG. 8 is an example for explaining a method of generating a pop-up window according to another embodiment.
  • the electronic device 101 may display a task switching screen in response to a user input 841 related to task switching.
  • the task switching screens 821, 822, and 823 may include a recent list of recently used applications or a recommend list of recommended tasks.
  • the area related to each task may include a screenshot (or a shortcut, a thumbnail) related to the application, and an icon of the application.
  • the electronic device 101 displays the option menu 832 in response to a user input 842 selecting an icon 831 of an application on the task switching screens 821, 822, 823. I can.
  • the option menu 832 may include buttons for functions such as information on a corresponding application, opening in a split screen, viewing as a pop-up screen, or locking an app.
  • the electronic device 101 may generate the pop-up window 320 in response to receiving a user input 842 for selecting a button related to the “view pop-up screen”.
  • the pop-up window 320 may include an execution screen related to a task or a specific application selected by the user.
  • 9 and 10 are examples for explaining a method of arranging pop-up windows by the electronic device 101 according to an exemplary embodiment.
  • the electronic device 101 displays the first pop-up window 320 in a state of displaying the first pop-up window 320 (eg, the state of FIG. 3 ).
  • a second user input for moving to the boundary area 911 of the main screen 310 may be received.
  • the second user input may be an input for moving the first pop-up window 320 to an area overlapping the boundary area 911 of the main screen 310.
  • the position where the first pop-up window 320 is moved eg, a liftoff position of the first pop-up window 320
  • the second user input may be an input for moving a part of the first pop-up window 320 (eg, a right edge) closer to the boundary area 911 of the main screen 310 within a specified distance.
  • the position where the first pop-up window 320 is moved eg, the liftoff position of the first pop-up window 320
  • the second user input may be determined that the second user input has been detected.
  • the electronic device 101 displays at least a portion of the first pop-up window 320 (eg, the handler of FIG. 4) while displaying the first pop-up window 320 (eg, the state of FIG. 3 ).
  • the location of the first pop-up window 320 may be moved based on a user input of dragging while touching (411)).
  • the electronic device 101 adjusts the size of the first pop-up window 320 to a designated size based on a second user input, and arranges the adjusted first pop-up window 320 in a designated area. can do.
  • the electronic device 101 adjusts the size of the pop-up window 320 to correspond to the size of the first sub-area 312a or the second sub-area 312b based on the second user input, and
  • the adjusted pop-up window 320 may be arranged in a designated area including the boundary area 911.
  • the designated area may be an area overlapping with the first sub-area 312a or an area overlapping with the second sub-area 312b.
  • 11 is a detailed example of an aligned pop-up window.
  • a handler 321 for controlling a function related to the aligned first pop-up window 320 may be displayed.
  • the electronic device 101 detects a user input 1121 for selecting the handler 321, it may display the quick option menu 1110.
  • the quick option menu 1110 may include a plurality of icons 1111, 1112, and 1113 for executing functions related to the arranged first pop-up window 320.
  • the plurality of icons 1111, 1112, 1113 displays a sixth icon 1111 for returning the aligned position of the first pop-up window 320 to a previous position, and the first pop-up window 320 ( 210) a seventh icon 1112 (for example, the fourth icon 326 in FIG. 4) for expanding to the full screen, or an eighth icon for closing the pop-up window 320 (for example, the fifth icon in FIG. 4). (327)) may be included.
  • 12 and 13 are examples illustrating a location of a pop-up window according to various embodiments.
  • the electronic device 101 may arrange a pop-up window (eg, 320 of FIG. 3) at various positions.
  • a pop-up window eg, 320 of FIG. 3
  • the electronic device 101 is based on a user input 1231 for moving the first pop-up window 320 to the border area 1220 located above the main screen 310.
  • the first pop-up window 320 may be arranged in the designated sub-region 1220 above the main screen 310.
  • the electronic device 101 may provide a first pop-up window 320 based on a user input 1331 for moving the first pop-up window 320 to the border area 1320 located on the left side of the main screen 310. 1
  • the first pop-up window 320 may be arranged in the designated sub-region 1310 on the left side of the main screen 310 with the pop-up window 320.
  • the electronic device 101 displays the first pop-up window 320 in a state in which the first pop-up window 320 is displayed (eg, the state of FIG. 3 ).
  • the boundary area eg, 911 in FIG. 9
  • a position at which the first pop-up window 320 is arranged may be determined based on a result of comparing the relative positions of the half lines 1412 of ).
  • the electronic device 101 may include a first half line 1411 crossing the center of the main screen 310 and a first pop-up window moved to the border area ( A relative position of the second half line 1412 crossing the center of 320 may be compared.
  • the electronic device 101 when the second half line 1412 is positioned above the first half line 1411, the electronic device 101 provides the adjusted first pop-up window 320 to the sub It may be arranged in a first designated area overlapping an upper area of the area 312 (eg, the first sub-area 312a).
  • the electronic device 101 may display the sized first pop-up window 320 as the sub area 312 ) May be arranged in a second designated area overlapping with a lower area (eg, the second sub-area 312b).
  • 15 is an example of a process of adding a new pop-up window by the electronic device 101 according to an embodiment.
  • the electronic device 101 displays a new pop-up window (eg, a second pop-up window 1510) in a state in which the first pop-up window 320 is aligned (eg, the state of FIG. 10 ).
  • a second pop-up window 1510 may be displayed on the main screen 310 and the first pop-up window 320 based on a fourth user input related to the generation of ).
  • the electronic device 101 may display the second pop-up window 1510 through a third layer disposed on the second layer based on a fourth user input.
  • the fourth user input for generating the second pop-up window 1510 may be the same as or similar to the first user input.
  • the second pop-up window may include an execution screen of the third application.
  • the fourth user input may be the same as or similar to the method described in FIGS. 7 and 8.
  • 16 and 17 are examples for explaining a method of arranging a new pop-up window by the electronic device 101 according to an exemplary embodiment.
  • the electronic device 101 refers to the second pop-up window 1510 as a border area (eg, in a state in which the second pop-up window 1510 is displayed).
  • a fifth user input for moving to 911) may be received.
  • the fifth user input may be an input for moving the second pop-up window 1510 to an area overlapping the boundary area (eg, 911 in FIG. 9) of the main screen (eg, 310 in FIG. 3 ). have.
  • the position where the second pop-up window 1510 is moved overlaps the boundary area 911 of the main screen 310.
  • the fifth user input may be an input for moving a portion (eg, a right edge) of the second pop-up window 1510 close to the boundary area 911 of the main screen 310 within a specified distance.
  • the position where the second pop-up window 1510 is moved is a specified distance from the boundary area 911 of the main screen 310 If it is close within, it may be determined that the second user input has been detected.
  • the electronic device 101 displays at least a portion of the second pop-up window 1510 (eg, in a state of displaying the first pop-up window 1510) (eg, the state of FIG. 15).
  • the position of the first pop-up window 320 may be moved based on a user input of dragging while touching (handler).
  • the electronic device 101 adjusts the size of the second pop-up window 1510 to a specified size based on a fifth user input, and arranges the adjusted second pop-up window 1510 in a designated area. can do.
  • the electronic device 101 adjusts the size of the second pop-up window 1510 to correspond to the size of the first sub-area 312a or the second sub-area 312b based on the fifth user input.
  • the second pop-up window 1510 whose size is adjusted may be arranged in a designated area including the boundary area 911.
  • the designated area may be an area overlapping with the first sub-area 312a or an area overlapping with the second sub-area 312b.
  • the electronic device 101 when the first pop-up window 320 is disposed in the second sub-area 312b, the electronic device 101 provides the second pop-up window 1510 to the first sub-area in response to a fifth user input. It can be arranged in the area 312a. That is, when there is only one pre-generated pop-up window, the electronic device 101 arranges the new pop-up window so that it does not overlap with the position of the pre-generated pop-up window in aligning the added pop-up window to overlap the sub area. can do.
  • the electronic device 101 compares the relative positions of the half lines of the second pop-up window 1510 and the half lines of the main screen 310, regardless of the number of previously generated pop-up windows.
  • the alignment position of the second pop-up window 1510 may be determined based on the result.
  • the electronic device 101 may display the second pop-up window 1510 based on the moved position of the second pop-up window 1510 according to a fifth user input (eg, a liftoff position of the second pop-up window 1510).
  • the second pop-up window 1510 may be displayed on the previously generated pop-up window (eg, the first pop-up window 320).
  • the electronic device 101 moves the second pop-up window 1510 according to a fifth user input (eg, a liftoff position of the second pop-up window 1510).
  • An operation of aligning 1510 to overlap with the first sub-region 312a or the second sub-region 312b may be the same as or similar to the operation of aligning the first pop-up window 320 in FIG. 14.
  • 18 is an example in which the position of an aligned pop-up window changes when the electronic device 101 rotates according to an embodiment.
  • the electronic device 101 may sense the rotation of the electronic device 101 in a state in which the first pop-up window 320 is aligned (eg, the state of FIG. 10 ).
  • the electronic device 101 may sense the rotation of the electronic device 101 based on the direction of gravity (eg, G of FIG. 18) using an acceleration sensor.
  • each side of the display 210 in a general portrait mode, has a first edge 1811 () located in the direction of gravity (G), and a second edge facing the first edge 1811 It may be defined as including (1812), a third corner (1813) located to the left of the first corner (1811), or a fourth corner (1814) located to the right of the first corner (1811).
  • the electronic device 101 When the electronic device 101 rotates in the counterclockwise direction 1821, the third corner 1813 is positioned in the direction of gravity, and the electronic device 101 can switch the screen of the display 210 to a so-called landscape mode. I can.
  • the electronic device 101 detects the rotation of the electronic device 101 while the first pop-up window 320 is aligned (eg, the state of FIG. 10 )
  • the electronic device 101 A position of a designated area in which the first pop-up window 320 is arranged may be changed based on the rotation direction of, and the first pop-up window 320 may be rotated and displayed through the changed designated area.
  • the first pop-up window 320 arranged close to the fourth edge 1814 is first It can be arranged close to the corner (1811).
  • 19 to 21 are examples illustrating a method of converting a pop-up window into a split screen by the electronic device 101 according to an exemplary embodiment.
  • the electronic device 101 divides the first pop-up window 320 into a split screen in a state in which the first pop-up windows 320 are aligned (eg, the state of FIG. 10 ).
  • a third user input 1911 to switch to may be received.
  • switching of the first pop-up window 320 to a split screen displays an execution screen of an application displayed by the first pop-up window 320 in a sub area
  • the main screen and the first pop-up window 320 This may mean displaying the execution screen of the application displayed by the first pop-up window 320 in the form of a split window so that the execution screen of the displayed application constitutes the entire screen of the display 210.
  • the third user input 1911 is transmitted from at least a portion of the first pop-up window 320 disposed in the second sub-area 312b or in a direction designated by a handler (eg, downward or upward). It may be a dragging touch input.
  • the third user input 2111 is from at least a portion of the first pop-up window 320 disposed in the first sub-area 312a or a direction designated by a handler (eg, downward direction). Alternatively, it may be a touch input dragging in the upward direction).
  • the electronic device 101 may release a pop-up window based on the third user input 1911 and drive the display 210 in a divided state.
  • the electronic device 101 refers to the main screen 310 as a sub-area including the main area 311 and the border area (eg, 911 in FIG. 9) based on a third user input 1911 ( Example: The first pop-up window is divided into 312 of FIG. 3 ), and the first pop-up window (eg, an area including the first sub-area 312a and the second sub-area 312b) A first sub-screen (eg,'B' of FIG. 10) included in 320) may be displayed. Accordingly, the electronic device 101 may convert a pop-up window state from displaying the display 210 in a form of a plurality of layers to a state in which a single layer is displayed.
  • the electronic device 101 while driving the display 210 in a divided state, the electronic device 101 adjusts the size of the sub area 312 at the boundary between the main area 311 and the sub area 312.
  • An adjustment icon 2011 for may be displayed.
  • the electronic device 101 may adjust the size of the main area 311 and the size of the sub area 312 based on a user input 2021 moving the adjustment icon 2011 left or right.
  • 22 and 23 are examples illustrating a method of converting any one of a plurality of pop-up windows into a split screen by the electronic device 101 according to an exemplary embodiment.
  • the electronic device 101 arranges and displays a plurality of pop-up windows (eg, the state of FIG. 17) while displaying any one of a plurality of pop-up windows based on a user input. Can be switched to a split screen.
  • the electronic device 101 displays the first pop-up window 320 as the second sub-area 312b of the main screen 310 (eg, the execution screen of the first application).
  • the second pop-up window 1510 may be displayed on the first sub-area 312a of the main screen 310 (eg, the execution screen of the first application).
  • the electronic device 101 touches and drags at least a portion of the second pop-up window 1510 or a handler in a specified direction (eg, downward or upward direction), based on a user input 2211, based on the sub-region ( Example: A second sub-screen included in the second pop-up window 1510 through the entire portion (eg, an area including the first sub-area 312a and the second sub-area 312b) of 312 of FIG. 3 ( Example:'C' of FIG. 23) can be displayed.
  • a second sub-screen (eg,'C' in FIG. 23) included in the second pop-up window 1510 is a split screen. Even if it is switched to, a state in which the first pop-up window 320 is displayed on the second sub-area 312b of the main screen 310 (eg, the execution screen of the first application) may be maintained.
  • 24 and 25 are examples illustrating a process of adding a new pop-up window in a divided window state by the electronic device 101 according to an exemplary embodiment.
  • the electronic device 101 is based on a user input related to the creation of a new pop-up window while driving the display 210 in a divided state (eg, the state of FIG. 20 ).
  • the third pop-up window 2410 may be displayed on the first layer divided into the main screen 310 and the sub screen.
  • the electronic device 101 may display the third pop-up window 2410 through a second layer disposed on the first layer based on a user input.
  • the third pop-up window 2410 may include an execution screen of the fourth application.
  • a user input for generating the third pop-up window 2410 may be the same as or similar to the first user input.
  • a user input for generating the third pop-up window 2410 may be the same as or similar to the method described in FIGS. 7 and 8.
  • the electronic device 101 may receive a user input 2421 for moving the third pop-up window 2410 to the boundary area.
  • the user input 2421 may be an input for moving the third pop-up window 2410 to an area overlapping the boundary area 911 of the main screen 310.
  • the position where the third pop-up window 2410 is moved overlaps the boundary area 911 of the main screen 310. If it is an area, it may be determined that the user input 2421 has been detected.
  • the user input may be an input in which a part (eg, a right edge) of the third pop-up window 2421 moves close to the boundary area 911 of the main screen 310 within a specified distance.
  • a part eg, a right edge
  • the position where the third pop-up window 2421 is moved eg, a liftoff position of the third pop-up window 2421
  • the position where the third pop-up window 2421 is moved is a specified distance from the boundary area 911 of the main screen 310 If it is close within, it may be determined that the user input 2421 has been detected.
  • the electronic device 101 adjusts the size of the third pop-up window 2410 to a specified size based on a user input 2421, and designates the adjusted third pop-up window 2410. Can be arranged in the area. For example, the electronic device 101 adjusts the size of the third pop-up window 2410 to correspond to the size of the first sub-area 312a or the second sub-area 312b based on the user input, and The adjusted third pop-up window 2410 is displayed on the second sub-screen being displayed through the entire sub-area (eg, the area including the first sub-area 312a and the second sub-area 312b) in a divided state. I can.
  • the third sub-screen of the third pop-up window 2410 (for example,'D' in FIG. 24) Is displayed through the first sub-area (eg, 312a of FIG. 3), and the second sub-screen (eg, C of FIG. 24) of the second pop-up window 1510 is displayed as a second sub-area (eg, 312b of FIG. 3). ), the display can be switched to the divided state.
  • the electronic device 101 displays the execution screen of the first application through the main area 311, the execution screen of the second application through the first sub-area 312a, and A split screen in which an execution screen of a third application is displayed through the area 312b, and the execution screen of the first application to the execution screen of the third application constitutes the entire screen of the display 210 (eg, three split screens) Can be converted to
  • 26 is an example of a process in which a position of a divided window is changed when the electronic device 101 is rotated according to an exemplary embodiment.
  • the electronic device 101 may sense the rotation of the electronic device 101 while driving the display 210 in the divided state (eg, the state of FIG. 20 ).
  • the electronic device 101 may sense the rotation of the electronic device 101 based on the direction of gravity (eg, G of FIG. 18) using an acceleration sensor.
  • the electronic device 101 in a general portrait mode, has a first edge 1811 positioned in the direction of gravity, a second edge 1812 facing the first edge 1811, and It may be defined as including a third edge 1813 positioned to the left of the first edge 1811 or a fourth edge 1814 positioned to the right of the first edge 1811.
  • the electronic device 101 When the electronic device 101 rotates in the counterclockwise direction 2621, the third corner 1813 is positioned in the direction of gravity, and the electronic device 101 can switch the screen of the display 210 to a so-called landscape mode. I can.
  • the electronic device 101 detects rotation of the electronic device 101 while driving the display 210 in a divided state (eg, the state of FIG. 20), the main area ( 311) and the location of the sub area may be changed, and the main screen 310 and the sub screen may be displayed through the changed location.
  • the sub-screens arranged close to the fourth corner 1814 e.g.,'C' in FIG. '
  • the fourth corner 1814 e.g.,'C' in FIG. '
  • 27 is a flowchart illustrating a method of driving an electronic device 101 according to various embodiments of the present disclosure.
  • the electronic device 101 displays the main screen 310 on the main screen 310 based on a first user input related to the generation of a pop-up window.
  • the first pop-up window 320 may be displayed.
  • the electronic device 101 may display the first pop-up window 320 based on a first user input related to the creation of the pop-up window while displaying the main screen 310.
  • the electronic device 101 displays the main screen 310 through a first layer, and when receiving a first user input, opens the first pop-up window 320 through a second layer disposed on the first layer. Can be displayed.
  • the first sub-screen included in the first pop-up window 320 may be displayed on the main screen 310.
  • the main screen 310 may be a home screen of the electronic device 101 or an execution screen of a specific application (eg, a first application).
  • the first sub-screen included in the first pop-up window 320 may be an execution screen of another application (eg, a second application).
  • the electronic device 101 may arrange the first pop-up window 320 in a designated area including the boundary area based on a second user input. For example, as illustrated in FIGS. 9 and 10, in a state in which the first pop-up window 320 is displayed (for example, in the state of FIG. 3), the electronic device 101 denotes the first pop-up window as a boundary area.
  • the second user input may be moved to or approached to the boundary area.
  • the electronic device 101 may adjust the size of the first pop-up window to a specified size based on a second user input, and arrange the adjusted first pop-up window 320 in a designated area. .
  • the electronic device 101 adjusts the size of the pop-up window to correspond to the size of the first sub-area 312a or the second sub-area 312b based on the second user input, and the size is adjusted.
  • Pop-up windows may be arranged in a designated area including the boundary area.
  • the designated area may be an area overlapping with the first sub-area 312a or an area overlapping with the second sub-area 312b.
  • the electronic device 101 closes the first pop-up window 320 and displays the first pop-up window 320 based on a third user input for converting the arranged first pop-up window 320 into a split screen. 210) can be driven in a divided state. For example, as illustrated in FIGS. 19 to 20, the electronic device 101 is in a state in which the first pop-up windows 320 are aligned (eg, in FIG. 10 ). State), a third user input 1911 for converting the first pop-up window 320 into a split screen may be received.
  • the third user input may be at least a portion of the first pop-up window 320 disposed in the second sub-area 312b or a touch input that is dragged in a direction (eg, downward or upward) from a handler.
  • the electronic device 101 may close the pop-up window based on the third user input 1911 and drive the display 210 in a divided state.
  • the electronic device 101 divides the main screen 310 into a main area 311 and a sub area including the border area in software based on a third user input 1911, and The first sub-screen included in the first pop-up window 320 through the entire area (eg, an area including the first sub-area 312a and the second sub-area 312b) Execution screen) can be displayed.
  • FIG. 28 is a flowchart illustrating a detailed method of arranging pop-up windows by the electronic device 101 according to various embodiments.
  • FIG. 28 may be a detailed flowchart related to operation 2720 of FIG. 28.
  • the electronic device 101 in operation 2810, includes a first half line 1411 crossing the center of the main screen 310 and a first pop-up window moved to the boundary area.
  • a relative position of the second half line 1412 crossing the center of 320 may be compared.
  • the electronic device 101 moves the first pop-up window to a boundary area in a state in which the first pop-up window 320 is displayed (eg, the state of FIG. 3).
  • the relative position of the half line of the main screen 310 and the half line of the first pop-up window 320 is compared when the second user input is released.
  • a position in which the first pop-up window 320 is arranged may be determined based on the result.
  • the electronic device 101 may include a first half line 1411 crossing the center of the main screen 310 and a first pop-up window moved to the border area ( A relative position of the second half line 1412 crossing the center of 320 may be compared.
  • the first pop-up window ( 320) may be arranged in a first designated area overlapping an upper area of the sub area (eg, the second sub area 312b).
  • the first pop-up window ( 320) may be arranged in a first designated area overlapping a lower area of the sub area (eg, the second sub area 312b).
  • Electronic devices may be devices of various types.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a smart phone
  • a portable multimedia device e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a wearable device e.g., a smart bracelet
  • phrases such as “at least one of B or C” may include any one of the items listed together in the corresponding one of the phrases, or all possible combinations thereof.
  • Terms such as “first”, “second”, or “first” or “second” may be used simply to distinguish a corresponding component from other corresponding Order) is not limited.
  • Some (eg, first) component is referred to as “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively”. When mentioned, it means that any of the above components can be connected to the other components directly (eg by wire), wirelessly, or via a third component.
  • module used in this document may include a unit implemented in hardware, software, or firmware, and may be used interchangeably with terms such as logic, logic blocks, parts, or circuits.
  • the module may be an integrally configured component or a minimum unit of the component or a part thereof that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • Various embodiments of the present document include one or more instructions stored in a storage medium (eg, internal memory 236 or external memory 238) that can be read by a machine (eg, electronic device 201). It may be implemented as software (eg, program 240) including them.
  • the processor eg, the processor 220 of the device (eg, the electronic device 201) may call and execute at least one command among one or more commands stored from the storage medium. This makes it possible for the device to be operated to perform at least one function according to the at least one command invoked.
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • a storage medium that can be read by a device may be provided in the form of a non-transitory storage medium.
  • non-transient only means that the storage medium is a tangible device and does not contain a signal (e.g., electromagnetic wave), and this term refers to the case where data is semi-permanently stored in the storage medium. It does not distinguish between temporary storage cases.
  • a signal e.g., electromagnetic wave
  • a method according to various embodiments disclosed in the present document may be provided by being included in a computer program product.
  • Computer program products can be traded between sellers and buyers as commodities.
  • Computer program products are distributed in the form of a device-readable storage medium (e.g. compact disc read only memory (CD-ROM)), or through an application store (e.g. Play StoreTM) or two user devices (e.g. It can be distributed (e.g., downloaded or uploaded) directly between, e.g. smartphones).
  • a device e.g. compact disc read only memory (CD-ROM)
  • an application store e.g. Play StoreTM
  • two user devices e.g. It can be distributed (e.g., downloaded or uploaded) directly between, e.g. smartphones).
  • at least a portion of the computer program product may be temporarily stored or temporarily generated in a storage medium that can be read by a device such as a server of a manufacturer, a server of an application store, or a memory of a relay server.
  • each component (eg, module or program) of the above-described components may include a singular number or a plurality of entities.
  • one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, a module or a program
  • the integrated component may perform one or more functions of each component of the plurality of components in the same or similar to that performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component are sequentially, parallel, repeatedly, or heuristically executed, or one or more of the above operations are executed in a different order or omitted. Or one or more other actions may be added.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

Selon divers modes de réalisation, la présente invention concerne un dispositif électronique qui, tout en affichant un écran désigné par l'intermédiaire d'un écran entier d'un dispositif d'affichage, peut recouvrir un écran d'exécution d'une première application sur l'écran désigné en utilisant une première fenêtre contextuelle sur la base d'une première entrée d'utilisateur, déplacer la première fenêtre contextuelle sur la base d'une seconde entrée d'utilisateur, lorsqu'une distance entre une limite de la première fenêtre contextuelle et une limite du dispositif d'affichage est inférieure ou égale à une valeur désignée, superposer la première fenêtre contextuelle sur l'écran désigné dans une première zone désignée du dispositif d'affichage adjacente à la limite du dispositif d'affichage, afficher l'écran d'exécution de la première application en tant que premier écran divisé dans une seconde zone désignée du dispositif d'affichage comprenant la première zone désignée sur la base d'une troisième entrée d'utilisateur concernant la première fenêtre contextuelle, et afficher l'écran désigné sous la forme d'un second écran divisé dans une troisième zone désignée du dispositif d'affichage à l'exclusion de la seconde zone désignée.
PCT/KR2020/003130 2019-04-19 2020-03-05 Dispositif électronique pour afficher des écrans d'exécution d'une pluralité d'applications et son procédé de fonctionnement WO2020213834A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2019-0046389 2019-04-19
KR1020190046389A KR20200122945A (ko) 2019-04-19 2019-04-19 복수의 어플리케이션의 실행화면들을 표시하는 전자 장치 및 그의 동작 방법

Publications (1)

Publication Number Publication Date
WO2020213834A1 true WO2020213834A1 (fr) 2020-10-22

Family

ID=72837379

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/003130 WO2020213834A1 (fr) 2019-04-19 2020-03-05 Dispositif électronique pour afficher des écrans d'exécution d'une pluralité d'applications et son procédé de fonctionnement

Country Status (2)

Country Link
KR (1) KR20200122945A (fr)
WO (1) WO2020213834A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023051354A1 (fr) * 2021-09-30 2023-04-06 华为技术有限公司 Procédé d'affichage à écran partagé et dispositif électronique

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102566913B1 (ko) * 2021-03-11 2023-08-14 주식회사 한글과컴퓨터 프레젠테이션 문서에 대한 슬라이드 쇼 실행시 개선된 페이지 전환 기능을 제공하는 전자 장치 및 그 동작 방법
EP4287631A4 (fr) 2021-08-02 2024-05-15 Samsung Electronics Co., Ltd. Procédé et dispositif de commande d'écran
KR20230019703A (ko) * 2021-08-02 2023-02-09 삼성전자주식회사 화면 제어 방법 및 장치

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130054071A (ko) * 2011-11-16 2013-05-24 삼성전자주식회사 다중 어플리케이션을 실행하는 모바일 장치 및 그 방법
KR101373337B1 (ko) * 2011-10-26 2014-03-10 엘지전자 주식회사 이동 단말기 및 이의 제어방법
KR20170040283A (ko) * 2014-07-31 2017-04-12 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 애플리케이션 윈도우에 대한 동적 조인트 디바이더
KR20170058152A (ko) * 2015-11-18 2017-05-26 삼성전자주식회사 전자 장치 및 전자 장치의 디스플레이 설정 방법
WO2018213241A1 (fr) * 2017-05-15 2018-11-22 Apple Inc. Systèmes et procédés pour interagir avec de multiples applications qui sont affichées simultanément sur un dispositif électronique ayant un dispositif d'affichage tactile

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101373337B1 (ko) * 2011-10-26 2014-03-10 엘지전자 주식회사 이동 단말기 및 이의 제어방법
KR20130054071A (ko) * 2011-11-16 2013-05-24 삼성전자주식회사 다중 어플리케이션을 실행하는 모바일 장치 및 그 방법
KR20170040283A (ko) * 2014-07-31 2017-04-12 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 애플리케이션 윈도우에 대한 동적 조인트 디바이더
KR20170058152A (ko) * 2015-11-18 2017-05-26 삼성전자주식회사 전자 장치 및 전자 장치의 디스플레이 설정 방법
WO2018213241A1 (fr) * 2017-05-15 2018-11-22 Apple Inc. Systèmes et procédés pour interagir avec de multiples applications qui sont affichées simultanément sur un dispositif électronique ayant un dispositif d'affichage tactile

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023051354A1 (fr) * 2021-09-30 2023-04-06 华为技术有限公司 Procédé d'affichage à écran partagé et dispositif électronique

Also Published As

Publication number Publication date
KR20200122945A (ko) 2020-10-28

Similar Documents

Publication Publication Date Title
WO2020213834A1 (fr) Dispositif électronique pour afficher des écrans d'exécution d'une pluralité d'applications et son procédé de fonctionnement
WO2019088667A1 (fr) Dispositif électronique pour reconnaître une empreinte digitale à l'aide d'un dispositif d'affichage
WO2019103396A1 (fr) Procédé de configuration d'interface d'entrée et dispositif électronique associé
WO2021261722A1 (fr) Dispositif électronique comprenant une unité d'affichage souple
WO2019035601A1 (fr) Appareil d'édition d'image utilisant une carte de profondeur et son procédé
WO2019147031A1 (fr) Dispositif électronique et procédé de commande d'affichage
WO2014148771A1 (fr) Terminal portable et procédé de fourniture d'effet haptique
WO2019182403A1 (fr) Procédé prenant en charge une entrée d'utilisateur et dispositif électronique prenant en charge ledit procédé
WO2020085704A1 (fr) Dispositif électronique pliable et son procédé d'affichage multi-étage de contenus
EP3695591A1 (fr) Dispositif électronique pour commander une pluralité d'applications
WO2020017743A1 (fr) Dispositif électronique comprenant une unité d'affichage sur laquelle est affiché un écran d'exécution pour de multiples applications, et procédé de fonctionnement du dispositif électronique
WO2019160347A1 (fr) Procédé de traitement d'entrée tactile et dispositif électronique prenant en charge ledit procédé
WO2020085628A1 (fr) Procédé d'affichage d'objets et dispositif électronique d'utilisation associé
WO2021054710A1 (fr) Procédé, dispositif électronique et support de stockage permettant d'afficher un état de charge en début de charge
WO2020246709A1 (fr) Dispositif électronique comprenant un dispositif d'affichage et procédé de correction d'une image affichée par le dispositif électronique
WO2020091538A1 (fr) Dispositif électronique pour afficher un écran par l'intermédiaire d'un panneau d'affichage en mode de faible puissance et son procédé de fonctionnement
WO2020111720A1 (fr) Dispositif électronique pliable et procédé d'affichage d'informations dans le dispositif électronique pliable
WO2018124823A1 (fr) Appareil d'affichage et son procédé de commande
WO2020013651A1 (fr) Dispositif électronique, et procédé pour la transmission d'un contenu du dispositif électronique
WO2019039734A1 (fr) Dispositif électronique comprenant un capteur et une ou plusieurs couches conductrices à exciter à l'aide d'un signal provenant d'un capteur
WO2022119276A1 (fr) Dispositif électronique d'affichage souple et procédé de fonctionnement associé
WO2021230568A1 (fr) Dispositif électronique permettant de fournir un service de réalité augmentée et son procédé de fonctionnement
WO2021133123A1 (fr) Dispositif électronique comprenant un écran flexible et son procédé de fonctionnement
WO2018143744A1 (fr) Dispositif d'affichage à détection tactile et procédé de commande de son écran
WO2020091491A1 (fr) Dispositif électronique de commande de position ou de zone d'affichage d'image en fonction d'un changement de contenu d'image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20791954

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20791954

Country of ref document: EP

Kind code of ref document: A1