WO2021194252A1 - Dispositif électronique et procédé de partage d'écran - Google Patents

Dispositif électronique et procédé de partage d'écran Download PDF

Info

Publication number
WO2021194252A1
WO2021194252A1 PCT/KR2021/003644 KR2021003644W WO2021194252A1 WO 2021194252 A1 WO2021194252 A1 WO 2021194252A1 KR 2021003644 W KR2021003644 W KR 2021003644W WO 2021194252 A1 WO2021194252 A1 WO 2021194252A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
area
screen
user
display
Prior art date
Application number
PCT/KR2021/003644
Other languages
English (en)
Korean (ko)
Inventor
신상민
임원배
강동훈
김나영
민조나단
신재경
전정환
임연욱
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2021194252A1 publication Critical patent/WO2021194252A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72412User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces

Definitions

  • Various embodiments of the present disclosure relate to a technology for sharing a region selected by a user in an electronic device with another electronic device.
  • wired/wireless communication technology electronic devices that display a screen on the display of the electronic device and output data that can be visually recognized by a user may be connected to each other through a wired/wireless communication network.
  • Various data may be transmitted/received between electronic devices through a wired/wireless communication network, and, for example, a screen may be shared between electronic devices.
  • a screen sharing technology such as a mirroring technology or a streaming technology may be used.
  • Screen mirroring technology or screen casting technology allows you to view the same or corresponding screen on the screen of an electronic device such as a smart phone or PC on the display of another electronic device such as a TV, a beam projector, or a smart phone.
  • the source electronic device compresses screen data of the source electronic device and transmits the compressed screen data to the synchronization electronic device through a wired/wireless network, and the synchronization electronic device decodes the compressed screen data to communicate with the source electronic device The same screen may be displayed on the display of the synchronization electronic device.
  • the screen streaming technology is a technology in which image content of a source electronic device is transmitted to a synchronization electronic device in real time, and the synchronization electronic device displays image content obtained from the source electronic device in real time.
  • a user in screen sharing between electronic devices, a user can designate an image to share and an application to share, but it is difficult to arbitrarily set an area for sharing. For example, the user had to share the entire screen of the user terminal or share the screen with another application (eg, a blank memo or a web browser) covering an area that the user does not want to share.
  • another application eg, a blank memo or a web browser
  • a specific area selected by a user can be shared, and a shared or recorded area can be changed according to a change in contents on the screen or movement/change of an object by moving or changing the size of the specified area.
  • An electronic device and method may be provided.
  • An electronic device for sharing a screen comprising: a wireless communication circuit; display; a memory in which instructions are stored; and a processor electrically connected to the wireless communication circuit, the display, and the memory, wherein the processor, upon execution of the instructions: reproduces a video screen on the display, and a first area corresponding to a portion of the video screen being reproduced obtain a first user input specifying After the first area is specified, an electronic device that transmits screen data reproduced in the first area of the video screen to an external device in real time through the wireless communication circuit may be included.
  • a method of sharing a screen in an electronic device may include reproducing a video screen on a display; obtaining a first user input for specifying a first area corresponding to a part of the video screen being reproduced; and after the first area is specified, transmitting screen data reproduced in the first area of the video screen to an external device through the wireless communication circuit in real time.
  • a user-specified partial area among a video screen being played on an electronic device may be shared with another electronic device in real time.
  • the size or location of a partial region of a video shared in real time by the electronic device may be changed based on a user's intention, and the changed region may be shared with other electronic devices in real time.
  • different portions of a screen of a video being reproduced on an electronic device may be shared in real time with a plurality of different electronic devices.
  • FIG. 1 is a diagram illustrating a screen sharing system according to an embodiment.
  • FIG. 2 is a block diagram of an electronic device 201 in a network environment 200 according to various embodiments of the present disclosure.
  • FIG. 3 is a block diagram illustrating a digital pen according to an exemplary embodiment.
  • FIG. 4 illustrates a method of sharing a screen in an electronic device according to an embodiment.
  • FIG. 5 is a diagram illustrating a state in which a screen is shared from an electronic device to another electronic device in real time according to an exemplary embodiment.
  • 6A is a diagram illustrating a state in which an image corresponding to a shared area is displayed in a partial area on a display screen of another electronic device, according to an exemplary embodiment.
  • 6B is a diagram illustrating a state in which an image corresponding to a shared area is displayed in an entire area on a display screen of another electronic device, according to an exemplary embodiment.
  • FIG. 7 is a diagram illustrating a state in which an area to be shared is moved in an electronic device according to an exemplary embodiment.
  • FIG. 8 is a diagram illustrating a state in which a size of a screen shared by an electronic device is changed according to an exemplary embodiment.
  • FIG. 9 is a diagram illustrating a state in which an electronic device shares a plurality of selection areas with a plurality of other electronic devices according to an exemplary embodiment
  • FIG. 10 is a diagram illustrating a state in which another electronic device performs fast-forward (FF, Fast-Forward) or rewind (RW, Rewind) while sharing a screen from an electronic device to another electronic device, according to an exemplary embodiment.
  • FF fast-forward
  • RW rewind
  • 11A is a diagram illustrating a state in which a finger 1000 selects a screen shared in an electronic device according to an exemplary embodiment.
  • 11B is a diagram illustrating a state in which a screen shared in an electronic device is selected by the digital pen 301 according to an exemplary embodiment.
  • FIG. 12A is a diagram illustrating a user interface (UI) capable of sharing data of a sharing area with a plurality of other electronic devices existing within a specific range in an electronic device according to an exemplary embodiment.
  • UI user interface
  • 12B is a diagram illustrating a state in which the electronic device shares data of a sharing area with a plurality of other electronic devices existing within a specific range based on a digital pen, according to an exemplary embodiment.
  • 12C is a diagram illustrating a state in which an electronic device shares data of a sharing area with a plurality of other electronic devices existing within a specific range based on a finger, according to an exemplary embodiment.
  • FIG. 1 illustrates a screen sharing system according to an embodiment.
  • the electronic device 100 may share the partial area 110 of the screen output on the display 125 of the electronic device 100 with other electronic devices 101 , 102 , and 103 .
  • the area 110 may be an area that the user wants to share or record. For example, if the user wants to share or record only the first object (eg, one out of three) among the objects output on the display 125 , as shown in FIG. 1 , the first object (for example, among three people) You can select an area that contains only 1 person).
  • the electronic device 100 may share a first object (eg, a region including only one of three people) with other electronic devices 101 , 102 , and 103 , and may record the region.
  • the electronic device 100 displays the display 125 of the electronic device 100 to the other electronic device 101 using a wireless communication technology (eg, a cellular network) through a base station such as 3G/4G/5G.
  • a wireless communication technology eg, a cellular network
  • the included region 110 may be shared.
  • the electronic device 100 uses a wireless communication technology through an access point (AP), such as Wi-Fi, to provide another electronic device 102 with an area included in the display 125 of the electronic device 100 ( 110) can be shared.
  • AP access point
  • the electronic device 100 uses a wireless communication technology (eg, Wifi-Direct, etc.) through a local area network such as Bluetooth or device-to-device (D2D) connection to another electronic device (
  • the region 110 included in the display 125 of the electronic device 100 may be shared with 103 .
  • the region 110 included in the display 125 of the electronic device 100 is not limited to the example illustrated in FIG. 1 , and the region 110 according to necessity or circumstances. position or size may vary.
  • the area 110 included in the display 125 of the electronic device 100 may include a first object (eg, one out of three).
  • the area 110 may be an area including the first object and the second object (eg, 2 out of 3), which is a wider area than the example shown in FIG. 1 including 1 out of 3 people.
  • the region 110 may be located at the upper end as in the example shown in FIG. 1 or at a lower region different from the example shown in FIG. 1 .
  • the region 110 may be any one of arbitrarily divided regions based on a user's intention among regions within the display screen.
  • the screen shared by the electronic device 100 may be displayed on the entire screen of the display of the other electronic devices 101 , 102 , 103 .
  • the shared screen displayed on the display of the other electronic device 101 , 102 , 103 may be changed according to a user manipulation of the other electronic device 101 , 102 , 103 or the other electronic device 101 , 102 , 103 .
  • ) may be displayed on a partial area of the display instead of the entire screen according to the default setting, device setting, or application setting.
  • the area in which the shared screen is displayed is reduced from the entire area to a partial area of the other electronic device 101 , 102 , 103 , enlarged from a partial area to the entire area, or reduced/reduced from a partial area to another partial area It can be enlarged/changed.
  • the electronic device transfers a first region (eg, the region 110 of FIG. 1 ), which is an area that the user wants to share, to another electronic device (eg, the other electronic device 101 ). , 102, 103)).
  • a first region eg, the region 110 of FIG. 1
  • another electronic device eg, the other electronic device 101 . , 102, 103
  • an area that the user wants to share is a plurality of areas including a first area (eg, area 170 of FIG. 9 ) and a second area (eg, area 180 of FIG. 9 ).
  • another electronic device eg, other electronic devices 101 , 102 , 103
  • the electronic devices 100 , 101 , 102 , and 103 may have the same or similar structure to the electronic device 201 of FIG. 2 , which will be described later.
  • FIG. 2 is a block diagram of an electronic device 201 in a network environment 200, according to various embodiments.
  • the electronic device 201 (eg, the electronic device 100 of FIG. 1 ) communicates with the electronic device 202 through a first network 298 (eg, a short-range wireless communication network). ), or communicate with the electronic device 204 or the server 208 through the second network 299 (eg, a remote wireless communication network). According to an embodiment, the electronic device 201 may communicate with the electronic device 204 through the server 208 .
  • a first network 298 eg, a short-range wireless communication network
  • the second network 299 eg, a remote wireless communication network
  • the electronic device 201 may communicate with the electronic device 204 through the server 208 .
  • the electronic device 201 includes a processor 320 , a memory 330 , an input device 250 , a sound output device 255 , a display device 260 , an audio module 270 , and a sensor module ( 276 , interface 277 , haptic module 279 , camera module 280 , power management module 288 , battery 389 , communication module 290 , subscriber identification module 296 , or antenna module 297 . ) may be included. In some embodiments, at least one of these components (eg, the display device 260 or the camera module 280 ) may be omitted or one or more other components may be added to the electronic device 201 . In some embodiments, some of these components may be implemented as one integrated circuit. For example, the sensor module 276 (eg, a fingerprint sensor, an iris sensor, or an illuminance sensor) may be implemented while being embedded in the display device 260 (eg, a display).
  • the sensor module 276 eg, a fingerprint sensor, an iris
  • the processor 320 executes software (eg, a program 240) to execute at least one other component (eg, a hardware or software component) of the electronic device 201 connected to the processor 320 . It can control and perform various data processing or operations. According to one embodiment, as at least part of data processing or computation, the processor 320 converts commands or data received from other components (eg, the sensor module 276 or the communication module 290 ) to the volatile memory 232 . may load into the volatile memory 232 , process commands or data stored in the volatile memory 232 , and store the resulting data in the non-volatile memory 234 .
  • software eg, a program 240
  • the processor 320 converts commands or data received from other components (eg, the sensor module 276 or the communication module 290 ) to the volatile memory 232 . may load into the volatile memory 232 , process commands or data stored in the volatile memory 232 , and store the resulting data in the non-volatile memory 234 .
  • the processor 320 includes a main processor 221 (eg, a central processing unit or an application processor), and a secondary processor 223 (eg, a graphics processing unit, an image signal processor) that can operate independently or together with the main processor 221 . , a sensor hub processor, or a communication processor). Additionally or alternatively, the auxiliary processor 223 may be configured to use less power than the main processor 221 or to specialize in a specified function. The auxiliary processor 223 may be implemented separately from or as a part of the main processor 221 .
  • a main processor 221 eg, a central processing unit or an application processor
  • a secondary processor 223 eg, a graphics processing unit, an image signal processor
  • the auxiliary processor 223 may be configured to use less power than the main processor 221 or to specialize in a specified function.
  • the auxiliary processor 223 may be implemented separately from or as a part of the main processor 221 .
  • the secondary processor 223 may, for example, act on behalf of the main processor 221 while the main processor 221 is in an inactive (eg, sleep) state, or when the main processor 221 is active (eg, executing an application). ), together with the main processor 221 , at least one of the components of the electronic device 201 (eg, the display device 260 , the sensor module 276 , or the communication module 290 ). It is possible to control at least some of the related functions or states.
  • the coprocessor 223 eg, image signal processor or communication processor
  • the memory 330 may store various data used by at least one component (eg, the processor 320 or the sensor module 276 ) of the electronic device 201 .
  • the data may include, for example, input data or output data for software (eg, the program 240 ) and instructions related thereto.
  • the memory 330 may include a volatile memory 232 or a non-volatile memory 234 .
  • the program 240 may be stored as software in the memory 330 , and may include, for example, an operating system 242 , middleware 244 , or an application 246 .
  • the input device 250 may receive a command or data to be used in a component (eg, the processor 320 ) of the electronic device 201 from the outside (eg, a user) of the electronic device 201 .
  • the input device 250 may include, for example, a microphone, a mouse, a keyboard, or a digital pen (eg, a stylus pen).
  • the sound output device 255 may output a sound signal to the outside of the electronic device 201 .
  • the sound output device 255 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback, and the receiver can be used to receive incoming calls. According to one embodiment, the receiver may be implemented separately from or as part of the speaker.
  • the display device 260 may visually provide information to the outside (eg, a user) of the electronic device 201 .
  • the display device 260 may include, for example, a display, a hologram device, or a projector and a control circuit for controlling the corresponding device.
  • the display device 260 may include a touch circuitry configured to sense a touch or a sensor circuit (eg, a pressure sensor) configured to measure the intensity of a force generated by the touch. have.
  • the audio module 270 may convert a sound into an electric signal or, conversely, convert an electric signal into a sound. According to an embodiment, the audio module 270 acquires a sound through the input device 250 , or an external electronic device (eg, a sound output device 255 ) connected directly or wirelessly with the electronic device 201 . The sound may be output through the electronic device 202 (eg, a speaker or headphones).
  • an external electronic device eg, a sound output device 255
  • the sound may be output through the electronic device 202 (eg, a speaker or headphones).
  • the sensor module 276 detects an operating state (eg, power or temperature) of the electronic device 201 or an external environmental state (eg, a user state), and generates an electrical signal or data value corresponding to the sensed state. can do.
  • the sensor module 276 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 277 may support one or more designated protocols that may be used to directly or wirelessly connect the electronic device 201 with an external electronic device (eg, the electronic device 202 ).
  • the interface 277 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card
  • the connection terminal 278 may include a connector through which the electronic device 201 can be physically connected to an external electronic device (eg, the electronic device 202 ).
  • the connection terminal 278 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 279 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that the user can perceive through tactile or kinesthetic sense.
  • the haptic module 279 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 280 may capture still images and moving images.
  • the camera module 280 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 288 may manage power supplied to the electronic device 201 .
  • the power management module 388 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 389 may supply power to at least one component of the electronic device 201 .
  • battery 389 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 290 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 201 and an external electronic device (eg, the electronic device 202, the electronic device 204, or the server 208). It can support establishment and communication through the established communication channel.
  • the communication module 290 may include one or more communication processors that operate independently of the processor 320 (eg, an application processor) and support direct (eg, wired) communication or wireless communication.
  • the communication module 290 is a wireless communication module 292 (eg, a cellular communication module, a short-range communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 294 (eg, : It may include a local area network (LAN) communication module, or a power line communication module).
  • a corresponding communication module may be a first network 298 (eg, a short-range communication network such as Bluetooth, WiFi direct or IrDA (infrared data association)) or a second network 299 (eg, a cellular network, the Internet, or It may communicate with an external electronic device via a computer network (eg, a telecommunication network such as a LAN or WAN).
  • a computer network eg, a telecommunication network such as a LAN or WAN.
  • the wireless communication module 292 uses subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 296 within a communication network, such as the first network 298 or the second network 299 .
  • subscriber information eg, International Mobile Subscriber Identifier (IMSI)
  • IMSI International Mobile Subscriber Identifier
  • the electronic device 201 may be identified and authenticated.
  • the antenna module 297 may transmit or receive a signal or power to the outside (eg, an external electronic device).
  • the antenna module may include one antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 297 may include a plurality of antennas. In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 298 or the second network 299 is connected from the plurality of antennas by, for example, the communication module 290 . can be selected. A signal or power may be transmitted or received between the communication module 290 and an external electronic device through the selected at least one antenna.
  • other components eg, RFIC
  • other than the radiator may be additionally formed as a part of the antenna module 297 .
  • peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • the command or data may be transmitted or received between the electronic device 201 and the external electronic device 204 through the server 208 connected to the second network 299 .
  • Each of the electronic devices 202 and 204 may be the same or a different type of device as the electronic device 201 .
  • all or part of the operations executed in the electronic device 201 may be executed in one or more of the external electronic devices 202 , 204 , or 208 .
  • the electronic device 201 may perform the function or service itself instead of executing the function or service itself.
  • one or more external electronic devices may be requested to perform at least a part of the function or the service.
  • the one or more external electronic devices that have received the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit a result of the execution to the electronic device 201 .
  • the electronic device 201 may process the result as it is or additionally and provide it as at least a part of a response to the request.
  • cloud computing, distributed computing, or client-server computing technology may be used.
  • the electronic device may have various types of devices.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a smart phone
  • a portable multimedia device e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a wearable device e.g., a smart bracelet
  • a home appliance device e.g., a home appliance
  • first”, “second”, or “first” or “second” may simply be used to distinguish the component from other components in question, and may refer to components in other aspects (e.g., importance or order) is not limited. It is said that one (eg, first) component is “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively”. When referenced, it means that one component can be connected to the other component directly (eg by wire), wirelessly, or through a third component.
  • module may include a unit implemented in hardware, software, or firmware, and may be used interchangeably with terms such as, for example, logic, logic block, component, or circuit.
  • a module may be an integrally formed part or a minimum unit or a part of the part that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • Various embodiments of the present document include one or more instructions stored in a storage medium (eg, internal memory 236 or external memory 238) readable by a machine (eg, electronic device 201). may be implemented as software (eg, the program 240) including
  • the processor eg, the processor 320 of the device (eg, the electronic device 201 ) may call at least one of one or more instructions stored from a storage medium and execute it. This makes it possible for the device to be operated to perform at least one function according to the at least one command called.
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not contain a signal (eg, electromagnetic wave), and this term is used in cases where data is semi-permanently stored in the storage medium and It does not distinguish between temporary storage cases.
  • a signal eg, electromagnetic wave
  • the method according to various embodiments disclosed in this document may be included and provided in a computer program product.
  • Computer program products may be traded between sellers and buyers as commodities.
  • the computer program product is distributed in the form of a machine-readable storage medium (eg compact disc read only memory (CD-ROM)), or through an application store (eg Play Store TM ) or on two user devices ( It can be distributed (eg downloaded or uploaded) directly, online between smartphones (eg: smartphones).
  • a part of the computer program product may be temporarily stored or temporarily created in a machine-readable storage medium such as a memory of a server of a manufacturer, a server of an application store, or a relay server.
  • each component eg, a module or a program of the above-described components may include a singular or a plurality of entities.
  • one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, a module or a program
  • the integrated component may perform one or more functions of each component of the plurality of components identically or similarly to those performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component are executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations are executed in a different order, or omitted. or one or more other operations may be added.
  • FIG. 3 illustrates a configuration of a digital pen 301 that is one of means for selecting a screen in relation to an electronic device sharing a screen according to an embodiment of the present disclosure.
  • the digital pen 301 includes a processor 320 , a memory 330 , a resonance circuit 387 , a charging circuit 388 , a battery 389 , and a communication circuit 390 . , an antenna 397 and/or a trigger circuit 398 .
  • the processor 320 of the digital pen 301, at least a portion of the resonant circuit 387, and/or at least a portion of the communication circuit 390 may be configured on a printed circuit board or in the form of a chip. have.
  • the processor 320 , the resonant circuit 387 and/or the communication circuit 390 may be electrically connected to the memory 330 , the charging circuit 388 , the battery 389 , the antenna 397 or the trigger circuit 398 .
  • the digital pen 301 according to an embodiment may include only a resonance circuit and a button.
  • the processor 320 may include a generic processor configured to execute a customized hardware module or software (eg, an application program).
  • the processor is a hardware component including at least one of various sensors provided in the digital pen 301, a data measurement module, an input/output interface, a module for managing the state or environment of the digital pen 301, or a communication module ( function) or a software element (program).
  • the processor 320 may include, for example, one or a combination of two or more of hardware, software, and firmware.
  • the processor 320 may receive a proximity signal corresponding to the electromagnetic field signal generated from the digitizer of the electronic device 201 through the resonance circuit 387 . When the proximity signal is confirmed, the resonance circuit 387 may be controlled to transmit an electromagnetic resonance (EMR) input signal to the electronic device 201 .
  • EMR electromagnetic resonance
  • the memory 330 may store information related to the operation of the digital pen 301 .
  • the information may include information for communication with the electronic device 201 and frequency information related to an input operation of the digital pen 301 .
  • the resonance circuit 387 may include at least one of a coil, an inductor, and a capacitor.
  • the resonance circuit 387 may be used by the digital pen 301 to generate a signal including a resonance frequency.
  • the digital pen 301 may use at least one of an electro-magnetic resonance (EMR) method, an active electrostatic (AES) method, or an electrically coupled resonance (ECR) method.
  • EMR electro-magnetic resonance
  • AES active electrostatic
  • ECR electrically coupled resonance
  • the digital pen 301 sets a resonance frequency based on an electromagnetic field generated from an inductive panel of the electronic device 201 .
  • the digital pen 301 may generate a signal using the electronic device 201 and capacitive coupling.
  • the digital pen 301 transmits a signal by the ECR method
  • the digital pen 301 generates a signal including a resonant frequency based on an electric field generated from a capacitive device of the electronic device.
  • the resonance circuit 387 may be used to change the strength or frequency of the electromagnetic field according to a user's manipulation state.
  • the resonance circuit 387 may provide a frequency for recognizing a hovering input, a drawing input, a button input, or an erasing input.
  • the charging circuit 388 When the charging circuit 388 is connected to the resonance circuit 387 based on a switching circuit, the charging circuit 388 may rectify the resonance signal generated in the resonance circuit 387 into a DC signal and provide it to the battery 389 .
  • the digital pen 301 may determine whether the digital pen 301 is inserted into the electronic device 201 by using the voltage level of the DC signal detected by the charging circuit 388 .
  • the battery 389 may be configured to store power required for the operation of the digital pen 301 .
  • the battery may include, for example, a lithium-ion battery or a capacitor, and may be rechargeable or replaceable. According to an embodiment, the battery 389 may be charged using power (eg, a DC signal (DC power)) provided from the charging circuit 388 .
  • power eg, a DC signal (DC power)
  • the communication circuit 390 may be configured to perform a wireless communication function between the digital pen 301 and the communication module 190 of the electronic device 201 .
  • the communication circuit 390 may transmit state information and input information of the digital pen 301 to the electronic device 201 using a short-distance communication method.
  • the communication circuit 390 may include direction information (eg, motion sensor data) of the digital pen 301 acquired through the trigger circuit 398, voice information input through a microphone, or the remaining amount of the battery 389 .
  • Information may be transmitted to the electronic device 201 .
  • the short-range communication method may include at least one of Bluetooth, Bluetooth low energy (BLE), and wireless LAN.
  • the antenna 397 may be used to transmit or receive a signal or power to the outside (eg, the electronic device 201 ).
  • the digital pen 301 may include a plurality of antennas 397, and among them, at least one antenna 397 suitable for a communication method may be selected. Through the selected at least one antenna 397 , the communication circuit 390 may exchange a signal or power with an external electronic device.
  • the trigger circuit 398 may include at least one button or sensor circuit.
  • the processor 320 may check the input method (eg, touch or press) or type (eg, EMR button or BLE button) of the button of the digital pen 301 .
  • the sensor circuit may generate an electrical signal or data value corresponding to an internal operating state or an external environmental state of the digital pen 301 .
  • the sensor circuit may include at least one of a motion sensor, a remaining battery level detecting sensor, a pressure sensor, an optical sensor, a temperature sensor, a geomagnetic sensor, and a biometric sensor.
  • the trigger circuit 398 may transmit a trigger signal to the electronic device 201 using an input signal of a button or a signal through a sensor.
  • FIG. 4 illustrates a method of sharing a screen in an electronic device according to an embodiment.
  • operations of the electronic device may be controlled by a processor (eg, the processor 220 ).
  • the electronic device eg, the electronic device 100 of FIG. 1 or the electronic device 201 of FIG. 2
  • displays the electronic device 100 eg, the display 125 of FIG. 1
  • the video screen may be reproduced on the display device 260 of FIG. 2 .
  • a video screen may be reproduced on a part or the entire area of the display of the electronic device 100 .
  • a video screen may be reproduced in some areas such as an upper area, a lower area, a half area, and a 1/3 area of the display of the electronic device 100 .
  • a video screen may be reproduced through the entire display area of the electronic device 100 .
  • the electronic device 100 may obtain a user input specifying a first region among regions in which a video screen is reproduced.
  • the first area may be an area that the user of the electronic device 100 wants to share with another electronic device. For example, when the user is interested in a specific object in the video screen, a region including the object may be selected as the first region. As another example, if the user is interested in a specific area in the video screen, the first area may be selected to include the specific area.
  • the first area may be an area excluding an area that the user of the electronic device 100 does not want to share with other electronic devices.
  • the user may prevent an area in which the content of a message or notification is displayed (eg, a part of the upper part) or an area in which personal information is displayed among the areas of the display of the electronic device 100 when a message or notification is received.
  • the first area may be selected.
  • the user may select the first region with various intentions, and the attribute or type of the first region is not limited by the above-described intention.
  • the user input may be an input based on a finger or a digital pen (eg, the digital pen 301 of FIG. 3 ).
  • the user input may be a touch input using a finger.
  • the touch input may be, for example, an input based on the number of touches, a touch rhythm, a touch pattern, a multi-touch, and the like.
  • the electronic device 100 may determine the first region according to the region specified by the touch input.
  • the user input may be a pen input using the digital pen 301 .
  • the electronic device 100 may detect an input by the digital pen 301 .
  • the user input may be a non-touch (non-contact) input using the digital pen 301 .
  • the electronic device 100 may detect an input by the digital pen 301 .
  • the electronic device 100 may detect a touch input by the user's hand using a touch panel of the display and detect an input by the digital pen 301 using a separate panel. In another embodiment, the electronic device 100 may sense both the user's hand and the input by the digital pen 301 using the touch panel.
  • the user input may be provided from the user's hand or the digital pen 301 , but may also be provided by using the hand and the digital pen 301 in combination.
  • the electronic device 100 may transmit the first area data to another electronic device.
  • the electronic device 100 may transmit the first area data to another electronic device based on a wireless communication circuit.
  • the first region data may include data related to the first region selected in operation 403, image data corresponding to the first region, or data obtained by compressing/converting/encrypting image data.
  • the first area data further includes other additional information, for example, a title of a video screen, an application executing the video screen, hardware information such as resolution or model information of the electronic device 100, and information related to a user account. can do.
  • the first area data may be transmitted to another electronic device in real time.
  • the first area data may be transmitted to another electronic device in real time.
  • the user sets the first area to include only the first object and the second object (eg, 2 out of 3 people)
  • data necessary to share the set first area may be transmitted in real time.
  • the data shared in real time may continuously become the data corresponding to the first area.
  • FIG. 5 illustrates a method of sharing a screen from an electronic device to another electronic device in real time according to an embodiment.
  • operations of the electronic device may be controlled by a processor (eg, the processor 220 ).
  • the electronic device (eg, the electronic device 100 of FIG. 1 ) includes a first object and a second object (eg, 2 out of 3) that the user wants to share based on a user input.
  • a first area (eg, area 120 ) may be set.
  • the electronic device 100 may set the area as the shared area 120 according to an input for selecting or specifying an area in which two out of three people exist.
  • the electronic device 100 may specify the first area (eg, the area 120 ) based on a user input based on a finger or the digital pen 301 .
  • the electronic device 100 drags a preset small rectangular selection area while touching a finger on the display or acquires a user input such as a pinch zoom in/out after multi-touch or multi-touch, the shared area 120 ) can be specified.
  • the electronic device 100 uses a user input such as magnifying the digital pen 301 as desired while in contact with the display or manipulating a setting for selecting an area of the digital pen 301 from a predetermined distance away. An input for selecting or specifying the region 120 may be obtained through . Detailed information on this will be described later with reference to FIG. 11 .
  • the electronic device 100 may specify the partial area as the sharing area 120 based on a user input for selecting the partial area on the screen of the display.
  • a moving picture may be being reproduced in an upper portion of the display 125 of the electronic device 100 .
  • a video of a web browser may be executed in a portion of the upper portion, or an application may be executed in a portion of a divided screen.
  • the electronic device 100 displays data corresponding to the region 120 among regions in which a moving picture is played, for example, a first object and a second object (eg, among three people). You can share data corresponding to 2 people). For another example, even if 2 out of 3 people leave the area 120 while moving, the electronic device 100 continuously transmits data regarding the area 120 to the other electronic device 500 (eg, the electronic device of FIG. 2 ).
  • the electronic device 100 may transmit data regarding the area 120 to another electronic device based on a wireless communication circuit.
  • the other electronic device 500 may display data regarding the area 120 in the third area on the display.
  • the third area may be arbitrarily set by the user of the other electronic device 500 or may be a previously set area.
  • 6A and 6B are diagrams illustrating a state in which an image corresponding to a shared area is displayed in a part or the entire area on a display screen of another electronic device, according to an exemplary embodiment.
  • operations of the electronic device may be controlled by a processor (eg, the processor 220 ).
  • another electronic device 500 (eg, the electronic devices 202 and 204 of FIG. 2 ) that receives data about the area shared displays an image corresponding to the area on the display screen based on the user's manipulation. It can be displayed in the third area (eg, part or all area) of .
  • setting the display area can do.
  • the other electronic device 500 may reproduce an image corresponding to the area 121 in a partial area of the display screen.
  • the electronic device eg, the electronic device 100 of FIG. 1
  • the other electronic device 500 may display an image corresponding to the region 121 in a partial region of the upper portion of the display of the other electronic device 500 .
  • the other electronic device 500 may reproduce an image corresponding to the area 121 in the entire area of the display. For example, as shown in FIG. 6B , when the user selects the region 121 including the upper body of one of the three people, the other electronic device 500 displays an image corresponding to the region 121 to the other electronic device. The entire area of the display of 500 may be displayed.
  • the electronic devices 100 and 500 may have the same or similar structure to the electronic device 201 of FIG. 2 described above.
  • FIG. 7 illustrates a state in which an area to be shared is moved in an electronic device according to an embodiment.
  • operations of the electronic device may be controlled by a processor (eg, the processor 220 ).
  • the electronic device (eg, the electronic device 100 of FIG. 1 ) transmits data about the video area 130 including only the first object (eg, the leftmost one among three people) to another. It may be shared with the electronic device 500 (eg, the electronic devices 202 and 204 of FIG. 2 ). The other electronic device 500 may reproduce an image corresponding to the region 130 on the display of the other electronic device 500 in the third region based on the received data on the region 130 .
  • the electronic device 100 may move the region 130 in the display based on an input using the user's digital pen 301 or a finger. For example, when the user drags and drops the area 130 to the location of the area 140 including the second object by parallel movement while contacting the display of the electronic device 100 with a finger or the digital pen 301 . , the electronic device 100 may share the area 140 including the second object with another electronic device 500 . For another example, when the user inputs an input for adjusting the size while moving, the electronic device 100 may adjust the size of the region while moving as the input for adjusting the size is obtained.
  • the electronic device 100 obtains an input for moving the area and thus the area ( 130 ) may be moved to a location of another region 140 .
  • the user may input to move the region 130 by appropriately combining manipulations based on a finger or digital pen 301 , and the electronic device 100 may move the region based on the combined user input. have.
  • the electronic device 100 when the user moves the area 130 including the first object and arranges it in the position of the other area 140 including the second object, the electronic device 100 relates to the shared area 140 .
  • Data may be transmitted to the other electronic device 500 in real time based on a wireless communication circuit, and the other electronic device 500 may receive data regarding the region 140 in real time.
  • the other electronic device 500 may display an image corresponding to the region 140 in the third region on the display of the other electronic device 500 based on the received data on the region 140 .
  • the electronic device 100 may share the region 130 and an intermediate region between the other regions in real time with the other electronic device 500 in the process of disposing the region 130 to a different location by moving the region 130 .
  • the user may want to move the area 130 in parallel to the position of the rightmost person.
  • the electronic device 100 may share data on an area that changes in real time while moving.
  • the other electronic device 500 may reproduce an image corresponding to the real-time changing area in the third area of the screen of the display.
  • the electronic device 100 may share the other area 140 with the other electronic device 500 by specifying the other area 140 as the sharing area according to a user input for selecting a new area.
  • the user wants to change the area 130 including only one of the three people to the left and share it with the area 140 including only one of the middle members while it is being shared with another electronic device 500 . can do.
  • the user may newly select or specify the area 140 including only one of them based on the user input of the finger or the digital pen 301 , and the electronic device 100 may change the area to another electronic device 500 .
  • Data about 140 may be transmitted.
  • whether to continue sharing the area 130 that the electronic device 100 has already shared may be preset based on a user's manipulation or selection of a new area Alternatively, it may be set at a specific time.
  • the electronic device 100 may share only the new area according to the user's device setting.
  • the electronic device 100 may be preset to share only the newly selected area 140 instead of the already selected area 130 by the user.
  • the electronic device 100 may transmit only the data related to the area 140 to the other electronic device 500 without another operation or provision of a user interface.
  • the electronic device 100 determines whether the user wants to continue to share the previously selected area 130 .
  • a user interface such as a pop-up screen or a setting screen may be displayed.
  • the user interface may be implemented in various ways such as voice and text icons.
  • the electronic device 100 may share only the new area 140 based on a user manipulation such as voice, text input, or icon click.
  • the user may additionally want to share a new area while sharing an already selected area, for example, the user may wish to share one of the leftmost persons among the shared areas.
  • the electronic device 100 presets a function of adding sharing of the newly selected area 140 including the center one while maintaining sharing of the already selected area 130 including the leftmost person. can be In this case, when the user selects the new area 140 , the electronic device 100 may share the area 130 and data related to the area 140 with the other electronic device 500 without another operation or provision of a user interface. can For another example, when the user newly sets the area 140 in a state where there is no preset for the electronic device 100 , the electronic device 100 displays a pop-up screen indicating whether only the new area 140 is to be shared.
  • the user interface for whether it is desired to share only the new area 140 may be implemented in various ways, such as, for example, voice, text, or icon.
  • a user operation such as voice, text input, or icon click may be performed, and the electronic device 100 communicates with the new area 140 and 130 .
  • the previously selected area 130 may also be shared with the other electronic device 500 .
  • the sharing may be sharing using a wireless communication circuit.
  • shared areas 130 and 140 in various embodiments have been described by taking two areas as an example, they may be three or more areas.
  • the shared electronic device 500 has been described by taking one electronic device as an example, but may be two or more electronic devices.
  • FIG. 8 illustrates a state in which the size of a screen shared by an electronic device is changed according to an embodiment.
  • operations of the electronic device may be controlled by a processor (eg, the processor 220 ).
  • the electronic device may change the size of the sharing area 150 based on the user's input from the digital pen 301 .
  • the electronic device 100 obtains the input of the digital pen 301 from the area 150 including the first object and the second object (eg, 2 out of 3) in the example shown in FIG. 8 .
  • the area 160 may be changed to include only the first object (eg, one out of three).
  • the size change of the shared area of the electronic device 100 may be performed based on a contact or non-contact input of the digital pen 301 .
  • the electronic device 100 sets the size of the region 150 based on a user's drag input based on the digital pen 301 or an input of the digital pen 301 that adjusts the size after setting the basic size of the region in advance. You can adjust the size.
  • the electronic device 100 may communicate with the digital pen 301 related to the adjustment of the area size.
  • the electronic device 100 may adjust the size of the region 150 based on a non-contact operation using the user's digital pen 301 in the region adjustment mode state.
  • the manipulation may include non-contact dragging based on the digital pen 301 or input of the digital pen 301 for adjusting the size after setting the basic size of the region in advance.
  • the area adjustment mode of the electronic device 100 may be executed by pressing an area adjustment button on the digital pen 301 or by operating a user interface for adjusting the area through communication with the digital pen 301 .
  • the electronic device 100 may adjust the size of the region 150 based on a user's finger input.
  • the finger input may include, for example, dragging the area 150 while pressing it with a finger, or controlling the size of the area after setting the basic size of the area in advance.
  • the electronic device 100 selects and presses the area 150 including the first object and the second object (eg, 2 out of 3) and drags the area 150 with a finger in an outward direction.
  • the area 150 may be reduced to an area 160 including only a smaller first object.
  • the electronic device 100 may not be able to obtain the user's finger input while the area adjustment function is executed. Based on this, the size of the region may be increased to correspond to the region 160 . Also, the electronic device 100 may further enlarge the size corresponding to the area 160 to correspond to the area 150 .
  • the electronic device 100 may change the size of the sharing area 150 based on a multi-touch using a user's finger.
  • the user may use a finger to perform multi-touch or a pinch zoom input/out input for controlling enlargement/reduction after multi-touch.
  • the electronic device 100 acquires a multi-touch of a finger or an input for pinch-zooming in/out after multi-touch, and receives the second object from the area 150 including the first object and the second object in the example shown in FIG. 8 .
  • the area 160 may be changed to include only 1 object (eg, 1 person out of 3).
  • the electronic device 100 may transmit real-time data in the process of changing the size of the region to another electronic device. Specifically, if the size of the area 150 is changed while the electronic device 100 is transmitting data regarding the area 150 to another electronic device 500 in real time, the electronic device 100 is continuous in the change process and Variable data can be transmitted in real time. For example, when the user reduces or changes the size of the region while data on the region 150 is being transmitted, data on the second object (eg, one of them) is transmitted while the region 150 is reduced. may gradually decrease. In this case, the electronic device 100 may transmit the decreasing real-time data to the other electronic device 500 . The electronic device 100 may transmit data regarding the area 150 to another electronic device based on a wireless communication circuit.
  • the other electronic device may reproduce an image corresponding to the changing region on the third region of the display screen of the other electronic device in real time based on data about the region of the electronic device 100 .
  • the electronic device 100 may enlarge or reduce the size of the area according to a user input, and change the shape while maintaining the size of the area based on the same or similar method. Various changes may be made.
  • FIG. 9 illustrates a state in which a plurality of selection regions (eg, a first region and a second region) are shared with a plurality of other electronic devices in an electronic device according to an embodiment.
  • operations of the electronic device may be controlled by a processor (eg, the processor 220 ).
  • the electronic device eg, the electronic device 100 of FIG. 1
  • the area 180 including the right person) may be shared with another electronic device 900 .
  • the electronic device eg, the electronic device 100
  • the electronic device 100 may specify the region 170 and the region 180 based on an input using the user's digital pen 301 or a finger.
  • Each of the inputs for specifying the region by the user may be made at the same time or at different times, and there may be no limit to the order of the inputs that the user specifies.
  • the electronic device 100 may share data on the new region 180 to another electronic device 900 while sharing the data on the region 170 with the other electronic device 500 according to a user input.
  • the order of sharing is possible in the reverse order to the above-mentioned order, and the sharing may be performed at the same time.
  • the electronic device 100 shares data about a first area (eg, area 170 ) including a first object (eg, one left person) with another electronic device 500 . and data on the second area (eg, the area 180 ) including the second object (eg, the right person) may be shared with the other electronic device 900 in real time.
  • the electronic device 100 may share the data using a wireless communication circuit, and there may be no limitation in the order of sharing.
  • Other electronic devices eg, other electronic devices 500 and 900
  • the electronic device 100 may share a plurality of windows with other electronic devices 500 and 900 .
  • the screen sharing illustrated in FIG. 9 represents a state in which different areas are shared with other electronic devices 500 and 900 within the same window.
  • the user of the electronic device 100 may use the electronic device 100 by moving to various windows instead of one specific window. For example, a user can view a video of three people dancing in one window and then move to another window to view videos such as music videos and movies. In this case, the user may select an area to be shared in any window of the dancing video, and may select another area to share in another window of the video, such as a music video or movie.
  • the electronic device 100 may transmit data on a specific area in the plurality of windows described above to the other electronic devices 500 and 900 in real time.
  • the other electronic devices 500 and 900 may display images corresponding to the shared data on their respective displays based on the data received in real time from the electronic device 100 .
  • the number of the plurality of different electronic devices may not be particularly limited.
  • the electronic devices 100 , 500 , and 900 may have the same or similar structure as the electronic device 201 .
  • FIG. 10 is a diagram illustrating fast-forward (FF, Fast-Forward) or rewind (RW, Rewind) in another electronic device while a screen is shared from an electronic device to another electronic device, according to an exemplary embodiment.
  • operations of the electronic device may be controlled by a processor (eg, the processor 220 ).
  • the electronic device 100 may reproduce a video image through the display screen, and the other electronic devices 500 and 900 may be receiving the image being shared.
  • the electronic device 100 may be playing an image of a person running, and may be sharing an image of a person running with other electronic devices 500 and 900 .
  • the other electronic devices 500 and 900 may fast forward or rewind an image corresponding to the shared area.
  • the other electronic devices 500 and 900 play an image corresponding to the shared area 190 from the electronic device 100 based on an input using the user's finger 1000 or the digital pen 301 . can be adjusted
  • the other electronic devices 500 and 900 reproduce an image corresponding to the shared area 190 by obtaining an input for executing a rewind function using the user's finger 1000 or the digital pen 301 .
  • can rewind For example, the other electronic devices 500 and 900 may acquire an input of pressing a user's rewind button while sharing an image of a person running and may reproduce an image of a running person in the past.
  • the other electronic devices 500 and 900 when executing the rewind function, press the rewind button of the user, move an icon indicating the current playback position displayed on the timeline, and externally connected to the other electronic devices 500 and 900 .
  • Rewind may be performed by operating the electronic device to obtain an input such as executing a rewind function.
  • the operation related to the rewind input may include various methods, such as execution of a user interface and manipulation within an application, in addition to the above-described method.
  • the other electronic devices 500 and 900 display the image corresponding to the shared area 190 by acquiring an input for executing a fast forward function using the user's finger 1000 or the digital pen 301 . You can fast forward playback.
  • the other electronic devices 500 and 900 may fast-forward to a 15 km section when an input of pressing a user's fast forward button is obtained while a video of a person running in a 10 km section is shared.
  • the other electronic devices 500 and 900 when the fast forward function is executed, press the user's fast forward button, move the icon indicating the current playback position displayed on the timeline, and the other electronic devices 500 and 900 . It is possible to fast forward by acquiring an input, such as operating an external electronic device connected to the .
  • an input such as operating an external electronic device connected to the .
  • a method of executing the fast forward function there may be various methods such as execution of a user interface, operation within an application, etc. in addition to the above-described method.
  • the other electronic devices 500 and 900 may freely rewind or fast-forward within the quantitative limit of the data on the region 190 received from the electronic device 100 .
  • the other electronic devices 500 and 900 may freely rewind or fast-forward within the quantitative limit of the data on the region 190 received from the electronic device 100 .
  • they can fast forward to a 10km section while watching a 2km section, and rewind to a 5km section. .
  • 11A illustrates a state in which a user selects a screen to be shared with a finger 1000 in an electronic device according to an exemplary embodiment.
  • the user of the electronic device 100 may specify the area 110 of the display to be shared by using the finger 1000 .
  • the user may specify the region 110 using the finger 1000 so that only the leftmost one is included.
  • the size of the area 110 or the location of the area 110 may be selected or specified based on a user's intention.
  • the input for selecting the region 110 may be an input based on a specific touch input pattern by the finger 1000 .
  • the specific touch input pattern by the finger 1000 may be an input differentiated from an existing touch input.
  • the input for specifying or selecting the area 110 to be shared related to the present disclosure is a single-touch input It may be a touch input of 2 times, 3 times, etc., which is distinguished from .
  • the input for selecting the region 110 by touching the finger 1000 in a certain pattern may be a touch input of a pattern distinct from an input of an existing pattern.
  • the mode is converted into a mode for selecting the region 110 , so that the user can adjust and select the region 110 .
  • the electronic device 100 may recognize an input pattern of three consecutive touches by the finger 1000 as an input converted into a mode for specifying (or selecting) the area 110 shared by the electronic device 100 . .
  • the user may freely select the region 110 in the region 110 selection mode, and after selecting the region 110 , the electronic device 100 may share the region 110 with other electronic devices.
  • 11B illustrates a state in which a screen shared by the electronic device is selected by the digital pen 301 according to an embodiment of the present disclosure.
  • the user of the electronic device 100 may specify the area 110 of the display to be shared by using the digital pen 301 .
  • the digital pen 301 For example, as shown in FIG. 11B , when the user wants to share only one of the three people, the user uses the digital pen 301 so that only the leftmost one is included in the region 110 to be shared. 110) can be specified.
  • Selection or specification of the size of the region 110 or the position of the region 110 according to an embodiment is made based on the user's intention, and there may be no particular limitation on the size or size of the region 110 . For example, if the user wants to share only the leftmost face, the user can freely select the area 110 to include the face. For another example, when the user wants to share two people on the left, the user may select the area 110 to include the two people.
  • the input for selecting the region 110 may be an input in a specific mode by the digital pen 301 .
  • the digital pen 301 may be converted from a normal input mode state to an input mode for selecting or specifying the area 110 shared by the electronic device 100 .
  • the conversion to the input mode for selecting or specifying the shared area 110 may be performed in various ways, such as, for example, pressing a specific button of the digital pen 301 or a user setting operation in the electronic device 100 .
  • the electronic device 100 may recognize an input in a specific input mode by the digital pen 301 as an input for specifying (or selecting) the area 110 shared by the electronic device 100 .
  • the user may freely select the region 110 in the region 110 selection mode, and after selecting the region 110 , the electronic device 100 may share the region 110 with other electronic devices.
  • the specific input pattern by the digital pen 301 may be an input that is different from an existing input.
  • the input for specifying or selecting the area 110 to be shared is twice as distinct from a single touch input. , 3 times, etc. may be input.
  • the mode is converted to a mode for selecting the area 110 , so that the user can adjust and select the area 110 .
  • the electronic device 100 may recognize the input pattern touched three times consecutively by the digital pen 301 as an input converted into a mode for specifying (or selecting) the area 110 shared by the electronic device 100 . have.
  • the user may freely select the area 110 in the mode for selecting the area 110 , and after selecting the area 110 , the electronic device 100 may share the area 110 with other electronic devices.
  • operations of the electronic device may be controlled by a processor (eg, the processor 220 ).
  • FIGS. 12A to 12C are diagrams illustrating a state in which the electronic device 1200 shares data of a sharing area with a plurality of other electronic devices, according to an exemplary embodiment.
  • the electronic device 1200 may have the same or similar configuration to the electronic devices 100 and 200 .
  • the sharing may be ultra-wideband (UWB)-based sharing.
  • UWB ultra-wideband
  • FIG. 12A is a diagram illustrating a user interface (UI) for sharing data of a sharing area with a plurality of other electronic devices existing within a specific range in the electronic device 1200 according to an exemplary embodiment.
  • UI user interface
  • the electronic device 1200 may display a plurality of other electronic devices 1211 , 1212 , 1213 , and 1214 together with a specific range on the screen. As shown in FIG. 12A , the electronic device 1200 may display specific ranges 1201 and 1202 in which data of the sharing area can be shared on the screen. For example, the electronic device 1200 may display the specific range 1201 as a range including a plurality of other electronic devices 1211 and 1212 . As another example, the electronic device 1200 may display the specific range 1202 as a range including a plurality of other electronic devices 1211 , 1212 , and 1213 . Also, the electronic device 1200 may display another electronic device (eg, another electronic device 1214 ) that exists outside the specific range 1202 .
  • another electronic device eg, another electronic device 1214
  • the electronic device 1200 may display specific ranges 1201 and 1202 based on various information.
  • the electronic device 1200 displays a plurality of other electronic devices 1211 , 1212 , 1213 , and 1214 existing within a specific range based on the distances of the plurality of other electronic devices 1211 , 1212 , 1213 , and 1214 .
  • the electronic device 1200 may display the other electronic device 1211 closest to the electronic device 1200 at the bottom of the screen.
  • the electronic device 1200 may display the other electronic device 1214 that is farthest from the electronic device 1200 at the top of the screen.
  • the electronic device 1200 displays a plurality of other electronic devices 1211 , 1212 , 1213 , and 1214 existing within a specific range based on the directions of the plurality of other electronic devices 1211 , 1212 , 1213 , and 1214 .
  • the electronic device 1200 may display the location of the other electronic device 1211 on the screen based on the current location of the electronic device 1200 in consideration of the direction (eg, the northwest direction).
  • the electronic device 1200 may display the location of the other electronic device 1212 on the screen based on the current location of the electronic device 1200 in consideration of a direction (eg, a north direction).
  • the electronic device 1200 may display the other electronic device 1213 in the northeast direction and the other electronic device 1214 in the northwest direction that is more biased toward the north.
  • the electronic device 1200 may also consider a distance.
  • the electronic device 1200 selects a plurality of other electronic devices 1211, 1212, 1213, and 1214 existing within a specific range based on user information of the plurality of other electronic devices 1211, 1212, 1213, and 1214. can be displayed on the screen.
  • the electronic device 1200 may display each of the other electronic devices 1211 , 1212 , 1213 , and 1214 on the screen according to user information stored in the memory.
  • User information may include, for example, name, age, contact information, and the like.
  • the electronic device 1200 selects a plurality of other electronic devices 1211 , 1212 , 1213 , and 1214 existing within a specific range based on the movement states of the plurality of other electronic devices 1211 , 1212 , 1213 , and 1214 .
  • the electronic device 1200 may detect, in real time, a state in which other electronic devices 1211 , 1212 , 1213 , and 1214 are moving based on the current location of the electronic device 1200 (eg, moving away from or moving toward closer). can be displayed on the screen.
  • the electronic device 1200 may display the other electronic device 1211 moving in the north direction on the screen in real time.
  • the electronic device 1200 may variously change the specific ranges 1201 and 1202 according to a user's setting.
  • the electronic device 1200 may change the specific ranges 1201 and 1202 based on the user's change settings regarding distance, direction, user information, and movement state, and may also change the number and interval of specific ranges.
  • the electronic device 1200 may generate specific ranges based on a free user input (eg, a digital pen, a finger, etc.), and the electronic device 1200 may display the specific ranges on the screen. For example, different from that illustrated in FIG. 12A , the electronic device 1200 may display specific ranges in a straight line form or may display specific ranges in an arbitrary irregular shape.
  • a free user input eg, a digital pen, a finger, etc.
  • the electronic device 1200 may display specific ranges on the screen. For example, different from that illustrated in FIG. 12A , the electronic device 1200 may display specific ranges in a straight line form or may display specific ranges in an arbitrary irregular shape.
  • the electronic device 1200 transmits data of the shared area to other electronic devices 1211 , 1212 , and 1213 existing within the specific range 1202 . can be transmitted
  • the electronic device 1200 may transmit data of the shared area to other electronic devices 1211 and 1212 existing within the specific range 1201. have.
  • the electronic device 1200 displays other electronic devices existing within the specific range.
  • Data in the shared area may be transmitted to the devices 1211 , 1212 , 1213 , and 1214 .
  • transmission of shared area data may be applied in the same or similar manner to a transmission operation described later with reference to FIGS. 12B and 12C .
  • FIG. 12B illustrates a state in which the electronic device shares data of the sharing area with a plurality of other electronic devices existing within a specific range based on the digital pen 301 according to an exemplary embodiment.
  • the user may designate or select a specific range to transmit data of the shared area using the digital pen 301 .
  • the electronic device 1200 may designate a specific range 1202 as shown in FIG. 12B to include other electronic devices 1211 , 1212 , and 1213 .
  • the electronic device 1200 may designate a specific range 1202 in an irregular shape, a straight shape, or the like, different from that shown in FIG. 12B to include the other electronic devices 1211 , 1212 , and 1213 .
  • the electronic device 1200 may transmit data of the shared area to other electronic devices 1211 , 1212 , and 1213 included in the specified specific range 1202 .
  • the user may select a range to transmit data of the shared area in units of objects (eg, other electronic devices) by using the digital pen 301 .
  • the electronic device 1200 may create an object unit including other electronic devices 1211 , 1212 , 1213 , and 1214 , respectively.
  • the user may select one or more object units to transmit data of the shared area, and the electronic device 1200 may transmit data of the shared area to another electronic device corresponding to the selected at least one or more object units.
  • FIG. 12C illustrates a state in which an electronic device shares data of a sharing area with a plurality of other electronic devices existing within a specific range based on the finger 1000 according to an exemplary embodiment.
  • the electronic device may obtain a user input (eg, an input for designating or selecting a specific range to transmit data of the shared area) based on a magnet effect.
  • a user input eg, an input for designating or selecting a specific range to transmit data of the shared area
  • the user may select a range within a specific distance (eg, 1 m or 3 m) or a range including a specific number of people (eg, 3 or 5 people) by using the finger 1000 .
  • the electronic device 1200 may obtain a user's input based on the magnet effect, and may designate the above-described ranges as ranges for transmitting data of the sharing area.
  • the user may designate or select a specific range to transmit data of the shared area using the finger 1000 .
  • the electronic device 1200 may designate a specific range 1202 as shown in FIG. 12C to include other electronic devices 1211 , 1212 , and 1213 based on the acquisition of the user's finger 1000 input.
  • the electronic device 1200 may include other electronic devices 1211 , 1212 , and 1213 based on the acquisition of the user's finger 1000 input in an irregular shape, a straight line shape, etc. different from those shown in FIG. 12C , in a specific range. (1202) may also be specified.
  • the electronic device 1200 may transmit data of the shared area to other electronic devices 1211 , 1212 , and 1213 included in the specified specific range 1202 .
  • the user may select a range to transmit data of the shared area in units of objects (eg, other electronic devices) by using the finger 1000 .
  • the electronic device 1200 may create an object unit including other electronic devices 1211 , 1212 , 1213 , and 1214 , respectively.
  • the user may select at least one or more object units to transmit data of the sharing area by using the finger 1000 , and the electronic device 1200 shares it with another electronic device corresponding to the at least one or more object units selected with the finger 1000 .
  • Area data can be transmitted.
  • operations of the electronic device may be controlled by a processor (eg, the processor 220 ).

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

Dispositif électronique de partage d'écran, selon un mode de réalisation du présent document, comprenant : un circuit de communication sans fil ; un dispositif d'affichage ; une mémoire stockant des instructions ; et un processeur connecté électriquement au circuit de communication sans fil, au dispositif d'affichage et à la mémoire, le processeur pouvant, lorsque les instructions sont exécutées, reproduire un écran vidéo sur le dispositif d'affichage, obtenir une première entrée d'utilisateur spécifiant une première région correspondant à une partie de l'écran vidéo en cours de reproduction, et lorsque la première région est spécifiée, transmettre, en temps réel à un dispositif externe par l'intermédiaire du circuit de communication sans fil, des données d'écran reproduites dans la première région de l'écran vidéo. Un procédé de partage d'un écran par un dispositif électronique, selon un mode de réalisation du présent document, peut comprendre les étapes suivantes : la reproduction d'un écran vidéo sur un dispositif d'affichage ; l'obtention d'une première entrée d'utilisateur spécifiant une première région correspondant à une partie de l'écran vidéo en cours de reproduction ; et lorsque la première région est spécifiée, la transmission, en temps réel à un dispositif externe par l'intermédiaire d'un circuit de communication sans fil, de données d'écran reproduites dans la première région de l'écran vidéo. Selon divers modes de réalisation du présent document, un dispositif électronique et un procédé permettant à un utilisateur de partager une région spécifique et de modifier une région de partage ou d'enregistrement en fonction d'un changement de contenu ou de mouvement/changement d'un objet dans un écran en déplaçant la région spécifique ou en modifiant la taille de la région spécifique peuvent être divulgués.
PCT/KR2021/003644 2020-03-27 2021-03-24 Dispositif électronique et procédé de partage d'écran WO2021194252A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020200037501A KR20210120589A (ko) 2020-03-27 2020-03-27 화면을 공유하는 전자 장치 및 방법
KR10-2020-0037501 2020-03-27

Publications (1)

Publication Number Publication Date
WO2021194252A1 true WO2021194252A1 (fr) 2021-09-30

Family

ID=77890648

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/003644 WO2021194252A1 (fr) 2020-03-27 2021-03-24 Dispositif électronique et procédé de partage d'écran

Country Status (2)

Country Link
KR (1) KR20210120589A (fr)
WO (1) WO2021194252A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20230053204A (ko) * 2021-10-14 2023-04-21 삼성전자주식회사 전자 장치들 간의 위치 관계에 기반하여 사용자 인터페이스를 제공하는 방법 및 그를 위한 전자 장치
KR20240026566A (ko) * 2022-08-22 2024-02-29 삼성전자주식회사 디스플레이 장치 및 그 동작 방법

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09231044A (ja) * 1996-02-26 1997-09-05 Canon Inc 画面共有システムおよび方法
JP2011154478A (ja) * 2010-01-26 2011-08-11 Canon Inc 画面共有装置及びその制御方法、プログラム、画面共有システム
KR20110107058A (ko) * 2010-03-24 2011-09-30 엘지전자 주식회사 이동 단말기 및 그 제어방법
KR20120079416A (ko) * 2011-01-04 2012-07-12 삼성전자주식회사 웹 페이지의 콘텐츠 서비스 공유 방법, 장치 및 이를 제공하는 시스템
KR20170059331A (ko) * 2015-11-20 2017-05-30 주식회사 티슈 어플을 이용한 이미지 추출 및 공유 시스템과, 그 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09231044A (ja) * 1996-02-26 1997-09-05 Canon Inc 画面共有システムおよび方法
JP2011154478A (ja) * 2010-01-26 2011-08-11 Canon Inc 画面共有装置及びその制御方法、プログラム、画面共有システム
KR20110107058A (ko) * 2010-03-24 2011-09-30 엘지전자 주식회사 이동 단말기 및 그 제어방법
KR20120079416A (ko) * 2011-01-04 2012-07-12 삼성전자주식회사 웹 페이지의 콘텐츠 서비스 공유 방법, 장치 및 이를 제공하는 시스템
KR20170059331A (ko) * 2015-11-20 2017-05-30 주식회사 티슈 어플을 이용한 이미지 추출 및 공유 시스템과, 그 방법

Also Published As

Publication number Publication date
KR20210120589A (ko) 2021-10-07

Similar Documents

Publication Publication Date Title
WO2020045947A1 (fr) Dispositif électronique de commande de propriété d'écran sur la base de la distance entre un dispositif d'entrée de stylet et le dispositif électronique et son procédé de commande
WO2019117566A1 (fr) Dispositif électronique et procédé de commande d'entrée associé
WO2021261722A1 (fr) Dispositif électronique comprenant une unité d'affichage souple
WO2020162709A1 (fr) Dispositif électronique pour la fourniture de données graphiques basées sur une voix et son procédé de fonctionnement
WO2021118061A1 (fr) Dispositif électronique et procédé de configuration d'agencement utilisant ledit dispositif
WO2020067639A1 (fr) Dispositif électronique d'appariement avec un stylet et procédé associé
WO2020149689A1 (fr) Procédé de traitement d'image et dispositif électronique le prenant en charge
WO2021194252A1 (fr) Dispositif électronique et procédé de partage d'écran
WO2020032347A1 (fr) Dispositif électronique et procédé de fourniture d'environnement de dessin
WO2019164183A1 (fr) Dispositif électronique d'acquisition d'informations biométriques à l'aide d'une électrode sélectionnée parmi des électrodes de capteur biométrique, et son procédé de commande
WO2013169051A1 (fr) Procédé et appareil pour exécuter une dénomination automatique d'un contenu et support d'enregistrement lisible par ordinateur correspondant
WO2019059483A1 (fr) Dispositif electronique et procédé de commande associé
WO2020091538A1 (fr) Dispositif électronique pour afficher un écran par l'intermédiaire d'un panneau d'affichage en mode de faible puissance et son procédé de fonctionnement
WO2020171342A1 (fr) Dispositif électronique permettant de fournir un service d'intelligence artificielle visualisé sur la base d'informations concernant un objet externe, et procédé de fonctionnement pour dispositif électronique
WO2020013651A1 (fr) Dispositif électronique, et procédé pour la transmission d'un contenu du dispositif électronique
WO2021133123A1 (fr) Dispositif électronique comprenant un écran flexible et son procédé de fonctionnement
WO2019066323A1 (fr) Dispositif électronique et procédé d'exécution de contenu utilisant des informations de ligne de vue de celui-ci
WO2020013542A1 (fr) Dispositif électronique et procédé d'exécution de fonction de dispositif électronique
WO2022203195A1 (fr) Dispositif électronique pour fournir une prévisualisation de contenu, procédé de fonctionnement de celui-ci et support de stockage
WO2022030824A1 (fr) Procédé d'affichage d'écran et de lecture audio et dispositif associé
WO2020159213A1 (fr) Procédé et dispositif de configuration contextuelle personnalisée par l'utilisateur
WO2021145614A1 (fr) Dispositif électronique pour commander un dispositif électronique externe et procédé associé
WO2020022829A1 (fr) Dispositif électronique de support d'entrée utilisateur et procédé de commande de dispositif électronique
WO2021201603A1 (fr) Dispositif électronique et son procédé de commande
WO2022182025A1 (fr) Dispositif électronique et son procédé de commande

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21775602

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21775602

Country of ref document: EP

Kind code of ref document: A1