WO2022010092A1 - Dispositif électronique pour prendre en charge un partage de contenu - Google Patents

Dispositif électronique pour prendre en charge un partage de contenu Download PDF

Info

Publication number
WO2022010092A1
WO2022010092A1 PCT/KR2021/006282 KR2021006282W WO2022010092A1 WO 2022010092 A1 WO2022010092 A1 WO 2022010092A1 KR 2021006282 W KR2021006282 W KR 2021006282W WO 2022010092 A1 WO2022010092 A1 WO 2022010092A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
content
display
image
user input
Prior art date
Application number
PCT/KR2021/006282
Other languages
English (en)
Korean (ko)
Inventor
우광택
김병국
오득규
이창호
안진완
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2022010092A1 publication Critical patent/WO2022010092A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72412User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/725Cordless telephones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2201/00Electronic components, circuits, software, systems or apparatus used in telephone systems
    • H04M2201/34Microprocessors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2201/00Electronic components, circuits, software, systems or apparatus used in telephone systems
    • H04M2201/38Displays

Definitions

  • Embodiments disclosed in this document may relate to an electronic device supporting content sharing.
  • Electronic devices may visually provide contents to a user through a display.
  • the electronic devices may communicate with an external electronic device through a network, and may transmit (or share) various contents to the external electronic device.
  • Embodiments are to provide an electronic device capable of sharing content with an external electronic device based on display state information of content changed according to a user's intention.
  • embodiments are intended to provide an electronic device capable of efficiently sharing information on a plurality of objects with an external electronic device by automatically extracting a plurality of objects of the same type from within one image.
  • An electronic device includes a display, a communication module for communication with an external electronic device, and a processor operatively connected to the display and the communication module, wherein the processor displays at least a portion of content on the display. output, and based on a user input for sharing the content, obtain display state information indicating a display state of the content, and transmit original information of the content and the display state information through the communication module to the external electronic device It can be set to transmit to
  • the electronic device includes a display, a communication module for communication with an external electronic device, and a processor operatively connected to the display and the communication module, wherein the processor includes a user input for sharing an image. Based on , it is determined whether a plurality of objects belonging to the same group exist in the original image, and when it is determined that the plurality of objects belonging to the same group exist in the original image, at least one of the plurality of objects is It may be configured to generate at least one cropped image of a user and transmit the at least one cropped image to the external electronic device through the communication module.
  • the electronic device may share content with an external electronic device based on the display state information of the content changed according to the user's intention.
  • the electronic device may automatically extract a plurality of objects of the same type from within one image, and efficiently share information on the plurality of objects with an external electronic device.
  • FIG. 1 is a block diagram of an electronic device in a network environment, according to various embodiments of the present disclosure
  • FIG. 2 is a block diagram of a display module according to various embodiments of the present disclosure.
  • FIG. 3 is a block diagram illustrating a program according to various embodiments.
  • FIG. 4 is a flowchart of an operation of an electronic device according to an exemplary embodiment.
  • FIG. 5 is a flowchart illustrating an operation of an electronic device according to an exemplary embodiment.
  • FIG. 6 is a diagram illustrating an operation of transmitting content from an electronic device to an external electronic device according to an exemplary embodiment.
  • FIG. 7A is a diagram illustrating an operation of transmitting content from an electronic device to an external electronic device according to an exemplary embodiment.
  • 7B is a diagram of one screen displaying shared content information.
  • FIG. 8 is a diagram illustrating an operation of transmitting content from an electronic device to an external electronic device according to an exemplary embodiment.
  • FIG. 9 is a flowchart illustrating an operation of an electronic device according to an exemplary embodiment.
  • FIG. 10 is a flowchart illustrating an operation of an electronic device according to an exemplary embodiment.
  • FIG. 11 is a diagram illustrating an operation of transmitting a plurality of cropped images to an external electronic device in an electronic device according to an exemplary embodiment
  • FIG. 12 is a diagram illustrating an operation of transmitting a plurality of cropped images to an external electronic device in an electronic device according to an exemplary embodiment.
  • FIG. 13 is a flowchart illustrating an operation of an electronic device according to an exemplary embodiment.
  • FIG. 14 is a diagram illustrating an operation of transmitting a plurality of cropped images to an external electronic device in an electronic device according to an exemplary embodiment.
  • 15 is a diagram illustrating an operation of transmitting a plurality of cropped images to an external electronic device in an electronic device according to an exemplary embodiment.
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100 according to various embodiments.
  • an electronic device 101 communicates with an electronic device 102 through a first network 198 (eg, a short-range wireless communication network) or a second network 199 . It may communicate with the electronic device 104 or the server 108 through (eg, a long-distance wireless communication network). According to an embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • a first network 198 eg, a short-range wireless communication network
  • a second network 199 e.g., a second network 199
  • the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • the electronic device 101 includes a processor 120 , a memory 130 , an input module 150 , a sound output module 155 , a display module 160 , an audio module 170 , and a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or an antenna module 197 may be included.
  • at least one of these components eg, the connection terminal 178
  • may be omitted or one or more other components may be added to the electronic device 101 .
  • some of these components are integrated into one component (eg, display module 160 ). can be
  • the processor 120 for example, executes software (eg, a program 140) to execute at least one other component (eg, a hardware or software component) of the electronic device 101 connected to the processor 120 . It can control and perform various data processing or operations. According to one embodiment, as at least part of data processing or operation, the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 . may be stored in the volatile memory 132 , and may process commands or data stored in the volatile memory 132 , and store the result data in the non-volatile memory 134 .
  • software eg, a program 140
  • the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 .
  • the volatile memory 132 may be stored in the volatile memory 132 , and may process commands or data stored in the volatile memory 132 , and store the result data in the non-volatile memory 134 .
  • the processor 120 is the main processor 121 (eg, a central processing unit or an application processor) or a secondary processor 123 (eg, a graphic processing unit, a neural network processing unit) a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor).
  • the main processor 121 e.g, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, a neural network processing unit
  • NPU neural processing unit
  • an image signal processor e.g., a sensor hub processor, or a communication processor.
  • the main processor 121 e.g, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, a neural network processing unit
  • NPU neural processing unit
  • an image signal processor e.g., a sensor hub processor, or a communication processor.
  • the main processor 121 e.g, a central processing unit or an application processor
  • a secondary processor 123
  • the auxiliary processor 123 is, for example, on behalf of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or the main processor 121 is active (eg, executing an application). ), together with the main processor 121, at least one of the components of the electronic device 101 (eg, the display module 160, the sensor module 176, or the communication module 190) It is possible to control at least some of the related functions or states.
  • the co-processor 123 eg, an image signal processor or a communication processor
  • may be implemented as part of another functionally related component eg, the camera module 180 or the communication module 190. have.
  • the auxiliary processor 123 may include a hardware structure specialized for processing an artificial intelligence model.
  • Artificial intelligence models can be created through machine learning. Such learning may be performed, for example, in the electronic device 101 itself on which artificial intelligence is performed, or may be performed through a separate server (eg, the server 108).
  • the learning algorithm may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but in the above example not limited
  • the artificial intelligence model may include a plurality of artificial neural network layers.
  • Artificial neural networks include deep neural networks (DNNs), convolutional neural networks (CNNs), recurrent neural networks (RNNs), restricted boltzmann machines (RBMs), deep belief networks (DBNs), bidirectional recurrent deep neural networks (BRDNNs), It may be one of deep Q-networks or a combination of two or more of the above, but is not limited to the above example.
  • the artificial intelligence model may include, in addition to, or alternatively, a software structure in addition to the hardware structure.
  • the memory 130 may store various data used by at least one component of the electronic device 101 (eg, the processor 120 or the sensor module 176 ).
  • the data may include, for example, input data or output data for software (eg, the program 140 ) and instructions related thereto.
  • the memory 130 may include a volatile memory 132 or a non-volatile memory 134 .
  • the program 140 may be stored as software in the memory 130 , and may include, for example, an operating system 142 , middleware 144 , or an application 146 .
  • the input module 150 may receive a command or data to be used in a component (eg, the processor 120 ) of the electronic device 101 from the outside (eg, a user) of the electronic device 101 .
  • the input module 150 may include, for example, a microphone, a mouse, a keyboard, a key (eg, a button), or a digital pen (eg, a stylus pen).
  • the sound output module 155 may output a sound signal to the outside of the electronic device 101 .
  • the sound output module 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback.
  • the receiver may be used to receive an incoming call. According to one embodiment, the receiver may be implemented separately from or as part of the speaker.
  • the display module 160 may visually provide information to the outside (eg, a user) of the electronic device 101 .
  • the display module 160 may include, for example, a control circuit for controlling a display, a hologram device, or a projector and a corresponding device.
  • the display module 160 may include a touch sensor configured to sense a touch or a pressure sensor configured to measure the intensity of a force generated by the touch.
  • the audio module 170 may convert a sound into an electric signal or, conversely, convert an electric signal into a sound. According to an embodiment, the audio module 170 acquires a sound through the input module 150 , or an external electronic device (eg, a sound output module 155 ) connected directly or wirelessly with the electronic device 101 . A sound may be output through the electronic device 102 (eg, a speaker or headphones).
  • an external electronic device eg, a sound output module 155
  • a sound may be output through the electronic device 102 (eg, a speaker or headphones).
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101 or an external environmental state (eg, user state), and generates an electrical signal or data value corresponding to the sensed state. can do.
  • the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 177 may support one or more designated protocols that may be used by the electronic device 101 to directly or wirelessly connect with an external electronic device (eg, the electronic device 102 ).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card
  • the connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102 ).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that the user can perceive through tactile or kinesthetic sense.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture still images and moving images. According to an embodiment, the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101 .
  • the power management module 188 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101 .
  • battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108). It can support establishment and communication performance through the established communication channel.
  • the communication module 190 may include one or more communication processors that operate independently of the processor 120 (eg, an application processor) and support direct (eg, wired) communication or wireless communication.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg, : It may include a LAN (local area network) communication module, or a power line communication module).
  • GNSS global navigation satellite system
  • a corresponding communication module among these communication modules is a first network 198 (eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (eg, legacy It may communicate with the external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
  • a first network 198 eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)
  • a second network 199 eg, legacy It may communicate with the external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
  • a telecommunication network
  • the wireless communication module 192 uses the subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199 .
  • the electronic device 101 may be identified or authenticated.
  • the wireless communication module 192 may support a 5G network after a 4G network and a next-generation communication technology, for example, a new radio access technology (NR).
  • NR access technology includes high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), minimization of terminal power and access to multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low-latency) -latency communications)).
  • eMBB enhanced mobile broadband
  • mMTC massive machine type communications
  • URLLC ultra-reliable and low-latency
  • the wireless communication module 192 may support a high frequency band (eg, mmWave band) to achieve a high data rate, for example.
  • a high frequency band eg, mmWave band
  • the wireless communication module 192 includes various technologies for securing performance in a high-frequency band, for example, beamforming, massive multiple-input and multiple-output (MIMO), all-dimensional multiplexing. It may support technologies such as full dimensional MIMO (FD-MIMO), an array antenna, analog beam-forming, or a large scale antenna.
  • the wireless communication module 192 may support various requirements specified in the electronic device 101 , an external electronic device (eg, the electronic device 104 ), or a network system (eg, the second network 199 ).
  • the wireless communication module 192 may include a peak data rate (eg, 20 Gbps or more) for realizing eMBB, loss coverage (eg, 164 dB or less) for realizing mMTC, or U-plane latency for realizing URLLC ( Example: downlink (DL) and uplink (UL) each 0.5 ms or less, or round trip 1 ms or less).
  • a peak data rate eg, 20 Gbps or more
  • loss coverage eg, 164 dB or less
  • U-plane latency for realizing URLLC
  • the antenna module 197 may transmit or receive a signal or power to the outside (eg, an external electronic device).
  • the antenna module 197 may include an antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is connected from the plurality of antennas by, for example, the communication module 190 . can be selected. A signal or power may be transmitted or received between the communication module 190 and an external electronic device through the selected at least one antenna.
  • other components eg, a radio frequency integrated circuit (RFIC)
  • RFIC radio frequency integrated circuit
  • the antenna module 197 may form a mmWave antenna module.
  • the mmWave antenna module comprises a printed circuit board, an RFIC disposed on or adjacent to a first side (eg, bottom side) of the printed circuit board and capable of supporting a designated high frequency band (eg, mmWave band); and a plurality of antennas (eg, an array antenna) disposed on or adjacent to a second side (eg, top or side) of the printed circuit board and capable of transmitting or receiving signals of the designated high frequency band. can do.
  • peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • the command or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199 .
  • Each of the external electronic devices 102 or 104 may be the same as or different from the electronic device 101 .
  • all or a part of operations executed in the electronic device 101 may be executed in one or more external electronic devices 102 , 104 , or 108 .
  • the electronic device 101 may perform the function or service itself instead of executing the function or service itself.
  • one or more external electronic devices may be requested to perform at least a part of the function or the service.
  • One or more external electronic devices that have received the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit a result of the execution to the electronic device 101 .
  • the electronic device 101 may process the result as it is or additionally and provide it as at least a part of a response to the request.
  • cloud computing distributed computing, mobile edge computing (MEC), or client-server computing technology may be used.
  • the electronic device 101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 104 may include an Internet of things (IoT) device.
  • Server 108 may be an intelligent server using machine learning and/or neural networks.
  • the external electronic device 104 or the server 108 may be included in the second network 199 .
  • the electronic device 101 may be applied to an intelligent service (eg, smart home, smart city, smart car, or health care) based on 5G communication technology and IoT-related technology.
  • the display module 160 may include a display 210 and a display driver integrated circuit (DDI) 230 for controlling the display 210 .
  • the display driving circuit 230 may include an interface module 231 , a memory 233 (eg, a buffer memory), an image processing module 235 , or a mapping module 237 .
  • the display driving circuit 230 transmits, for example, image data or image information including an image control signal corresponding to a command for controlling the image data to another of the electronic device 101 through the interface module 231 . It can be received from components.
  • the image information is provided by the processor 120 (eg, the main processor 121 (eg, an application processor) or the auxiliary processor 123 ( For example: graphic processing device)
  • the display driving circuit 230 may communicate with the touch circuit 250 or the sensor module 176 through the interface module 231.
  • the display driving circuit ( 230 may store at least a portion of the received image information in the memory 233, for example, in units of frames
  • the image processing module 235 may store, for example, at least a portion of the image data in the image Pre-processing or post-processing (eg, resolution, brightness, or size adjustment) may be performed based at least on the characteristics of the data or the characteristics of the display 210.
  • the mapping module 237 performs pre-processing through the image processing module 135 Alternatively, a voltage value or a current value corresponding to the post-processed image data may be generated. According to an embodiment, the generation of the voltage value or current value may include, for example, a property of pixels of the display 210 (eg: This may be performed based at least in part on an arrangement of pixels (RGB stripe or pentile structure), or a size of each sub-pixel. By being driven based at least in part, visual information (eg, text, image, or icon) corresponding to the image data may be displayed through the display 210 .
  • a property of pixels of the display 210 eg: This may be performed based at least in part on an arrangement of pixels (RGB stripe or pentile structure), or a size of each sub-pixel.
  • visual information eg, text, image, or icon
  • the display module 160 may further include a touch circuit 250 .
  • the touch circuit 250 may include a touch sensor 251 and a touch sensor IC 253 for controlling the touch sensor 251 .
  • the touch sensor IC 253 may control the touch sensor 251 to sense a touch input or a hovering input for a specific position of the display 210 , for example.
  • the touch sensor IC 253 may detect a touch input or a hovering input by measuring a change in a signal (eg, voltage, light amount, resistance, or electric charge amount) for a specific position of the display 210 .
  • the touch sensor IC 253 may provide information (eg, location, area, pressure, or time) regarding the sensed touch input or hovering input to the processor 120 .
  • At least a part of the touch circuit 250 is disposed as a part of the display driver IC 230 , the display 210 , or outside the display module 160 . may be included as a part of another component (eg, the coprocessor 123).
  • the display module 160 may further include at least one sensor (eg, a fingerprint sensor, an iris sensor, a pressure sensor, or an illuminance sensor) of the sensor module 176 , or a control circuit therefor.
  • the at least one sensor or a control circuit therefor may be embedded in a part of the display module 160 (eg, the display 210 or the display driving circuit 230 ) or a part of the touch circuit 250 .
  • the sensor module 176 embedded in the display module 160 includes a biometric sensor (eg, a fingerprint sensor)
  • the biometric sensor provides biometric information related to a touch input through a partial area of the display 210 . (eg fingerprint image) can be acquired.
  • the pressure sensor may acquire pressure information related to a touch input through a part or the entire area of the display 210 .
  • the touch sensor 251 or the sensor module 176 may be disposed between pixels of the pixel layer of the display 210 or above or below the pixel layer.
  • FIG. 3 is a block diagram 300 illustrating a program 140 in accordance with various embodiments.
  • the program 140 executes an operating system 142 for controlling one or more resources of the electronic device 101 , middleware 144 , or an application 146 executable in the operating system 142 .
  • Operating system 142 may include, for example, Android TM , iOS TM , Windows TM , Symbian TM , Tizen TM , or Bada TM .
  • At least some of the programs 140 are, for example, preloaded into the electronic device 101 at the time of manufacture, or an external electronic device (eg, the electronic device 102 or 104 ), or a server (eg, the electronic device 102 or 104 ) when used by a user ( 108)) or may be updated.
  • the operating system 142 may control management (eg, allocation or retrieval) of one or more system resources (eg, a process, memory, or power) of the electronic device 101 .
  • the operating system 142 may additionally or alternatively include other hardware devices of the electronic device 101 , for example, the input device 150 , the sound output device 155 , the display module 160 , and the audio module 170 . , sensor module 176 , interface 177 , haptic module 179 , camera module 180 , power management module 188 , battery 189 , communication module 190 , subscriber identification module 196 , or It may include one or more driver programs for driving the antenna module 197 .
  • the middleware 144 may provide various functions to the application 146 so that functions or information provided from one or more resources of the electronic device 101 may be used by the application 146 .
  • the middleware 144 includes, for example, an application manager 301 , a window manager 303 , a multimedia manager 305 , a resource manager 307 , a power manager 309 , a database manager 311 , and a package manager 313 . ), a connectivity manager 315 , a notification manager 317 , a location manager 319 , a graphics manager 321 , a security manager 323 , a call manager 325 , or a voice recognition manager 327 .
  • an application manager 301 includes, for example, an application manager 301 , a window manager 303 , a multimedia manager 305 , a resource manager 307 , a power manager 309 , a database manager 311 , and a package manager 313 .
  • a connectivity manager 315 a notification manager 317 , a
  • the application manager 301 may manage the life cycle of the application 146 , for example.
  • the window manager 303 may manage one or more GUI resources used in a screen, for example.
  • the multimedia manager 305 for example, identifies one or more formats required for playback of media files, and encodes or decodes a corresponding media file among the media files using a codec suitable for the selected format. can be done
  • the resource manager 307 may manage the space of the source code of the application 146 or the memory of the memory 130 , for example.
  • the power manager 309 may, for example, manage the capacity, temperature, or power of the battery 189 , and use the corresponding information to determine or provide related information required for the operation of the electronic device 101 . . According to an embodiment, the power manager 309 may interwork with a basic input/output system (BIOS) (not shown) of the electronic device 101 .
  • BIOS basic input/output system
  • the database manager 311 may create, retrieve, or change a database to be used by the application 146 , for example.
  • the package manager 313 may manage installation or update of an application distributed in the form of a package file, for example.
  • the connectivity manager 315 may manage, for example, a wireless connection or a direct connection between the electronic device 101 and an external electronic device.
  • the notification manager 317 may provide, for example, a function for notifying the user of the occurrence of a specified event (eg, an incoming call, a message, or an alarm).
  • the location manager 319 may manage location information of the electronic device 101 , for example.
  • the graphic manager 321 may manage, for example, one or more graphic effects to be provided to a user or a user interface related thereto.
  • Security manager 323 may provide, for example, system security or user authentication.
  • the telephony manager 325 may manage, for example, a voice call function or a video call function provided by the electronic device 101 .
  • the voice recognition manager 327 transmits, for example, the user's voice data to the server 108, and based at least in part on the voice data, a command corresponding to a function to be performed in the electronic device 101; Alternatively, the converted text data may be received from the server 108 based at least in part on the voice data.
  • the middleware 344 may dynamically delete some existing components or add new components.
  • at least a portion of the middleware 144 may be included as a part of the operating system 142 or implemented as software separate from the operating system 142 .
  • the application 146 includes, for example, home 351 , dialer 353 , SMS/MMS 355 , instant message (IM) 357 , browser 359 , camera 361 , alarm 363 . , contact 365, voice recognition 367, email 369, calendar 371, media player 373, album 375, watch 377, health 379 (such as exercise or blood sugar) measuring biometric information), or environment information 381 (eg, measuring atmospheric pressure, humidity, or temperature information) applications.
  • the application 146 may further include an information exchange application (not shown) capable of supporting information exchange between the electronic device 101 and an external electronic device.
  • the information exchange application may include, for example, a notification relay application configured to transmit specified information (eg, call, message, or alarm) to an external electronic device, or a device management application configured to manage the external electronic device.
  • the notification relay application for example, transmits notification information corresponding to a specified event (eg, mail reception) generated in another application (eg, the email application 369) of the electronic device 101 to the external electronic device.
  • the notification relay application may receive notification information from the external electronic device and provide it to the user of the electronic device 101 .
  • the device management application is, for example, a power source (eg, turn-on or turn on) of an external electronic device that communicates with the electronic device 101 or some components thereof (eg, the display module 160 or the camera module 180 ). -off) or a function (eg, brightness, resolution, or focus of the display module 160 or the camera module 180 ) can be controlled.
  • the device management application may additionally or alternatively support installation, deletion, or update of an application operating in an external electronic device.
  • FIG. 4 is a flowchart of an operation of an electronic device according to an exemplary embodiment.
  • an operation of the electronic device may be referred to as an operation of the processor 120 (refer to FIG. 1 ).
  • the electronic device may display at least a portion of the content (eg, the first area) through the display 210 (refer to FIG. 2 ).
  • the entire area corresponding to the content may be larger than the first area displayed on the display 210 .
  • the first area may be a partial area of content that is initially output based on a user input.
  • the electronic device may access content from an external electronic device (eg, the electronic device 104 of FIG. 1 ) or a server (eg, the server 108 of FIG. 1 ) through a browser 359 (refer to FIG. 3 ) application.
  • Data may be received, and content may be displayed on the display 210 based on the received data.
  • the electronic device may display content on the display 210 using data stored in the memory 130 (refer to FIG. 1 ).
  • the electronic device may change the display state of the content.
  • the electronic device may change the display state of the content by adjusting at least one of a display area, a visual effect, and/or a display magnification of the content.
  • the electronic device may display the second area of the content on the display 210 based on a user input (eg, a touch input or a gesture input).
  • the second area of the content may be a different area from the first area.
  • At least a portion of the second area in the content may be an area other than the first area.
  • the electronic device may display a second area different from the first area on the display by changing (eg, scrolling or moving) the display area of the content based on a user input.
  • the electronic device may add (or input) a specified effect (eg, highlight) to at least a part of the content based on a user input (eg, a drag input).
  • the electronic device may change the display state of the content by adding an effect to the first area or the second area of the displayed content.
  • the electronic device may enlarge or reduce content to display based on a user input (eg, a gesture input).
  • the electronic device may receive at least one user input for sharing content with an external electronic device (eg, the electronic device 104 of FIG. 1 ).
  • the user input for sharing content may include a user touch input for a sharing icon displayed on the display 210 .
  • the share icon may be displayed on one area of the screen together with the content.
  • the screen may be an area in which an image or an image is expressed by the display 210 .
  • the share icon may be displayed based on a specified user input.
  • the user input for sharing the content includes a first user input for a user interface (UI) (eg, a display state image) related to a display state of the content and a second user input for a share icon can do.
  • UI user interface
  • the user may select whether to share the content display state through a first user input to the user interface related to the changed display state of the content, and transmit the content to an external electronic device through a second user input to the share icon to device 104 .
  • the electronic device may obtain information about the changed display state of the content.
  • the display state information of the content may include at least one of a scroll position (or a scroll movement rate), a position of an added specified effect, a display size of the content, an enlargement rate, or a reduction rate.
  • the display size of the content may be the absolute size of the content displayed on the display (screen) of the electronic device.
  • the electronic device transmits the original content information and the changed content display state information to an external electronic device (eg, the electronic device 104 of FIG. 1) using a communication module (eg, the communication module 190 of FIG. 1).
  • the content original information may include at least one of URL information of the content and original data of the content.
  • FIG. 5 is a flowchart illustrating an operation of an electronic device according to an exemplary embodiment.
  • an operation of the electronic device may be referred to as an operation of the processor 120 (refer to FIG. 1 ).
  • the electronic device may receive at least one user input for sharing the content being output with an external electronic device (eg, the electronic device 104 of FIG. 1 ).
  • the user input for sharing content may include a user touch input for a sharing icon displayed on the display 210 .
  • the user input for sharing the content includes a first user input for a user interface (UI) (eg, a display state image) related to a display state of the content and a second user input for a share icon can do.
  • UI user interface
  • the electronic device may determine whether to change the content display state.
  • the electronic device may determine whether to change at least one of a display area of the content, a visual effect, a display size of the content, and/or a display magnification.
  • the electronic device may determine whether a position of a scroll corresponding to a region in which content is being displayed is different from a position (eg, a scroll position) of the first region of the content.
  • the first area may be a partial area of content that is initially output based on a user input.
  • the electronic device may determine whether a specified effect (eg, highlight) is input to at least a partial area of the content.
  • the electronic device may determine whether the content being output is enlarged or reduced compared to the first area.
  • the electronic device may determine the display size of the content being output.
  • the electronic device may transmit content original information to an external electronic device using a communication module (eg, the communication module 190).
  • the content original information may include at least one of URL information of the content and original data of the content.
  • the electronic device may obtain information about the changed content display state.
  • the content display state information may include at least one of a scroll position (or a scroll movement rate), a position of an input specified effect, a display size of the content, an enlargement rate, or a reduction rate.
  • the electronic device may transmit the original content information and the changed content display state information to the external electronic device using the communication module.
  • FIG. 6 is a diagram illustrating an operation of transmitting content from an electronic device to an external electronic device according to an exemplary embodiment.
  • the electronic device 101 may display the first area 610 of the content 622 on the screen.
  • the electronic device 101 may display a scroll 611 for moving the area where the content 622 is displayed in one direction.
  • the electronic device according to an embodiment may display an area of the content 622 corresponding to the position of the scroll 611 .
  • the first area 610 may be an area of the content 622 displayed when the scroll 611 is positioned at the top.
  • the electronic device may display the second area 620 of the content 622 .
  • the electronic device 101 sets the area displayed in the content 622 to the first area 610 . may move (or change) to the second area 620 .
  • the electronic device divides the area displayed in the content 622 from the first area 610 to the second area. It may move (or change) to 620 .
  • the electronic device selects at least one application 631 through which the content 622 transmission can be performed.
  • the at least one application 631 may include at least one of SMS/MMS 355 (refer to FIG. 3 ), IM 357 (refer to FIG. 3 ), and email 369 (refer to FIG. 3 ).
  • the electronic device may list and display icons of at least one application 631 .
  • the electronic device 101 may display a preview image 630 of the content 622 to be shared.
  • the electronic device 101 may display the preview image 630 to provide information about the shared content 622 to the user.
  • the electronic device 101 may transmit the content original information and the changed display state information to the external electronic device 104 .
  • the external electronic device 104 transmits the shared content information 641 through the screen 640 that is output based on the execution of the same application as the selected application on the electronic device 101 .
  • the electronic device 101 may display the shared content information 641 through the screen of the electronic device 101 that is output based on the execution of the selected application.
  • the content 622 original information may include at least one of URL information of the content 622 and original data of the content 622 .
  • the changed display state information of the content 622 may include a scroll 611 movement rate.
  • the changed display state information of the content 622 may include at least one of a display size of the content, an enlargement ratio or a reduction ratio of the screen.
  • the shared content information 641 may include a thumbnail of the shared content 622 .
  • the second area 620 of the content 622 is It may be displayed on the electronic device 104 .
  • the external electronic device 104 selects an area (the second area 620) of the content 622 corresponding to the scroll position included in the display state information. It can be printed as the first screen.
  • the external electronic device 104 has the same size as the size of the content displayed on the electronic device 101 without separately adjusting the display ratio of the content. content can be output to the screen.
  • the size of the content 622 output from the external electronic device 104 may be the same as the size of the content 622 output from the electronic device 101 .
  • the external electronic device 104 enlarges or reduces the content 622 and displays it based on the obtained enlargement or reduction ratio.
  • FIG. 7A is a diagram illustrating an operation of transmitting content from an electronic device to an external electronic device according to an exemplary embodiment.
  • the electronic device 101 may display the first area 710 of the content 722 on the screen.
  • the electronic device according to an embodiment may display an area of the content 722 corresponding to the position of the scroll 711 .
  • the electronic device may display the second area 720 of the content 722 .
  • the electronic device 101 sets the area displayed in the content 722 to the first area 710 . may move (or change) to the second area 720 .
  • the electronic device divides the area displayed in the content 722 from the first area 710 to the second area. You can move (or change) to 720 .
  • the electronic device 101 displays at least one application capable of transmitting the content 722 .
  • a preview image 730 and a display state image 735 of the content to be shared 722 may be displayed.
  • the display state image 735 may be an image representing at least a part of the area (the second area 720 ) being displayed in the content 722 .
  • the display state image 735 is not limited to that shown in FIG. 7A , and may be a graphic display including a designated figure or character.
  • a user input for the display state image 735 (hereinafter, also referred to as “a first user input”) is received, at least one property of the display state image 735 may be changed.
  • the electronic device 101 may display the border 736 of the display state image 735 with a specified color.
  • at least one of a shape and a color of the display state image 735 may be changed.
  • a check box may be displayed in an area surrounding the display state image 735 or in an area overlapping the display state image 735 .
  • the electronic device 101 displays the content 722 original information and the changed display state information may be transmitted to the external electronic device 104 .
  • the electronic device 101 may transmit the original content 722 information and the changed display state information to the external electronic device 104 through a service providing server (not shown) that provides the selected application 731 .
  • the content 722 original information may include at least one of URL information of the content 722 and original data of the content 722 .
  • the changed display state information of the content 722 may include a scroll 711 movement rate.
  • the changed display state information of the content 722 may include at least one of a display size of the content, an enlargement ratio, or a reduction ratio of the screen.
  • the changed display state information may not be transmitted to the external electronic device 104 .
  • the external electronic device 104 transmits the shared content information 741 through the screen 740 that is output based on the execution of the same application as the selected application on the electronic device 101 . can be displayed.
  • the electronic device 101 may display the shared content information 741 through the screen 740 of the electronic device 101 output by the execution of the selected application.
  • the shared content information 741 may include a thumbnail of the shared content 722 .
  • the second area 720 of the content 722 is It may be displayed on the electronic device 104 .
  • the external electronic device 104 displays the area (the second area) of the content 722 corresponding to the scroll position included in the display state information as the first screen. can be printed out.
  • the external electronic device 104 acquires only the original content information excluding the display state information
  • the external electronic device 104 displays the content 722 in the first area ( 710) can be output.
  • 7B is a diagram of one screen displaying shared content information.
  • the external electronic device 104 may display the shared content information 741 on the screen 740 that is output based on the execution of the selected application.
  • the shared content information 741 may include a thumbnail for the shared content 722 and a notification display 742 indicating that display state information about the changed display state has been transmitted.
  • the notification display 742 may include text indicating the changed display state.
  • the present invention is not limited thereto, and the notification display 742 may include an icon including a figure.
  • FIG. 8 is a diagram illustrating an operation of transmitting content from an electronic device to an external electronic device according to an exemplary embodiment.
  • the electronic device 101 may display the first area 810 of the content 811 on the screen.
  • the first area 810 may be an area initially output from the content 811 .
  • the electronic device When a user input for changing the display state of the content 811 is received, the electronic device adds a specified effect 822 (eg, highlight) to one region (eg, the second region 820 ) of the content 811 . can do. For example, when the user drags a partial area of the content 811 , a specified effect may be input (or added) to the dragged area.
  • a specified effect eg, highlight
  • the electronic device 101 displays at least one application 831 through which content 811 transmission can be performed. ) can be represented. For example, the electronic device may list and display icons of at least one application 831 . According to an embodiment, when a user input for the share icon 821 is received, the electronic device 101 may display a preview image 830 of the content 811 to be shared. When a user input (eg, a touch input) for any one of the at least one application 831 is received, the electronic device 101 transmits the original content 811 information and the changed display state information to the external electronic device 104 . can
  • the external electronic device 104 transmits the shared content information 841 through the screen 840 that is output based on the execution of the same application as the selected application on the electronic device 101 .
  • the electronic device 101 may display the shared content information 841 through the screen of the electronic device 101 output by the execution of the selected application.
  • the content 811 original information may include at least one of URL information of the content 811 and original data of the content 811 .
  • the changed display state information of the content 811 may include location information of the specified effect 822 .
  • the shared content information 841 may include a thumbnail of the shared content 811 .
  • the second area 820 of the content 811 is It may be displayed on the electronic device 104 .
  • the external electronic device 104 displays an area (the second area 820 ) of the content 811 corresponding to the location of the specified effect included in the display state information. ) can be output as the first screen.
  • FIG. 9 is a flowchart illustrating an operation of an electronic device according to an exemplary embodiment.
  • an operation of the electronic device may be referred to as an operation of the processor 120 (refer to FIG. 1 ).
  • the electronic device may display content through the display 210 (refer to FIG. 2).
  • the content may include media content and text content including characters corresponding to each scene of the media content.
  • the electronic device may change the display state of the content. For example, the electronic device may add a specified effect (eg, highlight) to at least a part of text content based on a user input.
  • a specified effect eg, highlight
  • the electronic device may receive at least one user input for sharing content with an external electronic device (eg, the electronic device 104 of FIG. 1 ).
  • the user input for sharing content may include a user touch input for a sharing icon displayed on the display 210 .
  • the user input for sharing the content includes a first user input for a user interface (UI) (eg, a display state image) related to a display state of the content and a second user input for a share icon can do.
  • UI user interface
  • the electronic device may acquire changed display state information of the content.
  • the electronic device may include playback time information of media content corresponding to a text to which a specified effect is input among text content.
  • the electronic device may obtain display state information of the content by using data related to text content corresponding to each part of the media content stored in the memory 130 (refer to FIG. 1 ).
  • the electronic device may obtain display state information of the content by using data related to text content corresponding to each part of the media content obtained from the server 108 (refer to FIG. 1 ).
  • the electronic device may input the specified effect so that the external electronic device or a server communicating with the external electronic device may acquire playback time information of media content corresponding to the character to which the specified effect is inputted from among the text content. It is possible to obtain information on the location where the information is located and transmit it to an external electronic device or a server.
  • the electronic device transmits the original content information and the changed content display state information to an external electronic device (eg, the electronic device 104 of FIG. 1) using a communication module (eg, the communication module 190 of FIG. 1 ).
  • the content original information may include at least one of URL information of the content and original data of the content.
  • the external electronic device receives the original content information and the changed content display state information, when the content is executed, a scene of the media content corresponding to the character to which a specified effect is input among the text content may be displayed for the first time.
  • FIG. 10 is a flowchart illustrating an operation of an electronic device according to an exemplary embodiment.
  • an operation of the electronic device may be referred to as an operation of the processor 120 (refer to FIG. 1 ).
  • the electronic device may display the original image through the display 210 (refer to FIG. 2).
  • the electronic device may display the original image based on the execution of the album 275 (refer to FIG. 3 ) application.
  • the original image may include a plurality of objects belonging to the same group.
  • the electronic device may receive at least one user input for sharing content with an external electronic device (eg, the electronic device 104 of FIG. 1 ).
  • the user input for sharing content may include a user touch input for a sharing icon displayed on the display 210 .
  • the user input for sharing the content includes a first user input for a user interface (UI) (eg, a display state image) related to a display state of the content and a second user input for a share icon can do.
  • UI user interface
  • the electronic device may determine whether the original image includes a plurality of objects belonging to one group. For example, the electronic device may extract a plurality of objects that are similar to or greater than a certain level by using the shapes of objects included in the original image. According to an embodiment, the electronic device may extract a plurality of objects for two or more groups. For example, the electronic device may determine whether a plurality of first objects belonging to a first group and a plurality of second objects belonging to a second group classified into a category different from the first group exist in the original image.
  • the electronic device may transmit the original image to an external electronic device using a communication module (eg, the communication module 190). .
  • a communication module eg, the communication module 190
  • the electronic device may generate a plurality of crop images for each of the plurality of objects.
  • the cropped image may be an image from which portions other than any one of the plurality of objects are removed.
  • the cropped image may be an image including only one object and a partial area around the object.
  • One crop image may include only one object among a plurality of objects.
  • one crop image may include two or more adjacent objects among a plurality of objects.
  • the electronic device may transmit a plurality of cropped images to an external electronic device (eg, the electronic device 104 of FIG. 1 ) using a communication module (eg, the communication module 190 of FIG. 1 ).
  • the electronic device may transmit the original image together with the plurality of cropped images to the external electronic device.
  • FIG. 11 is a diagram illustrating an operation of transmitting a plurality of cropped images to an external electronic device 104 in an electronic device according to an exemplary embodiment.
  • the electronic device 101 may display an original image 1110 .
  • the original image 1110 may include a plurality of objects 1110a, 1110b, 1110c, 1110d, 1110e, and 1110f belonging to one group.
  • the electronic device 101 may display at least one application 1120 in which image transmission may be performed. For example, the electronic device 101 may list and display icons of at least one application 1120 . According to an embodiment, when a user input for the share icon 1115 is received, the electronic device 101 receives a representative image 1130 of the plurality of extracted objects 1110a, 1110b, 1110c, 1110d, 1110e, and 1110f. ) can be displayed.
  • the electronic device 101 displays each of the plurality of cropped images 1131a, 1131b, 1131c, 1131d, 1131e, and 1131f. It may transmit to the external electronic device 104 .
  • the external electronic device 104 uses a screen output based on the execution of the same application as the selected application in the electronic device 101 .
  • a plurality of cropped images 1131a, 1131b, 1131c, 1131d, 1131e, and 1131f may be displayed.
  • the electronic device 101 outputs the plurality of cropped images 1131a, 1131b, 1131c, 1131d, 1131e, and 1131f based on the execution of the selected application on the screen of the electronic device 101 . can be displayed through
  • FIG. 12 is a diagram illustrating an operation of transmitting a plurality of cropped images to the external electronic device 104 in the electronic device according to an exemplary embodiment.
  • the electronic device 101 may display an original image 1210 .
  • the original image 1210 may include a plurality of objects 1211a, 1211b, 1211c, 1211d, 1211e, and 1211f belonging to one group.
  • the electronic device may display at least one application 1220 through which image transmission may be performed.
  • the electronic device 101 may list and display icons of at least one application 1220 .
  • the electronic device 101 receives a representative image 1230 representing the plurality of extracted objects 1211a, 1211b, 1211c, 1211d, 1211e, and 1211f. ) can be displayed.
  • the representative image 1230 may include object images representing a plurality of objects.
  • a property of the selected object image may be changed.
  • the electronic device 101 may display the border 1235 of the selected object image with a specified color.
  • the user may select at least one object to be transmitted to the external electronic device 104 from the representative image 1230 .
  • the electronic device 101 After selecting the object image, when a user input (eg, a touch input) for any one of the at least one application 1220 is received, the electronic device 101 displays at least one object image corresponding to the selected at least one object image from among the plurality of objects.
  • the cropped images 1231a, 1231d, and 1231e of one object may be transmitted to the external electronic device 104 .
  • Each of the cropped images 1231a, 1231d, and 1231e may include one object.
  • each of the cropped images 1231a, 1231d, and 1231e may include two or more adjacent objects.
  • the electronic device 101 may generate one cropped image including two or more objects whose separation distance between the objects is within a specified distance among objects selected in the original image 1210 .
  • the external electronic device 104 displays the cropped images 1231a, 1231d, and 1231e) can be displayed.
  • the electronic device 101 may display the cropped images 1231a, 1231d, and 1231e on the screen of the electronic device 101 that is output based on the execution of the selected application.
  • FIG. 13 is a flowchart illustrating an operation of an electronic device according to an exemplary embodiment.
  • an operation of the electronic device may be referred to as an operation of the processor 120 (refer to FIG. 1 ).
  • the electronic device may display the original image through the display 210 (refer to FIG. 2).
  • the original image may include a plurality of objects belonging to the same group.
  • the electronic device may display an enlarged image of a region of the original image based on a user input (eg, a gesture input).
  • a user input eg, a gesture input
  • the electronic device may receive at least one user input for sharing content with an external electronic device (eg, the electronic device 104 of FIG. 1 ).
  • the user input for sharing content may include a user touch input for a sharing icon displayed on the display 210 .
  • the user input for sharing the content may include a first user input for a user interface (UI) related to a display state of the content and a second user input for a sharing icon.
  • UI user interface
  • the electronic device may extract at least one object belonging to the same group as the object included in the enlarged image from the original image.
  • the electronic device may generate a plurality of cropped images for each of the plurality of extracted objects.
  • One crop image may include only one object among a plurality of objects.
  • one crop image may include two or more adjacent objects among a plurality of objects.
  • the electronic device may transmit a plurality of cropped images to an external electronic device (eg, the electronic device 104 of FIG. 1 ) using a communication module (eg, the communication module 190 of FIG. 1 ).
  • the electronic device may transmit the original image together with the plurality of cropped images to the external electronic device.
  • FIG. 14 is a diagram illustrating an operation of transmitting a plurality of cropped images to the external electronic device 104 in the electronic device according to an exemplary embodiment.
  • the electronic device 101 may display an original image 1410 .
  • the original image 1410 may include a plurality of objects belonging to one group.
  • the electronic device 101 displays an enlarged image 1420 that enlarges and displays a partial area 1411 of the original image 1410 .
  • the electronic device 101 selects at least one object belonging to the same group as the object 1421 included in the enlarged image 1420 . It can be extracted from the original image 1410 .
  • the electronic device 101 may display cropped images 1430 of the extracted objects and at least one application 1435 in which image transmission may be performed.
  • the electronic device 101 transmits the original image 1442 and each of the plurality of cropped images 1441 to the external electronic device ( 104) can be transmitted.
  • the electronic device 101 according to another embodiment includes only the plurality of cropped images 1441 based on a user input (eg, a touch input) for any one of the at least one application 1120 through the external electronic device ( 104 ), and the original image 1442 may not be transmitted to the external electronic device 104 .
  • the external electronic device 104 displays the original image ( 1442) and each of the plurality of cropped images 1441 may be displayed.
  • the electronic device 101 displays the original image 1442 and the plurality of cropped images 1441 through the screen of the electronic device 101 that is output based on the execution of the selected application.
  • FIG. 15 is a diagram illustrating an operation of transmitting a plurality of cropped images to an external electronic device in an electronic device according to an exemplary embodiment.
  • the electronic device 101 may display an original image 1510 .
  • the original image 1510 may be an image to which an automatic face recognition function is applied to subjects during photographing.
  • the electronic device 101 displays a graphic effect designated for the face images recognized by the automatic face recognition function. (1521) can be shown.
  • the electronic device 101 may display figures (eg, rectangles) surrounding each of the recognized face images.
  • the electronic device 101 transmits the representative image 1530 representing the recognized face images and the image transmission. At least one application 1535 that can be executed may be displayed.
  • the electronic device 101 may transmit each of the plurality of cropped images 1541 to the external electronic device.
  • the electronic device 101 based on a user input for a representative image 1530 that selects at least one face image from among a plurality of face images, the electronic device 101 sets at least one of the selected at least one face image. It is also possible to transmit a cropped image of .
  • the external electronic device 104 displays the plurality of cropped images 1541 through a screen output based on the execution of the same application as the selected application on the electronic device 101 . can do.
  • the electronic device 101 may display a plurality of cropped images 1541 through the screen of the electronic device 101 that is output based on the execution of the selected application.
  • the electronic device includes a display (eg, the display 210 of FIG. 2 ), a communication module for communication with an external electronic device (eg, the communication module 190 of FIG. 1 ), and the display and a processor operatively connected to the communication module (eg, the processor 120 of FIG. 1 ), wherein the processor outputs at least a portion of the content (eg, the content 622 of FIG. 6 ) to the display; to obtain display state information indicating a display state of the content based on a user input for sharing the content, and transmit the original information of the content and the display state information to the external electronic device through the communication module can be set.
  • a display eg, the display 210 of FIG. 2
  • a communication module for communication with an external electronic device
  • the display and a processor operatively connected to the communication module (eg, the processor 120 of FIG. 1 ), wherein the processor outputs at least a portion of the content (eg, the content 622 of FIG. 6 ) to the display; to obtain display state
  • the display state information may include at least one of a scroll movement ratio, a display size of the content, an enlargement ratio of the screen, or a reduction ratio of the screen.
  • the processor displays a share icon (eg, the share icon 621 of FIG. 6 ) on the display, and based on a user input to the share icon, performs transmission of the content
  • At least one application eg, at least one application 631 of FIG. 6
  • At least one application may be set to be displayed on the display.
  • the processor is, based on the user input for the share icon (eg, the share icon 721 of FIG. 7A ), a display state image (eg, FIG. 7A ) indicating the display state information of the content. It may be set to display the display state image 735 of 7a).
  • the processor is configured to change at least one property of the display state image based on a first user input for the display state image, and to perform a second operation for any one of the one or more applications.
  • the original information of the content and the display state information corresponding to the display state image in which the attribute is changed may be configured to be transmitted to the external electronic device through the communication module.
  • the processor is configured to add a specified effect (eg, highlight) to at least a portion of the content based on a user input to the content, and the display state information of the content includes the specified effect location information may be included.
  • a specified effect eg, highlight
  • the content includes media content and text content including characters corresponding to each scene of the media content
  • the processor is configured to: Adds a specified effect (eg, highlight) to at least a part, and based on the user input for sharing the content, reproduces time information of the media content corresponding to at least a part of the text content to which the designated effect is added It may be configured to transmit the display state information included therein to the external electronic device through the communication module.
  • the content includes media content and text content including characters corresponding to each scene of the media content
  • the processor is configured to: It is set to add a specified effect to at least a part, and the external electronic device or a server communicating with the external electronic device determines playback time information of the media content corresponding to at least a part of the text content to which the specified effect is added. It may be configured to transmit information on a display position of the designated effect to the external electronic device or the server to enable the application.
  • the electronic device includes a display (eg, the display 210 of FIG. 2 ), a communication module for communication with an external electronic device (eg, the communication module 190 of FIG. 1 ), and the display and and a processor operatively connected to the communication module (eg, the processor 120 of FIG. 1 ), wherein the processor is configured to: based on a user input for image sharing, an original image (eg, the original image 1110 of FIG. )) determines whether a plurality of objects belonging to the same group (eg, the plurality of objects 1110a, 1110b, 1110c, 1110d, 1110e, 1110f of FIG. 11 ) exist in the original image, and belongs to the same group in the original image.
  • an original image eg, the original image 1110 of FIG.
  • At least one crop image of at least one of the plurality of objects (eg, the plurality of crop images 1131a, 1131b, 1131c, 1131d of FIG. 11 , 1131e, 1131f)), and transmit the at least one cropped image to the external electronic device through the communication module.
  • the processor displays a share icon (eg, the share icon 1115 of FIG. 11 ) on the display, and transmits the at least one cropped image based on a user input to the share icon It may be set to display an application (eg, the application 1120 of FIG. 11 ) for performing on the display.
  • a share icon eg, the share icon 1115 of FIG. 11
  • an application eg, the application 1120 of FIG. 11
  • the processor is configured to display a representative image (eg, the representative image 1230 of FIG. 12 ) including object images representing the plurality of objects, based on the user input for the share icon. can be set.
  • a representative image eg, the representative image 1230 of FIG. 12
  • object images representing the plurality of objects
  • the processor may be configured to change a property of the selected at least one object image based on a user input for the representative image for selecting at least one object image among the object images have.
  • the processor is configured to, based on a user input to the application, the at least one cropped image (eg, the cropped image 1231a of FIG. 12 , 1231d, 1231e)) may be set to be transmitted to the external electronic device through the communication module.
  • the at least one cropped image eg, the cropped image 1231a of FIG. 12 , 1231d, 1231e
  • the processor may be configured to generate one crop image including two or more objects whose separation distance between the objects in the original image is within a specified distance among the at least one selected object image. have.
  • the processor is, based on a gesture input for the original image (eg, the original image 1410 of FIG. 14 ), an enlarged image (eg, an enlarged image of FIG. 14 ) of at least a portion of the original image An image 1420) may be displayed on the display, and the plurality of objects included in the same group as an object included in the enlarged image (eg, the object 1421 of FIG. 14 ) may be set to be extracted from the original image. have.
  • the processor based on the user input for sharing an image, includes a plurality of first objects belonging to a first group in the original image and a second group classified into a different category from the first group determining whether a plurality of second objects belonging to
  • the plurality of first cropped images and the plurality of second cropped images may be configured to be transmitted to the external electronic device through the communication module.
  • the processor may be configured to extract the plurality of objects using an automatic face recognition function.
  • the processor based on a user input for at least one of the plurality of objects extracted from the original image, a graphic effect (for example, it may be set to display the graphic effect 1521 of FIG. 15 ).
  • the processor displays a share icon on the display, and based on a user input for the share icon, a representative image (eg, in FIG. 15 ) representing the plurality of objects on which the specified graphic effect is displayed. It may be set to display the representative image 1530).
  • the processor is configured to: Based on a user input to the representative image for selecting at least one object from among the plurality of objects, the at least one object for the at least one object selected from the representative image It may be configured to transmit the cropped image to the external electronic device through the communication module.
  • the electronic device may have various types of devices.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a smart phone
  • a portable multimedia device e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a wearable device e.g., a smart bracelet
  • a home appliance device e.g., a home appliance
  • first, second, or first or second may be used simply to distinguish the element from other elements in question, and may refer to elements in other aspects (e.g., importance or order) is not limited. It is said that one (eg, first) component is “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively”. When referenced, it means that one component can be connected to the other component directly (eg by wire), wirelessly, or through a third component.
  • module used in various embodiments of this document may include a unit implemented in hardware, software, or firmware, and is interchangeable with terms such as, for example, logic, logic block, component, or circuit.
  • a module may be an integrally formed part or a minimum unit or a part of the part that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • one or more instructions stored in a storage medium may be implemented as software (eg, the program 140) including
  • a processor eg, processor 120
  • a device eg, electronic device 101
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not contain a signal (eg, electromagnetic wave), and this term refers to the case where data is semi-permanently stored in the storage medium and It does not distinguish between temporary storage cases.
  • a signal eg, electromagnetic wave
  • the method according to various embodiments disclosed in this document may be provided as included in a computer program product.
  • Computer program products may be traded between sellers and buyers as commodities.
  • the computer program product is distributed in the form of a machine-readable storage medium (eg compact disc read only memory (CD-ROM)), or via an application store (eg Play StoreTM) or on two user devices ( It can be distributed (eg downloaded or uploaded) directly between smartphones (eg: smartphones) and online.
  • a part of the computer program product may be temporarily stored or temporarily created in a machine-readable storage medium such as a memory of a server of a manufacturer, a server of an application store, or a relay server.
  • each component (eg, module or program) of the above-described components may include a singular or a plurality of entities, and some of the plurality of entities may be separately disposed in other components. have.
  • one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, a module or a program
  • the integrated component may perform one or more functions of each component of the plurality of components identically or similarly to those performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component are executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations are executed in a different order, or omitted. or one or more other operations may be added.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

L'invention concerne un dispositif électronique comprenant : un afficheur ; un module de communication pour communiquer avec un dispositif électronique externe ; et un processeur connecté fonctionnellement à l'afficheur et au module de communication. Le processeur est configuré pour : délivrer en sortie au moins une partie du contenu à l'afficheur ; acquérir des informations d'état d'affichage indiquant l'état d'affichage du contenu, sur la base d'une entrée d'utilisateur pour le partage du contenu ; et transmettre des informations d'origine du contenu et les informations d'état d'affichage au dispositif électronique externe par l'intermédiaire du module de communication. Divers autres modes de réalisation identifiés à partir de la description sont possibles.
PCT/KR2021/006282 2020-07-06 2021-05-20 Dispositif électronique pour prendre en charge un partage de contenu WO2022010092A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020200082653A KR20220005171A (ko) 2020-07-06 2020-07-06 컨텐츠 공유를 지원하는 전자 장치
KR10-2020-0082653 2020-07-06

Publications (1)

Publication Number Publication Date
WO2022010092A1 true WO2022010092A1 (fr) 2022-01-13

Family

ID=79341889

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/006282 WO2022010092A1 (fr) 2020-07-06 2021-05-20 Dispositif électronique pour prendre en charge un partage de contenu

Country Status (2)

Country Link
KR (1) KR20220005171A (fr)
WO (1) WO2022010092A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110028811A (ko) * 2009-09-14 2011-03-22 엘지전자 주식회사 이동 단말기 및 이동 단말기에서의 정보 제공 방법
KR20140066059A (ko) * 2012-11-22 2014-05-30 엘지전자 주식회사 이동 단말기 및 그의 제스처 레코딩 방법
KR20180081231A (ko) * 2017-01-06 2018-07-16 삼성전자주식회사 데이터를 공유하기 위한 방법 및 그 전자 장치
KR20180127831A (ko) * 2017-05-22 2018-11-30 삼성전자주식회사 전자 장치 및 그의 정보 공유 방법
KR20200007424A (ko) * 2018-07-13 2020-01-22 삼성전자주식회사 전자 장치 및 전자 장치의 컨텐트 전송 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110028811A (ko) * 2009-09-14 2011-03-22 엘지전자 주식회사 이동 단말기 및 이동 단말기에서의 정보 제공 방법
KR20140066059A (ko) * 2012-11-22 2014-05-30 엘지전자 주식회사 이동 단말기 및 그의 제스처 레코딩 방법
KR20180081231A (ko) * 2017-01-06 2018-07-16 삼성전자주식회사 데이터를 공유하기 위한 방법 및 그 전자 장치
KR20180127831A (ko) * 2017-05-22 2018-11-30 삼성전자주식회사 전자 장치 및 그의 정보 공유 방법
KR20200007424A (ko) * 2018-07-13 2020-01-22 삼성전자주식회사 전자 장치 및 전자 장치의 컨텐트 전송 방법

Also Published As

Publication number Publication date
KR20220005171A (ko) 2022-01-13

Similar Documents

Publication Publication Date Title
WO2022085885A1 (fr) Procédé de commande de fenêtre et dispositif électronique associé
WO2021075786A1 (fr) Dispositif électronique et procédé de traitement d'une fenêtre surgissante utilisant une multi-fenêtre de celui-ci
WO2022031051A1 (fr) Procédé permettant de fournir une fonction de capture et dispositif électronique associé
WO2019164098A1 (fr) Appareil et procédé permettant de fournir une fonction associée à une disposition de clavier
WO2022030996A1 (fr) Dispositif électronique comprenant un dispositif d'affichage et procédé de fonctionnement associé
WO2022030890A1 (fr) Procédé de capture d'image à fenêtres multiples et dispositif électronique associé
WO2022108192A1 (fr) Dispositif électronique et procédé de commande multi-fenêtre de dispositif électronique
WO2022119276A1 (fr) Dispositif électronique d'affichage souple et procédé de fonctionnement associé
WO2022098195A1 (fr) Dispositif électronique et procédé de fonctionnement en mode flexible
WO2022031055A1 (fr) Dispositif électronique et procédé permettant de commander une sortie de vibrations de celui-ci
WO2022086272A1 (fr) Dispositif électronique pour fournir une interface utilisateur, et procédé associé
WO2022030998A1 (fr) Dispositif électronique comprenant une unité d'affichage et son procédé de fonctionnement
WO2022154264A1 (fr) Dispositif électronique et procédé de fonctionnement d'un dispositif électronique
WO2022065845A1 (fr) Procédé de traitement de données d'entrée et dispositif électronique le prenant en charge
WO2022030921A1 (fr) Dispositif électronique, et procédé de commande de son écran
WO2022010092A1 (fr) Dispositif électronique pour prendre en charge un partage de contenu
WO2022014836A1 (fr) Procédé et appareil d'affichage d'objets virtuels dans différentes luminosités
WO2022182025A1 (fr) Dispositif électronique et son procédé de commande
WO2023214675A1 (fr) Dispositif électronique et procédé de traitement d'entrée tactile
WO2022030824A1 (fr) Procédé d'affichage d'écran et de lecture audio et dispositif associé
WO2023106606A1 (fr) Serveur en nuage supportant une édition collaborative entre des dispositifs électroniques et son procédé de fonctionnement
WO2023113167A1 (fr) Dispositif électronique et son procédé de fonctionnement
WO2022131617A1 (fr) Dispositif électronique de traitement de saisie en écriture manuscrite et son procédé de fonctionnement
WO2024072057A1 (fr) Dispositif électronique et procédé de planification d'affichage d'image sur la base d'un signal provenant d'un circuit tactile
WO2022030752A1 (fr) Dispositif électronique et procédé de fourniture d'informations de prévisualisation à un dispositif externe par l'utilisation dudit dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21838704

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21838704

Country of ref document: EP

Kind code of ref document: A1