WO2022050772A1 - Procédé de dessin de prévisualisation et dispositif électronique pour celui-ci - Google Patents

Procédé de dessin de prévisualisation et dispositif électronique pour celui-ci Download PDF

Info

Publication number
WO2022050772A1
WO2022050772A1 PCT/KR2021/011963 KR2021011963W WO2022050772A1 WO 2022050772 A1 WO2022050772 A1 WO 2022050772A1 KR 2021011963 W KR2021011963 W KR 2021011963W WO 2022050772 A1 WO2022050772 A1 WO 2022050772A1
Authority
WO
WIPO (PCT)
Prior art keywords
input
preview
electronic device
display
image
Prior art date
Application number
PCT/KR2021/011963
Other languages
English (en)
Korean (ko)
Inventor
강보순
이다현
이정원
진준호
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2022050772A1 publication Critical patent/WO2022050772A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • G06F3/0383Signal control means within the pointing device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04804Transparency, e.g. transparent or translucent windows

Definitions

  • Various embodiments disclosed in this document relate to a method for drawing a preview and an electronic device for the same.
  • the electronic device may include a stylus pen that may be inserted into or detached from the electronic device.
  • the stylus pen may provide an environment in which the user inputs handwriting on the display of the electronic device.
  • the stylus pen may interact with an electronic device based on an electromagnetic induction method as well as a short-range wireless communication protocol.
  • the digital drawing application may provide a drawing environment based on a stylus pen and a user's touch input.
  • the digital drawing application may provide a user interface for changing the properties of the tool corresponding to the input.
  • the user may change the color, texture, pattern, and/or size of the tool corresponding to the input.
  • the user may change the properties of the tool corresponding to the input for digital drawing. For example, the user may draw through input after changing the attribute. The user may want to check whether the tool of the changed attribute conforms to the user's intention. If it does not match the user's intention, for example, the user may want to cancel the input.
  • a digital drawing application may provide an input cancellation function. However, when the user performs input several times, the user may have to execute input cancellation several times.
  • a digital drawing application may provide an erase function. However, a portion not intended by the user may be erased by the erase function.
  • a user may wish to cancel only part of an input. Since the input cancel function cancels the input sequentially, the input may be canceled even in the part that the user wants to leave. As another example, the user may wish to leave some of the input without canceling.
  • Various embodiments disclosed in this document may provide an electronic device and method for solving the above-described problems in a digital drawing environment.
  • An electronic device includes a display, a pen input interface configured to receive a pen input from a digital pen, a processor, and a memory, wherein the memory is provided to the display by the processor when executed.
  • Display a first image receive a first touch input to the display, set at least a partial region of the display as a preview region while the first touch input is maintained, and attribute to input of the digital pen is set as a first property, receives a first input for the preview area from the digital pen, displays a first preview image based on the first input overlaid on the first image, and displays the first touch input
  • instructions for stopping the display of the first preview image and maintaining the attribute for the digital pen input as the first attribute may be stored.
  • an operation of displaying a first image on a display of the electronic device, an operation of receiving a first touch input to the display, and the first An operation of setting a preview area for at least a partial area of the display while a touch input is maintained, and an operation of releasing the preview area when the first touch input is released, wherein the setting of the preview area includes: An operation of setting a property for an input of a digital pen as a first property, an operation of receiving a first input for the preview area from the digital pen, and an operation of applying a first preview image based on the first input to the first image overlapping display, and releasing the preview area includes stopping display of the first preview image, and maintaining a property for input of the digital pen as the first property can do.
  • the electronic device may improve user experience and convenience through preview drawing.
  • the electronic device may provide a visual feedback to the user through the preview drawing.
  • the electronic device may provide a preview drawing that can be selectively applied or canceled.
  • the electronic device may provide some application of the preview drawing.
  • the electronic device may provide a part of the preview drawing as a tool.
  • FIG. 1 is a block diagram of an electronic device in a network environment according to various embodiments of the present disclosure
  • FIG. 2 is a block diagram of an electronic device according to an exemplary embodiment.
  • FIG. 4 illustrates an example of a drawing screen according to a pen input.
  • FIG. 5 illustrates a first UI according to an embodiment.
  • FIG. 6 illustrates a preview screen according to an exemplary embodiment.
  • FIG. 7 illustrates a screen configuration according to an exemplary embodiment.
  • FIG. 8 illustrates provision of a first UI according to an embodiment.
  • FIG 9 illustrates a preview image according to an exemplary embodiment.
  • FIG. 10 illustrates cancellation of a preview image according to an embodiment.
  • FIG. 11 illustrates application of a partial preview image according to an embodiment.
  • FIG. 13 illustrates a pen input using a shortcut according to an exemplary embodiment.
  • 15 is a flowchart of a method for preview drawing according to an embodiment.
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100 according to various embodiments.
  • an electronic device 101 communicates with an electronic device 102 through a first network 198 (eg, a short-range wireless communication network) or a second network 199 . It may communicate with the electronic device 104 or the server 108 through (eg, a long-distance wireless communication network). According to an embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • a first network 198 eg, a short-range wireless communication network
  • a second network 199 e.g., a second network 199
  • the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • the electronic device 101 includes a processor 120 , a memory 130 , an input module 150 , a sound output module 155 , a display module 160 , an audio module 170 , and a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or an antenna module 197 may be included.
  • at least one of these components eg, the connection terminal 178
  • may be omitted or one or more other components may be added to the electronic device 101 .
  • some of these components are integrated into one component (eg, display module 160 ). can be
  • the processor 120 for example, executes software (eg, a program 140) to execute at least one other component (eg, a hardware or software component) of the electronic device 101 connected to the processor 120 . It can control and perform various data processing or operations. According to one embodiment, as at least part of data processing or operation, the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 . may be stored in the volatile memory 132 , and may process commands or data stored in the volatile memory 132 , and store the result data in the non-volatile memory 134 .
  • software eg, a program 140
  • the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 .
  • the volatile memory 132 may be stored in the volatile memory 132 , and may process commands or data stored in the volatile memory 132 , and store the result data in the non-volatile memory 134 .
  • the processor 120 is the main processor 121 (eg, a central processing unit or an application processor) or a secondary processor 123 (eg, a graphic processing unit, a neural network processing unit) a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor).
  • the main processor 121 e.g, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, a neural network processing unit
  • NPU neural processing unit
  • an image signal processor e.g., a sensor hub processor, or a communication processor.
  • the main processor 121 e.g, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, a neural network processing unit
  • NPU neural processing unit
  • an image signal processor e.g., a sensor hub processor, or a communication processor.
  • the main processor 121 e.g, a central processing unit or an application processor
  • a secondary processor 123
  • the auxiliary processor 123 is, for example, on behalf of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or the main processor 121 is active (eg, executing an application). ), together with the main processor 121, at least one of the components of the electronic device 101 (eg, the display module 160, the sensor module 176, or the communication module 190) It is possible to control at least some of the related functions or states.
  • the co-processor 123 eg, an image signal processor or a communication processor
  • may be implemented as part of another functionally related component eg, the camera module 180 or the communication module 190. there is.
  • the auxiliary processor 123 may include a hardware structure specialized for processing an artificial intelligence model.
  • Artificial intelligence models can be created through machine learning. Such learning may be performed, for example, in the electronic device 101 itself on which artificial intelligence is performed, or may be performed through a separate server (eg, the server 108).
  • the learning algorithm may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but in the above example not limited
  • the artificial intelligence model may include a plurality of artificial neural network layers.
  • Artificial neural networks include deep neural networks (DNNs), convolutional neural networks (CNNs), recurrent neural networks (RNNs), restricted boltzmann machines (RBMs), deep belief networks (DBNs), bidirectional recurrent deep neural networks (BRDNNs), It may be one of deep Q-networks or a combination of two or more of the above, but is not limited to the above example.
  • the artificial intelligence model may include, in addition to, or alternatively, a software structure in addition to the hardware structure.
  • the memory 130 may store various data used by at least one component of the electronic device 101 (eg, the processor 120 or the sensor module 176 ).
  • the data may include, for example, input data or output data for software (eg, the program 140 ) and instructions related thereto.
  • the memory 130 may include a volatile memory 132 or a non-volatile memory 134 .
  • the program 140 may be stored as software in the memory 130 , and may include, for example, an operating system 142 , middleware 144 , or an application 146 .
  • the input module 150 may receive a command or data to be used in a component (eg, the processor 120 ) of the electronic device 101 from the outside (eg, a user) of the electronic device 101 .
  • the input module 150 may include, for example, a microphone, a mouse, a keyboard, a key (eg, a button), or a digital pen (eg, a stylus pen).
  • the sound output module 155 may output a sound signal to the outside of the electronic device 101 .
  • the sound output module 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback.
  • the receiver may be used to receive an incoming call. According to one embodiment, the receiver may be implemented separately from or as part of the speaker.
  • the display module 160 may visually provide information to the outside (eg, a user) of the electronic device 101 .
  • the display module 160 may include, for example, a control circuit for controlling a display, a hologram device, or a projector and a corresponding device.
  • the display module 160 may include a touch sensor configured to sense a touch or a pressure sensor configured to measure the intensity of a force generated by the touch.
  • the audio module 170 may convert a sound into an electric signal or, conversely, convert an electric signal into a sound. According to an embodiment, the audio module 170 acquires a sound through the input module 150 , or an external electronic device (eg, a sound output module 155 ) connected directly or wirelessly with the electronic device 101 . A sound may be output through the electronic device 102 (eg, a speaker or headphones).
  • an external electronic device eg, a sound output module 155
  • a sound may be output through the electronic device 102 (eg, a speaker or headphones).
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101 or an external environmental state (eg, user state), and generates an electrical signal or data value corresponding to the sensed state. can do.
  • the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 177 may support one or more designated protocols that may be used by the electronic device 101 to directly or wirelessly connect with an external electronic device (eg, the electronic device 102 ).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card
  • the connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102 ).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that the user can perceive through tactile or kinesthetic sense.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture still images and moving images. According to an embodiment, the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101 .
  • the power management module 188 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101 .
  • battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108). It can support establishment and communication performance through the established communication channel.
  • the communication module 190 may include one or more communication processors that operate independently of the processor 120 (eg, an application processor) and support direct (eg, wired) communication or wireless communication.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg, : It may include a LAN (local area network) communication module, or a power line communication module).
  • GNSS global navigation satellite system
  • a corresponding communication module among these communication modules is a first network 198 (eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (eg, legacy It may communicate with the external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
  • a first network 198 eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)
  • a second network 199 eg, legacy It may communicate with the external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
  • a telecommunication network
  • the wireless communication module 192 uses the subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199 .
  • the electronic device 101 may be identified or authenticated.
  • the wireless communication module 192 may support a 5G network after a 4G network and a next-generation communication technology, for example, a new radio access technology (NR).
  • NR access technology includes high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), minimization of terminal power and access to multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low-latency) -latency communications)).
  • eMBB enhanced mobile broadband
  • mMTC massive machine type communications
  • URLLC ultra-reliable and low-latency
  • the wireless communication module 192 may support a high frequency band (eg, mmWave band) to achieve a high data rate, for example.
  • a high frequency band eg, mmWave band
  • the wireless communication module 192 includes various technologies for securing performance in a high-frequency band, for example, beamforming, massive multiple-input and multiple-output (MIMO), all-dimensional multiplexing. It may support technologies such as full dimensional MIMO (FD-MIMO), an array antenna, analog beam-forming, or a large scale antenna.
  • the wireless communication module 192 may support various requirements specified in the electronic device 101 , an external electronic device (eg, the electronic device 104 ), or a network system (eg, the second network 199 ).
  • the wireless communication module 192 may include a peak data rate (eg, 20 Gbps or more) for realizing eMBB, loss coverage (eg, 164 dB or less) for realizing mMTC, or U-plane latency for realizing URLLC ( Example: downlink (DL) and uplink (UL) each 0.5 ms or less, or round trip 1 ms or less).
  • a peak data rate eg, 20 Gbps or more
  • loss coverage eg, 164 dB or less
  • U-plane latency for realizing URLLC
  • the antenna module 197 may transmit or receive a signal or power to the outside (eg, an external electronic device).
  • the antenna module 197 may include an antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is connected from the plurality of antennas by, for example, the communication module 190 . can be chosen.
  • a signal or power may be transmitted or received between the communication module 190 and an external electronic device through the selected at least one antenna.
  • other components eg, a radio frequency integrated circuit (RFIC)
  • RFIC radio frequency integrated circuit
  • the antenna module 197 may form a mmWave antenna module.
  • the mmWave antenna module comprises a printed circuit board, an RFIC disposed on or adjacent to a first side (eg, bottom side) of the printed circuit board and capable of supporting a designated high frequency band (eg, mmWave band); and a plurality of antennas (eg, an array antenna) disposed on or adjacent to a second side (eg, top or side) of the printed circuit board and capable of transmitting or receiving signals of the designated high frequency band. can do.
  • peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • the command or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199 .
  • Each of the external electronic devices 102 or 104 may be the same as or different from the electronic device 101 .
  • all or a part of operations executed in the electronic device 101 may be executed in one or more external electronic devices 102 , 104 , or 108 .
  • the electronic device 101 may perform the function or service itself instead of executing the function or service itself.
  • one or more external electronic devices may be requested to perform at least a part of the function or the service.
  • One or more external electronic devices that have received the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit a result of the execution to the electronic device 101 .
  • the electronic device 101 may process the result as it is or additionally and provide it as at least a part of a response to the request.
  • cloud computing distributed computing, mobile edge computing (MEC), or client-server computing technology may be used.
  • the electronic device 101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 104 may include an Internet of things (IoT) device.
  • Server 108 may be an intelligent server using machine learning and/or neural networks.
  • the external electronic device 104 or the server 108 may be included in the second network 199 .
  • the electronic device 101 may be applied to an intelligent service (eg, smart home, smart city, smart car, or health care) based on 5G communication technology and IoT-related technology.
  • the electronic device may have various types of devices.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a smart phone
  • a portable multimedia device e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a wearable device e.g., a smart bracelet
  • a home appliance device e.g., a home appliance
  • first, second, or first or second may be used simply to distinguish the element from other elements in question, and may refer to elements in other aspects (e.g., importance or order) is not limited. It is said that one (eg, first) component is “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively”. When referenced, it means that one component can be connected to the other component directly (eg by wire), wirelessly, or through a third component.
  • module used in various embodiments of this document may include a unit implemented in hardware, software, or firmware, and is interchangeable with terms such as, for example, logic, logic block, component, or circuit.
  • a module may be an integrally formed part or a minimum unit or a part of the part that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • one or more instructions stored in a storage medium may be implemented as software (eg, the program 140) including
  • a processor eg, processor 120
  • a device eg, electronic device 101
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not include a signal (eg, electromagnetic wave), and this term is used in cases where data is semi-permanently stored in the storage medium and It does not distinguish between temporary storage cases.
  • a signal eg, electromagnetic wave
  • the method according to various embodiments disclosed in this document may be provided as included in a computer program product.
  • Computer program products may be traded between sellers and buyers as commodities.
  • the computer program product is distributed in the form of a machine-readable storage medium (eg compact disc read only memory (CD-ROM)), or via an application store (eg Play StoreTM) or on two user devices ( It can be distributed (eg downloaded or uploaded) directly between smartphones (eg: smartphones) and online.
  • a part of the computer program product may be temporarily stored or temporarily generated in a machine-readable storage medium such as a memory of a server of a manufacturer, a server of an application store, or a relay server.
  • each component (eg, module or program) of the above-described components may include a singular or a plurality of entities, and some of the plurality of entities may be separately disposed in other components. there is.
  • one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, a module or a program
  • the integrated component may perform one or more functions of each component of the plurality of components identically or similarly to those performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component are executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations are executed in a different order, or omitted. or one or more other operations may be added.
  • FIG. 2 is a block diagram of an electronic device according to an exemplary embodiment.
  • the electronic device 201 disclosed in this document may include at least some of the components of the electronic device described above with reference to FIG. 1 (eg, the electronic device 101 of FIG. 1 ).
  • the electronic device 201 may include a processor 220 (eg, the processor 120 of FIG. 1 ), a memory 230 (eg, the memory 130 of FIG. 1 ), and a pen input interface 250 (eg, a pen input interface 250 ).
  • a processor 220 eg, the processor 120 of FIG. 1
  • a memory 230 eg, the memory 130 of FIG. 1
  • a pen input interface 250 eg, a pen input interface 250
  • Processor 220 may be operatively coupled to memory 230 , pen input interface 250 , display 260 , and communication circuitry 290 .
  • the memory 230 may store instructions that, when executed, cause the processor 220 to perform various operations of the electronic device 201 .
  • the pen input interface 250 may be set to receive an input by the input device 202 (eg, a stylus pen or a digital pen).
  • the pen input interface 250 may include a grid-shaped circuit pattern, and may be set to detect an input position of the input device 202 using the circuit pattern.
  • the pen input interface 250 may be embedded in the display 260 or located below the display 260 .
  • the display 260 may include a plurality of pixels.
  • the display 260 may further include a touch input detection circuit and/or a pen input detection circuit (eg, the pen input interface 250 ) for detecting a touch input (eg, a touch input by a user).
  • the processor 220 may distinguish and detect a touch input by an external object (eg, a user's hand) and an input by the input device 202 .
  • the processor 220 may detect a touch input using a touch input detection circuit, and may detect an input of the input device 202 using a pen input detection circuit. For example, the processor 220 may detect a hovering input from the input device 202 when the input device 202 is spaced apart from the display 260 within a specified distance.
  • the processor 220 may detect a pen input (eg, pen touch input or pen contact input) from the input device 202 when one end of the input device 202 is in contact with the display 260 .
  • a pen input eg, pen touch input or pen contact input
  • the processor 220 may detect a button input from the input device 202 when the button of the input device 202 is triggered.
  • the communication circuit 290 may be configured to support short-range wireless communication based on a Bluetooth protocol (eg, legacy Bluetooth and/or BLE) and/or a wireless LAN.
  • a Bluetooth protocol eg, legacy Bluetooth and/or BLE
  • communication circuitry 290 may provide communication with input device 202 .
  • the electronic device 201 may further include a configuration not shown in FIG. 2 .
  • the electronic device 201 may further include a housing.
  • the housing may include a magnetic pad for attachment of the input device 202 and/or a slot for insertion of the input device 202 .
  • the input device 202 may be referred to as a stylus, digital pen, or digitizer pen.
  • the input device 202 may be detachable from the electronic device 201 or may be inserted into the electronic device 201 .
  • the input device 202 may receive an electromagnetic field signal (eg, a proximity signal) generated from a digitizer (eg, the pen input interface 250 ) of the electronic device 201 .
  • the input device 202 may receive an electromagnetic field signal using a resonant circuit.
  • the input device 202 may transmit an electromagnetic resonance (EMR) input signal to the electronic device 201 .
  • EMR electromagnetic resonance
  • the input device 202 may use at least one of an active electrostatic (AES) method or an electrically coupled resonance (ECR) method.
  • AES active electrostatic
  • ECR electrically coupled resonance
  • the input device 202 may generate a signal using the electronic device 201 and capacitive coupling.
  • the input device 202 transmits a signal by the ECR method
  • the input device 202 includes a resonance frequency based on an electric field generated from a capacitive device of the electronic device 201 . signal can be generated.
  • the input device 202 may include a communication circuit for communication with the electronic device 201 .
  • the input device 202 may communicate with the electronic device 201 using short-range wireless communication (eg, at least one of Bluetooth, Bluetooth low energy (BLE), or wireless LAN).
  • the input device 202 may include at least one button. When an input for at least one button is received, the input device 202 may transmit a signal corresponding to the button input to the electronic device 201 using a resonance circuit and/or a communication circuit.
  • the electronic device 201 may detect a hovering input of the input device 202 .
  • the electronic device 201 may detect a hovering input of the input device 202 with respect to the first point 310 of the display 260 .
  • the electronic device 201 may identify the first point 310 corresponding to one end of the input device 202 as a point at which a hovering input by the input device 202 is received.
  • the electronic device 201 may receive the electromagnetic field signal from the input device 202 and identify the first point 310 based on the magnitude of the electromagnetic field signal.
  • the electronic device 201 may identify a hovering input when one end (eg, a pen tip) of the input device 202 does not physically contact the display 260 .
  • the electronic device 201 may detect a touch input.
  • the electronic device 201 may detect a touch input by detecting a change in voltage and/or a change in the amount of electric charge caused by the external object 390 (eg, a finger).
  • the electronic device 201 may identify a point on the display 260 to which the touch input is input (eg, the second point 320 ).
  • the electronic device 201 may independently identify a touch input and a hovering input.
  • the electronic device 201 may be set to perform different operations according to the type of input. For example, when the touch input and the hovering input are simultaneously identified, the electronic device 201 may perform an operation corresponding to the touch input and an operation corresponding to the hovering input.
  • FIG. 4 illustrates an example of a drawing screen according to a pen input.
  • the electronic device 201 may detect a pen input by the input device 202 .
  • the pen input may be referred to as an input performed while one end (eg, a pen tip) of the input device 202 is in physical contact with the display 260 .
  • the pen input may include a stroke input and/or a spot input by the input device 202 .
  • the electronic device 201 may provide a feedback corresponding to the pen input to the display 260 based on the pen input. For example, the electronic device 201 may display a line 410 corresponding to a pen input on the drawing screen 710 of the display 260 . The electronic device 201 may display the line 410 based on the attribute set for the pen input. Attributes of pen input may include, for example, color, pen type (eg, pencil, brush, pastel, etc.), pattern, size, and/or shape.
  • the electronic device 201 may independently identify a pen input, a touch input, and a hovering input.
  • the electronic device 201 may be set to perform different operations according to the type of input. For example, when the touch input and the pen input are simultaneously identified, the electronic device 201 may perform an operation corresponding to the touch input and an operation corresponding to the pen input.
  • the drawing screen 710 may be a logical area in which a user can draw a picture based on a drawing input (eg, a pen input).
  • the drawing screen 710 may be referred to as a drawing area or a drawing layer.
  • the drawing screen 710 may include one or more layers.
  • the drawing screen 710 may be, for example, a partial area or an entire area displayed on the display 260 among all logical drawings.
  • FIG. 5 illustrates a first UI according to an embodiment.
  • the electronic device 201 may display the first UI 720 .
  • the electronic device 201 may display the first UI 720 at a position corresponding to the touch input.
  • the electronic device 201 may display the first UI 720 on a partial area of the drawing screen 710 .
  • the first UI 720 may be displayed as a pop-up image on the drawing screen 710 .
  • the electronic device 201 may change the display position of the first UI 720 according to a drag input to a point of the first UI 720 .
  • the first UI 720 may be referred to as a first UI area, a first UI layer, or a pop-up image.
  • the first UI 720 may include at least one interface for changing a property of a pen input.
  • the first UI 720 may include a pen type setting UI 723 and a pen color setting UI 725 .
  • the electronic device 201 may change the pen type specified for the pen input based on the input to the pen type setting UI 723 .
  • Pen types may include, for example, fountain pens, markers, pencils, and/or brushes, and the like. According to the change of the pen type, the shape of the pen tip corresponding to the pen input, the blur of the pen input, the bleeding of the pen input, and/or the transparency of the pen input may be changed.
  • the electronic device 201 may change the color of the pen input based on the input of the pen color setting UI 725 .
  • the first UI 720 may include a preview activation button 721 .
  • the electronic device 201 may set a preview area.
  • FIG. 6 illustrates a preview screen according to an exemplary embodiment.
  • the electronic device 201 may receive a touch input by the hand 390 for the preview activation button 721 .
  • the electronic device 201 may set at least a partial area of the drawing screen 710 as the preview screen 730 .
  • the electronic device 201 maintains the preview screen 730 while the input to the preview activation button 721 is maintained, and when the input to the preview activation button 721 is released, the preview screen 730 is displayed. setting can be turned off.
  • the electronic device 201 may set a preview area at a position corresponding to the input device 202 .
  • the preview area may be referred to as a preview screen 730 .
  • the electronic device 201 may set the preview screen 730 at a position where a hovering input of the input device 202 is sensed.
  • the electronic device 201 may set the preview screen 730 centered on the position where the hovering input of the input device 202 is sensed.
  • the preview screen 730 may be a logical area in which a user can draw a picture based on a drawing input (eg, a pen input).
  • the preview screen 730 may be referred to as a preview area or a preview layer.
  • the preview screen 730 may be, for example, a partial area or an entire area displayed on the display 260 among all logical preview areas.
  • the preview screen 730 may be displayed while overlapping at least a portion of the drawing screen 710 . Since the input to the preview screen 730 is displayed overlapping the drawing on the drawing screen 710 , the user can predict the drawing result through the input to the preview screen 730 . As will be described later with reference to FIGS. 8 to 15 , when the preview screen 730 is released, at least some of the inputs to the preview screen 730 may be canceled. The user can easily cancel the input to the preview screen 730 by releasing the preview screen 730 .
  • the position, size, and shape of the preview screen 730 illustrated in FIG. 6 are exemplary and embodiments of the present document are not limited thereto.
  • the preview screen 730 may be set as at least a partial area of the drawing screen 710 .
  • the preview screen 730 may be located at a designated location or at the center of the drawing screen 710 .
  • the drawing screen 710 , the first UI 720 , and/or the preview screen 730 may constitute an execution screen of a specified application (eg, a drawing application).
  • a specified application eg, a drawing application
  • FIG. 7 illustrates a screen configuration according to an exemplary embodiment.
  • the drawing screen 710 , the first UI 720 , and the preview screen 730 may be arranged according to a logical hierarchical structure.
  • the drawing screen 710 may correspond to a lower layer of the first UI 720 and the preview screen 730 .
  • the first UI 720 may correspond to the uppermost layer.
  • the preview screen 730 may correspond to a layer between the first UI 720 and the drawing screen 710 .
  • the drawing screen 710 is illustrated as one layer in FIG. 7 , the drawing screen 710 may be composed of a plurality of layers.
  • the electronic device 201 may process an input to a point of the display 260 as an input to an uppermost layer of the corresponding point. For example, when an input is received at a location corresponding to the first UI 720 , the electronic device 201 may process the input as an input corresponding to the first UI 720 . For example, when an input is received at a location where the preview screen 730 and the drawing screen 710 overlap, the electronic device 201 may process the corresponding input as an input to the preview screen 730 . For example, when an input is received on the preview screen 730 and the drawing screen 710 that does not overlap the first UI 720 , the electronic device 201 converts the input as an input to the drawing screen 710 . can be processed
  • FIG. 8 illustrates provision of a first UI according to an embodiment.
  • an image is displayed on the drawing screen 710 .
  • the image on the drawing screen 710 may include at least one of an image based on a pen input and a loaded image.
  • the image is for illustrative purposes only, and embodiments of the present document are not limited thereto.
  • a touch input by an external object 390 is received at a point on the drawing screen 710 .
  • the electronic device 201 may display the first UI 720 at a position corresponding to the touch input.
  • the first UI 720 may include a preview activation button 721 .
  • FIG 9 illustrates a preview image according to an exemplary embodiment.
  • the electronic device 201 may receive a touch input for the preview activation button 721 .
  • the electronic device 201 may set the preview screen 730 based on the input. For example, the electronic device 201 may set the preview screen 730 at a position corresponding to the input device 202 .
  • the electronic device 201 may provide a visual feedback corresponding to the preview screen 730 .
  • the electronic device 201 may display an outline of the preview screen 730 .
  • the electronic device 201 may set the transparency of the preview screen 730 to a specified value.
  • the electronic device 201 may set the preview screen 730 to be translucent to provide visual feedback on the preview screen 730 .
  • the electronic device 201 may receive at least one pen input by the input device 202 .
  • the electronic device 201 may maintain the preview screen 730 while an input to the preview activation button 721 is maintained.
  • the electronic device 201 may receive a pen input by the input device 202 while displaying the preview screen 730 .
  • the electronic device 201 may receive a first pen input 910 and a second pen input 920 with respect to the preview screen 920 .
  • the user may perform the first pen input 910 after setting the pen input property as the first property through the input to the first UI 720 .
  • the electronic device 201 may display the first pen input 910 according to the first attribute.
  • the user may change the pen input property from the first property to the second property through the input to the first UI 720 , and then perform the second pen input 920 .
  • the electronic device 201 may display the second pen input 920 according to the second attribute.
  • the electronic device 201 may receive a pen input to the drawing screen 710 other than the preview screen 730 .
  • the electronic device 201 may receive the third pen input 930 .
  • the electronic device 201 receives an input for the drawing screen 710 outside the preview screen 730 , but embodiments of the present document are not limited thereto.
  • the electronic device 201 may not allow drawing of an area outside the preview screen 730 while displaying the preview screen 730 .
  • FIG. 10 illustrates cancellation of a preview image according to an embodiment.
  • FIG. 10 shows a situation in which an input to the preview activation button 721 from reference numeral 903 of FIG. 9 is released.
  • the electronic device 201 may cancel at least some of the input to the preview screen 730 .
  • the electronic device 201 may cancel an unspecified input among the input to the preview screen 730 .
  • the electronic device 201 may stop displaying the canceled input.
  • the electronic device 201 may cancel the first pen input 910 and the second pen input 920 of FIG. 9 . there is. While the first pen input 910 and the second pen input 920 to the preview screen 730 are canceled, the electronic device 201 receives an input to the drawing screen 710 other than the preview screen 730 (eg: The third input 930 may be maintained.
  • FIG. 11 illustrates application of a partial preview image according to an embodiment.
  • FIG. 11 may correspond to a situation following reference numeral 903 of FIG. 9 .
  • the electronic device 201 may receive a selection input for at least a partial region of the preview screen 730 .
  • the electronic device 201 may receive a selection input by the input device 202 .
  • the electronic device 201 may change the input property of the input device 202 with a selection tool.
  • the electronic device 201 may change the input property of the input device 202 to a selection tool based on an input to the first UI 720 or an input to a button of the input device 202 .
  • the electronic device 201 may set the selection area 1110 based on the pen input of the input device 202 set as the selection tool. It may be assumed that a position corresponding to the first pen input 910 is set as the selection area 1110 , but the second pen input 920 is not selected.
  • an input to the preview activation button 721 is released.
  • the electronic device 201 may cancel the remaining input except for a region that is not set as the selection region 1110 among the inputs to the preview screen 730 .
  • the electronic device 201 may maintain the first pen input 910 and cancel the second pen input 920 .
  • the partial area of the preview screen 730 may be applied to the drawing screen 710 .
  • the electronic device 201 may merge the first pen input 910 into the same layer as the drawing screen 710 .
  • the electronic device 201 may apply the first pen input 910 on a layer separate from the drawing screen 710 .
  • FIG. 12 may correspond to a situation following reference numeral 903 of FIG. 9 .
  • the electronic device 201 may receive a selection input for at least a partial region of the preview screen 730 .
  • the electronic device 201 may set the selection area 1210 based on a pen input of the input device 202 set as the selection tool. It may be assumed that a position corresponding to the second pen input 920 is set as the selection area 1210 , but the first pen input 910 is not selected.
  • the electronic device 201 may receive a drag input for moving the second pen input 920 corresponding to the selection area 1210 out of the preview screen 730 .
  • the electronic device 201 may move the second pen input 920 out of the preview screen 730 based on the drag input.
  • the input to the preview activation button 721 is released.
  • the electronic device 201 may cancel the remaining input except for a region that is not set as the selection region 1110 among the inputs to the preview screen 730 .
  • the electronic device 201 may cancel the first pen input 910 .
  • the electronic device 201 may set the second input 920 moved out of the preview screen 730 as a short cut 1230 .
  • the electronic device 201 may set the shortcut 1230 as a shortcut for the property and/or shape of the second input 920 .
  • the electronic device 201 may set the properties and/or shape of the pen input according to the properties and/or shape of the shortcut 1230 .
  • the electronic device 201 may display the shortcut 1230 on an upper layer of the drawing screen 710 .
  • the shortcut 1230 may be referred to as, for example, an icon or a floating user interface (UI).
  • FIG. 13 illustrates a pen input using a shortcut according to an exemplary embodiment.
  • FIG. 13 may correspond to a situation following reference numeral 1203 of FIG. 12 .
  • the electronic device 201 may receive a pen input for the shortcut 1230 .
  • the electronic device 201 may set the property of the pen input as the property of the shortcut 1230 based on the pen input for the shortcut 1230 .
  • the electronic device 201 may receive a pen input 1310 .
  • the electronic device 201 may display the pen input 1310 according to the properties of the shortcut 1230 .
  • the user may apply at least a portion of the input to the preview screen 730 to the input to the drawing screen 710 by using the shortcut 1230 .
  • FIG. 14 may correspond to a situation following reference numeral 1203 of FIG. 12 .
  • the electronic device 201 may receive a pen input for the shortcut 1230 .
  • the electronic device 201 may set the shape and properties of the pen input to those of the shortcut 1230 based on the pen input for the shortcut 1230 .
  • shortcut 1230 may be used as a stamp.
  • the electronic device 201 may receive a pen input 1410 .
  • the electronic device 201 may pattern the properties and shape of the shortcut 1230 and display the pen input 1410 according to the pattern.
  • 15 is a flowchart of a method for preview drawing according to an embodiment.
  • the electronic device 201 may perform the operations of FIG. 15 .
  • the processor 220 of the electronic device 201 may be configured to perform the operations of FIG. 15 when instructions stored in the memory 230 are executed.
  • the electronic device 201 may display the first image on the display 260 .
  • the first image may correspond to the drawing screen 710 .
  • the first image may include a drawing and/or a background image based on a user's pen input.
  • the electronic device 201 may receive a first touch input.
  • the first touch input may be a touch input for a designated UI (eg, a preview activation button 721).
  • the electronic device 201 receives a touch input prior to the first touch input, and displays a designated UI (eg, the first UI 720 including the preview activation button 721 ) based on the touch input. can do.
  • the electronic device 201 may receive a touch input and display a designated UI based on the touch input.
  • the electronic device 201 displays the first UI 720 based on a touch input with reference to FIGS. 5 to 14 , embodiments of the present document are not limited thereto.
  • the electronic device 201 may display a designated UI (eg, a preview activation button 721) together with the first image.
  • the electronic device 201 may determine whether the first touch input is maintained. For example, when the first touch input for the specified UI is maintained, the electronic device 201 may determine that the first touch input is maintained. When the first touch input for the specified UI is released, the electronic device 201 may determine that the first touch input is not maintained.
  • the electronic device 201 may set at least a partial area of the first image as a preview area (eg, the preview screen 730 ).
  • the electronic device 201 may display the first preview image by superimposing it on the first image based on the first input (eg, pen input).
  • the electronic device 201 may display the first input in the preview area on the first image according to the first attribute set for the first input. For example, the electronic device 201 may display the first input according to the first attribute.
  • the electronic device 201 may set the pen input property as the first property based on the user input.
  • the electronic device 201 may remove the preview area and the first preview image.
  • the electronic device 201 may cancel the preview area setting and stop displaying the first preview image. In this case, the electronic device 201 may maintain the pen input attribute as the first attribute.
  • the electronic device 201 stops displaying the first preview image in operation 1530, embodiments of the present document are not limited thereto.
  • the electronic device 201 may not remove at least a portion of the preview image as described above with reference to FIGS. 11 to 14 .
  • the electronic device 201 may not remove the selected area of the preview image.
  • the electronic device 201 may apply a preview image corresponding to the selected area to the first image.
  • the electronic device 201 may generate a preview image corresponding to the selected area as a shortcut 1230 .
  • the electronic device 201 includes a display 260 , a pen input interface 250 configured to receive pen input from a digital pen (eg, the input device 202 ), and a processor 220 , and a memory 230 .
  • the memory 230 may store instructions that, when executed, cause the processor 220 to perform operations to be described later.
  • the pen input interface 250 may be located within the display 260 .
  • the processor 220 displays a first image on the display 260, receives a first touch input to the display 260, and while the first touch input is maintained, the display 260 set at least a partial area of the . 1
  • a preview image may be displayed by being superimposed on the first image.
  • the processor 220 may stop displaying the first preview image and maintain a property of the digital pen input as the first property. For example, the processor 220 may set the preview area at a position corresponding to a hovering input by the digital pen.
  • the first touch input may include a touch input for a button on the display.
  • the processor 220 may receive a second touch input for the display, and display a user interface including the button on the display based on the second touch input.
  • the processor 220 receives a second input for the preview area from the digital pen while the first touch input is maintained, and superimposes a second preview image based on the second input on the first image. may be displayed, and a selection input for the second preview image may be received. When the first touch input is released, the processor 220 may stop displaying the first preview image and apply the second preview image designated by the selection input to the first image. The processor 220 moves the second preview image designated by the selection input to the outside of the preview area based on the drag input while the first touch input is maintained, and when the first touch input is released, The second preview image may be set as a shortcut, and the display of the second preview image may be maintained.
  • the processor 220 may change the property of the input to the digital pen according to the property of the second preview image.
  • the processor 220 may change the property and shape of the input to the digital pen according to the property and shape of the second preview image.
  • a method for preview drawing of an electronic device 201 includes displaying a first image on a display 260 of the electronic device and receiving a first touch input on the display 260 . , setting a preview area for at least a partial area of the display while the first touch input is maintained, and releasing the preview area when the first touch input is released.
  • the setting of the preview area may include setting a property for input of the digital pen as a first property, receiving a first input for the preview area from the digital pen, and a second input based on the first input. and displaying a first preview image by superimposing the preview image on the first image.
  • the operation of releasing the preview area may include stopping the display of the first preview image and maintaining an input property of the digital pen as the first property.
  • the setting of the preview area may further include setting the preview area at a position corresponding to a hovering input by the digital pen.
  • the setting of the preview area may include: receiving a second input for the preview area from the digital pen; displaying a second preview image based on the second input by superimposing the first image on the first image;
  • the method may further include receiving a selection input for the second preview image.
  • the operation of releasing the preview area may further include applying the second preview image designated by the selection input to the first image.
  • the setting of the preview area may further include moving the second preview image designated by the selection input out of the preview area based on a drag input, and releasing the preview area may include:
  • the method may further include an operation of setting the second preview image as a shortcut, and an operation of maintaining display of the second preview image.
  • the method may further include, when an input for the shortcut is received, changing a property of the input to the digital pen according to a property of the second preview image.
  • the method may further include, when the input for the shortcut is received, changing the properties and shape of the input to the digital pen according to the properties and shape of the second preview image.
  • the first touch input may include a touch input for a button on the display.
  • the method may further include receiving a second touch input for the display, and displaying a user interface including the button on the display based on the second touch input.
  • the user interface may further include at least one graphic object for changing a property of an input to the digital pen.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Est décrit un dispositif électronique comprenant un afficheur, un processeur, une mémoire, et une interface d'entrée de stylet configurée pour recevoir une entrée de stylet provenant d'un stylet numérique. Le dispositif électronique peut afficher une première image sur l'afficheur et recevoir une première entrée tactile sur l'afficheur. Le dispositif électronique peut définir au moins une zone partielle de l'afficheur en tant que zone de prévisualisation tandis que la première entrée tactile est maintenue, définir une propriété concernant une entrée du stylet numérique en tant que première propriété, recevoir une première entrée concernant la zone de prévisualisation provenant du stylet numérique, et afficher une première image de prévisualisation qui est basée sur la première entrée, en superposant la première image de prévisualisation sur la première image. Lorsque la première entrée tactile est relâchée, le dispositif électronique peut arrêter l'affichage de la première image de prévisualisation et maintenir la propriété concernant l'entrée du stylet numérique en tant que ladite première propriété. Divers autres modes de réalisation identifiés tout au long du présent document sont possibles.
PCT/KR2021/011963 2020-09-04 2021-09-03 Procédé de dessin de prévisualisation et dispositif électronique pour celui-ci WO2022050772A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020200113387A KR20220031455A (ko) 2020-09-04 2020-09-04 프리뷰 드로잉 방법 및 이를 위한 전자 장치
KR10-2020-0113387 2020-09-04

Publications (1)

Publication Number Publication Date
WO2022050772A1 true WO2022050772A1 (fr) 2022-03-10

Family

ID=80491223

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/011963 WO2022050772A1 (fr) 2020-09-04 2021-09-03 Procédé de dessin de prévisualisation et dispositif électronique pour celui-ci

Country Status (2)

Country Link
KR (1) KR20220031455A (fr)
WO (1) WO2022050772A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130080102A (ko) * 2012-01-04 2013-07-12 엘지전자 주식회사 이동 단말기 및 그 제어방법
KR20130092934A (ko) * 2012-02-13 2013-08-21 삼성전자주식회사 사용자 인터페이스를 가진 태블릿
KR20140070196A (ko) * 2012-11-30 2014-06-10 엘지전자 주식회사 이동 단말기 및 이의 제어방법
KR20150080454A (ko) * 2013-12-30 2015-07-09 삼성전자주식회사 사용자 인터렉션을 제공하는 사용자 단말 장치 및 그 방법
KR20160006516A (ko) * 2014-07-09 2016-01-19 엘지전자 주식회사 이동단말기 및 그 제어방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130080102A (ko) * 2012-01-04 2013-07-12 엘지전자 주식회사 이동 단말기 및 그 제어방법
KR20130092934A (ko) * 2012-02-13 2013-08-21 삼성전자주식회사 사용자 인터페이스를 가진 태블릿
KR20140070196A (ko) * 2012-11-30 2014-06-10 엘지전자 주식회사 이동 단말기 및 이의 제어방법
KR20150080454A (ko) * 2013-12-30 2015-07-09 삼성전자주식회사 사용자 인터렉션을 제공하는 사용자 단말 장치 및 그 방법
KR20160006516A (ko) * 2014-07-09 2016-01-19 엘지전자 주식회사 이동단말기 및 그 제어방법

Also Published As

Publication number Publication date
KR20220031455A (ko) 2022-03-11

Similar Documents

Publication Publication Date Title
WO2022030890A1 (fr) Procédé de capture d'image à fenêtres multiples et dispositif électronique associé
WO2022211271A1 (fr) Dispositif électronique pour traiter une saisie manuscrite sur la base d'un apprentissage, son procédé de fonctionnement et support de stockage
WO2022119276A1 (fr) Dispositif électronique d'affichage souple et procédé de fonctionnement associé
WO2022031048A1 (fr) Dispositif électronique et procédé d'affichage d'un pointeur de stylo électronique associé
WO2022030921A1 (fr) Dispositif électronique, et procédé de commande de son écran
WO2022086272A1 (fr) Dispositif électronique pour fournir une interface utilisateur, et procédé associé
WO2022177299A1 (fr) Procédé de commande de fonction d'appel et dispositif électronique le prenant en charge
WO2022108125A1 (fr) Dispositif électronique comprenant une batterie et procédé associé
WO2022086071A1 (fr) Dispositif électronique pour commander le fonctionnement d'un dispositif de type stylo électronique, procédé de fonctionnement dans un dispositif électronique, et support de stockage non transitoire
WO2021177640A1 (fr) Procédé de commande d'application de dispositif électronique externe, et dispositif électronique prenant en charge ce dernier
WO2022030933A1 (fr) Dispositif électronique et procédé de traitement d'entrée d'écriture associé
WO2022050772A1 (fr) Procédé de dessin de prévisualisation et dispositif électronique pour celui-ci
WO2023146142A1 (fr) Dispositif électronique, et procédé de notification et de connexion pour stylo électronique
WO2022119416A1 (fr) Dispositif électronique utilisant un stylo électronique et procédé correspondant
WO2022103010A1 (fr) Dispositif électronique et son procédé de fonctionnement
WO2022181949A1 (fr) Dispositif électronique pour fournir un environnement de ra/rv et son procédé de fonctionnement
WO2024019300A1 (fr) Dispositif électronique et procédé de détection de fixation d'un dispositif d'entrée d'utilisateur
WO2024101704A1 (fr) Dispositif pouvant être porté et procédé d'identification d'entrée tactile et support de stockage lisible par ordinateur non transitoire
WO2023146173A1 (fr) Procédé de fourniture d'écran et dispositif électronique le prenant en charge
WO2023163380A1 (fr) Dispositif électronique et dispositif d'entrée de stylo, procédé d'utilisation de multiples dispositifs d'entrée de stylo dans un dispositif électronique
WO2021242025A1 (fr) Dispositif électronique, procédé de fonctionnement de dispositif électronique et support d'enregistrement non transitoire
WO2023106622A1 (fr) Appareil électronique comprenant un écran souple
WO2022177105A1 (fr) Dispositif électronique à affichage transparent et procédé de fonctionnement dudit dispositif
WO2022103108A1 (fr) Dispositif électronique et procédé de détection d'entrée tactile sur le dispositif électronique
WO2021162268A1 (fr) Dispositif électronique et procédé de synchronisation d'édition d'objets entre des écrans de lancement associés

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21864731

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21864731

Country of ref document: EP

Kind code of ref document: A1