WO2022065844A1 - Procédé d'affichage d'image de prévisualisation et appareil électronique le prenant en charge - Google Patents

Procédé d'affichage d'image de prévisualisation et appareil électronique le prenant en charge Download PDF

Info

Publication number
WO2022065844A1
WO2022065844A1 PCT/KR2021/012863 KR2021012863W WO2022065844A1 WO 2022065844 A1 WO2022065844 A1 WO 2022065844A1 KR 2021012863 W KR2021012863 W KR 2021012863W WO 2022065844 A1 WO2022065844 A1 WO 2022065844A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
image
area
input
camera
Prior art date
Application number
PCT/KR2021/012863
Other languages
English (en)
Korean (ko)
Inventor
강보순
이정원
김혜령
이다현
진준호
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2022065844A1 publication Critical patent/WO2022065844A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/0206Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings
    • H04M1/0208Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings characterized by the relative motions of the body parts
    • H04M1/0214Foldable telephones, i.e. with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • H04M1/0216Foldable in one direction, i.e. using a one degree of freedom hinge
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes

Definitions

  • Various embodiments disclosed in this document relate to a method of displaying a preview image and an electronic device supporting the same.
  • the digital drawing application may provide a drawing environment based on a stylus pen and a user's touch input.
  • the digital drawing application may provide a user interface for changing the properties of the tool corresponding to the input.
  • a preview image provided by a digital drawing application may be acquired in various ways.
  • the preview image may be acquired by changing the folding angle of the electronic device.
  • the electronic device may detect a specified touch input (eg, a pen touch input) and perform a preview image providing function of overlaying and displaying an image on a screen displayed on the display in response to the detected input.
  • a specified touch input eg, a pen touch input
  • the electronic device may not be able to display a preview image that matches the user's intention due to the limitation of the frame. For example, when a user captures an external environment while holding the electronic device in his hand, it may be difficult to photograph various angles of view.
  • the user While the digital drawing application is being executed, the user may be provided with a drawing function while the electronic device is fixed. In order to be provided with a preview image taken from an angle other than the provided preview image, the user had to change the fixed state of the electronic device.
  • An electronic device includes a camera supporting a first angle of view, a second angle of view, and a third angle of view, a first housing, a second housing, and a space between the first housing and the second housing. a hinge structure disposed thereon, a flexible display, a processor, and a memory operatively coupled to the processor.
  • the processor displays a user interface (UI) including a plurality of buttons for changing a drawing property of an input in a first area on the flexible display, and displays a user interface (UI) in a second area on the flexible display
  • UI user interface
  • a first image acquired using the camera set to the first angle of view is displayed, and an image generated based on a drawing input within a specified range of the second area is displayed on the first image displayed in the second area.
  • the second angle of view obtained by changing the angle of view of the camera to the second angle of view wider than the first angle of view when the drawing input is detected in at least a part of the second area except for the specified range 2
  • One or more instructions for displaying an image in the second area may be stored.
  • a method for a foldable electronic device to display a preview image includes a UI including a plurality of buttons for changing a drawing property of an input in a first area on a flexible display user interface), displaying a first image acquired using a camera set to a first angle of view in a second area on the flexible display and generating based on a drawing input within a specified range of the second area an operation of overlaying and displaying an image displayed in the second area on the first image displayed in the second area, and when the drawing input is sensed in at least a portion of the second area except for the specified range, the angle of view of the camera and displaying a second image obtained by changing to a second angle of view wider than the first angle of view on the second area.
  • the electronic device may provide an intuitive application use experience to the user by changing the angle of view of the camera, the photographing angle, and/or the shape of the electronic device based on the drawing input.
  • the electronic device may provide convenience to the user through a preview image obtained based on intuitively changing conditions.
  • FIG. 1 is a block diagram of an electronic device in a network environment, according to various embodiments of the present disclosure
  • FIG. 2 is a block diagram illustrating a camera module according to various embodiments of the present disclosure
  • FIG. 3 is a block diagram of a display module according to various embodiments of the present disclosure.
  • FIG. 4 is a block diagram illustrating components of an electronic device according to various embodiments of the present disclosure.
  • FIG. 5 illustrates an electronic device that provides a preview image, according to various embodiments of the present disclosure.
  • FIG. 6 illustrates an electronic device for acquiring a preview image at various angles of view, according to various embodiments of the present disclosure.
  • FIG. 7 illustrates an electronic device for acquiring a preview image at various angles of view, according to various embodiments of the present disclosure.
  • FIG. 8 illustrates an electronic device for acquiring a preview image by rotating a camera, according to various embodiments of the present disclosure
  • FIG. 9 illustrates an electronic device for acquiring a preview image by rotating a camera, according to various embodiments of the present disclosure
  • FIG. 10 illustrates an electronic device acquiring a preview image by changing a folding angle, according to various embodiments of the present disclosure
  • FIG. 11 illustrates an electronic device for acquiring a preview image in various forms, according to various embodiments of the present disclosure.
  • FIG. 12 illustrates an electronic device for acquiring a preview image in various forms, according to various embodiments of the present disclosure.
  • FIG. 13 illustrates an electronic device displaying various types of drawing images on a display, according to various embodiments of the present disclosure
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100 according to various embodiments.
  • an electronic device 101 communicates with an electronic device 102 through a first network 198 (eg, a short-range wireless communication network) or a second network 199 . It may communicate with the electronic device 104 or the server 108 through (eg, a long-distance wireless communication network). According to an embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • a first network 198 eg, a short-range wireless communication network
  • a second network 199 e.g., a second network 199
  • the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • the electronic device 101 includes a processor 120 , a memory 130 , an input module 150 , a sound output module 155 , a display module 160 , an audio module 170 , and a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or an antenna module 197 may be included.
  • at least one of these components eg, the connection terminal 178
  • may be omitted or one or more other components may be added to the electronic device 101 .
  • some of these components are integrated into one component (eg, display module 160 ). can be
  • the processor 120 for example, executes software (eg, a program 140) to execute at least one other component (eg, a hardware or software component) of the electronic device 101 connected to the processor 120 . It can control and perform various data processing or operations. According to one embodiment, as at least part of data processing or operation, the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 . may be stored in the volatile memory 132 , and may process commands or data stored in the volatile memory 132 , and store the result data in the non-volatile memory 134 .
  • software eg, a program 140
  • the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 .
  • the volatile memory 132 may be stored in the volatile memory 132 , and may process commands or data stored in the volatile memory 132 , and store the result data in the non-volatile memory 134 .
  • the processor 120 is the main processor 121 (eg, a central processing unit or an application processor) or a secondary processor 123 (eg, a graphic processing unit, a neural network processing unit) a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor).
  • the main processor 121 e.g, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, a neural network processing unit
  • NPU neural processing unit
  • an image signal processor e.g., a sensor hub processor, or a communication processor.
  • the main processor 121 e.g, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, a neural network processing unit
  • NPU neural processing unit
  • an image signal processor e.g., a sensor hub processor, or a communication processor.
  • the main processor 121 e.g, a central processing unit or an application processor
  • a secondary processor 123
  • the auxiliary processor 123 is, for example, on behalf of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or the main processor 121 is active (eg, executing an application). ), together with the main processor 121, at least one of the components of the electronic device 101 (eg, the display module 160, the sensor module 176, or the communication module 190) It is possible to control at least some of the related functions or states.
  • the co-processor 123 eg, an image signal processor or a communication processor
  • may be implemented as part of another functionally related component eg, the camera module 180 or the communication module 190. there is.
  • the auxiliary processor 123 may include a hardware structure specialized for processing an artificial intelligence model.
  • Artificial intelligence models can be created through machine learning. Such learning may be performed, for example, in the electronic device 101 itself on which artificial intelligence is performed, or may be performed through a separate server (eg, the server 108).
  • the learning algorithm may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but in the above example not limited
  • the artificial intelligence model may include a plurality of artificial neural network layers.
  • Artificial neural networks include deep neural networks (DNNs), convolutional neural networks (CNNs), recurrent neural networks (RNNs), restricted boltzmann machines (RBMs), deep belief networks (DBNs), bidirectional recurrent deep neural networks (BRDNNs), It may be one of deep Q-networks or a combination of two or more of the above, but is not limited to the above example.
  • the artificial intelligence model may include, in addition to, or alternatively, a software structure in addition to the hardware structure.
  • the memory 130 may store various data used by at least one component of the electronic device 101 (eg, the processor 120 or the sensor module 176 ).
  • the data may include, for example, input data or output data for software (eg, the program 140 ) and instructions related thereto.
  • the memory 130 may include a volatile memory 132 or a non-volatile memory 134 .
  • the program 140 may be stored as software in the memory 130 , and may include, for example, an operating system 142 , middleware 144 , or an application 146 .
  • the input module 150 may receive a command or data to be used in a component (eg, the processor 120 ) of the electronic device 101 from the outside (eg, a user) of the electronic device 101 .
  • the input module 150 may include, for example, a microphone, a mouse, a keyboard, a key (eg, a button), or a digital pen (eg, a stylus pen).
  • the sound output module 155 may output a sound signal to the outside of the electronic device 101 .
  • the sound output module 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback.
  • the receiver may be used to receive an incoming call. According to one embodiment, the receiver may be implemented separately from or as part of the speaker.
  • the display module 160 may visually provide information to the outside (eg, a user) of the electronic device 101 .
  • the display module 160 may include, for example, a control circuit for controlling a display, a hologram device, or a projector and a corresponding device.
  • the display module 160 may include a touch sensor configured to sense a touch or a pressure sensor configured to measure the intensity of a force generated by the touch.
  • the audio module 170 may convert a sound into an electric signal or, conversely, convert an electric signal into a sound. According to an embodiment, the audio module 170 acquires a sound through the input module 150 , or an external electronic device (eg, a sound output module 155 ) connected directly or wirelessly with the electronic device 101 . A sound may be output through the electronic device 102 (eg, a speaker or headphones).
  • an external electronic device eg, a sound output module 155
  • a sound may be output through the electronic device 102 (eg, a speaker or headphones).
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101 or an external environmental state (eg, user state), and generates an electrical signal or data value corresponding to the sensed state. can do.
  • the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 177 may support one or more designated protocols that may be used by the electronic device 101 to directly or wirelessly connect with an external electronic device (eg, the electronic device 102 ).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card
  • the connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102 ).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that the user can perceive through tactile or kinesthetic sense.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture still images and moving images. According to an embodiment, the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101 .
  • the power management module 188 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101 .
  • battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108). It can support establishment and communication performance through the established communication channel.
  • the communication module 190 may include one or more communication processors that operate independently of the processor 120 (eg, an application processor) and support direct (eg, wired) communication or wireless communication.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg, : It may include a LAN (local area network) communication module, or a power line communication module).
  • GNSS global navigation satellite system
  • a corresponding communication module among these communication modules is a first network 198 (eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (eg, legacy It may communicate with the external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
  • a first network 198 eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)
  • a second network 199 eg, legacy It may communicate with the external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
  • a telecommunication network
  • the wireless communication module 192 uses the subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199 .
  • the electronic device 101 may be identified or authenticated.
  • the wireless communication module 192 may support a 5G network after a 4G network and a next-generation communication technology, for example, a new radio access technology (NR).
  • NR access technology includes high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), minimization of terminal power and access to multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low-latency) -latency communications)).
  • eMBB enhanced mobile broadband
  • mMTC massive machine type communications
  • URLLC ultra-reliable and low-latency
  • the wireless communication module 192 may support a high frequency band (eg, mmWave band) to achieve a high data rate, for example.
  • a high frequency band eg, mmWave band
  • the wireless communication module 192 includes various technologies for securing performance in a high-frequency band, for example, beamforming, massive multiple-input and multiple-output (MIMO), all-dimensional multiplexing. It may support technologies such as full dimensional MIMO (FD-MIMO), an array antenna, analog beam-forming, or a large scale antenna.
  • the wireless communication module 192 may support various requirements specified in the electronic device 101 , an external electronic device (eg, the electronic device 104 ), or a network system (eg, the second network 199 ).
  • the wireless communication module 192 may include a peak data rate (eg, 20 Gbps or more) for realizing eMBB, loss coverage (eg, 164 dB or less) for realizing mMTC, or U-plane latency for realizing URLLC ( Example: downlink (DL) and uplink (UL) each 0.5 ms or less, or round trip 1 ms or less).
  • a peak data rate eg, 20 Gbps or more
  • loss coverage eg, 164 dB or less
  • U-plane latency for realizing URLLC
  • the antenna module 197 may transmit or receive a signal or power to the outside (eg, an external electronic device).
  • the antenna module 197 may include an antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is connected from the plurality of antennas by, for example, the communication module 190 . can be selected. A signal or power may be transmitted or received between the communication module 190 and an external electronic device through the selected at least one antenna.
  • other components eg, a radio frequency integrated circuit (RFIC)
  • RFIC radio frequency integrated circuit
  • the antenna module 197 may form a mmWave antenna module.
  • the mmWave antenna module comprises a printed circuit board, an RFIC disposed on or adjacent to a first side (eg, bottom side) of the printed circuit board and capable of supporting a designated high frequency band (eg, mmWave band); and a plurality of antennas (eg, an array antenna) disposed on or adjacent to a second side (eg, top or side) of the printed circuit board and capable of transmitting or receiving signals of the designated high frequency band. can do.
  • peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • the command or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199 .
  • Each of the external electronic devices 102 or 104 may be the same as or different from the electronic device 101 .
  • all or a part of operations executed in the electronic device 101 may be executed in one or more external electronic devices 102 , 104 , or 108 .
  • the electronic device 101 may perform the function or service itself instead of executing the function or service itself.
  • one or more external electronic devices may be requested to perform at least a part of the function or the service.
  • One or more external electronic devices that have received the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit a result of the execution to the electronic device 101 .
  • the electronic device 101 may process the result as it is or additionally and provide it as at least a part of a response to the request.
  • cloud computing distributed computing, mobile edge computing (MEC), or client-server computing technology may be used.
  • the electronic device 101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 104 may include an Internet of things (IoT) device.
  • Server 108 may be an intelligent server using machine learning and/or neural networks.
  • the external electronic device 104 or the server 108 may be included in the second network 199 .
  • the electronic device 101 may be applied to an intelligent service (eg, smart home, smart city, smart car, or health care) based on 5G communication technology and IoT-related technology.
  • the electronic device may have various types of devices.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a smart phone
  • a portable multimedia device e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a wearable device e.g., a smart bracelet
  • a home appliance device e.g., a home appliance
  • first, second, or first or second may be used simply to distinguish the element from other elements in question, and may refer to elements in other aspects (e.g., importance or order) is not limited. It is said that one (eg, first) component is “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively”. When referenced, it means that one component can be connected to the other component directly (eg by wire), wirelessly, or through a third component.
  • module used in various embodiments of this document may include a unit implemented in hardware, software, or firmware, and is interchangeable with terms such as, for example, logic, logic block, component, or circuit.
  • a module may be an integrally formed part or a minimum unit or a part of the part that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • one or more instructions stored in a storage medium may be implemented as software (eg, the program 140) including
  • a processor eg, processor 120
  • a device eg, electronic device 101
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not include a signal (eg, electromagnetic wave), and this term is used in cases where data is semi-permanently stored in the storage medium and It does not distinguish between temporary storage cases.
  • a signal eg, electromagnetic wave
  • the method according to various embodiments disclosed in this document may be provided as included in a computer program product.
  • Computer program products may be traded between sellers and buyers as commodities.
  • the computer program product is distributed in the form of a machine-readable storage medium (eg compact disc read only memory (CD-ROM)), or via an application store (eg Play StoreTM) or on two user devices ( It can be distributed (eg downloaded or uploaded) directly between smartphones (eg: smartphones) and online.
  • a part of the computer program product may be temporarily stored or temporarily created in a machine-readable storage medium such as a memory of a server of a manufacturer, a server of an application store, or a relay server.
  • each component (eg, module or program) of the above-described components may include a singular or a plurality of entities, and some of the plurality of entities may be separately disposed in other components. there is.
  • one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, a module or a program
  • the integrated component may perform one or more functions of each component of the plurality of components identically or similarly to those performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component are executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations are executed in a different order, or omitted. or one or more other operations may be added.
  • the camera module 180 includes a lens assembly 210 , a flash 220 , an image sensor 230 , an image stabilizer 240 , a memory 250 (eg, a buffer memory), or an image signal processor. (260).
  • the lens assembly 210 may collect light emitted from a subject, which is an image to be captured.
  • the lens assembly 210 may include one or more lenses.
  • the camera module 180 may include a plurality of lens assemblies 210 . In this case, the camera module 180 may form, for example, a dual camera, a 360 degree camera, or a spherical camera.
  • Some of the plurality of lens assemblies 210 may have the same lens properties (eg, angle of view, focal length, auto focus, f number, or optical zoom), or at least one lens assembly may be a different lens assembly. It may have one or more lens properties different from the lens properties of .
  • the lens assembly 210 may include, for example, a wide-angle lens or a telephoto lens.
  • the flash 220 may emit light used to enhance light emitted or reflected from the subject.
  • the flash 220 may include one or more light emitting diodes (eg, a red-green-blue (RGB) LED, a white LED, an infrared LED, or an ultraviolet LED), or a xenon lamp.
  • the image sensor 230 may acquire an image corresponding to the subject by converting light emitted or reflected from the subject and transmitted through the lens assembly 210 into an electrical signal.
  • the image sensor 230 is, for example, one image sensor selected from among image sensors having different properties, such as an RGB sensor, a black and white (BW) sensor, an IR sensor, or a UV sensor, the same It may include a plurality of image sensors having properties, or a plurality of image sensors having different properties.
  • Each image sensor included in the image sensor 230 may be implemented using, for example, a charged coupled device (CCD) sensor or a complementary metal oxide semiconductor (CMOS) sensor.
  • CCD charged coupled device
  • CMOS complementary metal oxide semiconductor
  • the image stabilizer 240 moves at least one lens or the image sensor 230 included in the lens assembly 210 in a specific direction or Operation characteristics of the image sensor 230 may be controlled (eg, read-out timing may be adjusted, etc.). This makes it possible to compensate for at least some of the negative effects of the movement on the image being taken.
  • the image stabilizer 240 is, according to an embodiment, the image stabilizer 240 is a gyro sensor (not shown) or an acceleration sensor (not shown) disposed inside or outside the camera module 180 . can be used to detect such a movement of the camera module 180 or the electronic device 101 .
  • the image stabilizer 240 may be implemented as, for example, an optical image stabilizer.
  • the memory 250 may temporarily store at least a portion of the image acquired through the image sensor 230 for the next image processing operation. For example, when image acquisition is delayed according to the shutter or a plurality of images are acquired at high speed, the acquired original image (eg, Bayer-patterned image or high-resolution image) is stored in the memory 250 and , a copy image corresponding thereto (eg, a low-resolution image) may be previewed through the display device 160 .
  • the acquired original image eg, Bayer-patterned image or high-resolution image
  • a copy image corresponding thereto eg, a low-resolution image
  • the memory 250 may be configured as at least a part of the memory 130 or as a separate memory operated independently of the memory 130 .
  • the image signal processor 260 may perform one or more image processing on an image acquired through the image sensor 230 or an image stored in the memory 250 .
  • the one or more image processes may include, for example, depth map generation, 3D modeling, panorama generation, feature point extraction, image synthesis, or image compensation (eg, noise reduction, resolution adjustment, brightness adjustment, blurring ( blurring, sharpening, or softening.
  • the image signal processor 260 may include at least one of the components included in the camera module 180 (eg, an image sensor). 230), for example, exposure time control, readout timing control, etc.
  • the image processed by the image signal processor 260 is stored back in the memory 250 for further processing.
  • the image signal processor 260 may be configured as at least a part of the processor 120 or as a separate processor operated independently of the processor 120.
  • the image signal processor 260 may be configured as the processor 120 and a separate processor, the at least one image processed by the image signal processor 260 may be displayed through the display device 160 as it is by the processor 120 or after additional image processing.
  • the electronic device 101 may include a plurality of camera modules 180 each having different properties or functions.
  • at least one of the plurality of camera modules 180 may be a wide-angle camera, and at least the other may be a telephoto camera.
  • at least one of the plurality of camera modules 180 may be a front camera, and at least the other may be a rear camera.
  • the display module 160 may include a display 310 and a display driver IC (DDI) 330 for controlling the display 310 .
  • the DDI 330 may include an interface module 331 , a memory 333 (eg, a buffer memory), an image processing module 335 , or a mapping module 337 .
  • the DDI 330 receives, for example, image data or image information including an image control signal corresponding to a command for controlling the image data from other components of the electronic device 101 through the interface module 331 . can do.
  • the image information is the processor 120 (eg, the main processor 121 (eg, an application processor) or the auxiliary processor 123 (eg, an application processor) operated independently of the function of the main processor 121 ( For example: graphic processing device)
  • the DDI 330 may communicate with the touch circuit 350 or the sensor module 176 through the interface module 331.
  • the DDI 330 may be At least a portion of the received image information may be stored in the memory 333, for example, in units of frames, for example, the image processing module 335 may store at least a portion of the image data, Pre-processing or post-processing (eg, resolution, brightness, or size adjustment) may be performed based at least on the characteristics of the display 310.
  • Pre-processing or post-processing eg, resolution, brightness, or size adjustment
  • the mapping module 337 may be pre-processed or post-processed through the image processing module 135.
  • a voltage value or a current value corresponding to the image data may be generated.
  • the generation of the voltage value or the current value may include, for example, a property of pixels of the display 310 (eg, an arrangement of pixels ( RGB stripe or pentile structure), or the size of each sub-pixel) At least some pixels of the display 310 are, for example, based at least in part on the voltage value or the current value.
  • visual information eg, text, image, or icon
  • corresponding to the image data may be displayed through the display 310 .
  • the display module 160 may further include a touch circuit 350 .
  • the touch circuit 350 may include a touch sensor 351 and a touch sensor IC 353 for controlling the touch sensor 351 .
  • the touch sensor IC 353 may control the touch sensor 351 to sense, for example, a touch input or a hovering input for a specific location of the display 310 .
  • the touch sensor IC 353 may detect a touch input or a hovering input by measuring a change in a signal (eg, voltage, light amount, resistance, or electric charge amount) for a specific position of the display 310 .
  • the touch sensor IC 353 may provide information (eg, location, area, pressure, or time) regarding the sensed touch input or hovering input to the processor 120 .
  • At least a part of the touch circuit 350 is disposed as a part of the display driver IC 330 , or the display 310 , or outside the display module 160 . may be included as a part of another component (eg, the coprocessor 123).
  • the display module 160 may further include at least one sensor (eg, a fingerprint sensor, an iris sensor, a pressure sensor, or an illuminance sensor) of the sensor module 176 , or a control circuit therefor.
  • the at least one sensor or a control circuit therefor may be embedded in a part of the display module 160 (eg, the display 310 or the DDI 330 ) or a part of the touch circuit 350 .
  • the sensor module 176 embedded in the display module 160 includes a biometric sensor (eg, a fingerprint sensor)
  • the biometric sensor provides biometric information related to a touch input through a partial area of the display 310 . (eg fingerprint image) can be acquired.
  • the pressure sensor may acquire pressure information related to a touch input through a part or the entire area of the display 310 .
  • the touch sensor 351 or the sensor module 176 may be disposed between pixels of the pixel layer of the display 310 , or above or below the pixel layer.
  • FIG. 4 is a block diagram 400 illustrating components of an electronic device 401 according to various embodiments of the present disclosure.
  • the electronic device 401 (eg, the electronic device 101 of FIG. 1 ) includes a processor 420 (eg, the processor 120 of FIG. 1 ), a memory 430 (eg: Memory 130 of FIG. 1 ), display 460 (eg, display 160 of FIG. 1 ), camera 480 (eg, camera module 180 of FIG. 1 ), and communication circuitry 490 (eg, display 160 of FIG. 1 ) :
  • the communication module 190 of FIG. 1) may be included.
  • the components shown in FIG. 4 are exemplary, and the electronic device 401 may further include components not shown or may not include some of the components shown.
  • Processor 420 may be operatively coupled to memory 430 , display 460 , camera 480 , and/or communication circuitry 490 .
  • the memory 430 may store one or more instructions that, when executed, cause the processor 420 to perform various operations of the electronic device 401 .
  • the display 460 may detect a drawing input.
  • display 460 may include pen input sensing circuitry (eg, pen input interface) and/or touch input sensing circuitry for detecting touch input (eg, pen input or touch input by a user).
  • the processor 420 may be configured to hover from the external input device 402 when the external input device 402 (eg, a stylus pen, digital pen, or digitizer pen) is spaced apart within a specified distance from the display 460 . (hovering) input can be detected.
  • the processor 420 may perform a pen input (eg, pen touch input or pen contact) from the external input device 402 . input) can be detected.
  • the display 460 may include various types of displays.
  • the display may include a flexible display.
  • the flexible display may refer to a display in which at least a part is flexible.
  • the flexible display may be folded or unfolded based on a hinge structure included in the electronic device 401 .
  • the display 460 may display different screens in each area of the display that is physically and/or logically separated under the control of the processor 420 .
  • the camera 480 may photograph the external environment from various angles and/or directions.
  • the processor 420 may photograph the external environment using the camera 480 set to the first angle of view, the second angle of view wider than the first angle of view, or the third angle of view wider than the second angle of view.
  • the processor 420 may flexibly change the angle of view of the camera 480 to photograph the external environment.
  • the camera 480 may rotate about a designated axis to photograph the external environment in various photographing directions.
  • the processor 420 may rotate the camera based on a specified axis to change the direction the camera faces to photograph the external environment.
  • the communication circuit 490 may be configured to support short-range wireless communication based on a Bluetooth protocol (eg, legacy Bluetooth and/or BLE) and/or a wireless LAN.
  • a Bluetooth protocol eg, legacy Bluetooth and/or BLE
  • communication circuitry 490 may provide communication with a digital pen.
  • the electronic device 401 may further include components not shown in FIG. 4 .
  • the electronic device 401 may further include a housing.
  • the housing may include a magnetic pad for attachment of the external input device 402 and/or a slot for insertion of the external input device 402 .
  • the external input device 402 may be referred to as a stylus pen, a digital pen, or a digitizer pen.
  • the external input device 402 may receive an electromagnetic field signal (eg, a proximity signal) generated from a digitizer (eg, a pen input interface) of the electronic device 401 .
  • the external input device 402 may receive an electromagnetic field signal using a resonance circuit.
  • the external input device 402 may transmit an electromagnetic resonance (EMR) input signal to the electronic device 401 .
  • EMR electromagnetic resonance
  • the external input device 402 may use at least one of an active electrostatic (AES) method and an electrically coupled resonance (ECR) method.
  • AES active electrostatic
  • ECR electrically coupled resonance
  • the external input device 402 may generate a signal using the electronic device 401 and capacitive coupling.
  • the external input device 402 transmits a signal by the ECR method
  • the external input device 402 is based on the electric field generated from the capacitive device of the electronic device 401, the resonance frequency It is possible to generate a signal comprising
  • the external input device 402 may include a communication circuit for communication with the electronic device 401 .
  • the external input device 402 may communicate with the electronic device 401 using short-range wireless communication (eg, at least one of Bluetooth, Bluetooth low energy (BLE), or wireless LAN).
  • the external input device 402 may include at least one button. When an input for at least one button is received, the external input device 402 may transmit a signal corresponding to the button input to the electronic device 401 using a resonance circuit and/or a communication circuit.
  • FIG. 5 illustrates an electronic device 501 that provides a preview image, according to various embodiments of the present disclosure.
  • the user may use the electronic device 501 by holding it in various shapes.
  • the user may use the electronic device 501 by holding the electronic device 501 in the form of a first type in which the electronic device 501 is folded from side to side.
  • the electronic device 501 may display screens corresponding to various functions on a flexible display (eg, the display 460 of FIG. 4 ).
  • the user may use the electronic device 501 by holding the electronic device 501 in a second type form in which the electronic device 501 is folded up and down.
  • FIG. 5, which will be described later, may be a description commonly referred to with respect to components having the same reference number in reference number 500a and reference number 500b.
  • the electronic device 501 may display a user interface (UI) including a plurality of buttons for changing a drawing property of an input in the first area 560a of the flexible display.
  • the UI displayed in the first area 560a causes the first UI 551 and the camera 580 to set the type and/or color of the image generated based on the drawing input to start and/or stop the shooting function.
  • a second UI 553 that Based on the input to the first UI 551 , the electronic device 501 may change the property of the image generated based on the drawing input.
  • the first UI 551 may include a corresponding icon for executing a pen type change or pen color change function based on a specified touch input.
  • the thickness and/or transparency of the pen input may be changed according to the change of the pen type.
  • the color of the pen input may be changed according to the change of the pen color.
  • the electronic device 501 may photograph the external environment through the camera 580 based on the input to the second UI 553 .
  • the electronic device 501 may display at least one image acquired using the camera 580 on the second area 560b of the flexible display. For example, in response to an input to the second UI 553 of the first area 560a , the electronic device 501 displays at least one image acquired using the camera 580 to the second area 560b. can be displayed in
  • the electronic device 501 may change the direction in which the camera 580 faces by rotating the camera 580 based on a specified axis. For example, if a designated input is detected while the camera 580 faces the first direction 521 while executing a photographing function, the electronic device 501 determines the direction in which the camera 580 faces is the direction in which the designated input is detected. It can be rotated to track to execute the shooting function.
  • FIG. 6 illustrates an electronic device 601 that acquires a preview image at various angles of view, according to various embodiments of the present disclosure.
  • the electronic device 601 may acquire at least one image using a camera 680 (eg, the camera 480 of FIG. 4 ) supporting a plurality of angles of view. there is.
  • a camera 680 eg, the camera 480 of FIG. 4
  • the electronic device 601 may acquire at least one image using a camera 680 (eg, the camera 480 of FIG. 4 ) supporting a plurality of angles of view. there is.
  • the electronic device 601 may display a UI including a plurality of buttons for changing the drawing attribute of an input in the first area 660a of the flexible display.
  • the electronic device 601 may display the first image 610 obtained using the camera 680 set to the first angle of view on the second area 660b of the flexible display.
  • the electronic device 601 may identify a drawing input (eg, a touch input by a user and/or a pen touch input) sensed on the second area 660b of the flexible display.
  • the electronic device 601 may identify the position of the drawing input sensed on the flexible display by using a sensor module (eg, the sensor module 176 of FIG. 1 ).
  • the electronic device 601 wirelessly communicates with the external input device (eg, Bluetooth). communication) to identify the location of the drawing input. For example, the electronic device 601 may identify whether a drawing input sensed on the second area 660b of the flexible display is detected within a specified range 650 . For example, when a drawing input is detected in at least a portion of the second area 660b except for the designated range 650 , the electronic device 601 may change the angle of view of the camera 680 . As another example, the electronic device 601 may include a plurality of cameras.
  • the electronic device 601 acquires at least one image by using a camera having a relatively wider angle of view among a plurality of cameras. can do.
  • An operation in which the electronic device 601 acquires an image by changing the angle of view of the camera 680 may be referred to as reference numeral 600b.
  • a second area 660b in which a drawing input is sensed is illustrated as an area corresponding to a right half of the flexible display with respect to the illustrated electronic device 601 , but is not limited thereto.
  • the second area 660b may be an area corresponding to the entire area of the flexible display.
  • the first region 660a may be at least one region included in the second region 660b.
  • the electronic device 601 detects a drawing input in at least a portion of the flexible display (eg, at least a portion of the first area 660a or the second area 660b) except for the specified range 650 .
  • the angle of view of the camera 680 may be changed to a second angle of view wider than the first angle of view.
  • the electronic device 601 may display the second image 620 obtained by using the camera 680 set to the second angle of view on the second area 660b.
  • Reference numerals 600a and 600b illustrate a first image 610 and a second image 620 obtained by a camera 680 set to the first angle of view and the second angle of view, respectively, but is not limited thereto.
  • the electronic device 601 may display the third image 630 acquired using the camera 680 set to the third angle of view on the second area 660b. there is. For example, when a drawing input is detected in at least a portion of the second region 660b displaying the second image 620 , except for the designated range 650 , the electronic device 601 adjusts the angle of view of the camera 680 .
  • the third image 630 obtained by changing to a third angle of view wider than the second angle of view may be displayed on the second area 660b.
  • FIG. 7 illustrates an electronic device 701 that acquires a preview image at various angles of view, according to various embodiments of the present disclosure.
  • the electronic device 701 may acquire at least one image using a camera set to various angles of view (eg, the camera 480 of FIG. 4 ).
  • the electronic device 701 may display a drawing image generated based on drawing inputs 711 , 713 , and 715 within a specified range of the second area by overlaying it on an image displayed in the second area. .
  • the electronic device 701 may identify whether a drawing input sensed in one area of the flexible display (eg, the display 460 of FIG. 4 ) is within a specified range (eg, the specified range 650 of FIG. 6 ). .
  • the electronic device 601 changes the angle of view of the camera to a wider range gradually and obtains at least one may be displayed on the second area 660b.
  • the electronic device 701 may acquire at least one image using a camera supporting the first angle of view 721 , the second angle of view 723 , and the third angle of view 725 .
  • the electronic device 701 adjusts the camera's angle of view.
  • the first angle of view 721 , the second angle of view 723 , and the third angle of view 725 may be changed and set in the order.
  • the electronic device 701 may display at least one image obtained by changing the angle of view based on the drawing inputs 711, 713, and 715 sensed in the second area on the second area.
  • the electronic device 701 sets the first image (eg, the first image in FIG. 6 ) obtained by using the camera set to the first angle of view 721 . 1 image 610) may be displayed in the second area. Since the first drawing input 711 is detected only within a specified range, the electronic device 701 can display the image generated based on the first drawing input 711 superimposed on the first image without changing the angle of view. there is.
  • the electronic device 701 when the second drawing input 713 is detected in the second area, the electronic device 701 changes the angle of view of the camera to a second angle of view 723 that is wider than the first angle of view 721 and is obtained A second image (eg, the second image 620 of FIG. 6 ) may be displayed on the second area. Since the second drawing input 713 is sensed within the specified range and at least a portion of the range except for the specified range, the electronic device 701 changes the angle of view of the camera to a second angle of view 723 wider than the first angle of view 721. An image generated based on the second drawing input 713 may be overlapped and displayed on the second image.
  • a second image eg, the second image 620 of FIG. 6
  • the electronic device 701 when a third drawing input 715 is sensed in the second area, the electronic device 701 changes the angle of view of the camera to a third angle of view 725 that is wider than the second angle of view 723 and obtains the 3 images (eg, the third image 630 of FIG. 6 ) may be displayed on the second area. Since the third drawing input 715 is sensed within the specified range and at least a portion of the range except for the specified range, the electronic device 701 changes the angle of view of the camera to a third angle of view 725 wider than the second angle of view 723. An image generated based on the third drawing input 715 may be overlapped and displayed on the third image.
  • FIG. 8 illustrates an electronic device 801 that acquires a preview image by rotating the camera 880, according to various embodiments of the present disclosure.
  • the electronic device 801 of FIG. 8 (eg, the electronic device 101 of FIG. 1 ) is the first flexible display (eg, the display 460 of FIG. 4 ).
  • a preview image is displayed in the area 860a and a UI including a plurality of buttons for changing the drawing property of the input is displayed in the second area 860b.
  • the execution screens of the first area 860a and the second area 860b are exemplary, and various embodiments of the present document are not limited thereto, and the execution screens may be switched based on a specified input.
  • the electronic device 801 may classify regions of the flexible display and display various different execution screens in each region.
  • the electronic device 801 may display at least one image acquired using the camera 880 on the first area 860a of the flexible display.
  • the electronic device 801 may acquire at least one image by setting the camera 880 to a third angle of view (eg, the third angle of view 725 of FIG. 7 ).
  • the camera 880 may perform a photographing function in a state set to face the first direction 810 .
  • the electronic device 801 may display a UI including a plurality of buttons for changing the drawing property of an input on the second area 860b of the flexible display.
  • the electronic device 801 may identify a drawing input (eg, a touch input by a user and/or a pen touch input) sensed on the flexible display. For example, the electronic device 801 may identify whether a drawing input sensed on the first area 860a of the flexible display is detected within a specified range 850 . When a drawing input is sensed in at least a portion of the first area 860a except for the designated range 850 in a state in which photographing is being executed with the camera 880 set to the third angle of view, the electronic device 801 displays the camera 880 can be rotated around a specified central axis.
  • a drawing input eg, a touch input by a user and/or a pen touch input
  • the direction in which the camera 880 is directed by rotating the camera 880 tracks the direction in which the drawing input is detected ( tracking) can be done.
  • the electronic device 801 may display at least one image obtained by rotating the camera 880 to face the second direction 820 on the flexible display.
  • at least one image displayed on the flexible display may be displayed in at least a portion of the second area 860b in excess of the first area 860a.
  • the UI that is being displayed may continue to be displayed in a reduced state.
  • FIG 9 illustrates an electronic device 901 that acquires a preview image by rotating a camera, according to various embodiments of the present disclosure.
  • the electronic device 901 may rotate the camera so that the direction the camera faces may track the direction in which a drawing input is sensed.
  • the electronic device 901 is based on the drawing inputs 911 , 913 , and 915 (eg, a touch input by a user and/or a touch input of the pen 902 ) within a designated range 950 of the second region 960b.
  • the generated drawing image may be displayed by being superimposed on the image displayed in the second area 960b.
  • the electronic device 901 may identify whether a drawing input sensed in one area of the flexible display (eg, the display 460 of FIG. 4 ) is within a specified range.
  • the electronic device 901 may acquire at least one image using a camera set to a third angle of view (eg, the third angle of view 725 of FIG. 7 ).
  • the electronic device 901 may superimpose and display an image generated based on the first drawing input 911 and the second drawing input 913 on at least one image acquired using the camera set to the third angle of view.
  • the electronic device 901 may detect the third drawing input 913 .
  • the third drawing input 913 may be referred to as an input sensed in at least a portion of the second area 960b except for the designated range 950 .
  • the electronic device 901 may rotate the camera so that the direction the camera faces may track the direction in which the third drawing input 915 is sensed.
  • the electronic device 901 may overlap and display an image generated based on the third drawing input 915 on at least one image 917 acquired using a camera set to the third angle of view.
  • FIG. 10 illustrates an electronic device 1001 acquiring a preview image by changing a folding angle, according to various embodiments of the present disclosure.
  • the user may hold and use the electronic device 1001 (eg, the electronic device 101 of FIG. 1 ) in various forms.
  • a user may indicate that a flexible display (eg, the display 460 in FIG. 4 ) is a first type (eg, in FIG. 4 ) that is folded from side to side through a hinge structure 1070 . 5), the electronic device 1001 may be used by holding it in the form of reference numeral 500a.
  • the user holds the flexible display in the form of a second type (for example, the second type according to reference number 500b in FIG.
  • FIG. 10 which will be described later, may be a description commonly referred to with reference to reference numerals 1010 and 1020 having the same reference numerals.
  • the electronic device 1001 may display different screens in the first area 1060a and the second area 1060b on the flexible display, respectively.
  • the electronic device 1001 has a camera set to the angle of view specified in the first area 1060a (eg, the first angle of view 721 , the second angle of view 723 , or the third angle of view 725 of FIG. 7 ).
  • At least one image eg, the first image 610 , the second image 620 , or the third image 630 of FIG. 6
  • the electronic device 1001 may display a user interface (UI) including a plurality of buttons for changing drawing properties in the second area 1060b.
  • UI user interface
  • division of regions of the flexible display may be a physical division based on the hinge structure 1070 .
  • the division of the flexible display area may be logical division of the display area.
  • the electronic device 1001 displays an image obtained by using the camera 1080 set to a field of view (eg, a third angle of view) specified in the first area 1060a, and a specified range of the first area 1060a on the flexible display.
  • An image generated based on a drawing input to the interior 1050 may be displayed by overlaying it on the image displayed in the first area 1060a.
  • the electronic device 1001 rotates the camera 1080 so that the direction the camera 1080 faces is tracking the direction in which the drawing input is detected. (tracking) can be done.
  • the description of the operation of the electronic device 1001 rotating the camera 1080 may be replaced with the description of FIGS.
  • the electronic device 1001 may change the direction in which the camera 1080 faces by folding the flexible display based on the hinge structure 1070 .
  • the electronic device 1001 may change the folding angle of the hinge structure 1070 so that the direction the camera 1080 faces tracks the direction in which the drawing input is sensed.
  • the electronic device 1001 changes the direction the camera 1080 faces from the first direction 1021 to the second direction 1023 by changing the folding angle of the flexible display that is folded based on the hinge structure 1070 .
  • FIG. 11 illustrates an electronic device 1101 acquiring a preview image in various forms, according to various embodiments of the present disclosure.
  • the electronic device 1101 (eg, the electronic device 101 of FIG. 1 ) has an angle of view of the camera 1180 (eg, the camera 480 of FIG. 4 ), a direction in which the camera 1180 faces,
  • various preview images may be acquired.
  • the electronic device 1101 may acquire a preview image included in the first direction 1121 by using the camera 1180 .
  • the user holds the electronic device 1101 in the form of a second type (eg, a second type according to reference number 500b of FIG. 5 ), and the camera 1180 moves in the first direction 1121 It can be made to perform the shooting function in the state facing.
  • the electronic device may display an image acquired through the camera 1180 disposed to face the first direction 1121 on at least one area of the flexible display.
  • at least one image displayed by the electronic device 1101 according to reference number 1111 may be an image captured in a direction in which the user is present.
  • the electronic device 1101 may acquire a preview image included in the second direction 1122 by using the camera 1180 .
  • the user may hold the electronic device 1101 in a second type shape and perform a photographing function with the camera 1180 facing the second direction 1122 .
  • the electronic device may display an image acquired through the camera 1180 disposed to face the second direction 1122 on at least one area of the flexible display.
  • at least one image displayed by the electronic device 1101 according to reference number 1112 may be an image photographed in a direction in which the user looks.
  • the electronic device 1101 may acquire a preview image included in the third direction 1123 using the camera 1180 .
  • the user may hold the electronic device 1101 in a second type shape and perform a photographing function with the camera 1180 facing the third direction 1123 .
  • the electronic device may display an image acquired through the camera 1180 disposed to face the third direction 1123 on at least one area of the flexible display.
  • at least one image displayed by the electronic device 1101 according to reference number 1113 may be an image obtained by the camera 1180 in the direction of the lower end of the electronic device 1101 .
  • the electronic device 1101 may acquire a preview image included in the fourth direction 1124 using the camera 1180 .
  • the user may hold the electronic device 1101 in a second type shape and perform a photographing function with the camera 1180 facing the fourth direction 1124 .
  • the electronic device may display an image acquired through the camera 1180 disposed to face the fourth direction 1124 on at least one area of the flexible display.
  • at least one image displayed by the electronic device 1101 according to reference number 1114 may be an image obtained by the camera 1180 in the direction of the upper end of the electronic device 1101 .
  • FIG. 12 illustrates an electronic device 1201 that acquires a preview image in various forms, according to various embodiments of the present disclosure.
  • the electronic device 1201 (eg, the electronic device 101 of FIG. 1 ) is a flexible display (eg, the display 460 of FIG. 4 ).
  • a user interface (UI) including a plurality of buttons for changing the drawing property of an input is displayed in an area, and at least acquired using a camera (eg, the camera 480 in FIG. 4 ) in an area other than the one area
  • UI user interface
  • the electronic device 1201 displays the first area 1260a and the second area 1260b. You can switch the execution screen.
  • the electronic device 1201 executes screens displayed in the first area 1260a and the second area 1260b as indicated by reference numeral 1210b.
  • the designated input may be a hovering drag input by an external electronic device (eg, the external electronic device 402 of FIG. 4 ) or a drag input by a user.
  • the electronic device 1201 may display at least one image acquired using a camera on the second area 1260b.
  • the electronic device 1201 may expand and display the second area 1260b. For example, in a state in which the electronic device 1201 displays the first area 1260a and the second area 1260b at substantially the same ratio according to reference number 1220a, drawing in at least a part except for the designated range 1250 An input can be sensed.
  • the electronic device 1260 detects the drawing input and then displays at least one image acquired using a camera to a part of the first area 1260a and the second area 1260b. It can be expanded to display.
  • FIG. 13 illustrates an electronic device 1301 that displays various types of drawing images on a display, according to various embodiments of the present disclosure.
  • the electronic device 1301 (eg, the electronic device 101 of FIG. 1 ) is a drawing image of various shapes 1310a and 1310b on a flexible display (eg, the display 460 of FIG. 4 ). (1311, 1313, 1315, and 1317) can be displayed.
  • the electronic device 1301 may display a preview image obtained by using the camera 1380 (eg, the camera 480 of FIG. 4 ) facing the first direction 1305 on the flexible display.
  • the electronic device 1301 may display the preview image on both the first area 1360a and the second area 1360b of the flexible display.
  • the electronic device 1301 may display the preview image in some or all of each of the first area 1360a and the second area 1360b.
  • the electronic device 1301 may display an image generated based on an input to the flexible display.
  • the electronic device 1301 may receive a touch input through the external electronic device 1302 (eg, a digital pen) using a pen input interface located in the flexible display.
  • the electronic device 1301 may receive a user's touch input for the flexible display.
  • the electronic device 1301 may display the first drawing image 1311 and the second image 1313 generated based on an input to the flexible display.
  • the first drawing image 1311 and the second drawing image 1313 may be referred to as drawing images corresponding to two-dimensional figures.
  • the electronic device 1301 may display a third drawing image 1315 and a fourth drawing image 1317 generated based on an input to the flexible display.
  • the third drawing image 1315 and the fourth drawing image 1317 may be referred to as drawing images (eg, augmented reality (AR) images) corresponding to 3D figures.
  • the third drawing image 1315 and the fourth drawing image 1317 may be referred to as 3D figures generated based on a specified input (eg, a touch and drag input).
  • the electronic device 1301 configures the third drawing image 1315 and the fourth drawing image 1315 and the fourth drawing image 1315 based on a specified input sensed in at least a portion of an area where the first drawing image 1311 and the second drawing image 1313 are displayed.
  • An image 1317 may be displayed.
  • FIG. 14 illustrates a flowchart 1400 of an operation of an electronic device according to various embodiments of the present disclosure.
  • the electronic device may perform the operations of FIG. 14 .
  • the processor of the electronic device eg, the processor 120 of FIG. 1
  • the processor of the electronic device may be set to perform the operations of FIG. 14 when instructions stored in the memory (eg, the memory 130 of FIG. 1 ) are executed. .
  • the electronic device may display a user interface (UI) and a first image (eg, the first image 610 of FIG. 6 ) on one area of the flexible display (eg, the display 460 of FIG. 4 ).
  • the UI may include a plurality of buttons for changing a drawing property of an input (eg, a touch input by a user or a pen touch input by an external electronic device) to an area of the flexible display.
  • the first image may correspond to at least one image acquired using a camera (eg, the camera 480 of FIG. 4 ) set to a designated angle of view (eg, the first angle of view 721 of FIG. 7 ).
  • the electronic device may display the UI and the first image in the first area (eg, the first area 560a of FIG. 5 ) and the second area (eg, the second area 560b of FIG. 5 ), respectively.
  • the electronic device may display the image generated based on the drawing input by superimposing it on the first image.
  • the electronic device receives a drawing input (eg, the first drawing input 711 , the second drawing input 713 , and/or the third drawing input 715 of FIG. 7 ) for one area of the flexible display.
  • a drawing input eg, the first drawing input 711 , the second drawing input 713 , and/or the third drawing input 715 of FIG. 7
  • a drawing image generated based on the drawing input may be displayed by overlaying the drawing image on the first image.
  • the electronic device may determine whether a drawing input is sensed in at least a part of the designated range (eg, the designated range 550 of FIG. 5 ). When a drawing input is sensed in at least a portion except for the specified range, the electronic device may perform operation 1420 . If the drawing input is not detected in at least a portion of the range except for the specified range, the electronic device may repeatedly perform operation 1410 .
  • the electronic device acquires an image by changing the angle of view of the camera, rotating the camera based on a designated axis, or changing the folding angle, and displays the image on the second area.
  • the electronic device may include a camera supporting various angles of view (eg, a first angle of view, a second angle of view, and a third angle of view). For example, in a state in which the electronic device is displaying a first image acquired using a camera set as the first angle of view, and a drawing input is sensed in at least a portion except for a specified range, the electronic device sets the angle of view of the camera to the first The second image obtained by changing to a second angle of view wider than the angle of view may be displayed on the second area.
  • various angles of view eg, a first angle of view, a second angle of view, and a third angle of view.
  • the electronic device when a drawing input is sensed in at least a part of the second area displaying the second image except for a designated range, the electronic device changes the camera's angle of view to a third angle of view that is wider than the second angle of view. 3 An image may be displayed in the second area.
  • the electronic device may change the direction the camera faces by rotating the camera based on a specified axis. For example, when a drawing input is detected in at least a part of the second area displaying the third image except for a specified range, the electronic device rotates the camera so that the direction the camera faces is tracking the direction in which the drawing input is detected. ) can be done.
  • the electronic device may display the fourth image obtained by rotating the camera through a part of the first area and the second area.
  • the electronic device may change the folding angle of the hinge structure (eg, the hinge structure 1070 of FIG. 10 ) so that the camera may take pictures in various directions. For example, when a drawing input is sensed in at least a part of the second area displaying the fourth image except for a specified range, the electronic device changes the folding angle of the hinge structure so that the direction the camera faces is the direction in which the touch input is sensed. can be tracked.
  • the hinge structure eg, the hinge structure 1070 of FIG. 10
  • the electronic device changes the folding angle of the hinge structure so that the direction the camera faces is the direction in which the touch input is sensed. can be tracked.
  • a foldable electronic device (eg, the electronic device 101 of FIG. 1 ) includes a camera supporting a first angle of view, a second angle of view, and a third angle of view, a first housing, and a second and a housing, a hinge structure disposed between the first housing and the second housing, a flexible display, a processor, and a memory operatively connected to the processor.
  • the memory displays a user interface (UI) including a plurality of buttons that, when executed, cause the processor to change a drawing property of an input in a first area on the flexible display, and A first image acquired using the camera set to the first angle of view is displayed in a second area on the display, and an image generated based on a drawing input within a specified range of the second area is displayed in the second area
  • UI user interface
  • the camera's angle of view is set to be wider than the first angle of view.
  • the processor when the processor detects the drawing input in at least a portion of the second area displaying the second image except for the specified range, the camera A third image obtained by changing the angle of view to the third angle of view wider than the second angle of view may be displayed on the second area.
  • the processor when the processor detects the drawing input in at least a part of the second area displaying the third image except for the specified range, the camera is By rotating it, the direction in which the camera faces may track the direction in which the drawing input is sensed.
  • the one or more instructions when executed, may cause the processor to display a fourth image obtained by rotating the camera through a part of the first area and the second area.
  • the hinge structure By changing the folding angle of , the direction the camera faces may track the direction in which the drawing input is sensed.
  • the foldable electronic device further comprises a pen input interface configured to receive pen input from a digital pen and a wireless communication circuit operatively connected to the processor, the pen input interface positioned within the flexible display;
  • the input may be a digital pen communicating with the wireless communication circuit or an input by a user's touch.
  • the one or more instructions when executed, may cause the processor to switch execution screens displayed in the first area and the second area in response to a specified input to the flexible display. .
  • the designated input may be a hovering drag input by the digital pen or a drag input by a user.
  • the UI may include at least one of a color, a type, and a shooting start button of an image generated based on the drawing input.
  • the one or more instructions when executed, cause the processor to display, on the second area, at least one image acquired using the camera when a touch input to the start shooting button is sensed. can do.
  • a method for a foldable electronic device to display a preview image includes a UI ( user interface), displaying a first image acquired using a camera set to a first angle of view in a second area on the flexible display and generating based on a drawing input within a specified range of the second area an operation of overlaying and displaying an image displayed in the second area on the first image displayed in the second area, and when the drawing input is sensed in at least a portion of the second area except for the specified range, the angle of view of the camera and displaying a second image obtained by changing to a second angle of view wider than the first angle of view on the second area.
  • the drawing input is sensed in at least a portion of the second area displaying the second image except for the specified range.
  • the method may further include displaying a third image obtained by changing the angle of view of the camera to the third angle of view wider than the second angle of view on the second area.
  • the drawing input is sensed in at least a portion of the second area displaying the third image except for the specified range.
  • the method may further include an operation of tracking a direction in which the drawing input is sensed in a direction in which the camera is directed by rotating the camera.
  • the method of displaying a preview image by the foldable electronic device includes displaying a fourth image obtained by rotating the camera through a part of the first area and the second area. movement; may further include.
  • the drawing input is sensed in at least a portion of the second area displaying the fourth image except for the specified range. If it is, the method may further include tracking the direction in which the drawing input is sensed in the direction the camera faces by changing the folding angle of the hinge structure.
  • the drawing input may be an input by a user's touch or a digital pen communicating with a wireless communication circuit included in the foldable electronic device.
  • execution screens displayed on the first area and the second area in response to a specified input to the flexible display are displayed. It may further include an operation of switching.
  • the designated input may be a hovering drag input by the digital pen or a drag input by a user.
  • the UI may include at least one of a color, a type, and a shooting start button of an image generated based on the drawing input.
  • the method for displaying a preview image by the foldable electronic device when a touch input to the start shooting button is detected, at least one image acquired using the camera is displayed as the second image.
  • An operation of displaying in the second area may be further included.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Est divulgué ici un appareil électronique qui comprend une caméra, un premier boîtier, un second boîtier, une structure de charnière, un afficheur flexible, un processeur et une mémoire. L'appareil électronique peut : afficher, dans une première région, une UI qui comprend une pluralité de boutons pour changer des attributs de dessin d'une entrée ; afficher, dans une seconde région, d'une première image obtenue à l'aide d'une caméra réglée à un premier angle de visualisation ; afficher, superposée sur la première image affichée dans la seconde région, une image générée sur la base d'une entrée de dessin dans une zone désignée de la seconde région ; et afficher, dans la seconde région, une seconde image obtenue en changeant l'angle de visualisation de la caméra à un second angle de visualisation plus grand que le premier angle de visualisation, la seconde image étant affichée lorsqu'une entrée de dessin est détectée dans au moins une partie de la seconde région autre que la zone désignée. Divers autres modes de réalisation qui peuvent être entendus dans la description sont également possibles.
PCT/KR2021/012863 2020-09-24 2021-09-17 Procédé d'affichage d'image de prévisualisation et appareil électronique le prenant en charge WO2022065844A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2020-0124257 2020-09-24
KR1020200124257A KR20220040935A (ko) 2020-09-24 2020-09-24 프리뷰 이미지를 표시하는 방법 및 이를 지원하는 전자 장치

Publications (1)

Publication Number Publication Date
WO2022065844A1 true WO2022065844A1 (fr) 2022-03-31

Family

ID=80845644

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/012863 WO2022065844A1 (fr) 2020-09-24 2021-09-17 Procédé d'affichage d'image de prévisualisation et appareil électronique le prenant en charge

Country Status (2)

Country Link
KR (1) KR20220040935A (fr)
WO (1) WO2022065844A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170004532A (ko) * 2015-07-03 2017-01-11 엘지이노텍 주식회사 광각 촬영장치 및 이를 포함하는 모바일 기기
KR20170038365A (ko) * 2015-09-30 2017-04-07 삼성전자주식회사 전자장치의 이미지 처리장치 및 방법
WO2017183743A1 (fr) * 2016-04-19 2017-10-26 엘지전자 주식회사 Terminal mobile, stylet et procédé de commande associé
KR20200034528A (ko) * 2018-09-21 2020-03-31 엘지전자 주식회사 이동 단말기
US20200267326A1 (en) * 2019-02-19 2020-08-20 Samsung Electronics Co., Ltd. Electronic device and method for changing magnification of image using multiple cameras

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170004532A (ko) * 2015-07-03 2017-01-11 엘지이노텍 주식회사 광각 촬영장치 및 이를 포함하는 모바일 기기
KR20170038365A (ko) * 2015-09-30 2017-04-07 삼성전자주식회사 전자장치의 이미지 처리장치 및 방법
WO2017183743A1 (fr) * 2016-04-19 2017-10-26 엘지전자 주식회사 Terminal mobile, stylet et procédé de commande associé
KR20200034528A (ko) * 2018-09-21 2020-03-31 엘지전자 주식회사 이동 단말기
US20200267326A1 (en) * 2019-02-19 2020-08-20 Samsung Electronics Co., Ltd. Electronic device and method for changing magnification of image using multiple cameras

Also Published As

Publication number Publication date
KR20220040935A (ko) 2022-03-31

Similar Documents

Publication Publication Date Title
WO2022030855A1 (fr) Dispositif électronique et procédé permettant de générer une image par application d'un effet sur un sujet et un arrière-plan
WO2022097857A1 (fr) Dispositif électronique et procédé d'affichage d'image sur un écran souple
WO2022039424A1 (fr) Procédé de stabilisation d'images et dispositif électronique associé
WO2022108235A1 (fr) Procédé, appareil et support de stockage pour obtenir un obturateur lent
WO2022235043A1 (fr) Dispositif électronique comprenant une pluralité de caméras et son procédé de fonctionnement
WO2022092706A1 (fr) Procédé de prise de photographie à l'aide d'une pluralité de caméras, et dispositif associé
WO2022235075A1 (fr) Dispositif électronique et son procédé de fonctionnement
WO2022196993A1 (fr) Dispositif électronique et procédé de capture d'image au moyen d'un angle de vue d'un module d'appareil de prise de vues
WO2022119218A1 (fr) Procédé et dispositif électronique pour corriger un tremblement de caméra
WO2022030921A1 (fr) Dispositif électronique, et procédé de commande de son écran
WO2021251631A1 (fr) Dispositif électronique comprenant une fonction de réglage de mise au point, et procédé associé
WO2022065844A1 (fr) Procédé d'affichage d'image de prévisualisation et appareil électronique le prenant en charge
WO2022186495A1 (fr) Dispositif électronique comprenant une pluralité d'objectifs et procédé de commande dudit dispositif
WO2021230567A1 (fr) Procédé de capture d'image faisant intervenir une pluralité d'appareils de prise de vues et dispositif associé
WO2022245037A1 (fr) Dispositif électronique comprenant un capteur d'image et un capteur de vison dynamique, et son procédé de fonctionnement
WO2022270870A1 (fr) Procédé pour effectuer une photographie de manière multidirectionnelle par l'intermédiaire d'une caméra et dispositif électronique associé
WO2022025574A1 (fr) Dispositif électronique comprenant un capteur d'image et un processeur de signal d'image, et son procédé
WO2022240186A1 (fr) Procédé de correction de distorsion d'image et dispositif électronique associé
WO2022240183A1 (fr) Procédé de génération de fichier comprenant des données d'image et des données de mouvement, et dispositif électronique associé
WO2022177105A1 (fr) Dispositif électronique à affichage transparent et procédé de fonctionnement dudit dispositif
WO2021230507A1 (fr) Procédé et dispositif pour fournir un guidage en imagerie
WO2022250344A1 (fr) Dispositif électronique comprenant un capteur d'image et un capteur de vision dynamique, et son procédé de fonctionnement
WO2022092607A1 (fr) Dispositif électronique comportant un capteur d'image et procédé de fonctionnement de celui-ci
WO2022197036A1 (fr) Procédé de mesure utilisant la ra, et dispositif électronique
WO2022203355A1 (fr) Dispositif électronique comprenant une pluralité de caméras

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21872872

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21872872

Country of ref document: EP

Kind code of ref document: A1