WO2021256709A1 - Dispositif électronique et procédé de fonctionnement de dispositif électronique - Google Patents

Dispositif électronique et procédé de fonctionnement de dispositif électronique Download PDF

Info

Publication number
WO2021256709A1
WO2021256709A1 PCT/KR2021/006216 KR2021006216W WO2021256709A1 WO 2021256709 A1 WO2021256709 A1 WO 2021256709A1 KR 2021006216 W KR2021006216 W KR 2021006216W WO 2021256709 A1 WO2021256709 A1 WO 2021256709A1
Authority
WO
WIPO (PCT)
Prior art keywords
frame
frames
electronic device
pixel
speed
Prior art date
Application number
PCT/KR2021/006216
Other languages
English (en)
Korean (ko)
Inventor
김승훈
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2021256709A1 publication Critical patent/WO2021256709A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/915Television signal processing therefor for field- or frame-skip recording or reproducing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera

Definitions

  • a hyper-lapse (or time-lapse) video is an image in which at least one frame generated by shooting an external object for a relatively long time is played back for a relatively short time, or It may mean a set of images, or a result captured using a hyper-lapse (or time-lapse) technique may be called a hyper-lapse image (or time-lapse image).
  • Hyperlapse video requires less storage space as it can display external objects in a relatively short period of time compared to general videos that are played back at the same speed as the speed at which the external object was recorded.
  • Devices provide the ability to generate hyper-lapse images.
  • Various embodiments disclosed in this document are intended to provide an electronic device and an operating method of the electronic device capable of preventing saturation in a result when generating a hyper-lapse image using field exposure.
  • Various embodiments disclosed in this document are intended to provide an electronic device capable of generating a hyper-lapse image having different effects according to a method of synthesizing frames obtained by photographing an external object and an operating method of the electronic device.
  • An electronic device may include a camera, a memory, and a processor operatively connected to the camera and the memory.
  • the processor acquires a plurality of successive first frames by photographing an external object at a first speed through the camera, and at least one of the plurality of first frames At least one second frame is generated by synthesizing parts in a predetermined number unit in the order in which they are acquired, and the at least one second frame representing the movement of the external object is reproduced at a second speed different from the first speed Instruction for generating a hyper-lapse video can be stored.
  • the electronic device may include a camera, a memory, and a processor operatively connected to the camera and the memory.
  • the processor acquires a plurality of successive first frames by the camera capturing an external object at a first speed, and among the plurality of first frames, at least Time for generating a plurality of second frames by synthesizing one frame in a unit of a predetermined number so as to be common, and allowing the plurality of second frames representing the movement of the external object to be reproduced at a second speed different from the first speed Instructions for generating a lapse image may be stored.
  • the method of operating an electronic device includes an operation of acquiring a plurality of successive first frames by photographing an external object at a first speed through a camera, the plurality of first frames an operation of generating at least one second frame by synthesizing at least a portion of the plurality of frames in a unit of a predetermined number in the order in which they are acquired, and a second speed at which the at least one second frame representing the movement of the external object is different from the first speed It may include an operation of generating a hyper-lapse image to be played back.
  • various hyper-lapse images may be generated according to a method of synthesizing frames obtained by photographing an external object.
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100 according to various embodiments.
  • FIG. 2 is a block diagram of an electronic device according to various embodiments of the present disclosure.
  • 3A is a diagram for describing an example in which an electronic device generates at least one second frame by synthesizing a plurality of first frames, according to various embodiments of the present disclosure
  • 3B is a diagram for describing an example in which an electronic device generates at least one second frame by synthesizing a plurality of first frames, according to various embodiments of the present disclosure
  • 3C is a diagram for describing an example in which an electronic device generates at least one second frame by synthesizing a plurality of first frames, according to various embodiments of the present disclosure
  • FIG. 4 is a diagram for explaining a method of synthesizing at least two first frames in an electronic device in a first mode, a second mode, or a third mode, according to various embodiments of the present disclosure
  • 5 is a diagram for exemplarily showing weights indicating a blend ratio of an average frame and a maximum frame.
  • FIG. 6 is a flowchart illustrating an operation in which an electronic device generates a hyper-lapse image in a first mode, according to various embodiments of the present disclosure
  • FIG. 7 is a flowchart illustrating an operation in which an electronic device generates a hyper-lapse image in a second mode, according to various embodiments of the present disclosure
  • FIG. 8 is a flowchart illustrating an operation in which an electronic device generates a hyper-lapse image in a third mode, according to various embodiments of the present disclosure
  • FIG. 9 is a flowchart illustrating an operation in which an electronic device generates a hyper-lapse image based on mode information, according to various embodiments of the present disclosure
  • FIG. 10 is a flowchart illustrating an operation in which an electronic device generates at least one second frame by synthesizing a plurality of first frames in a first mode, according to various embodiments of the present disclosure
  • FIG. 11 is a flowchart illustrating an operation in which an electronic device generates at least one second frame by synthesizing a plurality of first frames in a second mode, according to various embodiments of the present disclosure
  • FIG. 12 is a flowchart illustrating an operation in which an electronic device generates at least one second frame by synthesizing a plurality of first frames in a third mode, according to various embodiments of the present disclosure
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100 according to various embodiments.
  • an electronic device 101 communicates with an electronic device 102 through a first network 198 (eg, a short-range wireless communication network) or a second network 199 . It may communicate with the electronic device 104 or the server 108 through (eg, a long-distance wireless communication network). According to an embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • a first network 198 eg, a short-range wireless communication network
  • a second network 199 e.g., a second network 199
  • the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • the electronic device 101 includes a processor 120 , a memory 130 , an input module 150 , a sound output module 155 , a display module 160 , an audio module 170 , and a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or an antenna module 197 may be included.
  • at least one of these components eg, the connection terminal 178
  • may be omitted or one or more other components may be added to the electronic device 101 .
  • some of these components are integrated into one component (eg, display module 160 ). can be
  • the processor 120 for example, executes software (eg, a program 140) to execute at least one other component (eg, a hardware or software component) of the electronic device 101 connected to the processor 120 . It can control and perform various data processing or operations. According to one embodiment, as at least part of data processing or operation, the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 . may be stored in the volatile memory 132 , and may process commands or data stored in the volatile memory 132 , and store the result data in the non-volatile memory 134 .
  • software eg, a program 140
  • the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 .
  • the volatile memory 132 may be stored in the volatile memory 132 , and may process commands or data stored in the volatile memory 132 , and store the result data in the non-volatile memory 134 .
  • the processor 120 is the main processor 121 (eg, a central processing unit or an application processor) or a secondary processor 123 (eg, a graphic processing unit, a neural network processing unit) a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor).
  • the main processor 121 e.g, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, a neural network processing unit
  • NPU neural processing unit
  • an image signal processor e.g., a sensor hub processor, or a communication processor.
  • the main processor 121 e.g, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, a neural network processing unit
  • NPU neural processing unit
  • an image signal processor e.g., a sensor hub processor, or a communication processor.
  • the main processor 121 e.g, a central processing unit or an application processor
  • a secondary processor 123
  • the auxiliary processor 123 is, for example, on behalf of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or the main processor 121 is active (eg, executing an application). ), together with the main processor 121, at least one of the components of the electronic device 101 (eg, the display module 160, the sensor module 176, or the communication module 190) It is possible to control at least some of the related functions or states.
  • the co-processor 123 eg, an image signal processor or a communication processor
  • may be implemented as part of another functionally related component eg, the camera module 180 or the communication module 190. have.
  • the auxiliary processor 123 may include a hardware structure specialized for processing an artificial intelligence model.
  • Artificial intelligence models can be created through machine learning. Such learning may be performed, for example, in the electronic device 101 itself on which artificial intelligence is performed, or may be performed through a separate server (eg, the server 108).
  • the learning algorithm may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but in the above example not limited
  • the artificial intelligence model may include a plurality of artificial neural network layers.
  • Artificial neural networks include deep neural networks (DNNs), convolutional neural networks (CNNs), recurrent neural networks (RNNs), restricted boltzmann machines (RBMs), deep belief networks (DBNs), bidirectional recurrent deep neural networks (BRDNNs), It may be one of deep Q-networks or a combination of two or more of the above, but is not limited to the above example.
  • the artificial intelligence model may include, in addition to, or alternatively, a software structure in addition to the hardware structure.
  • the memory 130 may store various data used by at least one component of the electronic device 101 (eg, the processor 120 or the sensor module 176 ).
  • the data may include, for example, input data or output data for software (eg, the program 140 ) and instructions related thereto.
  • the memory 130 may include a volatile memory 132 or a non-volatile memory 134 .
  • the program 140 may be stored as software in the memory 130 , and may include, for example, an operating system 142 , middleware 144 , or an application 146 .
  • the input module 150 may receive a command or data to be used in a component (eg, the processor 120 ) of the electronic device 101 from the outside (eg, a user) of the electronic device 101 .
  • the input module 150 may include, for example, a microphone, a mouse, a keyboard, a key (eg, a button), or a digital pen (eg, a stylus pen).
  • the sound output module 155 may output a sound signal to the outside of the electronic device 101 .
  • the sound output module 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback.
  • the receiver may be used to receive an incoming call. According to one embodiment, the receiver may be implemented separately from or as part of the speaker.
  • the display module 160 may visually provide information to the outside (eg, a user) of the electronic device 101 .
  • the display module 160 may include, for example, a control circuit for controlling a display, a hologram device, or a projector and a corresponding device.
  • the display module 160 may include a touch sensor configured to sense a touch or a pressure sensor configured to measure the intensity of a force generated by the touch.
  • the audio module 170 may convert a sound into an electric signal or, conversely, convert an electric signal into a sound. According to an embodiment, the audio module 170 acquires a sound through the input module 150 , or an external electronic device (eg, a sound output module 155 ) connected directly or wirelessly with the electronic device 101 . A sound may be output through the electronic device 102 (eg, a speaker or headphones).
  • an external electronic device eg, a sound output module 155
  • a sound may be output through the electronic device 102 (eg, a speaker or headphones).
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101 or an external environmental state (eg, user state), and generates an electrical signal or data value corresponding to the sensed state. can do.
  • the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 177 may support one or more designated protocols that may be used by the electronic device 101 to directly or wirelessly connect with an external electronic device (eg, the electronic device 102 ).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card
  • the connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102 ).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that the user can perceive through tactile or kinesthetic sense.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture still images and moving images. According to an embodiment, the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101 .
  • the power management module 188 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101 .
  • battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108). It can support establishment and communication performance through the established communication channel.
  • the communication module 190 may include one or more communication processors that operate independently of the processor 120 (eg, an application processor) and support direct (eg, wired) communication or wireless communication.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg, : It may include a LAN (local area network) communication module, or a power line communication module).
  • GNSS global navigation satellite system
  • a corresponding communication module among these communication modules is a first network 198 (eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (eg, legacy It may communicate with the external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
  • a first network 198 eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)
  • a second network 199 eg, legacy It may communicate with the external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
  • a telecommunication network
  • the wireless communication module 192 uses the subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199 .
  • the electronic device 101 may be identified or authenticated.
  • the wireless communication module 192 may support a 5G network after a 4G network and a next-generation communication technology, for example, a new radio access technology (NR).
  • NR access technology includes high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), minimization of terminal power and access to multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low-latency) -latency communications)).
  • eMBB enhanced mobile broadband
  • mMTC massive machine type communications
  • URLLC ultra-reliable and low-latency
  • the wireless communication module 192 may support a high frequency band (eg, mmWave band) to achieve a high data rate, for example.
  • a high frequency band eg, mmWave band
  • the wireless communication module 192 includes various technologies for securing performance in a high-frequency band, for example, beamforming, massive multiple-input and multiple-output (MIMO), all-dimensional multiplexing. It may support technologies such as full dimensional MIMO (FD-MIMO), an array antenna, analog beam-forming, or a large scale antenna.
  • the wireless communication module 192 may support various requirements specified in the electronic device 101 , an external electronic device (eg, the electronic device 104 ), or a network system (eg, the second network 199 ).
  • the wireless communication module 192 may include a peak data rate (eg, 20 Gbps or more) for realizing eMBB, loss coverage (eg, 164 dB or less) for realizing mMTC, or U-plane latency for realizing URLLC ( Example: downlink (DL) and uplink (UL) each 0.5 ms or less, or round trip 1 ms or less).
  • a peak data rate eg, 20 Gbps or more
  • loss coverage eg, 164 dB or less
  • U-plane latency for realizing URLLC
  • the antenna module 197 may transmit or receive a signal or power to the outside (eg, an external electronic device).
  • the antenna module 197 may include an antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is connected from the plurality of antennas by, for example, the communication module 190 . can be selected. A signal or power may be transmitted or received between the communication module 190 and an external electronic device through the selected at least one antenna.
  • other components eg, a radio frequency integrated circuit (RFIC)
  • RFIC radio frequency integrated circuit
  • the antenna module 197 may form a mmWave antenna module.
  • the mmWave antenna module comprises a printed circuit board, an RFIC disposed on or adjacent to a first side (eg, bottom side) of the printed circuit board and capable of supporting a designated high frequency band (eg, mmWave band); and a plurality of antennas (eg, an array antenna) disposed on or adjacent to a second side (eg, top or side) of the printed circuit board and capable of transmitting or receiving signals of the designated high frequency band. can do.
  • peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • the command or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199 .
  • Each of the external electronic devices 102 or 104 may be the same as or different from the electronic device 101 .
  • all or a part of operations executed in the electronic device 101 may be executed in one or more external electronic devices 102 , 104 , or 108 .
  • the electronic device 101 may perform the function or service itself instead of executing the function or service itself.
  • one or more external electronic devices may be requested to perform at least a part of the function or the service.
  • One or more external electronic devices that have received the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit a result of the execution to the electronic device 101 .
  • the electronic device 101 may process the result as it is or additionally and provide it as at least a part of a response to the request.
  • cloud computing distributed computing, mobile edge computing (MEC), or client-server computing technology may be used.
  • the electronic device 101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 104 may include an Internet of things (IoT) device.
  • Server 108 may be an intelligent server using machine learning and/or neural networks.
  • the external electronic device 104 or the server 108 may be included in the second network 199 .
  • the electronic device 101 may be applied to an intelligent service (eg, smart home, smart city, smart car, or health care) based on 5G communication technology and IoT-related technology.
  • FIG. 2 is a block diagram of an electronic device according to various embodiments of the present disclosure.
  • the electronic device 200 (eg, the electronic device 101 of FIG. 1 ) includes a camera 210 (eg, the camera module 180 of FIG. 1 ) and a memory 220 (eg, FIG. 1 ). of the memory 130 ) and a processor 230 (eg, the processor 120 of FIG. 1 ).
  • the camera 210 may generate a hyper-lapse image by photographing an external object.
  • time-lapse may be used interchangeably with or equivalent to hyper-lapse.
  • the camera 210 may generate a plurality of frames including the external object by photographing the external object. According to an embodiment, the camera 210 may generate a plurality of first frames by photographing an external object at a predetermined speed based on mode information. According to an embodiment, the electronic device 200 may receive mode information from the user. According to an embodiment, the electronic device 200 may generate mode information based on a result of performing an internal operation of the processor 230 and use the generated mode information to synthesize a plurality of first frames. .
  • the mode information may include at least one of first mode information, second mode information, and third mode information.
  • the mode information includes the first mode information
  • the camera 210 acquires a plurality of first frames
  • the processor 230 synthesizes the plurality of first frames into a first mode. You can create hyper-lapse images.
  • the mode information includes the second mode information
  • the camera 210 acquires a plurality of first frames
  • the processor 230 synthesizes the plurality of first frames into a second mode (mode) to create a hyper-lapse image.
  • mode information includes the third mode information
  • the camera 210 obtains a plurality of first frames
  • the processor 230 synthesizes the plurality of first frames into a third mode (mode). to create a hyper-lapse image.
  • the camera 210 may include at least a part of the camera module 180 illustrated in FIG. 1 .
  • the memory 220 may store at least one program, application, data, or instructions executed by the processor 230 . According to an embodiment, the memory 220 may include at least a portion of the memory 130 illustrated in FIG. 1 .
  • the processor 230 may generate at least one second frame by synthesizing a plurality of first frames based on the mode information, and may generate a hyper-lapse image including the at least one second frame. have.
  • the processor 230 may generate a hyper-lapse image reproduced at a second speed different from a first speed at which a plurality of first frames are generated.
  • saturation may occur due to excessively received light.
  • the processor 230 may set the exposure time of the camera 210 to be longer than an appropriate exposure time in order to indicate the trajectory of the external object that has moved for a long time including the light emitting element.
  • the hyper-lapse image representing the light trajectory corresponding to the movement trajectory of the external object may be saturated.
  • the processor 230 may set the exposure time of the camera 210 to about 1.5 seconds in order to indicate the movement trajectory of the external object including the light emitting element. .
  • saturation may occur in the hyper-lapse image generated using the plurality of first frames acquired through the camera 210 .
  • the processor 230 may prevent saturation that may occur in the hyper-lapse image.
  • the processor 230 may generate at least one second frame by synthesizing a plurality of first frames based on a predetermined blend (or blending) ratio.
  • saturation of the hyper-lapse image including at least one second frame generated by the processor 230 may be prevented.
  • the processor 230 according to an embodiment may generate the second frame by synthesizing three first frames corresponding to an individual exposure time of 0.5 seconds (sec) based on the blend ratio.
  • the second frame generated by the processor 230 corresponds to a total exposure time of 1.5 seconds (sec), but is generated by synthesizing the first frames having an exposure time of 0.5 seconds (sec). Therefore, saturation can be prevented.
  • the processor 230 since the processor 230 according to an embodiment synthesizes the first frames based on the blend ratio, it is possible to generate a high-quality hyper-lapse image with significantly reduced saturation.
  • the processor 230 may generate a high-speed hyper-lapse image or a high-quality hyper-lapse image with reduced saturation that can effectively represent a trajectory of an external object according to a frame synthesis method.
  • the processor 230 may generate at least one second frame by synthesizing the plurality of first frames generated by the camera 210 in the first mode.
  • the processor 230 may generate at least one second frame by synthesizing a plurality of first frames in units of a predetermined number in the order in which they are generated.
  • the processor 230 may determine the number of a plurality of first frames to be synthesized based on the brightness of a photographing environment (or a photographing scene), and according to the determined number of the plurality of first frames
  • At least one second frame may be generated by synthesizing the frames in the order in which they are acquired.
  • the processor 230 may control the exposure time of one first frame (eg, the first time T1 of FIGS. 3A to 3C ) to be relatively short. .
  • the processor 230 controls the exposure time (eg, the first time T1 of FIGS. 3A to 3C ) of one first frame to be relatively long.
  • the processor 230 sets the number of first frames synthesized to generate one second frame to a relatively large value.
  • the processor 230 compares the number of first frames synthesized to generate one second frame relative to each other. can be set to a small value.
  • the electronic device may change the exposure time of the first frame while capturing an external object.
  • the electronic device may change the number of synthesizing the plurality of first frames in response to the changed exposure time. For example, it is assumed that the surrounding environment becomes bright or dark while the electronic device captures an external object in a dark environment.
  • the electronic device may change the exposure time of the first frame and change the first speed to correspond to the changed exposure time of the first frame. Accordingly, the electronic device may change the number of the plurality of first frames to be synthesized in response to the changed first speed in order to keep the second speed, which is the reproduction speed of the hyper-lapse image, constant.
  • the electronic device may change a predetermined number of units to the changed number, and may generate a second frame by synthesizing a plurality of first frames in an acquired order by the changed predetermined number of units.
  • the electronic device may prevent the generated hyper-lapse image from being saturated by varying the exposure time or the first speed of the first frame.
  • the electronic device 200 may receive an exposure effect for a hyper-lapse image from a user.
  • the user may input information (eg, 1 second) on the exposure effect into the electronic device 200 .
  • the processor 230 may determine the exposure time of one first frame. For example, in a relatively bright photographing environment, the processor 230 may determine the exposure time of one first frame to be 0.2 seconds. Accordingly, the processor 230 may generate one second frame having an exposure effect of 1 second desired by the user by synthesizing the five first frames.
  • the processor 230 may determine the exposure time of one first frame to be 0.5 seconds. Accordingly, the processor 230 may generate one second frame having an exposure effect of 1 second desired by the user by synthesizing the two first frames. According to an embodiment, the processor 230 may generate a hyper-lapse image including the generated at least one second frame.
  • the hyper-lapse image generated based on the first mode may represent an external object that moves at a relatively faster speed than when the external object is photographed.
  • the external object included in the hyper-lapse image generated based on the first mode may move at a relatively faster speed than the actual movement of the external object during photographing.
  • the hyper-lapse image generated to have an exposure effect of 1 second may represent the motion of the original external object at 30x speed.
  • the processor 230 may generate at least one second frame by synthesizing the plurality of first frames generated by the camera 210 in the second mode.
  • the processor 230 may generate at least two second frames by synthesizing in units of a predetermined number so that at least one overlaps among the plurality of first frames.
  • the processor 230 determines the number of a plurality of first frames to be synthesized based on the brightness of a shooting environment (or a shooting scene), and according to the determined number, at least one first frame is generated.
  • At least two second frames may be generated by synthesizing a plurality of first frames to overlap.
  • the processor 230 may control the exposure time of one first frame (eg, the first time T1 of FIGS. 3A to 3C ) to be relatively short. .
  • the processor 230 controls the exposure time (eg, the first time T1 of FIGS. 3A to 3C ) of one first frame to be relatively long.
  • the processor 230 sets the number of first frames synthesized to generate one second frame to a relatively large value.
  • the processor 230 compares the number of first frames synthesized to generate one second frame relative to each other. can be set to a small value.
  • the electronic device may change the exposure time of the first frame while capturing an external object.
  • the electronic device may change the number of synthesizing the plurality of first frames in response to the changed exposure time of the first frame. For example, it is assumed that the surrounding environment becomes bright or dark while the electronic device captures an external object in a dark environment.
  • the electronic device may change the exposure time of the first frame and change the first speed to correspond to the changed exposure time of the first frame. Accordingly, the electronic device may change the number of the plurality of first frames to be synthesized in response to the changed first speed in order to keep the second speed, which is the reproduction speed of the hyper-lapse image, constant.
  • the electronic device changes a predetermined number of units to the changed number, and generates a second frame by synthesizing a plurality of first frames in the changed predetermined number of units so that at least one first frame overlaps. can do.
  • the electronic device may prevent the generated hyper-lapse image from being saturated by varying the exposure time and the first speed of the first frame.
  • the electronic device 200 may receive an exposure effect for a hyper-lapse image from a user.
  • the user may input information (eg, 1 second) on the exposure effect into the electronic device 200 .
  • the processor 230 may determine the exposure time of one first frame. For example, in a relatively bright photographing environment, the processor 230 may determine the exposure time of one first frame to be 0.2 seconds. Accordingly, the processor 230 may generate one second frame having an exposure effect of 1 second desired by the user by synthesizing the five first frames.
  • the processor 230 for generating the hyper-lapse image in the second mode may obtain ten first frames such that at least one first frame overlaps, and synthesize five of them to generate two consecutive second frames. have. For example, in a relatively dark photographing environment, the processor 230 may determine the exposure time of one first frame to be 0.5 seconds. Accordingly, the processor 230 may generate one second frame having an exposure effect of 1 second desired by the user by synthesizing the two first frames.
  • the processor 230 for generating the hyper-lapse image of the second mode may obtain four first frames such that at least one first frame overlaps, and synthesize two of them to generate two consecutive second frames. have. According to an embodiment, assuming that the second speed corresponding to the playback speed of the hyper-lapse image is 30 fps, the hyper-lapse image generated to have an exposure effect of 1 second may represent the motion of the original external object at 30x speed.
  • the processor 230 may generate a hyper-lapse image including the generated at least one second frame. Since at least one frame among the plurality of first frames overlaps in the hyper-lapse image generated based on the second mode, the movement (or trajectory according to the movement) of the external object may be effectively represented.
  • the hyper-lapse image generated in the second mode may represent a smooth image with fewer inter-frame breaks than the hyper-lapse image generated in the first mode.
  • the processor 230 may generate at least one second frame by synthesizing the plurality of first frames generated by the camera 210 in the third mode.
  • the processor 230 may generate at least one second frame including a frame that is first generated among the plurality of first frames and a frame that is cumulatively synthesized in the order in which the plurality of first frames are generated.
  • the processor 230 may generate a hyper-lapse image including the generated at least one second frame.
  • the hyper-lapse image generated based on the third mode includes at least one second frame generated by accumulating a plurality of first frames obtained by photographing the external object, the movement of the external object (or the movement of the external object) trajectory) can be effectively represented.
  • the hyper-lapse image generated in the third mode may represent a smooth image with fewer inter-frame breaks than the hyper-lapse image generated in the first mode or the second mode.
  • the electronic device 200 generates a hyper-lapse image in which the speed of the external object is increased according to a method of synthesizing a plurality of first frames acquired at a first speed by photographing the external object or , or a hyper-lapse image that can effectively represent a moving trajectory of an external object can be generated.
  • the electronic device 200 generates a hyper-lapse image by using all of the acquired first frames instead of using a sampling method using some of the first frames acquired through the camera 210 . Therefore, the completed hyper-lapse video can provide the user with a smooth video feeling without interruption.
  • the electronic device 200 may generate at least one second frame having an effect corresponding to a relatively long exposure time by synthesizing first frames having a relatively short exposure time.
  • the hyper-lapse image including the second frame effectively expresses the movement trajectory of an external object and at the same time prevents saturation by light, so that it can have high quality.
  • 3A is a diagram for describing an example in which an electronic device generates at least one second frame by synthesizing a plurality of first frames, according to various embodiments of the present disclosure;
  • the electronic device eg, the electronic device 101 of FIG. 1 or the electronic device 200 of FIG. 2 ) synthesizes a plurality of first frames acquired at a first speed in a first mode, and at least One second frame may be generated.
  • the electronic device captures an external object through a camera (eg, the camera module 180 of FIG. 1 or the camera 210 of FIG. 2 ) to obtain a plurality of first frames 301 , 302 , 303 , 304, 305, 306, 307, 308, ...) may be obtained at the first rate.
  • FIG. 3A exemplarily illustrates a plurality of first frames 301 , 302 , 303 , 304 , 305 , 306 , 307 , 308 , ... .
  • the plurality of first frames 301 , 302 , 303 , 304 , 305 , 306 , 307 , 308 , . , 305, 306, 307, 308, ...) are not construed as being limited to those shown in FIG. 3A .
  • the plurality of first frames 301, 302, 303, 304, 305, 306, 307, 308, ...) is a 301 frame 301, a 302 frame 302, a 303 frame ( 303 ), 304 frame 304 , 305 frame 305 , 306 frame 306 , 307 frame 307 and 308 frame 308 .
  • each of the plurality of first frames 301, 302, 303, 304, 305, 306, 307, 308, ... may represent an external object photographed through a camera.
  • each of the plurality of first frames 301 , 302 , 303 , 304 , 305 , 306 , 307 , 308 , ... may include a still image obtained by continuously photographing an external object.
  • an external object that changes with time may be represented.
  • the electronic device receives light from the outside for a predetermined period of time through a camera, and uses the received light to display the plurality of first frames 301, 302, 303, 304, 305, 306, 307, 308 , ...) can be created.
  • the electronic device according to an embodiment receives light reflected from an external object for a first time T1, and uses a sensor (eg, an image sensor) included in a camera to receive data based on the received light can create
  • the electronic device according to an embodiment may read data generated during the first time T1 during the second time T2 and reset a sensor included in the camera.
  • the electronic device according to an embodiment may further perform an operation of removing noise generated in data generated during the first time period T1 during the second time period T2.
  • each of the plurality of first frames includes a result of photographing an external object at a predetermined time during the first time T1
  • the hyper-lapse image generated by synthesizing the plurality of first frames is A change in the external object may be expressed during the first time T1 of .
  • the position of the external object appearing in each of the plurality of first frames may vary. Accordingly, when the hyper-lapse image is generated using the plurality of first frames, a change in the position of the external object may be expressed in the hyper-lapse image.
  • the electronic device synthesizes the plurality of first frames 301, 302, 303, 304, 305, 306, 307, 308, ... in the first mode to synthesize at least one second frame 351, 352, 353, ...) can be created.
  • FIG. 3A exemplarily shows at least one second frame 351 , 352 , 353 , ... .
  • the at least one second frame 351 , 352 , 353 , ... may include a 351 frame 351 , a 352 frame 352 , and a 353 frame 353 .
  • At least one second frame (351, 352, 353, ...) shown in FIG. 3A is only an example, so the number of at least one second frame (351, 352, 353, ...) is the same as that shown in FIG. 3A limited and not construed.
  • the hyper-lapse image may include at least one second frame 351 , 352 , 353 , ... .
  • the electronic device may generate a hyper-lapse image such that the first frame is reproduced at a second speed different from the obtained first speed.
  • the hyper-lapse video is played at a rate of 30 frames per second (30 fps (frame per sec)
  • the playback time of each second frame included in the hyper-lapse video is 1/30 second (sec)
  • the third time T3 illustrated in FIG. 3A may correspond to a time at which one second frame is reproduced in the hyper-lapse image.
  • the third time T3 may correspond to 1/30 second (sec).
  • the electronic device synthesizes a plurality of first frames 301, 302, 303, 304, 305, 306, 307, 308, ... in a first mode to synthesize at least one second frame 351, 352, 353, ...) can be created.
  • the electronic device may generate the second frame in the first mode in which the first frames are synthesized in units of a predetermined number.
  • the electronic device may generate at least one second frame by synthesizing three first frames for example.
  • the electronic device may generate the 351 frame 351 by synthesizing the 301 frame 301 , the 302 frame 302 , and the 303 frame 303 in the order obtained through the camera.
  • the electronic device may generate the 352 frame 352 by synthesizing the 304 frames 304 , 305 frames 305 , and 306 frames 306 in the order obtained through the camera.
  • the electronic device may generate the second frame after the 353 frame 353 .
  • a specific frame synthesis method will be described with reference to FIGS. 4 and 5 to be described later.
  • the hyper-lapse image generated in the first mode may provide a user with a sense of speed.
  • the hyper-lapse image generated in the first mode may represent an external object moving at double speed according to Equation 1 below.
  • the speed of the hyper-lapse image generated in the first mode may be determined by Equation 1 above.
  • N may be a positive integer greater than or equal to 1
  • N may correspond to the number of a plurality of first frames synthesized to generate one second frame.
  • the sum of the first time T1 and the second time T2 may correspond to a time required to acquire one first frame.
  • a reciprocal of a first rate eg, a rate at which the first frame is acquired
  • p is a value corresponding to the playback speed of the hyper-lapse image generated in the first mode, and may correspond to the second speed described above. For example, assuming N is 3, T1+T2 is 0.5, and p is 30, a speed of the hyper-lapse image generated in the first mode may correspond to a speed of 45 times.
  • the hyper-lapse image according to an embodiment may be expressed as if the external object is moving at a speed 45 times faster than the original moving speed, and the user may feel a sense of speed.
  • the electronic device may acquire a plurality of first frames by photographing an external object at a first speed.
  • the first speed is 1 fps
  • the sum of the first time T1 and the second time T2 of one first frame may correspond to 1 second.
  • the electronic device may generate a second frame by synthesizing the first frames in units of three, and may generate a hyper-lapse image in which the second frames are reproduced at a second speed.
  • the third time T3 during which one second frame is reproduced may correspond to 1/30 second (33 milliseconds (msec)).
  • the hyper-lapse image according to the first mode divides the first frames corresponding to 3 seconds by 1/30 second (sec). ), the user can feel as if the external object is moving at 90 times the original speed. Accordingly, the hyper-lapse image according to the first mode may provide a fast sense of speed to the user.
  • 3B is a diagram for describing an example in which an electronic device generates at least one second frame by synthesizing a plurality of first frames, according to various embodiments of the present disclosure
  • the electronic device may include a camera (eg, the camera module 180 of FIG. 1 or the camera 210 of FIG. 2 ). )), a plurality of first frames 301, 302, 303, 304, 305, 306, 307, 308, ...) may be acquired at a first rate.
  • the electronic device synthesizes the acquired plurality of first frames 301, 302, 303, 304, 305, 306, 307, 308, ... in the second mode to obtain at least one second frame 351, 352, 353, 354, 355...) can be created.
  • the plurality of first frames 301, 302, 303, 304, 305, 306, 307, 308, ...) is a 301 frame 301, a 302 frame 302, a 303 frame ( 303 ), 304 frame 304 , 305 frame 305 , 306 frame 306 , 307 frame 307 and 308 frame 308 .
  • at least one second frame 351 , 352 , 353 , 354 , 355 is a 351 frame 351 , 352 frame 352 , 353 frame 353 , 354 frame ( 354 and 355 frames 355 may be included.
  • the plurality of first frames 301, 302, 303, 304, 305, 306, 307, 308, ... shown in FIG. 3B is the plurality of first frames 301, 302, 303, 304 shown in FIG. 3A. , 305, 306, 307, 308, .
  • differences from those described with reference to FIG. 3A will be mainly described.
  • the electronic device acquires a plurality of first frames 301, 302, 303, 304, 305, 306, 307, 308, ...) at a first speed through a camera, and obtains the plurality of first frames
  • One frame (301, 302, 303, 304, 305, 306, 307, 308, ...) is synthesized in the second mode to generate at least one second frame (351, 352, 353, 354, 355).
  • the electronic device may generate a hyper-lapse image including at least one generated second frame 351 , 352 , 353 , 354 , 355 ... and reproduced at a second speed.
  • the electronic device may select a plurality of first frames such that at least one first frame overlaps (or overlaps). have.
  • the electronic device may generate at least one second frame by synthesizing the first frames so that at least one first frame overlaps. For example, when the electronic device generates one second frame by synthesizing three first frames, the electronic device generates 301 frames 301 , 302 frames 302 and 303 frames 303 in the order obtained through the camera. may be synthesized to generate 351 frames 351 . Next, the electronic device may generate the 352 frame 352 by synthesizing the 302 frame 302 , the 303 frame 303 , and the 304 frame 304 in the order obtained through the camera.
  • the electronic device may generate the 353 frame 353 by synthesizing the 303 frame 303 , the 304 frame 304 , and the 305 frame 305 in the order obtained through the camera.
  • the electronic device may generate the 354 frame 354 by synthesizing the 304 frames 304 , 305 frames 305 , and 306 frames 306 in the order obtained through the camera.
  • the electronic device may generate the second frame after the 355 frame 355 .
  • a specific frame synthesis method will be described with reference to FIGS. 4 and 5 to be described later.
  • one second frame is generated by synthesizing three first frames, and among them, two first frames are illustrated to overlap, but this is only an example, and one second frame is generated In order to do this, the number of first frames synthesized and the number of overlapping first frames are not interpreted as being limited to that illustrated in FIG. 3B .
  • two or more first frames may be synthesized to generate one second frame, and at least one first frame may be overlapped to generate second frames.
  • the second mode may include a synthesis method in which at least one of a plurality of first frames synthesized to generate different second frames is common.
  • 351 frames 351 and 352 frames 352 generated by synthesizing a plurality of first frames in the second mode may be generated by synthesizing 302 frames 302 and 303 frames 303 in common.
  • the 352 frame 352 and the 353 frame 353 may be generated by synthesizing the 303 frame 303 and the 304 frame 304 in common.
  • the hyper-lapse image according to the second mode is displayed to the user according to the first mode. It can provide a smoother video feeling with less stuttering than hyper-lapse video.
  • the hyper-lapse image generated in the second mode may more effectively represent the movement trajectory of the external object than the hyper-lapse image according to the first mode.
  • 3C is a diagram for describing an example in which an electronic device generates at least one second frame by synthesizing a plurality of first frames, according to various embodiments of the present disclosure
  • the electronic device may include a camera (eg, the camera module 180 of FIG. 1 or the camera 210 of FIG. 2 ). )), a plurality of first frames 301, 302, 303, 304, 305, 306, 307, 308, ...) may be acquired at a first rate.
  • the electronic device synthesizes the obtained plurality of first frames 301, 302, 303, 304, 305, 306, 307, 308, ... in the third mode to obtain at least one second frame 351, 352, 353, 354, 355, 356, 356, 358).
  • the plurality of first frames 301, 302, 303, 304, 305, 306, 307, 308, ...) is a 301 frame 301, a 302 frame 302, a 303 frame ( 303 ), 304 frame 304 , 305 frame 305 , 306 frame 306 , 307 frame 307 and 308 frame 308 .
  • a plurality of first frames 301, 302, 303, 304, 305, 306, 307, 308, ... and at least one second frame 351, 352, 353, 354, 355, 356, 356, 358 ...) is only an example, so it is not limited to what is shown and is not interpreted.
  • the plurality of first frames 301, 302, 303, 304, 305, 306, 307, 308, ... shown in FIG. 3C is the plurality of first frames 301, 302, shown in FIGS. 3A and 3B. 303, 304, 305, 306, 307, 308, .
  • FIGS. 3A and 3B differences from those described with reference to FIGS. 3A and 3B will be mainly described.
  • the electronic device acquires a plurality of first frames 301, 302, 303, 304, 305, 306, 307, 308, ... at a first speed through a camera, and obtains the acquired first frame At least one second frame 351, 352, 353, 354, 355, 356, 356, 358... ) can be created.
  • the electronic device may include the generated at least one second frame 351, 352, 353, 354, 355, 356, 356, 358 ... and generate a hyper-lapse image reproduced at a second speed.
  • the electronic device may cumulatively synthesize a plurality of first frames that are synthesized to generate different second frames according to an acquired order. For example, the electronic device controls the number of first frames to be synthesized to be different in order to generate different second frames, respectively, and cumulatively synthesizes different numbers of first frames in the order in which they are acquired to form at least one second frame. You can create 2 frames.
  • the electronic device may generate at least one second frame by synthesizing a different number of first frames.
  • the electronic device may generate the 312 frame 312 by synthesizing the 301 frame 301 and the 302 frame 302 in an order obtained from among the plurality of first frames.
  • the electronic device may generate the 313 frame 313 by synthesizing the 301 frame 301 , the 302 frame 302 , and the 303 frame 303 in an order obtained from among the plurality of first frames.
  • the electronic device may generate the 313 frame 313 by synthesizing the previously generated 312 frames 312 and 303 frames 303 .
  • the electronic device generates the 314 frame 314 by synthesizing the 301 frame 301 , the 302 frame 302 , the 303 frame 303 , and the 304 frame 304 in the order obtained from among the plurality of first frames. can do.
  • the electronic device may generate the 314 frame 314 by synthesizing the previously generated 313 frame 313 and the 304 frame 304 .
  • At least one second frame 351 , 352 , 353 , 354 , 355 , 356 , 357 , 358 ... generated as a result of synthesizing a plurality of first frames is illustrated.
  • the time at which each frame is reproduced corresponds to the third time T3 and 351 frames 351 , 352 frames 352 , 353 frames 353 , 354 frames 354 . and a hyper-lapse image including 355 frames 355 may be generated.
  • the 351 frame 351 may correspond to either the 311 frame 311 or the 301 frame 301 .
  • the electronic device plays the 301 frame 301 acquired through the camera during the sum of the first time T1 and the second time T2 in the hyper-lapse image for a third time T3. can be controlled For example, it is assumed that the electronic device acquires each of the plurality of first frames for 1 second (sec), and that the reproduction speed of the hyper-lapse image generated by the electronic device is 30 fps. In this case, the electronic device may control the 301 frames 301 acquired for 1 second (sec) to be reproduced for 1/30 second (sec) corresponding to the speed at which the hyper-lapse image is reproduced.
  • frame 352 352 may correspond to frame 312 .
  • the electronic device according to an embodiment may generate the 312 frame 312 by synthesizing the 301 frame 301 and the 302 frame 302 in the order obtained by the camera.
  • the electronic device according to an embodiment may generate the 312 frame 312 by synthesizing the previously generated 311 frame 311 and the 302 frame 302 .
  • the electronic device acquires each of the plurality of first frames for 1 second (sec), and that the reproduction speed of the hyper-lapse image generated by the electronic device is 30 fps.
  • the electronic device may control the 301 frames 301 and 302 frames 302 acquired for 2 seconds (sec) to be played back for 1/30 second (sec) corresponding to the playback speed of the hyper-lapse image. have.
  • 353 frame 353 may correspond to 313 frame 313 .
  • the electronic device according to an embodiment may generate 313 frames 313 by synthesizing the 301 frames 301 , 302 frames 302 , and 303 frames 303 in the order obtained by the camera.
  • the electronic device according to an embodiment may generate 313 frame 313 by synthesizing previously generated 312 frames 312 and 303 frames 303 .
  • the electronic device according to an embodiment acquires the 301 frames ( 301), 302 frames 302 and 303 frames 303 may be controlled to be played back for 1/30 second corresponding to the playback speed of the hyper-lapse video.
  • each of the at least one second frame included in the hyper-lapse image generated in the third mode may include a 301 frame 301 that is the first acquired first frame in common. Accordingly, the hyper-lapse image generated in the third mode may represent all the movement trajectories of the external object.
  • the electronic device may receive the maximum number of first frames synthesized in one second frame from the user.
  • the electronic device may control the exposure time of one first frame to be short in order to prevent the hyper-lapse image from being saturated with excessive light.
  • the electronic device may generate one second frame by synthesizing a plurality of first frames acquired during a short exposure time.
  • the electronic device may change the exposure time of the first frame while capturing an external object in order to prevent the hyper-lapse image from being saturated with excessive light. For example, when the surrounding environment becomes bright while the electronic device is capturing an external object in a dark environment, the electronic device changes the exposure time of the first frame from a large value to a small value, and corresponds to the changed exposure time to a first The first rate, which is the frame acquisition rate, may be changed. According to an embodiment, even when the first speed is changed while capturing an external object, the electronic device may change the number of first frames to be synthesized in order to keep the second speed, which is the reproduction speed of the hyper-lapse image, constant. have.
  • the number of synthesizing the plurality of first frames may be relatively small.
  • the number of synthesizing the plurality of first frames may be relatively large.
  • the electronic device according to an embodiment may be configured to generate a second frame that is synthesized to respectively generate two or more second frames included in the same hyper-lapse image according to an exposure time of the first frame or a photographing environment (eg, ambient illumination). The number of frames can be controlled.
  • the electronic device may generate a hyper-lapse image by combining two or more modes among the first mode, the second mode, and the third mode.
  • the electronic device may generate a plurality of second frames by synthesizing a plurality of first frames obtained by photographing an external object through a camera in a first mode.
  • the electronic device may generate at least one third frame by synthesizing the plurality of second frames in the first mode.
  • the electronic device may generate a hyper-lapse image including at least one third frame.
  • the electronic device may generate a plurality of second frames by synthesizing a plurality of first frames obtained by photographing an external object through a camera in a first mode.
  • the electronic device may generate at least one third frame by synthesizing the plurality of second frames in the second mode, and may generate a hyper-lapse image including the generated at least one third frame.
  • the electronic device may generate a plurality of second frames by synthesizing a plurality of first frames obtained by photographing an external object through a camera in a first mode.
  • the electronic device may generate at least one third frame by synthesizing the plurality of second frames in the third mode, and may generate a hyper-lapse image including the generated at least one third frame.
  • FIG. 4 is a diagram for explaining a method of synthesizing at least two first frames in an electronic device in a first mode, a second mode, or a third mode, according to various embodiments of the present disclosure
  • first frames 401 and 402 sequentially acquired by the electronic device through the camera are shown in a state where the position of the camera is fixed.
  • the first frames 401 and 402 may include an A frame 401 and a B frame 402 .
  • the A frame 401 and the B frame 402 may be the 301 frame 301 and the 302 frame 302 described with reference to FIGS. 3A to 3C , respectively.
  • the frame A 401 of FIG. 4 may be a frame acquired by the electronic device at a first time through a camera according to an embodiment.
  • the B frame 402 of FIG. 4 indicates that the electronic device displays a first time (eg, a first time T1 in FIGS. 3A to 3C ) and a second time (eg, a second time in FIGS. 3A to 3C ) from a first time point. It may be a frame acquired after time T2) has elapsed.
  • the electronic device may acquire the first frames 401 and 402 in which the position of the external object is changed based on the frame photographed at the same position.
  • a frame acquired through a camera may include at least one pixel.
  • Each pixel may have a pixel value (or pixel value) indicating the color or contrast of the pixel.
  • the pixel value may include an RGB value indicating at least one of a unique brightness or color of each pixel.
  • pixel values for pixels included in a frame expressed in color include an R (red) value ranging from 0 to 255, a G (green) value ranging from 0 to 255, and a range from 0 to 255. It can be expressed as a combination of B (blue) values with .
  • the first point 410 of the A frame 401 may include four pixels.
  • the first point 410 may include a first pixel, a second pixel, a third pixel, and a fourth pixel.
  • the first pixel may have pixel values corresponding to R(159), G(184), and B(244).
  • the second pixel may have pixel values corresponding to R 161 , G 186 , and B 246 .
  • the third pixel may have pixel values corresponding to R(175), G(205), and B(251).
  • the fourth pixel may have pixel values corresponding to R(183), G(211), and B(255).
  • the first point 410 of the B frame 402 may be a point corresponding to the same position as the first point 410 of the A frame 401 .
  • first coordinates eg, coordinates corresponding to the x-axis
  • second coordinates eg, coordinates corresponding to the y-axis perpendicular to the x-axis
  • the A frame The first point 410 of the 401 and the first point 410 of the B frame 402 may have the same coordinates.
  • the first point 410 of the B frame 402 may include a first pixel, a second pixel, a third pixel, and a fourth pixel.
  • Each pixel value may be different from the pixel value of the A frame 401 because the position of the external object is changed during the first time T1 and the second time T2 .
  • the first pixel may have pixel values corresponding to R(100), G(175), and B(169).
  • the second pixel may have pixel values corresponding to R(120), G(50), and B(84).
  • the third pixel may have pixel values corresponding to R(58), G(121), and B(76).
  • the fourth pixel may have pixel values corresponding to R(25), G(175), and B(152).
  • the electronic device sequentially acquires first frames 401 and 402 , and uses the acquired first frames 401 and 402 to obtain an average frame and a max frame ) can be created.
  • the electronic device may obtain an average value by calculating pixel values corresponding to pixels at the same location.
  • the electronic device may generate an average frame 403 including a plurality of pixels having the acquired average value as a pixel value.
  • the average frame 403 may include pixels having an average pixel value as a pixel value.
  • the average pixel value is obtained by averaging pixel values of pixels (eg, the first pixel) located at the same point (eg, the first point 410 ) of the A frame 401 and the B frame 402 . can correspond to values.
  • the first pixel included in the first point 410 of the average frame 403 is R, which is a result of calculating the average value of the first pixel of the A frame 401 and the first pixel of the B frame 402 . It may have a pixel value corresponding to (129 or 130), G (179 or 180), and B (206 or 207). The second pixel, the third pixel, and the fourth pixel may also have corresponding pixel values in a similar manner.
  • the electronic device compares pixel values corresponding to pixels at the same position, and generates a maximum frame 404 including a plurality of pixels having a largest maximum value among them.
  • the maximum frame 404 may be composed of pixels having the maximum pixel value as the pixel value.
  • the maximum pixel value may be the greater of pixel values of pixels (eg, first pixel) located at the same point (eg, first point 410) of frame A 401 and frame B 402 .
  • the first pixel included in the first point 410 of the maximum frame 404 is a result of calculating the maximum value of the first pixel of the A frame 401 and the first pixel of the B frame 402 . It may have pixel values corresponding to R(159), G(184), and B(244).
  • the second pixel, the third pixel, and the fourth pixel may have corresponding pixel values in a similar manner.
  • the electronic device may obtain an average value by calculating pixel values corresponding to pixels at the same position in the three first frames, and may generate an average frame having the average value as a pixel value. Also, the electronic device may compare pixel values corresponding to pixels at the same position in the three first frames, and generate a maximum frame having a largest value among them as a pixel value.
  • the electronic device may blend (or blend) the average frame and the maximum frame according to a predetermined weight.
  • the trajectory of the bright light may not be expressed relatively well.
  • the trajectory of the bright light may be well expressed, but the trajectory of the dark external object may not be well expressed. Since the electronic device according to an embodiment generates a hyper-lapse image by using a result of blending the average frame and the maximum frame according to a predetermined weight, it is possible to effectively express both a trajectory of a bright light and a trajectory of a dark external object. .
  • a weight according to an embodiment will be exemplarily described with reference to FIG. 5 to be described later.
  • the pixel values described with reference to FIG. 4 are only examples for convenience of description, and various embodiments of the present document are not limited to the embodiments according to the aforementioned pixel values.
  • 5 is a diagram for exemplarily showing weights indicating a blend ratio of an average frame and a maximum frame.
  • the electronic device uses the pixel value of the maximum frame described with reference to FIG. 4 , and based on the diagram (any one of 501, 502, 503, 511, or 512) illustrated in FIG. 5 , the maximum frame It is possible to determine a weight corresponding to the ratio of blending the average frame with the average frame.
  • the first graph 510 of FIG. 5 is an S-shaped sigmoid representing a first diagram 501 , a second diagram 502 , and a third diagram 503 for determining weights.
  • ) is a function graph.
  • the x-axis represents a pixel value
  • the y-axis represents a weight.
  • the first diagram 501 may be a graph according to Equation 2 below.
  • the second diagram 502 may be a graph according to Equation 3 below.
  • the third diagram 503 may be a graph according to Equation 4 below.
  • Variable a in Equations 2 to 4 may correspond to the maximum value described with reference to FIG. 4 .
  • the variable a may correspond to a maximum value obtained by comparing pixel values corresponding to pixels at the same position in the plurality of first frames by the electronic device according to an embodiment.
  • the second graph 520 and the third graph 530 of FIG. 5 are graphs illustrating a fourth diagram 511 and a fifth diagram 512 for determining a weight.
  • an x-axis indicates a pixel value
  • a y-axis indicates a weight.
  • the fourth diagram 511 may be a graph according to Equation 5 below.
  • the weight when the weight calculated by Equation 5 exceeds 1, the weight may be determined to be 1. In another embodiment, when the weight calculated by Equation 5 is less than 0, the weight may be determined to be 0.
  • b in Equation 5 may correspond to a pixel value.
  • the fifth diagram 512 may be a graph according to Equation 6 below.
  • the weight when the weight calculated by Equation 6 exceeds 1, the weight may be determined to be 1. In another embodiment, when the weight calculated by Equation 6 is less than 0, the weight may be determined to be 0.
  • b in Equation 6 may correspond to a pixel value.
  • the pixel value may be determined according to various equations having an S-shaped characteristic.
  • the electronic device may obtain the pixel value C of the second frame as a result of blending and synthesizing the average frame and the maximum frame according to Equation 7 below using the above-described weight.
  • the pixel value C of the second frame includes a weight calculated by Equations 2 to 6 described above, weight(r,c), and a pixel value max(r,c) of the maximum frame. )) and the pixel value of the average frame (average(r,c)) may be determined by Equation (7).
  • the pixel value max(r,c) of the maximum frame may correspond to a pixel value corresponding to a pixel at a position determined by the row value (r) and the column value (c) among the pixels of the maximum frame.
  • the pixel value of the average frame (average(r,c)) may correspond to a pixel value corresponding to a pixel at a position determined by the row value (r) and the column value (c) among the pixels of the average frame. have.
  • the electronic device (eg, the electronic device 101 of FIG. 1 , the electronic device 200 of FIG. 2 ) according to an embodiment of the present disclosure includes a camera (eg, the camera module 180 of FIG. 1 or the camera of FIG. 2 ) 210), a memory (eg, memory 130 of FIG. 1 or memory 220 of FIG. 2) and a processor operatively coupled with the camera and the memory (eg, processor 120 of FIG. 1 or FIG. 2) of the processor 230).
  • a camera eg, the camera module 180 of FIG. 1 or the camera of FIG. 2
  • a memory eg, memory 130 of FIG. 1 or memory 220 of FIG. 2
  • a processor operatively coupled with the camera and the memory (eg, processor 120 of FIG. 1 or FIG. 2) of the processor 230).
  • the processor when the memory is executed, acquires a plurality of successive first frames by photographing an external object at a first speed through the camera, and at least one of the plurality of first frames At least one second frame is generated by synthesizing parts in a predetermined number unit in the order in which they are acquired, and the at least one second frame representing the movement of the external object is reproduced at a second speed different from the first speed Instruction for generating a hyper-lapse video can be stored.
  • the processor calculates a pixel value corresponding to at least one pixel included in each of the predetermined number of frames among the plurality of first frames to have an average pixel value.
  • An average frame including at least one pixel and a maximum frame including at least one pixel having a maximum pixel value may be generated, and the second frame may be generated by synthesizing the average frame and the maximum frame.
  • a pixel value corresponding to at least one pixel included in each of the predetermined number of frames may include an RGB value indicating at least one of a brightness level or a color.
  • the instructions include, by the processor, an RGB value representing an average value of RGB values corresponding to at least one pixel included in each of the predetermined number of frames as the average pixel value.
  • the average frame including one pixel may be generated.
  • the instructions include, by the processor, an RGB value representing a maximum value among RGB values corresponding to at least one pixel included in each of the predetermined number of frames as the maximum pixel value.
  • the maximum frame including one pixel may be generated.
  • the instructions include, by the processor, a weight value for the average frame and the maximum frame based on a pixel value corresponding to the at least one pixel, and the average frame and the average frame according to the weight.
  • the maximum frame may be synthesized.
  • the first speed may be smaller than the second speed.
  • the instructions may cause the processor to vary the first speed based on a brightness level of an external environment.
  • the electronic device (eg, the electronic device 101 of FIG. 1 , the electronic device 200 of FIG. 2 ) according to an embodiment of the present disclosure includes a camera (eg, the camera module 180 of FIG. 1 or the camera of FIG. 2 ) 210), a memory (eg, memory 130 of FIG. 1 or memory 220 of FIG. 2) and a processor operatively coupled with the camera and the memory (eg, processor 120 of FIG. 1 or FIG. 2) of the processor 230).
  • a camera eg, the camera module 180 of FIG. 1 or the camera of FIG. 2
  • a memory eg, memory 130 of FIG. 1 or memory 220 of FIG. 2
  • a processor operatively coupled with the camera and the memory (eg, processor 120 of FIG. 1 or FIG. 2) of the processor 230).
  • the processor acquires a plurality of successive first frames by the camera capturing an external object at a first speed, and among the plurality of first frames, at least Time for generating a plurality of second frames by synthesizing one frame in a unit of a predetermined number so as to be common, and allowing the plurality of second frames representing the movement of the external object to be reproduced at a second speed different from the first speed Instructions for generating a lapse image may be stored.
  • the processor calculates a pixel value corresponding to at least one pixel included in each of the predetermined number of frames among the plurality of first frames to have an average pixel value. generating an average frame including at least one pixel and a maximum frame including at least one pixel having a maximum pixel value, respectively, and synthesizing the average frame and the maximum frame to generate the plurality of second frames have.
  • a pixel value corresponding to at least one pixel included in each of the predetermined number of frames may include an RGB value indicating a brightness level.
  • the instructions cause the processor to obtain an average value of pixel values corresponding to at least one pixel included in each of the predetermined number of frames, and use the average value as the average value of the at least one pixel. generate the average frame having a pixel value corresponding to The maximum frame having a pixel value corresponding to one pixel may be generated.
  • the instructions may cause the processor to determine weights for the average frame and the maximum frame, and combine the average frame and the maximum frame according to the weights.
  • the first speed may be smaller than the second speed.
  • the instructions may cause the processor to vary the first speed based on a brightness level of an external environment.
  • FIG. 6 is a flowchart illustrating an operation in which an electronic device generates a hyper-lapse image in a first mode, according to various embodiments of the present disclosure
  • the electronic device receives a camera (eg, the camera module 180 of FIG. 1 or the electronic device 200 of FIG. 2 ).
  • a plurality of successive first frames may be acquired by photographing an external object at a first speed through the camera 210 .
  • the electronic device may acquire a plurality of first frames by photographing an external object through the camera at a rate of one frame per second.
  • the electronic device may generate at least one second frame by synthesizing at least some of the plurality of first frames acquired in operation 610 into a unit of a predetermined number in the acquired order. have.
  • the electronic device may generate at least one second frame by synthesizing the plurality of first frames in the order in which they are acquired.
  • the electronic device may generate at least one second frame by synthesizing a plurality of non-overlapping first frames in units of three.
  • the electronic device according to an embodiment may generate two second frames by synthesizing six first frames acquired for 6 seconds into three units.
  • the electronic device may generate a hyper-lapse image in which at least one second frame representing the movement of an external object is reproduced at a second speed different from the first speed.
  • the electronic device may generate a hyper-lapse image including at least one second frame generated in operation 620 .
  • the electronic device may control the hyper-lapse image to be reproduced at a second speed different from the first speed (eg, 1 frame per second) at which the plurality of first frames are acquired in operation 610 .
  • the electronic device may control the hyper-lapse image to be reproduced at a rate of 30 frames per second.
  • the electronic device may generate a hyper-lapse image including two second frames, and the generated hyper-lapse image may be reproduced at a rate of 30 frames per second. Accordingly, the playback time of the hyper-lapse image including the two second frames may correspond to 1/15 second.
  • the hyper-lapse image generated by the electronic device according to an embodiment may indicate a change (eg, a change in position) of an external object for 6 seconds for 1/15 second.
  • the hyper-lapse image of the first mode generated by the electronic device may provide a user with a sense of high speed.
  • FIG. 7 is a flowchart illustrating an operation in which an electronic device generates a hyper-lapse image in a second mode, according to various embodiments of the present disclosure
  • the electronic device receives a camera (eg, the camera module 180 of FIG. 1 or the electronic device 200 of FIG. 2 ).
  • a plurality of successive first frames may be acquired by photographing an external object at a first speed through the camera 210 .
  • the electronic device may acquire a plurality of first frames by photographing an external object through the camera at a rate of one frame per second.
  • the electronic device may generate at least two second frames by synthesizing in units of a predetermined number so that at least one frame among the plurality of first frames acquired in operation 710 is common.
  • the electronic device may select three of the plurality of first frames according to an acquired order, and may generate at least one second frame by synthesizing the three selected first frames.
  • the electronic device records at least one frame from among the three selected first frames (eg, the most recently acquired first frame among the three selected first frames) and two first frames acquired in the following order.
  • the following second frame can be generated by synthesizing.
  • the electronic device may generate a hyper-lapse image in which at least one second frame representing the movement of an external object is reproduced at a second speed different from the first speed.
  • the electronic device may generate a hyper-lapse image including at least one second frame generated in operation 720 .
  • the electronic device may generate a hyper-lapse image to be reproduced at a second speed different from the first speed (eg, 1 frame per second) at which the plurality of first frames are acquired in operation 710 .
  • the electronic device may generate a hyper-lapse image to be reproduced at a rate of 30 frames per second.
  • the hyper-lapse image generated by the electronic device in the second mode includes a vehicle moving with a light turned on, the user may effectively recognize a movement trajectory of the vehicle displayed in the hyper-lapse image.
  • FIG. 8 is a flowchart illustrating an operation in which an electronic device generates a hyper-lapse image in a third mode, according to various embodiments of the present disclosure
  • the electronic device receives a camera (eg, the camera module 180 of FIG. 1 or the electronic device 200 of FIG. 2 ).
  • a plurality of successive first frames may be acquired by photographing an external object at a first speed through the camera 210 .
  • the electronic device may acquire a plurality of first frames by photographing an external object through the camera at a rate of one frame per second.
  • the electronic device may sequentially acquire a plurality of first frames including frames 1-1, 1-2, 1-3, and 1-4 by photographing an external object. have.
  • the electronic device acquires the first frame among the plurality of first frames acquired in operation 810 and a frame that is cumulatively synthesized in the order in which the plurality of first frames are acquired.
  • At least one second frame including
  • the electronic device accumulates a plurality of first frames in different numbers and synthesizes a plurality of second frames including the 2-1 frame, the 2-2 frame, and the 2-3 frame. can create
  • the electronic device may generate a 2-1 frame by synthesizing a 1-1 frame and a 1-2 frame.
  • the electronic device may generate a 2-2 frame by synthesizing the 1-1 frame, the 1-2 frame, and the 1-3 frame.
  • the electronic device may generate a 2-3 th frame by synthesizing a 1-1 frame, a 1-2 frame, a 1-3 frame, and a 1-4 frame.
  • the hyper-lapse image including the at least one second frame is the image of the moving external object.
  • the trajectory can be expressed effectively.
  • the electronic device may generate a hyper-lapse image in which at least one second frame representing the movement of an external object is reproduced at a second speed different from the first speed.
  • the electronic device includes a 1-1 frame, a 2-1 frame, a 2-2 frame, and a 2-3 frame, and a hyper reproduced at a second speed different from the first speed in operation 810 .
  • the first rate includes a rate of 1 frame per second
  • the second rate may include a rate of 30 frames per second.
  • the hyper-lapse image generated by the electronic device in the third mode when the hyper-lapse image generated by the electronic device in the third mode includes a vehicle moving with a light turned on, the user may effectively recognize a movement trajectory of the vehicle displayed in the hyper-lapse image.
  • the hyper-lapse image generated by the electronic device in the third mode may effectively represent an irregular movement trajectory.
  • a hyper-lapse image generated in the third mode in which a plurality of first frames are cumulatively synthesized can effectively represent the trajectory of an external object that moves irregularly with a light source (eg, vehicle light) included.
  • the hyper-lapse image generated in the third mode may effectively represent the trajectory of an external object having a small degree of movement.
  • FIG. 9 is a flowchart illustrating an operation in which an electronic device generates a hyper-lapse image based on mode information, according to various embodiments of the present disclosure;
  • the operations illustrated in FIG. 9 may include at least some of the operations of FIGS. 6 to 8 .
  • the electronic device receives a camera (eg, the camera module 180 of FIG. 1 or the electronic device 200 of FIG. 2 ).
  • the maximum exposure time of the camera 210) may be set.
  • the maximum exposure time may be set to a value included in the interval of 0.2 seconds to 1.5 seconds, but is not limited to the value of the interval.
  • the electronic device may receive step information from the user.
  • the electronic device may display a UI for receiving step information from the user through the display.
  • the step information may correspond to the number of first frames synthesized to generate one second frame or a speed at which a completed hyper-lapse image is reproduced.
  • operation 902 is an optional operation, and when operation 902 is omitted, the electronic device may designate a preset value as a value corresponding to the step information.
  • the electronic device may receive mode information from the user.
  • the electronic device may display a UI for receiving mode information from the user through the display.
  • the mode information may include at least one of first mode information, second mode information, and third mode information.
  • the electronic device may determine a first speed corresponding to the number of first frames to be acquired per unit second.
  • the electronic device may determine the number of first frames to be acquired per unit second based on at least one of the step information received in operation 902 or the brightness of the surrounding environment within a range not exceeding the maximum exposure time. For example, in a place where the surrounding environment is bright, the electronic device according to an embodiment may set the exposure time of the camera to be relatively short, and as a result, the number of first frames to be acquired per unit second may have a large value. .
  • the electronic device in a dark place, may set the exposure time of the camera to be relatively long, and as a result, the number of first frames to be acquired per second may have a small value. .
  • the electronic device according to an embodiment may vary the number of first frames to be acquired per second while photographing an external object by changing the exposure time of the camera when the illuminance of the surrounding environment changes.
  • the electronic device may receive a record start input from the user. After receiving a record start input from the user, the electronic device may acquire a plurality of first frames by photographing an external object through a camera.
  • the electronic device may determine mode information received from the user. For example, when the mode information includes the first mode information, the electronic device may perform operation 907 . For example, when the mode information includes the second mode information, the electronic device may perform operation 908 . For example, when the mode information includes the third mode information, the electronic device may perform operation 909 .
  • the electronic device may generate at least one second frame by synthesizing the first frames according to the first mode. For example, the electronic device determines the number (N) of the first frames to be synthesized based on any one of the step information received in operation 902 or the step information corresponding to a preset value and an exposure time according to a shooting environment, , by synthesizing the plurality of first frames to generate at least one second frame. Details of operation 907 will be described with reference to FIG. 10 to be described later.
  • the electronic device may determine whether a record end input is received from the user. For example, if the electronic device does not receive the record end input, the electronic device may re-perform operation 907 . For example, upon receiving the record end input, the electronic device may perform operation 910 .
  • the electronic device may generate at least one second frame by synthesizing the first frames according to the second mode. For example, the electronic device determines the number (N) of the first frames to be synthesized based on any one of the step information received in operation 902 or the step information corresponding to a preset value and an exposure time according to a shooting environment, , by synthesizing the plurality of first frames to generate at least one second frame.
  • N the number of the first frames to be synthesized based on any one of the step information received in operation 902 or the step information corresponding to a preset value and an exposure time according to a shooting environment.
  • the electronic device may determine whether a record end input is received from the user. For example, if the electronic device does not receive the record end input, the electronic device may perform operation 908 again. For example, upon receiving the record end input, the electronic device may perform operation 910 .
  • the electronic device may generate at least one second frame by synthesizing the first frames according to the third mode. Details of operation 909 will be described with reference to FIG. 12 to be described later.
  • the electronic device may determine whether a record end input is received from the user. For example, if the electronic device does not receive the record end input, the electronic device may re-perform operation 909 . For example, upon receiving a record end input, the electronic device may perform operation 910 .
  • the electronic device may use the generated at least one second frame to generate a hyper-lapse image including at least one second frame and reproduced at a second speed.
  • the second speed may have a different value from the first speed.
  • the second rate may include a rate of 30 frames per second.
  • the electronic device may generate a hyper-lapse image including a second frame including one first frame.
  • the hyper-lapse image generated in the first mode may represent an external object moving faster than the original speed, a user who recognizes the hyper-lapse image generated in the first mode may feel a sense of high speed.
  • the hyper-lapse image generated in the second mode may provide the user with a slower sense of speed than the first mode, but may effectively represent the trajectory of the external object.
  • the hyper-lapse image generated in the second mode among a plurality of first frames synthesized to respectively generate successive second frames, at least one first frame overlaps, so the trajectory of the moving external object can be expressed effectively.
  • the hyper-lapse image generated in the third mode may provide the user with a slower sense of speed than the first mode, but may most effectively represent the trajectory of the external object among the first to third modes.
  • operation 907 of FIG. 9 may include at least some of the operations of FIG. 10 .
  • the electronic device receives a camera (eg, the camera module 180 of FIG. 1 or the electronic device 200 of FIG. 2 ).
  • the first frame may be acquired at a first speed through the camera 210 .
  • the first speed may correspond to the number of first frames acquired by the camera of the electronic device per unit second.
  • the number of first frames acquired per second may be a positive integer of 1 or more.
  • the electronic device may store the acquired first frame in a memory (eg, the memory 130 of FIG. 1 or the memory 220 of FIG. 2 ).
  • a memory eg, the memory 130 of FIG. 1 or the memory 220 of FIG. 2 .
  • the electronic device may determine whether the number of stored first frames is equal to the number N of first frames used to generate one second frame. When the acquired number of first frames is not the same as the number (N) of first frames used to generate one second frame, the electronic device may perform operation 1010 again. When the obtained number of first frames is equal to the number (N) of first frames used to generate one second frame, the electronic device may perform operation 1040 .
  • the electronic device may generate a maximum frame and an average frame for each of the N first frames stored in the memory and store the generated frames in the memory.
  • the electronic device may determine a weight for the maximum frame and the average frame.
  • the weight may correspond to a ratio for blending the maximum frame and the average frame.
  • the electronic device may synthesize the N first frames according to the determined weight to generate at least one second frame and store it in the memory. For example, the electronic device may blend the maximum frame and the average frame with respect to the first frame according to the determined weight, and may generate one second frame by synthesizing the blended result.
  • the electronic device may delete the N first frames stored in the memory. For example, in order to store N first frames to be acquired in the following order, the electronic device may delete an area of a memory in which first frames synthesized in an already generated second frame are stored. For example, the electronic device may delete all information about the first frames stored in the memory.
  • operation 908 of FIG. 9 may include at least some of the operations of FIG. 11 .
  • FIG. 11 The operations shown in FIG. 11 are the same as or substantially the same as the operations shown in FIG. 10 except for operation 1170, and thus overlapping descriptions will be omitted, and operation 1170 will be described below.
  • the electronic device (eg, the electronic device 101 of FIG. 1 or the electronic device 200 of FIG. 2 ) is configured with a memory (eg, the memory 130 of FIG. 1 or the memory of FIG. 2 ) At least a portion of the N first frames stored in 220 ) may be deleted. For example, in order to store the first frame to be acquired in the next order, the electronic device may delete at least a part of the memory area in which the first frames synthesized into the already generated second frame are stored. According to an embodiment, the electronic device may generate different second frames by selecting at least one first frame to overlap and synthesizing the plurality of first frames.
  • the electronic device does not delete all of the first frames already stored in the memory, but deletes a partial region of the memory in which at least one first frame is stored, and then deletes the next acquired first frame from the deleted region of the memory.
  • at least one first frame left without being deleted by the electronic device may be commonly synthesized to generate different second frames.
  • operation 909 of FIG. 9 may include at least some of the operations of FIG. 12 .
  • the electronic device sets the acquired first frame to different values according to the weight determined in operation 1250 .
  • At least one second frame may be generated by cumulatively synthesizing the number of frames.
  • the electronic device may cumulatively synthesize the first frames in units of different numbers in the order in which they are acquired.
  • the number (N) of the first frames used to generate one second frame is 4, and the electronic device sequentially displays the 1-1 frame, the 1-2 frame, the 1-3 frame, and the first frame. It is assumed that -4 frames are acquired.
  • the electronic device may synthesize the 1-1 frame and the 1-2 th frame according to weights in the obtained order. Next, the electronic device may synthesize frames 1-1, 1-2, and 1-3. Next, the electronic device may synthesize frames 1-1, 1-2, 1-3, and 1-4.
  • the electronic device according to an embodiment may generate at least one second frame including frames generated as a result of synthesizing a different number of first frames. According to an embodiment, since the electronic device generates at least one second frame by accumulatively synthesizing a plurality of first frames, the generated at least one second frame may effectively represent a trajectory of a moving external object. . For example, a hyper-lapse image generated by the electronic device in the third mode may effectively represent an irregular movement trajectory.
  • a method of operating an electronic device includes a camera (eg, the camera module 180 of FIG. 1 or FIG. 1 ). 2) by photographing an external object at a first speed to obtain a plurality of successive first frames, in units of a predetermined number in the order of obtaining at least some of the plurality of first frames generating at least one second frame by synthesizing and generating a hyper-lapse image in which the at least one second frame representing the movement of the external object is reproduced at a second speed different from the first speed can do.
  • the generating of the at least one second frame may include performing pixel values corresponding to at least one pixel included in each of the predetermined number of frames among the plurality of first frames. obtaining an average pixel value representing an average value; generating an average frame including at least one pixel having the average pixel value; The method may include obtaining a maximum pixel value representing a maximum value among pixel values corresponding to at least one pixel and generating a maximum frame including at least one pixel having the maximum pixel value.
  • the generating of the at least one second frame may include generating the at least one second frame by synthesizing the average frame and the maximum frame according to a predetermined weight. .
  • the generating of the at least one second frame may include generating the at least one second frame by synthesizing the average frame and the maximum frame according to a predetermined weight. .
  • the first speed may be smaller than the second speed.
  • the operation of acquiring the plurality of first frames may include the operation of varying the first speed based on a brightness level of an external environment.
  • the electronic device may have various types of devices.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a smart phone
  • a portable multimedia device e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a wearable device e.g., a smart bracelet
  • a home appliance device e.g., a home appliance
  • first, second, or first or second may be used simply to distinguish the element from other elements in question, and may refer to elements in other aspects (e.g., importance or order) is not limited. It is said that one (eg, first) component is “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively”. When referenced, it means that one component can be connected to the other component directly (eg by wire), wirelessly, or through a third component.
  • module used in various embodiments of this document may include a unit implemented in hardware, software, or firmware, and is interchangeable with terms such as, for example, logic, logic block, component, or circuit.
  • a module may be an integrally formed part or a minimum unit or a part of the part that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • one or more instructions stored in a storage medium may be implemented as software (eg, the program 140) including
  • a processor eg, processor 120
  • a device eg, electronic device 101
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not include a signal (eg, electromagnetic wave), and this term is used in cases where data is semi-permanently stored in the storage medium and It does not distinguish between temporary storage cases.
  • a signal eg, electromagnetic wave
  • the method according to various embodiments disclosed in this document may be provided as included in a computer program product.
  • Computer program products may be traded between sellers and buyers as commodities.
  • the computer program product is distributed in the form of a machine-readable storage medium (eg compact disc read only memory (CD-ROM)), or via an application store (eg Play StoreTM) or on two user devices ( It can be distributed (eg downloaded or uploaded) directly between smartphones (eg: smartphones) and online.
  • a part of the computer program product may be temporarily stored or temporarily created in a machine-readable storage medium such as a memory of a server of a manufacturer, a server of an application store, or a relay server.
  • each component (eg, module or program) of the above-described components may include a singular or a plurality of entities, and some of the plurality of entities may be separately disposed in other components. have.
  • one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, a module or a program
  • the integrated component may perform one or more functions of each component of the plurality of components identically or similarly to those performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component are executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations are executed in a different order, or omitted. or one or more other operations may be added.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

Un mode de réalisation de la présente invention concerne un dispositif électronique pouvant comprendre un appareil de prise de vues, une mémoire, et un processeur connecté de manière opérationnelle à l'appareil de prise de vues et à la mémoire. Selon le mode de réalisation, la mémoire peut enregistrer des instructions destinées, pendant leur exécution, à permettre au processeur de : photographier un objet externe à une première vitesse par le biais de l'appareil de prise de vues afin d'acquérir une pluralité de premières trames consécutives ; générer au moins une deuxième trame par synthèse, dans l'ordre d'acquisition, d'au moins une partie de la pluralité de premières trames dans une unité de nombre prédéterminée ; et générer une image hyper-accélérée qui permet à la ou aux deuxièmes trames indiquant le mouvement de l'objet externe d'être lues à une deuxième vitesse différente de la première vitesse. Divers autres modes de réalisation identifiés dans la description sont également possibles.
PCT/KR2021/006216 2020-06-17 2021-05-18 Dispositif électronique et procédé de fonctionnement de dispositif électronique WO2021256709A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020200073523A KR20210155961A (ko) 2020-06-17 2020-06-17 전자 장치 및 전자 장치의 동작 방법
KR10-2020-0073523 2020-06-17

Publications (1)

Publication Number Publication Date
WO2021256709A1 true WO2021256709A1 (fr) 2021-12-23

Family

ID=79176286

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/006216 WO2021256709A1 (fr) 2020-06-17 2021-05-18 Dispositif électronique et procédé de fonctionnement de dispositif électronique

Country Status (2)

Country Link
KR (1) KR20210155961A (fr)
WO (1) WO2021256709A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024053835A1 (fr) * 2022-09-08 2024-03-14 삼성전자주식회사 Dispositif électronique et procédé d'obtention d'image sur la base de la synthèse de trames

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150090999A (ko) * 2014-01-30 2015-08-07 가시오게산키 가부시키가이샤 타임 랩스 동영상을 생성하는 화상 처리 장치, 화상 처리 방법 및 기록 매체
US20160094801A1 (en) * 2014-09-30 2016-03-31 Apple Inc. Time-Lapse Video Capture With Optimal Image Stabilization
KR20170074538A (ko) * 2015-12-22 2017-06-30 삼성전자주식회사 타임랩스 영상을 생성하는 장치 및 방법
US20170359548A1 (en) * 2015-05-08 2017-12-14 Microsoft Technology Licensing, Llc Real-time hyper-lapse video creation via frame selection
US20190306589A1 (en) * 2017-12-19 2019-10-03 Oath Inc. Method and system for generating a time-lapse video

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150090999A (ko) * 2014-01-30 2015-08-07 가시오게산키 가부시키가이샤 타임 랩스 동영상을 생성하는 화상 처리 장치, 화상 처리 방법 및 기록 매체
US20160094801A1 (en) * 2014-09-30 2016-03-31 Apple Inc. Time-Lapse Video Capture With Optimal Image Stabilization
US20170359548A1 (en) * 2015-05-08 2017-12-14 Microsoft Technology Licensing, Llc Real-time hyper-lapse video creation via frame selection
KR20170074538A (ko) * 2015-12-22 2017-06-30 삼성전자주식회사 타임랩스 영상을 생성하는 장치 및 방법
US20190306589A1 (en) * 2017-12-19 2019-10-03 Oath Inc. Method and system for generating a time-lapse video

Also Published As

Publication number Publication date
KR20210155961A (ko) 2021-12-24

Similar Documents

Publication Publication Date Title
WO2021086040A1 (fr) Procédé pour la fourniture de prévisualisation et dispositif électronique pour l'affichage de prévisualisation
WO2022114801A1 (fr) Dispositif électronique comprenant une pluralité de dispositifs de prise de vues, et procédé de commande de dispositif électronique
WO2021230485A1 (fr) Procédé et appareil de fourniture d'image
WO2022010122A1 (fr) Procédé pour fournir une image et dispositif électronique acceptant celui-ci
WO2022154387A1 (fr) Dispositif électronique et son procédé de fonctionnement
WO2021256709A1 (fr) Dispositif électronique et procédé de fonctionnement de dispositif électronique
WO2022177166A1 (fr) Procédé de commande de fréquence de rafraîchissement, et dispositif électronique prenant en charge celui-ci
WO2022149812A1 (fr) Dispositif électronique comprenant un module de caméra et procédé de fonctionnement de dispositif électronique
WO2022139238A1 (fr) Procédé de fourniture d'image et dispositif électronique le prenant en charge
WO2022098204A1 (fr) Dispositif électronique et procédé de fourniture de service de réalité virtuelle
WO2022154166A1 (fr) Procédé permettant de fournir une fonction de création de contenu et dispositif électronique prenant en charge celui-ci
WO2022145673A1 (fr) Dispositif électronique et procédé de fonctionnement d'un dispositif électronique
WO2024090803A1 (fr) Procédé permettant d'obtention d'une image et dispositif électronique le prenant en charge
WO2024085642A1 (fr) Dispositif électronique comprenant une unité d'affichage et son procédé de fonctionnement
WO2022203211A1 (fr) Dispositif électronique comprenant un module de caméra et procédé de fonctionnement du dispositif électronique
WO2024053824A1 (fr) Dispositif électronique pour fournir une image et son procédé de fonctionnement
WO2024034856A1 (fr) Procédé permettant de fournir une image, et dispositif électronique le prenant en charge
WO2023146290A1 (fr) Dispositif électronique de commande d'affichage d'écran et procédé de fonctionnement de dispositif électronique
WO2023033396A1 (fr) Dispositif électronique pour traiter une entrée de prise de vue continue, et son procédé de fonctionnement
WO2023038252A1 (fr) Dispositif électronique de capture d'image mobile et son procédé de fonctionnement
WO2022025384A1 (fr) Dispositif électronique de lecture de vidéo et procédé de lecture de vidéo
WO2023085679A1 (fr) Dispositif électronique et procédé de génération automatique de vidéo éditée
WO2022154168A1 (fr) Dispositif électronique apte à réaliser une mise au point automatique et son procédé de fonctionnement
WO2023017981A1 (fr) Dispositif électronique comprenant une caméra sous écran et son procédé de fonctionnement
WO2024014761A1 (fr) Procédé de correction de tremblement de dispositif de prise de vues et dispositif électronique le prenant en charge

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21824781

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21824781

Country of ref document: EP

Kind code of ref document: A1