WO2021162391A1 - Dispositif électronique comprenant une caméra, et procédé de prise de vue - Google Patents

Dispositif électronique comprenant une caméra, et procédé de prise de vue Download PDF

Info

Publication number
WO2021162391A1
WO2021162391A1 PCT/KR2021/001677 KR2021001677W WO2021162391A1 WO 2021162391 A1 WO2021162391 A1 WO 2021162391A1 KR 2021001677 W KR2021001677 W KR 2021001677W WO 2021162391 A1 WO2021162391 A1 WO 2021162391A1
Authority
WO
WIPO (PCT)
Prior art keywords
view
image
angle
preview
pipeline
Prior art date
Application number
PCT/KR2021/001677
Other languages
English (en)
Korean (ko)
Inventor
강창훈
최병근
성대현
정승환
김동현
천보현
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2021162391A1 publication Critical patent/WO2021162391A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming

Definitions

  • Various embodiments disclosed in this document relate to an electronic device including a camera and a photographing method using a camera included in the electronic device.
  • the camera of the electronic device may provide photographing at various angles of view through a crop or a plurality of cameras.
  • the camera of the electronic device supports photographing in various angles of view, there may be cases in which users may not easily know which angle of view to take in a specific situation.
  • an electronic device including a camera capable of determining an optimal shooting angle of view according to a shooting situation and continuously providing a preview image without interruption while the angle of view is changed, and a shooting method are provided.
  • An electronic device may include a camera and a processor operatively connected to the camera, and the processor may acquire images from at least two different angles of view through the camera. It is possible to prepare to do so, to analyze the state of the subject included in the image obtained through the camera, to determine a photographing angle of view based on the analysis, and to provide a preview image and an image with the determined photographing angle of view .
  • the method of photographing a camera includes an operation of preparing to acquire an image from at least two different angles of view through a camera, and analyzing a state of a subject included in an image obtained through the camera It may include an operation, an operation of determining a photographing angle of view based on the analysis, and an operation of providing a preview image and an image at the determined photographing angle of view.
  • An electronic device may include a plurality of cameras having different angles of view and a processor operatively connected to the plurality of cameras, wherein the processor is It is possible to prepare to acquire an image with at least two or more angles of view, and analyze a state of a subject included in an image obtained through one of the plurality of cameras, and based on the analysis, one of the plurality of cameras may be determined, and a preview image and an image may be provided through the determined camera.
  • a photographing angle of view may be changed to an optimal photographing angle of view according to a state of a subject present in the preview image.
  • FIG. 1 is a block diagram of an electronic device in a network environment, according to various embodiments of the present disclosure
  • FIG. 2 is a block diagram of an electronic device according to various embodiments disclosed herein.
  • FIG. 3 is a flow diagram according to various embodiments disclosed herein.
  • FIG. 4 is a conceptual diagram of pipelines corresponding to various shooting angles according to various embodiments disclosed in this document.
  • FIG. 5 is a flowchart of a photographing method according to various embodiments disclosed herein.
  • 6A to 6D are diagrams for explaining switching of a photographing angle of view according to various embodiments disclosed herein.
  • FIG. 7 is a flowchart of a photographing method according to various embodiments disclosed herein.
  • a or B at least one of A and B”, “or at least one of B,” “A, B or C,” “at least one of A, B and C,” and “B; or at least one of C” may include any one of, or all possible combinations of, items listed together in the corresponding one of the phrases.
  • Terms such as “first”, “second”, or “first” or “second” may simply be used to distinguish the component from other such components, and refer to those components in other aspects (e.g., importance or order) is not limited.
  • one (eg first) component is “coupled” or “connected” to another (eg, second) component with or without the terms “functionally” or “communicatively”
  • one component can be connected to the other component directly (eg by wire), wirelessly, or through a third component.
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100 according to various embodiments.
  • an electronic device 101 communicates with an electronic device 102 through a first network 198 (eg, a short-range wireless communication network) or a second network 199 . It may communicate with the electronic device 104 or the server 108 through (eg, a long-distance wireless communication network). According to an embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • a first network 198 eg, a short-range wireless communication network
  • a second network 199 e.g., a second network 199
  • the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • the electronic device 101 includes a processor 120 , a memory 130 , an input device 150 , a sound output device 155 , a display device 160 , an audio module 170 , and a sensor module ( 176 , interface 177 , haptic module 179 , camera module 180 , power management module 188 , battery 189 , communication module 190 , subscriber identification module 196 , or antenna module 197 . ) may be included. In some embodiments, at least one of these components (eg, the display device 160 or the camera module 180 ) may be omitted or one or more other components may be added to the electronic device 101 . In some embodiments, some of these components may be implemented as one integrated circuit. For example, the sensor module 176 (eg, a fingerprint sensor, an iris sensor, or an illuminance sensor) may be implemented while being embedded in the display device 160 (eg, a display).
  • the sensor module 176 eg, a fingerprint sensor, an iris sensor, or an illuminance sensor
  • the processor 120 for example, executes software (eg, a program 140) to execute at least one other component (eg, a hardware or software component) of the electronic device 101 connected to the processor 120 . It can control and perform various data processing or operations. According to one embodiment, as at least part of data processing or operation, the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 . may be loaded into the volatile memory 132 , process commands or data stored in the volatile memory 132 , and store the resulting data in the non-volatile memory 134 .
  • software eg, a program 140
  • the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 .
  • the volatile memory 132 may be loaded into the volatile memory 132 , process commands or data stored in the volatile memory 132 , and store the resulting data in the non-volatile memory 134 .
  • the processor 120 includes a main processor 121 (eg, a central processing unit or an application processor), and a secondary processor 123 (eg, a graphics processing unit, an image signal processor) that can be operated independently or in conjunction with the main processor 121 . , a sensor hub processor, or a communication processor). Additionally or alternatively, the auxiliary processor 123 may be configured to use less power than the main processor 121 or to be specialized for a designated function. The auxiliary processor 123 may be implemented separately from or as a part of the main processor 121 .
  • a main processor 121 eg, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphics processing unit, an image signal processor
  • the auxiliary processor 123 may be configured to use less power than the main processor 121 or to be specialized for a designated function.
  • the auxiliary processor 123 may be implemented separately from or as a part of the main processor 121 .
  • the auxiliary processor 123 may be, for example, on behalf of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or when the main processor 121 is active (eg, executing an application). ), together with the main processor 121, at least one of the components of the electronic device 101 (eg, the display device 160, the sensor module 176, or the communication module 190) It is possible to control at least some of the related functions or states.
  • the coprocessor 123 eg, an image signal processor or a communication processor
  • the memory 130 may store various data used by at least one component (eg, the processor 120 or the sensor module 176 ) of the electronic device 101 .
  • the data may include, for example, input data or output data for software (eg, the program 140 ) and instructions related thereto.
  • the memory 130 may include a volatile memory 132 or a non-volatile memory 134 .
  • the program 140 may be stored as software in the memory 130 , and may include, for example, an operating system 142 , middleware 144 , or an application 146 .
  • the input device 150 may receive a command or data to be used by a component (eg, the processor 120 ) of the electronic device 101 from the outside (eg, a user) of the electronic device 101 .
  • the input device 150 may include, for example, a microphone, a mouse, a keyboard, or a digital pen (eg, a stylus pen). According to various embodiments, the input device 150 may recognize the user's voice.
  • the input device 150 may receive a command through a user's voice.
  • the input device 150 may be a multi-microphone device corresponding to a 360 degree direction to recognize a voice generated in the vicinity of the electronic device 101 .
  • the sound output device 155 may output a sound signal to the outside of the electronic device 101 .
  • the sound output device 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback, and the receiver can be used to receive incoming calls. According to one embodiment, the receiver may be implemented separately from or as part of the speaker.
  • the display device 160 may visually provide information to the outside (eg, a user) of the electronic device 101 .
  • the display device 160 may include, for example, a display, a hologram device, or a projector and a control circuit for controlling the corresponding device.
  • the display device 160 may include a touch circuitry configured to sense a touch or a sensor circuit (eg, a pressure sensor) configured to measure the intensity of a force generated by the touch. there is.
  • the audio module 170 may convert a sound into an electric signal or, conversely, convert an electric signal into a sound. According to an embodiment, the audio module 170 acquires a sound through the input device 150 , or an external electronic device (eg, a sound output device 155 ) connected directly or wirelessly with the electronic device 101 . The sound may be output through the electronic device 102 (eg, a speaker or a headphone).
  • an external electronic device eg, a sound output device 155
  • the sound may be output through the electronic device 102 (eg, a speaker or a headphone).
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101 or an external environmental state (eg, user state), and generates an electrical signal or data value corresponding to the sensed state. can do.
  • the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 177 may support one or more designated protocols that may be used for the electronic device 101 to directly or wirelessly connect with an external electronic device (eg, the electronic device 102 ).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card
  • the connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102 ).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, a display port (DP), or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that the user can perceive through tactile or kinesthetic sense.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture still images and moving images. According to an embodiment, the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101 .
  • the power management module 188 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • Power supplied to the electronic device 101 may be supplied in a wired or wireless manner.
  • the electronic device 101 may include a wireless charging module (not shown) to receive power wirelessly.
  • the wireless charging module may be a device for receiving power in a magnetic induction method or a resonance induction method.
  • the wireless charging module may include a wireless charging coil in which a conductive metal wire is wound.
  • the battery 189 may supply power to at least one component of the electronic device 101 .
  • battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108). It can support establishment and communication through the established communication channel.
  • the communication module 190 may include one or more communication processors that operate independently of the processor 120 (eg, an application processor) and support direct (eg, wired) communication or wireless communication.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg, : It may include a local area network (LAN) communication module, or a power line communication module).
  • a wireless communication module 192 eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module
  • GNSS global navigation satellite system
  • wired communication module 194 eg, : It may include a local area network (LAN) communication module, or a power line communication module.
  • a corresponding communication module is a first network 198 (eg, a short-range communication network such as Bluetooth, WiFi direct, or IrDA (infrared data association)) or a second network 199 (eg, a cellular network, the Internet, or It may communicate with an external electronic device via a computer network (eg, a telecommunication network such as a LAN or WAN).
  • a computer network eg, a telecommunication network such as a LAN or WAN.
  • These various types of communication modules may be integrated into one component (eg, a single chip) or may be implemented as a plurality of components (eg, multiple chips) separate from each other.
  • the wireless communication module 192 uses the subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199 .
  • the electronic device 101 may be identified and authenticated.
  • the antenna module 197 may transmit or receive a signal or power to the outside (eg, an external electronic device).
  • the antenna module may include one antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 197 may include a plurality of antennas. In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is connected from the plurality of antennas by, for example, the communication module 190 . can be selected. A signal or power may be transmitted or received between the communication module 190 and an external electronic device through the selected at least one antenna.
  • the antenna module 197 may transmit/receive a 5G communication signal so that the electronic device 101 may support 5G communication.
  • the antenna module 197 may transmit/receive signals of several gigahertz bands and tens to hundreds of gigahertz bands (eg, mmWave).
  • the antenna module may include a plurality of antennas (eg, a plurality of patch array antennas) to generate an RF beam.
  • peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • the command or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199 .
  • Each of the electronic devices 102 and 104 may be the same or a different type of the electronic device 101 .
  • all or a part of operations executed in the electronic device 101 may be executed in one or more of the external electronic devices 102 , 104 , or 108 .
  • the electronic device 101 may perform the function or service itself instead of executing the function or service itself.
  • one or more external electronic devices may be requested to perform at least a part of the function or the service.
  • the one or more external electronic devices that have received the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit a result of the execution to the electronic device 101 .
  • the electronic device 101 may process the result as it is or additionally and provide it as at least a part of a response to the request.
  • cloud computing, distributed computing, or client-server computing technology may be used.
  • FIG. 2 is a block diagram of an electronic device according to various embodiments disclosed herein.
  • the electronic device (eg, the electronic device 101 of FIG. 1 ) includes a camera 270 (eg, the camera module 180 of FIG. 1 ) and a processor 230 (eg, the processor ( ) of FIG. 1 ). 120)) may be included.
  • a camera 270 eg, the camera module 180 of FIG. 1
  • a processor 230 eg, the processor ( ) of FIG. 1 ). 120
  • the camera 270 may be disposed on the front and/or rear of the electronic device.
  • the camera 270 may generate an image signal by photographing a subject.
  • the camera 270 may include an image sensor including a device that converts light into an electrical signal (eg, a complementary metal-oxide-semiconductor (CMOS), a charge-coupled device (CCD)).
  • CMOS complementary metal-oxide-semiconductor
  • CCD charge-coupled device
  • the camera 270 may capture an image at various angles of view or field of view.
  • the angle of view may be understood as a range of an image captured by the camera 270 .
  • a wide angle of view may mean that an image is captured in a wide range
  • a narrow angle of view may mean that an image is captured in a narrow range.
  • the angle of view may be changed by adjusting the range of light incident on the image sensor through the lens, or may be changed by cropping the image. For example, when some of the image information received by the image sensor is cropped, an image having a narrower angle of view than the original image may be obtained. When the angle of view is adjusted by cropping the image, the resolution of the image may be changed. For example, when the image sensor of the camera 270 has a maximum resolution of 8000 X 6000, an image having a resolution of 7680 X 4320 may be obtained by cropping the image.
  • the change in resolution by cropping will be described as the same as adjusting the angle of view.
  • the processor 230 may process an image signal received from the camera 270 to display an image on the display 210 (eg, the display device 160 of FIG. 1 ).
  • the processor 230 may display a preview image (eg, the preview image 421 of FIG. 4 ) on the display 210.
  • the preview image may mean a real-time image captured by the camera 270 .
  • the user can check the preview image and check the area to be photographed, the degree of exposure, composition, etc.
  • the processor 230 processes the image signal received by the camera 270 in accordance with a predetermined standard (eg, resolution) of the preview image. to be displayed on the display 210.
  • a predetermined standard eg, resolution
  • the processor 230 may convert (eg, resize) the image signal to a full HD (1920 X 1080) standard and display it on the display 210 .
  • the processor 230 may convert the image of the currently set angle of view into a preview image and display it.
  • the processor 230 may convert the image of the changed angle of view into a preview image and display it.
  • the processor 230 may generate a preview command, and the processor 230 may display a preview image on the display 210 according to the generated preview command.
  • the preview command may be generated by various triggers. For example, when a user executes an application related to the camera 270 , a preview command may be generated.
  • the processor 230 may generate and provide an image (eg, the first angle of view image 431 or the second angle of view image 441 of FIG. 4 ) with a currently set photographing angle of view.
  • the processor 230 may receive a user's various inputs (eg, a touch input, a voice input, a gesture input, etc.) and generate a shooting command.
  • the processor 230 may process the image signal received by the camera 270 at the point in time when a user input is received based on the photographing command to generate an image of the currently set photographing angle of view.
  • the processor 230 may analyze a state of a subject included in an image (eg, a preview image) acquired through the camera 270 .
  • the processor 230 may determine the photographing angle of view based on the state of the subject included in the image.
  • a preview image and an image may be provided according to the determined angle of view.
  • the processor 230 may prepare in advance so that the preview image can be continuously provided even when the photographing angle of view is changed.
  • the processor 230 may include an image constructing unit 231 , an image providing unit 232 , and an image analyzing unit 233 .
  • the image composing unit 231 , the image providing unit 232 , and the image analyzing unit 233 are merely separated for convenience of explanation and may not be physically separated.
  • the image composing unit 231 may generate a preview command for displaying a preview image and generate a photographing command for generating an image.
  • the image providing unit 232 may receive an image signal from the camera 270 and provide an image under the control of the image configuring unit 231 .
  • the image analyzer 233 may determine a photographing angle of view by analyzing a state of a subject included in the image provided by the image providing unit 232 .
  • the subject analyzed by the image analyzer 233 may be, for example, a face of a person or a face of an animal.
  • the user may directly select a subject of interest.
  • the image analyzer 233 may analyze the subject selected by the user.
  • the electronic device may include an image signal processor (ISP) 250 .
  • the image signal processor 250 may post-process the image signal received from the image sensor of the camera 270 .
  • the image signal processor 250 may improve image quality by removing noise or the like from the original image signal received by the image sensor.
  • the image signal processor 250 may be configured as a separate integrated circuit (IC).
  • the image signal processor 250 may be included in the processor 230 or the camera 270 .
  • FIG. 3 is a flowchart according to various embodiments disclosed in this document
  • FIG. 4 is a conceptual diagram of pipelines corresponding to various shooting angles according to various embodiments disclosed in this document.
  • the image construction unit 231 may transmit a pipeline preparation request to the image providing unit 232 ( 311 ).
  • the image providing unit 232 may configure a pipeline by receiving a pipeline preparation request ( 312 ).
  • the image providing unit 232 may configure the preview pipeline 420 and the angle of view pipeline (eg, the first angle of view pipeline 430 and the second angle of view pipeline 440 ).
  • the pipeline generates a series of data processing structures for generating a preview image 421 or an image (eg, the first angle of view image 431 and the second angle of view image 441) by processing the image signal.
  • Data processing performed in the pipeline may be performed through hardware or software.
  • data processing may take place in physically connected (eg hard wired) image processing blocks and software (eg camera-related applications) for a particular purpose.
  • the image signal received from the image sensor of the camera may be converted into a preview image 421 or an image through various processing processes.
  • the image signal may be processed through at least one of a camera, a processor, and an image signal processor.
  • the image processing 410 of the pipeline includes defective pixel correction, color interpolation, color correction, gamma correction, conversion of color gamut from RGB to YCrCb, and noise reduction. It may include correction and encoding such as noise reduction, edge enhancement, exposure, contrast, gradation and white balance.
  • the image signal input to the pipeline may be converted into a preview image 421 or an image that may be displayed on the display of the electronic device through the above post processing through the pipeline.
  • the preview pipeline 420 prepared by the image providing unit 232 may process the preview image 421 that displays the image being photographed in real time.
  • the preview pipeline 420 may receive an image signal in real time and convert it into a preview image 421 .
  • the preview pipeline 420 may convert (eg, resize) an image to a preset resolution.
  • the angle of view pipeline prepared by the image providing unit 232 may receive an image signal and convert it into an image in the form of a still picture suitable for the captured angle of view.
  • the angle of view pipeline may include, for example, a first angle of view pipeline 430 and a second angle of view pipeline 440 .
  • the first angle of view may mean a wider angle of view than the second angle of view.
  • the first angle of view pipeline 430 may receive an image signal and convert it into a first angle of view image 431 .
  • the second angle of view pipeline 440 may receive an image signal and convert it into a second angle of view image 441 .
  • the first angle of view image 431 may be an image captured in a wider range than the second angle of view image 441 .
  • the electronic device may preconfigure the preview pipeline 420 and the angle-of-view pipelines 430 and 440 corresponding to the plurality of angles of view. Therefore, even if the photographing angle of view is changed, there is no need to remove the pipeline. Since the image signal can be continuously received through the preconfigured pipeline, the preview image 421 can be continuously output on the display even when the photographing angle of view is changed. For this reason, the electronic device according to various embodiments disclosed in this document may provide a seamless preview image 421 without deterioration in image quality even if the photographing angle of view is changed.
  • the image providing unit 232 may transmit a pipeline preparation completion signal to the image configuring unit 231 ( 313 ).
  • the image configuration unit 231 may transmit a preview command to the image providing unit 232 ( 321 ).
  • the preview command can be generated by various triggers. For example, when a user executes a camera-related application, a preview command may be generated.
  • the preview command may include a currently set shooting angle.
  • the image providing unit 232 may receive the image signal and process the image signal through the preview pipeline 420 configured in advance to generate the preview image 421 ( 322 ). In this case, the image providing unit 232 may convert (eg, resize) the preview image 421 to suit the currently set shooting angle.
  • the preview image 421 generated by the image providing unit 232 may be provided to the image composing unit 231 ( 323 ) and may be provided to the image analyzing unit 233 ( 324 ).
  • the image configuration unit 231 may transmit a preview image 421 analysis request to the image analyzer 233 ( 331 ).
  • the image analyzer 233 may receive the preview image 421 analysis request and analyze the preview image 421 .
  • the image analyzer 233 may determine a photographing angle of view by analyzing the state of the subject included in the preview image 421 ( 332 ).
  • the image analyzer 233 may determine whether to change the captured view angle and provide the determined captured view angle to the image constructing unit 231 ( 333 ).
  • the image analyzer 233 may determine to change the photographing angle of view to the second angle of view.
  • the image analyzer 233 may determine a photographing angle of view as the second angle of view and provide the determined angle of view to the image constructing unit 231 .
  • the image configuration unit 231 may transmit a preview command to the image providing unit 232 according to the changed angle of view ( 325 ).
  • the image providing unit 232 may receive the image signal and process the image signal through the preview pipeline 420 configured in advance to generate the preview image 421 of the second angle of view ( 326 ).
  • the preview image 421 of the second angle of view generated by the image providing unit 232 may be provided to the image composing unit 231 ( 327 ) and may be provided to the image analyzing unit 233 ( 328 ).
  • the image analysis unit 233 when the photographing angle of view does not need to be changed, the above-described operations of transmitting a preview command according to the changed photographing angle of view (325) and providing a preview image of the changed photographing angle of view (327) may be omitted.
  • the image configuration unit 231 may generate a photographing command and transmit it to the image providing unit 232 ( 341 ).
  • the photographing command may be generated by a user's various inputs (eg, a touch input, a voice input, a gesture input, etc.).
  • the image providing unit 232 may receive a photographing command and generate an image using a preconfigured angle of view pipeline (eg, the first angle of view pipeline 430 or the second angle of view pipeline 440 ) ( 342 ). .
  • the image providing unit 232 may generate an image by using an angle of view pipeline corresponding to the current captured angle of view.
  • the first angle of view image 431 may be generated using the first angle of view pipeline 430 .
  • the second angle of view image 441 may be generated using the second angle of view pipeline 440 .
  • the image providing unit 232 may provide the generated image to the image configuring unit 231 ( 343 ).
  • the image composing unit 231 may store the image provided from the image providing unit 232 in a memory (eg, the memory 130 of FIG. 1 ) ( 344 ).
  • FIG. 5 is a flowchart of a photographing method according to various embodiments disclosed in this document
  • FIGS. 6A to 6D are diagrams for explaining switching of a photographing angle of view according to various embodiments disclosed in this document.
  • the same reference numerals used in FIG. 2 are used for the same components as those described in FIG. 2 .
  • the image analyzer 233 may determine the photographing angle of view in the order shown in FIG. 5 .
  • the order shown in FIG. 5 is only an example, and the image analyzer 233 may determine the photographing angle of view in various other orders.
  • the image analyzing unit 233 may receive a preview image (eg, the preview image 421 of FIG. 4 ) provided from the image providing unit 232 ( 510 ).
  • the image analysis unit 233 may receive an image analysis request transmitted from the image configuration unit 231 .
  • the user may select (560) whether to use the automatic angle-of-view change function in the setting of the camera-related application. When the user selects to use the automatic angle of view change function, an image analysis request may be transmitted along with the preview image reception.
  • the user may select whether to automatically change the angle of view through a user interface (UI) displayed together with the preview image.
  • UI user interface
  • an image analysis request may be transmitted from the image configuration unit 231 .
  • the operation 560 of receiving a selection of whether to automatically change the angle of view from the user may be omitted.
  • the image analyzer 233 may detect and recognize a subject included in the preview image ( 520 ).
  • the subject detection may refer to an operation of distinguishing a background from a subject in the preview image.
  • the image analyzer 233 may detect the number of subjects included in the preview image.
  • the subject recognition may refer to an operation of identifying a detected subject.
  • the image analyzer 233 may recognize the type of the subject or, if the subject is a person, recognize the person by face recognition.
  • the image analyzer 233 may analyze the state of the detected and recognized subject ( 530 ).
  • the state of the subject may include, for example, at least one of the number, size, location, and exposure degree of the subject.
  • the image analysis unit 233 may determine the change of the photographing angle of view and transmit the determined photographing angle of view to the image configuration unit 231 ( 540 ).
  • the current photographing angle of view may be maintained and the current photographing angle of view may be transmitted to the image configuration unit 231 ( 550 ).
  • the user may check the current shooting angle through the UI 630 displayed on the display 600 .
  • the UI 630 may include a first display 631 indicating that the current captured view angle is the first view angle and a second mark 633 indicating that the current captured view angle is the second view angle.
  • the first angle of view may mean a wider angle of view than the second angle of view.
  • the image analyzer 233 may determine the photographing angle of view so that the photographing angle of view is widened. As shown in FIG. 6A , when the number of subjects is large, the image analyzer 233 may determine to change the photographing angle of view to a wider photographing angle of view (eg, the first angle of view) than the current photographing angle of view (eg, the second angle of view). For example, in the case of (a) of FIG. 6A , since the second display 633 is selected, it can be seen that the current angle of view of FIG. 6A (a) is the second angle of view. As a result of analysis by the image analyzer 233 , the angle of view may be changed.
  • a wider photographing angle of view eg, the first angle of view
  • the current photographing angle of view eg, the second angle of view
  • the image analysis unit 233 may reflect the recognition result of the subject to change the photographing angle of view. For example, when two or more subjects included in the preview image are all persons registered in the electronic device, the photographing angle of view may be changed widely. When the subject included in the preview image is a person not registered in the electronic device, the photographing angle of view may not be changed. Also, even if a plurality of subjects are detected, the angle of view may not be changed when the subjects move at a speed greater than a reference speed. In some cases, it may be possible to change the shooting angle by analyzing the user's preferred image.
  • the image analyzer 233 may analyze the photos stored in the electronic device to confirm that the user prefers to photograph the subject in a larger size. In this case, the image analysis unit 233 may not change the photographing angle of view even when the subject is photographed somewhat larger because the current photographing angle of view is narrow.
  • the image analyzer 233 may change the photographing angle of view in consideration of the position of the subject in the preview image. As illustrated in FIG. 6B , the image analyzer 233 may change the photographing angle of view to be wider when a part of the subject is not included in the preview image. In the case of (a) of FIG. 6B , since the second display 633 is selected, it can be seen that the current photographing angle of view is the second angle of view. In the case of (b) of FIG. 6B in which the photographing angle of view is changed, since the first display 631 is selected, it can be seen that the current photographing angle of view is the first angle of view.
  • the image analyzer 233 may change the photographing angle of view to be narrower.
  • the current photographing angle of view is the first angle of view.
  • the second display 633 is selected, it can be seen that the current photographing angle of view is the second angle of view. Distortion may occur at the edge or periphery of the image due to optical characteristics of the camera.
  • the peripheral portion may have lower image quality than the central portion.
  • the image analyzer 233 may induce the user to move the subject to the center of the preview image by changing the angle of view to be narrower.
  • the image analysis unit 233 may reflect the recognition result of the subject to change the photographing angle of view. For example, when the subject located in the periphery of the preview image is a person not registered in the electronic device, the photographing angle of view may not be changed.
  • the image analyzer 233 may change the photographing angle of view in consideration of the degree of exposure of the subject. For example, as illustrated in FIG. 6D , when a bright light source (eg, sun and lighting) exists together with the subject in the preview image, the exposure of the subject may be excessively reduced according to the metering method of the camera. Due to this, the degree of exposure of the subject may be insufficient. If the photographing is performed in such a state, the subject may be photographed darkly. The image analyzer 233 may change the photographing angle of view to be narrow so that a bright light source existing in the preview image is removed from the preview image. In the case of (a) of FIG.
  • a bright light source eg, sun and lighting
  • the image analysis unit 233 may change the photographing angle of view by applying a criterion having a higher priority according to a preset priority.
  • the image analysis unit 233 may display the determined photographing angle of view on the display. In this case, the photographing angle of view may or may not be changed according to the user's selection.
  • FIG. 7 is a flowchart of a photographing method according to various embodiments disclosed herein.
  • the same reference numerals used in FIG. 2 are used for the same components as those described in FIG. 2 .
  • the analysis of the subject state of the image analyzer 233 may be performed in the order shown in FIG. 7 .
  • the sequence shown in FIG. 7 is merely an example, and the image analyzer 233 may also analyze the state of the subject in various other procedures and methods.
  • the subject analyzed by the image analyzer 233 may be, for example, a face of a person or a face of an animal.
  • the user may directly select a subject of interest.
  • the image analyzer 233 may analyze the subject selected by the user.
  • the image analyzer 233 may determine whether the current preview image (eg, the preview image 421 of FIG. 4 ) is provided with the widest angle of view supported by the electronic device ( 710 ). When the current preview image is not the widest angle of view, the image analyzer 233 may check the number of subjects ( 721 ). When two or more subjects exist, the photographing angle of view may be enlarged ( 723 ) (eg, FIG. 6A ). Enlarging the photographing angle of view may mean widening the photographing angle of view. When there is only one subject, the image analyzer 233 may check whether the position of the subject exists at the edge ( 722 ). When the subject is at the edge of the preview image and a portion is cut off, the photographing angle of view may be enlarged ( 723 ) (eg, FIG. 6B ).
  • the photographing angle of view may be enlarged ( 723 ) (eg, FIG. 6B ).
  • the image analyzer 233 may determine whether saturation due to overexposure occurs in the current preview image ( 730 ). Saturation may mean that light of a brightness level that the image sensor cannot distinguish exists in the preview image. If saturation occurs, the image analysis unit 233 may check whether a configuration causing saturation exists only in the currently set photographing angle of view ( 740 ).
  • a component causing saturation eg, a bright light source
  • it may be checked whether the subject exists at the edge ( 741 ).
  • distortion may occur in the subject.
  • the image analyzer 233 may reduce the captured angle of view (751). Reducing the photographing angle of view may mean narrowing the photographing angle of view. Through the reduction of the photographing angle of view, it is possible to induce the user to move the subject existing at the edge to the center (eg, FIG. 6c ).
  • the state (eg, cropping) of the subject according to the change of the photographing angle of view may be analyzed ( S750 ).
  • the photographing angle of view may be reduced ( 751 ) (eg, FIG. 6D ). If the photographing angle of view is changed, the photographing angle of view may be maintained when the subject is changed.
  • the photographing angle of view determined by the image analysis unit 233 is provided to the image configuration unit 231 , and the image configuration unit 231 transmits a photographing command according to the determined photographing angle of view to the image providing unit 232 .
  • the image providing unit 232 is an image signal received from the image sensor using an angle of view pipeline (eg, the first angle of view pipeline 430 or the second angle of view pipeline 440 of FIG. 4 ) corresponding to the determined photographing angle of view. may be converted into an image of the determined angle of view (eg, the first angle of view image 431 or the second angle of view image 441 of FIG. 4 .
  • the image constructing unit 231 may obtain the converted image (760). ).
  • the electronic device may include a plurality of cameras having different angles of view.
  • the electronic device may include a standard angle of view camera, a wide angle camera, and a telephoto camera.
  • the processor of the electronic device may preconfigure an angle-of-view pipeline corresponding to the angles of view of a plurality of different cameras.
  • an angle-of-view pipeline corresponding to a standard-angle camera, an angle-of-view pipeline corresponding to a wide-angle camera, and an angle-of-view pipeline corresponding to a telephoto camera may be configured in advance.
  • the image analyzer may analyze the state of the subject and select one of the plurality of cameras based on the analyzed state of the subject.
  • the image configuration unit 231 may transmit a photographing command including the selected camera to the image providing unit 232 .
  • the image providing unit 233 may generate an image signal received from the selected camera as an image by using a preconfigured angle of view pipeline.
  • An electronic device may include a camera and a processor operatively connected to the camera, and the processor may acquire images from at least two different angles of view through the camera. It is possible to prepare to do so, to analyze the state of the subject included in the image obtained through the camera, to determine a photographing angle of view based on the analysis, and to provide a preview image and an image with the determined photographing angle of view .
  • the processor may include an image providing unit that receives an image signal from the camera and provides a preview image and an image having at least two angles of view based on at least one of a preview command and a shooting command, and a subject included in the preview image. and an image analyzer configured to determine a photographing angle of view by analyzing a state of . and an image configuration unit configured to generate at least one of the preview command and the photographing command according to the photographing angle of view determined by the image analysis unit.
  • the image configuration unit may prepare a plurality of angle-of-view pipelines respectively corresponding to images of different angles of view from a preview pipeline in the image providing unit, and the preview pipeline includes: It may be a data processing structure that converts an image signal into the preview image, and the plurality of angle of view pipelines may have a data processing structure that converts the image signals received from the camera into images of different angles of view.
  • the image configuration unit may prepare a preview pipeline, a first angle of view pipeline, and a second angle of view pipeline for the image providing unit, and the first angle of view pipeline may include the image signal received from the camera. It may be a data processing structure for converting an image of a first angle of view, and the second angle of view pipeline is a data processing structure for converting an image signal received from the camera into an image of a second angle of view that is different from the first angle of view can be
  • the image providing unit may provide the preview image using the preview pipeline according to the preview command to at least one of the first angle of view and the second angle of view, and according to the shooting command, the preview image may be provided through the first angle of view pipeline. and using at least one of the second angle of view pipeline, the image captured by at least one of the first angle of view and the second angle of view may be provided.
  • the image providing unit may change the data processing structure from at least one of the first angle of view pipeline and the second angle of view pipeline to the other of the first angle of view pipeline and the second angle of view pipeline based on the shooting command.
  • the preview image may be continuously provided by continuously receiving the image signal from the camera during the change process.
  • the image analyzer may determine the photographing angle of view based on at least one of the number, position, and size of the subjects included in the preview image.
  • the image analyzer may determine the photographing angle of view based on at least one of the number, location, and size of the registered subjects.
  • the image analyzer may determine the photographing angle of view based on an exposure degree of the subject included in the preview image.
  • the method of photographing a camera includes an operation of preparing to acquire an image from at least two different angles of view through a camera, and analyzing a state of a subject included in an image obtained through the camera It may include an operation, an operation of determining a photographing angle of view based on the analysis, and an operation of providing a preview image and an image at the determined photographing angle of view.
  • the image providing unit may receive an image signal from the camera, and provide a preview image and an image having at least two angles of view based on at least one of a preview command and a shooting command, and an image analysis unit included in the preview image.
  • the photographing angle of view may be determined by analyzing the state of the subject, and the image configuration unit may generate at least one of the preview command and the photographing command according to the photographing angle of view determined by the image analyzing unit.
  • the image configuration unit may prepare a plurality of angle-of-view pipelines respectively corresponding to images of different angles of view from the preview pipeline to the image providing unit, wherein the preview pipeline receives the image signal received from the camera. It may be a data processing structure for converting the preview image, and the plurality of angle-of-view pipelines may have a data processing structure for converting image signals received from the camera into images of different angles of view.
  • the image configuration unit may prepare a preview pipeline, a first angle of view pipeline, and a second angle of view pipeline for the image providing unit, and the first angle of view pipeline may include the image signal received from the camera. It may be a data processing structure for converting an image of a first angle of view, and the second angle of view pipeline is a data processing structure for converting an image signal received from the camera into an image of a second angle of view that is different from the first angle of view can be
  • the image providing unit may provide the preview image using the preview pipeline according to the preview command to at least one of the first angle of view and the second angle of view, and according to the shooting command, the preview image may be provided through the first angle of view pipeline. and using at least one of the second angle of view pipeline, the image captured by at least one of the first angle of view and the second angle of view may be provided.
  • the image providing unit may change the data processing structure from at least one of the first angle of view pipeline and the second angle of view pipeline to the other of the first angle of view pipeline and the second angle of view pipeline based on the shooting command.
  • the preview image may be continuously provided by continuously receiving the image signal from the camera during the change process.
  • the image analyzer may determine the photographing angle of view based on at least one of the number, position, and size of the subjects included in the preview image.
  • the image analyzer may determine the photographing angle of view based on at least one of the number, location, and size of the registered subjects.
  • the image analyzer may determine the photographing angle of view based on an exposure degree of the subject included in the preview image.
  • An electronic device may include a plurality of cameras having different angles of view and a processor operatively connected to the plurality of cameras, wherein the processor is It is possible to prepare to acquire an image with at least two or more angles of view, and analyze a state of a subject included in an image obtained through one of the plurality of cameras, and based on the analysis, one of the plurality of cameras may be determined, and a preview image and an image may be provided through the determined camera.
  • the processor may prepare a plurality of angle-of-view pipelines respectively corresponding to images of different angles of view from the preview pipeline, wherein the preview pipeline previews the image signal received from one of the plurality of cameras. It may be a data processing structure for converting an image, and the plurality of angle of view pipelines may be a data processing structure for converting image signals received from the plurality of cameras into images of different angles of view.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

Divers modes de réalisation de la présente invention concernent un dispositif électronique pouvant comprendre une caméra, et un processeur connecté fonctionnellement à la caméra, le processeur pouvant : préparer l'obtention d'une image, par l'intermédiaire de la caméra, à partir d'au moins deux angles de vue différents ; analyser l'état d'un sujet inclus dans une image obtenue par l'intermédiaire de la caméra ; déterminer un angle de vue pour la prise de vue, sur la base de l'analyse ; et fournir une image de prévisualisation et une image ayant l'angle de vue déterminé pour la prise de vue. Divers autres modes de réalisation sont possibles.
PCT/KR2021/001677 2020-02-10 2021-02-09 Dispositif électronique comprenant une caméra, et procédé de prise de vue WO2021162391A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020200015898A KR20210101656A (ko) 2020-02-10 2020-02-10 카메라를 포함하는 전자 장치 및 촬영 방법
KR10-2020-0015898 2020-02-10

Publications (1)

Publication Number Publication Date
WO2021162391A1 true WO2021162391A1 (fr) 2021-08-19

Family

ID=77292875

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/001677 WO2021162391A1 (fr) 2020-02-10 2021-02-09 Dispositif électronique comprenant une caméra, et procédé de prise de vue

Country Status (2)

Country Link
KR (1) KR20210101656A (fr)
WO (1) WO2021162391A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009212804A (ja) * 2008-03-04 2009-09-17 Fujifilm Corp 構図アシスト機能付き撮像装置及び該撮像装置における構図アシスト方法
JP2011103550A (ja) * 2009-11-10 2011-05-26 Olympus Imaging Corp 画像撮像装置及び画像撮像方法
JP2011199702A (ja) * 2010-03-23 2011-10-06 Casio Computer Co Ltd カメラ、カメラ制御プログラム及び撮影方法並びに被写体情報送受信システム
JP2014064134A (ja) * 2012-09-20 2014-04-10 Make Softwear:Kk 撮影遊技機、撮影遊技機の制御方法及びコンピュータプログラム
JP2015103945A (ja) * 2013-11-25 2015-06-04 キヤノン株式会社 撮像装置及び画像信号処理方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009212804A (ja) * 2008-03-04 2009-09-17 Fujifilm Corp 構図アシスト機能付き撮像装置及び該撮像装置における構図アシスト方法
JP2011103550A (ja) * 2009-11-10 2011-05-26 Olympus Imaging Corp 画像撮像装置及び画像撮像方法
JP2011199702A (ja) * 2010-03-23 2011-10-06 Casio Computer Co Ltd カメラ、カメラ制御プログラム及び撮影方法並びに被写体情報送受信システム
JP2014064134A (ja) * 2012-09-20 2014-04-10 Make Softwear:Kk 撮影遊技機、撮影遊技機の制御方法及びコンピュータプログラム
JP2015103945A (ja) * 2013-11-25 2015-06-04 キヤノン株式会社 撮像装置及び画像信号処理方法

Also Published As

Publication number Publication date
KR20210101656A (ko) 2021-08-19

Similar Documents

Publication Publication Date Title
WO2020032473A2 (fr) Dispositif électronique de floutage d'image obtenue par combinaison de plusieurs images sur la base d'informations de profondeur et procédé de pilotage du dispositif électronique
WO2020080845A1 (fr) Dispositif électronique et procédé pour obtenir des images
WO2020130654A1 (fr) Module de caméra ayant une structure multi-cellulaire et dispositif de communication portable le comprenant
WO2020116844A1 (fr) Dispositif électronique et procédé d'acquisition d'informations de profondeur à l'aide au moins de caméras ou d'un capteur de profondeur
WO2019035551A1 (fr) Appareil de composition d'objets à l'aide d'une carte de profondeur et procédé associé
WO2020197070A1 (fr) Dispositif électronique effectuant une fonction selon une entrée de geste et son procédé de fonctionnement
WO2019168374A1 (fr) Procédé de génération d'informations multiples à l'aide d'une caméra pour détecter une largeur de bande d'onde multiple et appareil associé
WO2019125074A1 (fr) Procédé de génération d'image composite à l'aide d'une pluralité d'images ayant différentes valeurs d'exposition, et dispositif électronique prenant le procédé en charge
WO2019160237A1 (fr) Dispositif électronique, et procédé de commande d'affichage d'images
WO2021137555A1 (fr) Dispositif électronique comprenant un capteur d'image et son procédé de fonctionnement
WO2020171492A1 (fr) Procédé de traitement d'image photographique et dispositif électronique associé
WO2020190008A1 (fr) Dispositif électronique pour fonction de focalisation auto, et procédé de commande correspondant
WO2020246710A1 (fr) Procédé de détermination de carte de profondeur et dispositif électronique auquel le même procédé est appliqué
WO2019054610A1 (fr) Dispositif électronique et procédé de commande d'une pluralité de capteurs d'image
WO2021162391A1 (fr) Dispositif électronique comprenant une caméra, et procédé de prise de vue
WO2021215795A1 (fr) Filtre couleur pour dispositif électronique, et dispositif électronique le comportant
WO2021235884A1 (fr) Dispositif électronique et procédé de génération d'image par réalisation d'un awb
WO2022030943A1 (fr) Appareil et procédé de segmentation d'image basés sur un apprentissage profond
WO2021096219A1 (fr) Dispositif électronique comprenant une caméra et son procédé
WO2020159115A1 (fr) Dispositif électronique à plusieurs lentilles, et son procédé de commande
WO2021125875A1 (fr) Dispositif électronique pour fournir un service de traitement d'image à travers un réseau
WO2021194161A1 (fr) Procédé de correction de tremblement au niveau d'un agrandissement élevé et dispositif électronique associé
WO2021080307A1 (fr) Procédé de commande de caméra et dispositif électronique correspondant
WO2021162241A1 (fr) Procédé et dispositif de commande d'un capteur d'image
WO2013103230A1 (fr) Procédé de fourniture d'une interface utilisateur et appareil de photographie d'image l'utilisant

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21753785

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21753785

Country of ref document: EP

Kind code of ref document: A1