US20230087406A1 - Method of controlling camera device and electronic device thereof - Google Patents

Method of controlling camera device and electronic device thereof Download PDF

Info

Publication number
US20230087406A1
US20230087406A1 US18/070,130 US202218070130A US2023087406A1 US 20230087406 A1 US20230087406 A1 US 20230087406A1 US 202218070130 A US202218070130 A US 202218070130A US 2023087406 A1 US2023087406 A1 US 2023087406A1
Authority
US
United States
Prior art keywords
camera
electronic device
processor
image
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US18/070,130
Inventor
Bonghak Choi
Jungyeob OH
Jongkee Lee
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to US18/070,130 priority Critical patent/US20230087406A1/en
Publication of US20230087406A1 publication Critical patent/US20230087406A1/en
Priority to US18/460,976 priority patent/US20230412911A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • H04N5/23216
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/634Warning indications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • H04N5/23218
    • H04N5/232933
    • H04N5/232941
    • H04N5/247

Definitions

  • the present disclosure relates generally to an apparatus and a method for controlling a camera device in an electronic device.
  • portable electronic devices may provide various services such as broadcast services, wireless Internet services, camera services, and music playback services.
  • the electronic device may provide the camera services through a plurality of camera devices to meet various user demands.
  • the electronic device may acquire images or videos through a front camera device disposed on the front surface of the electronic device and a back camera device disposed on the back surface.
  • the electronic device may provide the camera service to a user of the electronic device by executing a camera application to control the plurality of camera devices.
  • the user of the electronic device may feel inconvenience due to multiple controls for execution of the camera application. For example, when the user of the electronic device uses the camera service through an electronic device in which a message application is being executed, the user may feel inconvenience in executing the camera application through a second control after making the electronic device enter a standby mode through a first control. As another example, when the user of the electronic device uses the camera service through a locked electronic device, the user may feel inconvenience in executing the camera application through a second control after unlocking the electronic device through a first control.
  • an aspect of the present disclosure is to provide an electronic device and a method for easily controlling a camera device in an electronic device.
  • another aspect of the present disclosure is to provide an electronic device including a camera device disposed at a location overlapping at least a partial area of a display and a method for controlling the camera device based on an adjacent area including a placement area of the camera device or an adjacent area close to the placement area of the camera device, so that a user of the electronic device can easily drive and control a camera application.
  • another aspect of the present disclosure is to provide an electronic device including a camera device disposed at a location overlapping at least a partial area of a display and a method for displaying control information related to a camera device in an adjacent area including a placement area of the camera device or an adjacent area close to the placement area of the camera device, so that a user of the electronic device can easily detect an operational state of the camera device and a photo can be taken by inducing a user's eyes in a direction of the camera lens.
  • an electronic device includes a display, a camera device disposed at a location overlapping a partial area of the display, and a processor configured to receive, via a touch screen, touch input on a portion of the touch screen, a contact area of the received touch input including at least a part of an area where a hole is formed, perform an operation related with the camera based on the touch input, and display an image received through the camera on the touch screen based on the touch input.
  • a method of operating an electronic device comprising a camera configured to capture an image through a hole formed in a layer of a touch screen.
  • the method includes receiving, via the touch screen, a touch input on a portion of the touch screen, a contact area of the received touch input including at least a part of an area where the hole is formed, performing an operation related with the camera based on the touch input, and displaying an image received through the camera on the touch screen based on the touch input.
  • FIGS. 1 A and 1 B illustrate a configuration of an electronic device, according to an embodiment of the present disclosure
  • FIG. 2 is a block diagram of an electronic device in a network environment, according to an embodiment of the present disclosure
  • FIG. 3 is a flowchart of a process for controlling a camera device in an electronic device, according to an embodiment of the present disclosure
  • FIG. 4 is a flowchart of a process for controlling a camera device in an electronic device when a screen of the electronic device is turned off, according to an embodiment of the present disclosure
  • FIG. 5 is a flowchart of a process for detecting touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure
  • FIGS. 6 A to 6 D illustrate a screen configuration for controlling a camera device in an electronic device when a screen of the electronic device is turned off, according to an embodiment of the present disclosure
  • FIG. 7 is a flowchart of a process for configuring a camera display area in an electronic device based on touch information of a camera control area, according to an embodiment of the present disclosure
  • FIG. 8 is a flowchart of a process for controlling a camera device in an electronic device when a camera application is linked with another application, according to an embodiment of the present disclosure
  • FIGS. 9 A to 9 E illustrate a screen configuration for controlling a camera device in an electronic device when a camera application is linked with another application, according to an embodiment of the present disclosure
  • FIG. 10 is a flowchart of a process for controlling a camera device in an electronic device based on touch maintaining information of a camera control area, according to an embodiment of the present disclosure
  • FIGS. 11 A and 11 B illustrate a screen configuration for controlling a camera device in an electronic device based on touch maintaining information of a camera control area, according to an embodiment of the present disclosure
  • FIG. 12 is a flowchart of a process for setting a timer of a camera device based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure
  • FIGS. 13 A to 13 C illustrate a screen configuration for setting a timer of a camera device based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure
  • FIG. 14 is a flowchart of a process for providing a flash effect in an electronic device, according to an embodiment of the present disclosure
  • FIGS. 15 A to 15 D illustrate a screen configuration for providing a flash effect in an electronic device, according to an embodiment of the present disclosure
  • FIGS. 17 A and 17 B illustrate a screen configuration for providing a camera service through an application in an electronic device, according to an embodiment of the present disclosure
  • FIG. 18 is a flowchart of a process for providing a multi-camera service in an electronic device, according to an embodiment of the present disclosure
  • FIGS. 19 A to 19 F illustrate a screen configuration for providing a multi-camera service in an electronic device, according to an embodiment of the present disclosure
  • FIG. 20 is a flowchart of a process for providing an automatic photographing service in an electronic device, according to an embodiment of the present disclosure
  • FIGS. 21 A to 21 C illustrate a screen configuration for providing an automatic photographing service in an electronic device, according to an embodiment of the present disclosure
  • FIG. 22 is a flowchart of a process for controlling a camera device in accordance with a camera application in an electronic device, according to an embodiment of the present disclosure
  • FIGS. 23 A and 23 B illustrate a screen configuration for controlling a camera device in accordance with a camera application in an electronic device, according to an embodiment of the present disclosure
  • FIG. 25 illustrates a screen configuration for controlling a camera device in accordance with photographing setting information of an image in an electronic device, according to an embodiment of the present disclosure
  • FIG. 26 is a flowchart of a process for capturing an image based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure
  • FIG. 27 is a flowchart of a process in which an electronic device photographs video based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure
  • FIG. 28 is a flowchart of a process for displaying camera driving limit information in an electronic device, according to an embodiment of the present disclosure
  • FIG. 29 illustrates a screen configuration for displaying camera driving limit information in an electronic device, according to an embodiment of the present disclosure
  • FIG. 30 is a flowchart of a process for providing a video call service in an electronic device, according to an embodiment of the present disclosure
  • FIG. 31 illustrates a screen configuration for providing a video call service in an electronic device, according to an embodiment of the present disclosure
  • FIGS. 33 A to 33 D illustrate a screen configuration for displaying human body recognition service information in an electronic device according to an embodiment of the present disclosure
  • FIG. 34 is a flowchart of a process for displaying pollution level information of a camera device in an electronic device, according to an embodiment of the present disclosure.
  • FIG. 35 illustrates a screen configuration for displaying pollution level information of a camera device in an electronic device, according to an embodiment of the present disclosure.
  • a or B “at least one of A or/and B” or “one or more of A or/and B” used in the various embodiments of the present disclosure include any and all combinations of words enumerated with it.
  • “A or B”, “at least one of A and B” or “at least one of A or B” means (1) including A, (2) including B, or (3) including both A and B.
  • first and second may modify various elements of various embodiments, these terms do not limit the corresponding elements. For example, these terms do not limit an order and/or importance of the corresponding elements. These terms may be used for the purpose of distinguishing one element from another element.
  • a first user device and a second user device both indicate user devices and may indicate different user devices.
  • a first element may be referred to as a second element without departing from the scope of the present disclosure, and similarly, a second element may be referred to as a first element.
  • first element e.g., a first element
  • second element e.g., a second element
  • first element may be directly connected or coupled to the second element, and there may be an intervening element (e.g., a third element) between the first element and second element.
  • intervening element e.g., a third element
  • a processor configured to (set to) perform A, B, and C may be a dedicated processor, e.g., an embedded processor, for performing a corresponding operation, or a generic-purpose processor, e.g., a central processing unit (CPU) or an application processor (AP), capable of performing a corresponding operation by executing one or more software programs stored in a memory device.
  • a dedicated processor e.g., an embedded processor
  • a generic-purpose processor e.g., a central processing unit (CPU) or an application processor (AP), capable of performing a corresponding operation by executing one or more software programs stored in a memory device.
  • An electronic device may include at least one of a smartphone, a tablet personal computer (PC), a mobile phone, a video phone, an electronic book (e-book) reader, a laptop PC, a netbook computer, a workstation, a personal digital assistant (PDA), a portable multimedia player (PMP), an MP3 player, a mobile medical appliance, a camera, and a wearable device (e.g., smart glasses, a head-mounted-device (HMD), electronic clothes, an electronic bracelet, an electronic necklace, an electronic appcessory, an electronic tattoo, a smart mirror, or a smart watch).
  • a wearable device e.g., smart glasses, a head-mounted-device (HMD), electronic clothes, an electronic bracelet, an electronic necklace, an electronic appcessory, an electronic tattoo, a smart mirror, or a smart watch.
  • the electronic device may include at least one of various medical devices (e.g., various portable medical measuring devices (a blood glucose monitoring device, a heart rate monitoring device, a blood pressure measuring device, a body temperature measuring device, etc.)), a navigation device, a global positioning system (GPS) receiver, an event data recorder (EDR), a flight data recorder (FDR), a vehicle infotainment device, an electronic device for a ship (e.g., a navigation device, and a gyro-compass), an avionics device, a security device, or Internet of Things (IoT) device (e.g., a light bulb, various sensors, an electric or gas meter, a sprinkler device, a fire alarm, a thermostat, a streetlamp, a toaster, a sporting goods, a hot water tank, a heater, a boiler, etc.).
  • various medical devices e.g., various portable medical measuring devices (a blood glucose monitoring device, a heart rate monitoring device, a blood
  • the term “user” may indicate a person who uses an electronic device or a device (e.g., an artificial intelligence electronic device) that uses an electronic device.
  • FIGS. 1 A and 1 B illustrate a configuration of an electronic device, according to an embodiment of the present disclosure.
  • an electronic device 100 is provided.
  • the electronic device 100 may be configured as one body.
  • the electronic device 100 may be an electronic device for communication including a speaker device 130 and a microphone device 140 for a voice call.
  • the electronic device 100 may have a front surface configured by a touch screen 110 .
  • a camera device 120 may be disposed on at least some areas of the touch screen 110 .
  • the speaker device 130 may be disposed on at least one surface adjacent to the touch screen 110 (for example, an upper side surface, lower side surface, left side surface, and right side surface).
  • the speaker device 130 may be disposed on the upper side surface adjacent to the touch screen 110 close to the user's ear for a voice call.
  • Control buttons for example, a home button and a back button for controlling the electronic device 100 may be displayed in a lower area of the touch screen 110 .
  • the touch screen 110 of the electronic device 100 may include a front window 140 , a touch panel 150 , a display module 160 , and a printed circuit board (PCB) 170 , as illustrated in FIG. 1 B .
  • the camera device (for example, a front camera device) 180 of the electronic device 100 may be mounted on the PCB.
  • the front window 140 may be a transparent material window film that forms an external surface of the touch screen 110 .
  • the PCB 170 may use a flexible PCB (FPCB), which is an electronic component made by forming a conductive circuit having good electrical conductivity (e.g., cooper) on an insulator.
  • FPCB flexible PCB
  • the camera device 180 may be disposed at a position overlapping at least some areas 152 of the touch panel 150 .
  • at least some areas 152 of the touch panel 150 on which the camera device 180 is disposed may be perforated.
  • the touch screen 110 may have a limited touch recognition function in the area 152 on which the camera device 180 is disposed.
  • at least some areas 152 of the touch panel 150 on which the camera device 180 is disposed are not perforated, and a touch pattern for touch recognition may be omitted in the corresponding areas 152 .
  • the touch screen 110 may have a limited touch recognition function in the area 152 on which the camera device 180 is disposed.
  • the touch panel 150 on which the camera device 180 is disposed are not perforated, and the touch pattern for touch recognition may be set in the corresponding areas 152 .
  • the touch screen 110 may detect a touch input through the areas 152 on which the camera device 180 is disposed.
  • the touch pattern may include an electrode for the touch recognition.
  • the camera device 180 may be disposed at a position overlapping some areas 162 of the display module 160 .
  • at least some areas 162 of the display module 160 on which the camera device 180 is disposed may be perforated.
  • the touch screen 110 may have a limited display function on the areas 162 in which the camera device 180 is disposed.
  • at least some areas 162 of the display module 160 on which the camera device 180 is disposed are not perforated, and a display component may not be disposed in the corresponding areas 162 .
  • the touch screen 110 may have a limited display function on the areas 162 in which the camera device 180 is disposed.
  • At least some areas 162 of the display module 160 on which the camera device 180 is disposed may not be perforated, and a display component may be disposed.
  • the touch screen 110 may display information through the areas 162 in which the camera device 180 is disposed.
  • the electronic device 100 may form at least one hole in at least some areas (upper end) of the touch screen 110 and place the speaker device 130 for a voice call service in the at least one hole.
  • FIG. 2 is a block diagram of an electronic device in a network environment, according to an embodiment of the present disclosure.
  • the electronic device 201 may include a bus 210 , a camera device 220 , a processor 230 (e.g., including processing circuitry), a memory 240 , an input/output interface 260 (e.g., including input/output circuitry), a display 270 (e.g., including display circuitry), and a communication interface 280 (e.g., including communication circuitry).
  • the electronic device 201 may omit at least one of the elements, or may further include other elements.
  • the bus 210 is a circuit that interconnects the elements 220 to 280 and transfers communication (for example, control messages and/or data) between the elements.
  • the camera device 220 may collect image information of a subject.
  • the camera device 220 may include a plurality of camera devices included in the electronic device 201 .
  • the camera device 220 may include a first camera device (for example, front camera device) for performing photography in a selfie mode and a second camera device (for example, back camera device) for photographing a subject located in front of the user.
  • the camera device 220 may be disposed to be included in at least some areas of the display 270 .
  • an image senor of the first camera device may be disposed in at least some areas of the display 270 .
  • the image sensor may use a charge-coupled device (CCD) image sensor or a complementary metal-oxide semiconductor (CMOS) image sensor.
  • CCD charge-coupled device
  • CMOS complementary metal-oxide semiconductor
  • the processor 230 may include one or more of a central processing unit (CPU), an application processor (AP), and a communication processor (CP). For example, the processor 230 may execute calculations or data processing about controls and/or communication of at least one other element of the electronic device 201 . The processor 230 may perform various functions of the electronic device 201 . Accordingly, the processor 230 may control the elements of the electronic device 201 .
  • CPU central processing unit
  • AP application processor
  • CP communication processor
  • the processor 230 may control the camera device 220 based on touch information of a preset camera control area to control the camera device 220 . For example, when some areas of the touch panel are perforated to place the camera device 220 , the processor 230 may set at least some areas adjacent to the placement areas of the camera device 220 on the touch panel as camera control areas. For example, when some areas of the touch panel corresponding to the placement areas of the camera device 220 are not perforated and a touch pattern is omitted in the corresponding areas, the processor 230 may set at least some areas adjacent to the placement areas of the camera device 220 on the touch panel as camera control areas.
  • the processor 230 may set the placement areas of the camera device 220 on the touch panel and at least some areas adjacent to the placement areas of the camera device 220 as camera control areas.
  • the processor 230 may drive the camera device 220 based on a touch and a drag input in the camera control area. For example, when the touch input in the camera control area is detected, the processor 230 may control the display 270 to display camera activation information in a display area corresponding to the camera control area. For example, when the display 270 is deactivated, the processor 230 may maintain the touch recognition function of the camera control area in an active state. Accordingly, the processor 230 may detect the touch input in the camera control area in an inactive state of the display 270 . When the drag input of the camera activation information is detected, the processor 230 may execute a camera application to start a front camera mode.
  • the processor 230 may set a camera display area to display a service screen of the camera application based on a distance of the drag input.
  • the processor 230 may control the display 270 to display the service screen of the camera application (for example, a preview image acquired through the camera device 220 ) in the camera display area.
  • the processor 230 may drive the camera device 220 based on a touch and a touch maintaining time in the camera control area. For example, when the touch input in the camera control area is detected, the processor 230 may control the display 270 to display camera activation information in the display area corresponding to the placement area of the camera device 220 . When the touch maintaining time of the camera activation information exceeds a reference time, the processor 230 may execute the camera application to start the front camera mode, for example. In this case, the processor 230 may control the display 270 to display the preview image acquired through the front camera device.
  • the processor 230 may control the camera application to be linked with another application. For example, when the touch and the drag input in the camera control area are detected in a state where a service screen of another application is displayed, the processor 230 may display the service screen of the camera application in at least some areas of the display 270 based on a distance of the drag input. That is, the processor 230 may divide the display 270 into a first area and a second area based on the distance of the drag input. The processor 230 may control the display 270 to display a service screen of another application in the first area of the display 270 and to display a service screen of the camera application in the second area. When an image is captured (or acquired) through the camera application, the processor 230 may determine whether the camera application can be linked with the other application.
  • the processor 230 may set the image captured through the camera application as contents to be controlled in the other application.
  • the processor 230 may store the image captured through the camera application in the memory 240 .
  • the processor 230 may end the camera application when the image is captured.
  • the processor 230 may set a timer of the camera device 220 to capture an image based on touch information (for example, at least one of the touch input and the drag input) in the camera control area. For example, when the drag input in the camera control area is detected while the camera application is executed (for example, in the front camera mode), the processor 230 may set the timer of the camera device 220 to correspond to a drag distance. That is, the processor 230 may set a time of the timer in proportion to the drag distance.
  • the processor 230 may control the display 270 to display timer information based on the placement area of the camera device 220 .
  • the processor 230 may continuously reduce a size of the timer information displayed on the display 270 in accordance with the elapsing of the time of the timer.
  • the processor 230 may capture an image. For example, when the touch input in the camera control area is detected while the camera application is executed (for example, in the front camera mode), the processor 230 may set the timer of the camera device 220 based on a touch position. That is, the processor 230 may set the time of the timer in proportion to a distance between the placement area of the camera device 220 and the touch position. For example, the timer of the camera device 220 may include a photographing timer of the camera device 220 to capture an image.
  • the processor 230 may change a color of the display 270 to secure an amount of light to capture the image. For example, when the image is captured through the front camera device, the processor 230 may change the color of the display 270 into a bright color (for example, white) based on the placement area of the camera device 220 and provide a flash effect. For example, the processor 230 may apply various image effects by changing the color of the display 270 in accordance with a user input.
  • a bright color for example, white
  • the processor 230 may control the display 270 to display additional information for a camera service through the camera control area. For example, when the image is captured through the front camera device, the processor 230 may control the display 270 to display a graphic effect (for example, wavelength image) based on the placement area of the camera device 220 to induce a user's eyes to the front camera device. For example, when video is photographed through the back camera device, the processor 230 may control the display 270 to display audio input information based on the placement area of the camera device 220 . For example, the processor 230 may control a size of audio input information to correspond to a size of an audio signal collected through the microphone device 140 while the video is photographed.
  • a graphic effect for example, wavelength image
  • the processor 230 may control the display 270 to display audio input information based on the placement area of the camera device 220 .
  • the processor 230 may control a size of audio input information to correspond to a size of an audio signal collected through the microphone device 140 while the video is photographed.
  • the processor 230 may execute the camera application based on touch information of an application icon. For example, when at least one of a touch input and a drag input for the application icon is detected, the processor 230 may identify whether the application icon enters the camera control area. When the application icon enters the camera control area, the processor 230 may identify whether an application corresponding to the application icon is linked with the camera application. When the application corresponding to the application icon is linked with the camera application, the processor 230 may execute a camera function (for example, front camera mode) of the application corresponding to the application icon. For example, when the touch input for the application icon is released within the camera control area, the processor 230 may determine that the application icon enters the camera control area.
  • a camera function for example, front camera mode
  • the processor 230 may execute a multi-camera mode based on touch information of the camera control area.
  • the processor 230 may provide a camera service of one of a plurality of camera devices.
  • the processor 230 may switch to the multi-camera mode in which the plurality of camera devices are simultaneously activated.
  • the processor 230 may additionally activate the back camera device and execute the multi-camera mode.
  • the display 270 may overlap the display of preview images acquired through the front camera device and the back camera device or display the preview images in different areas.
  • the processor 230 may switch positions of the preview images.
  • the processor 230 may control sizes of the preview images based on input information detected through the input/output interface 260 .
  • the processor 230 may provide an automatic photographing service based on at least one of a location and an angle of the electronic device 201 in the front camera mode. For example, when the automatic photographing mode is set, the processor 230 may display a camera image corresponding to the location and the angle of the electronic device 201 to be adjacent to the placement area of the camera device 220 . That is, the processor 230 may display the camera image corresponding to the location and the angle of the electronic device 201 to allow the user to control the location and the angle of the electronic device 201 to match photographing information. When the location and the angle of the electronic device 201 match the photographing information, the processor 230 may automatically capture the image. For example, the photographing information may be set by a user input or may include at least one of the location and the angle of the electronic device 201 that match the image acquired through the front camera mode.
  • the processor 230 may set the camera application to control the camera device 220 based on touch information of the camera control area. For example, when a drag input in a first direction (for example, a horizontal direction) in the camera control area is detected, the processor 230 may control the display 270 to display a camera application list installed in the electronic device 201 . The processor 230 may select one first camera application based on input information detected through the input/output interface 260 . The processor 230 may control the camera device 220 by executing the first camera application. That is, the processor 230 may drive the camera device 220 based on camera setting information set to the first camera application. In addition, the processor 230 may set the first camera application as a basic camera application.
  • the processor 230 may execute the first camera application.
  • the camera setting information may include at least one of a filter for photographing, a photographing mode, and a photographing setting value (for example, aperture, shutter speed, image size, and the like).
  • the processor 230 may control the camera device 220 in accordance with the camera setting information of the image.
  • the processor 230 may control the display 270 to display a list of images stored in the memory 240 .
  • the processor 230 may control the display 270 to display corresponding filter information in the image to which a filter is applied.
  • the processor 230 may identify whether the first image enters the camera control area.
  • the processor 230 may drive the camera device 220 in accordance with camera setting information of the first image. For example, when a touch input for the first image is released within the camera control area, the processor 230 may determine that the first image enters the camera control area.
  • the processor 230 may capture an image based on touch information of the camera control area. For example, when a double tap input in the camera control area is detected, the processor 230 may capture an image through the camera device 220 without executing the camera application.
  • the processor 230 may photograph video based on touch information of the camera control area. For example, when a touch maintaining time of the camera control area exceeds a reference time, the processor 230 may photograph video through the camera device 220 without executing the camera application. When the touch input in the camera control area is released, the processor 230 may end the photographing of the video. For example, when the touch maintaining time in the camera control area exceeds the reference time, the processor 230 may output notification information to allow the user to recognize the start of the video photographing.
  • the notification information may include at least one of a notification sound, a notification message, and a vibration.
  • the processor 230 may display driving limit information to be adjacent to the placement area of the camera device 220 .
  • the processor 230 may display driving limit information to be adjacent to the placement area of the camera device 220 .
  • the driving limit information may be displayed to be adjacent to the placement area of the camera device 220 .
  • the processor 230 may execute a camera setting menu.
  • the processor 230 may display notification information of a communication service using an image to be adjacent to the placement area of the camera device 220 . For example, when a video call signal is received, the processor 230 may display video call notification information to be adjacent to the placement area of the camera device 220 . The processor 230 may determine whether to accept the video call based on touch information of an area where the video call notification information is displayed.
  • the processor 230 may display human body recognition information (for example, face recognition) to be adjacent to the placement area of the camera device 220 .
  • human body recognition information for example, face recognition
  • the processor 230 may display time information required for the iris recognition based on the placement area of the camera device 220 to allow the user to recognize the time information corresponding to a time during which the user should look at the camera device 220 for the iris recognition.
  • the processor 230 may further display progress time information of the iris recognition. For example, when the time information required for the iris recognition matches the progress time information of the iris recognition, the processor 230 may complete the iris recognition.
  • the processor 230 may display pollution level information of the camera device 220 to be adjacent to the placement area of the camera device 220 .
  • the processor 230 may estimate a pollution level of the image sensor of the camera device 220 by detecting the definition of the image acquired through the camera device 220 .
  • the processor 230 may display the pollution level information of the image sensor of the camera device 220 to be adjacent to the placement area of the camera device 220 .
  • the processor 230 may display pollution level information to be adjacent to the placement area of the camera device 220 .
  • the processor 230 may display a guide image to induce a touch of another area adjacent to the placement area of the camera device 220 .
  • the memory 240 may include a volatile memory and/or a non-volatile memory.
  • the memory 240 may store instructions or data related to at least one other element of the electronic device 201 .
  • the memory 240 may store software and/or a program 250 .
  • the program 250 may include a kernel 251 , middleware 253 , an application programming interface (API) 255 , and an application program 257 .
  • At least some of the kernel 251 , the middleware 253 , and the API 255 may be referred to as an operating system (OS).
  • OS operating system
  • the input/output interface 260 may function as an interface that may transfer instructions or data input from a user or another external device to the other elements of the electronic device 201 . Furthermore, the input/output interface 260 may output instructions or data, which are received from the other elements of the electronic device 201 , to the user or the external device.
  • the input/output interface 260 may include a touch panel that detects a touch input or a hovering input using an electronic pen or a user's body part.
  • the input/output interface 260 may receive a gesture or a proximity input using an electronic pen or a user's body part.
  • the display 270 may display various types of contents (for example, text, images, videos, icons, symbols, or the like) to a user. For example, at least some areas (for example, upper areas) of the display 270 may be perforated for placement of the camera device 220 . Accordingly, the display 270 may limit the display function in the placement area of the camera device 220 . According to an embodiment, the display 270 may be implemented by a touch screen coupled with the touch panel of the input/output interface 260 .
  • the communication interface 280 may establish communication between the electronic device 201 and an external device.
  • the communication interface 280 may communicate with a first external electronic device 202 through short-range communication 284 or wired communication.
  • the communication interface 280 may be connected to a network 282 through wireless or wired communication to communicate with a second external electronic device 204 or a server 206 .
  • the network 282 may include at least one of a communication network, a computer network (for example, a LAN or a WAN), the Internet, and a telephone network.
  • FIG. 3 is a flowchart of a process for controlling a camera device in an electronic device, according to an embodiment of the present disclosure.
  • the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
  • the electronic device 201 identifies whether a touch input for the camera control area related to the placement area of the camera device 220 is detected on the touch screen in operation 301 .
  • the camera control area may be disposed on at least some areas adjacent to the placement area of the camera device 220 on the touch screen.
  • the camera control area may be disposed on at least some areas adjacent to the placement area of the camera device 220 on the touch screen.
  • the camera control area may be disposed on the placement area of the camera device 220 and at least some areas adjacent to the placement area of the camera device 220 on the touch screen.
  • the electronic device 201 detects a control function of the camera device 220 corresponding to the touch input in operation 303 .
  • the processor 230 detects the control function of the camera device 220 based on at least one of the number of touches in the camera control area, a drag distance (i.e., touch motion distance), a drag direction (i.e., touch motion direction), and a touch maintaining time.
  • the control function of the camera device 220 may include at least one of driving of the camera device 220 , selection of an application for driving the camera device 220 , camera setting information, image capturing, video photographing, timer setting, and camera mode switching.
  • the electronic device 201 drives the camera device 220 based on the control function of the camera device 220 corresponding to the touch input in the camera control area in operation 305 .
  • the processor 230 controls the camera device 220 by executing the camera application in accordance with the control function of the camera device 220 .
  • FIG. 4 is a flowchart of a process for controlling a camera device in an electronic device when a screen of the electronic device is turned off, according to an embodiment of the present disclosure.
  • FIGS. 6 A to 6 D illustrate a screen configuration for controlling a camera device in an electronic device when a screen of the electronic device is turned off, according to an embodiment of the present disclosure.
  • the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
  • the electronic device 201 determines whether a first touch input is detected through a camera control area set based on a placement area of the camera device 220 on the touch screen in operation 401 .
  • the processor 230 maintains a touch recognition function of the camera control area in an active state. The processor 230 determines whether a first type touch input is detected through the camera control area.
  • the first type touch input may correspond to the type of touch in which the user rubs the camera control area and include a touch input having a continuously changing drag direction.
  • the processor 230 determines whether the first type touch input is detected through the camera control area.
  • the touch recognition function of the camera control area may be activated or deactivated based on the type of an application driven in the electronic device 201 .
  • the electronic device 201 terminates the operation for controlling the driving of the camera device 220 .
  • the electronic device 201 displays camera activation information in a display area corresponding to the camera control area in operation 403 .
  • the processor 230 may display camera activation information 620 based on the placement area of the camera device 220 , as illustrated in FIG. 6 B .
  • the electronic device 201 determines whether a second touch input for the camera activation information is detected in operation 405 .
  • the processor 230 determines whether a drag input 630 for the camera activation information 620 is detected within a reference time from a time point when the camera activation information 620 is displayed, as illustrated in FIG. 6 B .
  • the electronic device 201 may determine to not drive the camera. Accordingly, the electronic device 201 may terminate the operation for controlling driving of the camera device 220 .
  • the electronic device 201 drives the camera device 220 in operation 407 .
  • the processor 230 may display at least some of the service screen of the camera application in accordance with a distance of the drag input, as illustrated in FIG. 6 C .
  • the processor 230 may display the service screen of the camera application on the display 270 as indicated by reference numeral 650 in FIG. 6 D .
  • the processor 230 may display a preview image acquired through the front camera device 220 on the display 270 by executing the camera application.
  • the processor 230 may display the service screen (for example, a preview image) of the camera application in at least some areas of the display 270 in accordance with the drag input.
  • the service screen for example, a preview image
  • the electronic device 201 may determine to not drive the camera device 220 . Accordingly, the electronic device 201 may terminate the service screen of the camera application, as illustrated in FIG. 6 A .
  • FIG. 5 is a flowchart of a process for detecting touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure.
  • the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
  • the electronic device 201 determines whether the display 270 is deactivated in operation 501 .
  • the processor 230 determines whether an operation state of the display 270 switches to an inactive state since the electronic device 201 operates in a low power mode.
  • the electronic device 201 When the display 270 is deactivated, the electronic device 201 maintains the touch recognition function of the camera control area in an active state in operation 503 .
  • the processor 230 maintains a touch recognition function of the camera control area in an active state.
  • the electronic device 201 determines whether a touch input is detected through the camera control area. For example, the processor 230 determines whether a touch input of the type in which the user rubs the touch screen is detected through the camera control area having the activated touch recognition function as illustrated in FIG. 6 A .
  • the electronic device 201 When the touch input is not detected through the camera control area, the electronic device 201 maintains the touch recognition function of the camera control area in the active state in operation 503 .
  • the electronic device 201 determines whether the touch input is detected in operation 507 . For example, when the display 270 is in the active state, the processor 230 maintains the touch recognition function of the touch panel corresponding to the display 270 in the active state. Accordingly, the processor 230 determines whether the touch input is detected through the touch panel in the active state.
  • the electronic device 201 determines whether the display 270 is deactivated again in operation 501 .
  • the electronic device 201 determines whether the touch input is detected through the camera control area in operation 509 .
  • the processor 230 determines whether a touch coordinate of the touch input is included in the camera control area.
  • FIG. 7 is a flowchart of a process for configuring a camera display area in an electronic device based on touch information of a camera control area, according to an embodiment of the present disclosure.
  • the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
  • the electronic device 201 sets the camera display area based on a second touch input in operation 701 .
  • the processor 230 sets at least some areas of the display 270 as the camera display area for displaying the service screen of the camera application in accordance with a drag distance. For example, when the drag distance for the camera activation information 620 exceeds a reference distance, the processor 230 sets the entire area of the display 270 as the camera display area.
  • the electronic device 201 may drive the camera device 220 based on the camera display area in operation 703 .
  • the processor 230 may display a preview image acquired through the front camera device in the camera display area of the display 270 by executing the camera application.
  • FIG. 8 is a flowchart of a process for controlling a camera device in an electronic device when a camera application is linked with another application, according to an embodiment of the present disclosure.
  • FIGS. 9 A to 9 E illustrate a screen configuration for controlling a camera device in an electronic device when a camera application is linked with another application, according to an embodiment of the present disclosure.
  • the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
  • the electronic device 201 may drive a first application among at least one application installed in the electronic device 201 in operation 801 .
  • a messenger application is selected from at least one application installed in the electronic device 201 based on input information detected through the input/output interface 260
  • the processor 230 displays a service screen 900 of the messenger application on the display 270 .
  • the electronic device 210 detects a first touch input for the camera control area in operation 803 .
  • the processor 230 may detect a tap input through at least some areas of the touch screen set as the camera control area.
  • the electronic device 201 determines to not drive the camera device 220 and terminate the operation for controlling driving of the camera device 220 .
  • the electronic device 201 displays camera activation information in at least some areas of the display corresponding to the camera control area in operation 805 .
  • the processor 230 displays camera activation information 920 based on the placement area of the camera device 220 .
  • the electronic device 201 determines whether a second touch input for the camera activation information is detected in operation 807 .
  • the processor 230 determines whether a drag input is detected through at least some areas of the touch screen where the camera activation information is displayed.
  • the electronic device 201 determines to not drive the camera device 220 . Accordingly, the electronic device 201 terminates the operation for controlling driving of the camera device 220 .
  • the electronic device 201 sets a camera display area in accordance with the second touch input in operation 809 .
  • the processor 230 sets at least some areas of the display 270 as the camera display area in accordance with a drag distance.
  • the electronic device 201 displays driving information (for example, service screen of the camera application) of the camera device 220 in the camera display area in operation 811 .
  • driving information for example, service screen of the camera application
  • the processor 230 displays a preview image acquired through the front camera device in the camera display area set to at least some areas of the display 270 based on the drag distance as indicated by reference numeral 940 .
  • the processor 230 displays a photographing button 942 at a position where the drag input is released.
  • the electronic device 201 determines whether an event for capturing an image is generated through the camera application in operation 813 .
  • the processor 230 may determine whether a touch input for the photographing button 942 displayed in the camera display area is detected or whether a gesture input mapped to image capturing is detected.
  • the electronic device 201 When the event for capturing the image is not generated, the electronic device 201 maintains display of the camera driving information of the camera display area in operation 811 .
  • the electronic device determines whether the camera application and a first application are linked to each other in operation 815 .
  • the processor 230 determines whether the first application provides a service using the image captured through the camera application.
  • the electronic device 201 links the image captured through the camera application with the first application in operation 817 .
  • the processor 230 may transmit the image captured through the camera application to a counterpart electronic device through a chat room of the messenger application as indicated by reference numeral 950 .
  • the processor 230 may store the image captured through the camera application in the memory 240 .
  • the electronic device 201 stores the image captured through the camera application in the memory 240 of the electronic device 201 in operation 819 .
  • the electronic device 201 terminates driving of the camera device 220 .
  • the processor 230 terminates the camera application, as illustrated in FIG. 9 E .
  • FIG. 10 is a flowchart of a process for controlling a camera device in an electronic device based on touch maintaining information of a camera control area, according to an embodiment of the present disclosure.
  • FIGS. 11 A and 11 B illustrate a screen configuration for controlling a camera device in an electronic device based on touch maintaining information of a camera control area, according to an embodiment of the present disclosure.
  • the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
  • the electronic device 201 determines whether a touch input is detected through a camera control area set to at least some areas of the touch screen in operation 1001 .
  • the processor 230 may determine whether a hovering input for the camera control area set to be adjacent to the placement area of the camera device 220 is detected or whether a tap input for the camera control area of the touch screen is detected.
  • the electronic device 201 determines to not drive the camera device 220 and terminate the operation for controlling the camera device 220 .
  • the electronic device 201 displays camera activation information in at least some areas of the display corresponding to the camera control area in operation 1003 .
  • the processor 230 displays camera activation information 1130 to be adjacent to the placement area of the camera device 220 , as illustrated in FIG. 11 A .
  • the processor 230 displays the camera activation information 1130 in at least some areas different from the placement area of the camera device 220 in order to prevent the image sensor of the front camera device from becoming dirty due to the touch input for controlling the camera device 220 . That is, the processor 230 displays the camera activation information 1130 in at least some areas different from the placement area of the camera device 220 to allow the user to touch at least some areas different from the placement area of the camera device 220 .
  • the electronic device 201 determines whether a touch maintaining time for the camera activation information exceeds a reference time in operation 1005 .
  • the processor 230 determines whether the touch maintaining time for the camera activation information 1130 exceeds the reference time, as illustrated in FIG. 11 A .
  • the electronic device 201 determines whether the touch input for the camera activation information is released in operation 1009 .
  • the processor 230 determines whether a touch input 1120 for the camera activation information 1130 is released, as illustrated in FIG. 11 A .
  • the electronic device 201 determines to not drive the camera device 220 and terminates the operation for controlling driving of the camera device 220 .
  • the electronic device 201 determines whether the touch maintaining time for the camera activation information exceeds the reference time again in operation 1005 .
  • the electronic device 201 drives the camera device 220 in operation 1007 .
  • the processor 230 displays the service screen of the camera application on the display 270 through an image effect that makes the service screen of the camera application spread from the placement area of the camera device 220 .
  • the service screen of the camera application may include a preview image acquired through the front camera device or the back camera device.
  • FIG. 12 is a flowchart of a process for setting a timer of a camera device based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure.
  • FIGS. 13 A to 13 C illustrate a screen configuration for setting a timer of a camera device based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure.
  • the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
  • the electronic device 201 displays a service screen of a camera application in at least some areas of the display in operation 1201 .
  • the processor 230 executes the camera application based on touch information of a camera control area, as described in operations 401 to 407 of FIG. 4 or operations 1001 to 1009 of FIG. 10 .
  • the processor 230 executes the corresponding camera application.
  • the electronic device 201 determines whether the touch input for the camera control area is detected in operation 1203 .
  • the processor 230 displays a preview image of the front camera device on the display 270 by executing the camera application as indicated by reference numeral 1300 .
  • the processor 230 may determine whether a drag input 1310 is detected from the placement area of the camera device 220 in a state where the preview image of the front camera device is displayed.
  • the processor 230 determines whether a subsequent tap input for the camera control area is detected in the state where the preview image of the front camera device is displayed.
  • the electronic device 201 determines to not set a timer of the camera device. Accordingly, the electronic device terminates the operation for setting the timer of the camera device 220 .
  • the electronic device 201 sets the timer of the camera device in accordance with the touch input in operation 1205 .
  • the processor 230 sets the timer of the camera device 220 to a timer required time corresponding to a drag distance from the placement area of the camera device 220 , as illustrated in FIG. 13 A .
  • the processor 230 sets the timer of the camera device 220 to a time corresponding to a distance between the placement area of the camera device 220 and a position where a tap input is detected.
  • the electronic device 201 displays timer information of the camera device 220 set to correspond to the touch input on the display 270 in operation 1207 .
  • the processor 230 displays time information set to correspond to the drag distance from the placement area of the camera device 220 as indicated by reference numeral 1320 in FIG. 13 A .
  • the electronic device 201 determines whether the time set to correspond to the touch input has expired in operation 1209 .
  • the electronic device 201 displays timer information of the camera device 220 on the display 270 in operation 1207 .
  • the processor 230 updates the time information displayed in accordance with the elapse of time. That is, the processor 230 updates the timer of the camera device 220 such that display of the time information becomes gradually smaller in accordance with the elapse of time from a time point when the timer is set.
  • the electronic device 201 captures an image by driving the camera device 220 in operation 1211 .
  • the processor 230 captures an image by using the front camera device. In this case, the processor 230 removes the display of the time information from the display 270 as illustrated in FIG. 13 C .
  • the processor 230 may acquire the amount of light for the image capturing by changing a color of the display into a bright color.
  • FIG. 14 is a flowchart of a process for providing a flash effect in an electronic device, according to an embodiment of the present disclosure.
  • FIGS. 15 A to 15 D illustrate a screen configuration for providing a flash effect in an electronic device, according to an embodiment of the present disclosure.
  • the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 of FIG. 2 .
  • the electronic device 201 determines whether a flash function is set. For example, when the time set for the timer of the camera device 220 expires, the processor 230 may determine that the image capturing event is generated. The processor 230 displays camera activation information 1510 based on the placement area of the camera device 220 to make user's eyes face the front image device in response to the generation of the image capturing event. In this case, the processor 230 determines whether a flash setting menu of the camera device 220 is set in an active state.
  • the electronic device 201 captures the image by driving the camera device 220 in operation 1405 .
  • the processor 230 captures the image by using the activated front camera device of the camera device 220 .
  • the electronic device 201 changes a color of the display 270 into a color set as the flash function in operation 1403 .
  • the processor 230 displays a background image such that a bright colored (for example, white) background image spreads across an entire area of the display 270 based on the placement area of the camera device 220 as indicated by reference numerals 1520 and 1530 .
  • the electronic device 201 captures the image by driving the camera device 220 while changing the color of the display 270 by the flash function in operation 1405 .
  • the electronic device 201 may change the color of the display 270 in accordance with an image effect which the user desires.
  • the processor 230 may set an image effect having a warm feeling based on the user's input information.
  • the processor 230 displays a background image such that a yellow background image, for example, spreads across an entire area of the display 270 based on the placement area of the camera device 220 while capturing the image.
  • the electronic device 201 may display audio input information to allow the user to identify a size of an audio signal input through the microphone device 140 .
  • the processor 230 may display audio input information 1540 corresponding to a size of an audio signal based on the placement area of the camera device 220 , as illustrated in FIG. 15 D .
  • FIG. 16 is a flowchart of a process for providing a camera service through an application in an electronic device, according to an embodiment of the present disclosure.
  • FIGS. 17 A and 17 B illustrate a screen configuration for providing a camera service through an application in an electronic device, according to an embodiment of the present disclosure.
  • the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
  • the electronic device 201 displays a standby screen including an icon of at least one application in operation 1601 .
  • the processor 230 displays the standby screen including icons of applications installed in the electronic device 201 as indicated by reference numeral 1700 .
  • the electronic device 201 determines whether a touch input for the icon of the application is detected in the standby screen in operation 1603 .
  • the processor 230 determines whether a touch input for an icon of one of a plurality of applications displayed on the standby screen is detected, as illustrated in FIG. 17 A .
  • the electronic device 201 determines to not provide the camera service and terminates the operation for providing the camera service.
  • the electronic device 201 determines whether the application icon enters the camera control area by the touch input in operation 1605 .
  • the processor 230 determines whether a first application icon 1710 on which the touch input is detected enters the camera control area through a drag input 1720 on a standby screen 1700 as indicated by reference numeral 1730 in FIG. 17 A .
  • the processor 230 determines that the first application icon 1710 enters the camera control area.
  • the processor 230 determines whether a second touch input for determining a movement location of the first application icon 1710 is detected within the camera control area, as illustrated in FIG. 17 A .
  • the second touch input may be determined as being effective only when the corresponding touch input is detected within a reference time from a time point when the first touch input is detected.
  • the electronic device 201 determines whether the touch input for the application icon is released in operation 1611 .
  • the processor 230 determines whether the touch input for the first application icon 1710 is released outside the camera control area, as illustrated in FIG. 17 A .
  • the electronic device 201 determines whether the application icon enters the camera control area by the touch input in operation 1605 .
  • the electronic device 201 determines to not provide the camera service and terminates the operation for providing the camera service. For example, the processor 230 may change a location of the application icon to a location where the touch input for the application icon is released.
  • the electronic device 201 determines whether an application corresponding to the application icon can be linked with the camera application in operation 1607 .
  • the processor 230 determines whether the camera service can be provided through the application corresponding to the first application icon 1710 .
  • the electronic device 201 terminates the operation for providing the camera service.
  • the electronic device 201 displays link information between the application corresponding to the application icon and the camera device 220 on the display 270 in operation 1609 .
  • the processor 230 displays a camera service screen of the first application on the display 270 as indicated by reference numeral 1740 in FIG. 17 B.
  • FIG. 18 is a flowchart of a process for providing a multi-camera service in an electronic device, according to an embodiment of the present disclosure.
  • FIGS. 19 A to 19 E illustrate a screen configuration for providing a multi-camera service in an electronic device, according to an embodiment of the present disclosure.
  • the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
  • the electronic device 201 displays a service screen of a camera application on the display 270 in operation 1801 .
  • the processor 230 displays a preview image collected through the front camera device on the display 270 as indicated by reference numeral 1900 .
  • the processor 230 displays the service screen of the camera application on the display 270 like operations 401 to 407 of FIG. 4 or operations 1001 to 1009 of FIG. 10 .
  • the electronic device 201 determines whether a touch input is detected through the camera control area. For example, the processor 230 determines whether a tap input 1910 for the camera control area adjacent to the placement area of the camera device 220 is detected in a state where a preview image of the front camera device is displayed, as illustrated in FIG. 19 A .
  • the electronic device 201 determines to not provide a multi-camera service. Accordingly, the electronic device 201 terminates the operation for providing the multi-camera service.
  • the electronic device 201 switches a camera mode of the electronic device 201 to a multi-camera mode in operation 1805 .
  • the processor 230 may additionally activate the back camera device.
  • the processor 230 may additionally activate the front camera device.
  • the electronic device 201 displays a service screen using multiple cameras on the display 270 based on the multi-camera mode in operation 1807 .
  • the processor 230 displays a preview image 1920 of the activated back camera device to overlap at least a part of the preview image 1910 of the front camera device based on the multi-camera mode.
  • the processor 230 controls a size of the preview image 1920 of the back camera device in accordance with a drag distance. Referring to FIG.
  • the processor 230 when a tap input for the displayed small preview image 1920 of the back camera device is detected, the processor 230 reverses display areas of the preview image 1910 of the front camera device and the preview image 1920 of the back camera device as indicated by reference numeral 1940 .
  • the processor 230 updates a display location of the preview image of the front camera according to the drag input as indicated by reference numeral 1960 in FIG. 19 E .
  • the electronic device 201 determines whether the multi-camera mode ends in operation 1809 . For example, when the drag input for the displayed small preview image is detected to move outside the area of the display 270 as indicated by reference numeral 1970 in FIG. 19 E , the processor 230 determines that the multi-camera mode ends.
  • the electronic device 201 maintains the service screen using the multiple cameras displayed on the display 270 in operation 1807 .
  • the electronic device 201 switches the camera mode of the electronic device 201 to a single camera mode and displays a service screen of a single camera device on the display 270 in operation 1811 .
  • the electronic device 201 switches the camera mode of the electronic device 201 to a single camera mode and displays a service screen of a single camera device on the display 270 in operation 1811 .
  • the processor 230 displays the preview screen of the back camera device on the display 270 , as indicated by reference numeral 1980 .
  • FIG. 20 is a flowchart of a process for providing an automatic photographing service in an electronic device, according to an embodiment of the present disclosure.
  • FIGS. 21 A to 21 C illustrate a screen configuration for providing an automatic photographing service in an electronic device, according to an embodiment of the present disclosure.
  • the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
  • the electronic device 201 displays a service screen (for example, preview image) of a camera application on the display 270 in operation 2001 .
  • the processor 230 executes the camera application based on touch information of a camera control area through operations 401 to 407 of FIG. 4 or operations 1001 to 1009 of FIG. 10 .
  • the processor 230 displays the preview image of the front camera device on the display 270 as indicated by reference numeral 2100 .
  • the electronic device 201 determines whether an automatic photographing mode is set to the camera application in operation 2003 .
  • the processor 230 determines whether an automatic photographing menu is set in an activated state based on input information detected through the input/output interface 260 .
  • the electronic device 201 When the automatic photographing mode is not set to the camera application, the electronic device 201 terminates the operation for providing the automatic photographing service. In this case, the electronic device 201 captures an image based on a touch input of a photographing button displayed on the service screen of the camera application.
  • the electronic device 201 displays motion information of the camera device 220 in at least some areas of the display 270 in operation 2005 .
  • the processor 230 displays motion information of the camera device 220 corresponding to a location and an angle of the electronic device 201 based on the placement area of the camera device 220 as indicated by reference numeral 2110 in FIG. 21 A .
  • the electronic device 201 determines whether a motion of the camera device 220 that a capturing event matches is detected in operation 2007 .
  • the processor 230 determines whether motion information of the electronic device 201 that matches the location and angle of the electronic device 201 preset for image capturing is detected.
  • the preset location and angle of the electronic device 201 may be set by a user's input or may include at least one of locations and angles of the electronic device 201 that match the image acquired through the front camera mode.
  • the electronic device 201 displays changed motion information of the camera device 220 in at least some areas of the display 270 in operation 2005 .
  • the processor 230 may change the motion information of the camera device 220 displayed based on the placement area of the camera device 220 according to a change in the location and angle of the electronic device 201 as indicated by reference numeral 2120 .
  • the electronic device 201 captures the image by driving the camera device (for example, front camera device) in operation 2009 .
  • the camera device for example, front camera device
  • the processor 230 displays matching information by the location and angle of the electronic device 201 to allow the user to recognize an automatic photographing time point as indicated by reference numeral 2130 .
  • the processor 230 captures the image by using the front camera device.
  • the processor 230 may acquire an amount of light for the image capturing or perform the image capturing and change a color of the display 270 at the same time for an image effect.
  • FIG. 22 is a flowchart of a process for controlling a camera device in accordance with a camera application in an electronic device, according to an embodiment of the present disclosure.
  • FIGS. 23 A and 23 B illustrate a screen configuration for controlling a camera device in accordance with a camera application in an electronic device, according to an embodiment of the present disclosure.
  • the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
  • the electronic device 201 determines whether a first type touch input for the camera control area is detected in operation 2201 .
  • the processor 230 determines whether a drag input 2310 in a right direction of the placement area of the camera device 220 is detected.
  • the electronic device 201 terminates the operation for setting the camera application. For example, when the drag input in a down direction for the camera control area is detected, the electronic device 201 executes one camera application that is set as a basic application among a plurality of applications installed in the electronic device 201 .
  • the electronic device 201 determines at least one camera application installed in the electronic device 201 in operation 2203 .
  • the processor 230 may extract camera application information stored in the memory 240 .
  • the electronic device 201 displays a camera application list including at least one camera application installed in the electronic device 201 .
  • the processor 230 displays icons of camera applications installed in the electronic device 201 on the display 270 such that the icons are output in the placement area of the camera device 220 .
  • the electronic device 201 determines whether a first camera application which is one of the applications included in the camera application list is selected. For example, the processor 230 determines whether a touch input for one of the icons of the camera applications displayed in the camera control area is detected as illustrated in FIG. 23 B .
  • the electronic device 201 When a selection input for the first camera application is not detected, the electronic device 201 maintains display of the camera application list in operation 2205 . In addition, when an input for the camera application list is not detected until a reference time passes from a time point when the camera application list is displayed, the electronic device 201 determines to not select the camera application for controlling the camera device 220 and terminates the operation.
  • the electronic device 201 drives the camera device 220 based on the first camera application in operation 2209 .
  • the processor 230 controls the camera device 220 by executing the first camera application. Accordingly, the processor 230 performs initial settings on the camera device 220 based on camera setting information set to the first camera application. In addition, the processor 230 sets the first camera application as a basic camera application of the electronic device 201 .
  • FIG. 24 is a flowchart of a process for controlling a camera device in accordance with photographing setting information of an image in an electronic device, according to an embodiment of the present disclosure.
  • FIG. 25 illustrates a screen configuration for controlling a camera device in accordance with photographing setting information of an image in an electronic device, according to an embodiment of the present disclosure.
  • the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
  • the electronic device 201 displays a list of at least one image stored in the memory 240 of the electronic device 201 on the display 270 in operation 2401 .
  • the processor 230 displays a thumbnail for at least one image stored in the memory 240 on the display 270 as indicated by reference numeral 2500 .
  • the processor 230 displays corresponding filter information 2510 on an image to which a filter for an image effect is applied.
  • the electronic device 201 determines whether a touch input for a first image in an image list displayed on the display 270 is detected in operation 2403 .
  • the processor 230 determines whether a touch input 2520 for the first image in the image list 2500 displayed on the display 270 is detected, as illustrated in FIG. 25 .
  • the electronic device 201 determines to not provide the camera service and terminates the operation for providing the camera service.
  • the electronic device 201 determines whether the first image enters the camera control area in operation 2405 .
  • the processor 230 determines whether the first image enters the camera control area through a drag input 2530 for the first image, as illustrated in FIG. 25 .
  • the processor 230 determines that the first image enters the camera control area.
  • the processor 230 determines whether a second touch input for determining a movement location of the first image is detected within the camera control area, as illustrated in FIG. 25 .
  • the electronic device 201 determines whether the touch input for the first image is released in operation 2411 .
  • the processor 230 determines whether the touch input 2520 for the first image is released outside the camera control area in FIG. 25 .
  • the electronic device 201 determines whether the first image enters the camera control area again by the touch input in operation 2405 .
  • the electronic device 201 determines to not provide the camera service and terminates the operation for providing the camera service. For example, the processor 230 changes a location of the first image to a location where the touch input for the first image is released.
  • the electronic device 201 determines setting information of the camera device 220 set to capture the first image in operation 2407 .
  • the setting information of the camera device 220 may include at least one of a filter for capturing the first image, image filter information, a photographing mode, and a photographing setting value (for example, aperture, shutter speed, and image size).
  • the electronic device 201 may update the setting information of the camera device 220 based on the setting information of the camera device 220 set to capture the first image in operation 2409 .
  • the processor 230 may perform initial settings on the camera device 220 in accordance with the setting information of the camera device 220 that has been set to capture the first image.
  • FIG. 26 is a flowchart of a process for capturing an image based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure.
  • the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
  • the electronic device 201 determines whether a touch input is detected through a camera control area set based on the placement area of the camera device 220 on the touch screen in operation 2601 .
  • the electronic device 201 determines whether image capturing matches the touch input detected through camera control area in operation 2603 .
  • the processor 230 determines whether a double tap input which an image capturing event matches is detected based on touch input matching information stored in the memory 240 .
  • the electronic device 201 determines to not perform the image capturing. Accordingly, the electronic device 201 terminates the operation for the image capturing.
  • the electronic device 201 captures the image through the camera device 220 without executing the camera application in operation 2605 .
  • the processor 230 captures the image through the camera device 220 (for example, back camera device) in a state where the service screen of the application displayed on the display 270 is maintained. That is, the processor 230 captures the image in a state where a preview image acquired through the camera device 220 is not displayed.
  • the processor 230 stores the captured image in the memory 240 .
  • the processor 230 displays image capturing information on a notification bar.
  • FIG. 27 is a flowchart of a process for photographing video based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure.
  • the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
  • the electronic device 201 determines whether a touch input is detected through a camera control area preset to control the camera device 220 on the touch screen in operation 2701 .
  • the camera control area may include at least some areas of the touch screen including the placement area of the camera device 220 .
  • the camera control area may include at least some areas of the touch screen adjacent to the placement area of the camera device 220 .
  • the electronic device 201 determines whether video photographing matches the touch input detected through the camera control area in operation 2703 .
  • the processor 230 determines whether a touch input having a touch maintaining time exceeding a reference time is detected through the camera control area based on touch input matching information stored in the memory 240 .
  • the electronic device 201 determines to not perform the video photographing. Accordingly, the electronic device 201 terminates the operation for the video photographing.
  • the electronic device 201 starts the video photographing through the camera device 220 without executing the camera application in operation 2705 .
  • the processor 230 photographs the video through the camera device 220 (for example, back camera device) in a state where the service screen of the application displayed on the display 270 is maintained from a time point when the touch maintaining time of the touch input detected through the camera control area exceeds the reference time.
  • the processor 230 outputs notification information to allow the user to recognize the video photographing operation.
  • the notification information may include at least one of a notification sound, a notification message, and a vibration.
  • the electronic device 201 determines whether the touch input that the video photographing matches is released in operation 2707 .
  • the electronic device 201 may continuously photograph the video in operation 2705 .
  • the electronic device 201 terminates the video photographing.
  • the processor 230 may store the video photographed through the back camera device in the memory 240 .
  • the processor 230 displays video photographing information on the notification bar.
  • FIG. 28 is a flowchart of a process for displaying camera driving limit information in an electronic device, according to an embodiment of the present disclosure.
  • FIG. 29 illustrates a screen configuration for displaying camera driving limit information in an electronic device, according to an embodiment of the present disclosure.
  • the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
  • the electronic device 201 determines whether driving of a camera device 220 is limited in 220 2801 .
  • the processor 230 may identify whether the driving of the camera device 220 is limited based on the type of an application being executed in the electronic device 201 .
  • the processor 230 may identify whether the driving of the camera device 220 is limited based on a location of the electronic device 201 .
  • the processor 230 determines whether an operation mode of the camera device 220 is set as an inactive mode based on input information detected through the input/output interface 260 .
  • the electronic device 201 terminates the operation for displaying the camera driving limit information.
  • the electronic device 201 displays camera driving limit information in the camera control area in operation 2803 .
  • the processor 230 displays camera driving limit information 2900 (for example, red colored concentric circles) based on the placement area of the camera device 220 .
  • the electronic device 201 determines whether a touch input is detected through the camera control area in a state where driving the camera device is limited in operation 2805 .
  • the processor 230 determines whether a touch input for the camera driving limit information 2900 displayed in the camera control area is detected, as illustrated in FIG. 29 .
  • the electronic device 201 executes a camera setting menu in operation 2807 .
  • the processor 230 displays a camera setting menu for resetting a right of the camera device 220 on the display 270 .
  • FIG. 30 is a flowchart of a process for providing a video call service in an electronic device, according to an embodiment of the present disclosure.
  • FIG. 31 illustrates a screen configuration for providing a video call service in an electronic device, according to an embodiment of the present disclosure.
  • the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
  • the electronic device 201 determines whether a call connection request signal for the video call is received in operation 3001 .
  • the processor 230 determines whether the corresponding call connection request signal is a call connection request signal corresponding to the video call service.
  • the electronic device 201 terminates the operation for providing the video call service.
  • the electronic device 201 displays video call reception information of the display area corresponding to the camera control area in operation 3003 .
  • the processor 230 displays video call reception information 3100 to be output from the placement area of the camera device 220 on the display 270 .
  • the electronic device 201 determines whether a touch input is detected through the camera control area in a state where the video call reception information is displayed. For example, the processor 230 determines whether a drag input in a first direction (for example, right direction) for the video call reception information 3100 displayed to be adjacent to the placement area of the camera device 220 is detected.
  • a first direction for example, right direction
  • the electronic device 220 may maintain display of the video call reception information in at least some area of the display 270 corresponding to the camera control area in operation 3003
  • the electronic device 201 may determine to not accept the video call connection. In this case, the electronic device 201 displays video call connection failure information on the display 270 .
  • the electronic device 201 When the touch input is detected through the camera control area in a state where the video call reception information is displayed, the electronic device 201 activates the front camera device and provides the video call service in operation 3007 .
  • the processor 230 determines that the user accepts the call connection for the video call. Accordingly, the processor 230 displays an image collected through the front camera device and an image received from a counterpart electronic device on the display 270 by executing the video call application.
  • the processor 230 may determine that the user does not accept the call connection for the video call. Accordingly, the processor 230 may block the call connection for the video call.
  • FIG. 32 is a flowchart of a process for displaying human body recognition service information in an electronic device, according to an embodiment of the present disclosure.
  • FIGS. 33 A to 33 D illustrate a screen configuration for displaying human body recognition service information in an electronic device, according to an embodiment of the present disclosure.
  • the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
  • the electronic device 201 determines whether human body recognition is performed through the camera device 220 of the electronic device 201 in operation 3201 .
  • the processor 230 may determine whether an iris recognition menu for unlocking the electronic device 201 is selected using the camera device 220 (for example, front camera device).
  • the processor 230 may determine whether a face recognition menu for authenticating the user of the electronic device 201 is selected using the camera device 220 (for example, front camera device).
  • the electronic device 201 terminates the operation for providing the human body recognition service.
  • the electronic device 201 displays time information spent for the human body recognition in a display area corresponding to the camera control area in operation 3203 .
  • the processor 230 displays time information 3300 spent for the iris recognition based on the placement area of the camera device 220 .
  • the time spent for the human body recognition may include a minimum time during which the human body recognition (for example, iris recognition) can be completed through the camera device 220 .
  • the electronic device 201 determines whether the time spent for the human body recognition expires in operation 3205 .
  • the processor 230 determines whether an elapsed time from a time point when the human body recognition starts is the same as the time spent for the human body recognition.
  • the electronic device 201 displays elapsed time information for the human body recognition in the display area corresponding to the camera control area in operation 3211 .
  • the processor 230 displays elapsed time information 3310 of the iris recognition to overlap the time information 3300 spent for the iris recognition displayed based on the placement area of the camera device 220 .
  • the electronic device 201 determines again whether the time spent for the human body recognition expires in operation 3205 .
  • the electronic device 201 determines whether the human body recognition is successful in operation 3207 .
  • the processor 230 determines that the iris recognition is completed. Accordingly, the processor 230 determines whether the authentication of the user is successful based on a result of the iris recognition. For example, the processor 230 determines whether iris information detected through the iris recognition matches iris information preset in the memory 240 .
  • the electronic device 201 determines that the authentication of the user through the human body recognition fails. Accordingly, the electronic device 201 terminates the operation for providing the human body recognition service.
  • the electronic device 201 displays human body recognition failure information on the display 270 .
  • the electronic device 201 may unlock the electronic device 201 in operation 3209 .
  • the processor 230 releases a lock function of the electronic device 201 and displays a standby screen 3330 on the display 270 .
  • FIG. 34 is a flowchart of a process for displaying pollution level information of a camera device in an electronic device, according to an embodiment of the present disclosure.
  • FIG. 35 illustrates a screen configuration for displaying pollution level information of a camera device in an electronic device, according to an embodiment of the present disclosure.
  • the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
  • the electronic device 201 drives the camera device 220 disposed on some areas (for example, upper area) of the display 270 in operation 3401 .
  • the processor 230 executes the camera application based on touch information of a camera control area through operations 401 to 407 of FIG. 4 or operations 1001 to 1009 of FIG. 10 .
  • the processor 230 controls the camera device 220 (for example, front camera device) through the camera application.
  • the electronic device 201 captures an image through the camera device 220 disposed on some areas of the display 270 in operation 3403 .
  • the processor 230 captures an image through the front camera device 120 disposed in the upper area of the display 270 .
  • the pollution level measuring event may be periodically generated or may be generated at a time point when the camera device is driven.
  • the electronic device 201 may detect a pollution level of a camera lens through the image capture through the camera device 220 in operation 3405 .
  • the processor 230 may estimate the definition of the image acquired through the front camera device.
  • the processor 230 may detect the pollution level of the camera lens corresponding to the definition of the image.
  • the electronic device 220 determines whether the pollution level of the camera lens exceeds a reference pollution level in operation 3407 .
  • the reference pollution level may be set, by the user, as a reference value of a pollution level which can influence a quality of the image acquired through the camera device 220 or may include a fixed value.
  • the electronic device 201 determines that the pollution level of the camera lens does not influence the quality of the image acquired through the camera device 220 . Accordingly, the electronic device 201 terminates the operation for displaying the pollution information of the camera lens.
  • the electronic device 201 displays pollution information of the camera lens in the display area corresponding to the camera control area to allow the user to recognize the pollution level of the camera lens in operation 3409 .
  • the processor 230 displays pollution information 3500 of the camera lens based on the placement area of the camera device 220 to inform the user to wash the camera lens.
  • the electronic device 201 may display an amount of the pollution level of the camera lens based on the placement area of the camera device 220 .
  • the processor 230 may display the amount of the pollution level of the camera lens through the number of concentric circles based on the placement area of the camera device 220 .
  • the processor 230 may increase the number of concentric circles displayed in the placement area of the camera device 220 as the pollution level of the camera lens is more serious.
  • the processor 230 may not display the concentric circle indicating the pollution level of the camera lens.
  • An electronic device including a camera device disposed at a location overlapping at least a partial area of a display and an operation method thereof control the camera device based on an adjacent area including a placement area of the camera device or an adjacent area close to the placement area of the camera device, so that a user of the electronic device can easily drive and control a camera application.
  • An electronic device including a camera device disposed at a location overlapping at least a partial area of a display and an operation method thereof display control information related to a camera device in an adjacent area including a placement area of the camera device or an adjacent area close to the placement area of the camera device, so that a user of the electronic device can easily detect an operational state of the camera device and a natural photo can be taken by inducing user's eyes to a lens direction.
  • module may refer to a unit including one of hardware, software, and firmware or a combination of two or more of them.
  • the term “module” may be interchangeably used with the term “unit”, “logic”, “logical block”, “component”, or “circuit”.
  • the “module” may be a minimum unit of an integrated component element or a part thereof.
  • the “module” may be a minimum unit for performing one or more functions or a part thereof.
  • the “module” may be mechanically or electronically implemented.
  • the “module” may include at least one of an application-specific integrated circuit (ASIC) chip, a field-programmable gate arrays (FPGA), and a programmable-logic device for performing operations which has been known or are to be developed hereinafter.
  • ASIC application-specific integrated circuit
  • FPGA field-programmable gate arrays
  • programmable-logic device for performing operations which has been known or are to be developed hereinafter.
  • At least some of the devices may be implemented by instructions stored in a computer-readable storage medium in a program module form.
  • the instructions when executed by the processor 230 , may cause the processor 230 to execute the function corresponding to the instruction.
  • the computer-readable storage medium may be the memory 240 .
  • the computer readable storage medium may include a hard disk, a floppy disk, magnetic media (for example, a magnetic tape), optical media (for example, a compact disc read only memory (CD-ROM) and a digital versatile disc (DVD)), magneto-optical media (for example, a floptical disk), a hardware device (for example, a read only memory (ROM), a random access memory (RAM), a flash memory), and the like.
  • the instructions may include high class language codes, which can be executed in a computer by using an interpreter, as well as machine codes made by a compiler. Any of the hardware devices as described above may be configured to work as one or more software modules in order to perform the operations according to various embodiments of the present disclosure, and vice versa.
  • modules or programming modules may include at least one of the above described elements, exclude some of the elements, or further include other additional elements.
  • the operations performed by the modules, programming module, or other elements according to various embodiments of the present disclosure may be executed in a sequential, parallel, repetitive, or heuristic manner. Further, some operations may be executed according to another order or may be omitted, or other operations may be added.

Abstract

An electronic device and a method for controlling a camera device in the electronic device are provided. The electronic device includes a display, a camera device disposed at a location overlapping a partial area of the display, and a processor configured to receive, via a touch screen, touch input on a portion of the touch screen, a contact area of the received touch input including at least a part of an area where a hole is formed, perform an operation related with the camera based on the touch input, and display an image received through the camera on the touch screen based on the touch input.

Description

    PRIORITY
  • This application is a Continuation application of U.S. patent application Ser. No. 17/104,980, filed Nov. 25, 2020, now U.S. Pat. No. 11,516,380, issued Nov. 29, 2022, which is a Continuation application of U.S. patent application Ser. No. 15/407,943, filed Jan. 17, 2017, which claims priority under 35 U.S.C. § 119(a) to Korean Patent Application Serial No. 10-2016-0005293, which was filed in the Korean Intellectual Property Office on Jan. 15, 2016, the entire content of each of which is incorporated herein by reference.
  • BACKGROUND 1. Field of the Disclosure
  • The present disclosure relates generally to an apparatus and a method for controlling a camera device in an electronic device.
  • 2. Description of the Related Art
  • With the development of information, communication, and semiconductor technologies, various types of electronic devices have developed into devices that provide various multimedia services. For example, portable electronic devices may provide various services such as broadcast services, wireless Internet services, camera services, and music playback services.
  • The electronic device may provide the camera services through a plurality of camera devices to meet various user demands. For example, the electronic device may acquire images or videos through a front camera device disposed on the front surface of the electronic device and a back camera device disposed on the back surface.
  • SUMMARY
  • The electronic device may provide the camera service to a user of the electronic device by executing a camera application to control the plurality of camera devices. However, the user of the electronic device may feel inconvenience due to multiple controls for execution of the camera application. For example, when the user of the electronic device uses the camera service through an electronic device in which a message application is being executed, the user may feel inconvenience in executing the camera application through a second control after making the electronic device enter a standby mode through a first control. As another example, when the user of the electronic device uses the camera service through a locked electronic device, the user may feel inconvenience in executing the camera application through a second control after unlocking the electronic device through a first control.
  • The present disclosure has been made to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below.
  • Accordingly, an aspect of the present disclosure is to provide an electronic device and a method for easily controlling a camera device in an electronic device.
  • Accordingly, another aspect of the present disclosure is to provide an electronic device including a camera device disposed at a location overlapping at least a partial area of a display and a method for controlling the camera device based on an adjacent area including a placement area of the camera device or an adjacent area close to the placement area of the camera device, so that a user of the electronic device can easily drive and control a camera application.
  • Accordingly, another aspect of the present disclosure is to provide an electronic device including a camera device disposed at a location overlapping at least a partial area of a display and a method for displaying control information related to a camera device in an adjacent area including a placement area of the camera device or an adjacent area close to the placement area of the camera device, so that a user of the electronic device can easily detect an operational state of the camera device and a photo can be taken by inducing a user's eyes in a direction of the camera lens.
  • In accordance with an aspect of the present disclosure, an electronic device is provided. The electronic device includes a display, a camera device disposed at a location overlapping a partial area of the display, and a processor configured to receive, via a touch screen, touch input on a portion of the touch screen, a contact area of the received touch input including at least a part of an area where a hole is formed, perform an operation related with the camera based on the touch input, and display an image received through the camera on the touch screen based on the touch input.
  • In accordance with another aspect of the present disclosure, a method of operating an electronic device comprising a camera configured to capture an image through a hole formed in a layer of a touch screen is provided. The method includes receiving, via the touch screen, a touch input on a portion of the touch screen, a contact area of the received touch input including at least a part of an area where the hole is formed, performing an operation related with the camera based on the touch input, and displaying an image received through the camera on the touch screen based on the touch input.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other aspects, features, and advantages of the present disclosure will be more apparent from the following detailed description taken in conjunction with the accompanying drawings, in which:
  • FIGS. 1A and 1B illustrate a configuration of an electronic device, according to an embodiment of the present disclosure;
  • FIG. 2 is a block diagram of an electronic device in a network environment, according to an embodiment of the present disclosure;
  • FIG. 3 is a flowchart of a process for controlling a camera device in an electronic device, according to an embodiment of the present disclosure;
  • FIG. 4 is a flowchart of a process for controlling a camera device in an electronic device when a screen of the electronic device is turned off, according to an embodiment of the present disclosure;
  • FIG. 5 is a flowchart of a process for detecting touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure;
  • FIGS. 6A to 6D illustrate a screen configuration for controlling a camera device in an electronic device when a screen of the electronic device is turned off, according to an embodiment of the present disclosure;
  • FIG. 7 is a flowchart of a process for configuring a camera display area in an electronic device based on touch information of a camera control area, according to an embodiment of the present disclosure;
  • FIG. 8 is a flowchart of a process for controlling a camera device in an electronic device when a camera application is linked with another application, according to an embodiment of the present disclosure;
  • FIGS. 9A to 9E illustrate a screen configuration for controlling a camera device in an electronic device when a camera application is linked with another application, according to an embodiment of the present disclosure;
  • FIG. 10 is a flowchart of a process for controlling a camera device in an electronic device based on touch maintaining information of a camera control area, according to an embodiment of the present disclosure;
  • FIGS. 11A and 11B illustrate a screen configuration for controlling a camera device in an electronic device based on touch maintaining information of a camera control area, according to an embodiment of the present disclosure;
  • FIG. 12 is a flowchart of a process for setting a timer of a camera device based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure;
  • FIGS. 13A to 13C illustrate a screen configuration for setting a timer of a camera device based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure;
  • FIG. 14 is a flowchart of a process for providing a flash effect in an electronic device, according to an embodiment of the present disclosure;
  • FIGS. 15A to 15D illustrate a screen configuration for providing a flash effect in an electronic device, according to an embodiment of the present disclosure;
  • FIG. 16 is a flowchart of a process for providing a camera service through an application in an electronic device, according to an embodiment of the present disclosure;
  • FIGS. 17A and 17B illustrate a screen configuration for providing a camera service through an application in an electronic device, according to an embodiment of the present disclosure;
  • FIG. 18 is a flowchart of a process for providing a multi-camera service in an electronic device, according to an embodiment of the present disclosure;
  • FIGS. 19A to 19F illustrate a screen configuration for providing a multi-camera service in an electronic device, according to an embodiment of the present disclosure;
  • FIG. 20 is a flowchart of a process for providing an automatic photographing service in an electronic device, according to an embodiment of the present disclosure;
  • FIGS. 21A to 21C illustrate a screen configuration for providing an automatic photographing service in an electronic device, according to an embodiment of the present disclosure;
  • FIG. 22 is a flowchart of a process for controlling a camera device in accordance with a camera application in an electronic device, according to an embodiment of the present disclosure;
  • FIGS. 23A and 23B illustrate a screen configuration for controlling a camera device in accordance with a camera application in an electronic device, according to an embodiment of the present disclosure;
  • FIG. 24 is a flowchart of a process for controlling a camera device in accordance with photographing setting information of an image in an electronic device, according to an embodiment of the present disclosure;
  • FIG. 25 illustrates a screen configuration for controlling a camera device in accordance with photographing setting information of an image in an electronic device, according to an embodiment of the present disclosure;
  • FIG. 26 is a flowchart of a process for capturing an image based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure;
  • FIG. 27 is a flowchart of a process in which an electronic device photographs video based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure;
  • FIG. 28 is a flowchart of a process for displaying camera driving limit information in an electronic device, according to an embodiment of the present disclosure;
  • FIG. 29 illustrates a screen configuration for displaying camera driving limit information in an electronic device, according to an embodiment of the present disclosure;
  • FIG. 30 is a flowchart of a process for providing a video call service in an electronic device, according to an embodiment of the present disclosure;
  • FIG. 31 illustrates a screen configuration for providing a video call service in an electronic device, according to an embodiment of the present disclosure;
  • FIG. 32 is a flowchart of a process for displaying human body recognition service information in an electronic device, according to an embodiment of the present disclosure;
  • FIGS. 33A to 33D illustrate a screen configuration for displaying human body recognition service information in an electronic device according to an embodiment of the present disclosure;
  • FIG. 34 is a flowchart of a process for displaying pollution level information of a camera device in an electronic device, according to an embodiment of the present disclosure; and
  • FIG. 35 illustrates a screen configuration for displaying pollution level information of a camera device in an electronic device, according to an embodiment of the present disclosure.
  • DETAILED DESCRIPTION OF EMBODIMENTS OF THE PRESENT DISCLOSURE
  • Hereinafter, various embodiments of the present disclosure will be described with reference to the accompanying drawings. In the following description, specific details, such as detailed configuration and components, are merely provided to assist the overall understanding of these embodiments of the present disclosure. Therefore, it should be apparent to those skilled in the art that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. In addition, descriptions of well-known functions and constructions are omitted for clarity and conciseness. In describing the drawings, similar reference numerals may be used to designate similar elements.
  • The terms “have” or “include” used in describing the various embodiments of the present disclosure indicate the presence of disclosed corresponding functions, operations, elements, features, numbers, steps, and the like, and do not limit the addition of one or more functions, operations, elements, features, numbers, steps and the like. The terms “A or B”, “at least one of A or/and B” or “one or more of A or/and B” used in the various embodiments of the present disclosure include any and all combinations of words enumerated with it. For example, “A or B”, “at least one of A and B” or “at least one of A or B” means (1) including A, (2) including B, or (3) including both A and B.
  • Although terms such as “first” and “second” used herein may modify various elements of various embodiments, these terms do not limit the corresponding elements. For example, these terms do not limit an order and/or importance of the corresponding elements. These terms may be used for the purpose of distinguishing one element from another element. For example, a first user device and a second user device both indicate user devices and may indicate different user devices. For example, a first element may be referred to as a second element without departing from the scope of the present disclosure, and similarly, a second element may be referred to as a first element.
  • It should be understood that when an element (e.g., a first element) is “connected” or “coupled” another element (e.g., a second element), the first element may be directly connected or coupled to the second element, and there may be an intervening element (e.g., a third element) between the first element and second element. To the contrary, it will be understood that when an element (e.g., a first element) is “directly connected” or “directly coupled” to another element (e.g., second element), there is no intervening element (e.g., a third element) between the first element and the second element.
  • The expressions “configured to” or “set to” used in describing various embodiments of the present disclosure may be used interchangeably with “suitable for”, “having the capacity to”, “designed to”, “adapted to”, “made to”, or “capable of” according to the situation. The terms “configured to” or “set to” do not necessarily mean “specifically designed to” in a hardware level. Instead, the expression “apparatus configured to . . . ” may mean that the apparatus is “capable of . . . ” along with other devices or parts in a certain situation. For example, “a processor configured to (set to) perform A, B, and C” may be a dedicated processor, e.g., an embedded processor, for performing a corresponding operation, or a generic-purpose processor, e.g., a central processing unit (CPU) or an application processor (AP), capable of performing a corresponding operation by executing one or more software programs stored in a memory device.
  • The terms as used herein are used merely to describe certain embodiments and are not intended to limit the present disclosure. As used herein, singular forms may include plural forms as well unless the context explicitly indicates otherwise. Further, all the terms used herein, including technical and scientific terms, should be interpreted to have the same meanings as commonly understood by those skilled in the art to which the present disclosure pertains, and should not be interpreted to have ideal or excessively formal meanings unless explicitly defined in various embodiments of the present disclosure.
  • An electronic device according to various embodiments of the present disclosure may include at least one of a smartphone, a tablet personal computer (PC), a mobile phone, a video phone, an electronic book (e-book) reader, a laptop PC, a netbook computer, a workstation, a personal digital assistant (PDA), a portable multimedia player (PMP), an MP3 player, a mobile medical appliance, a camera, and a wearable device (e.g., smart glasses, a head-mounted-device (HMD), electronic clothes, an electronic bracelet, an electronic necklace, an electronic appcessory, an electronic tattoo, a smart mirror, or a smart watch).
  • According to another embodiment, the electronic device may include at least one of various medical devices (e.g., various portable medical measuring devices (a blood glucose monitoring device, a heart rate monitoring device, a blood pressure measuring device, a body temperature measuring device, etc.)), a navigation device, a global positioning system (GPS) receiver, an event data recorder (EDR), a flight data recorder (FDR), a vehicle infotainment device, an electronic device for a ship (e.g., a navigation device, and a gyro-compass), an avionics device, a security device, or Internet of Things (IoT) device (e.g., a light bulb, various sensors, an electric or gas meter, a sprinkler device, a fire alarm, a thermostat, a streetlamp, a toaster, a sporting goods, a hot water tank, a heater, a boiler, etc.).
  • The electronic device according to some embodiments of the present disclosure may be a flexible device. Further, the electronic device according to an embodiment of the present disclosure is not limited to the aforementioned devices, and may include a new electronic device according to the development of technology
  • Hereinafter, an electronic device according to various embodiments will be described with reference to the accompanying drawings. As used herein, the term “user” may indicate a person who uses an electronic device or a device (e.g., an artificial intelligence electronic device) that uses an electronic device.
  • FIGS. 1A and 1B illustrate a configuration of an electronic device, according to an embodiment of the present disclosure.
  • Referring to FIGS. 1A and 1B, an electronic device 100 is provided. The electronic device 100 may be configured as one body. For example, the electronic device 100 may be an electronic device for communication including a speaker device 130 and a microphone device 140 for a voice call.
  • The electronic device 100 may have a front surface configured by a touch screen 110. For example, a camera device 120 may be disposed on at least some areas of the touch screen 110.
  • The speaker device 130 may be disposed on at least one surface adjacent to the touch screen 110 (for example, an upper side surface, lower side surface, left side surface, and right side surface). For example, the speaker device 130 may be disposed on the upper side surface adjacent to the touch screen 110 close to the user's ear for a voice call.
  • Control buttons (for example, a home button and a back button) for controlling the electronic device 100 may be displayed in a lower area of the touch screen 110.
  • The touch screen 110 of the electronic device 100 may include a front window 140, a touch panel 150, a display module 160, and a printed circuit board (PCB) 170, as illustrated in FIG. 1B. For example, the camera device (for example, a front camera device) 180 of the electronic device 100 may be mounted on the PCB. For example, the front window 140 may be a transparent material window film that forms an external surface of the touch screen 110. For example, the PCB 170 may use a flexible PCB (FPCB), which is an electronic component made by forming a conductive circuit having good electrical conductivity (e.g., cooper) on an insulator.
  • According to an embodiment, the camera device 180 may be disposed at a position overlapping at least some areas 152 of the touch panel 150. For example, at least some areas 152 of the touch panel 150 on which the camera device 180 is disposed may be perforated. In this case, the touch screen 110 may have a limited touch recognition function in the area 152 on which the camera device 180 is disposed. Alternatively, at least some areas 152 of the touch panel 150 on which the camera device 180 is disposed are not perforated, and a touch pattern for touch recognition may be omitted in the corresponding areas 152. In this case, the touch screen 110 may have a limited touch recognition function in the area 152 on which the camera device 180 is disposed. Alternatively, at least some areas 152 of the touch panel 150 on which the camera device 180 is disposed are not perforated, and the touch pattern for touch recognition may be set in the corresponding areas 152. In this case, the touch screen 110 may detect a touch input through the areas 152 on which the camera device 180 is disposed. For example, the touch pattern may include an electrode for the touch recognition.
  • According to an embodiment, the camera device 180 may be disposed at a position overlapping some areas 162 of the display module 160. For example, at least some areas 162 of the display module 160 on which the camera device 180 is disposed may be perforated. In this case, the touch screen 110 may have a limited display function on the areas 162 in which the camera device 180 is disposed. Alternatively, at least some areas 162 of the display module 160 on which the camera device 180 is disposed are not perforated, and a display component may not be disposed in the corresponding areas 162. In this case, the touch screen 110 may have a limited display function on the areas 162 in which the camera device 180 is disposed. Alternatively, at least some areas 162 of the display module 160 on which the camera device 180 is disposed may not be perforated, and a display component may be disposed. In this case, the touch screen 110 may display information through the areas 162 in which the camera device 180 is disposed.
  • The electronic device 100 may form at least one hole in at least some areas (upper end) of the touch screen 110 and place the speaker device 130 for a voice call service in the at least one hole.
  • FIG. 2 is a block diagram of an electronic device in a network environment, according to an embodiment of the present disclosure.
  • Referring to FIG. 2 , an electronic device 201 is provided. The electronic device 201 may include a bus 210, a camera device 220, a processor 230 (e.g., including processing circuitry), a memory 240, an input/output interface 260 (e.g., including input/output circuitry), a display 270 (e.g., including display circuitry), and a communication interface 280 (e.g., including communication circuitry). In some embodiments, the electronic device 201 may omit at least one of the elements, or may further include other elements.
  • The bus 210 is a circuit that interconnects the elements 220 to 280 and transfers communication (for example, control messages and/or data) between the elements.
  • The camera device 220 may collect image information of a subject. For example, the camera device 220 may include a plurality of camera devices included in the electronic device 201. For example, the camera device 220 may include a first camera device (for example, front camera device) for performing photography in a selfie mode and a second camera device (for example, back camera device) for photographing a subject located in front of the user. For example, the camera device 220 may be disposed to be included in at least some areas of the display 270. For example, an image senor of the first camera device may be disposed in at least some areas of the display 270. For example, the image sensor may use a charge-coupled device (CCD) image sensor or a complementary metal-oxide semiconductor (CMOS) image sensor.
  • The processor 230 may include one or more of a central processing unit (CPU), an application processor (AP), and a communication processor (CP). For example, the processor 230 may execute calculations or data processing about controls and/or communication of at least one other element of the electronic device 201. The processor 230 may perform various functions of the electronic device 201. Accordingly, the processor 230 may control the elements of the electronic device 201.
  • The processor 230 may control the camera device 220 based on touch information of a preset camera control area to control the camera device 220. For example, when some areas of the touch panel are perforated to place the camera device 220, the processor 230 may set at least some areas adjacent to the placement areas of the camera device 220 on the touch panel as camera control areas. For example, when some areas of the touch panel corresponding to the placement areas of the camera device 220 are not perforated and a touch pattern is omitted in the corresponding areas, the processor 230 may set at least some areas adjacent to the placement areas of the camera device 220 on the touch panel as camera control areas. Alternatively, when some areas of the touch panel corresponding to the placement areas of the camera device 220 are not perforated and the touch pattern is set in the corresponding areas, the processor 230 may set the placement areas of the camera device 220 on the touch panel and at least some areas adjacent to the placement areas of the camera device 220 as camera control areas.
  • The processor 230 may drive the camera device 220 based on a touch and a drag input in the camera control area. For example, when the touch input in the camera control area is detected, the processor 230 may control the display 270 to display camera activation information in a display area corresponding to the camera control area. For example, when the display 270 is deactivated, the processor 230 may maintain the touch recognition function of the camera control area in an active state. Accordingly, the processor 230 may detect the touch input in the camera control area in an inactive state of the display 270. When the drag input of the camera activation information is detected, the processor 230 may execute a camera application to start a front camera mode. For example, the processor 230 may set a camera display area to display a service screen of the camera application based on a distance of the drag input. The processor 230 may control the display 270 to display the service screen of the camera application (for example, a preview image acquired through the camera device 220) in the camera display area.
  • The processor 230 may drive the camera device 220 based on a touch and a touch maintaining time in the camera control area. For example, when the touch input in the camera control area is detected, the processor 230 may control the display 270 to display camera activation information in the display area corresponding to the placement area of the camera device 220. When the touch maintaining time of the camera activation information exceeds a reference time, the processor 230 may execute the camera application to start the front camera mode, for example. In this case, the processor 230 may control the display 270 to display the preview image acquired through the front camera device.
  • The processor 230 may control the camera application to be linked with another application. For example, when the touch and the drag input in the camera control area are detected in a state where a service screen of another application is displayed, the processor 230 may display the service screen of the camera application in at least some areas of the display 270 based on a distance of the drag input. That is, the processor 230 may divide the display 270 into a first area and a second area based on the distance of the drag input. The processor 230 may control the display 270 to display a service screen of another application in the first area of the display 270 and to display a service screen of the camera application in the second area. When an image is captured (or acquired) through the camera application, the processor 230 may determine whether the camera application can be linked with the other application. When the camera application can be linked with the other application, the processor 230 may set the image captured through the camera application as contents to be controlled in the other application. When the camera application cannot be linked with the other application, the processor 230 may store the image captured through the camera application in the memory 240. For example, the processor 230 may end the camera application when the image is captured.
  • The processor 230 may set a timer of the camera device 220 to capture an image based on touch information (for example, at least one of the touch input and the drag input) in the camera control area. For example, when the drag input in the camera control area is detected while the camera application is executed (for example, in the front camera mode), the processor 230 may set the timer of the camera device 220 to correspond to a drag distance. That is, the processor 230 may set a time of the timer in proportion to the drag distance. The processor 230 may control the display 270 to display timer information based on the placement area of the camera device 220. The processor 230 may continuously reduce a size of the timer information displayed on the display 270 in accordance with the elapsing of the time of the timer. When the display of the timer information is removed from the display 270, the processor 230 may capture an image. For example, when the touch input in the camera control area is detected while the camera application is executed (for example, in the front camera mode), the processor 230 may set the timer of the camera device 220 based on a touch position. That is, the processor 230 may set the time of the timer in proportion to a distance between the placement area of the camera device 220 and the touch position. For example, the timer of the camera device 220 may include a photographing timer of the camera device 220 to capture an image.
  • The processor 230 may change a color of the display 270 to secure an amount of light to capture the image. For example, when the image is captured through the front camera device, the processor 230 may change the color of the display 270 into a bright color (for example, white) based on the placement area of the camera device 220 and provide a flash effect. For example, the processor 230 may apply various image effects by changing the color of the display 270 in accordance with a user input.
  • The processor 230 may control the display 270 to display additional information for a camera service through the camera control area. For example, when the image is captured through the front camera device, the processor 230 may control the display 270 to display a graphic effect (for example, wavelength image) based on the placement area of the camera device 220 to induce a user's eyes to the front camera device. For example, when video is photographed through the back camera device, the processor 230 may control the display 270 to display audio input information based on the placement area of the camera device 220. For example, the processor 230 may control a size of audio input information to correspond to a size of an audio signal collected through the microphone device 140 while the video is photographed.
  • The processor 230 may execute the camera application based on touch information of an application icon. For example, when at least one of a touch input and a drag input for the application icon is detected, the processor 230 may identify whether the application icon enters the camera control area. When the application icon enters the camera control area, the processor 230 may identify whether an application corresponding to the application icon is linked with the camera application. When the application corresponding to the application icon is linked with the camera application, the processor 230 may execute a camera function (for example, front camera mode) of the application corresponding to the application icon. For example, when the touch input for the application icon is released within the camera control area, the processor 230 may determine that the application icon enters the camera control area.
  • The processor 230 may execute a multi-camera mode based on touch information of the camera control area. For example, the processor 230 may provide a camera service of one of a plurality of camera devices. When a tap input in the camera control area is detected while the camera service is provided, the processor 230 may switch to the multi-camera mode in which the plurality of camera devices are simultaneously activated. For example, when a tap input in the camera control area is detected while the front camera mode is executed, the processor 230 may additionally activate the back camera device and execute the multi-camera mode. In this case, the display 270 may overlap the display of preview images acquired through the front camera device and the back camera device or display the preview images in different areas. In addition, when a tap input in the camera control area is detected while the multi-camera mode is executed, the processor 230 may switch positions of the preview images. The processor 230 may control sizes of the preview images based on input information detected through the input/output interface 260.
  • The processor 230 may provide an automatic photographing service based on at least one of a location and an angle of the electronic device 201 in the front camera mode. For example, when the automatic photographing mode is set, the processor 230 may display a camera image corresponding to the location and the angle of the electronic device 201 to be adjacent to the placement area of the camera device 220. That is, the processor 230 may display the camera image corresponding to the location and the angle of the electronic device 201 to allow the user to control the location and the angle of the electronic device 201 to match photographing information. When the location and the angle of the electronic device 201 match the photographing information, the processor 230 may automatically capture the image. For example, the photographing information may be set by a user input or may include at least one of the location and the angle of the electronic device 201 that match the image acquired through the front camera mode.
  • The processor 230 may set the camera application to control the camera device 220 based on touch information of the camera control area. For example, when a drag input in a first direction (for example, a horizontal direction) in the camera control area is detected, the processor 230 may control the display 270 to display a camera application list installed in the electronic device 201. The processor 230 may select one first camera application based on input information detected through the input/output interface 260. The processor 230 may control the camera device 220 by executing the first camera application. That is, the processor 230 may drive the camera device 220 based on camera setting information set to the first camera application. In addition, the processor 230 may set the first camera application as a basic camera application. Accordingly, when the camera device 220 is driven based on touch information of the camera control area, the processor 230 may execute the first camera application. For example, the camera setting information may include at least one of a filter for photographing, a photographing mode, and a photographing setting value (for example, aperture, shutter speed, image size, and the like).
  • The processor 230 may control the camera device 220 in accordance with the camera setting information of the image. For example, the processor 230 may control the display 270 to display a list of images stored in the memory 240. For example, when the image is acquired, the processor 230 may control the display 270 to display corresponding filter information in the image to which a filter is applied. When a touch and a drag input for a first image is detected in the image list, the processor 230 may identify whether the first image enters the camera control area. When the first image enters the camera control area, the processor 230 may drive the camera device 220 in accordance with camera setting information of the first image. For example, when a touch input for the first image is released within the camera control area, the processor 230 may determine that the first image enters the camera control area.
  • The processor 230 may capture an image based on touch information of the camera control area. For example, when a double tap input in the camera control area is detected, the processor 230 may capture an image through the camera device 220 without executing the camera application.
  • The processor 230 may photograph video based on touch information of the camera control area. For example, when a touch maintaining time of the camera control area exceeds a reference time, the processor 230 may photograph video through the camera device 220 without executing the camera application. When the touch input in the camera control area is released, the processor 230 may end the photographing of the video. For example, when the touch maintaining time in the camera control area exceeds the reference time, the processor 230 may output notification information to allow the user to recognize the start of the video photographing. Here, the notification information may include at least one of a notification sound, a notification message, and a vibration.
  • When driving of the camera device 220 is limited, the processor 230 may display driving limit information to be adjacent to the placement area of the camera device 220. For example, when the driving of the camera device 220 is limited by an application being executed in the electronic device 201, the processor 230 may display driving limit information to be adjacent to the placement area of the camera device 220. For example, when the driving of the camera device 220 is limited based on a position of the electronic device 201, the driving limit information may be displayed to be adjacent to the placement area of the camera device 220. In addition, when a touch input in the camera control area is detected in a state where the driving of the camera device 220 is limited, the processor 230 may execute a camera setting menu.
  • The processor 230 may display notification information of a communication service using an image to be adjacent to the placement area of the camera device 220. For example, when a video call signal is received, the processor 230 may display video call notification information to be adjacent to the placement area of the camera device 220. The processor 230 may determine whether to accept the video call based on touch information of an area where the video call notification information is displayed.
  • When a human body recognition service using the camera device 220 is provided, the processor 230 may display human body recognition information (for example, face recognition) to be adjacent to the placement area of the camera device 220. For example, when iris recognition is performed through the camera device 220, the processor 230 may display time information required for the iris recognition based on the placement area of the camera device 220 to allow the user to recognize the time information corresponding to a time during which the user should look at the camera device 220 for the iris recognition. The processor 230 may further display progress time information of the iris recognition. For example, when the time information required for the iris recognition matches the progress time information of the iris recognition, the processor 230 may complete the iris recognition.
  • The processor 230 may display pollution level information of the camera device 220 to be adjacent to the placement area of the camera device 220. For example, the processor 230 may estimate a pollution level of the image sensor of the camera device 220 by detecting the definition of the image acquired through the camera device 220. The processor 230 may display the pollution level information of the image sensor of the camera device 220 to be adjacent to the placement area of the camera device 220. For example, when the pollution level of the image sensor of the camera device 220 exceeds a reference value, the processor 230 may display pollution level information to be adjacent to the placement area of the camera device 220.
  • When a user's hovering input is detected through the camera control area, the processor 230 may display a guide image to induce a touch of another area adjacent to the placement area of the camera device 220.
  • The memory 240 may include a volatile memory and/or a non-volatile memory. For example, the memory 240 may store instructions or data related to at least one other element of the electronic device 201. The memory 240 may store software and/or a program 250. For example, the program 250 may include a kernel 251, middleware 253, an application programming interface (API) 255, and an application program 257. At least some of the kernel 251, the middleware 253, and the API 255 may be referred to as an operating system (OS).
  • The input/output interface 260 may function as an interface that may transfer instructions or data input from a user or another external device to the other elements of the electronic device 201. Furthermore, the input/output interface 260 may output instructions or data, which are received from the other elements of the electronic device 201, to the user or the external device. For example, the input/output interface 260 may include a touch panel that detects a touch input or a hovering input using an electronic pen or a user's body part. For example, the input/output interface 260 may receive a gesture or a proximity input using an electronic pen or a user's body part.
  • The display 270 may display various types of contents (for example, text, images, videos, icons, symbols, or the like) to a user. For example, at least some areas (for example, upper areas) of the display 270 may be perforated for placement of the camera device 220. Accordingly, the display 270 may limit the display function in the placement area of the camera device 220. According to an embodiment, the display 270 may be implemented by a touch screen coupled with the touch panel of the input/output interface 260.
  • The communication interface 280 may establish communication between the electronic device 201 and an external device. For example, the communication interface 280 may communicate with a first external electronic device 202 through short-range communication 284 or wired communication. The communication interface 280 may be connected to a network 282 through wireless or wired communication to communicate with a second external electronic device 204 or a server 206.
  • According to an embodiment, the network 282 may include at least one of a communication network, a computer network (for example, a LAN or a WAN), the Internet, and a telephone network.
  • FIG. 3 is a flowchart of a process for controlling a camera device in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include the electronic device 201 or at least a part (for example, processor 230) of the electronic device 201.
  • Referring to FIG. 3 , the electronic device 201 identifies whether a touch input for the camera control area related to the placement area of the camera device 220 is detected on the touch screen in operation 301. For example, when some areas of the touch panel are perforated for placement of the camera device 220, the camera control area may be disposed on at least some areas adjacent to the placement area of the camera device 220 on the touch screen. For example, when some areas of the touch panel corresponding to the placement area of the camera device 220 are not perforated and a touch pattern is omitted in the areas, the camera control area may be disposed on at least some areas adjacent to the placement area of the camera device 220 on the touch screen. For example, when some areas of the touch panel corresponding to the placement area of the camera device 220 are not perforated and the touch pattern is set in the areas, the camera control area may be disposed on the placement area of the camera device 220 and at least some areas adjacent to the placement area of the camera device 220 on the touch screen.
  • When the touch input for the camera control area is detected, the electronic device 201 detects a control function of the camera device 220 corresponding to the touch input in operation 303. For example, the processor 230 detects the control function of the camera device 220 based on at least one of the number of touches in the camera control area, a drag distance (i.e., touch motion distance), a drag direction (i.e., touch motion direction), and a touch maintaining time. For example, the control function of the camera device 220 may include at least one of driving of the camera device 220, selection of an application for driving the camera device 220, camera setting information, image capturing, video photographing, timer setting, and camera mode switching.
  • The electronic device 201 drives the camera device 220 based on the control function of the camera device 220 corresponding to the touch input in the camera control area in operation 305. For example, the processor 230 controls the camera device 220 by executing the camera application in accordance with the control function of the camera device 220.
  • FIG. 4 is a flowchart of a process for controlling a camera device in an electronic device when a screen of the electronic device is turned off, according to an embodiment of the present disclosure. FIGS. 6A to 6D illustrate a screen configuration for controlling a camera device in an electronic device when a screen of the electronic device is turned off, according to an embodiment of the present disclosure. In the following description, the electronic device may include the electronic device 201 or at least a part (for example, processor 230) of the electronic device 201.
  • Referring to FIG. 4 , an operation for controlling the camera device 220 of electronic device 201 based on the screen configuration shown in FIGS. 6A to 6D will be described. Referring to FIG. 4 , the electronic device 201 determines whether a first touch input is detected through a camera control area set based on a placement area of the camera device 220 on the touch screen in operation 401. For example, referring to FIG. 6A, when the display 270 is deactivated, the processor 230 maintains a touch recognition function of the camera control area in an active state. The processor 230 determines whether a first type touch input is detected through the camera control area. For example, the first type touch input may correspond to the type of touch in which the user rubs the camera control area and include a touch input having a continuously changing drag direction. For example, when the display 270 is activated, the processor 230 determines whether the first type touch input is detected through the camera control area. For example, the touch recognition function of the camera control area may be activated or deactivated based on the type of an application driven in the electronic device 201.
  • When a first touch input is not detected through the camera control area, the electronic device 201 terminates the operation for controlling the driving of the camera device 220.
  • When the first touch input is detected through the camera control area, the electronic device 201 displays camera activation information in a display area corresponding to the camera control area in operation 403. For example, when the first type touch input is detected through the camera control area as indicated by reference numeral 610 in FIG. 6A, the processor 230 may display camera activation information 620 based on the placement area of the camera device 220, as illustrated in FIG. 6B.
  • The electronic device 201 determines whether a second touch input for the camera activation information is detected in operation 405. For example, the processor 230 determines whether a drag input 630 for the camera activation information 620 is detected within a reference time from a time point when the camera activation information 620 is displayed, as illustrated in FIG. 6B.
  • When the second touch input is not detected before the reference time passes from the time point when the camera activation information is displayed, the electronic device 201 may determine to not drive the camera. Accordingly, the electronic device 201 may terminate the operation for controlling driving of the camera device 220.
  • When the second touch input for the camera activation information is detected, the electronic device 201 drives the camera device 220 in operation 407. For example, when a drag input for the camera activation information 620 is detected as indicated by reference numeral 630, the processor 230 may display at least some of the service screen of the camera application in accordance with a distance of the drag input, as illustrated in FIG. 6C. When the drag distance exceeds a reference distance, the processor 230 may display the service screen of the camera application on the display 270 as indicated by reference numeral 650 in FIG. 6D. For example, the processor 230 may display a preview image acquired through the front camera device 220 on the display 270 by executing the camera application. For example, when the drag input for the camera activation information 620 is detected as indicated by reference numeral 630, the processor 230 may display the service screen (for example, a preview image) of the camera application in at least some areas of the display 270 in accordance with the drag input.
  • When the touch input for the drag is released before the drag distance exceeds the reference distance, the electronic device 201 may determine to not drive the camera device 220. Accordingly, the electronic device 201 may terminate the service screen of the camera application, as illustrated in FIG. 6A.
  • FIG. 5 is a flowchart of a process for detecting touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include the electronic device 201 or at least a part (for example, processor 230) of the electronic device 201.
  • Referring to FIG. 5 , the operation for detecting a first touch input through the camera control area in operation 401 of FIG. 4 will be described. The electronic device 201 determines whether the display 270 is deactivated in operation 501. For example, the processor 230 determines whether an operation state of the display 270 switches to an inactive state since the electronic device 201 operates in a low power mode.
  • When the display 270 is deactivated, the electronic device 201 maintains the touch recognition function of the camera control area in an active state in operation 503. For example, when the display 270 is deactivated as indicated by reference numeral 600 in FIG. 6A, the processor 230 maintains a touch recognition function of the camera control area in an active state.
  • In operation 505, the electronic device 201 determines whether a touch input is detected through the camera control area. For example, the processor 230 determines whether a touch input of the type in which the user rubs the touch screen is detected through the camera control area having the activated touch recognition function as illustrated in FIG. 6A.
  • When the touch input is not detected through the camera control area, the electronic device 201 maintains the touch recognition function of the camera control area in the active state in operation 503.
  • When the display 270 is activated, the electronic device 201 determines whether the touch input is detected in operation 507. For example, when the display 270 is in the active state, the processor 230 maintains the touch recognition function of the touch panel corresponding to the display 270 in the active state. Accordingly, the processor 230 determines whether the touch input is detected through the touch panel in the active state.
  • When the touch input is not detected through the display in the active state, the electronic device 201 determines whether the display 270 is deactivated again in operation 501.
  • When the touch input is detected through the display 270 in the active state, the electronic device 201 determines whether the touch input is detected through the camera control area in operation 509. For example, the processor 230 determines whether a touch coordinate of the touch input is included in the camera control area.
  • FIG. 7 is a flowchart of a process for configuring a camera display area in an electronic device based on touch information of a camera control area, according to an embodiment of the present disclosure. In the following description, the electronic device may include the electronic device 201 or at least a part (for example, processor 230) of the electronic device 201.
  • Referring to FIG. 7 , an operation for driving the camera device in operation 407 of FIG. 4 will be described. When the touch input for driving the camera device is detected through the camera control area (i.e., operation 405 of FIG. 4 ), the electronic device 201 sets the camera display area based on a second touch input in operation 701. For example, when the drag input for the camera activation information 620 is detected, as illustrated in FIG. 6B, the processor 230 sets at least some areas of the display 270 as the camera display area for displaying the service screen of the camera application in accordance with a drag distance. For example, when the drag distance for the camera activation information 620 exceeds a reference distance, the processor 230 sets the entire area of the display 270 as the camera display area.
  • The electronic device 201 may drive the camera device 220 based on the camera display area in operation 703. For example, the processor 230 may display a preview image acquired through the front camera device in the camera display area of the display 270 by executing the camera application.
  • FIG. 8 is a flowchart of a process for controlling a camera device in an electronic device when a camera application is linked with another application, according to an embodiment of the present disclosure. FIGS. 9A to 9E illustrate a screen configuration for controlling a camera device in an electronic device when a camera application is linked with another application, according to an embodiment of the present disclosure. In the following description, the electronic device may include the electronic device 201 or at least a part (for example, processor 230) of the electronic device 201.
  • Referring to FIG. 8 , the electronic device 201 may drive a first application among at least one application installed in the electronic device 201 in operation 801. For example, referring to FIG. 9A, when a messenger application is selected from at least one application installed in the electronic device 201 based on input information detected through the input/output interface 260, the processor 230 displays a service screen 900 of the messenger application on the display 270.
  • The electronic device 210 detects a first touch input for the camera control area in operation 803. For example, referring to FIG. 9B, the processor 230 may detect a tap input through at least some areas of the touch screen set as the camera control area.
  • When the first touch input for the camera control area is not detected, the electronic device 201 determines to not drive the camera device 220 and terminate the operation for controlling driving of the camera device 220.
  • When the first touch input for the camera control area is detected, the electronic device 201 displays camera activation information in at least some areas of the display corresponding to the camera control area in operation 805. For example, when a tap input for the camera control area is detected as indicated by reference numeral 910 in FIG. 9B, the processor 230 displays camera activation information 920 based on the placement area of the camera device 220.
  • The electronic device 201 determines whether a second touch input for the camera activation information is detected in operation 807. For example, the processor 230 determines whether a drag input is detected through at least some areas of the touch screen where the camera activation information is displayed.
  • When the second touch input for the camera activation information is not detected before a reference time passes from a time point when the camera activation information is displayed, the electronic device 201 determines to not drive the camera device 220. Accordingly, the electronic device 201 terminates the operation for controlling driving of the camera device 220.
  • When the second touch input for the camera activation information is detected, the electronic device 201 sets a camera display area in accordance with the second touch input in operation 809. For example, referring to FIG. 9C, when the drag input for the camera activation information 920 is detected as indicated by reference numeral 930, the processor 230 sets at least some areas of the display 270 as the camera display area in accordance with a drag distance.
  • The electronic device 201 displays driving information (for example, service screen of the camera application) of the camera device 220 in the camera display area in operation 811. For example, referring to FIG. 9D, the processor 230 displays a preview image acquired through the front camera device in the camera display area set to at least some areas of the display 270 based on the drag distance as indicated by reference numeral 940. In addition, the processor 230 displays a photographing button 942 at a position where the drag input is released.
  • The electronic device 201 determines whether an event for capturing an image is generated through the camera application in operation 813. For example, the processor 230 may determine whether a touch input for the photographing button 942 displayed in the camera display area is detected or whether a gesture input mapped to image capturing is detected.
  • When the event for capturing the image is not generated, the electronic device 201 maintains display of the camera driving information of the camera display area in operation 811.
  • When the event for capturing the image is generated, the electronic device determines whether the camera application and a first application are linked to each other in operation 815. For example, the processor 230 determines whether the first application provides a service using the image captured through the camera application.
  • When the camera application and the first application are linked to each other, the electronic device 201 links the image captured through the camera application with the first application in operation 817. For example, referring to FIG. 9E, the processor 230 may transmit the image captured through the camera application to a counterpart electronic device through a chat room of the messenger application as indicated by reference numeral 950. In addition, the processor 230 may store the image captured through the camera application in the memory 240.
  • When the camera application and the first application are not linked to each other, the electronic device 201 stores the image captured through the camera application in the memory 240 of the electronic device 201 in operation 819.
  • After the image is captured through the camera application displayed in at least some areas of the display 270, the electronic device 201 terminates driving of the camera device 220. For example, after the image is captured through the camera application, the processor 230 terminates the camera application, as illustrated in FIG. 9E.
  • FIG. 10 is a flowchart of a process for controlling a camera device in an electronic device based on touch maintaining information of a camera control area, according to an embodiment of the present disclosure. FIGS. 11A and 11B illustrate a screen configuration for controlling a camera device in an electronic device based on touch maintaining information of a camera control area, according to an embodiment of the present disclosure. In the following description, the electronic device may include the electronic device 201 or at least a part (for example, processor 230) of the electronic device 201.
  • Referring to FIG. 10 , an operation for controlling the camera device 220 based on the screen configuration of FIGS. 11A and 11B will be described. The electronic device 201 determines whether a touch input is detected through a camera control area set to at least some areas of the touch screen in operation 1001. For example, referring to FIG. 11A, when the display 270 is in an active state as indicated by reference numeral 1100, the processor 230 may determine whether a hovering input for the camera control area set to be adjacent to the placement area of the camera device 220 is detected or whether a tap input for the camera control area of the touch screen is detected.
  • When a touch input is not detected through the camera control area, the electronic device 201 determines to not drive the camera device 220 and terminate the operation for controlling the camera device 220.
  • When the touch input is detected through camera control area, the electronic device 201 displays camera activation information in at least some areas of the display corresponding to the camera control area in operation 1003. For example, when a hovering input is detected through the camera control area as indicated by reference numeral 1120, the processor 230 displays camera activation information 1130 to be adjacent to the placement area of the camera device 220, as illustrated in FIG. 11A. The processor 230 displays the camera activation information 1130 in at least some areas different from the placement area of the camera device 220 in order to prevent the image sensor of the front camera device from becoming dirty due to the touch input for controlling the camera device 220. That is, the processor 230 displays the camera activation information 1130 in at least some areas different from the placement area of the camera device 220 to allow the user to touch at least some areas different from the placement area of the camera device 220.
  • The electronic device 201 determines whether a touch maintaining time for the camera activation information exceeds a reference time in operation 1005. For example, the processor 230 determines whether the touch maintaining time for the camera activation information 1130 exceeds the reference time, as illustrated in FIG. 11A.
  • When the touch maintaining time for the camera activation information is shorter than the reference time, the electronic device 201 determines whether the touch input for the camera activation information is released in operation 1009. For example, the processor 230 determines whether a touch input 1120 for the camera activation information 1130 is released, as illustrated in FIG. 11A.
  • When the touch input for the camera activation information is released, the electronic device 201 determines to not drive the camera device 220 and terminates the operation for controlling driving of the camera device 220.
  • When the touch input for the camera activation information is maintained, the electronic device 201 determines whether the touch maintaining time for the camera activation information exceeds the reference time again in operation 1005.
  • When the touch maintaining time for the camera activation information exceeds the reference time, the electronic device 201 drives the camera device 220 in operation 1007. For example, referring to FIG. 11B, when the touch maintaining time for the camera activation information 1130 exceeds the reference time, the processor 230 displays the service screen of the camera application on the display 270 through an image effect that makes the service screen of the camera application spread from the placement area of the camera device 220. For example, the service screen of the camera application may include a preview image acquired through the front camera device or the back camera device.
  • FIG. 12 is a flowchart of a process for setting a timer of a camera device based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure. FIGS. 13A to 13C illustrate a screen configuration for setting a timer of a camera device based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include the electronic device 201 or at least a part (for example, processor 230) of the electronic device 201.
  • Referring to FIG. 12 , an operation for setting a timer of a camera device using the screen configuration of FIGS. 13A to 13C will be described. The electronic device 201 displays a service screen of a camera application in at least some areas of the display in operation 1201. For example, the processor 230 executes the camera application based on touch information of a camera control area, as described in operations 401 to 407 of FIG. 4 or operations 1001 to 1009 of FIG. 10 . For example, when a touch input for an icon of the camera application is detected, the processor 230 executes the corresponding camera application.
  • The electronic device 201 determines whether the touch input for the camera control area is detected in operation 1203. For example, referring to FIG. 13A, the processor 230 displays a preview image of the front camera device on the display 270 by executing the camera application as indicated by reference numeral 1300. The processor 230 may determine whether a drag input 1310 is detected from the placement area of the camera device 220 in a state where the preview image of the front camera device is displayed. The processor 230 determines whether a subsequent tap input for the camera control area is detected in the state where the preview image of the front camera device is displayed.
  • When the touch input for the camera control area is not detected, the electronic device 201 determines to not set a timer of the camera device. Accordingly, the electronic device terminates the operation for setting the timer of the camera device 220.
  • When the touch input is detected through the camera control area, the electronic device 201 sets the timer of the camera device in accordance with the touch input in operation 1205. For example, the processor 230 sets the timer of the camera device 220 to a timer required time corresponding to a drag distance from the placement area of the camera device 220, as illustrated in FIG. 13A. The processor 230 sets the timer of the camera device 220 to a time corresponding to a distance between the placement area of the camera device 220 and a position where a tap input is detected.
  • The electronic device 201 displays timer information of the camera device 220 set to correspond to the touch input on the display 270 in operation 1207. For example, the processor 230 displays time information set to correspond to the drag distance from the placement area of the camera device 220 as indicated by reference numeral 1320 in FIG. 13A.
  • The electronic device 201 determines whether the time set to correspond to the touch input has expired in operation 1209.
  • When the time set to correspond to the touch input has not expired, the electronic device 201 displays timer information of the camera device 220 on the display 270 in operation 1207. For example, referring to FIG. 13B, the processor 230 updates the time information displayed in accordance with the elapse of time. That is, the processor 230 updates the timer of the camera device 220 such that display of the time information becomes gradually smaller in accordance with the elapse of time from a time point when the timer is set.
  • When the time set to correspond to the touch input expires, the electronic device 201 captures an image by driving the camera device 220 in operation 1211. For example, when the time set to correspond to the drag distance expires, the processor 230 captures an image by using the front camera device. In this case, the processor 230 removes the display of the time information from the display 270 as illustrated in FIG. 13C. In addition, if it is determined that an amount of light for image capturing is insufficient, the processor 230 may acquire the amount of light for the image capturing by changing a color of the display into a bright color.
  • FIG. 14 is a flowchart of a process for providing a flash effect in an electronic device, according to an embodiment of the present disclosure. FIGS. 15A to 15D illustrate a screen configuration for providing a flash effect in an electronic device, according to an embodiment of the present disclosure.
  • Referring to FIG. 14 , an operation for capturing an image using the screen configuration of FIG. 15 , as in operation 1211 of FIG. 12 , will be described. In the following description, the electronic device may include the electronic device 201 or at least a part (for example, processor 230) of the electronic device 201 of FIG. 2 .
  • When an image capturing event is generated (operation 1209 of FIG. 12 ), the electronic device 201 determines whether a flash function is set. For example, when the time set for the timer of the camera device 220 expires, the processor 230 may determine that the image capturing event is generated. The processor 230 displays camera activation information 1510 based on the placement area of the camera device 220 to make user's eyes face the front image device in response to the generation of the image capturing event. In this case, the processor 230 determines whether a flash setting menu of the camera device 220 is set in an active state.
  • When the flash function of the camera device 220 is not set, the electronic device 201 captures the image by driving the camera device 220 in operation 1405. For example, the processor 230 captures the image by using the activated front camera device of the camera device 220.
  • When the flash function of the camera device 220 is set, the electronic device 201 changes a color of the display 270 into a color set as the flash function in operation 1403. For example, referring to FIGS. 15B and 15C, when acquiring an amount of light for the image capturing, the processor 230 displays a background image such that a bright colored (for example, white) background image spreads across an entire area of the display 270 based on the placement area of the camera device 220 as indicated by reference numerals 1520 and 1530.
  • The electronic device 201 captures the image by driving the camera device 220 while changing the color of the display 270 by the flash function in operation 1405.
  • When the image is captured, the electronic device 201 may change the color of the display 270 in accordance with an image effect which the user desires. For example, the processor 230 may set an image effect having a warm feeling based on the user's input information. In this case, the processor 230 displays a background image such that a yellow background image, for example, spreads across an entire area of the display 270 based on the placement area of the camera device 220 while capturing the image.
  • When video is photographed, the electronic device 201 may display audio input information to allow the user to identify a size of an audio signal input through the microphone device 140. For example, when video is photographed through the back camera device, the processor 230 may display audio input information 1540 corresponding to a size of an audio signal based on the placement area of the camera device 220, as illustrated in FIG. 15D.
  • FIG. 16 is a flowchart of a process for providing a camera service through an application in an electronic device, according to an embodiment of the present disclosure. FIGS. 17A and 17B illustrate a screen configuration for providing a camera service through an application in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include the electronic device 201 or at least a part (for example, processor 230) of the electronic device 201.
  • Referring to FIG. 16 , an operation for providing a camera service using the screen configuration of FIGS. 17A and 17B will be described. The electronic device 201 displays a standby screen including an icon of at least one application in operation 1601. For example, referring to FIG. 17A, the processor 230 displays the standby screen including icons of applications installed in the electronic device 201 as indicated by reference numeral 1700.
  • The electronic device 201 determines whether a touch input for the icon of the application is detected in the standby screen in operation 1603. For example, the processor 230 determines whether a touch input for an icon of one of a plurality of applications displayed on the standby screen is detected, as illustrated in FIG. 17A.
  • When the touch input for the icon of the application is not detected on the standby screen, the electronic device 201 determines to not provide the camera service and terminates the operation for providing the camera service.
  • When the touch input for the application icon included in the standby screen is detected, the electronic device 201 determines whether the application icon enters the camera control area by the touch input in operation 1605. For example, the processor 230 determines whether a first application icon 1710 on which the touch input is detected enters the camera control area through a drag input 1720 on a standby screen 1700 as indicated by reference numeral 1730 in FIG. 17A. For example, when the touch input of the first application icon 1710 is released within the camera control area, the processor 230 determines that the first application icon 1710 enters the camera control area. For example, after detecting a first touch input for selecting the first application icon 1710, the processor 230 determines whether a second touch input for determining a movement location of the first application icon 1710 is detected within the camera control area, as illustrated in FIG. 17A. For example, the second touch input may be determined as being effective only when the corresponding touch input is detected within a reference time from a time point when the first touch input is detected.
  • When the application icon does not enter the camera control area, the electronic device 201 determines whether the touch input for the application icon is released in operation 1611. For example, the processor 230 determines whether the touch input for the first application icon 1710 is released outside the camera control area, as illustrated in FIG. 17A.
  • When the touch input for the application icon is maintained, the electronic device 201 determines whether the application icon enters the camera control area by the touch input in operation 1605.
  • When the touch input for the application icon is released, the electronic device 201 determines to not provide the camera service and terminates the operation for providing the camera service. For example, the processor 230 may change a location of the application icon to a location where the touch input for the application icon is released.
  • When the application icon enters the camera control area, the electronic device 201 determines whether an application corresponding to the application icon can be linked with the camera application in operation 1607. For example, the processor 230 determines whether the camera service can be provided through the application corresponding to the first application icon 1710.
  • When the application corresponding to the application icon is not linked with the camera application, the electronic device 201 terminates the operation for providing the camera service.
  • When the application corresponding to the application icon is linked with the camera application, the electronic device 201 displays link information between the application corresponding to the application icon and the camera device 220 on the display 270 in operation 1609. For example, when the camera service can be provided through the first application corresponding to the first application icon 1710, the processor 230 displays a camera service screen of the first application on the display 270 as indicated by reference numeral 1740 in FIG. 17B.
  • FIG. 18 is a flowchart of a process for providing a multi-camera service in an electronic device, according to an embodiment of the present disclosure. FIGS. 19A to 19E illustrate a screen configuration for providing a multi-camera service in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include the electronic device 201 or at least a part (for example, processor 230) of the electronic device 201.
  • Referring to FIG. 18 , an operation for providing a multi-camera service using the screen configuration of FIG. 19 will be described. The electronic device 201 displays a service screen of a camera application on the display 270 in operation 1801. For example, referring to FIG. 19A, when the electronic device 201 operates in a front camera mode, the processor 230 displays a preview image collected through the front camera device on the display 270 as indicated by reference numeral 1900. In this case, the processor 230 displays the service screen of the camera application on the display 270 like operations 401 to 407 of FIG. 4 or operations 1001 to 1009 of FIG. 10 .
  • In operation 1803, the electronic device 201 determines whether a touch input is detected through the camera control area. For example, the processor 230 determines whether a tap input 1910 for the camera control area adjacent to the placement area of the camera device 220 is detected in a state where a preview image of the front camera device is displayed, as illustrated in FIG. 19A.
  • When a touch input for the camera control area is not detected, the electronic device 201 determines to not provide a multi-camera service. Accordingly, the electronic device 201 terminates the operation for providing the multi-camera service.
  • When the touch input for the camera control area is detected, the electronic device 201 switches a camera mode of the electronic device 201 to a multi-camera mode in operation 1805. For example, when the tap input 1910 for the camera control area is detected while the service is provided through the front camera device, the processor 230 may additionally activate the back camera device. For example, when the tap input 1910 for the camera control area is detected while the service is provided through the back camera device, the processor 230 may additionally activate the front camera device.
  • The electronic device 201 displays a service screen using multiple cameras on the display 270 based on the multi-camera mode in operation 1807. For example, referring to FIG. 19B, the processor 230 displays a preview image 1920 of the activated back camera device to overlap at least a part of the preview image 1910 of the front camera device based on the multi-camera mode. In addition, when a drag input for an edge area of the preview image 1920 of the back camera device is detected, the processor 230 controls a size of the preview image 1920 of the back camera device in accordance with a drag distance. Referring to FIG. 19C, when a tap input for the displayed small preview image 1920 of the back camera device is detected, the processor 230 reverses display areas of the preview image 1910 of the front camera device and the preview image 1920 of the back camera device as indicated by reference numeral 1940. Referring to FIGS. 19D and 19E, when a drag input for the displayed small preview image of the front camera is detected as indicated by reference numeral 1950 in FIG. 19D, the processor 230 updates a display location of the preview image of the front camera according to the drag input as indicated by reference numeral 1960 in FIG. 19E.
  • The electronic device 201 determines whether the multi-camera mode ends in operation 1809. For example, when the drag input for the displayed small preview image is detected to move outside the area of the display 270 as indicated by reference numeral 1970 in FIG. 19E, the processor 230 determines that the multi-camera mode ends.
  • When the multi-camera mode does not end, the electronic device 201 maintains the service screen using the multiple cameras displayed on the display 270 in operation 1807.
  • When the multi-camera mode ends, the electronic device 201 switches the camera mode of the electronic device 201 to a single camera mode and displays a service screen of a single camera device on the display 270 in operation 1811. For example, referring to FIG. 19F, when an event corresponding to the type of the preview image acquired through the front camera device is detected, as indicated by reference numeral 1970 in FIG. 19E, the processor 230 displays the preview screen of the back camera device on the display 270, as indicated by reference numeral 1980.
  • FIG. 20 is a flowchart of a process for providing an automatic photographing service in an electronic device, according to an embodiment of the present disclosure. FIGS. 21A to 21C illustrate a screen configuration for providing an automatic photographing service in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include the electronic device 201 or at least a part (for example, processor 230) of the electronic device 201.
  • Referring to FIG. 20 , an operation for providing the automatic photographing service using the screen configuration of FIGS. 21A to 21C will be described. The electronic device 201 displays a service screen (for example, preview image) of a camera application on the display 270 in operation 2001. For example, the processor 230 executes the camera application based on touch information of a camera control area through operations 401 to 407 of FIG. 4 or operations 1001 to 1009 of FIG. 10 . Referring to FIG. 21A, the processor 230 displays the preview image of the front camera device on the display 270 as indicated by reference numeral 2100.
  • The electronic device 201 determines whether an automatic photographing mode is set to the camera application in operation 2003. For example, the processor 230 determines whether an automatic photographing menu is set in an activated state based on input information detected through the input/output interface 260.
  • When the automatic photographing mode is not set to the camera application, the electronic device 201 terminates the operation for providing the automatic photographing service. In this case, the electronic device 201 captures an image based on a touch input of a photographing button displayed on the service screen of the camera application.
  • When the automatic photographing mode is set, the electronic device 201 displays motion information of the camera device 220 in at least some areas of the display 270 in operation 2005. For example, when the automatic photographing mode is set, the processor 230 displays motion information of the camera device 220 corresponding to a location and an angle of the electronic device 201 based on the placement area of the camera device 220 as indicated by reference numeral 2110 in FIG. 21A.
  • The electronic device 201 determines whether a motion of the camera device 220 that a capturing event matches is detected in operation 2007. For example, the processor 230 determines whether motion information of the electronic device 201 that matches the location and angle of the electronic device 201 preset for image capturing is detected. For example, the preset location and angle of the electronic device 201 may be set by a user's input or may include at least one of locations and angles of the electronic device 201 that match the image acquired through the front camera mode.
  • When the motion of the camera device 220 that the capturing event matches is not detected, the electronic device 201 displays changed motion information of the camera device 220 in at least some areas of the display 270 in operation 2005. For example, referring to FIG. 21B, the processor 230 may change the motion information of the camera device 220 displayed based on the placement area of the camera device 220 according to a change in the location and angle of the electronic device 201 as indicated by reference numeral 2120.
  • When the motion of the camera device 220 that the capturing event matches is detected, the electronic device 201 captures the image by driving the camera device (for example, front camera device) in operation 2009. For example, referring to FIG. 21C, when motion information of the electronic device 201 which matches the location and angle of the electronic device 201 preset for image capturing is detected, the processor 230 displays matching information by the location and angle of the electronic device 201 to allow the user to recognize an automatic photographing time point as indicated by reference numeral 2130. The processor 230 captures the image by using the front camera device. For example, the processor 230 may acquire an amount of light for the image capturing or perform the image capturing and change a color of the display 270 at the same time for an image effect.
  • FIG. 22 is a flowchart of a process for controlling a camera device in accordance with a camera application in an electronic device, according to an embodiment of the present disclosure.
  • FIGS. 23A and 23B illustrate a screen configuration for controlling a camera device in accordance with a camera application in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include the electronic device 201 or at least a part (for example, processor 230) of the electronic device 201.
  • Referring to FIG. 22 , an operation for controlling the camera device 220 using the screen configuration of FIGS. 23A and 23B will be described. The electronic device 201 determines whether a first type touch input for the camera control area is detected in operation 2201. For example, referring to FIG. 23A, the processor 230 determines whether a drag input 2310 in a right direction of the placement area of the camera device 220 is detected.
  • When the first type touch input for the camera control area is not detected, the electronic device 201 terminates the operation for setting the camera application. For example, when the drag input in a down direction for the camera control area is detected, the electronic device 201 executes one camera application that is set as a basic application among a plurality of applications installed in the electronic device 201.
  • When the first type touch input for the camera control area is detected, the electronic device 201 determines at least one camera application installed in the electronic device 201 in operation 2203. For example, the processor 230 may extract camera application information stored in the memory 240.
  • In operation 2205, the electronic device 201 displays a camera application list including at least one camera application installed in the electronic device 201. For example, referring to FIG. 23B, the processor 230 displays icons of camera applications installed in the electronic device 201 on the display 270 such that the icons are output in the placement area of the camera device 220.
  • In operation 2207, the electronic device 201 determines whether a first camera application which is one of the applications included in the camera application list is selected. For example, the processor 230 determines whether a touch input for one of the icons of the camera applications displayed in the camera control area is detected as illustrated in FIG. 23B.
  • When a selection input for the first camera application is not detected, the electronic device 201 maintains display of the camera application list in operation 2205. In addition, when an input for the camera application list is not detected until a reference time passes from a time point when the camera application list is displayed, the electronic device 201 determines to not select the camera application for controlling the camera device 220 and terminates the operation.
  • When the selection input for the first camera application is detected, the electronic device 201 drives the camera device 220 based on the first camera application in operation 2209. For example, the processor 230 controls the camera device 220 by executing the first camera application. Accordingly, the processor 230 performs initial settings on the camera device 220 based on camera setting information set to the first camera application. In addition, the processor 230 sets the first camera application as a basic camera application of the electronic device 201.
  • FIG. 24 is a flowchart of a process for controlling a camera device in accordance with photographing setting information of an image in an electronic device, according to an embodiment of the present disclosure. FIG. 25 illustrates a screen configuration for controlling a camera device in accordance with photographing setting information of an image in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include the electronic device 201 or at least a part (for example, processor 230) of the electronic device 201.
  • Referring to FIG. 24 , an operation for controlling the camera device using the screen configuration of FIG. 25 will be described. The electronic device 201 displays a list of at least one image stored in the memory 240 of the electronic device 201 on the display 270 in operation 2401. For example, referring to FIG. 25 , when a gallery application is executed based on a user input, the processor 230 displays a thumbnail for at least one image stored in the memory 240 on the display 270 as indicated by reference numeral 2500. In addition, the processor 230 displays corresponding filter information 2510 on an image to which a filter for an image effect is applied. The electronic device 201 determines whether a touch input for a first image in an image list displayed on the display 270 is detected in operation 2403. For example, the processor 230 determines whether a touch input 2520 for the first image in the image list 2500 displayed on the display 270 is detected, as illustrated in FIG. 25 .
  • When the touch input for at least one image in the image list is not detected, the electronic device 201 determines to not provide the camera service and terminates the operation for providing the camera service.
  • When the touch input for the first image in the image list is detected, the electronic device 201 determines whether the first image enters the camera control area in operation 2405. For example, the processor 230 determines whether the first image enters the camera control area through a drag input 2530 for the first image, as illustrated in FIG. 25 . For example, when the touch input for the first image is released within the camera control area, the processor 230 determines that the first image enters the camera control area. For example, after detecting the first touch input 2520 for selecting the first image, the processor 230 determines whether a second touch input for determining a movement location of the first image is detected within the camera control area, as illustrated in FIG. 25 .
  • When the first image does not enter the camera control area, the electronic device 201 determines whether the touch input for the first image is released in operation 2411. For example, the processor 230 determines whether the touch input 2520 for the first image is released outside the camera control area in FIG. 25 .
  • When the touch input for the image is maintained, the electronic device 201 determines whether the first image enters the camera control area again by the touch input in operation 2405.
  • When the touch input for the first image is released, the electronic device 201 determines to not provide the camera service and terminates the operation for providing the camera service. For example, the processor 230 changes a location of the first image to a location where the touch input for the first image is released.
  • When the first image enters the camera control area, the electronic device 201 determines setting information of the camera device 220 set to capture the first image in operation 2407. For example, the setting information of the camera device 220 may include at least one of a filter for capturing the first image, image filter information, a photographing mode, and a photographing setting value (for example, aperture, shutter speed, and image size).
  • The electronic device 201 may update the setting information of the camera device 220 based on the setting information of the camera device 220 set to capture the first image in operation 2409. For example, the processor 230 may perform initial settings on the camera device 220 in accordance with the setting information of the camera device 220 that has been set to capture the first image.
  • FIG. 26 is a flowchart of a process for capturing an image based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include the electronic device 201 or at least a part (for example, processor 230) of the electronic device 201.
  • Referring to FIG. 26 , the electronic device 201 determines whether a touch input is detected through a camera control area set based on the placement area of the camera device 220 on the touch screen in operation 2601.
  • When the touch input is detected through the camera control area, the electronic device 201 determines whether image capturing matches the touch input detected through camera control area in operation 2603. For example, the processor 230 determines whether a double tap input which an image capturing event matches is detected based on touch input matching information stored in the memory 240.
  • When the touch input is not detected through the camera control area or when the image capturing does not match the touch input detected through the camera control area, the electronic device 201 determines to not perform the image capturing. Accordingly, the electronic device 201 terminates the operation for the image capturing.
  • When the image capturing matches the touch input detected through camera control area, the electronic device 201 captures the image through the camera device 220 without executing the camera application in operation 2605. For example, when the double tap input is detected through the camera control area, the processor 230 captures the image through the camera device 220 (for example, back camera device) in a state where the service screen of the application displayed on the display 270 is maintained. That is, the processor 230 captures the image in a state where a preview image acquired through the camera device 220 is not displayed. The processor 230 stores the captured image in the memory 240. In addition, the processor 230 displays image capturing information on a notification bar.
  • FIG. 27 is a flowchart of a process for photographing video based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include the electronic device 201 or at least a part (for example, processor 230) of the electronic device 201.
  • Referring to FIG. 27 , the electronic device 201 determines whether a touch input is detected through a camera control area preset to control the camera device 220 on the touch screen in operation 2701. The camera control area may include at least some areas of the touch screen including the placement area of the camera device 220. The camera control area may include at least some areas of the touch screen adjacent to the placement area of the camera device 220.
  • When the touch input is detected through the camera control area, the electronic device 201 determines whether video photographing matches the touch input detected through the camera control area in operation 2703. For example, the processor 230 determines whether a touch input having a touch maintaining time exceeding a reference time is detected through the camera control area based on touch input matching information stored in the memory 240.
  • When the touch input is not detected through the camera control area or when video photographing does not match the touch input detected through the camera control area, the electronic device 201 determines to not perform the video photographing. Accordingly, the electronic device 201 terminates the operation for the video photographing.
  • When the video photographing matches the touch input detected through the camera control area, the electronic device 201 starts the video photographing through the camera device 220 without executing the camera application in operation 2705. The processor 230 photographs the video through the camera device 220 (for example, back camera device) in a state where the service screen of the application displayed on the display 270 is maintained from a time point when the touch maintaining time of the touch input detected through the camera control area exceeds the reference time. When the video photographing is started, the processor 230 outputs notification information to allow the user to recognize the video photographing operation. Here, the notification information may include at least one of a notification sound, a notification message, and a vibration.
  • The electronic device 201 determines whether the touch input that the video photographing matches is released in operation 2707.
  • When the touch input that the video photographing matches is maintained, the electronic device 201 may continuously photograph the video in operation 2705.
  • When the touch input that the video photographing matches is released, the electronic device 201 terminates the video photographing. For example, the processor 230 may store the video photographed through the back camera device in the memory 240. In addition, the processor 230 displays video photographing information on the notification bar.
  • FIG. 28 is a flowchart of a process for displaying camera driving limit information in an electronic device, according to an embodiment of the present disclosure. FIG. 29 illustrates a screen configuration for displaying camera driving limit information in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include the electronic device 201 or at least a part (for example, processor 230) of the electronic device 201.
  • Referring to FIG. 28 , an operation for displaying the camera driving limit information using the screen configuration of FIG. 29 will be described. The electronic device 201 determines whether driving of a camera device 220 is limited in 220 2801. For example, the processor 230 may identify whether the driving of the camera device 220 is limited based on the type of an application being executed in the electronic device 201. As another example, the processor 230 may identify whether the driving of the camera device 220 is limited based on a location of the electronic device 201. For example, the processor 230 determines whether an operation mode of the camera device 220 is set as an inactive mode based on input information detected through the input/output interface 260.
  • When the driving of the camera device 220 is not limited, the electronic device 201 terminates the operation for displaying the camera driving limit information.
  • When the driving of the camera device 220 is limited, the electronic device 201 displays camera driving limit information in the camera control area in operation 2803. Referring to FIG. 29 , the processor 230 displays camera driving limit information 2900 (for example, red colored concentric circles) based on the placement area of the camera device 220.
  • The electronic device 201 determines whether a touch input is detected through the camera control area in a state where driving the camera device is limited in operation 2805. For example, the processor 230 determines whether a touch input for the camera driving limit information 2900 displayed in the camera control area is detected, as illustrated in FIG. 29 .
  • When the touch input is detected through the camera control area in a state where the driving of the camera device 220 is limited, the electronic device 201 executes a camera setting menu in operation 2807. For example, when a tap input for the camera driving limit information 2900 is detected, the processor 230 displays a camera setting menu for resetting a right of the camera device 220 on the display 270.
  • FIG. 30 is a flowchart of a process for providing a video call service in an electronic device, according to an embodiment of the present disclosure. FIG. 31 illustrates a screen configuration for providing a video call service in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include the electronic device 201 or at least a part (for example, processor 230) of the electronic device 201.
  • Referring to FIG. 30 , an operation for providing the video call service using the screen configuration of FIG. 31 will be described. Referring to FIG. 30 , the electronic device 201 determines whether a call connection request signal for the video call is received in operation 3001. For example, when the call connection request signal is received through the communication interface 280, the processor 230 determines whether the corresponding call connection request signal is a call connection request signal corresponding to the video call service.
  • When the call connection request signal for the video call is not received, the electronic device 201 terminates the operation for providing the video call service.
  • When the call connection request signal for the video call is received, the electronic device 201 displays video call reception information of the display area corresponding to the camera control area in operation 3003. For example, referring to FIG. 31 , the processor 230 displays video call reception information 3100 to be output from the placement area of the camera device 220 on the display 270.
  • In operation 3005, the electronic device 201 determines whether a touch input is detected through the camera control area in a state where the video call reception information is displayed. For example, the processor 230 determines whether a drag input in a first direction (for example, right direction) for the video call reception information 3100 displayed to be adjacent to the placement area of the camera device 220 is detected.
  • When the touch input is not detected through the camera control area, the electronic device 220 may maintain display of the video call reception information in at least some area of the display 270 corresponding to the camera control area in operation 3003 When the touch input is not detected through the camera control area until a reference time passes from a time point when the call connection request signal for the video call is received, the electronic device 201 may determine to not accept the video call connection. In this case, the electronic device 201 displays video call connection failure information on the display 270.
  • When the touch input is detected through the camera control area in a state where the video call reception information is displayed, the electronic device 201 activates the front camera device and provides the video call service in operation 3007. For example, when a drag input in a first direction (for example, right direction) for the video call reception information 3100 displayed in at least some areas of the display 270 is detected, the processor 230 determines that the user accepts the call connection for the video call. Accordingly, the processor 230 displays an image collected through the front camera device and an image received from a counterpart electronic device on the display 270 by executing the video call application.
  • When a drag input in a second direction (for example, left direction) for the video call reception information 3100 displayed in at least some areas of the display 270 is detected, the processor 230 may determine that the user does not accept the call connection for the video call. Accordingly, the processor 230 may block the call connection for the video call.
  • FIG. 32 is a flowchart of a process for displaying human body recognition service information in an electronic device, according to an embodiment of the present disclosure. FIGS. 33A to 33D illustrate a screen configuration for displaying human body recognition service information in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include the electronic device 201 or at least a part (for example, processor 230) of the electronic device 201.
  • Referring to FIG. 32 , an operation for providing the human body recognition service using the screen configuration of FIGS. 33A to 33D will be described. The electronic device 201 determines whether human body recognition is performed through the camera device 220 of the electronic device 201 in operation 3201. The processor 230 may determine whether an iris recognition menu for unlocking the electronic device 201 is selected using the camera device 220 (for example, front camera device). As another example, the processor 230 may determine whether a face recognition menu for authenticating the user of the electronic device 201 is selected using the camera device 220 (for example, front camera device).
  • When the human body recognition using the camera device 220 is not performed, the electronic device 201 terminates the operation for providing the human body recognition service.
  • When the human body recognition using the camera device 220 is performed, the electronic device 201 displays time information spent for the human body recognition in a display area corresponding to the camera control area in operation 3203. Referring to FIG. 33A, when the iris recognition is performed, the processor 230 displays time information 3300 spent for the iris recognition based on the placement area of the camera device 220. The time spent for the human body recognition may include a minimum time during which the human body recognition (for example, iris recognition) can be completed through the camera device 220.
  • The electronic device 201 determines whether the time spent for the human body recognition expires in operation 3205. For example, the processor 230 determines whether an elapsed time from a time point when the human body recognition starts is the same as the time spent for the human body recognition.
  • When the time spent for the human body recognition does not expire, the electronic device 201 displays elapsed time information for the human body recognition in the display area corresponding to the camera control area in operation 3211. For example, referring to FIG. 33B, the processor 230 displays elapsed time information 3310 of the iris recognition to overlap the time information 3300 spent for the iris recognition displayed based on the placement area of the camera device 220.
  • The electronic device 201 determines again whether the time spent for the human body recognition expires in operation 3205.
  • When the time spent for the human body recognition expires, the electronic device 201 determines whether the human body recognition is successful in operation 3207. Referring to FIG. 33C, when the elapsed time of the performance of the iris recognition is the same as the time information spent for the iris recognition as indicated by reference numeral 3320, the processor 230 determines that the iris recognition is completed. Accordingly, the processor 230 determines whether the authentication of the user is successful based on a result of the iris recognition. For example, the processor 230 determines whether iris information detected through the iris recognition matches iris information preset in the memory 240.
  • When the human body recognition fails, the electronic device 201 determines that the authentication of the user through the human body recognition fails. Accordingly, the electronic device 201 terminates the operation for providing the human body recognition service. The electronic device 201 displays human body recognition failure information on the display 270.
  • When the human body recognition is successful, the electronic device 201 may unlock the electronic device 201 in operation 3209. Referring to FIG. 33D, when the user is authenticated through the iris recognition, the processor 230 releases a lock function of the electronic device 201 and displays a standby screen 3330 on the display 270.
  • FIG. 34 is a flowchart of a process for displaying pollution level information of a camera device in an electronic device, according to an embodiment of the present disclosure. FIG. 35 illustrates a screen configuration for displaying pollution level information of a camera device in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include the electronic device 201 or at least a part (for example, processor 230) of the electronic device 201.
  • Referring to FIG. 34 , an operation for displaying the pollution level information of the camera device 220 using the screen configuration of FIG. 35 will be described. The electronic device 201 drives the camera device 220 disposed on some areas (for example, upper area) of the display 270 in operation 3401. For example, the processor 230 executes the camera application based on touch information of a camera control area through operations 401 to 407 of FIG. 4 or operations 1001 to 1009 of FIG. 10 . The processor 230 controls the camera device 220 (for example, front camera device) through the camera application.
  • The electronic device 201 captures an image through the camera device 220 disposed on some areas of the display 270 in operation 3403. For example, when a pollution level measuring event is generated, the processor 230 captures an image through the front camera device 120 disposed in the upper area of the display 270. For example, the pollution level measuring event may be periodically generated or may be generated at a time point when the camera device is driven.
  • The electronic device 201 may detect a pollution level of a camera lens through the image capture through the camera device 220 in operation 3405. For example, the processor 230 may estimate the definition of the image acquired through the front camera device. The processor 230 may detect the pollution level of the camera lens corresponding to the definition of the image.
  • The electronic device 220 determines whether the pollution level of the camera lens exceeds a reference pollution level in operation 3407. For example, the reference pollution level may be set, by the user, as a reference value of a pollution level which can influence a quality of the image acquired through the camera device 220 or may include a fixed value.
  • When the pollution level of the camera lens is less than or equal to the reference pollution level, the electronic device 201 determines that the pollution level of the camera lens does not influence the quality of the image acquired through the camera device 220. Accordingly, the electronic device 201 terminates the operation for displaying the pollution information of the camera lens.
  • When the pollution level of the camera lens exceeds the reference pollution level, the electronic device 201 displays pollution information of the camera lens in the display area corresponding to the camera control area to allow the user to recognize the pollution level of the camera lens in operation 3409. For example, when pollution level of the camera lens exceeds the reference pollution level, the processor 230 displays pollution information 3500 of the camera lens based on the placement area of the camera device 220 to inform the user to wash the camera lens.
  • The electronic device 201 may display an amount of the pollution level of the camera lens based on the placement area of the camera device 220. For example, the processor 230 may display the amount of the pollution level of the camera lens through the number of concentric circles based on the placement area of the camera device 220. For example, the processor 230 may increase the number of concentric circles displayed in the placement area of the camera device 220 as the pollution level of the camera lens is more serious. In addition, when the camera lens is not polluted, the processor 230 may not display the concentric circle indicating the pollution level of the camera lens.
  • An electronic device including a camera device disposed at a location overlapping at least a partial area of a display and an operation method thereof control the camera device based on an adjacent area including a placement area of the camera device or an adjacent area close to the placement area of the camera device, so that a user of the electronic device can easily drive and control a camera application.
  • An electronic device including a camera device disposed at a location overlapping at least a partial area of a display and an operation method thereof display control information related to a camera device in an adjacent area including a placement area of the camera device or an adjacent area close to the placement area of the camera device, so that a user of the electronic device can easily detect an operational state of the camera device and a natural photo can be taken by inducing user's eyes to a lens direction.
  • The term “module” as used herein may refer to a unit including one of hardware, software, and firmware or a combination of two or more of them. The term “module” may be interchangeably used with the term “unit”, “logic”, “logical block”, “component”, or “circuit”. The “module” may be a minimum unit of an integrated component element or a part thereof. The “module” may be a minimum unit for performing one or more functions or a part thereof. The “module” may be mechanically or electronically implemented. For example, the “module” according to the present disclosure may include at least one of an application-specific integrated circuit (ASIC) chip, a field-programmable gate arrays (FPGA), and a programmable-logic device for performing operations which has been known or are to be developed hereinafter.
  • According to an embodiment, at least some of the devices (for example, modules or functions thereof) or the method (for example, operations) may be implemented by instructions stored in a computer-readable storage medium in a program module form. The instructions, when executed by the processor 230, may cause the processor 230 to execute the function corresponding to the instruction. The computer-readable storage medium may be the memory 240.
  • The computer readable storage medium may include a hard disk, a floppy disk, magnetic media (for example, a magnetic tape), optical media (for example, a compact disc read only memory (CD-ROM) and a digital versatile disc (DVD)), magneto-optical media (for example, a floptical disk), a hardware device (for example, a read only memory (ROM), a random access memory (RAM), a flash memory), and the like. In addition, the instructions may include high class language codes, which can be executed in a computer by using an interpreter, as well as machine codes made by a compiler. Any of the hardware devices as described above may be configured to work as one or more software modules in order to perform the operations according to various embodiments of the present disclosure, and vice versa.
  • Any of the modules or programming modules according to various embodiments of the present disclosure may include at least one of the above described elements, exclude some of the elements, or further include other additional elements. The operations performed by the modules, programming module, or other elements according to various embodiments of the present disclosure may be executed in a sequential, parallel, repetitive, or heuristic manner. Further, some operations may be executed according to another order or may be omitted, or other operations may be added.
  • The embodiments disclosed herein are provided merely to easily describe technical details of the present disclosure and to help the understanding of the present disclosure, and are not intended to limit the scope of the present disclosure. Therefore, it should be construed that all modifications and changes or various other embodiments based on the technical idea of the present disclosure fall within the scope of the present disclosure. Therefore, the scope of the present disclosure is defined, not by the detailed description and embodiments, but by the appended claims and their equivalents.

Claims (20)

What is claimed is:
1. An electronic device comprising:
a touch screen including a layer having a hole formed in the layer;
a camera configured to capture an image through the hole; and
at least one processor configured to:
receive, via the touch screen, touch input on a portion of the touch screen, a contact area of the received touch input including at least a part of an area where the hole is formed;
perform an operation related with the camera based on the touch input; and
display an image received through the camera on the touch screen based on the touch input.
2. The electronic device of claim 1, wherein the hole of the layer is perforated or omits an element of the layer.
3. The electronic device of claim 1, wherein the at least one processor is further configured to display operational state information of the camera on the portion.
4. The electronic device of claim 1, wherein the touch input comprises a drag gesture that is initiated within the portion and is ended outside of the portion.
5. The electronic device of claim 4, wherein the at least one processor is further configured to display the image being received through the camera and having a size based on a position of the touch screen where the drag gesture is ended.
6. The electronic device of claim 1, wherein the at least one processor is further configured to:
perform the operation including an execution of the camera in response to the received touch input,
display activation information regarding the execution in the portion, and
display the image received through the camera on the touch screen based on a drag of the touch input from a first position to a second position on the touch screen.
7. The electronic device of claim 1, wherein the at least one processor is further configured to display the image in a first portion of the touch screen and a screen of an application in a second portion of the touch screen distinct from the first portion.
8. The electronic device of claim 7, wherein the at least one processor is further configured to configure, when the image is captured through the camera, the captured image as input data of the application.
9. The electronic device of claim 1, wherein the at least one processor is further configured to set a photographing timer of the camera to capture the image based on a drag distance of the touch input.
10. The electronic device of claim 1, wherein the at least one processor is further configured to change a color of the touch screen into a bright color based on the touch input simultaneously with capturing the image through the camera.
11. A method of operating an electronic device comprising a camera configured to capture an image through a hole formed in a layer of a touch screen, the method comprising:
receiving, via the touch screen, a touch input on a portion of the touch screen, a contact area of the received touch input including at least a part of an area where the hole is formed;
performing an operation related with the camera based on the touch input; and
displaying an image received through the camera on the touch screen based on the touch input.
12. The method of claim 11, further comprising:
displaying operational state information of the camera on the portion.
13. The method of claim 11, wherein the touch input comprises a drag gesture that is initiated within the portion and is ended outside of the portion.
14. The method of claim 13, wherein displaying the image further comprises:
displaying the image being received through the camera and having a size based on a position of the touch screen where the drag gesture is ended.
15. The method of claim 11, wherein performing the operation further comprises:
performing the operation including an execution of the camera in response to the received touch input, and
displaying activation information regarding the execution in the portion.
16. The method of claim 15, wherein displaying the image comprises:
displaying the image received through the camera on the touch screen based on a drag of the touch input from a first position to a second position on the touch screen.
17. The method of claim 11, wherein displaying the image comprises:
displaying the image in a first portion of the touch screen and a screen of an application in a second portion of the touch screen distinct from the first portion.
18. The method of claim 17, further comprising:
configuring, when the image is captured through the camera, the captured image as input data of the application.
19. The method of claim 11, further comprising:
setting a photographing timer of the camera to capture the image based on a drag distance of the touch input; and
displaying information on the photographing timer of the camera on the touch screen.
20. The method of claim 11, further comprising:
changing a color of the touch screen into a bright color based on the touch input simultaneously with capturing the image through the camera.
US18/070,130 2016-01-15 2022-11-28 Method of controlling camera device and electronic device thereof Abandoned US20230087406A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US18/070,130 US20230087406A1 (en) 2016-01-15 2022-11-28 Method of controlling camera device and electronic device thereof
US18/460,976 US20230412911A1 (en) 2016-01-15 2023-09-05 Method of controlling camera device and electronic device thereof

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
KR1020160005293A KR102449593B1 (en) 2016-01-15 2016-01-15 Method for controlling camera device and electronic device thereof
KR10-2016-0005293 2016-01-15
US15/407,943 US20170208241A1 (en) 2016-01-15 2017-01-17 Method of controlling camera device and electronic device thereof
US17/104,980 US11516380B2 (en) 2016-01-15 2020-11-25 Method of controlling camera device in an electronic device in various instances and electronic device thereof
US18/070,130 US20230087406A1 (en) 2016-01-15 2022-11-28 Method of controlling camera device and electronic device thereof

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US17/104,980 Continuation US11516380B2 (en) 2016-01-15 2020-11-25 Method of controlling camera device in an electronic device in various instances and electronic device thereof

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/460,976 Continuation US20230412911A1 (en) 2016-01-15 2023-09-05 Method of controlling camera device and electronic device thereof

Publications (1)

Publication Number Publication Date
US20230087406A1 true US20230087406A1 (en) 2023-03-23

Family

ID=59314854

Family Applications (4)

Application Number Title Priority Date Filing Date
US15/407,943 Abandoned US20170208241A1 (en) 2016-01-15 2017-01-17 Method of controlling camera device and electronic device thereof
US17/104,980 Active US11516380B2 (en) 2016-01-15 2020-11-25 Method of controlling camera device in an electronic device in various instances and electronic device thereof
US18/070,130 Abandoned US20230087406A1 (en) 2016-01-15 2022-11-28 Method of controlling camera device and electronic device thereof
US18/460,976 Pending US20230412911A1 (en) 2016-01-15 2023-09-05 Method of controlling camera device and electronic device thereof

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US15/407,943 Abandoned US20170208241A1 (en) 2016-01-15 2017-01-17 Method of controlling camera device and electronic device thereof
US17/104,980 Active US11516380B2 (en) 2016-01-15 2020-11-25 Method of controlling camera device in an electronic device in various instances and electronic device thereof

Family Applications After (1)

Application Number Title Priority Date Filing Date
US18/460,976 Pending US20230412911A1 (en) 2016-01-15 2023-09-05 Method of controlling camera device and electronic device thereof

Country Status (2)

Country Link
US (4) US20170208241A1 (en)
KR (1) KR102449593B1 (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102109883B1 (en) * 2013-09-03 2020-05-12 삼성전자주식회사 Content transmission method and apparatus
CN108363537A (en) * 2018-01-24 2018-08-03 京东方科技集团股份有限公司 Mobile terminal
EP3525064B1 (en) * 2018-02-12 2022-07-13 Samsung Display Co., Ltd. Display device and method for fabricating the same
US10991774B2 (en) 2018-02-12 2021-04-27 Samsung Display Co., Ltd. Display device and method for fabricating the same
EP3962063A1 (en) * 2018-05-23 2022-03-02 Huawei Technologies Co., Ltd. Photographing method and terminal device
CN111524932A (en) * 2019-02-01 2020-08-11 Oppo广东移动通信有限公司 Electronic equipment, pixel structure and display device
US11308618B2 (en) 2019-04-14 2022-04-19 Holovisions LLC Healthy-Selfie(TM): a portable phone-moving device for telemedicine imaging using a mobile phone
CN112306314B (en) * 2019-07-31 2022-10-04 华为技术有限公司 Interface display method and electronic equipment
WO2022034938A1 (en) * 2020-08-11 2022-02-17 엘지전자 주식회사 Image capturing device and control method therefor
USD992593S1 (en) * 2021-01-08 2023-07-18 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
US20220269831A1 (en) * 2021-02-25 2022-08-25 Lenovo (Singapore) Pte. Ltd. Electronic privacy filter activation
EP4341793A2 (en) * 2021-05-17 2024-03-27 Apple Inc. Interacting with notes user interfaces
US11516434B1 (en) * 2021-08-26 2022-11-29 Motorola Mobility Llc Routing visual content from different camera systems to different applications during video call

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6154210A (en) * 1998-11-25 2000-11-28 Flashpoint Technology, Inc. Method and system for implementing button interface compatibility in touch-screen equipped digital imaging device
US8314859B2 (en) * 2008-05-29 2012-11-20 Lg Electronics Inc. Mobile terminal and image capturing method thereof
KR101653169B1 (en) * 2010-03-02 2016-09-02 삼성디스플레이 주식회사 Apparatus for visible light communication and method thereof
JP5464083B2 (en) * 2010-07-07 2014-04-09 ソニー株式会社 Information processing apparatus, information processing method, and program
KR101375908B1 (en) * 2012-07-10 2014-03-18 주식회사 팬택 Photographing timer control apparatus and method
US20140133715A1 (en) * 2012-11-15 2014-05-15 Identity Validation Products, Llc Display screen with integrated user biometric sensing and verification system
KR102010955B1 (en) * 2013-01-07 2019-08-14 삼성전자 주식회사 Method for controlling preview of picture taken in camera and mobile terminal implementing the same
KR102020636B1 (en) * 2013-06-07 2019-09-10 삼성전자주식회사 Method for controlling electronic device based on camera, machine-readable storage medium and electronic device
US9525811B2 (en) * 2013-07-01 2016-12-20 Qualcomm Incorporated Display device configured as an illumination source
KR102080746B1 (en) 2013-07-12 2020-02-24 엘지전자 주식회사 Mobile terminal and control method thereof
US9294914B2 (en) * 2013-08-27 2016-03-22 Symbol Technologies, Llc Localized visible light communications among wireless communication devices
KR20160063875A (en) * 2014-11-27 2016-06-07 엘지전자 주식회사 Mobile terminal and method for controlling the same
JP6497549B2 (en) * 2015-03-05 2019-04-10 カシオ計算機株式会社 Electronic device, touch operation control method, and program
US9736383B2 (en) * 2015-10-30 2017-08-15 Essential Products, Inc. Apparatus and method to maximize the display area of a mobile device

Also Published As

Publication number Publication date
US20230412911A1 (en) 2023-12-21
US11516380B2 (en) 2022-11-29
KR102449593B1 (en) 2022-09-30
US20210084216A1 (en) 2021-03-18
KR20170085760A (en) 2017-07-25
US20170208241A1 (en) 2017-07-20

Similar Documents

Publication Publication Date Title
US20230087406A1 (en) Method of controlling camera device and electronic device thereof
KR102386398B1 (en) Method for providing different indicator for image based on photographing mode and electronic device thereof
KR102593824B1 (en) Method for controlling a camera and electronic device thereof
US10082998B2 (en) Electronic device and information sharing method thereof
KR102627244B1 (en) Electronic device and method for displaying image for iris recognition in electronic device
KR102361885B1 (en) Electronic apparatus and controlling method thereof
US10805522B2 (en) Method of controlling camera of device and device thereof
EP3057309B1 (en) Method for controlling camera system, electronic device, and storage medium
KR102377277B1 (en) Method and apparatus for supporting communication in electronic device
KR102620138B1 (en) Method for Outputting Screen and the Electronic Device supporting the same
KR102488563B1 (en) Apparatus and Method for Processing Differential Beauty Effect
CN107924314B (en) Apparatus and method for executing application
TWI673679B (en) Method,apparatus and computer-readable storage media for user interface
US10367978B2 (en) Camera switching method and electronic device supporting the same
CN105893078B (en) Method for controlling camera system, electronic device, and storage medium
EP3511817B1 (en) Electronic device having double-sided display and method for controlling application
US20170118402A1 (en) Electronic device and camera control method therefor
CN108427533B (en) Electronic device and method for determining environment of electronic device
US10057479B2 (en) Electronic apparatus and method for switching touch operations between states
KR20150122574A (en) Method and device for executing user instructions
US10685465B2 (en) Electronic device and method for displaying and generating panoramic image
KR102650189B1 (en) Electronic apparatus and controlling method thereof
KR102317624B1 (en) Electronic device and method for processing image of the same
CN114826799A (en) Information acquisition method, device, terminal and storage medium
CN110809256B (en) System acceleration method and device of terminal, storage medium and terminal

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION