US20230087406A1 - Method of controlling camera device and electronic device thereof - Google Patents
Method of controlling camera device and electronic device thereof Download PDFInfo
- Publication number
- US20230087406A1 US20230087406A1 US18/070,130 US202218070130A US2023087406A1 US 20230087406 A1 US20230087406 A1 US 20230087406A1 US 202218070130 A US202218070130 A US 202218070130A US 2023087406 A1 US2023087406 A1 US 2023087406A1
- Authority
- US
- United States
- Prior art keywords
- camera
- electronic device
- processor
- image
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 60
- 230000004913 activation Effects 0.000 claims description 40
- 230000008859 change Effects 0.000 claims description 8
- 230000004044 response Effects 0.000 claims description 3
- 230000008569 process Effects 0.000 description 38
- 230000006870 function Effects 0.000 description 33
- 238000004891 communication Methods 0.000 description 17
- 230000000694 effects Effects 0.000 description 12
- 238000012545 processing Methods 0.000 description 4
- 230000005236 sound signal Effects 0.000 description 3
- 238000011161 development Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 230000001939 inductive effect Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012806 monitoring device Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- RMPWIIKNWPVWNG-UHFFFAOYSA-N 1,2,3,4-tetrachloro-5-(2,3,4-trichlorophenyl)benzene Chemical compound ClC1=C(Cl)C(Cl)=CC=C1C1=CC(Cl)=C(Cl)C(Cl)=C1Cl RMPWIIKNWPVWNG-UHFFFAOYSA-N 0.000 description 1
- WQZGKKKJIJFFOK-GASJEMHNSA-N Glucose Natural products OC[C@H]1OC(O)[C@H](O)[C@@H](O)[C@@H]1O WQZGKKKJIJFFOK-GASJEMHNSA-N 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 230000036772 blood pressure Effects 0.000 description 1
- 230000036760 body temperature Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 239000008103 glucose Substances 0.000 description 1
- 239000012212 insulator Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 239000012780 transparent material Substances 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H04N5/23216—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1686—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/631—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
- H04N23/634—Warning indications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H04N5/23218—
-
- H04N5/232933—
-
- H04N5/232941—
-
- H04N5/247—
Definitions
- the present disclosure relates generally to an apparatus and a method for controlling a camera device in an electronic device.
- portable electronic devices may provide various services such as broadcast services, wireless Internet services, camera services, and music playback services.
- the electronic device may provide the camera services through a plurality of camera devices to meet various user demands.
- the electronic device may acquire images or videos through a front camera device disposed on the front surface of the electronic device and a back camera device disposed on the back surface.
- the electronic device may provide the camera service to a user of the electronic device by executing a camera application to control the plurality of camera devices.
- the user of the electronic device may feel inconvenience due to multiple controls for execution of the camera application. For example, when the user of the electronic device uses the camera service through an electronic device in which a message application is being executed, the user may feel inconvenience in executing the camera application through a second control after making the electronic device enter a standby mode through a first control. As another example, when the user of the electronic device uses the camera service through a locked electronic device, the user may feel inconvenience in executing the camera application through a second control after unlocking the electronic device through a first control.
- an aspect of the present disclosure is to provide an electronic device and a method for easily controlling a camera device in an electronic device.
- another aspect of the present disclosure is to provide an electronic device including a camera device disposed at a location overlapping at least a partial area of a display and a method for controlling the camera device based on an adjacent area including a placement area of the camera device or an adjacent area close to the placement area of the camera device, so that a user of the electronic device can easily drive and control a camera application.
- another aspect of the present disclosure is to provide an electronic device including a camera device disposed at a location overlapping at least a partial area of a display and a method for displaying control information related to a camera device in an adjacent area including a placement area of the camera device or an adjacent area close to the placement area of the camera device, so that a user of the electronic device can easily detect an operational state of the camera device and a photo can be taken by inducing a user's eyes in a direction of the camera lens.
- an electronic device includes a display, a camera device disposed at a location overlapping a partial area of the display, and a processor configured to receive, via a touch screen, touch input on a portion of the touch screen, a contact area of the received touch input including at least a part of an area where a hole is formed, perform an operation related with the camera based on the touch input, and display an image received through the camera on the touch screen based on the touch input.
- a method of operating an electronic device comprising a camera configured to capture an image through a hole formed in a layer of a touch screen.
- the method includes receiving, via the touch screen, a touch input on a portion of the touch screen, a contact area of the received touch input including at least a part of an area where the hole is formed, performing an operation related with the camera based on the touch input, and displaying an image received through the camera on the touch screen based on the touch input.
- FIGS. 1 A and 1 B illustrate a configuration of an electronic device, according to an embodiment of the present disclosure
- FIG. 2 is a block diagram of an electronic device in a network environment, according to an embodiment of the present disclosure
- FIG. 3 is a flowchart of a process for controlling a camera device in an electronic device, according to an embodiment of the present disclosure
- FIG. 4 is a flowchart of a process for controlling a camera device in an electronic device when a screen of the electronic device is turned off, according to an embodiment of the present disclosure
- FIG. 5 is a flowchart of a process for detecting touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure
- FIGS. 6 A to 6 D illustrate a screen configuration for controlling a camera device in an electronic device when a screen of the electronic device is turned off, according to an embodiment of the present disclosure
- FIG. 7 is a flowchart of a process for configuring a camera display area in an electronic device based on touch information of a camera control area, according to an embodiment of the present disclosure
- FIG. 8 is a flowchart of a process for controlling a camera device in an electronic device when a camera application is linked with another application, according to an embodiment of the present disclosure
- FIGS. 9 A to 9 E illustrate a screen configuration for controlling a camera device in an electronic device when a camera application is linked with another application, according to an embodiment of the present disclosure
- FIG. 10 is a flowchart of a process for controlling a camera device in an electronic device based on touch maintaining information of a camera control area, according to an embodiment of the present disclosure
- FIGS. 11 A and 11 B illustrate a screen configuration for controlling a camera device in an electronic device based on touch maintaining information of a camera control area, according to an embodiment of the present disclosure
- FIG. 12 is a flowchart of a process for setting a timer of a camera device based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure
- FIGS. 13 A to 13 C illustrate a screen configuration for setting a timer of a camera device based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure
- FIG. 14 is a flowchart of a process for providing a flash effect in an electronic device, according to an embodiment of the present disclosure
- FIGS. 15 A to 15 D illustrate a screen configuration for providing a flash effect in an electronic device, according to an embodiment of the present disclosure
- FIGS. 17 A and 17 B illustrate a screen configuration for providing a camera service through an application in an electronic device, according to an embodiment of the present disclosure
- FIG. 18 is a flowchart of a process for providing a multi-camera service in an electronic device, according to an embodiment of the present disclosure
- FIGS. 19 A to 19 F illustrate a screen configuration for providing a multi-camera service in an electronic device, according to an embodiment of the present disclosure
- FIG. 20 is a flowchart of a process for providing an automatic photographing service in an electronic device, according to an embodiment of the present disclosure
- FIGS. 21 A to 21 C illustrate a screen configuration for providing an automatic photographing service in an electronic device, according to an embodiment of the present disclosure
- FIG. 22 is a flowchart of a process for controlling a camera device in accordance with a camera application in an electronic device, according to an embodiment of the present disclosure
- FIGS. 23 A and 23 B illustrate a screen configuration for controlling a camera device in accordance with a camera application in an electronic device, according to an embodiment of the present disclosure
- FIG. 25 illustrates a screen configuration for controlling a camera device in accordance with photographing setting information of an image in an electronic device, according to an embodiment of the present disclosure
- FIG. 26 is a flowchart of a process for capturing an image based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure
- FIG. 27 is a flowchart of a process in which an electronic device photographs video based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure
- FIG. 28 is a flowchart of a process for displaying camera driving limit information in an electronic device, according to an embodiment of the present disclosure
- FIG. 29 illustrates a screen configuration for displaying camera driving limit information in an electronic device, according to an embodiment of the present disclosure
- FIG. 30 is a flowchart of a process for providing a video call service in an electronic device, according to an embodiment of the present disclosure
- FIG. 31 illustrates a screen configuration for providing a video call service in an electronic device, according to an embodiment of the present disclosure
- FIGS. 33 A to 33 D illustrate a screen configuration for displaying human body recognition service information in an electronic device according to an embodiment of the present disclosure
- FIG. 34 is a flowchart of a process for displaying pollution level information of a camera device in an electronic device, according to an embodiment of the present disclosure.
- FIG. 35 illustrates a screen configuration for displaying pollution level information of a camera device in an electronic device, according to an embodiment of the present disclosure.
- a or B “at least one of A or/and B” or “one or more of A or/and B” used in the various embodiments of the present disclosure include any and all combinations of words enumerated with it.
- “A or B”, “at least one of A and B” or “at least one of A or B” means (1) including A, (2) including B, or (3) including both A and B.
- first and second may modify various elements of various embodiments, these terms do not limit the corresponding elements. For example, these terms do not limit an order and/or importance of the corresponding elements. These terms may be used for the purpose of distinguishing one element from another element.
- a first user device and a second user device both indicate user devices and may indicate different user devices.
- a first element may be referred to as a second element without departing from the scope of the present disclosure, and similarly, a second element may be referred to as a first element.
- first element e.g., a first element
- second element e.g., a second element
- first element may be directly connected or coupled to the second element, and there may be an intervening element (e.g., a third element) between the first element and second element.
- intervening element e.g., a third element
- a processor configured to (set to) perform A, B, and C may be a dedicated processor, e.g., an embedded processor, for performing a corresponding operation, or a generic-purpose processor, e.g., a central processing unit (CPU) or an application processor (AP), capable of performing a corresponding operation by executing one or more software programs stored in a memory device.
- a dedicated processor e.g., an embedded processor
- a generic-purpose processor e.g., a central processing unit (CPU) or an application processor (AP), capable of performing a corresponding operation by executing one or more software programs stored in a memory device.
- An electronic device may include at least one of a smartphone, a tablet personal computer (PC), a mobile phone, a video phone, an electronic book (e-book) reader, a laptop PC, a netbook computer, a workstation, a personal digital assistant (PDA), a portable multimedia player (PMP), an MP3 player, a mobile medical appliance, a camera, and a wearable device (e.g., smart glasses, a head-mounted-device (HMD), electronic clothes, an electronic bracelet, an electronic necklace, an electronic appcessory, an electronic tattoo, a smart mirror, or a smart watch).
- a wearable device e.g., smart glasses, a head-mounted-device (HMD), electronic clothes, an electronic bracelet, an electronic necklace, an electronic appcessory, an electronic tattoo, a smart mirror, or a smart watch.
- the electronic device may include at least one of various medical devices (e.g., various portable medical measuring devices (a blood glucose monitoring device, a heart rate monitoring device, a blood pressure measuring device, a body temperature measuring device, etc.)), a navigation device, a global positioning system (GPS) receiver, an event data recorder (EDR), a flight data recorder (FDR), a vehicle infotainment device, an electronic device for a ship (e.g., a navigation device, and a gyro-compass), an avionics device, a security device, or Internet of Things (IoT) device (e.g., a light bulb, various sensors, an electric or gas meter, a sprinkler device, a fire alarm, a thermostat, a streetlamp, a toaster, a sporting goods, a hot water tank, a heater, a boiler, etc.).
- various medical devices e.g., various portable medical measuring devices (a blood glucose monitoring device, a heart rate monitoring device, a blood
- the term “user” may indicate a person who uses an electronic device or a device (e.g., an artificial intelligence electronic device) that uses an electronic device.
- FIGS. 1 A and 1 B illustrate a configuration of an electronic device, according to an embodiment of the present disclosure.
- an electronic device 100 is provided.
- the electronic device 100 may be configured as one body.
- the electronic device 100 may be an electronic device for communication including a speaker device 130 and a microphone device 140 for a voice call.
- the electronic device 100 may have a front surface configured by a touch screen 110 .
- a camera device 120 may be disposed on at least some areas of the touch screen 110 .
- the speaker device 130 may be disposed on at least one surface adjacent to the touch screen 110 (for example, an upper side surface, lower side surface, left side surface, and right side surface).
- the speaker device 130 may be disposed on the upper side surface adjacent to the touch screen 110 close to the user's ear for a voice call.
- Control buttons for example, a home button and a back button for controlling the electronic device 100 may be displayed in a lower area of the touch screen 110 .
- the touch screen 110 of the electronic device 100 may include a front window 140 , a touch panel 150 , a display module 160 , and a printed circuit board (PCB) 170 , as illustrated in FIG. 1 B .
- the camera device (for example, a front camera device) 180 of the electronic device 100 may be mounted on the PCB.
- the front window 140 may be a transparent material window film that forms an external surface of the touch screen 110 .
- the PCB 170 may use a flexible PCB (FPCB), which is an electronic component made by forming a conductive circuit having good electrical conductivity (e.g., cooper) on an insulator.
- FPCB flexible PCB
- the camera device 180 may be disposed at a position overlapping at least some areas 152 of the touch panel 150 .
- at least some areas 152 of the touch panel 150 on which the camera device 180 is disposed may be perforated.
- the touch screen 110 may have a limited touch recognition function in the area 152 on which the camera device 180 is disposed.
- at least some areas 152 of the touch panel 150 on which the camera device 180 is disposed are not perforated, and a touch pattern for touch recognition may be omitted in the corresponding areas 152 .
- the touch screen 110 may have a limited touch recognition function in the area 152 on which the camera device 180 is disposed.
- the touch panel 150 on which the camera device 180 is disposed are not perforated, and the touch pattern for touch recognition may be set in the corresponding areas 152 .
- the touch screen 110 may detect a touch input through the areas 152 on which the camera device 180 is disposed.
- the touch pattern may include an electrode for the touch recognition.
- the camera device 180 may be disposed at a position overlapping some areas 162 of the display module 160 .
- at least some areas 162 of the display module 160 on which the camera device 180 is disposed may be perforated.
- the touch screen 110 may have a limited display function on the areas 162 in which the camera device 180 is disposed.
- at least some areas 162 of the display module 160 on which the camera device 180 is disposed are not perforated, and a display component may not be disposed in the corresponding areas 162 .
- the touch screen 110 may have a limited display function on the areas 162 in which the camera device 180 is disposed.
- At least some areas 162 of the display module 160 on which the camera device 180 is disposed may not be perforated, and a display component may be disposed.
- the touch screen 110 may display information through the areas 162 in which the camera device 180 is disposed.
- the electronic device 100 may form at least one hole in at least some areas (upper end) of the touch screen 110 and place the speaker device 130 for a voice call service in the at least one hole.
- FIG. 2 is a block diagram of an electronic device in a network environment, according to an embodiment of the present disclosure.
- the electronic device 201 may include a bus 210 , a camera device 220 , a processor 230 (e.g., including processing circuitry), a memory 240 , an input/output interface 260 (e.g., including input/output circuitry), a display 270 (e.g., including display circuitry), and a communication interface 280 (e.g., including communication circuitry).
- the electronic device 201 may omit at least one of the elements, or may further include other elements.
- the bus 210 is a circuit that interconnects the elements 220 to 280 and transfers communication (for example, control messages and/or data) between the elements.
- the camera device 220 may collect image information of a subject.
- the camera device 220 may include a plurality of camera devices included in the electronic device 201 .
- the camera device 220 may include a first camera device (for example, front camera device) for performing photography in a selfie mode and a second camera device (for example, back camera device) for photographing a subject located in front of the user.
- the camera device 220 may be disposed to be included in at least some areas of the display 270 .
- an image senor of the first camera device may be disposed in at least some areas of the display 270 .
- the image sensor may use a charge-coupled device (CCD) image sensor or a complementary metal-oxide semiconductor (CMOS) image sensor.
- CCD charge-coupled device
- CMOS complementary metal-oxide semiconductor
- the processor 230 may include one or more of a central processing unit (CPU), an application processor (AP), and a communication processor (CP). For example, the processor 230 may execute calculations or data processing about controls and/or communication of at least one other element of the electronic device 201 . The processor 230 may perform various functions of the electronic device 201 . Accordingly, the processor 230 may control the elements of the electronic device 201 .
- CPU central processing unit
- AP application processor
- CP communication processor
- the processor 230 may control the camera device 220 based on touch information of a preset camera control area to control the camera device 220 . For example, when some areas of the touch panel are perforated to place the camera device 220 , the processor 230 may set at least some areas adjacent to the placement areas of the camera device 220 on the touch panel as camera control areas. For example, when some areas of the touch panel corresponding to the placement areas of the camera device 220 are not perforated and a touch pattern is omitted in the corresponding areas, the processor 230 may set at least some areas adjacent to the placement areas of the camera device 220 on the touch panel as camera control areas.
- the processor 230 may set the placement areas of the camera device 220 on the touch panel and at least some areas adjacent to the placement areas of the camera device 220 as camera control areas.
- the processor 230 may drive the camera device 220 based on a touch and a drag input in the camera control area. For example, when the touch input in the camera control area is detected, the processor 230 may control the display 270 to display camera activation information in a display area corresponding to the camera control area. For example, when the display 270 is deactivated, the processor 230 may maintain the touch recognition function of the camera control area in an active state. Accordingly, the processor 230 may detect the touch input in the camera control area in an inactive state of the display 270 . When the drag input of the camera activation information is detected, the processor 230 may execute a camera application to start a front camera mode.
- the processor 230 may set a camera display area to display a service screen of the camera application based on a distance of the drag input.
- the processor 230 may control the display 270 to display the service screen of the camera application (for example, a preview image acquired through the camera device 220 ) in the camera display area.
- the processor 230 may drive the camera device 220 based on a touch and a touch maintaining time in the camera control area. For example, when the touch input in the camera control area is detected, the processor 230 may control the display 270 to display camera activation information in the display area corresponding to the placement area of the camera device 220 . When the touch maintaining time of the camera activation information exceeds a reference time, the processor 230 may execute the camera application to start the front camera mode, for example. In this case, the processor 230 may control the display 270 to display the preview image acquired through the front camera device.
- the processor 230 may control the camera application to be linked with another application. For example, when the touch and the drag input in the camera control area are detected in a state where a service screen of another application is displayed, the processor 230 may display the service screen of the camera application in at least some areas of the display 270 based on a distance of the drag input. That is, the processor 230 may divide the display 270 into a first area and a second area based on the distance of the drag input. The processor 230 may control the display 270 to display a service screen of another application in the first area of the display 270 and to display a service screen of the camera application in the second area. When an image is captured (or acquired) through the camera application, the processor 230 may determine whether the camera application can be linked with the other application.
- the processor 230 may set the image captured through the camera application as contents to be controlled in the other application.
- the processor 230 may store the image captured through the camera application in the memory 240 .
- the processor 230 may end the camera application when the image is captured.
- the processor 230 may set a timer of the camera device 220 to capture an image based on touch information (for example, at least one of the touch input and the drag input) in the camera control area. For example, when the drag input in the camera control area is detected while the camera application is executed (for example, in the front camera mode), the processor 230 may set the timer of the camera device 220 to correspond to a drag distance. That is, the processor 230 may set a time of the timer in proportion to the drag distance.
- the processor 230 may control the display 270 to display timer information based on the placement area of the camera device 220 .
- the processor 230 may continuously reduce a size of the timer information displayed on the display 270 in accordance with the elapsing of the time of the timer.
- the processor 230 may capture an image. For example, when the touch input in the camera control area is detected while the camera application is executed (for example, in the front camera mode), the processor 230 may set the timer of the camera device 220 based on a touch position. That is, the processor 230 may set the time of the timer in proportion to a distance between the placement area of the camera device 220 and the touch position. For example, the timer of the camera device 220 may include a photographing timer of the camera device 220 to capture an image.
- the processor 230 may change a color of the display 270 to secure an amount of light to capture the image. For example, when the image is captured through the front camera device, the processor 230 may change the color of the display 270 into a bright color (for example, white) based on the placement area of the camera device 220 and provide a flash effect. For example, the processor 230 may apply various image effects by changing the color of the display 270 in accordance with a user input.
- a bright color for example, white
- the processor 230 may control the display 270 to display additional information for a camera service through the camera control area. For example, when the image is captured through the front camera device, the processor 230 may control the display 270 to display a graphic effect (for example, wavelength image) based on the placement area of the camera device 220 to induce a user's eyes to the front camera device. For example, when video is photographed through the back camera device, the processor 230 may control the display 270 to display audio input information based on the placement area of the camera device 220 . For example, the processor 230 may control a size of audio input information to correspond to a size of an audio signal collected through the microphone device 140 while the video is photographed.
- a graphic effect for example, wavelength image
- the processor 230 may control the display 270 to display audio input information based on the placement area of the camera device 220 .
- the processor 230 may control a size of audio input information to correspond to a size of an audio signal collected through the microphone device 140 while the video is photographed.
- the processor 230 may execute the camera application based on touch information of an application icon. For example, when at least one of a touch input and a drag input for the application icon is detected, the processor 230 may identify whether the application icon enters the camera control area. When the application icon enters the camera control area, the processor 230 may identify whether an application corresponding to the application icon is linked with the camera application. When the application corresponding to the application icon is linked with the camera application, the processor 230 may execute a camera function (for example, front camera mode) of the application corresponding to the application icon. For example, when the touch input for the application icon is released within the camera control area, the processor 230 may determine that the application icon enters the camera control area.
- a camera function for example, front camera mode
- the processor 230 may execute a multi-camera mode based on touch information of the camera control area.
- the processor 230 may provide a camera service of one of a plurality of camera devices.
- the processor 230 may switch to the multi-camera mode in which the plurality of camera devices are simultaneously activated.
- the processor 230 may additionally activate the back camera device and execute the multi-camera mode.
- the display 270 may overlap the display of preview images acquired through the front camera device and the back camera device or display the preview images in different areas.
- the processor 230 may switch positions of the preview images.
- the processor 230 may control sizes of the preview images based on input information detected through the input/output interface 260 .
- the processor 230 may provide an automatic photographing service based on at least one of a location and an angle of the electronic device 201 in the front camera mode. For example, when the automatic photographing mode is set, the processor 230 may display a camera image corresponding to the location and the angle of the electronic device 201 to be adjacent to the placement area of the camera device 220 . That is, the processor 230 may display the camera image corresponding to the location and the angle of the electronic device 201 to allow the user to control the location and the angle of the electronic device 201 to match photographing information. When the location and the angle of the electronic device 201 match the photographing information, the processor 230 may automatically capture the image. For example, the photographing information may be set by a user input or may include at least one of the location and the angle of the electronic device 201 that match the image acquired through the front camera mode.
- the processor 230 may set the camera application to control the camera device 220 based on touch information of the camera control area. For example, when a drag input in a first direction (for example, a horizontal direction) in the camera control area is detected, the processor 230 may control the display 270 to display a camera application list installed in the electronic device 201 . The processor 230 may select one first camera application based on input information detected through the input/output interface 260 . The processor 230 may control the camera device 220 by executing the first camera application. That is, the processor 230 may drive the camera device 220 based on camera setting information set to the first camera application. In addition, the processor 230 may set the first camera application as a basic camera application.
- the processor 230 may execute the first camera application.
- the camera setting information may include at least one of a filter for photographing, a photographing mode, and a photographing setting value (for example, aperture, shutter speed, image size, and the like).
- the processor 230 may control the camera device 220 in accordance with the camera setting information of the image.
- the processor 230 may control the display 270 to display a list of images stored in the memory 240 .
- the processor 230 may control the display 270 to display corresponding filter information in the image to which a filter is applied.
- the processor 230 may identify whether the first image enters the camera control area.
- the processor 230 may drive the camera device 220 in accordance with camera setting information of the first image. For example, when a touch input for the first image is released within the camera control area, the processor 230 may determine that the first image enters the camera control area.
- the processor 230 may capture an image based on touch information of the camera control area. For example, when a double tap input in the camera control area is detected, the processor 230 may capture an image through the camera device 220 without executing the camera application.
- the processor 230 may photograph video based on touch information of the camera control area. For example, when a touch maintaining time of the camera control area exceeds a reference time, the processor 230 may photograph video through the camera device 220 without executing the camera application. When the touch input in the camera control area is released, the processor 230 may end the photographing of the video. For example, when the touch maintaining time in the camera control area exceeds the reference time, the processor 230 may output notification information to allow the user to recognize the start of the video photographing.
- the notification information may include at least one of a notification sound, a notification message, and a vibration.
- the processor 230 may display driving limit information to be adjacent to the placement area of the camera device 220 .
- the processor 230 may display driving limit information to be adjacent to the placement area of the camera device 220 .
- the driving limit information may be displayed to be adjacent to the placement area of the camera device 220 .
- the processor 230 may execute a camera setting menu.
- the processor 230 may display notification information of a communication service using an image to be adjacent to the placement area of the camera device 220 . For example, when a video call signal is received, the processor 230 may display video call notification information to be adjacent to the placement area of the camera device 220 . The processor 230 may determine whether to accept the video call based on touch information of an area where the video call notification information is displayed.
- the processor 230 may display human body recognition information (for example, face recognition) to be adjacent to the placement area of the camera device 220 .
- human body recognition information for example, face recognition
- the processor 230 may display time information required for the iris recognition based on the placement area of the camera device 220 to allow the user to recognize the time information corresponding to a time during which the user should look at the camera device 220 for the iris recognition.
- the processor 230 may further display progress time information of the iris recognition. For example, when the time information required for the iris recognition matches the progress time information of the iris recognition, the processor 230 may complete the iris recognition.
- the processor 230 may display pollution level information of the camera device 220 to be adjacent to the placement area of the camera device 220 .
- the processor 230 may estimate a pollution level of the image sensor of the camera device 220 by detecting the definition of the image acquired through the camera device 220 .
- the processor 230 may display the pollution level information of the image sensor of the camera device 220 to be adjacent to the placement area of the camera device 220 .
- the processor 230 may display pollution level information to be adjacent to the placement area of the camera device 220 .
- the processor 230 may display a guide image to induce a touch of another area adjacent to the placement area of the camera device 220 .
- the memory 240 may include a volatile memory and/or a non-volatile memory.
- the memory 240 may store instructions or data related to at least one other element of the electronic device 201 .
- the memory 240 may store software and/or a program 250 .
- the program 250 may include a kernel 251 , middleware 253 , an application programming interface (API) 255 , and an application program 257 .
- At least some of the kernel 251 , the middleware 253 , and the API 255 may be referred to as an operating system (OS).
- OS operating system
- the input/output interface 260 may function as an interface that may transfer instructions or data input from a user or another external device to the other elements of the electronic device 201 . Furthermore, the input/output interface 260 may output instructions or data, which are received from the other elements of the electronic device 201 , to the user or the external device.
- the input/output interface 260 may include a touch panel that detects a touch input or a hovering input using an electronic pen or a user's body part.
- the input/output interface 260 may receive a gesture or a proximity input using an electronic pen or a user's body part.
- the display 270 may display various types of contents (for example, text, images, videos, icons, symbols, or the like) to a user. For example, at least some areas (for example, upper areas) of the display 270 may be perforated for placement of the camera device 220 . Accordingly, the display 270 may limit the display function in the placement area of the camera device 220 . According to an embodiment, the display 270 may be implemented by a touch screen coupled with the touch panel of the input/output interface 260 .
- the communication interface 280 may establish communication between the electronic device 201 and an external device.
- the communication interface 280 may communicate with a first external electronic device 202 through short-range communication 284 or wired communication.
- the communication interface 280 may be connected to a network 282 through wireless or wired communication to communicate with a second external electronic device 204 or a server 206 .
- the network 282 may include at least one of a communication network, a computer network (for example, a LAN or a WAN), the Internet, and a telephone network.
- FIG. 3 is a flowchart of a process for controlling a camera device in an electronic device, according to an embodiment of the present disclosure.
- the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
- the electronic device 201 identifies whether a touch input for the camera control area related to the placement area of the camera device 220 is detected on the touch screen in operation 301 .
- the camera control area may be disposed on at least some areas adjacent to the placement area of the camera device 220 on the touch screen.
- the camera control area may be disposed on at least some areas adjacent to the placement area of the camera device 220 on the touch screen.
- the camera control area may be disposed on the placement area of the camera device 220 and at least some areas adjacent to the placement area of the camera device 220 on the touch screen.
- the electronic device 201 detects a control function of the camera device 220 corresponding to the touch input in operation 303 .
- the processor 230 detects the control function of the camera device 220 based on at least one of the number of touches in the camera control area, a drag distance (i.e., touch motion distance), a drag direction (i.e., touch motion direction), and a touch maintaining time.
- the control function of the camera device 220 may include at least one of driving of the camera device 220 , selection of an application for driving the camera device 220 , camera setting information, image capturing, video photographing, timer setting, and camera mode switching.
- the electronic device 201 drives the camera device 220 based on the control function of the camera device 220 corresponding to the touch input in the camera control area in operation 305 .
- the processor 230 controls the camera device 220 by executing the camera application in accordance with the control function of the camera device 220 .
- FIG. 4 is a flowchart of a process for controlling a camera device in an electronic device when a screen of the electronic device is turned off, according to an embodiment of the present disclosure.
- FIGS. 6 A to 6 D illustrate a screen configuration for controlling a camera device in an electronic device when a screen of the electronic device is turned off, according to an embodiment of the present disclosure.
- the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
- the electronic device 201 determines whether a first touch input is detected through a camera control area set based on a placement area of the camera device 220 on the touch screen in operation 401 .
- the processor 230 maintains a touch recognition function of the camera control area in an active state. The processor 230 determines whether a first type touch input is detected through the camera control area.
- the first type touch input may correspond to the type of touch in which the user rubs the camera control area and include a touch input having a continuously changing drag direction.
- the processor 230 determines whether the first type touch input is detected through the camera control area.
- the touch recognition function of the camera control area may be activated or deactivated based on the type of an application driven in the electronic device 201 .
- the electronic device 201 terminates the operation for controlling the driving of the camera device 220 .
- the electronic device 201 displays camera activation information in a display area corresponding to the camera control area in operation 403 .
- the processor 230 may display camera activation information 620 based on the placement area of the camera device 220 , as illustrated in FIG. 6 B .
- the electronic device 201 determines whether a second touch input for the camera activation information is detected in operation 405 .
- the processor 230 determines whether a drag input 630 for the camera activation information 620 is detected within a reference time from a time point when the camera activation information 620 is displayed, as illustrated in FIG. 6 B .
- the electronic device 201 may determine to not drive the camera. Accordingly, the electronic device 201 may terminate the operation for controlling driving of the camera device 220 .
- the electronic device 201 drives the camera device 220 in operation 407 .
- the processor 230 may display at least some of the service screen of the camera application in accordance with a distance of the drag input, as illustrated in FIG. 6 C .
- the processor 230 may display the service screen of the camera application on the display 270 as indicated by reference numeral 650 in FIG. 6 D .
- the processor 230 may display a preview image acquired through the front camera device 220 on the display 270 by executing the camera application.
- the processor 230 may display the service screen (for example, a preview image) of the camera application in at least some areas of the display 270 in accordance with the drag input.
- the service screen for example, a preview image
- the electronic device 201 may determine to not drive the camera device 220 . Accordingly, the electronic device 201 may terminate the service screen of the camera application, as illustrated in FIG. 6 A .
- FIG. 5 is a flowchart of a process for detecting touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure.
- the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
- the electronic device 201 determines whether the display 270 is deactivated in operation 501 .
- the processor 230 determines whether an operation state of the display 270 switches to an inactive state since the electronic device 201 operates in a low power mode.
- the electronic device 201 When the display 270 is deactivated, the electronic device 201 maintains the touch recognition function of the camera control area in an active state in operation 503 .
- the processor 230 maintains a touch recognition function of the camera control area in an active state.
- the electronic device 201 determines whether a touch input is detected through the camera control area. For example, the processor 230 determines whether a touch input of the type in which the user rubs the touch screen is detected through the camera control area having the activated touch recognition function as illustrated in FIG. 6 A .
- the electronic device 201 When the touch input is not detected through the camera control area, the electronic device 201 maintains the touch recognition function of the camera control area in the active state in operation 503 .
- the electronic device 201 determines whether the touch input is detected in operation 507 . For example, when the display 270 is in the active state, the processor 230 maintains the touch recognition function of the touch panel corresponding to the display 270 in the active state. Accordingly, the processor 230 determines whether the touch input is detected through the touch panel in the active state.
- the electronic device 201 determines whether the display 270 is deactivated again in operation 501 .
- the electronic device 201 determines whether the touch input is detected through the camera control area in operation 509 .
- the processor 230 determines whether a touch coordinate of the touch input is included in the camera control area.
- FIG. 7 is a flowchart of a process for configuring a camera display area in an electronic device based on touch information of a camera control area, according to an embodiment of the present disclosure.
- the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
- the electronic device 201 sets the camera display area based on a second touch input in operation 701 .
- the processor 230 sets at least some areas of the display 270 as the camera display area for displaying the service screen of the camera application in accordance with a drag distance. For example, when the drag distance for the camera activation information 620 exceeds a reference distance, the processor 230 sets the entire area of the display 270 as the camera display area.
- the electronic device 201 may drive the camera device 220 based on the camera display area in operation 703 .
- the processor 230 may display a preview image acquired through the front camera device in the camera display area of the display 270 by executing the camera application.
- FIG. 8 is a flowchart of a process for controlling a camera device in an electronic device when a camera application is linked with another application, according to an embodiment of the present disclosure.
- FIGS. 9 A to 9 E illustrate a screen configuration for controlling a camera device in an electronic device when a camera application is linked with another application, according to an embodiment of the present disclosure.
- the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
- the electronic device 201 may drive a first application among at least one application installed in the electronic device 201 in operation 801 .
- a messenger application is selected from at least one application installed in the electronic device 201 based on input information detected through the input/output interface 260
- the processor 230 displays a service screen 900 of the messenger application on the display 270 .
- the electronic device 210 detects a first touch input for the camera control area in operation 803 .
- the processor 230 may detect a tap input through at least some areas of the touch screen set as the camera control area.
- the electronic device 201 determines to not drive the camera device 220 and terminate the operation for controlling driving of the camera device 220 .
- the electronic device 201 displays camera activation information in at least some areas of the display corresponding to the camera control area in operation 805 .
- the processor 230 displays camera activation information 920 based on the placement area of the camera device 220 .
- the electronic device 201 determines whether a second touch input for the camera activation information is detected in operation 807 .
- the processor 230 determines whether a drag input is detected through at least some areas of the touch screen where the camera activation information is displayed.
- the electronic device 201 determines to not drive the camera device 220 . Accordingly, the electronic device 201 terminates the operation for controlling driving of the camera device 220 .
- the electronic device 201 sets a camera display area in accordance with the second touch input in operation 809 .
- the processor 230 sets at least some areas of the display 270 as the camera display area in accordance with a drag distance.
- the electronic device 201 displays driving information (for example, service screen of the camera application) of the camera device 220 in the camera display area in operation 811 .
- driving information for example, service screen of the camera application
- the processor 230 displays a preview image acquired through the front camera device in the camera display area set to at least some areas of the display 270 based on the drag distance as indicated by reference numeral 940 .
- the processor 230 displays a photographing button 942 at a position where the drag input is released.
- the electronic device 201 determines whether an event for capturing an image is generated through the camera application in operation 813 .
- the processor 230 may determine whether a touch input for the photographing button 942 displayed in the camera display area is detected or whether a gesture input mapped to image capturing is detected.
- the electronic device 201 When the event for capturing the image is not generated, the electronic device 201 maintains display of the camera driving information of the camera display area in operation 811 .
- the electronic device determines whether the camera application and a first application are linked to each other in operation 815 .
- the processor 230 determines whether the first application provides a service using the image captured through the camera application.
- the electronic device 201 links the image captured through the camera application with the first application in operation 817 .
- the processor 230 may transmit the image captured through the camera application to a counterpart electronic device through a chat room of the messenger application as indicated by reference numeral 950 .
- the processor 230 may store the image captured through the camera application in the memory 240 .
- the electronic device 201 stores the image captured through the camera application in the memory 240 of the electronic device 201 in operation 819 .
- the electronic device 201 terminates driving of the camera device 220 .
- the processor 230 terminates the camera application, as illustrated in FIG. 9 E .
- FIG. 10 is a flowchart of a process for controlling a camera device in an electronic device based on touch maintaining information of a camera control area, according to an embodiment of the present disclosure.
- FIGS. 11 A and 11 B illustrate a screen configuration for controlling a camera device in an electronic device based on touch maintaining information of a camera control area, according to an embodiment of the present disclosure.
- the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
- the electronic device 201 determines whether a touch input is detected through a camera control area set to at least some areas of the touch screen in operation 1001 .
- the processor 230 may determine whether a hovering input for the camera control area set to be adjacent to the placement area of the camera device 220 is detected or whether a tap input for the camera control area of the touch screen is detected.
- the electronic device 201 determines to not drive the camera device 220 and terminate the operation for controlling the camera device 220 .
- the electronic device 201 displays camera activation information in at least some areas of the display corresponding to the camera control area in operation 1003 .
- the processor 230 displays camera activation information 1130 to be adjacent to the placement area of the camera device 220 , as illustrated in FIG. 11 A .
- the processor 230 displays the camera activation information 1130 in at least some areas different from the placement area of the camera device 220 in order to prevent the image sensor of the front camera device from becoming dirty due to the touch input for controlling the camera device 220 . That is, the processor 230 displays the camera activation information 1130 in at least some areas different from the placement area of the camera device 220 to allow the user to touch at least some areas different from the placement area of the camera device 220 .
- the electronic device 201 determines whether a touch maintaining time for the camera activation information exceeds a reference time in operation 1005 .
- the processor 230 determines whether the touch maintaining time for the camera activation information 1130 exceeds the reference time, as illustrated in FIG. 11 A .
- the electronic device 201 determines whether the touch input for the camera activation information is released in operation 1009 .
- the processor 230 determines whether a touch input 1120 for the camera activation information 1130 is released, as illustrated in FIG. 11 A .
- the electronic device 201 determines to not drive the camera device 220 and terminates the operation for controlling driving of the camera device 220 .
- the electronic device 201 determines whether the touch maintaining time for the camera activation information exceeds the reference time again in operation 1005 .
- the electronic device 201 drives the camera device 220 in operation 1007 .
- the processor 230 displays the service screen of the camera application on the display 270 through an image effect that makes the service screen of the camera application spread from the placement area of the camera device 220 .
- the service screen of the camera application may include a preview image acquired through the front camera device or the back camera device.
- FIG. 12 is a flowchart of a process for setting a timer of a camera device based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure.
- FIGS. 13 A to 13 C illustrate a screen configuration for setting a timer of a camera device based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure.
- the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
- the electronic device 201 displays a service screen of a camera application in at least some areas of the display in operation 1201 .
- the processor 230 executes the camera application based on touch information of a camera control area, as described in operations 401 to 407 of FIG. 4 or operations 1001 to 1009 of FIG. 10 .
- the processor 230 executes the corresponding camera application.
- the electronic device 201 determines whether the touch input for the camera control area is detected in operation 1203 .
- the processor 230 displays a preview image of the front camera device on the display 270 by executing the camera application as indicated by reference numeral 1300 .
- the processor 230 may determine whether a drag input 1310 is detected from the placement area of the camera device 220 in a state where the preview image of the front camera device is displayed.
- the processor 230 determines whether a subsequent tap input for the camera control area is detected in the state where the preview image of the front camera device is displayed.
- the electronic device 201 determines to not set a timer of the camera device. Accordingly, the electronic device terminates the operation for setting the timer of the camera device 220 .
- the electronic device 201 sets the timer of the camera device in accordance with the touch input in operation 1205 .
- the processor 230 sets the timer of the camera device 220 to a timer required time corresponding to a drag distance from the placement area of the camera device 220 , as illustrated in FIG. 13 A .
- the processor 230 sets the timer of the camera device 220 to a time corresponding to a distance between the placement area of the camera device 220 and a position where a tap input is detected.
- the electronic device 201 displays timer information of the camera device 220 set to correspond to the touch input on the display 270 in operation 1207 .
- the processor 230 displays time information set to correspond to the drag distance from the placement area of the camera device 220 as indicated by reference numeral 1320 in FIG. 13 A .
- the electronic device 201 determines whether the time set to correspond to the touch input has expired in operation 1209 .
- the electronic device 201 displays timer information of the camera device 220 on the display 270 in operation 1207 .
- the processor 230 updates the time information displayed in accordance with the elapse of time. That is, the processor 230 updates the timer of the camera device 220 such that display of the time information becomes gradually smaller in accordance with the elapse of time from a time point when the timer is set.
- the electronic device 201 captures an image by driving the camera device 220 in operation 1211 .
- the processor 230 captures an image by using the front camera device. In this case, the processor 230 removes the display of the time information from the display 270 as illustrated in FIG. 13 C .
- the processor 230 may acquire the amount of light for the image capturing by changing a color of the display into a bright color.
- FIG. 14 is a flowchart of a process for providing a flash effect in an electronic device, according to an embodiment of the present disclosure.
- FIGS. 15 A to 15 D illustrate a screen configuration for providing a flash effect in an electronic device, according to an embodiment of the present disclosure.
- the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 of FIG. 2 .
- the electronic device 201 determines whether a flash function is set. For example, when the time set for the timer of the camera device 220 expires, the processor 230 may determine that the image capturing event is generated. The processor 230 displays camera activation information 1510 based on the placement area of the camera device 220 to make user's eyes face the front image device in response to the generation of the image capturing event. In this case, the processor 230 determines whether a flash setting menu of the camera device 220 is set in an active state.
- the electronic device 201 captures the image by driving the camera device 220 in operation 1405 .
- the processor 230 captures the image by using the activated front camera device of the camera device 220 .
- the electronic device 201 changes a color of the display 270 into a color set as the flash function in operation 1403 .
- the processor 230 displays a background image such that a bright colored (for example, white) background image spreads across an entire area of the display 270 based on the placement area of the camera device 220 as indicated by reference numerals 1520 and 1530 .
- the electronic device 201 captures the image by driving the camera device 220 while changing the color of the display 270 by the flash function in operation 1405 .
- the electronic device 201 may change the color of the display 270 in accordance with an image effect which the user desires.
- the processor 230 may set an image effect having a warm feeling based on the user's input information.
- the processor 230 displays a background image such that a yellow background image, for example, spreads across an entire area of the display 270 based on the placement area of the camera device 220 while capturing the image.
- the electronic device 201 may display audio input information to allow the user to identify a size of an audio signal input through the microphone device 140 .
- the processor 230 may display audio input information 1540 corresponding to a size of an audio signal based on the placement area of the camera device 220 , as illustrated in FIG. 15 D .
- FIG. 16 is a flowchart of a process for providing a camera service through an application in an electronic device, according to an embodiment of the present disclosure.
- FIGS. 17 A and 17 B illustrate a screen configuration for providing a camera service through an application in an electronic device, according to an embodiment of the present disclosure.
- the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
- the electronic device 201 displays a standby screen including an icon of at least one application in operation 1601 .
- the processor 230 displays the standby screen including icons of applications installed in the electronic device 201 as indicated by reference numeral 1700 .
- the electronic device 201 determines whether a touch input for the icon of the application is detected in the standby screen in operation 1603 .
- the processor 230 determines whether a touch input for an icon of one of a plurality of applications displayed on the standby screen is detected, as illustrated in FIG. 17 A .
- the electronic device 201 determines to not provide the camera service and terminates the operation for providing the camera service.
- the electronic device 201 determines whether the application icon enters the camera control area by the touch input in operation 1605 .
- the processor 230 determines whether a first application icon 1710 on which the touch input is detected enters the camera control area through a drag input 1720 on a standby screen 1700 as indicated by reference numeral 1730 in FIG. 17 A .
- the processor 230 determines that the first application icon 1710 enters the camera control area.
- the processor 230 determines whether a second touch input for determining a movement location of the first application icon 1710 is detected within the camera control area, as illustrated in FIG. 17 A .
- the second touch input may be determined as being effective only when the corresponding touch input is detected within a reference time from a time point when the first touch input is detected.
- the electronic device 201 determines whether the touch input for the application icon is released in operation 1611 .
- the processor 230 determines whether the touch input for the first application icon 1710 is released outside the camera control area, as illustrated in FIG. 17 A .
- the electronic device 201 determines whether the application icon enters the camera control area by the touch input in operation 1605 .
- the electronic device 201 determines to not provide the camera service and terminates the operation for providing the camera service. For example, the processor 230 may change a location of the application icon to a location where the touch input for the application icon is released.
- the electronic device 201 determines whether an application corresponding to the application icon can be linked with the camera application in operation 1607 .
- the processor 230 determines whether the camera service can be provided through the application corresponding to the first application icon 1710 .
- the electronic device 201 terminates the operation for providing the camera service.
- the electronic device 201 displays link information between the application corresponding to the application icon and the camera device 220 on the display 270 in operation 1609 .
- the processor 230 displays a camera service screen of the first application on the display 270 as indicated by reference numeral 1740 in FIG. 17 B.
- FIG. 18 is a flowchart of a process for providing a multi-camera service in an electronic device, according to an embodiment of the present disclosure.
- FIGS. 19 A to 19 E illustrate a screen configuration for providing a multi-camera service in an electronic device, according to an embodiment of the present disclosure.
- the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
- the electronic device 201 displays a service screen of a camera application on the display 270 in operation 1801 .
- the processor 230 displays a preview image collected through the front camera device on the display 270 as indicated by reference numeral 1900 .
- the processor 230 displays the service screen of the camera application on the display 270 like operations 401 to 407 of FIG. 4 or operations 1001 to 1009 of FIG. 10 .
- the electronic device 201 determines whether a touch input is detected through the camera control area. For example, the processor 230 determines whether a tap input 1910 for the camera control area adjacent to the placement area of the camera device 220 is detected in a state where a preview image of the front camera device is displayed, as illustrated in FIG. 19 A .
- the electronic device 201 determines to not provide a multi-camera service. Accordingly, the electronic device 201 terminates the operation for providing the multi-camera service.
- the electronic device 201 switches a camera mode of the electronic device 201 to a multi-camera mode in operation 1805 .
- the processor 230 may additionally activate the back camera device.
- the processor 230 may additionally activate the front camera device.
- the electronic device 201 displays a service screen using multiple cameras on the display 270 based on the multi-camera mode in operation 1807 .
- the processor 230 displays a preview image 1920 of the activated back camera device to overlap at least a part of the preview image 1910 of the front camera device based on the multi-camera mode.
- the processor 230 controls a size of the preview image 1920 of the back camera device in accordance with a drag distance. Referring to FIG.
- the processor 230 when a tap input for the displayed small preview image 1920 of the back camera device is detected, the processor 230 reverses display areas of the preview image 1910 of the front camera device and the preview image 1920 of the back camera device as indicated by reference numeral 1940 .
- the processor 230 updates a display location of the preview image of the front camera according to the drag input as indicated by reference numeral 1960 in FIG. 19 E .
- the electronic device 201 determines whether the multi-camera mode ends in operation 1809 . For example, when the drag input for the displayed small preview image is detected to move outside the area of the display 270 as indicated by reference numeral 1970 in FIG. 19 E , the processor 230 determines that the multi-camera mode ends.
- the electronic device 201 maintains the service screen using the multiple cameras displayed on the display 270 in operation 1807 .
- the electronic device 201 switches the camera mode of the electronic device 201 to a single camera mode and displays a service screen of a single camera device on the display 270 in operation 1811 .
- the electronic device 201 switches the camera mode of the electronic device 201 to a single camera mode and displays a service screen of a single camera device on the display 270 in operation 1811 .
- the processor 230 displays the preview screen of the back camera device on the display 270 , as indicated by reference numeral 1980 .
- FIG. 20 is a flowchart of a process for providing an automatic photographing service in an electronic device, according to an embodiment of the present disclosure.
- FIGS. 21 A to 21 C illustrate a screen configuration for providing an automatic photographing service in an electronic device, according to an embodiment of the present disclosure.
- the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
- the electronic device 201 displays a service screen (for example, preview image) of a camera application on the display 270 in operation 2001 .
- the processor 230 executes the camera application based on touch information of a camera control area through operations 401 to 407 of FIG. 4 or operations 1001 to 1009 of FIG. 10 .
- the processor 230 displays the preview image of the front camera device on the display 270 as indicated by reference numeral 2100 .
- the electronic device 201 determines whether an automatic photographing mode is set to the camera application in operation 2003 .
- the processor 230 determines whether an automatic photographing menu is set in an activated state based on input information detected through the input/output interface 260 .
- the electronic device 201 When the automatic photographing mode is not set to the camera application, the electronic device 201 terminates the operation for providing the automatic photographing service. In this case, the electronic device 201 captures an image based on a touch input of a photographing button displayed on the service screen of the camera application.
- the electronic device 201 displays motion information of the camera device 220 in at least some areas of the display 270 in operation 2005 .
- the processor 230 displays motion information of the camera device 220 corresponding to a location and an angle of the electronic device 201 based on the placement area of the camera device 220 as indicated by reference numeral 2110 in FIG. 21 A .
- the electronic device 201 determines whether a motion of the camera device 220 that a capturing event matches is detected in operation 2007 .
- the processor 230 determines whether motion information of the electronic device 201 that matches the location and angle of the electronic device 201 preset for image capturing is detected.
- the preset location and angle of the electronic device 201 may be set by a user's input or may include at least one of locations and angles of the electronic device 201 that match the image acquired through the front camera mode.
- the electronic device 201 displays changed motion information of the camera device 220 in at least some areas of the display 270 in operation 2005 .
- the processor 230 may change the motion information of the camera device 220 displayed based on the placement area of the camera device 220 according to a change in the location and angle of the electronic device 201 as indicated by reference numeral 2120 .
- the electronic device 201 captures the image by driving the camera device (for example, front camera device) in operation 2009 .
- the camera device for example, front camera device
- the processor 230 displays matching information by the location and angle of the electronic device 201 to allow the user to recognize an automatic photographing time point as indicated by reference numeral 2130 .
- the processor 230 captures the image by using the front camera device.
- the processor 230 may acquire an amount of light for the image capturing or perform the image capturing and change a color of the display 270 at the same time for an image effect.
- FIG. 22 is a flowchart of a process for controlling a camera device in accordance with a camera application in an electronic device, according to an embodiment of the present disclosure.
- FIGS. 23 A and 23 B illustrate a screen configuration for controlling a camera device in accordance with a camera application in an electronic device, according to an embodiment of the present disclosure.
- the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
- the electronic device 201 determines whether a first type touch input for the camera control area is detected in operation 2201 .
- the processor 230 determines whether a drag input 2310 in a right direction of the placement area of the camera device 220 is detected.
- the electronic device 201 terminates the operation for setting the camera application. For example, when the drag input in a down direction for the camera control area is detected, the electronic device 201 executes one camera application that is set as a basic application among a plurality of applications installed in the electronic device 201 .
- the electronic device 201 determines at least one camera application installed in the electronic device 201 in operation 2203 .
- the processor 230 may extract camera application information stored in the memory 240 .
- the electronic device 201 displays a camera application list including at least one camera application installed in the electronic device 201 .
- the processor 230 displays icons of camera applications installed in the electronic device 201 on the display 270 such that the icons are output in the placement area of the camera device 220 .
- the electronic device 201 determines whether a first camera application which is one of the applications included in the camera application list is selected. For example, the processor 230 determines whether a touch input for one of the icons of the camera applications displayed in the camera control area is detected as illustrated in FIG. 23 B .
- the electronic device 201 When a selection input for the first camera application is not detected, the electronic device 201 maintains display of the camera application list in operation 2205 . In addition, when an input for the camera application list is not detected until a reference time passes from a time point when the camera application list is displayed, the electronic device 201 determines to not select the camera application for controlling the camera device 220 and terminates the operation.
- the electronic device 201 drives the camera device 220 based on the first camera application in operation 2209 .
- the processor 230 controls the camera device 220 by executing the first camera application. Accordingly, the processor 230 performs initial settings on the camera device 220 based on camera setting information set to the first camera application. In addition, the processor 230 sets the first camera application as a basic camera application of the electronic device 201 .
- FIG. 24 is a flowchart of a process for controlling a camera device in accordance with photographing setting information of an image in an electronic device, according to an embodiment of the present disclosure.
- FIG. 25 illustrates a screen configuration for controlling a camera device in accordance with photographing setting information of an image in an electronic device, according to an embodiment of the present disclosure.
- the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
- the electronic device 201 displays a list of at least one image stored in the memory 240 of the electronic device 201 on the display 270 in operation 2401 .
- the processor 230 displays a thumbnail for at least one image stored in the memory 240 on the display 270 as indicated by reference numeral 2500 .
- the processor 230 displays corresponding filter information 2510 on an image to which a filter for an image effect is applied.
- the electronic device 201 determines whether a touch input for a first image in an image list displayed on the display 270 is detected in operation 2403 .
- the processor 230 determines whether a touch input 2520 for the first image in the image list 2500 displayed on the display 270 is detected, as illustrated in FIG. 25 .
- the electronic device 201 determines to not provide the camera service and terminates the operation for providing the camera service.
- the electronic device 201 determines whether the first image enters the camera control area in operation 2405 .
- the processor 230 determines whether the first image enters the camera control area through a drag input 2530 for the first image, as illustrated in FIG. 25 .
- the processor 230 determines that the first image enters the camera control area.
- the processor 230 determines whether a second touch input for determining a movement location of the first image is detected within the camera control area, as illustrated in FIG. 25 .
- the electronic device 201 determines whether the touch input for the first image is released in operation 2411 .
- the processor 230 determines whether the touch input 2520 for the first image is released outside the camera control area in FIG. 25 .
- the electronic device 201 determines whether the first image enters the camera control area again by the touch input in operation 2405 .
- the electronic device 201 determines to not provide the camera service and terminates the operation for providing the camera service. For example, the processor 230 changes a location of the first image to a location where the touch input for the first image is released.
- the electronic device 201 determines setting information of the camera device 220 set to capture the first image in operation 2407 .
- the setting information of the camera device 220 may include at least one of a filter for capturing the first image, image filter information, a photographing mode, and a photographing setting value (for example, aperture, shutter speed, and image size).
- the electronic device 201 may update the setting information of the camera device 220 based on the setting information of the camera device 220 set to capture the first image in operation 2409 .
- the processor 230 may perform initial settings on the camera device 220 in accordance with the setting information of the camera device 220 that has been set to capture the first image.
- FIG. 26 is a flowchart of a process for capturing an image based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure.
- the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
- the electronic device 201 determines whether a touch input is detected through a camera control area set based on the placement area of the camera device 220 on the touch screen in operation 2601 .
- the electronic device 201 determines whether image capturing matches the touch input detected through camera control area in operation 2603 .
- the processor 230 determines whether a double tap input which an image capturing event matches is detected based on touch input matching information stored in the memory 240 .
- the electronic device 201 determines to not perform the image capturing. Accordingly, the electronic device 201 terminates the operation for the image capturing.
- the electronic device 201 captures the image through the camera device 220 without executing the camera application in operation 2605 .
- the processor 230 captures the image through the camera device 220 (for example, back camera device) in a state where the service screen of the application displayed on the display 270 is maintained. That is, the processor 230 captures the image in a state where a preview image acquired through the camera device 220 is not displayed.
- the processor 230 stores the captured image in the memory 240 .
- the processor 230 displays image capturing information on a notification bar.
- FIG. 27 is a flowchart of a process for photographing video based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure.
- the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
- the electronic device 201 determines whether a touch input is detected through a camera control area preset to control the camera device 220 on the touch screen in operation 2701 .
- the camera control area may include at least some areas of the touch screen including the placement area of the camera device 220 .
- the camera control area may include at least some areas of the touch screen adjacent to the placement area of the camera device 220 .
- the electronic device 201 determines whether video photographing matches the touch input detected through the camera control area in operation 2703 .
- the processor 230 determines whether a touch input having a touch maintaining time exceeding a reference time is detected through the camera control area based on touch input matching information stored in the memory 240 .
- the electronic device 201 determines to not perform the video photographing. Accordingly, the electronic device 201 terminates the operation for the video photographing.
- the electronic device 201 starts the video photographing through the camera device 220 without executing the camera application in operation 2705 .
- the processor 230 photographs the video through the camera device 220 (for example, back camera device) in a state where the service screen of the application displayed on the display 270 is maintained from a time point when the touch maintaining time of the touch input detected through the camera control area exceeds the reference time.
- the processor 230 outputs notification information to allow the user to recognize the video photographing operation.
- the notification information may include at least one of a notification sound, a notification message, and a vibration.
- the electronic device 201 determines whether the touch input that the video photographing matches is released in operation 2707 .
- the electronic device 201 may continuously photograph the video in operation 2705 .
- the electronic device 201 terminates the video photographing.
- the processor 230 may store the video photographed through the back camera device in the memory 240 .
- the processor 230 displays video photographing information on the notification bar.
- FIG. 28 is a flowchart of a process for displaying camera driving limit information in an electronic device, according to an embodiment of the present disclosure.
- FIG. 29 illustrates a screen configuration for displaying camera driving limit information in an electronic device, according to an embodiment of the present disclosure.
- the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
- the electronic device 201 determines whether driving of a camera device 220 is limited in 220 2801 .
- the processor 230 may identify whether the driving of the camera device 220 is limited based on the type of an application being executed in the electronic device 201 .
- the processor 230 may identify whether the driving of the camera device 220 is limited based on a location of the electronic device 201 .
- the processor 230 determines whether an operation mode of the camera device 220 is set as an inactive mode based on input information detected through the input/output interface 260 .
- the electronic device 201 terminates the operation for displaying the camera driving limit information.
- the electronic device 201 displays camera driving limit information in the camera control area in operation 2803 .
- the processor 230 displays camera driving limit information 2900 (for example, red colored concentric circles) based on the placement area of the camera device 220 .
- the electronic device 201 determines whether a touch input is detected through the camera control area in a state where driving the camera device is limited in operation 2805 .
- the processor 230 determines whether a touch input for the camera driving limit information 2900 displayed in the camera control area is detected, as illustrated in FIG. 29 .
- the electronic device 201 executes a camera setting menu in operation 2807 .
- the processor 230 displays a camera setting menu for resetting a right of the camera device 220 on the display 270 .
- FIG. 30 is a flowchart of a process for providing a video call service in an electronic device, according to an embodiment of the present disclosure.
- FIG. 31 illustrates a screen configuration for providing a video call service in an electronic device, according to an embodiment of the present disclosure.
- the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
- the electronic device 201 determines whether a call connection request signal for the video call is received in operation 3001 .
- the processor 230 determines whether the corresponding call connection request signal is a call connection request signal corresponding to the video call service.
- the electronic device 201 terminates the operation for providing the video call service.
- the electronic device 201 displays video call reception information of the display area corresponding to the camera control area in operation 3003 .
- the processor 230 displays video call reception information 3100 to be output from the placement area of the camera device 220 on the display 270 .
- the electronic device 201 determines whether a touch input is detected through the camera control area in a state where the video call reception information is displayed. For example, the processor 230 determines whether a drag input in a first direction (for example, right direction) for the video call reception information 3100 displayed to be adjacent to the placement area of the camera device 220 is detected.
- a first direction for example, right direction
- the electronic device 220 may maintain display of the video call reception information in at least some area of the display 270 corresponding to the camera control area in operation 3003
- the electronic device 201 may determine to not accept the video call connection. In this case, the electronic device 201 displays video call connection failure information on the display 270 .
- the electronic device 201 When the touch input is detected through the camera control area in a state where the video call reception information is displayed, the electronic device 201 activates the front camera device and provides the video call service in operation 3007 .
- the processor 230 determines that the user accepts the call connection for the video call. Accordingly, the processor 230 displays an image collected through the front camera device and an image received from a counterpart electronic device on the display 270 by executing the video call application.
- the processor 230 may determine that the user does not accept the call connection for the video call. Accordingly, the processor 230 may block the call connection for the video call.
- FIG. 32 is a flowchart of a process for displaying human body recognition service information in an electronic device, according to an embodiment of the present disclosure.
- FIGS. 33 A to 33 D illustrate a screen configuration for displaying human body recognition service information in an electronic device, according to an embodiment of the present disclosure.
- the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
- the electronic device 201 determines whether human body recognition is performed through the camera device 220 of the electronic device 201 in operation 3201 .
- the processor 230 may determine whether an iris recognition menu for unlocking the electronic device 201 is selected using the camera device 220 (for example, front camera device).
- the processor 230 may determine whether a face recognition menu for authenticating the user of the electronic device 201 is selected using the camera device 220 (for example, front camera device).
- the electronic device 201 terminates the operation for providing the human body recognition service.
- the electronic device 201 displays time information spent for the human body recognition in a display area corresponding to the camera control area in operation 3203 .
- the processor 230 displays time information 3300 spent for the iris recognition based on the placement area of the camera device 220 .
- the time spent for the human body recognition may include a minimum time during which the human body recognition (for example, iris recognition) can be completed through the camera device 220 .
- the electronic device 201 determines whether the time spent for the human body recognition expires in operation 3205 .
- the processor 230 determines whether an elapsed time from a time point when the human body recognition starts is the same as the time spent for the human body recognition.
- the electronic device 201 displays elapsed time information for the human body recognition in the display area corresponding to the camera control area in operation 3211 .
- the processor 230 displays elapsed time information 3310 of the iris recognition to overlap the time information 3300 spent for the iris recognition displayed based on the placement area of the camera device 220 .
- the electronic device 201 determines again whether the time spent for the human body recognition expires in operation 3205 .
- the electronic device 201 determines whether the human body recognition is successful in operation 3207 .
- the processor 230 determines that the iris recognition is completed. Accordingly, the processor 230 determines whether the authentication of the user is successful based on a result of the iris recognition. For example, the processor 230 determines whether iris information detected through the iris recognition matches iris information preset in the memory 240 .
- the electronic device 201 determines that the authentication of the user through the human body recognition fails. Accordingly, the electronic device 201 terminates the operation for providing the human body recognition service.
- the electronic device 201 displays human body recognition failure information on the display 270 .
- the electronic device 201 may unlock the electronic device 201 in operation 3209 .
- the processor 230 releases a lock function of the electronic device 201 and displays a standby screen 3330 on the display 270 .
- FIG. 34 is a flowchart of a process for displaying pollution level information of a camera device in an electronic device, according to an embodiment of the present disclosure.
- FIG. 35 illustrates a screen configuration for displaying pollution level information of a camera device in an electronic device, according to an embodiment of the present disclosure.
- the electronic device may include the electronic device 201 or at least a part (for example, processor 230 ) of the electronic device 201 .
- the electronic device 201 drives the camera device 220 disposed on some areas (for example, upper area) of the display 270 in operation 3401 .
- the processor 230 executes the camera application based on touch information of a camera control area through operations 401 to 407 of FIG. 4 or operations 1001 to 1009 of FIG. 10 .
- the processor 230 controls the camera device 220 (for example, front camera device) through the camera application.
- the electronic device 201 captures an image through the camera device 220 disposed on some areas of the display 270 in operation 3403 .
- the processor 230 captures an image through the front camera device 120 disposed in the upper area of the display 270 .
- the pollution level measuring event may be periodically generated or may be generated at a time point when the camera device is driven.
- the electronic device 201 may detect a pollution level of a camera lens through the image capture through the camera device 220 in operation 3405 .
- the processor 230 may estimate the definition of the image acquired through the front camera device.
- the processor 230 may detect the pollution level of the camera lens corresponding to the definition of the image.
- the electronic device 220 determines whether the pollution level of the camera lens exceeds a reference pollution level in operation 3407 .
- the reference pollution level may be set, by the user, as a reference value of a pollution level which can influence a quality of the image acquired through the camera device 220 or may include a fixed value.
- the electronic device 201 determines that the pollution level of the camera lens does not influence the quality of the image acquired through the camera device 220 . Accordingly, the electronic device 201 terminates the operation for displaying the pollution information of the camera lens.
- the electronic device 201 displays pollution information of the camera lens in the display area corresponding to the camera control area to allow the user to recognize the pollution level of the camera lens in operation 3409 .
- the processor 230 displays pollution information 3500 of the camera lens based on the placement area of the camera device 220 to inform the user to wash the camera lens.
- the electronic device 201 may display an amount of the pollution level of the camera lens based on the placement area of the camera device 220 .
- the processor 230 may display the amount of the pollution level of the camera lens through the number of concentric circles based on the placement area of the camera device 220 .
- the processor 230 may increase the number of concentric circles displayed in the placement area of the camera device 220 as the pollution level of the camera lens is more serious.
- the processor 230 may not display the concentric circle indicating the pollution level of the camera lens.
- An electronic device including a camera device disposed at a location overlapping at least a partial area of a display and an operation method thereof control the camera device based on an adjacent area including a placement area of the camera device or an adjacent area close to the placement area of the camera device, so that a user of the electronic device can easily drive and control a camera application.
- An electronic device including a camera device disposed at a location overlapping at least a partial area of a display and an operation method thereof display control information related to a camera device in an adjacent area including a placement area of the camera device or an adjacent area close to the placement area of the camera device, so that a user of the electronic device can easily detect an operational state of the camera device and a natural photo can be taken by inducing user's eyes to a lens direction.
- module may refer to a unit including one of hardware, software, and firmware or a combination of two or more of them.
- the term “module” may be interchangeably used with the term “unit”, “logic”, “logical block”, “component”, or “circuit”.
- the “module” may be a minimum unit of an integrated component element or a part thereof.
- the “module” may be a minimum unit for performing one or more functions or a part thereof.
- the “module” may be mechanically or electronically implemented.
- the “module” may include at least one of an application-specific integrated circuit (ASIC) chip, a field-programmable gate arrays (FPGA), and a programmable-logic device for performing operations which has been known or are to be developed hereinafter.
- ASIC application-specific integrated circuit
- FPGA field-programmable gate arrays
- programmable-logic device for performing operations which has been known or are to be developed hereinafter.
- At least some of the devices may be implemented by instructions stored in a computer-readable storage medium in a program module form.
- the instructions when executed by the processor 230 , may cause the processor 230 to execute the function corresponding to the instruction.
- the computer-readable storage medium may be the memory 240 .
- the computer readable storage medium may include a hard disk, a floppy disk, magnetic media (for example, a magnetic tape), optical media (for example, a compact disc read only memory (CD-ROM) and a digital versatile disc (DVD)), magneto-optical media (for example, a floptical disk), a hardware device (for example, a read only memory (ROM), a random access memory (RAM), a flash memory), and the like.
- the instructions may include high class language codes, which can be executed in a computer by using an interpreter, as well as machine codes made by a compiler. Any of the hardware devices as described above may be configured to work as one or more software modules in order to perform the operations according to various embodiments of the present disclosure, and vice versa.
- modules or programming modules may include at least one of the above described elements, exclude some of the elements, or further include other additional elements.
- the operations performed by the modules, programming module, or other elements according to various embodiments of the present disclosure may be executed in a sequential, parallel, repetitive, or heuristic manner. Further, some operations may be executed according to another order or may be omitted, or other operations may be added.
Abstract
An electronic device and a method for controlling a camera device in the electronic device are provided. The electronic device includes a display, a camera device disposed at a location overlapping a partial area of the display, and a processor configured to receive, via a touch screen, touch input on a portion of the touch screen, a contact area of the received touch input including at least a part of an area where a hole is formed, perform an operation related with the camera based on the touch input, and display an image received through the camera on the touch screen based on the touch input.
Description
- This application is a Continuation application of U.S. patent application Ser. No. 17/104,980, filed Nov. 25, 2020, now U.S. Pat. No. 11,516,380, issued Nov. 29, 2022, which is a Continuation application of U.S. patent application Ser. No. 15/407,943, filed Jan. 17, 2017, which claims priority under 35 U.S.C. § 119(a) to Korean Patent Application Serial No. 10-2016-0005293, which was filed in the Korean Intellectual Property Office on Jan. 15, 2016, the entire content of each of which is incorporated herein by reference.
- The present disclosure relates generally to an apparatus and a method for controlling a camera device in an electronic device.
- With the development of information, communication, and semiconductor technologies, various types of electronic devices have developed into devices that provide various multimedia services. For example, portable electronic devices may provide various services such as broadcast services, wireless Internet services, camera services, and music playback services.
- The electronic device may provide the camera services through a plurality of camera devices to meet various user demands. For example, the electronic device may acquire images or videos through a front camera device disposed on the front surface of the electronic device and a back camera device disposed on the back surface.
- The electronic device may provide the camera service to a user of the electronic device by executing a camera application to control the plurality of camera devices. However, the user of the electronic device may feel inconvenience due to multiple controls for execution of the camera application. For example, when the user of the electronic device uses the camera service through an electronic device in which a message application is being executed, the user may feel inconvenience in executing the camera application through a second control after making the electronic device enter a standby mode through a first control. As another example, when the user of the electronic device uses the camera service through a locked electronic device, the user may feel inconvenience in executing the camera application through a second control after unlocking the electronic device through a first control.
- The present disclosure has been made to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below.
- Accordingly, an aspect of the present disclosure is to provide an electronic device and a method for easily controlling a camera device in an electronic device.
- Accordingly, another aspect of the present disclosure is to provide an electronic device including a camera device disposed at a location overlapping at least a partial area of a display and a method for controlling the camera device based on an adjacent area including a placement area of the camera device or an adjacent area close to the placement area of the camera device, so that a user of the electronic device can easily drive and control a camera application.
- Accordingly, another aspect of the present disclosure is to provide an electronic device including a camera device disposed at a location overlapping at least a partial area of a display and a method for displaying control information related to a camera device in an adjacent area including a placement area of the camera device or an adjacent area close to the placement area of the camera device, so that a user of the electronic device can easily detect an operational state of the camera device and a photo can be taken by inducing a user's eyes in a direction of the camera lens.
- In accordance with an aspect of the present disclosure, an electronic device is provided. The electronic device includes a display, a camera device disposed at a location overlapping a partial area of the display, and a processor configured to receive, via a touch screen, touch input on a portion of the touch screen, a contact area of the received touch input including at least a part of an area where a hole is formed, perform an operation related with the camera based on the touch input, and display an image received through the camera on the touch screen based on the touch input.
- In accordance with another aspect of the present disclosure, a method of operating an electronic device comprising a camera configured to capture an image through a hole formed in a layer of a touch screen is provided. The method includes receiving, via the touch screen, a touch input on a portion of the touch screen, a contact area of the received touch input including at least a part of an area where the hole is formed, performing an operation related with the camera based on the touch input, and displaying an image received through the camera on the touch screen based on the touch input.
- The above and other aspects, features, and advantages of the present disclosure will be more apparent from the following detailed description taken in conjunction with the accompanying drawings, in which:
-
FIGS. 1A and 1B illustrate a configuration of an electronic device, according to an embodiment of the present disclosure; -
FIG. 2 is a block diagram of an electronic device in a network environment, according to an embodiment of the present disclosure; -
FIG. 3 is a flowchart of a process for controlling a camera device in an electronic device, according to an embodiment of the present disclosure; -
FIG. 4 is a flowchart of a process for controlling a camera device in an electronic device when a screen of the electronic device is turned off, according to an embodiment of the present disclosure; -
FIG. 5 is a flowchart of a process for detecting touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure; -
FIGS. 6A to 6D illustrate a screen configuration for controlling a camera device in an electronic device when a screen of the electronic device is turned off, according to an embodiment of the present disclosure; -
FIG. 7 is a flowchart of a process for configuring a camera display area in an electronic device based on touch information of a camera control area, according to an embodiment of the present disclosure; -
FIG. 8 is a flowchart of a process for controlling a camera device in an electronic device when a camera application is linked with another application, according to an embodiment of the present disclosure; -
FIGS. 9A to 9E illustrate a screen configuration for controlling a camera device in an electronic device when a camera application is linked with another application, according to an embodiment of the present disclosure; -
FIG. 10 is a flowchart of a process for controlling a camera device in an electronic device based on touch maintaining information of a camera control area, according to an embodiment of the present disclosure; -
FIGS. 11A and 11B illustrate a screen configuration for controlling a camera device in an electronic device based on touch maintaining information of a camera control area, according to an embodiment of the present disclosure; -
FIG. 12 is a flowchart of a process for setting a timer of a camera device based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure; -
FIGS. 13A to 13C illustrate a screen configuration for setting a timer of a camera device based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure; -
FIG. 14 is a flowchart of a process for providing a flash effect in an electronic device, according to an embodiment of the present disclosure; -
FIGS. 15A to 15D illustrate a screen configuration for providing a flash effect in an electronic device, according to an embodiment of the present disclosure; -
FIG. 16 is a flowchart of a process for providing a camera service through an application in an electronic device, according to an embodiment of the present disclosure; -
FIGS. 17A and 17B illustrate a screen configuration for providing a camera service through an application in an electronic device, according to an embodiment of the present disclosure; -
FIG. 18 is a flowchart of a process for providing a multi-camera service in an electronic device, according to an embodiment of the present disclosure; -
FIGS. 19A to 19F illustrate a screen configuration for providing a multi-camera service in an electronic device, according to an embodiment of the present disclosure; -
FIG. 20 is a flowchart of a process for providing an automatic photographing service in an electronic device, according to an embodiment of the present disclosure; -
FIGS. 21A to 21C illustrate a screen configuration for providing an automatic photographing service in an electronic device, according to an embodiment of the present disclosure; -
FIG. 22 is a flowchart of a process for controlling a camera device in accordance with a camera application in an electronic device, according to an embodiment of the present disclosure; -
FIGS. 23A and 23B illustrate a screen configuration for controlling a camera device in accordance with a camera application in an electronic device, according to an embodiment of the present disclosure; -
FIG. 24 is a flowchart of a process for controlling a camera device in accordance with photographing setting information of an image in an electronic device, according to an embodiment of the present disclosure; -
FIG. 25 illustrates a screen configuration for controlling a camera device in accordance with photographing setting information of an image in an electronic device, according to an embodiment of the present disclosure; -
FIG. 26 is a flowchart of a process for capturing an image based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure; -
FIG. 27 is a flowchart of a process in which an electronic device photographs video based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure; -
FIG. 28 is a flowchart of a process for displaying camera driving limit information in an electronic device, according to an embodiment of the present disclosure; -
FIG. 29 illustrates a screen configuration for displaying camera driving limit information in an electronic device, according to an embodiment of the present disclosure; -
FIG. 30 is a flowchart of a process for providing a video call service in an electronic device, according to an embodiment of the present disclosure; -
FIG. 31 illustrates a screen configuration for providing a video call service in an electronic device, according to an embodiment of the present disclosure; -
FIG. 32 is a flowchart of a process for displaying human body recognition service information in an electronic device, according to an embodiment of the present disclosure; -
FIGS. 33A to 33D illustrate a screen configuration for displaying human body recognition service information in an electronic device according to an embodiment of the present disclosure; -
FIG. 34 is a flowchart of a process for displaying pollution level information of a camera device in an electronic device, according to an embodiment of the present disclosure; and -
FIG. 35 illustrates a screen configuration for displaying pollution level information of a camera device in an electronic device, according to an embodiment of the present disclosure. - Hereinafter, various embodiments of the present disclosure will be described with reference to the accompanying drawings. In the following description, specific details, such as detailed configuration and components, are merely provided to assist the overall understanding of these embodiments of the present disclosure. Therefore, it should be apparent to those skilled in the art that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. In addition, descriptions of well-known functions and constructions are omitted for clarity and conciseness. In describing the drawings, similar reference numerals may be used to designate similar elements.
- The terms “have” or “include” used in describing the various embodiments of the present disclosure indicate the presence of disclosed corresponding functions, operations, elements, features, numbers, steps, and the like, and do not limit the addition of one or more functions, operations, elements, features, numbers, steps and the like. The terms “A or B”, “at least one of A or/and B” or “one or more of A or/and B” used in the various embodiments of the present disclosure include any and all combinations of words enumerated with it. For example, “A or B”, “at least one of A and B” or “at least one of A or B” means (1) including A, (2) including B, or (3) including both A and B.
- Although terms such as “first” and “second” used herein may modify various elements of various embodiments, these terms do not limit the corresponding elements. For example, these terms do not limit an order and/or importance of the corresponding elements. These terms may be used for the purpose of distinguishing one element from another element. For example, a first user device and a second user device both indicate user devices and may indicate different user devices. For example, a first element may be referred to as a second element without departing from the scope of the present disclosure, and similarly, a second element may be referred to as a first element.
- It should be understood that when an element (e.g., a first element) is “connected” or “coupled” another element (e.g., a second element), the first element may be directly connected or coupled to the second element, and there may be an intervening element (e.g., a third element) between the first element and second element. To the contrary, it will be understood that when an element (e.g., a first element) is “directly connected” or “directly coupled” to another element (e.g., second element), there is no intervening element (e.g., a third element) between the first element and the second element.
- The expressions “configured to” or “set to” used in describing various embodiments of the present disclosure may be used interchangeably with “suitable for”, “having the capacity to”, “designed to”, “adapted to”, “made to”, or “capable of” according to the situation. The terms “configured to” or “set to” do not necessarily mean “specifically designed to” in a hardware level. Instead, the expression “apparatus configured to . . . ” may mean that the apparatus is “capable of . . . ” along with other devices or parts in a certain situation. For example, “a processor configured to (set to) perform A, B, and C” may be a dedicated processor, e.g., an embedded processor, for performing a corresponding operation, or a generic-purpose processor, e.g., a central processing unit (CPU) or an application processor (AP), capable of performing a corresponding operation by executing one or more software programs stored in a memory device.
- The terms as used herein are used merely to describe certain embodiments and are not intended to limit the present disclosure. As used herein, singular forms may include plural forms as well unless the context explicitly indicates otherwise. Further, all the terms used herein, including technical and scientific terms, should be interpreted to have the same meanings as commonly understood by those skilled in the art to which the present disclosure pertains, and should not be interpreted to have ideal or excessively formal meanings unless explicitly defined in various embodiments of the present disclosure.
- An electronic device according to various embodiments of the present disclosure may include at least one of a smartphone, a tablet personal computer (PC), a mobile phone, a video phone, an electronic book (e-book) reader, a laptop PC, a netbook computer, a workstation, a personal digital assistant (PDA), a portable multimedia player (PMP), an MP3 player, a mobile medical appliance, a camera, and a wearable device (e.g., smart glasses, a head-mounted-device (HMD), electronic clothes, an electronic bracelet, an electronic necklace, an electronic appcessory, an electronic tattoo, a smart mirror, or a smart watch).
- According to another embodiment, the electronic device may include at least one of various medical devices (e.g., various portable medical measuring devices (a blood glucose monitoring device, a heart rate monitoring device, a blood pressure measuring device, a body temperature measuring device, etc.)), a navigation device, a global positioning system (GPS) receiver, an event data recorder (EDR), a flight data recorder (FDR), a vehicle infotainment device, an electronic device for a ship (e.g., a navigation device, and a gyro-compass), an avionics device, a security device, or Internet of Things (IoT) device (e.g., a light bulb, various sensors, an electric or gas meter, a sprinkler device, a fire alarm, a thermostat, a streetlamp, a toaster, a sporting goods, a hot water tank, a heater, a boiler, etc.).
- The electronic device according to some embodiments of the present disclosure may be a flexible device. Further, the electronic device according to an embodiment of the present disclosure is not limited to the aforementioned devices, and may include a new electronic device according to the development of technology
- Hereinafter, an electronic device according to various embodiments will be described with reference to the accompanying drawings. As used herein, the term “user” may indicate a person who uses an electronic device or a device (e.g., an artificial intelligence electronic device) that uses an electronic device.
-
FIGS. 1A and 1B illustrate a configuration of an electronic device, according to an embodiment of the present disclosure. - Referring to
FIGS. 1A and 1B , anelectronic device 100 is provided. Theelectronic device 100 may be configured as one body. For example, theelectronic device 100 may be an electronic device for communication including aspeaker device 130 and amicrophone device 140 for a voice call. - The
electronic device 100 may have a front surface configured by atouch screen 110. For example, acamera device 120 may be disposed on at least some areas of thetouch screen 110. - The
speaker device 130 may be disposed on at least one surface adjacent to the touch screen 110 (for example, an upper side surface, lower side surface, left side surface, and right side surface). For example, thespeaker device 130 may be disposed on the upper side surface adjacent to thetouch screen 110 close to the user's ear for a voice call. - Control buttons (for example, a home button and a back button) for controlling the
electronic device 100 may be displayed in a lower area of thetouch screen 110. - The
touch screen 110 of theelectronic device 100 may include afront window 140, atouch panel 150, adisplay module 160, and a printed circuit board (PCB) 170, as illustrated inFIG. 1B . For example, the camera device (for example, a front camera device) 180 of theelectronic device 100 may be mounted on the PCB. For example, thefront window 140 may be a transparent material window film that forms an external surface of thetouch screen 110. For example, thePCB 170 may use a flexible PCB (FPCB), which is an electronic component made by forming a conductive circuit having good electrical conductivity (e.g., cooper) on an insulator. - According to an embodiment, the
camera device 180 may be disposed at a position overlapping at least someareas 152 of thetouch panel 150. For example, at least someareas 152 of thetouch panel 150 on which thecamera device 180 is disposed may be perforated. In this case, thetouch screen 110 may have a limited touch recognition function in thearea 152 on which thecamera device 180 is disposed. Alternatively, at least someareas 152 of thetouch panel 150 on which thecamera device 180 is disposed are not perforated, and a touch pattern for touch recognition may be omitted in the correspondingareas 152. In this case, thetouch screen 110 may have a limited touch recognition function in thearea 152 on which thecamera device 180 is disposed. Alternatively, at least someareas 152 of thetouch panel 150 on which thecamera device 180 is disposed are not perforated, and the touch pattern for touch recognition may be set in the correspondingareas 152. In this case, thetouch screen 110 may detect a touch input through theareas 152 on which thecamera device 180 is disposed. For example, the touch pattern may include an electrode for the touch recognition. - According to an embodiment, the
camera device 180 may be disposed at a position overlapping someareas 162 of thedisplay module 160. For example, at least someareas 162 of thedisplay module 160 on which thecamera device 180 is disposed may be perforated. In this case, thetouch screen 110 may have a limited display function on theareas 162 in which thecamera device 180 is disposed. Alternatively, at least someareas 162 of thedisplay module 160 on which thecamera device 180 is disposed are not perforated, and a display component may not be disposed in the correspondingareas 162. In this case, thetouch screen 110 may have a limited display function on theareas 162 in which thecamera device 180 is disposed. Alternatively, at least someareas 162 of thedisplay module 160 on which thecamera device 180 is disposed may not be perforated, and a display component may be disposed. In this case, thetouch screen 110 may display information through theareas 162 in which thecamera device 180 is disposed. - The
electronic device 100 may form at least one hole in at least some areas (upper end) of thetouch screen 110 and place thespeaker device 130 for a voice call service in the at least one hole. -
FIG. 2 is a block diagram of an electronic device in a network environment, according to an embodiment of the present disclosure. - Referring to
FIG. 2 , anelectronic device 201 is provided. Theelectronic device 201 may include abus 210, acamera device 220, a processor 230 (e.g., including processing circuitry), amemory 240, an input/output interface 260 (e.g., including input/output circuitry), a display 270 (e.g., including display circuitry), and a communication interface 280 (e.g., including communication circuitry). In some embodiments, theelectronic device 201 may omit at least one of the elements, or may further include other elements. - The
bus 210 is a circuit that interconnects theelements 220 to 280 and transfers communication (for example, control messages and/or data) between the elements. - The
camera device 220 may collect image information of a subject. For example, thecamera device 220 may include a plurality of camera devices included in theelectronic device 201. For example, thecamera device 220 may include a first camera device (for example, front camera device) for performing photography in a selfie mode and a second camera device (for example, back camera device) for photographing a subject located in front of the user. For example, thecamera device 220 may be disposed to be included in at least some areas of thedisplay 270. For example, an image senor of the first camera device may be disposed in at least some areas of thedisplay 270. For example, the image sensor may use a charge-coupled device (CCD) image sensor or a complementary metal-oxide semiconductor (CMOS) image sensor. - The
processor 230 may include one or more of a central processing unit (CPU), an application processor (AP), and a communication processor (CP). For example, theprocessor 230 may execute calculations or data processing about controls and/or communication of at least one other element of theelectronic device 201. Theprocessor 230 may perform various functions of theelectronic device 201. Accordingly, theprocessor 230 may control the elements of theelectronic device 201. - The
processor 230 may control thecamera device 220 based on touch information of a preset camera control area to control thecamera device 220. For example, when some areas of the touch panel are perforated to place thecamera device 220, theprocessor 230 may set at least some areas adjacent to the placement areas of thecamera device 220 on the touch panel as camera control areas. For example, when some areas of the touch panel corresponding to the placement areas of thecamera device 220 are not perforated and a touch pattern is omitted in the corresponding areas, theprocessor 230 may set at least some areas adjacent to the placement areas of thecamera device 220 on the touch panel as camera control areas. Alternatively, when some areas of the touch panel corresponding to the placement areas of thecamera device 220 are not perforated and the touch pattern is set in the corresponding areas, theprocessor 230 may set the placement areas of thecamera device 220 on the touch panel and at least some areas adjacent to the placement areas of thecamera device 220 as camera control areas. - The
processor 230 may drive thecamera device 220 based on a touch and a drag input in the camera control area. For example, when the touch input in the camera control area is detected, theprocessor 230 may control thedisplay 270 to display camera activation information in a display area corresponding to the camera control area. For example, when thedisplay 270 is deactivated, theprocessor 230 may maintain the touch recognition function of the camera control area in an active state. Accordingly, theprocessor 230 may detect the touch input in the camera control area in an inactive state of thedisplay 270. When the drag input of the camera activation information is detected, theprocessor 230 may execute a camera application to start a front camera mode. For example, theprocessor 230 may set a camera display area to display a service screen of the camera application based on a distance of the drag input. Theprocessor 230 may control thedisplay 270 to display the service screen of the camera application (for example, a preview image acquired through the camera device 220) in the camera display area. - The
processor 230 may drive thecamera device 220 based on a touch and a touch maintaining time in the camera control area. For example, when the touch input in the camera control area is detected, theprocessor 230 may control thedisplay 270 to display camera activation information in the display area corresponding to the placement area of thecamera device 220. When the touch maintaining time of the camera activation information exceeds a reference time, theprocessor 230 may execute the camera application to start the front camera mode, for example. In this case, theprocessor 230 may control thedisplay 270 to display the preview image acquired through the front camera device. - The
processor 230 may control the camera application to be linked with another application. For example, when the touch and the drag input in the camera control area are detected in a state where a service screen of another application is displayed, theprocessor 230 may display the service screen of the camera application in at least some areas of thedisplay 270 based on a distance of the drag input. That is, theprocessor 230 may divide thedisplay 270 into a first area and a second area based on the distance of the drag input. Theprocessor 230 may control thedisplay 270 to display a service screen of another application in the first area of thedisplay 270 and to display a service screen of the camera application in the second area. When an image is captured (or acquired) through the camera application, theprocessor 230 may determine whether the camera application can be linked with the other application. When the camera application can be linked with the other application, theprocessor 230 may set the image captured through the camera application as contents to be controlled in the other application. When the camera application cannot be linked with the other application, theprocessor 230 may store the image captured through the camera application in thememory 240. For example, theprocessor 230 may end the camera application when the image is captured. - The
processor 230 may set a timer of thecamera device 220 to capture an image based on touch information (for example, at least one of the touch input and the drag input) in the camera control area. For example, when the drag input in the camera control area is detected while the camera application is executed (for example, in the front camera mode), theprocessor 230 may set the timer of thecamera device 220 to correspond to a drag distance. That is, theprocessor 230 may set a time of the timer in proportion to the drag distance. Theprocessor 230 may control thedisplay 270 to display timer information based on the placement area of thecamera device 220. Theprocessor 230 may continuously reduce a size of the timer information displayed on thedisplay 270 in accordance with the elapsing of the time of the timer. When the display of the timer information is removed from thedisplay 270, theprocessor 230 may capture an image. For example, when the touch input in the camera control area is detected while the camera application is executed (for example, in the front camera mode), theprocessor 230 may set the timer of thecamera device 220 based on a touch position. That is, theprocessor 230 may set the time of the timer in proportion to a distance between the placement area of thecamera device 220 and the touch position. For example, the timer of thecamera device 220 may include a photographing timer of thecamera device 220 to capture an image. - The
processor 230 may change a color of thedisplay 270 to secure an amount of light to capture the image. For example, when the image is captured through the front camera device, theprocessor 230 may change the color of thedisplay 270 into a bright color (for example, white) based on the placement area of thecamera device 220 and provide a flash effect. For example, theprocessor 230 may apply various image effects by changing the color of thedisplay 270 in accordance with a user input. - The
processor 230 may control thedisplay 270 to display additional information for a camera service through the camera control area. For example, when the image is captured through the front camera device, theprocessor 230 may control thedisplay 270 to display a graphic effect (for example, wavelength image) based on the placement area of thecamera device 220 to induce a user's eyes to the front camera device. For example, when video is photographed through the back camera device, theprocessor 230 may control thedisplay 270 to display audio input information based on the placement area of thecamera device 220. For example, theprocessor 230 may control a size of audio input information to correspond to a size of an audio signal collected through themicrophone device 140 while the video is photographed. - The
processor 230 may execute the camera application based on touch information of an application icon. For example, when at least one of a touch input and a drag input for the application icon is detected, theprocessor 230 may identify whether the application icon enters the camera control area. When the application icon enters the camera control area, theprocessor 230 may identify whether an application corresponding to the application icon is linked with the camera application. When the application corresponding to the application icon is linked with the camera application, theprocessor 230 may execute a camera function (for example, front camera mode) of the application corresponding to the application icon. For example, when the touch input for the application icon is released within the camera control area, theprocessor 230 may determine that the application icon enters the camera control area. - The
processor 230 may execute a multi-camera mode based on touch information of the camera control area. For example, theprocessor 230 may provide a camera service of one of a plurality of camera devices. When a tap input in the camera control area is detected while the camera service is provided, theprocessor 230 may switch to the multi-camera mode in which the plurality of camera devices are simultaneously activated. For example, when a tap input in the camera control area is detected while the front camera mode is executed, theprocessor 230 may additionally activate the back camera device and execute the multi-camera mode. In this case, thedisplay 270 may overlap the display of preview images acquired through the front camera device and the back camera device or display the preview images in different areas. In addition, when a tap input in the camera control area is detected while the multi-camera mode is executed, theprocessor 230 may switch positions of the preview images. Theprocessor 230 may control sizes of the preview images based on input information detected through the input/output interface 260. - The
processor 230 may provide an automatic photographing service based on at least one of a location and an angle of theelectronic device 201 in the front camera mode. For example, when the automatic photographing mode is set, theprocessor 230 may display a camera image corresponding to the location and the angle of theelectronic device 201 to be adjacent to the placement area of thecamera device 220. That is, theprocessor 230 may display the camera image corresponding to the location and the angle of theelectronic device 201 to allow the user to control the location and the angle of theelectronic device 201 to match photographing information. When the location and the angle of theelectronic device 201 match the photographing information, theprocessor 230 may automatically capture the image. For example, the photographing information may be set by a user input or may include at least one of the location and the angle of theelectronic device 201 that match the image acquired through the front camera mode. - The
processor 230 may set the camera application to control thecamera device 220 based on touch information of the camera control area. For example, when a drag input in a first direction (for example, a horizontal direction) in the camera control area is detected, theprocessor 230 may control thedisplay 270 to display a camera application list installed in theelectronic device 201. Theprocessor 230 may select one first camera application based on input information detected through the input/output interface 260. Theprocessor 230 may control thecamera device 220 by executing the first camera application. That is, theprocessor 230 may drive thecamera device 220 based on camera setting information set to the first camera application. In addition, theprocessor 230 may set the first camera application as a basic camera application. Accordingly, when thecamera device 220 is driven based on touch information of the camera control area, theprocessor 230 may execute the first camera application. For example, the camera setting information may include at least one of a filter for photographing, a photographing mode, and a photographing setting value (for example, aperture, shutter speed, image size, and the like). - The
processor 230 may control thecamera device 220 in accordance with the camera setting information of the image. For example, theprocessor 230 may control thedisplay 270 to display a list of images stored in thememory 240. For example, when the image is acquired, theprocessor 230 may control thedisplay 270 to display corresponding filter information in the image to which a filter is applied. When a touch and a drag input for a first image is detected in the image list, theprocessor 230 may identify whether the first image enters the camera control area. When the first image enters the camera control area, theprocessor 230 may drive thecamera device 220 in accordance with camera setting information of the first image. For example, when a touch input for the first image is released within the camera control area, theprocessor 230 may determine that the first image enters the camera control area. - The
processor 230 may capture an image based on touch information of the camera control area. For example, when a double tap input in the camera control area is detected, theprocessor 230 may capture an image through thecamera device 220 without executing the camera application. - The
processor 230 may photograph video based on touch information of the camera control area. For example, when a touch maintaining time of the camera control area exceeds a reference time, theprocessor 230 may photograph video through thecamera device 220 without executing the camera application. When the touch input in the camera control area is released, theprocessor 230 may end the photographing of the video. For example, when the touch maintaining time in the camera control area exceeds the reference time, theprocessor 230 may output notification information to allow the user to recognize the start of the video photographing. Here, the notification information may include at least one of a notification sound, a notification message, and a vibration. - When driving of the
camera device 220 is limited, theprocessor 230 may display driving limit information to be adjacent to the placement area of thecamera device 220. For example, when the driving of thecamera device 220 is limited by an application being executed in theelectronic device 201, theprocessor 230 may display driving limit information to be adjacent to the placement area of thecamera device 220. For example, when the driving of thecamera device 220 is limited based on a position of theelectronic device 201, the driving limit information may be displayed to be adjacent to the placement area of thecamera device 220. In addition, when a touch input in the camera control area is detected in a state where the driving of thecamera device 220 is limited, theprocessor 230 may execute a camera setting menu. - The
processor 230 may display notification information of a communication service using an image to be adjacent to the placement area of thecamera device 220. For example, when a video call signal is received, theprocessor 230 may display video call notification information to be adjacent to the placement area of thecamera device 220. Theprocessor 230 may determine whether to accept the video call based on touch information of an area where the video call notification information is displayed. - When a human body recognition service using the
camera device 220 is provided, theprocessor 230 may display human body recognition information (for example, face recognition) to be adjacent to the placement area of thecamera device 220. For example, when iris recognition is performed through thecamera device 220, theprocessor 230 may display time information required for the iris recognition based on the placement area of thecamera device 220 to allow the user to recognize the time information corresponding to a time during which the user should look at thecamera device 220 for the iris recognition. Theprocessor 230 may further display progress time information of the iris recognition. For example, when the time information required for the iris recognition matches the progress time information of the iris recognition, theprocessor 230 may complete the iris recognition. - The
processor 230 may display pollution level information of thecamera device 220 to be adjacent to the placement area of thecamera device 220. For example, theprocessor 230 may estimate a pollution level of the image sensor of thecamera device 220 by detecting the definition of the image acquired through thecamera device 220. Theprocessor 230 may display the pollution level information of the image sensor of thecamera device 220 to be adjacent to the placement area of thecamera device 220. For example, when the pollution level of the image sensor of thecamera device 220 exceeds a reference value, theprocessor 230 may display pollution level information to be adjacent to the placement area of thecamera device 220. - When a user's hovering input is detected through the camera control area, the
processor 230 may display a guide image to induce a touch of another area adjacent to the placement area of thecamera device 220. - The
memory 240 may include a volatile memory and/or a non-volatile memory. For example, thememory 240 may store instructions or data related to at least one other element of theelectronic device 201. Thememory 240 may store software and/or aprogram 250. For example, theprogram 250 may include akernel 251,middleware 253, an application programming interface (API) 255, and anapplication program 257. At least some of thekernel 251, themiddleware 253, and theAPI 255 may be referred to as an operating system (OS). - The input/
output interface 260 may function as an interface that may transfer instructions or data input from a user or another external device to the other elements of theelectronic device 201. Furthermore, the input/output interface 260 may output instructions or data, which are received from the other elements of theelectronic device 201, to the user or the external device. For example, the input/output interface 260 may include a touch panel that detects a touch input or a hovering input using an electronic pen or a user's body part. For example, the input/output interface 260 may receive a gesture or a proximity input using an electronic pen or a user's body part. - The
display 270 may display various types of contents (for example, text, images, videos, icons, symbols, or the like) to a user. For example, at least some areas (for example, upper areas) of thedisplay 270 may be perforated for placement of thecamera device 220. Accordingly, thedisplay 270 may limit the display function in the placement area of thecamera device 220. According to an embodiment, thedisplay 270 may be implemented by a touch screen coupled with the touch panel of the input/output interface 260. - The
communication interface 280 may establish communication between theelectronic device 201 and an external device. For example, thecommunication interface 280 may communicate with a first externalelectronic device 202 through short-range communication 284 or wired communication. Thecommunication interface 280 may be connected to anetwork 282 through wireless or wired communication to communicate with a second externalelectronic device 204 or aserver 206. - According to an embodiment, the
network 282 may include at least one of a communication network, a computer network (for example, a LAN or a WAN), the Internet, and a telephone network. -
FIG. 3 is a flowchart of a process for controlling a camera device in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include theelectronic device 201 or at least a part (for example, processor 230) of theelectronic device 201. - Referring to
FIG. 3 , theelectronic device 201 identifies whether a touch input for the camera control area related to the placement area of thecamera device 220 is detected on the touch screen inoperation 301. For example, when some areas of the touch panel are perforated for placement of thecamera device 220, the camera control area may be disposed on at least some areas adjacent to the placement area of thecamera device 220 on the touch screen. For example, when some areas of the touch panel corresponding to the placement area of thecamera device 220 are not perforated and a touch pattern is omitted in the areas, the camera control area may be disposed on at least some areas adjacent to the placement area of thecamera device 220 on the touch screen. For example, when some areas of the touch panel corresponding to the placement area of thecamera device 220 are not perforated and the touch pattern is set in the areas, the camera control area may be disposed on the placement area of thecamera device 220 and at least some areas adjacent to the placement area of thecamera device 220 on the touch screen. - When the touch input for the camera control area is detected, the
electronic device 201 detects a control function of thecamera device 220 corresponding to the touch input inoperation 303. For example, theprocessor 230 detects the control function of thecamera device 220 based on at least one of the number of touches in the camera control area, a drag distance (i.e., touch motion distance), a drag direction (i.e., touch motion direction), and a touch maintaining time. For example, the control function of thecamera device 220 may include at least one of driving of thecamera device 220, selection of an application for driving thecamera device 220, camera setting information, image capturing, video photographing, timer setting, and camera mode switching. - The
electronic device 201 drives thecamera device 220 based on the control function of thecamera device 220 corresponding to the touch input in the camera control area inoperation 305. For example, theprocessor 230 controls thecamera device 220 by executing the camera application in accordance with the control function of thecamera device 220. -
FIG. 4 is a flowchart of a process for controlling a camera device in an electronic device when a screen of the electronic device is turned off, according to an embodiment of the present disclosure.FIGS. 6A to 6D illustrate a screen configuration for controlling a camera device in an electronic device when a screen of the electronic device is turned off, according to an embodiment of the present disclosure. In the following description, the electronic device may include theelectronic device 201 or at least a part (for example, processor 230) of theelectronic device 201. - Referring to
FIG. 4 , an operation for controlling thecamera device 220 ofelectronic device 201 based on the screen configuration shown inFIGS. 6A to 6D will be described. Referring toFIG. 4 , theelectronic device 201 determines whether a first touch input is detected through a camera control area set based on a placement area of thecamera device 220 on the touch screen inoperation 401. For example, referring toFIG. 6A , when thedisplay 270 is deactivated, theprocessor 230 maintains a touch recognition function of the camera control area in an active state. Theprocessor 230 determines whether a first type touch input is detected through the camera control area. For example, the first type touch input may correspond to the type of touch in which the user rubs the camera control area and include a touch input having a continuously changing drag direction. For example, when thedisplay 270 is activated, theprocessor 230 determines whether the first type touch input is detected through the camera control area. For example, the touch recognition function of the camera control area may be activated or deactivated based on the type of an application driven in theelectronic device 201. - When a first touch input is not detected through the camera control area, the
electronic device 201 terminates the operation for controlling the driving of thecamera device 220. - When the first touch input is detected through the camera control area, the
electronic device 201 displays camera activation information in a display area corresponding to the camera control area inoperation 403. For example, when the first type touch input is detected through the camera control area as indicated byreference numeral 610 inFIG. 6A , theprocessor 230 may displaycamera activation information 620 based on the placement area of thecamera device 220, as illustrated inFIG. 6B . - The
electronic device 201 determines whether a second touch input for the camera activation information is detected inoperation 405. For example, theprocessor 230 determines whether adrag input 630 for thecamera activation information 620 is detected within a reference time from a time point when thecamera activation information 620 is displayed, as illustrated inFIG. 6B . - When the second touch input is not detected before the reference time passes from the time point when the camera activation information is displayed, the
electronic device 201 may determine to not drive the camera. Accordingly, theelectronic device 201 may terminate the operation for controlling driving of thecamera device 220. - When the second touch input for the camera activation information is detected, the
electronic device 201 drives thecamera device 220 inoperation 407. For example, when a drag input for thecamera activation information 620 is detected as indicated byreference numeral 630, theprocessor 230 may display at least some of the service screen of the camera application in accordance with a distance of the drag input, as illustrated inFIG. 6C . When the drag distance exceeds a reference distance, theprocessor 230 may display the service screen of the camera application on thedisplay 270 as indicated byreference numeral 650 inFIG. 6D . For example, theprocessor 230 may display a preview image acquired through thefront camera device 220 on thedisplay 270 by executing the camera application. For example, when the drag input for thecamera activation information 620 is detected as indicated byreference numeral 630, theprocessor 230 may display the service screen (for example, a preview image) of the camera application in at least some areas of thedisplay 270 in accordance with the drag input. - When the touch input for the drag is released before the drag distance exceeds the reference distance, the
electronic device 201 may determine to not drive thecamera device 220. Accordingly, theelectronic device 201 may terminate the service screen of the camera application, as illustrated inFIG. 6A . -
FIG. 5 is a flowchart of a process for detecting touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include theelectronic device 201 or at least a part (for example, processor 230) of theelectronic device 201. - Referring to
FIG. 5 , the operation for detecting a first touch input through the camera control area inoperation 401 ofFIG. 4 will be described. Theelectronic device 201 determines whether thedisplay 270 is deactivated inoperation 501. For example, theprocessor 230 determines whether an operation state of thedisplay 270 switches to an inactive state since theelectronic device 201 operates in a low power mode. - When the
display 270 is deactivated, theelectronic device 201 maintains the touch recognition function of the camera control area in an active state inoperation 503. For example, when thedisplay 270 is deactivated as indicated by reference numeral 600 inFIG. 6A , theprocessor 230 maintains a touch recognition function of the camera control area in an active state. - In
operation 505, theelectronic device 201 determines whether a touch input is detected through the camera control area. For example, theprocessor 230 determines whether a touch input of the type in which the user rubs the touch screen is detected through the camera control area having the activated touch recognition function as illustrated inFIG. 6A . - When the touch input is not detected through the camera control area, the
electronic device 201 maintains the touch recognition function of the camera control area in the active state inoperation 503. - When the
display 270 is activated, theelectronic device 201 determines whether the touch input is detected inoperation 507. For example, when thedisplay 270 is in the active state, theprocessor 230 maintains the touch recognition function of the touch panel corresponding to thedisplay 270 in the active state. Accordingly, theprocessor 230 determines whether the touch input is detected through the touch panel in the active state. - When the touch input is not detected through the display in the active state, the
electronic device 201 determines whether thedisplay 270 is deactivated again inoperation 501. - When the touch input is detected through the
display 270 in the active state, theelectronic device 201 determines whether the touch input is detected through the camera control area inoperation 509. For example, theprocessor 230 determines whether a touch coordinate of the touch input is included in the camera control area. -
FIG. 7 is a flowchart of a process for configuring a camera display area in an electronic device based on touch information of a camera control area, according to an embodiment of the present disclosure. In the following description, the electronic device may include theelectronic device 201 or at least a part (for example, processor 230) of theelectronic device 201. - Referring to
FIG. 7 , an operation for driving the camera device inoperation 407 ofFIG. 4 will be described. When the touch input for driving the camera device is detected through the camera control area (i.e.,operation 405 ofFIG. 4 ), theelectronic device 201 sets the camera display area based on a second touch input inoperation 701. For example, when the drag input for thecamera activation information 620 is detected, as illustrated inFIG. 6B , theprocessor 230 sets at least some areas of thedisplay 270 as the camera display area for displaying the service screen of the camera application in accordance with a drag distance. For example, when the drag distance for thecamera activation information 620 exceeds a reference distance, theprocessor 230 sets the entire area of thedisplay 270 as the camera display area. - The
electronic device 201 may drive thecamera device 220 based on the camera display area inoperation 703. For example, theprocessor 230 may display a preview image acquired through the front camera device in the camera display area of thedisplay 270 by executing the camera application. -
FIG. 8 is a flowchart of a process for controlling a camera device in an electronic device when a camera application is linked with another application, according to an embodiment of the present disclosure.FIGS. 9A to 9E illustrate a screen configuration for controlling a camera device in an electronic device when a camera application is linked with another application, according to an embodiment of the present disclosure. In the following description, the electronic device may include theelectronic device 201 or at least a part (for example, processor 230) of theelectronic device 201. - Referring to
FIG. 8 , theelectronic device 201 may drive a first application among at least one application installed in theelectronic device 201 inoperation 801. For example, referring toFIG. 9A , when a messenger application is selected from at least one application installed in theelectronic device 201 based on input information detected through the input/output interface 260, theprocessor 230 displays aservice screen 900 of the messenger application on thedisplay 270. - The
electronic device 210 detects a first touch input for the camera control area inoperation 803. For example, referring toFIG. 9B , theprocessor 230 may detect a tap input through at least some areas of the touch screen set as the camera control area. - When the first touch input for the camera control area is not detected, the
electronic device 201 determines to not drive thecamera device 220 and terminate the operation for controlling driving of thecamera device 220. - When the first touch input for the camera control area is detected, the
electronic device 201 displays camera activation information in at least some areas of the display corresponding to the camera control area inoperation 805. For example, when a tap input for the camera control area is detected as indicated byreference numeral 910 inFIG. 9B , theprocessor 230 displayscamera activation information 920 based on the placement area of thecamera device 220. - The
electronic device 201 determines whether a second touch input for the camera activation information is detected inoperation 807. For example, theprocessor 230 determines whether a drag input is detected through at least some areas of the touch screen where the camera activation information is displayed. - When the second touch input for the camera activation information is not detected before a reference time passes from a time point when the camera activation information is displayed, the
electronic device 201 determines to not drive thecamera device 220. Accordingly, theelectronic device 201 terminates the operation for controlling driving of thecamera device 220. - When the second touch input for the camera activation information is detected, the
electronic device 201 sets a camera display area in accordance with the second touch input inoperation 809. For example, referring toFIG. 9C , when the drag input for thecamera activation information 920 is detected as indicated byreference numeral 930, theprocessor 230 sets at least some areas of thedisplay 270 as the camera display area in accordance with a drag distance. - The
electronic device 201 displays driving information (for example, service screen of the camera application) of thecamera device 220 in the camera display area inoperation 811. For example, referring toFIG. 9D , theprocessor 230 displays a preview image acquired through the front camera device in the camera display area set to at least some areas of thedisplay 270 based on the drag distance as indicated byreference numeral 940. In addition, theprocessor 230 displays a photographingbutton 942 at a position where the drag input is released. - The
electronic device 201 determines whether an event for capturing an image is generated through the camera application inoperation 813. For example, theprocessor 230 may determine whether a touch input for the photographingbutton 942 displayed in the camera display area is detected or whether a gesture input mapped to image capturing is detected. - When the event for capturing the image is not generated, the
electronic device 201 maintains display of the camera driving information of the camera display area inoperation 811. - When the event for capturing the image is generated, the electronic device determines whether the camera application and a first application are linked to each other in
operation 815. For example, theprocessor 230 determines whether the first application provides a service using the image captured through the camera application. - When the camera application and the first application are linked to each other, the
electronic device 201 links the image captured through the camera application with the first application inoperation 817. For example, referring toFIG. 9E , theprocessor 230 may transmit the image captured through the camera application to a counterpart electronic device through a chat room of the messenger application as indicated byreference numeral 950. In addition, theprocessor 230 may store the image captured through the camera application in thememory 240. - When the camera application and the first application are not linked to each other, the
electronic device 201 stores the image captured through the camera application in thememory 240 of theelectronic device 201 inoperation 819. - After the image is captured through the camera application displayed in at least some areas of the
display 270, theelectronic device 201 terminates driving of thecamera device 220. For example, after the image is captured through the camera application, theprocessor 230 terminates the camera application, as illustrated inFIG. 9E . -
FIG. 10 is a flowchart of a process for controlling a camera device in an electronic device based on touch maintaining information of a camera control area, according to an embodiment of the present disclosure.FIGS. 11A and 11B illustrate a screen configuration for controlling a camera device in an electronic device based on touch maintaining information of a camera control area, according to an embodiment of the present disclosure. In the following description, the electronic device may include theelectronic device 201 or at least a part (for example, processor 230) of theelectronic device 201. - Referring to
FIG. 10 , an operation for controlling thecamera device 220 based on the screen configuration ofFIGS. 11A and 11B will be described. Theelectronic device 201 determines whether a touch input is detected through a camera control area set to at least some areas of the touch screen inoperation 1001. For example, referring toFIG. 11A , when thedisplay 270 is in an active state as indicated byreference numeral 1100, theprocessor 230 may determine whether a hovering input for the camera control area set to be adjacent to the placement area of thecamera device 220 is detected or whether a tap input for the camera control area of the touch screen is detected. - When a touch input is not detected through the camera control area, the
electronic device 201 determines to not drive thecamera device 220 and terminate the operation for controlling thecamera device 220. - When the touch input is detected through camera control area, the
electronic device 201 displays camera activation information in at least some areas of the display corresponding to the camera control area inoperation 1003. For example, when a hovering input is detected through the camera control area as indicated byreference numeral 1120, theprocessor 230 displayscamera activation information 1130 to be adjacent to the placement area of thecamera device 220, as illustrated inFIG. 11A . Theprocessor 230 displays thecamera activation information 1130 in at least some areas different from the placement area of thecamera device 220 in order to prevent the image sensor of the front camera device from becoming dirty due to the touch input for controlling thecamera device 220. That is, theprocessor 230 displays thecamera activation information 1130 in at least some areas different from the placement area of thecamera device 220 to allow the user to touch at least some areas different from the placement area of thecamera device 220. - The
electronic device 201 determines whether a touch maintaining time for the camera activation information exceeds a reference time inoperation 1005. For example, theprocessor 230 determines whether the touch maintaining time for thecamera activation information 1130 exceeds the reference time, as illustrated inFIG. 11A . - When the touch maintaining time for the camera activation information is shorter than the reference time, the
electronic device 201 determines whether the touch input for the camera activation information is released inoperation 1009. For example, theprocessor 230 determines whether atouch input 1120 for thecamera activation information 1130 is released, as illustrated inFIG. 11A . - When the touch input for the camera activation information is released, the
electronic device 201 determines to not drive thecamera device 220 and terminates the operation for controlling driving of thecamera device 220. - When the touch input for the camera activation information is maintained, the
electronic device 201 determines whether the touch maintaining time for the camera activation information exceeds the reference time again inoperation 1005. - When the touch maintaining time for the camera activation information exceeds the reference time, the
electronic device 201 drives thecamera device 220 inoperation 1007. For example, referring toFIG. 11B , when the touch maintaining time for thecamera activation information 1130 exceeds the reference time, theprocessor 230 displays the service screen of the camera application on thedisplay 270 through an image effect that makes the service screen of the camera application spread from the placement area of thecamera device 220. For example, the service screen of the camera application may include a preview image acquired through the front camera device or the back camera device. -
FIG. 12 is a flowchart of a process for setting a timer of a camera device based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure.FIGS. 13A to 13C illustrate a screen configuration for setting a timer of a camera device based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include theelectronic device 201 or at least a part (for example, processor 230) of theelectronic device 201. - Referring to
FIG. 12 , an operation for setting a timer of a camera device using the screen configuration ofFIGS. 13A to 13C will be described. Theelectronic device 201 displays a service screen of a camera application in at least some areas of the display inoperation 1201. For example, theprocessor 230 executes the camera application based on touch information of a camera control area, as described inoperations 401 to 407 ofFIG. 4 oroperations 1001 to 1009 ofFIG. 10 . For example, when a touch input for an icon of the camera application is detected, theprocessor 230 executes the corresponding camera application. - The
electronic device 201 determines whether the touch input for the camera control area is detected inoperation 1203. For example, referring toFIG. 13A , theprocessor 230 displays a preview image of the front camera device on thedisplay 270 by executing the camera application as indicated byreference numeral 1300. Theprocessor 230 may determine whether adrag input 1310 is detected from the placement area of thecamera device 220 in a state where the preview image of the front camera device is displayed. Theprocessor 230 determines whether a subsequent tap input for the camera control area is detected in the state where the preview image of the front camera device is displayed. - When the touch input for the camera control area is not detected, the
electronic device 201 determines to not set a timer of the camera device. Accordingly, the electronic device terminates the operation for setting the timer of thecamera device 220. - When the touch input is detected through the camera control area, the
electronic device 201 sets the timer of the camera device in accordance with the touch input inoperation 1205. For example, theprocessor 230 sets the timer of thecamera device 220 to a timer required time corresponding to a drag distance from the placement area of thecamera device 220, as illustrated inFIG. 13A . Theprocessor 230 sets the timer of thecamera device 220 to a time corresponding to a distance between the placement area of thecamera device 220 and a position where a tap input is detected. - The
electronic device 201 displays timer information of thecamera device 220 set to correspond to the touch input on thedisplay 270 inoperation 1207. For example, theprocessor 230 displays time information set to correspond to the drag distance from the placement area of thecamera device 220 as indicated byreference numeral 1320 inFIG. 13A . - The
electronic device 201 determines whether the time set to correspond to the touch input has expired inoperation 1209. - When the time set to correspond to the touch input has not expired, the
electronic device 201 displays timer information of thecamera device 220 on thedisplay 270 inoperation 1207. For example, referring toFIG. 13B , theprocessor 230 updates the time information displayed in accordance with the elapse of time. That is, theprocessor 230 updates the timer of thecamera device 220 such that display of the time information becomes gradually smaller in accordance with the elapse of time from a time point when the timer is set. - When the time set to correspond to the touch input expires, the
electronic device 201 captures an image by driving thecamera device 220 inoperation 1211. For example, when the time set to correspond to the drag distance expires, theprocessor 230 captures an image by using the front camera device. In this case, theprocessor 230 removes the display of the time information from thedisplay 270 as illustrated inFIG. 13C . In addition, if it is determined that an amount of light for image capturing is insufficient, theprocessor 230 may acquire the amount of light for the image capturing by changing a color of the display into a bright color. -
FIG. 14 is a flowchart of a process for providing a flash effect in an electronic device, according to an embodiment of the present disclosure.FIGS. 15A to 15D illustrate a screen configuration for providing a flash effect in an electronic device, according to an embodiment of the present disclosure. - Referring to
FIG. 14 , an operation for capturing an image using the screen configuration ofFIG. 15 , as inoperation 1211 ofFIG. 12 , will be described. In the following description, the electronic device may include theelectronic device 201 or at least a part (for example, processor 230) of theelectronic device 201 ofFIG. 2 . - When an image capturing event is generated (
operation 1209 ofFIG. 12 ), theelectronic device 201 determines whether a flash function is set. For example, when the time set for the timer of thecamera device 220 expires, theprocessor 230 may determine that the image capturing event is generated. Theprocessor 230 displayscamera activation information 1510 based on the placement area of thecamera device 220 to make user's eyes face the front image device in response to the generation of the image capturing event. In this case, theprocessor 230 determines whether a flash setting menu of thecamera device 220 is set in an active state. - When the flash function of the
camera device 220 is not set, theelectronic device 201 captures the image by driving thecamera device 220 inoperation 1405. For example, theprocessor 230 captures the image by using the activated front camera device of thecamera device 220. - When the flash function of the
camera device 220 is set, theelectronic device 201 changes a color of thedisplay 270 into a color set as the flash function inoperation 1403. For example, referring toFIGS. 15B and 15C , when acquiring an amount of light for the image capturing, theprocessor 230 displays a background image such that a bright colored (for example, white) background image spreads across an entire area of thedisplay 270 based on the placement area of thecamera device 220 as indicated byreference numerals - The
electronic device 201 captures the image by driving thecamera device 220 while changing the color of thedisplay 270 by the flash function inoperation 1405. - When the image is captured, the
electronic device 201 may change the color of thedisplay 270 in accordance with an image effect which the user desires. For example, theprocessor 230 may set an image effect having a warm feeling based on the user's input information. In this case, theprocessor 230 displays a background image such that a yellow background image, for example, spreads across an entire area of thedisplay 270 based on the placement area of thecamera device 220 while capturing the image. - When video is photographed, the
electronic device 201 may display audio input information to allow the user to identify a size of an audio signal input through themicrophone device 140. For example, when video is photographed through the back camera device, theprocessor 230 may displayaudio input information 1540 corresponding to a size of an audio signal based on the placement area of thecamera device 220, as illustrated inFIG. 15D . -
FIG. 16 is a flowchart of a process for providing a camera service through an application in an electronic device, according to an embodiment of the present disclosure.FIGS. 17A and 17B illustrate a screen configuration for providing a camera service through an application in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include theelectronic device 201 or at least a part (for example, processor 230) of theelectronic device 201. - Referring to
FIG. 16 , an operation for providing a camera service using the screen configuration ofFIGS. 17A and 17B will be described. Theelectronic device 201 displays a standby screen including an icon of at least one application inoperation 1601. For example, referring toFIG. 17A , theprocessor 230 displays the standby screen including icons of applications installed in theelectronic device 201 as indicated byreference numeral 1700. - The
electronic device 201 determines whether a touch input for the icon of the application is detected in the standby screen inoperation 1603. For example, theprocessor 230 determines whether a touch input for an icon of one of a plurality of applications displayed on the standby screen is detected, as illustrated inFIG. 17A . - When the touch input for the icon of the application is not detected on the standby screen, the
electronic device 201 determines to not provide the camera service and terminates the operation for providing the camera service. - When the touch input for the application icon included in the standby screen is detected, the
electronic device 201 determines whether the application icon enters the camera control area by the touch input inoperation 1605. For example, theprocessor 230 determines whether afirst application icon 1710 on which the touch input is detected enters the camera control area through adrag input 1720 on astandby screen 1700 as indicated byreference numeral 1730 inFIG. 17A . For example, when the touch input of thefirst application icon 1710 is released within the camera control area, theprocessor 230 determines that thefirst application icon 1710 enters the camera control area. For example, after detecting a first touch input for selecting thefirst application icon 1710, theprocessor 230 determines whether a second touch input for determining a movement location of thefirst application icon 1710 is detected within the camera control area, as illustrated inFIG. 17A . For example, the second touch input may be determined as being effective only when the corresponding touch input is detected within a reference time from a time point when the first touch input is detected. - When the application icon does not enter the camera control area, the
electronic device 201 determines whether the touch input for the application icon is released inoperation 1611. For example, theprocessor 230 determines whether the touch input for thefirst application icon 1710 is released outside the camera control area, as illustrated inFIG. 17A . - When the touch input for the application icon is maintained, the
electronic device 201 determines whether the application icon enters the camera control area by the touch input inoperation 1605. - When the touch input for the application icon is released, the
electronic device 201 determines to not provide the camera service and terminates the operation for providing the camera service. For example, theprocessor 230 may change a location of the application icon to a location where the touch input for the application icon is released. - When the application icon enters the camera control area, the
electronic device 201 determines whether an application corresponding to the application icon can be linked with the camera application inoperation 1607. For example, theprocessor 230 determines whether the camera service can be provided through the application corresponding to thefirst application icon 1710. - When the application corresponding to the application icon is not linked with the camera application, the
electronic device 201 terminates the operation for providing the camera service. - When the application corresponding to the application icon is linked with the camera application, the
electronic device 201 displays link information between the application corresponding to the application icon and thecamera device 220 on thedisplay 270 inoperation 1609. For example, when the camera service can be provided through the first application corresponding to thefirst application icon 1710, theprocessor 230 displays a camera service screen of the first application on thedisplay 270 as indicated byreference numeral 1740 in FIG. 17B. -
FIG. 18 is a flowchart of a process for providing a multi-camera service in an electronic device, according to an embodiment of the present disclosure.FIGS. 19A to 19E illustrate a screen configuration for providing a multi-camera service in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include theelectronic device 201 or at least a part (for example, processor 230) of theelectronic device 201. - Referring to
FIG. 18 , an operation for providing a multi-camera service using the screen configuration ofFIG. 19 will be described. Theelectronic device 201 displays a service screen of a camera application on thedisplay 270 inoperation 1801. For example, referring toFIG. 19A , when theelectronic device 201 operates in a front camera mode, theprocessor 230 displays a preview image collected through the front camera device on thedisplay 270 as indicated byreference numeral 1900. In this case, theprocessor 230 displays the service screen of the camera application on thedisplay 270 likeoperations 401 to 407 ofFIG. 4 oroperations 1001 to 1009 ofFIG. 10 . - In
operation 1803, theelectronic device 201 determines whether a touch input is detected through the camera control area. For example, theprocessor 230 determines whether atap input 1910 for the camera control area adjacent to the placement area of thecamera device 220 is detected in a state where a preview image of the front camera device is displayed, as illustrated inFIG. 19A . - When a touch input for the camera control area is not detected, the
electronic device 201 determines to not provide a multi-camera service. Accordingly, theelectronic device 201 terminates the operation for providing the multi-camera service. - When the touch input for the camera control area is detected, the
electronic device 201 switches a camera mode of theelectronic device 201 to a multi-camera mode inoperation 1805. For example, when thetap input 1910 for the camera control area is detected while the service is provided through the front camera device, theprocessor 230 may additionally activate the back camera device. For example, when thetap input 1910 for the camera control area is detected while the service is provided through the back camera device, theprocessor 230 may additionally activate the front camera device. - The
electronic device 201 displays a service screen using multiple cameras on thedisplay 270 based on the multi-camera mode inoperation 1807. For example, referring toFIG. 19B , theprocessor 230 displays apreview image 1920 of the activated back camera device to overlap at least a part of thepreview image 1910 of the front camera device based on the multi-camera mode. In addition, when a drag input for an edge area of thepreview image 1920 of the back camera device is detected, theprocessor 230 controls a size of thepreview image 1920 of the back camera device in accordance with a drag distance. Referring toFIG. 19C , when a tap input for the displayedsmall preview image 1920 of the back camera device is detected, theprocessor 230 reverses display areas of thepreview image 1910 of the front camera device and thepreview image 1920 of the back camera device as indicated byreference numeral 1940. Referring toFIGS. 19D and 19E , when a drag input for the displayed small preview image of the front camera is detected as indicated byreference numeral 1950 inFIG. 19D , theprocessor 230 updates a display location of the preview image of the front camera according to the drag input as indicated byreference numeral 1960 inFIG. 19E . - The
electronic device 201 determines whether the multi-camera mode ends inoperation 1809. For example, when the drag input for the displayed small preview image is detected to move outside the area of thedisplay 270 as indicated byreference numeral 1970 inFIG. 19E , theprocessor 230 determines that the multi-camera mode ends. - When the multi-camera mode does not end, the
electronic device 201 maintains the service screen using the multiple cameras displayed on thedisplay 270 inoperation 1807. - When the multi-camera mode ends, the
electronic device 201 switches the camera mode of theelectronic device 201 to a single camera mode and displays a service screen of a single camera device on thedisplay 270 inoperation 1811. For example, referring toFIG. 19F , when an event corresponding to the type of the preview image acquired through the front camera device is detected, as indicated byreference numeral 1970 inFIG. 19E , theprocessor 230 displays the preview screen of the back camera device on thedisplay 270, as indicated byreference numeral 1980. -
FIG. 20 is a flowchart of a process for providing an automatic photographing service in an electronic device, according to an embodiment of the present disclosure.FIGS. 21A to 21C illustrate a screen configuration for providing an automatic photographing service in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include theelectronic device 201 or at least a part (for example, processor 230) of theelectronic device 201. - Referring to
FIG. 20 , an operation for providing the automatic photographing service using the screen configuration ofFIGS. 21A to 21C will be described. Theelectronic device 201 displays a service screen (for example, preview image) of a camera application on thedisplay 270 inoperation 2001. For example, theprocessor 230 executes the camera application based on touch information of a camera control area throughoperations 401 to 407 ofFIG. 4 oroperations 1001 to 1009 ofFIG. 10 . Referring toFIG. 21A , theprocessor 230 displays the preview image of the front camera device on thedisplay 270 as indicated byreference numeral 2100. - The
electronic device 201 determines whether an automatic photographing mode is set to the camera application inoperation 2003. For example, theprocessor 230 determines whether an automatic photographing menu is set in an activated state based on input information detected through the input/output interface 260. - When the automatic photographing mode is not set to the camera application, the
electronic device 201 terminates the operation for providing the automatic photographing service. In this case, theelectronic device 201 captures an image based on a touch input of a photographing button displayed on the service screen of the camera application. - When the automatic photographing mode is set, the
electronic device 201 displays motion information of thecamera device 220 in at least some areas of thedisplay 270 inoperation 2005. For example, when the automatic photographing mode is set, theprocessor 230 displays motion information of thecamera device 220 corresponding to a location and an angle of theelectronic device 201 based on the placement area of thecamera device 220 as indicated byreference numeral 2110 inFIG. 21A . - The
electronic device 201 determines whether a motion of thecamera device 220 that a capturing event matches is detected inoperation 2007. For example, theprocessor 230 determines whether motion information of theelectronic device 201 that matches the location and angle of theelectronic device 201 preset for image capturing is detected. For example, the preset location and angle of theelectronic device 201 may be set by a user's input or may include at least one of locations and angles of theelectronic device 201 that match the image acquired through the front camera mode. - When the motion of the
camera device 220 that the capturing event matches is not detected, theelectronic device 201 displays changed motion information of thecamera device 220 in at least some areas of thedisplay 270 inoperation 2005. For example, referring toFIG. 21B , theprocessor 230 may change the motion information of thecamera device 220 displayed based on the placement area of thecamera device 220 according to a change in the location and angle of theelectronic device 201 as indicated byreference numeral 2120. - When the motion of the
camera device 220 that the capturing event matches is detected, theelectronic device 201 captures the image by driving the camera device (for example, front camera device) inoperation 2009. For example, referring toFIG. 21C , when motion information of theelectronic device 201 which matches the location and angle of theelectronic device 201 preset for image capturing is detected, theprocessor 230 displays matching information by the location and angle of theelectronic device 201 to allow the user to recognize an automatic photographing time point as indicated byreference numeral 2130. Theprocessor 230 captures the image by using the front camera device. For example, theprocessor 230 may acquire an amount of light for the image capturing or perform the image capturing and change a color of thedisplay 270 at the same time for an image effect. -
FIG. 22 is a flowchart of a process for controlling a camera device in accordance with a camera application in an electronic device, according to an embodiment of the present disclosure. -
FIGS. 23A and 23B illustrate a screen configuration for controlling a camera device in accordance with a camera application in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include theelectronic device 201 or at least a part (for example, processor 230) of theelectronic device 201. - Referring to
FIG. 22 , an operation for controlling thecamera device 220 using the screen configuration ofFIGS. 23A and 23B will be described. Theelectronic device 201 determines whether a first type touch input for the camera control area is detected inoperation 2201. For example, referring toFIG. 23A , theprocessor 230 determines whether adrag input 2310 in a right direction of the placement area of thecamera device 220 is detected. - When the first type touch input for the camera control area is not detected, the
electronic device 201 terminates the operation for setting the camera application. For example, when the drag input in a down direction for the camera control area is detected, theelectronic device 201 executes one camera application that is set as a basic application among a plurality of applications installed in theelectronic device 201. - When the first type touch input for the camera control area is detected, the
electronic device 201 determines at least one camera application installed in theelectronic device 201 inoperation 2203. For example, theprocessor 230 may extract camera application information stored in thememory 240. - In
operation 2205, theelectronic device 201 displays a camera application list including at least one camera application installed in theelectronic device 201. For example, referring toFIG. 23B , theprocessor 230 displays icons of camera applications installed in theelectronic device 201 on thedisplay 270 such that the icons are output in the placement area of thecamera device 220. - In
operation 2207, theelectronic device 201 determines whether a first camera application which is one of the applications included in the camera application list is selected. For example, theprocessor 230 determines whether a touch input for one of the icons of the camera applications displayed in the camera control area is detected as illustrated inFIG. 23B . - When a selection input for the first camera application is not detected, the
electronic device 201 maintains display of the camera application list inoperation 2205. In addition, when an input for the camera application list is not detected until a reference time passes from a time point when the camera application list is displayed, theelectronic device 201 determines to not select the camera application for controlling thecamera device 220 and terminates the operation. - When the selection input for the first camera application is detected, the
electronic device 201 drives thecamera device 220 based on the first camera application inoperation 2209. For example, theprocessor 230 controls thecamera device 220 by executing the first camera application. Accordingly, theprocessor 230 performs initial settings on thecamera device 220 based on camera setting information set to the first camera application. In addition, theprocessor 230 sets the first camera application as a basic camera application of theelectronic device 201. -
FIG. 24 is a flowchart of a process for controlling a camera device in accordance with photographing setting information of an image in an electronic device, according to an embodiment of the present disclosure.FIG. 25 illustrates a screen configuration for controlling a camera device in accordance with photographing setting information of an image in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include theelectronic device 201 or at least a part (for example, processor 230) of theelectronic device 201. - Referring to
FIG. 24 , an operation for controlling the camera device using the screen configuration ofFIG. 25 will be described. Theelectronic device 201 displays a list of at least one image stored in thememory 240 of theelectronic device 201 on thedisplay 270 inoperation 2401. For example, referring toFIG. 25 , when a gallery application is executed based on a user input, theprocessor 230 displays a thumbnail for at least one image stored in thememory 240 on thedisplay 270 as indicated byreference numeral 2500. In addition, theprocessor 230 displays correspondingfilter information 2510 on an image to which a filter for an image effect is applied. Theelectronic device 201 determines whether a touch input for a first image in an image list displayed on thedisplay 270 is detected inoperation 2403. For example, theprocessor 230 determines whether atouch input 2520 for the first image in theimage list 2500 displayed on thedisplay 270 is detected, as illustrated inFIG. 25 . - When the touch input for at least one image in the image list is not detected, the
electronic device 201 determines to not provide the camera service and terminates the operation for providing the camera service. - When the touch input for the first image in the image list is detected, the
electronic device 201 determines whether the first image enters the camera control area inoperation 2405. For example, theprocessor 230 determines whether the first image enters the camera control area through adrag input 2530 for the first image, as illustrated inFIG. 25 . For example, when the touch input for the first image is released within the camera control area, theprocessor 230 determines that the first image enters the camera control area. For example, after detecting thefirst touch input 2520 for selecting the first image, theprocessor 230 determines whether a second touch input for determining a movement location of the first image is detected within the camera control area, as illustrated inFIG. 25 . - When the first image does not enter the camera control area, the
electronic device 201 determines whether the touch input for the first image is released inoperation 2411. For example, theprocessor 230 determines whether thetouch input 2520 for the first image is released outside the camera control area inFIG. 25 . - When the touch input for the image is maintained, the
electronic device 201 determines whether the first image enters the camera control area again by the touch input inoperation 2405. - When the touch input for the first image is released, the
electronic device 201 determines to not provide the camera service and terminates the operation for providing the camera service. For example, theprocessor 230 changes a location of the first image to a location where the touch input for the first image is released. - When the first image enters the camera control area, the
electronic device 201 determines setting information of thecamera device 220 set to capture the first image inoperation 2407. For example, the setting information of thecamera device 220 may include at least one of a filter for capturing the first image, image filter information, a photographing mode, and a photographing setting value (for example, aperture, shutter speed, and image size). - The
electronic device 201 may update the setting information of thecamera device 220 based on the setting information of thecamera device 220 set to capture the first image inoperation 2409. For example, theprocessor 230 may perform initial settings on thecamera device 220 in accordance with the setting information of thecamera device 220 that has been set to capture the first image. -
FIG. 26 is a flowchart of a process for capturing an image based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include theelectronic device 201 or at least a part (for example, processor 230) of theelectronic device 201. - Referring to
FIG. 26 , theelectronic device 201 determines whether a touch input is detected through a camera control area set based on the placement area of thecamera device 220 on the touch screen inoperation 2601. - When the touch input is detected through the camera control area, the
electronic device 201 determines whether image capturing matches the touch input detected through camera control area inoperation 2603. For example, theprocessor 230 determines whether a double tap input which an image capturing event matches is detected based on touch input matching information stored in thememory 240. - When the touch input is not detected through the camera control area or when the image capturing does not match the touch input detected through the camera control area, the
electronic device 201 determines to not perform the image capturing. Accordingly, theelectronic device 201 terminates the operation for the image capturing. - When the image capturing matches the touch input detected through camera control area, the
electronic device 201 captures the image through thecamera device 220 without executing the camera application inoperation 2605. For example, when the double tap input is detected through the camera control area, theprocessor 230 captures the image through the camera device 220 (for example, back camera device) in a state where the service screen of the application displayed on thedisplay 270 is maintained. That is, theprocessor 230 captures the image in a state where a preview image acquired through thecamera device 220 is not displayed. Theprocessor 230 stores the captured image in thememory 240. In addition, theprocessor 230 displays image capturing information on a notification bar. -
FIG. 27 is a flowchart of a process for photographing video based on touch information of a camera control area in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include theelectronic device 201 or at least a part (for example, processor 230) of theelectronic device 201. - Referring to
FIG. 27 , theelectronic device 201 determines whether a touch input is detected through a camera control area preset to control thecamera device 220 on the touch screen inoperation 2701. The camera control area may include at least some areas of the touch screen including the placement area of thecamera device 220. The camera control area may include at least some areas of the touch screen adjacent to the placement area of thecamera device 220. - When the touch input is detected through the camera control area, the
electronic device 201 determines whether video photographing matches the touch input detected through the camera control area inoperation 2703. For example, theprocessor 230 determines whether a touch input having a touch maintaining time exceeding a reference time is detected through the camera control area based on touch input matching information stored in thememory 240. - When the touch input is not detected through the camera control area or when video photographing does not match the touch input detected through the camera control area, the
electronic device 201 determines to not perform the video photographing. Accordingly, theelectronic device 201 terminates the operation for the video photographing. - When the video photographing matches the touch input detected through the camera control area, the
electronic device 201 starts the video photographing through thecamera device 220 without executing the camera application inoperation 2705. Theprocessor 230 photographs the video through the camera device 220 (for example, back camera device) in a state where the service screen of the application displayed on thedisplay 270 is maintained from a time point when the touch maintaining time of the touch input detected through the camera control area exceeds the reference time. When the video photographing is started, theprocessor 230 outputs notification information to allow the user to recognize the video photographing operation. Here, the notification information may include at least one of a notification sound, a notification message, and a vibration. - The
electronic device 201 determines whether the touch input that the video photographing matches is released inoperation 2707. - When the touch input that the video photographing matches is maintained, the
electronic device 201 may continuously photograph the video inoperation 2705. - When the touch input that the video photographing matches is released, the
electronic device 201 terminates the video photographing. For example, theprocessor 230 may store the video photographed through the back camera device in thememory 240. In addition, theprocessor 230 displays video photographing information on the notification bar. -
FIG. 28 is a flowchart of a process for displaying camera driving limit information in an electronic device, according to an embodiment of the present disclosure.FIG. 29 illustrates a screen configuration for displaying camera driving limit information in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include theelectronic device 201 or at least a part (for example, processor 230) of theelectronic device 201. - Referring to
FIG. 28 , an operation for displaying the camera driving limit information using the screen configuration ofFIG. 29 will be described. Theelectronic device 201 determines whether driving of acamera device 220 is limited in 220 2801. For example, theprocessor 230 may identify whether the driving of thecamera device 220 is limited based on the type of an application being executed in theelectronic device 201. As another example, theprocessor 230 may identify whether the driving of thecamera device 220 is limited based on a location of theelectronic device 201. For example, theprocessor 230 determines whether an operation mode of thecamera device 220 is set as an inactive mode based on input information detected through the input/output interface 260. - When the driving of the
camera device 220 is not limited, theelectronic device 201 terminates the operation for displaying the camera driving limit information. - When the driving of the
camera device 220 is limited, theelectronic device 201 displays camera driving limit information in the camera control area inoperation 2803. Referring toFIG. 29 , theprocessor 230 displays camera driving limit information 2900 (for example, red colored concentric circles) based on the placement area of thecamera device 220. - The
electronic device 201 determines whether a touch input is detected through the camera control area in a state where driving the camera device is limited inoperation 2805. For example, theprocessor 230 determines whether a touch input for the cameradriving limit information 2900 displayed in the camera control area is detected, as illustrated inFIG. 29 . - When the touch input is detected through the camera control area in a state where the driving of the
camera device 220 is limited, theelectronic device 201 executes a camera setting menu inoperation 2807. For example, when a tap input for the cameradriving limit information 2900 is detected, theprocessor 230 displays a camera setting menu for resetting a right of thecamera device 220 on thedisplay 270. -
FIG. 30 is a flowchart of a process for providing a video call service in an electronic device, according to an embodiment of the present disclosure.FIG. 31 illustrates a screen configuration for providing a video call service in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include theelectronic device 201 or at least a part (for example, processor 230) of theelectronic device 201. - Referring to
FIG. 30 , an operation for providing the video call service using the screen configuration ofFIG. 31 will be described. Referring toFIG. 30 , theelectronic device 201 determines whether a call connection request signal for the video call is received inoperation 3001. For example, when the call connection request signal is received through thecommunication interface 280, theprocessor 230 determines whether the corresponding call connection request signal is a call connection request signal corresponding to the video call service. - When the call connection request signal for the video call is not received, the
electronic device 201 terminates the operation for providing the video call service. - When the call connection request signal for the video call is received, the
electronic device 201 displays video call reception information of the display area corresponding to the camera control area inoperation 3003. For example, referring toFIG. 31 , theprocessor 230 displays videocall reception information 3100 to be output from the placement area of thecamera device 220 on thedisplay 270. - In
operation 3005, theelectronic device 201 determines whether a touch input is detected through the camera control area in a state where the video call reception information is displayed. For example, theprocessor 230 determines whether a drag input in a first direction (for example, right direction) for the videocall reception information 3100 displayed to be adjacent to the placement area of thecamera device 220 is detected. - When the touch input is not detected through the camera control area, the
electronic device 220 may maintain display of the video call reception information in at least some area of thedisplay 270 corresponding to the camera control area inoperation 3003 When the touch input is not detected through the camera control area until a reference time passes from a time point when the call connection request signal for the video call is received, theelectronic device 201 may determine to not accept the video call connection. In this case, theelectronic device 201 displays video call connection failure information on thedisplay 270. - When the touch input is detected through the camera control area in a state where the video call reception information is displayed, the
electronic device 201 activates the front camera device and provides the video call service inoperation 3007. For example, when a drag input in a first direction (for example, right direction) for the videocall reception information 3100 displayed in at least some areas of thedisplay 270 is detected, theprocessor 230 determines that the user accepts the call connection for the video call. Accordingly, theprocessor 230 displays an image collected through the front camera device and an image received from a counterpart electronic device on thedisplay 270 by executing the video call application. - When a drag input in a second direction (for example, left direction) for the video
call reception information 3100 displayed in at least some areas of thedisplay 270 is detected, theprocessor 230 may determine that the user does not accept the call connection for the video call. Accordingly, theprocessor 230 may block the call connection for the video call. -
FIG. 32 is a flowchart of a process for displaying human body recognition service information in an electronic device, according to an embodiment of the present disclosure.FIGS. 33A to 33D illustrate a screen configuration for displaying human body recognition service information in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include theelectronic device 201 or at least a part (for example, processor 230) of theelectronic device 201. - Referring to
FIG. 32 , an operation for providing the human body recognition service using the screen configuration ofFIGS. 33A to 33D will be described. Theelectronic device 201 determines whether human body recognition is performed through thecamera device 220 of theelectronic device 201 inoperation 3201. Theprocessor 230 may determine whether an iris recognition menu for unlocking theelectronic device 201 is selected using the camera device 220 (for example, front camera device). As another example, theprocessor 230 may determine whether a face recognition menu for authenticating the user of theelectronic device 201 is selected using the camera device 220 (for example, front camera device). - When the human body recognition using the
camera device 220 is not performed, theelectronic device 201 terminates the operation for providing the human body recognition service. - When the human body recognition using the
camera device 220 is performed, theelectronic device 201 displays time information spent for the human body recognition in a display area corresponding to the camera control area inoperation 3203. Referring toFIG. 33A , when the iris recognition is performed, theprocessor 230displays time information 3300 spent for the iris recognition based on the placement area of thecamera device 220. The time spent for the human body recognition may include a minimum time during which the human body recognition (for example, iris recognition) can be completed through thecamera device 220. - The
electronic device 201 determines whether the time spent for the human body recognition expires inoperation 3205. For example, theprocessor 230 determines whether an elapsed time from a time point when the human body recognition starts is the same as the time spent for the human body recognition. - When the time spent for the human body recognition does not expire, the
electronic device 201 displays elapsed time information for the human body recognition in the display area corresponding to the camera control area inoperation 3211. For example, referring toFIG. 33B , theprocessor 230 displays elapsedtime information 3310 of the iris recognition to overlap thetime information 3300 spent for the iris recognition displayed based on the placement area of thecamera device 220. - The
electronic device 201 determines again whether the time spent for the human body recognition expires inoperation 3205. - When the time spent for the human body recognition expires, the
electronic device 201 determines whether the human body recognition is successful inoperation 3207. Referring toFIG. 33C , when the elapsed time of the performance of the iris recognition is the same as the time information spent for the iris recognition as indicated byreference numeral 3320, theprocessor 230 determines that the iris recognition is completed. Accordingly, theprocessor 230 determines whether the authentication of the user is successful based on a result of the iris recognition. For example, theprocessor 230 determines whether iris information detected through the iris recognition matches iris information preset in thememory 240. - When the human body recognition fails, the
electronic device 201 determines that the authentication of the user through the human body recognition fails. Accordingly, theelectronic device 201 terminates the operation for providing the human body recognition service. Theelectronic device 201 displays human body recognition failure information on thedisplay 270. - When the human body recognition is successful, the
electronic device 201 may unlock theelectronic device 201 inoperation 3209. Referring toFIG. 33D , when the user is authenticated through the iris recognition, theprocessor 230 releases a lock function of theelectronic device 201 and displays astandby screen 3330 on thedisplay 270. -
FIG. 34 is a flowchart of a process for displaying pollution level information of a camera device in an electronic device, according to an embodiment of the present disclosure.FIG. 35 illustrates a screen configuration for displaying pollution level information of a camera device in an electronic device, according to an embodiment of the present disclosure. In the following description, the electronic device may include theelectronic device 201 or at least a part (for example, processor 230) of theelectronic device 201. - Referring to
FIG. 34 , an operation for displaying the pollution level information of thecamera device 220 using the screen configuration ofFIG. 35 will be described. Theelectronic device 201 drives thecamera device 220 disposed on some areas (for example, upper area) of thedisplay 270 inoperation 3401. For example, theprocessor 230 executes the camera application based on touch information of a camera control area throughoperations 401 to 407 ofFIG. 4 oroperations 1001 to 1009 ofFIG. 10 . Theprocessor 230 controls the camera device 220 (for example, front camera device) through the camera application. - The
electronic device 201 captures an image through thecamera device 220 disposed on some areas of thedisplay 270 inoperation 3403. For example, when a pollution level measuring event is generated, theprocessor 230 captures an image through thefront camera device 120 disposed in the upper area of thedisplay 270. For example, the pollution level measuring event may be periodically generated or may be generated at a time point when the camera device is driven. - The
electronic device 201 may detect a pollution level of a camera lens through the image capture through thecamera device 220 inoperation 3405. For example, theprocessor 230 may estimate the definition of the image acquired through the front camera device. Theprocessor 230 may detect the pollution level of the camera lens corresponding to the definition of the image. - The
electronic device 220 determines whether the pollution level of the camera lens exceeds a reference pollution level inoperation 3407. For example, the reference pollution level may be set, by the user, as a reference value of a pollution level which can influence a quality of the image acquired through thecamera device 220 or may include a fixed value. - When the pollution level of the camera lens is less than or equal to the reference pollution level, the
electronic device 201 determines that the pollution level of the camera lens does not influence the quality of the image acquired through thecamera device 220. Accordingly, theelectronic device 201 terminates the operation for displaying the pollution information of the camera lens. - When the pollution level of the camera lens exceeds the reference pollution level, the
electronic device 201 displays pollution information of the camera lens in the display area corresponding to the camera control area to allow the user to recognize the pollution level of the camera lens inoperation 3409. For example, when pollution level of the camera lens exceeds the reference pollution level, theprocessor 230displays pollution information 3500 of the camera lens based on the placement area of thecamera device 220 to inform the user to wash the camera lens. - The
electronic device 201 may display an amount of the pollution level of the camera lens based on the placement area of thecamera device 220. For example, theprocessor 230 may display the amount of the pollution level of the camera lens through the number of concentric circles based on the placement area of thecamera device 220. For example, theprocessor 230 may increase the number of concentric circles displayed in the placement area of thecamera device 220 as the pollution level of the camera lens is more serious. In addition, when the camera lens is not polluted, theprocessor 230 may not display the concentric circle indicating the pollution level of the camera lens. - An electronic device including a camera device disposed at a location overlapping at least a partial area of a display and an operation method thereof control the camera device based on an adjacent area including a placement area of the camera device or an adjacent area close to the placement area of the camera device, so that a user of the electronic device can easily drive and control a camera application.
- An electronic device including a camera device disposed at a location overlapping at least a partial area of a display and an operation method thereof display control information related to a camera device in an adjacent area including a placement area of the camera device or an adjacent area close to the placement area of the camera device, so that a user of the electronic device can easily detect an operational state of the camera device and a natural photo can be taken by inducing user's eyes to a lens direction.
- The term “module” as used herein may refer to a unit including one of hardware, software, and firmware or a combination of two or more of them. The term “module” may be interchangeably used with the term “unit”, “logic”, “logical block”, “component”, or “circuit”. The “module” may be a minimum unit of an integrated component element or a part thereof. The “module” may be a minimum unit for performing one or more functions or a part thereof. The “module” may be mechanically or electronically implemented. For example, the “module” according to the present disclosure may include at least one of an application-specific integrated circuit (ASIC) chip, a field-programmable gate arrays (FPGA), and a programmable-logic device for performing operations which has been known or are to be developed hereinafter.
- According to an embodiment, at least some of the devices (for example, modules or functions thereof) or the method (for example, operations) may be implemented by instructions stored in a computer-readable storage medium in a program module form. The instructions, when executed by the
processor 230, may cause theprocessor 230 to execute the function corresponding to the instruction. The computer-readable storage medium may be thememory 240. - The computer readable storage medium may include a hard disk, a floppy disk, magnetic media (for example, a magnetic tape), optical media (for example, a compact disc read only memory (CD-ROM) and a digital versatile disc (DVD)), magneto-optical media (for example, a floptical disk), a hardware device (for example, a read only memory (ROM), a random access memory (RAM), a flash memory), and the like. In addition, the instructions may include high class language codes, which can be executed in a computer by using an interpreter, as well as machine codes made by a compiler. Any of the hardware devices as described above may be configured to work as one or more software modules in order to perform the operations according to various embodiments of the present disclosure, and vice versa.
- Any of the modules or programming modules according to various embodiments of the present disclosure may include at least one of the above described elements, exclude some of the elements, or further include other additional elements. The operations performed by the modules, programming module, or other elements according to various embodiments of the present disclosure may be executed in a sequential, parallel, repetitive, or heuristic manner. Further, some operations may be executed according to another order or may be omitted, or other operations may be added.
- The embodiments disclosed herein are provided merely to easily describe technical details of the present disclosure and to help the understanding of the present disclosure, and are not intended to limit the scope of the present disclosure. Therefore, it should be construed that all modifications and changes or various other embodiments based on the technical idea of the present disclosure fall within the scope of the present disclosure. Therefore, the scope of the present disclosure is defined, not by the detailed description and embodiments, but by the appended claims and their equivalents.
Claims (20)
1. An electronic device comprising:
a touch screen including a layer having a hole formed in the layer;
a camera configured to capture an image through the hole; and
at least one processor configured to:
receive, via the touch screen, touch input on a portion of the touch screen, a contact area of the received touch input including at least a part of an area where the hole is formed;
perform an operation related with the camera based on the touch input; and
display an image received through the camera on the touch screen based on the touch input.
2. The electronic device of claim 1 , wherein the hole of the layer is perforated or omits an element of the layer.
3. The electronic device of claim 1 , wherein the at least one processor is further configured to display operational state information of the camera on the portion.
4. The electronic device of claim 1 , wherein the touch input comprises a drag gesture that is initiated within the portion and is ended outside of the portion.
5. The electronic device of claim 4 , wherein the at least one processor is further configured to display the image being received through the camera and having a size based on a position of the touch screen where the drag gesture is ended.
6. The electronic device of claim 1 , wherein the at least one processor is further configured to:
perform the operation including an execution of the camera in response to the received touch input,
display activation information regarding the execution in the portion, and
display the image received through the camera on the touch screen based on a drag of the touch input from a first position to a second position on the touch screen.
7. The electronic device of claim 1 , wherein the at least one processor is further configured to display the image in a first portion of the touch screen and a screen of an application in a second portion of the touch screen distinct from the first portion.
8. The electronic device of claim 7 , wherein the at least one processor is further configured to configure, when the image is captured through the camera, the captured image as input data of the application.
9. The electronic device of claim 1 , wherein the at least one processor is further configured to set a photographing timer of the camera to capture the image based on a drag distance of the touch input.
10. The electronic device of claim 1 , wherein the at least one processor is further configured to change a color of the touch screen into a bright color based on the touch input simultaneously with capturing the image through the camera.
11. A method of operating an electronic device comprising a camera configured to capture an image through a hole formed in a layer of a touch screen, the method comprising:
receiving, via the touch screen, a touch input on a portion of the touch screen, a contact area of the received touch input including at least a part of an area where the hole is formed;
performing an operation related with the camera based on the touch input; and
displaying an image received through the camera on the touch screen based on the touch input.
12. The method of claim 11 , further comprising:
displaying operational state information of the camera on the portion.
13. The method of claim 11 , wherein the touch input comprises a drag gesture that is initiated within the portion and is ended outside of the portion.
14. The method of claim 13 , wherein displaying the image further comprises:
displaying the image being received through the camera and having a size based on a position of the touch screen where the drag gesture is ended.
15. The method of claim 11 , wherein performing the operation further comprises:
performing the operation including an execution of the camera in response to the received touch input, and
displaying activation information regarding the execution in the portion.
16. The method of claim 15 , wherein displaying the image comprises:
displaying the image received through the camera on the touch screen based on a drag of the touch input from a first position to a second position on the touch screen.
17. The method of claim 11 , wherein displaying the image comprises:
displaying the image in a first portion of the touch screen and a screen of an application in a second portion of the touch screen distinct from the first portion.
18. The method of claim 17 , further comprising:
configuring, when the image is captured through the camera, the captured image as input data of the application.
19. The method of claim 11 , further comprising:
setting a photographing timer of the camera to capture the image based on a drag distance of the touch input; and
displaying information on the photographing timer of the camera on the touch screen.
20. The method of claim 11 , further comprising:
changing a color of the touch screen into a bright color based on the touch input simultaneously with capturing the image through the camera.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/070,130 US20230087406A1 (en) | 2016-01-15 | 2022-11-28 | Method of controlling camera device and electronic device thereof |
US18/460,976 US20230412911A1 (en) | 2016-01-15 | 2023-09-05 | Method of controlling camera device and electronic device thereof |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160005293A KR102449593B1 (en) | 2016-01-15 | 2016-01-15 | Method for controlling camera device and electronic device thereof |
KR10-2016-0005293 | 2016-01-15 | ||
US15/407,943 US20170208241A1 (en) | 2016-01-15 | 2017-01-17 | Method of controlling camera device and electronic device thereof |
US17/104,980 US11516380B2 (en) | 2016-01-15 | 2020-11-25 | Method of controlling camera device in an electronic device in various instances and electronic device thereof |
US18/070,130 US20230087406A1 (en) | 2016-01-15 | 2022-11-28 | Method of controlling camera device and electronic device thereof |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/104,980 Continuation US11516380B2 (en) | 2016-01-15 | 2020-11-25 | Method of controlling camera device in an electronic device in various instances and electronic device thereof |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/460,976 Continuation US20230412911A1 (en) | 2016-01-15 | 2023-09-05 | Method of controlling camera device and electronic device thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
US20230087406A1 true US20230087406A1 (en) | 2023-03-23 |
Family
ID=59314854
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/407,943 Abandoned US20170208241A1 (en) | 2016-01-15 | 2017-01-17 | Method of controlling camera device and electronic device thereof |
US17/104,980 Active US11516380B2 (en) | 2016-01-15 | 2020-11-25 | Method of controlling camera device in an electronic device in various instances and electronic device thereof |
US18/070,130 Abandoned US20230087406A1 (en) | 2016-01-15 | 2022-11-28 | Method of controlling camera device and electronic device thereof |
US18/460,976 Pending US20230412911A1 (en) | 2016-01-15 | 2023-09-05 | Method of controlling camera device and electronic device thereof |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/407,943 Abandoned US20170208241A1 (en) | 2016-01-15 | 2017-01-17 | Method of controlling camera device and electronic device thereof |
US17/104,980 Active US11516380B2 (en) | 2016-01-15 | 2020-11-25 | Method of controlling camera device in an electronic device in various instances and electronic device thereof |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/460,976 Pending US20230412911A1 (en) | 2016-01-15 | 2023-09-05 | Method of controlling camera device and electronic device thereof |
Country Status (2)
Country | Link |
---|---|
US (4) | US20170208241A1 (en) |
KR (1) | KR102449593B1 (en) |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102109883B1 (en) * | 2013-09-03 | 2020-05-12 | 삼성전자주식회사 | Content transmission method and apparatus |
CN108363537A (en) * | 2018-01-24 | 2018-08-03 | 京东方科技集团股份有限公司 | Mobile terminal |
EP3525064B1 (en) * | 2018-02-12 | 2022-07-13 | Samsung Display Co., Ltd. | Display device and method for fabricating the same |
US10991774B2 (en) | 2018-02-12 | 2021-04-27 | Samsung Display Co., Ltd. | Display device and method for fabricating the same |
EP3962063A1 (en) * | 2018-05-23 | 2022-03-02 | Huawei Technologies Co., Ltd. | Photographing method and terminal device |
CN111524932A (en) * | 2019-02-01 | 2020-08-11 | Oppo广东移动通信有限公司 | Electronic equipment, pixel structure and display device |
US11308618B2 (en) | 2019-04-14 | 2022-04-19 | Holovisions LLC | Healthy-Selfie(TM): a portable phone-moving device for telemedicine imaging using a mobile phone |
CN112306314B (en) * | 2019-07-31 | 2022-10-04 | 华为技术有限公司 | Interface display method and electronic equipment |
WO2022034938A1 (en) * | 2020-08-11 | 2022-02-17 | 엘지전자 주식회사 | Image capturing device and control method therefor |
USD992593S1 (en) * | 2021-01-08 | 2023-07-18 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with graphical user interface |
US20220269831A1 (en) * | 2021-02-25 | 2022-08-25 | Lenovo (Singapore) Pte. Ltd. | Electronic privacy filter activation |
EP4341793A2 (en) * | 2021-05-17 | 2024-03-27 | Apple Inc. | Interacting with notes user interfaces |
US11516434B1 (en) * | 2021-08-26 | 2022-11-29 | Motorola Mobility Llc | Routing visual content from different camera systems to different applications during video call |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6154210A (en) * | 1998-11-25 | 2000-11-28 | Flashpoint Technology, Inc. | Method and system for implementing button interface compatibility in touch-screen equipped digital imaging device |
US8314859B2 (en) * | 2008-05-29 | 2012-11-20 | Lg Electronics Inc. | Mobile terminal and image capturing method thereof |
KR101653169B1 (en) * | 2010-03-02 | 2016-09-02 | 삼성디스플레이 주식회사 | Apparatus for visible light communication and method thereof |
JP5464083B2 (en) * | 2010-07-07 | 2014-04-09 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
KR101375908B1 (en) * | 2012-07-10 | 2014-03-18 | 주식회사 팬택 | Photographing timer control apparatus and method |
US20140133715A1 (en) * | 2012-11-15 | 2014-05-15 | Identity Validation Products, Llc | Display screen with integrated user biometric sensing and verification system |
KR102010955B1 (en) * | 2013-01-07 | 2019-08-14 | 삼성전자 주식회사 | Method for controlling preview of picture taken in camera and mobile terminal implementing the same |
KR102020636B1 (en) * | 2013-06-07 | 2019-09-10 | 삼성전자주식회사 | Method for controlling electronic device based on camera, machine-readable storage medium and electronic device |
US9525811B2 (en) * | 2013-07-01 | 2016-12-20 | Qualcomm Incorporated | Display device configured as an illumination source |
KR102080746B1 (en) | 2013-07-12 | 2020-02-24 | 엘지전자 주식회사 | Mobile terminal and control method thereof |
US9294914B2 (en) * | 2013-08-27 | 2016-03-22 | Symbol Technologies, Llc | Localized visible light communications among wireless communication devices |
KR20160063875A (en) * | 2014-11-27 | 2016-06-07 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
JP6497549B2 (en) * | 2015-03-05 | 2019-04-10 | カシオ計算機株式会社 | Electronic device, touch operation control method, and program |
US9736383B2 (en) * | 2015-10-30 | 2017-08-15 | Essential Products, Inc. | Apparatus and method to maximize the display area of a mobile device |
-
2016
- 2016-01-15 KR KR1020160005293A patent/KR102449593B1/en active IP Right Grant
-
2017
- 2017-01-17 US US15/407,943 patent/US20170208241A1/en not_active Abandoned
-
2020
- 2020-11-25 US US17/104,980 patent/US11516380B2/en active Active
-
2022
- 2022-11-28 US US18/070,130 patent/US20230087406A1/en not_active Abandoned
-
2023
- 2023-09-05 US US18/460,976 patent/US20230412911A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
US20230412911A1 (en) | 2023-12-21 |
US11516380B2 (en) | 2022-11-29 |
KR102449593B1 (en) | 2022-09-30 |
US20210084216A1 (en) | 2021-03-18 |
KR20170085760A (en) | 2017-07-25 |
US20170208241A1 (en) | 2017-07-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230087406A1 (en) | Method of controlling camera device and electronic device thereof | |
KR102386398B1 (en) | Method for providing different indicator for image based on photographing mode and electronic device thereof | |
KR102593824B1 (en) | Method for controlling a camera and electronic device thereof | |
US10082998B2 (en) | Electronic device and information sharing method thereof | |
KR102627244B1 (en) | Electronic device and method for displaying image for iris recognition in electronic device | |
KR102361885B1 (en) | Electronic apparatus and controlling method thereof | |
US10805522B2 (en) | Method of controlling camera of device and device thereof | |
EP3057309B1 (en) | Method for controlling camera system, electronic device, and storage medium | |
KR102377277B1 (en) | Method and apparatus for supporting communication in electronic device | |
KR102620138B1 (en) | Method for Outputting Screen and the Electronic Device supporting the same | |
KR102488563B1 (en) | Apparatus and Method for Processing Differential Beauty Effect | |
CN107924314B (en) | Apparatus and method for executing application | |
TWI673679B (en) | Method,apparatus and computer-readable storage media for user interface | |
US10367978B2 (en) | Camera switching method and electronic device supporting the same | |
CN105893078B (en) | Method for controlling camera system, electronic device, and storage medium | |
EP3511817B1 (en) | Electronic device having double-sided display and method for controlling application | |
US20170118402A1 (en) | Electronic device and camera control method therefor | |
CN108427533B (en) | Electronic device and method for determining environment of electronic device | |
US10057479B2 (en) | Electronic apparatus and method for switching touch operations between states | |
KR20150122574A (en) | Method and device for executing user instructions | |
US10685465B2 (en) | Electronic device and method for displaying and generating panoramic image | |
KR102650189B1 (en) | Electronic apparatus and controlling method thereof | |
KR102317624B1 (en) | Electronic device and method for processing image of the same | |
CN114826799A (en) | Information acquisition method, device, terminal and storage medium | |
CN110809256B (en) | System acceleration method and device of terminal, storage medium and terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |