US9020278B2 - Conversion of camera settings to reference picture - Google Patents

Conversion of camera settings to reference picture Download PDF

Info

Publication number
US9020278B2
US9020278B2 US13/830,487 US201313830487A US9020278B2 US 9020278 B2 US9020278 B2 US 9020278B2 US 201313830487 A US201313830487 A US 201313830487A US 9020278 B2 US9020278 B2 US 9020278B2
Authority
US
United States
Prior art keywords
image
electronic device
photo
framed
setting information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US13/830,487
Other versions
US20130330007A1 (en
Inventor
Byoungju KIM
Prashant Desai
Jesse Alvarez
Jinho Choi
TaeYoung Ha
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to US13/830,487 priority Critical patent/US9020278B2/en
Assigned to SAMSUNG ELECTRONICS CO., LTD. reassignment SAMSUNG ELECTRONICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHOI, JINHO, HA, Taeyoung, KIM, BYOUNGJU, Alvarez, Jesse, DESAI, PRASHANT
Publication of US20130330007A1 publication Critical patent/US20130330007A1/en
Application granted granted Critical
Publication of US9020278B2 publication Critical patent/US9020278B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • G06K9/46
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/30Scenes; Scene-specific elements in albums, collections or shared content, e.g. social network photos or video
    • G06K9/00671
    • G06K9/00677
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • H04N5/23293

Definitions

  • One or more embodiments relate generally to taking photos and in particular to using reference photo setting information for taking a photo image of a current framed image on an electronic device.
  • One or more embodiments relate generally to using reference photo setting information for taking a photo image of a current framed image.
  • One embodiment of provides using photo setting information used for capturing a selected reference image for capturing a current framed image.
  • a method of using reference photo setting information for taking a photo image of a current framed image comprises displaying a framed image from an image capture device of an electronic device, performing object recognition for the framed image on a display of the electronic device, identifying location information for the electronic device, presenting one or more reference images related to the framed image based on one or more of the identified location information and object recognition, selecting one of the reference images, and using photo setting information used for capturing the selected reference image for capturing the framed image.
  • the electronic device comprises an image capture device, a display and an imaging inspiration module.
  • the image inspiration module provides photo setting information used for capturing a reference photo image for capturing a current framed image via the image capture device of the electronic device.
  • the imaging inspiration module identifies location information of the electronic device, performs object recognition of subject matter of the current framed image, searches for reference photo images related to the current framed image, and presents reference photo images found from the search for selection for providing the photo setting information used for capturing the reference photo image for capturing the current framed image.
  • One embodiment comprises a non-transitory computer-readable medium having instructions which when executed on a computer perform a method comprising displaying a framed image from an image capture device of an electronic device.
  • Object recognition for the framed image is performed on a display of the electronic device.
  • Location information for the electronic device is identified.
  • One or more reference images related to the framed image is presented based on one or more of the identified location information and object recognition.
  • One of the reference images is selected. Photo setting information used for capturing the selected reference image is used for capturing the framed image.
  • GUI graphical user interface
  • the GUI comprising one or more selectable reference images related to a framed image obtained by an image capture device of the electronic device based on one or more of identified location information and object recognition.
  • photo setting information is displayed on the GUI.
  • FIGS. 1A-1B show block diagrams of architecture on a system for using reference photo setting information for taking a photo image of a current framed image with an electronic device, according to an embodiment.
  • FIGS. 2A-E shows examples of selecting a reference photo for using reference photo setting information for taking a photo image of a current framed image with an electronic device, according to an embodiment.
  • FIG. 3 shows a flowchart of a process for reference photo setting information for taking a photo image of a current framed image with an electronic device, according to an embodiment.
  • FIG. 4 is a high-level block diagram showing an information processing system comprising a computing system implementing an embodiment.
  • One or more embodiments relate generally to using an electronic device for using reference photo setting information for taking a photo image of a current framed image with an electronic device.
  • One embodiment provides multiple reference photo selections.
  • the electronic device comprises a mobile electronic device capable of data communication over a communication link such as a wireless communication link.
  • a mobile electronic device capable of data communication over a communication link such as a wireless communication link.
  • Examples of such mobile device include a mobile phone device, a mobile tablet device, smart mobile devices, etc.
  • FIG. 1A shows a functional block diagram of an embodiment of a reference photo settings selection system 10 , which provides reference photo setting information for use in taking a photo image of a current framed image with an electronic device (such as mobile device 20 as shown in FIG. 1B ), according to an embodiment.
  • the system 10 comprises an imaging inspiration module 11 including an object-based recognition module 12 ( FIG. 1B ), a location-based information module 13 ( FIG. 1B ), a camera setting translation module 14 ( FIG. 1B ) and an image retrieval module 23 ( FIG. 1B ).
  • the imaging inspiration module 11 utilizes mobile device hardware functionality including one or more of: an image capture device such as, e.g., a camera module 15 , global positioning satellite (GPS) receiver module 16 , compass module 17 , and accelerometer and gyroscope module 18 .
  • an image capture device such as, e.g., a camera module 15 , global positioning satellite (GPS) receiver module 16 , compass module 17 , and accelerometer and gyroscope module 18 .
  • GPS global positioning satellite
  • the camera module 15 is used to capture images of objects, such as people, surroundings, places, etc.
  • the GPS module 16 is used to identify a current location of the mobile device 20 (i.e., user).
  • the compass module 17 is used to identify direction of the mobile device.
  • the accelerometer and gyroscope module 18 is used to identify tilt of the mobile device.
  • the system 10 determines current location and recognizes the subject matter currently being framed, and presents a row of related photo images on the display 21 of the mobile device 20 for selection and use of the selected image photo camera settings for taking a photo of the subject matter currently framed using an image capture device of the mobile device 20 .
  • the system 10 provides a simple, fluid, and responsive user experience.
  • Providing reference photo setting information for use in taking a photo image of a current framed image with an electronic device comprises integrating information including camera settings data (e.g., F-stop data, flash data, shutter speed data, lighting data, etc.), location data, sensor data (i.e., magnetic field, accelerometer, rotation vector), etc.
  • camera settings data e.g., F-stop data, flash data, shutter speed data, lighting data, etc.
  • location data i.e., magnetic field, accelerometer, rotation vector
  • sensor data i.e., magnetic field, accelerometer, rotation vector
  • Google Android mobile operating system application programming interface (API) components providing such information may be employed.
  • the object-based recognition module 12 performs object recognition for objects being viewed in a current frame based on, for example, shape, size, outline, etc. in comparison of known objects stored, for example, in a database or storage depository.
  • the location-based information module 13 obtains the location of the mobile device 20 using the GPS module 16 and the information from the object-based recognition module 12 . For example, based on the GPS location information and object-recognition information, the location-based information module 13 may determine that the location and place of the current photo frame is a sports stadium (e.g., based on the GPS data and the recognized object, the venue may be determined). Similarly, if the current frame encompasses a famous statue, based on GPS data and object recognition, the statue may be recognized and location (including, elevation, angle, lighting, time of day, etc.) may be determined.
  • rotational information from the accelerometer and gyroscope module 18 may be used to determine the position or angle of the image capture device of the electronic mobile device 20 .
  • the image retrieval module 23 uses the information obtained from the object-based recognition module 12 and the location-based information module 13 to search for similar preexisting photo images as compared to the current photo frame of an image capture device on the mobile electronic device 20 via multiple sources, such as photo databases, photographer databases, cloud storage facilities, etc.
  • the mobile electronic device 20 obtains the preexisting photos and the respective camera settings metadata and pulls the information to the mobile electronic device 20 via the transceiver 22 .
  • the retrieved images are displayed on the display 21 as thumbnails in, for example, a row on a display. The user may then select an “inspirational” photo image and view the camera settings on the display 21 to determine whether to apply the same settings to the currently framed image.
  • the camera setting translation module 14 analyzes the camera settings information from the preexisting photo metadata and determines whether the information needs to be translated to settings the camera module 15 understands and automatically changes the camera settings on the electronic mobile device 20 . The user may then capture the currently framed image using the settings based on the selected preexisting photo image.
  • a displayed current frame 200 shows an image, such as the famous “Hollywood” sign in California.
  • the current frame display 200 includes a live view tab 210 .
  • a user taps on the live view tab 210 using the touch screen 24 and a menu 220 appears.
  • the menu 220 includes options, such as different display modes (e.g., full, hide, guideline, and inspire).
  • FIG. 2C shows the display 200 after a user tapped and swiped the display using the touch screen 24 to highlight and select the inspire mode 225 .
  • FIG. 2D shows the display 200 that shows inspirational photo images 230 shown as a row of thumbnails on the lower portion of the display 200 . As illustrated, the first inspirational photo image 235 is tapped on for selection and showing photo settings.
  • FIG. 2E shows the camera settings 240 that were used in photographing the inspirational photo image 235 . The user may then determine whether to apply the camera settings from the inspirational photo image 235 to the current framed image shown on the display 200 .
  • the use of the inspire mode 225 may be made without having to leave the live view display 200 , which assists users in taking photos without having to look up information, determine settings based on lighting, angles, position, etc.
  • a user aims an image capture device of a mobile device (e.g., smartphone, tablet, smart device) including the imaging inspirational module 11 , towards a target object/subject, for example, an object, scene or person(s) at a physical location, such as a city center, attraction, event, etc. that the user is visiting and may use an inspirational photo image for automatically changing the camera settings for taking a new photo.
  • the photo from the camera application e.g., camera module 15
  • the mobile device 20 displayed on a display 21 of the mobile device 20 .
  • the new photo image may then be shared (e.g., emailing, text messaging, uploading/pushing to a network, etc.) with others as desired.
  • FIG. 3 shows a flowchart of using reference photo setting information for taking a photo image of a current framed image process 300 , according to an embodiment.
  • Process block 310 comprises using an electronic device for framing an image in a live view display.
  • Process block 320 comprises identifying the location of the framed image.
  • Process block 321 comprises identifying the subject matter of the framed image using object-recognition.
  • Process block 322 comprises determining the identification of the angle of the framed image (i.e., camera angle) using accelerometer/gyroscopic information.
  • Process block 323 comprises searching photo image databases and sources for related photo images.
  • Process block 330 comprises presenting image thumbnails for the related preexisting photo images on the live view display showing the current framed subject matter.
  • Process block 331 comprises viewing camera settings metadata of a selected preexisting photo image and making a selection of whether to accept the camera settings for the currently viewed subject matter in the current frame.
  • Process block 340 comprises pulling the metadata information from the selected related photo image as current camera settings for the electronic device.
  • Process block 341 comprises translating the metadata camera settings for use by the image capture device of electronic device.
  • Process block 342 comprises automatically changing the current camera settings of the electronic device based on the selected related photo image camera settings.
  • Process block 350 comprises capturing the currently framed subject matter with the image capture device of the electronic device.
  • Arrow 351 comprises storing the captured photo in a memory of the electronic device or sending the photo via text message, emailing, etc.
  • FIG. 4 is a high-level block diagram showing an information processing system comprising a computing system 500 implementing an embodiment.
  • the system 500 includes one or more processors 511 (e.g., ASIC, CPU, etc.), and can further include an electronic display device 512 (for displaying graphics, text, and other data), a main memory 513 (e.g., random access memory (RAM)), storage device 514 (e.g., hard disk drive), removable storage device 515 (e.g., removable storage drive, removable memory module, a magnetic tape drive, optical disk drive, computer-readable medium having stored therein computer software and/or data), user interface device 516 (e.g., keyboard, touch screen, keypad, pointing device), and a communication interface 517 (e.g., modem, wireless transceiver (such as WiFi, Cellular), a network interface (such as an Ethernet card), a communications port, or a PCMCIA slot and card).
  • processors 511 e.g., ASIC, CPU, etc.
  • the communication interface 517 allows software and data to be transferred between the computer system and external devices.
  • the system 500 further includes a communications infrastructure 518 (e.g., a communications bus, cross-over bar, or network) to which the aforementioned devices/modules 511 through 517 are connected.
  • a communications infrastructure 518 e.g., a communications bus, cross-over bar, or network
  • the information transferred via communications interface 517 may be in the form of signals such as electronic, electromagnetic, optical, or other signals capable of being received by communications interface 517 , via a communication link that carries signals and may be implemented using wire or cable, fiber optics, a phone line, a cellular phone link, an radio frequency (RF) link, and/or other communication channels.
  • signals such as electronic, electromagnetic, optical, or other signals capable of being received by communications interface 517 , via a communication link that carries signals and may be implemented using wire or cable, fiber optics, a phone line, a cellular phone link, an radio frequency (RF) link, and/or other communication channels.
  • RF radio frequency
  • the system 500 further includes an image capture device such as a camera 15 .
  • the system 500 may further include application modules as MMS module 521 , SMS module 522 , email module 523 , social network interface (SNI) module 524 , audio/video (AV) player 525 , web browser 526 , image capture module 527 , etc.
  • application modules as MMS module 521 , SMS module 522 , email module 523 , social network interface (SNI) module 524 , audio/video (AV) player 525 , web browser 526 , image capture module 527 , etc.
  • the system 500 further includes an imaging and inspiration module 11 as described herein, according to an embodiment.
  • said imaging and inspiration module 11 along with an operating system 529 may be implemented as executable code residing in a memory of the system 500 .
  • such modules are in firmware, etc.
  • the aforementioned example architectures described above, according to said architectures can be implemented in many ways, such as program instructions for execution by a processor, as software modules, microcode, as computer program product on computer readable media, as analog/logic circuits, as application specific integrated circuits, as firmware, as consumer electronic devices, AV devices, wireless/wired transmitters, wireless/wired receivers, networks, multi-media devices, etc.
  • embodiments of said Architecture can take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment containing both hardware and software elements.
  • Embodiments have been described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to one or more embodiments.
  • Each block of such illustrations/diagrams, or combinations thereof, can be implemented by computer program instructions.
  • the computer program instructions when provided to a processor produce a machine, such that the instructions, which execute via the processor, create means for implementing the functions/operations specified in the flowchart and/or block diagram.
  • Each block in the flowchart/block diagrams may represent a hardware and/or software module or logic, implementing one or more embodiments. In alternative implementations, the functions noted in the blocks may occur out of the order noted in the figures, concurrently, etc.
  • computer program medium “computer usable medium,” “computer readable medium”, and “computer program product,” are used to generally refer to media such as main memory, secondary memory, removable storage drive, a hard disk installed in hard disk drive. These computer program products are means for providing software to the computer system.
  • the computer readable medium allows the computer system to read data, instructions, messages or message packets, and other computer readable information from the computer readable medium.
  • the computer readable medium may include non-volatile memory, such as a floppy disk, ROM, flash memory, disk drive memory, a CD-ROM, and other permanent storage. It is useful, for example, for transporting information, such as data and computer instructions, between computer systems.
  • Computer program instructions may be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • Computer program instructions representing the block diagram and/or flowcharts herein may be loaded onto a computer, programmable data processing apparatus, or processing devices to cause a series of operations performed thereon to produce a computer implemented process.
  • Computer programs i.e., computer control logic
  • Computer programs are stored in main memory and/or secondary memory. Computer programs may also be received via a communications interface. Such computer programs, when executed, enable the computer system to perform the features of one or more embodiments as discussed herein. In particular, the computer programs, when executed, enable the processor and/or multi-core processor to perform the features of the computer system.
  • Such computer programs represent controllers of the computer system.
  • a computer program product comprises a tangible storage medium readable by a computer system and storing instructions for execution by the computer system for performing a method of one or more embodiments.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method of using reference photo setting information for taking a photo image of a current framed image comprises displaying a framed image from an image capture device of an electronic device, performing object recognition for the framed image on a display of the electronic device, identifying location information for the electronic device, presenting one or more reference images related to the framed image based on one or more of the identified location information and object recognition, selecting one of the reference images, and using photo setting information used for capturing the selected reference image for capturing the framed image.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims the priority benefit of U.S. Provisional Patent Application Ser. No. 61/657,642, filed Jun. 8, 2012, incorporated herein by reference in its entirety.
TECHNICAL FIELD
One or more embodiments relate generally to taking photos and in particular to using reference photo setting information for taking a photo image of a current framed image on an electronic device.
BACKGROUND
With the proliferation of electronic devices such as mobile electronic devices, users are using the electronic devices for photo taking and editing. Users that want to recreate previous taken photographs must experiment to attempt to achieve the same photo qualities.
SUMMARY
One or more embodiments relate generally to using reference photo setting information for taking a photo image of a current framed image. One embodiment of provides using photo setting information used for capturing a selected reference image for capturing a current framed image.
In one embodiment, a method of using reference photo setting information for taking a photo image of a current framed image comprises displaying a framed image from an image capture device of an electronic device, performing object recognition for the framed image on a display of the electronic device, identifying location information for the electronic device, presenting one or more reference images related to the framed image based on one or more of the identified location information and object recognition, selecting one of the reference images, and using photo setting information used for capturing the selected reference image for capturing the framed image.
Another embodiment comprises an electronic device. The electronic device comprises an image capture device, a display and an imaging inspiration module. In one embodiment, the image inspiration module provides photo setting information used for capturing a reference photo image for capturing a current framed image via the image capture device of the electronic device. The imaging inspiration module identifies location information of the electronic device, performs object recognition of subject matter of the current framed image, searches for reference photo images related to the current framed image, and presents reference photo images found from the search for selection for providing the photo setting information used for capturing the reference photo image for capturing the current framed image.
One embodiment comprises a non-transitory computer-readable medium having instructions which when executed on a computer perform a method comprising displaying a framed image from an image capture device of an electronic device. Object recognition for the framed image is performed on a display of the electronic device. Location information for the electronic device is identified. One or more reference images related to the framed image is presented based on one or more of the identified location information and object recognition. One of the reference images is selected. Photo setting information used for capturing the selected reference image is used for capturing the framed image.
Another embodiment comprises graphical user interface (GUI) displayed on a display of an electronic device. The GUI comprising one or more selectable reference images related to a framed image obtained by an image capture device of the electronic device based on one or more of identified location information and object recognition. Upon selection of one of the reference images, photo setting information is displayed on the GUI.
These and other aspects and advantages of the embodiments will become apparent from the following detailed description, which, when taken in conjunction with the drawings, illustrate by way of example the principles of the embodiments.
BRIEF DESCRIPTION OF THE DRAWINGS
For a fuller understanding of the nature and advantages of the embodiments, as well as a preferred mode of use, reference should be made to the following detailed description read in conjunction with the accompanying drawings, in which:
FIGS. 1A-1B show block diagrams of architecture on a system for using reference photo setting information for taking a photo image of a current framed image with an electronic device, according to an embodiment.
FIGS. 2A-E shows examples of selecting a reference photo for using reference photo setting information for taking a photo image of a current framed image with an electronic device, according to an embodiment.
FIG. 3 shows a flowchart of a process for reference photo setting information for taking a photo image of a current framed image with an electronic device, according to an embodiment.
FIG. 4 is a high-level block diagram showing an information processing system comprising a computing system implementing an embodiment.
DETAILED DESCRIPTION
The following description is made for the purpose of illustrating the general principles of the embodiments and is not meant to limit the inventive concepts claimed herein. Further, particular features described herein can be used in combination with other described features in each of the various possible combinations and permutations. Unless otherwise specifically defined herein, all terms are to be given their broadest possible interpretation including meanings implied from the specification as well as meanings understood by those skilled in the art and/or as defined in dictionaries, treatises, etc.
One or more embodiments relate generally to using an electronic device for using reference photo setting information for taking a photo image of a current framed image with an electronic device. One embodiment provides multiple reference photo selections.
In one embodiment, the electronic device comprises a mobile electronic device capable of data communication over a communication link such as a wireless communication link. Examples of such mobile device include a mobile phone device, a mobile tablet device, smart mobile devices, etc.
FIG. 1A shows a functional block diagram of an embodiment of a reference photo settings selection system 10, which provides reference photo setting information for use in taking a photo image of a current framed image with an electronic device (such as mobile device 20 as shown in FIG. 1B), according to an embodiment.
The system 10 comprises an imaging inspiration module 11 including an object-based recognition module 12 (FIG. 1B), a location-based information module 13 (FIG. 1B), a camera setting translation module 14 (FIG. 1B) and an image retrieval module 23 (FIG. 1B). The imaging inspiration module 11 utilizes mobile device hardware functionality including one or more of: an image capture device such as, e.g., a camera module 15, global positioning satellite (GPS) receiver module 16, compass module 17, and accelerometer and gyroscope module 18.
The camera module 15 is used to capture images of objects, such as people, surroundings, places, etc. The GPS module 16 is used to identify a current location of the mobile device 20 (i.e., user). The compass module 17 is used to identify direction of the mobile device. The accelerometer and gyroscope module 18 is used to identify tilt of the mobile device.
The system 10 determines current location and recognizes the subject matter currently being framed, and presents a row of related photo images on the display 21 of the mobile device 20 for selection and use of the selected image photo camera settings for taking a photo of the subject matter currently framed using an image capture device of the mobile device 20. The system 10 provides a simple, fluid, and responsive user experience.
Providing reference photo setting information for use in taking a photo image of a current framed image with an electronic device (such as mobile device 20 as shown in FIG. 1B) comprises integrating information including camera settings data (e.g., F-stop data, flash data, shutter speed data, lighting data, etc.), location data, sensor data (i.e., magnetic field, accelerometer, rotation vector), etc. For example, Google Android mobile operating system application programming interface (API) components providing such information may be employed.
In one embodiment, locating and obtaining images, image photo metadata, location data, compass data, object information, recognition, keyword information pulled from services 19 from sources, such as cloud environments, networks, servers, clients, mobile devices, etc. In one embodiment, the object-based recognition module 12 performs object recognition for objects being viewed in a current frame based on, for example, shape, size, outline, etc. in comparison of known objects stored, for example, in a database or storage depository.
In one embodiment, the location-based information module 13 obtains the location of the mobile device 20 using the GPS module 16 and the information from the object-based recognition module 12. For example, based on the GPS location information and object-recognition information, the location-based information module 13 may determine that the location and place of the current photo frame is a sports stadium (e.g., based on the GPS data and the recognized object, the venue may be determined). Similarly, if the current frame encompasses a famous statue, based on GPS data and object recognition, the statue may be recognized and location (including, elevation, angle, lighting, time of day, etc.) may be determined.
Additionally, rotational information from the accelerometer and gyroscope module 18 may be used to determine the position or angle of the image capture device of the electronic mobile device 20.
In one embodiment, the image retrieval module 23 uses the information obtained from the object-based recognition module 12 and the location-based information module 13 to search for similar preexisting photo images as compared to the current photo frame of an image capture device on the mobile electronic device 20 via multiple sources, such as photo databases, photographer databases, cloud storage facilities, etc. Once similar preexisting photos have been located, the mobile electronic device 20 obtains the preexisting photos and the respective camera settings metadata and pulls the information to the mobile electronic device 20 via the transceiver 22. The retrieved images are displayed on the display 21 as thumbnails in, for example, a row on a display. The user may then select an “inspirational” photo image and view the camera settings on the display 21 to determine whether to apply the same settings to the currently framed image.
In one embodiment, the camera setting translation module 14 analyzes the camera settings information from the preexisting photo metadata and determines whether the information needs to be translated to settings the camera module 15 understands and automatically changes the camera settings on the electronic mobile device 20. The user may then capture the currently framed image using the settings based on the selected preexisting photo image.
As illustrated in FIG. 2A, in one embodiment, a displayed current frame 200 shows an image, such as the famous “Hollywood” sign in California. The current frame display 200 includes a live view tab 210. In FIG. 2B, a user taps on the live view tab 210 using the touch screen 24 and a menu 220 appears. In one embodiment, the menu 220 includes options, such as different display modes (e.g., full, hide, guideline, and inspire).
FIG. 2C shows the display 200 after a user tapped and swiped the display using the touch screen 24 to highlight and select the inspire mode 225. FIG. 2D shows the display 200 that shows inspirational photo images 230 shown as a row of thumbnails on the lower portion of the display 200. As illustrated, the first inspirational photo image 235 is tapped on for selection and showing photo settings. FIG. 2E shows the camera settings 240 that were used in photographing the inspirational photo image 235. The user may then determine whether to apply the camera settings from the inspirational photo image 235 to the current framed image shown on the display 200. The use of the inspire mode 225 may be made without having to leave the live view display 200, which assists users in taking photos without having to look up information, determine settings based on lighting, angles, position, etc.
In one embodiment, a user aims an image capture device of a mobile device (e.g., smartphone, tablet, smart device) including the imaging inspirational module 11, towards a target object/subject, for example, an object, scene or person(s) at a physical location, such as a city center, attraction, event, etc. that the user is visiting and may use an inspirational photo image for automatically changing the camera settings for taking a new photo. The photo from the camera application (e.g., camera module 15) is processed by the mobile device 20 and displayed on a display 21 of the mobile device 20. In one embodiment, the new photo image may then be shared (e.g., emailing, text messaging, uploading/pushing to a network, etc.) with others as desired.
FIG. 3 shows a flowchart of using reference photo setting information for taking a photo image of a current framed image process 300, according to an embodiment. Process block 310 comprises using an electronic device for framing an image in a live view display. Process block 320 comprises identifying the location of the framed image. Process block 321 comprises identifying the subject matter of the framed image using object-recognition. Process block 322 comprises determining the identification of the angle of the framed image (i.e., camera angle) using accelerometer/gyroscopic information. Process block 323 comprises searching photo image databases and sources for related photo images. Process block 330 comprises presenting image thumbnails for the related preexisting photo images on the live view display showing the current framed subject matter. Process block 331 comprises viewing camera settings metadata of a selected preexisting photo image and making a selection of whether to accept the camera settings for the currently viewed subject matter in the current frame.
Process block 340 comprises pulling the metadata information from the selected related photo image as current camera settings for the electronic device. Process block 341 comprises translating the metadata camera settings for use by the image capture device of electronic device. Process block 342 comprises automatically changing the current camera settings of the electronic device based on the selected related photo image camera settings. Process block 350 comprises capturing the currently framed subject matter with the image capture device of the electronic device. Arrow 351 comprises storing the captured photo in a memory of the electronic device or sending the photo via text message, emailing, etc.
FIG. 4 is a high-level block diagram showing an information processing system comprising a computing system 500 implementing an embodiment. The system 500 includes one or more processors 511 (e.g., ASIC, CPU, etc.), and can further include an electronic display device 512 (for displaying graphics, text, and other data), a main memory 513 (e.g., random access memory (RAM)), storage device 514 (e.g., hard disk drive), removable storage device 515 (e.g., removable storage drive, removable memory module, a magnetic tape drive, optical disk drive, computer-readable medium having stored therein computer software and/or data), user interface device 516 (e.g., keyboard, touch screen, keypad, pointing device), and a communication interface 517 (e.g., modem, wireless transceiver (such as WiFi, Cellular), a network interface (such as an Ethernet card), a communications port, or a PCMCIA slot and card). The communication interface 517 allows software and data to be transferred between the computer system and external devices. The system 500 further includes a communications infrastructure 518 (e.g., a communications bus, cross-over bar, or network) to which the aforementioned devices/modules 511 through 517 are connected.
The information transferred via communications interface 517 may be in the form of signals such as electronic, electromagnetic, optical, or other signals capable of being received by communications interface 517, via a communication link that carries signals and may be implemented using wire or cable, fiber optics, a phone line, a cellular phone link, an radio frequency (RF) link, and/or other communication channels.
In one implementation, in a mobile wireless device such as a mobile phone, the system 500 further includes an image capture device such as a camera 15. The system 500 may further include application modules as MMS module 521, SMS module 522, email module 523, social network interface (SNI) module 524, audio/video (AV) player 525, web browser 526, image capture module 527, etc.
The system 500 further includes an imaging and inspiration module 11 as described herein, according to an embodiment. In one implementation of said imaging and inspiration module 11 along with an operating system 529 may be implemented as executable code residing in a memory of the system 500. In another embodiment, such modules are in firmware, etc.
As is known to those skilled in the art, the aforementioned example architectures described above, according to said architectures, can be implemented in many ways, such as program instructions for execution by a processor, as software modules, microcode, as computer program product on computer readable media, as analog/logic circuits, as application specific integrated circuits, as firmware, as consumer electronic devices, AV devices, wireless/wired transmitters, wireless/wired receivers, networks, multi-media devices, etc. Further, embodiments of said Architecture can take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment containing both hardware and software elements.
Embodiments have been described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to one or more embodiments. Each block of such illustrations/diagrams, or combinations thereof, can be implemented by computer program instructions. The computer program instructions when provided to a processor produce a machine, such that the instructions, which execute via the processor, create means for implementing the functions/operations specified in the flowchart and/or block diagram. Each block in the flowchart/block diagrams may represent a hardware and/or software module or logic, implementing one or more embodiments. In alternative implementations, the functions noted in the blocks may occur out of the order noted in the figures, concurrently, etc.
The terms “computer program medium,” “computer usable medium,” “computer readable medium”, and “computer program product,” are used to generally refer to media such as main memory, secondary memory, removable storage drive, a hard disk installed in hard disk drive. These computer program products are means for providing software to the computer system. The computer readable medium allows the computer system to read data, instructions, messages or message packets, and other computer readable information from the computer readable medium. The computer readable medium, for example, may include non-volatile memory, such as a floppy disk, ROM, flash memory, disk drive memory, a CD-ROM, and other permanent storage. It is useful, for example, for transporting information, such as data and computer instructions, between computer systems. Computer program instructions may be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
Computer program instructions representing the block diagram and/or flowcharts herein may be loaded onto a computer, programmable data processing apparatus, or processing devices to cause a series of operations performed thereon to produce a computer implemented process. Computer programs (i.e., computer control logic) are stored in main memory and/or secondary memory. Computer programs may also be received via a communications interface. Such computer programs, when executed, enable the computer system to perform the features of one or more embodiments as discussed herein. In particular, the computer programs, when executed, enable the processor and/or multi-core processor to perform the features of the computer system. Such computer programs represent controllers of the computer system. A computer program product comprises a tangible storage medium readable by a computer system and storing instructions for execution by the computer system for performing a method of one or more embodiments.
Though the embodiments have been described with reference to certain versions thereof; however, other versions are possible. Therefore, the spirit and scope of the appended claims should not be limited to the description of the preferred versions contained herein.

Claims (26)

What is claimed is:
1. A method of using reference photo setting information for taking a photo image of a current framed image, comprising:
displaying a framed image from an image capture device of an electronic device;
performing object recognition for the framed image on a display of the electronic device;
identifying location information for the electronic device;
presenting one or more reference images related to the framed image based on one or more of the identified location information and object recognition;
selecting one of the reference images; and
using photo setting information used for capturing the selected reference image for capturing the framed image.
2. The method of claim 1, further comprising:
determining angle information for the framed image; and
searching image databases for the one or more reference images based on the identified location information, object recognition and angle information.
3. The method of claim 2, wherein performing object recognition comprises:
identifying subject matter of the framed image.
4. The method of claim 3, further comprising:
pulling the photo setting information into the electronic device;
translating the photo setting information from the selected reference image to photo setting information for the image capture device;
automatically changing photo settings of the image capture device; and
capturing the framed image using the changed photo settings.
5. The method of claim 4, wherein the one or more reference images are displayed as a row of thumbnail images.
6. The method of claim 5, wherein the photo setting information comprises image metadata of the selected reference image.
7. The method of claim 4, further comprising displaying the image metadata of the selected reference image.
8. The method of claim 1, wherein the photo setting information comprises one or more of orientation, aperture, shutter speed, focal length, metering mode, and International Standard Organization (ISO) speed information.
9. The method of claim 1, wherein the electronic device comprises a mobile electronic device.
10. The method of claim 9, wherein the mobile electronic device comprises a mobile phone.
11. An electronic device, comprising:
an image capture device;
a display; and
an imaging inspiration processor configured to provide photo setting information used for capturing a reference photo image for capturing a current framed image via the image capture device of the electronic device;
wherein the imaging inspiration processor is further configured to:
identify location information of the electronic device, perform object recognition of subject matter of the current framed image, search for reference photo images related to the current framed image, and present reference photo images found from the search for selection for providing the photo setting information used for capturing the reference photo image for capturing the current framed image.
12. The electronic device of claim 11, wherein the imaging inspiration processor is further configured to determine angle information for the current framed image; and searches databases for reference images based on the identified location information, object recognition of subject matter, and angle information.
13. The electronic device of claim 12, wherein the imaging inspiration processor is further configured to translate the photo setting information from the selected reference image to photo setting information for the image capture device, automatically changes photo settings of the image capture device, and captures the current framed image using the changed photo settings.
14. The electronic device of claim 13, wherein the imaging inspiration processor is further configured to arrange the reference images as a row of thumbnails.
15. The electronic device of claim 13, wherein the photo setting information comprises image metadata of the selected reference image.
16. The electronic device of claim 15, wherein the image metadata of the reference image is provided on the display.
17. The electronic device of claim 11, wherein the electronic device comprises a mobile electronic device.
18. A non-transitory computer-readable medium having instructions which when executed on a computer perform a method comprising:
displaying a framed image from an image capture device of an electronic device;
performing object recognition for the framed image on a display of the electronic device;
identifying location information for the electronic device;
presenting one or more reference images related to the framed image based on one or more of the identified location information and object recognition;
selecting one of the reference images; and
using photo setting information used for capturing the selected reference image for capturing the framed image.
19. The medium of claim 18, further comprising:
determining angle information for the framed image; and
searching image databases for the one or more reference images based on the identified location information, object recognition, and angle information.
20. The medium of claim 19, wherein performing object recognition comprises identifying subject matter of the framed image.
21. The medium of claim 20, further comprising:
pulling the photo setting information into the electronic device;
translating the photo setting information from the selected reference image to photo setting information for the image capture device;
automatically changing photo settings of the image capture device; and
capturing the framed image using the changed photo settings.
22. The medium of claim 21, wherein the one or more reference images are displayed as a row of thumbnail images, and the photo setting information comprises image metadata of the selected reference image.
23. The medium of claim 22, further comprising displaying the image metadata of the reference image.
24. The medium of claim 18, wherein the electronic device comprises a mobile electronic device.
25. A method for presenting a graphical user interface (GUI) on a display of an electronic device, comprising:
framing an image prior to capturing by an image capture device of the electronic device;
determining one or more selectable reference images related to the framed image based on one or more of identified location information and object recognition;
selecting a particular reference image; and
upon selection of the particular reference image, displaying photo setting information on the GUI for use in capturing the framed image.
26. The method of claim 25, wherein the one or more selectable reference images are displayed as thumbnails on the framed image.
US13/830,487 2012-06-08 2013-03-14 Conversion of camera settings to reference picture Expired - Fee Related US9020278B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/830,487 US9020278B2 (en) 2012-06-08 2013-03-14 Conversion of camera settings to reference picture

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261657642P 2012-06-08 2012-06-08
US13/830,487 US9020278B2 (en) 2012-06-08 2013-03-14 Conversion of camera settings to reference picture

Publications (2)

Publication Number Publication Date
US20130330007A1 US20130330007A1 (en) 2013-12-12
US9020278B2 true US9020278B2 (en) 2015-04-28

Family

ID=49715371

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/830,487 Expired - Fee Related US9020278B2 (en) 2012-06-08 2013-03-14 Conversion of camera settings to reference picture

Country Status (1)

Country Link
US (1) US9020278B2 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101303166B1 (en) * 2012-01-26 2013-09-09 엘지전자 주식회사 Mobile terminal and photo searching method thereof
US9020278B2 (en) * 2012-06-08 2015-04-28 Samsung Electronics Co., Ltd. Conversion of camera settings to reference picture
EP2873314B1 (en) * 2013-11-19 2017-05-24 Honda Research Institute Europe GmbH Control system for an autonomous garden tool, method and apparatus
US9269009B1 (en) * 2014-05-20 2016-02-23 Amazon Technologies, Inc. Using a front-facing camera to improve OCR with a rear-facing camera
US9986149B2 (en) 2015-08-14 2018-05-29 International Business Machines Corporation Determining settings of a camera apparatus
US9767590B2 (en) * 2015-10-23 2017-09-19 Apple Inc. Techniques for transforming a multi-frame asset into a single image
EP3457042A4 (en) * 2016-05-11 2019-05-01 Mitsubishi Electric Corporation Air conditioning visualization system
JP6938232B2 (en) * 2017-06-09 2021-09-22 キヤノン株式会社 Information processing equipment, information processing methods and programs
CN107483830B (en) * 2017-09-13 2020-08-11 惠州Tcl移动通信有限公司 Photo shooting control method and system based on mobile terminal and storage medium
US10387487B1 (en) 2018-01-25 2019-08-20 Ikorongo Technology, LLC Determining images of interest based on a geographical location

Citations (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6370568B1 (en) * 1998-10-02 2002-04-09 Jeffrey Garfinkle Digital real time postcards including information such as geographic location or landmark
US20050280661A1 (en) * 2002-07-31 2005-12-22 Canon Kabushiki Kaisha Information presentation apparatus and information processing method thereof
US20060002590A1 (en) * 2004-06-30 2006-01-05 Borak Jason M Method of collecting information for a geographic database for use with a navigation system
US20070268392A1 (en) * 2004-12-31 2007-11-22 Joonas Paalasmaa Provision Of Target Specific Information
US20080002916A1 (en) * 2006-06-29 2008-01-03 Luc Vincent Using extracted image text
US20080069404A1 (en) * 2006-09-15 2008-03-20 Samsung Electronics Co., Ltd. Method, system, and medium for indexing image object
US20080240694A1 (en) * 2007-03-28 2008-10-02 Sony Corporation Electronic device
US20080239133A1 (en) * 2007-03-29 2008-10-02 Cazier Robb P Image manipulator for a camera
US20080268876A1 (en) * 2007-04-24 2008-10-30 Natasha Gelfand Method, Device, Mobile Terminal, and Computer Program Product for a Point of Interest Based Scheme for Improving Mobile Visual Searching Functionalities
US20090324103A1 (en) * 2008-06-27 2009-12-31 Natasha Gelfand Method, apparatus and computer program product for providing image modification
US20100149367A1 (en) * 2008-12-17 2010-06-17 Samsung Digital Imaging Co., Ltd. Digital image signal processing apparatus and method of displaying scene recognition
US20100194931A1 (en) * 2007-07-23 2010-08-05 Panasonic Corporation Imaging device
US20100245596A1 (en) * 2009-03-27 2010-09-30 Motorola, Inc. System and method for image selection and capture parameter determination
US20100290699A1 (en) * 2009-05-15 2010-11-18 Google Inc. Landmarks from Digital Photo Collections
US20110069201A1 (en) * 2009-03-31 2011-03-24 Ryouichi Kawanishi Image capturing device, integrated circuit, image capturing method, program, and recording medium
US7953295B2 (en) * 2006-06-29 2011-05-31 Google Inc. Enhancing text in images
US8031940B2 (en) * 2006-06-29 2011-10-04 Google Inc. Recognizing text in images using ranging data
US20110311140A1 (en) * 2010-06-18 2011-12-22 Google Inc. Selecting Representative Images for Establishments
US20120084323A1 (en) * 2010-10-02 2012-04-05 Microsoft Corporation Geographic text search using image-mined data
US20130018881A1 (en) * 2011-07-15 2013-01-17 Apple Inc. Geo-Tagging Digital Images
US20130138685A1 (en) * 2008-05-12 2013-05-30 Google Inc. Automatic Discovery of Popular Landmarks
US20130202209A1 (en) * 2012-02-08 2013-08-08 Sony Corporation Image processing device, image processing method, computer program and computer-readable recording medium
US8531514B2 (en) * 2007-09-20 2013-09-10 Nec Corporation Image providing system and image providing method
US20130288719A1 (en) * 2012-04-27 2013-10-31 Oracle International Corporation Augmented reality for maintenance management, asset management, or real estate management
US20130286244A1 (en) * 2010-03-23 2013-10-31 Motorola Mobility Llc System and Method for Image Selection and Capture Parameter Determination
US20130330007A1 (en) * 2012-06-08 2013-12-12 Samsung Electronics Co., Ltd. Conversion of camera settings to reference picture

Patent Citations (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6370568B1 (en) * 1998-10-02 2002-04-09 Jeffrey Garfinkle Digital real time postcards including information such as geographic location or landmark
US20050280661A1 (en) * 2002-07-31 2005-12-22 Canon Kabushiki Kaisha Information presentation apparatus and information processing method thereof
US20060002590A1 (en) * 2004-06-30 2006-01-05 Borak Jason M Method of collecting information for a geographic database for use with a navigation system
US20070268392A1 (en) * 2004-12-31 2007-11-22 Joonas Paalasmaa Provision Of Target Specific Information
US7953295B2 (en) * 2006-06-29 2011-05-31 Google Inc. Enhancing text in images
US20080002916A1 (en) * 2006-06-29 2008-01-03 Luc Vincent Using extracted image text
US8098934B2 (en) * 2006-06-29 2012-01-17 Google Inc. Using extracted image text
US8031940B2 (en) * 2006-06-29 2011-10-04 Google Inc. Recognizing text in images using ranging data
US20080069404A1 (en) * 2006-09-15 2008-03-20 Samsung Electronics Co., Ltd. Method, system, and medium for indexing image object
US20080240694A1 (en) * 2007-03-28 2008-10-02 Sony Corporation Electronic device
US20080239133A1 (en) * 2007-03-29 2008-10-02 Cazier Robb P Image manipulator for a camera
US20080268876A1 (en) * 2007-04-24 2008-10-30 Natasha Gelfand Method, Device, Mobile Terminal, and Computer Program Product for a Point of Interest Based Scheme for Improving Mobile Visual Searching Functionalities
US20100194931A1 (en) * 2007-07-23 2010-08-05 Panasonic Corporation Imaging device
US8294813B2 (en) * 2007-07-23 2012-10-23 Panasonic Corporation Imaging device with a scene discriminator
US8531514B2 (en) * 2007-09-20 2013-09-10 Nec Corporation Image providing system and image providing method
US20130138685A1 (en) * 2008-05-12 2013-05-30 Google Inc. Automatic Discovery of Popular Landmarks
US20090324103A1 (en) * 2008-06-27 2009-12-31 Natasha Gelfand Method, apparatus and computer program product for providing image modification
US20100149367A1 (en) * 2008-12-17 2010-06-17 Samsung Digital Imaging Co., Ltd. Digital image signal processing apparatus and method of displaying scene recognition
US20100245596A1 (en) * 2009-03-27 2010-09-30 Motorola, Inc. System and method for image selection and capture parameter determination
US8726324B2 (en) * 2009-03-27 2014-05-13 Motorola Mobility Llc Method for identifying image capture opportunities using a selected expert photo agent
US20110069201A1 (en) * 2009-03-31 2011-03-24 Ryouichi Kawanishi Image capturing device, integrated circuit, image capturing method, program, and recording medium
US20100290699A1 (en) * 2009-05-15 2010-11-18 Google Inc. Landmarks from Digital Photo Collections
US20130286244A1 (en) * 2010-03-23 2013-10-31 Motorola Mobility Llc System and Method for Image Selection and Capture Parameter Determination
US20110311140A1 (en) * 2010-06-18 2011-12-22 Google Inc. Selecting Representative Images for Establishments
US8379912B2 (en) * 2010-06-18 2013-02-19 Google Inc. Identifying establishments in images
US8385593B2 (en) * 2010-06-18 2013-02-26 Google Inc. Selecting representative images for establishments
US8532333B2 (en) * 2010-06-18 2013-09-10 Google Inc. Selecting representative images for establishments
US8265400B2 (en) * 2010-06-18 2012-09-11 Google Inc. Identifying establishments in images
US20120020565A1 (en) * 2010-06-18 2012-01-26 Google Inc. Selecting Representative Images for Establishments
US20120084323A1 (en) * 2010-10-02 2012-04-05 Microsoft Corporation Geographic text search using image-mined data
US20130018881A1 (en) * 2011-07-15 2013-01-17 Apple Inc. Geo-Tagging Digital Images
US20130202209A1 (en) * 2012-02-08 2013-08-08 Sony Corporation Image processing device, image processing method, computer program and computer-readable recording medium
US20130288719A1 (en) * 2012-04-27 2013-10-31 Oracle International Corporation Augmented reality for maintenance management, asset management, or real estate management
US8761811B2 (en) * 2012-04-27 2014-06-24 Oracle International Corporation Augmented reality for maintenance management, asset management, or real estate management
US20130330007A1 (en) * 2012-06-08 2013-12-12 Samsung Electronics Co., Ltd. Conversion of camera settings to reference picture

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Anonymous, "Flickr", Feb. 10, 2004, Wikipedia, pp. 1-13, United States [downloaded from http://en.wikipedia.org/wiki/Flickr on Dec. 17, 2014].
Anonymous, "Google Earth", Jun. 11, 2005, Wikipedia, pp. 1-36, United States [downloaded from http://en.wikipedia.org/wiki/Google-Earth on Dec. 17, 2014].

Also Published As

Publication number Publication date
US20130330007A1 (en) 2013-12-12

Similar Documents

Publication Publication Date Title
US9020278B2 (en) Conversion of camera settings to reference picture
US11714523B2 (en) Digital image tagging apparatuses, systems, and methods
EP3170123B1 (en) System and method for setting focus of digital image based on social relationship
KR101753031B1 (en) Mobile terminal and Method for setting metadata thereof
KR102125556B1 (en) Augmented reality arrangement of nearby location information
TWI753348B (en) Pose determination method, pose determination device, electronic device and computer readable storage medium
US9159169B2 (en) Image display apparatus, imaging apparatus, image display method, control method for imaging apparatus, and program
US9742995B2 (en) Receiver-controlled panoramic view video share
US10674066B2 (en) Method for processing image and electronic apparatus therefor
US20130329111A1 (en) Contextual help guide
KR102314594B1 (en) Image display method and electronic device
JP2018500611A (en) Image processing method and apparatus
US20140232906A1 (en) Method and apparatus for image processing
JP2017538978A (en) Alarm method and device
WO2020181728A1 (en) Image processing method and apparatus, electronic device, and storage medium
EP2654019B1 (en) Method for displaying augmented reality image and electronic device thereof
US20090278949A1 (en) Camera system and method for providing information on subjects displayed in a camera viewfinder
CN110019599A (en) Obtain method, system, device and the electronic equipment of point of interest POI information
CN112432637B (en) Positioning method and device, electronic equipment and storage medium
US20190114675A1 (en) Method and system for displaying relevant advertisements in pictures on real time dynamic basis
CN116349220A (en) Real-time video editing
KR20130094493A (en) Apparatus and method for outputting a image in a portable terminal
KR102097199B1 (en) Method and apparatus for providing image based on position
US11146741B2 (en) Electronic device and method for capturing and displaying image
US20190116214A1 (en) Method and system for taking pictures on real time dynamic basis

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KIM, BYOUNGJU;DESAI, PRASHANT;ALVAREZ, JESSE;AND OTHERS;SIGNING DATES FROM 20130312 TO 20130313;REEL/FRAME:030006/0793

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20190428