WO2018055617A1 - Method and system for managing images - Google Patents

Method and system for managing images Download PDF

Info

Publication number
WO2018055617A1
WO2018055617A1 PCT/IL2017/051062 IL2017051062W WO2018055617A1 WO 2018055617 A1 WO2018055617 A1 WO 2018055617A1 IL 2017051062 W IL2017051062 W IL 2017051062W WO 2018055617 A1 WO2018055617 A1 WO 2018055617A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
screen display
computer
storage medium
container
Prior art date
Application number
PCT/IL2017/051062
Other languages
French (fr)
Inventor
Zohar POIZNER
Yair STEINMETZ
Alexander POIZNER
Original Assignee
Flyve Group Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Flyve Group Ltd. filed Critical Flyve Group Ltd.
Publication of WO2018055617A1 publication Critical patent/WO2018055617A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons

Definitions

  • the present invention is directed to methods and systems for managing images, in particular on computers, such as mobile devices, including smartphones.
  • Photographs and videos are taken by the camera and stored in the storage media on the smartphone. This process is automatic. Once a photograph or video is taken, it is automatically stored in the storage media and accessible through the touch screen of the smartphone. This photograph or video storage takes up memory in storage media.
  • the present invention provides methods, systems, and products, such as software products, for allowing a user of a computer, for example, a mobile computer, such as a smartphone, to determine whether or not an image, for example, a photograph or a video, is to be saved, before it is actually saved in the storage media of the computer, e.g., smartphone.
  • a computer for example, a mobile computer, such as a smartphone
  • an image for example, a photograph or a video
  • Embodiments of the present invention are directed to a method for managing images.
  • the method comprises: obtaining an image which displays on a screen display of a computer; determining whether the image is held on the screen display; determining the location on the screen display where the image is held; and, performing an action on the image based on the location of the image on the display screen when the hold on the image on the screen display is determined to be broken.
  • the action performed is such that: if the hold on the image is broken while at least a portion of the image is over or in a predetermined proximity to a designated location on the screen display where the image is to be saved, saving the image; and, if the hold on the image is broken while at least a portion of the image is not over or in the predetermined proximity to the designated location on the screen display where the image is to be saved, dismissing the image.
  • the hold on the image on the screen display includes at least one of: a contact on the screen display over the image, when the screen display includes a touch screen; or, a mouse click on the image on the screen display, when the screen display is not a touch screen.
  • the designated location on the screen display includes a dedicated area for receiving images.
  • the dedicated area includes at least one container.
  • the at least one container is represented on the screen display graphically.
  • the dedicated area includes a portion of the screen display.
  • the saving the image includes adding the image to a stack of images in the at least one container.
  • the at least one container is represented on the screen display as an album.
  • the saving the image includes adding the image to a stack of images in the at least one container represented on the screen display as an album.
  • the at least one container includes a plurality of containers and each container of the plurality of containers is represented on the screen display as an album.
  • the dismissing of the image is performed without saving the image in storage media of the computer.
  • the dismissing the image includes deleting the image.
  • the obtaining the image is performed by an imaging device associated with the computer.
  • the imaging device includes a camera and the computer includes a mobile computing device.
  • the mobile computing device includes a smartphone.
  • the image includes at least one of a photographic image or a video.
  • the obtaining the image determining whether the image is held on the screen display; determining the location on the screen display where the image is held; and, performing an action on the image, is in real time.
  • the determining whether the image is held on the screen display is performed continuously.
  • Embodiments of the present invention are directed to a system for managing images.
  • the system comprises: computer components; and, a processor for executing the computer components.
  • the computer components comprise: a first module for obtaining an image which displays on a screen display of a computer; a second module for determining whether the image is held on the screen display; a third module for determining the location on the screen display where the image is held; and, a fourth module for performing an action on the image based on the location of the image on the display screen when the hold on the image on the screen display is determined to be broken, such that: if the hold on the image is broken while at least a portion of the image is over or in a predetermined proximity to a designated location on the screen display where the image is to be saved, saving the image; and, if the hold on the image is broken while at least a portion of the image is not over or in the predetermined proximity to the designated location on the screen display where the image is to be saved, dismissing the image.
  • the system additionally comprises an imaging system for providing the image for the first module.
  • the imaging system includes a camera.
  • the computer components, the processor and the imaging system are part of a mobile computer.
  • the system additionally comprises storage media for saving the image.
  • the image includes at least one of a photographic image or a video.
  • Embodiments of the present invention are directed to a computer usable non-transitory storage medium having a computer program embodied thereon for causing a suitable programmed system to manage images, by performing the following steps when such program is executed on the system.
  • the steps comprise: obtaining an image which displays on a screen display of a computer; determining whether the image is held on the screen display; determining the location on the screen display where the image is held; and, performing an action on the image based on the location of the image on the display screen when the hold on the image on the screen display is determined to be broken, such that: if the hold on the image is broken while at least a portion of the image is over or in a predetermined proximity to a designated location on the screen display where the image is to be saved, saving the image; and, if the hold on the image is broken while at least a portion of the image is not over or in the predetermined proximity to the designated location on the screen display where the image is to be saved, dismissing the image.
  • the hold on the image on the screen display includes at least one of: a contact on the screen display over the image, when the screen display includes a touch screen; or, a mouse click on the image on the screen display, when the screen display is not a touch screen.
  • the designated location on the screen display includes a dedicated area for receiving images.
  • the dedicated area includes at least one container.
  • the at least one container is represented on the screen display graphically.
  • the dedicated area includes a portion of the screen display.
  • the saving the image includes adding the image to a stack of images in the at least one container.
  • the at least one container is represented on the screen display as an album.
  • the saving the image includes adding the image to a stack of images in the at least one container represented on the screen display as an album.
  • the at least one container includes a plurality of containers and each container of the plurality of containers is represented on the screen display as an album.
  • the dismissing of the image is performed without saving the image in storage media of the computer.
  • the dismissing the image includes deleting the image.
  • the obtaining the image is performed by an imaging device associated with the computer.
  • the imaging device includes a camera and the computer includes a mobile computing device.
  • the mobile computing device includes a smartphone.
  • the image includes at least one of a photographic image or a video.
  • the obtaining the image, the determining whether the image is held on the screen display; the determining the location on the screen display where the image is held; and, the performing an action on the image is in real time.
  • the determining whether the image is held on the screen display is performed continuously.
  • a “computer” includes machines, computers, and computing or computer systems (for example, physically separate locations or devices), servers, computer, computing, and computerized devices, processors, processing systems, computing cores (for example, shared devices), and similar systems, workstations, modules and combinations of the aforementioned.
  • the aforementioned "computer” may be in various types, such as a personal computer (e.g., laptop, desktop, or tablet computer), or any type of computing device, including mobile devices, mobile computing devices and mobile computers, that can be readily transported from one location to another location, such as smartphones (cellular and network linked), smart bands, smart watches, virtual and augmented reality headsets, personal digital assistants (PDA).
  • PDA personal digital assistants
  • a “server” is typically a remote computer or remote computer system, or computer program therein, in accordance with the "computer” defined above, that is accessible over a communications medium, such as a communications network or other computer network, including the Internet.
  • a “server” provides services to, or performs functions for, other computer programs (and their users), in the same or other computers.
  • a server may also include a virtual machine, a software based emulation of a computer.
  • An “application” or “APP”, includes executable software, and optionally, any graphical user interfaces (GUI), through which certain functionality may be implemented.
  • GUI graphical user interfaces
  • a screen display for example, a touch screen of a computer or computerized device, such as a smartphone, tablet computer and the like.
  • the activation is similar to that of a mouse or other pointing device, caused by a "click” of the pointing device at an activatable location of the screen display.
  • the screen displays are activatable, for example, by activating, e.g., contacting, touching, swiping, an activatable graphic area, location button, or icon, that causes an action of the various software and or hardware, including that for executing applications and supporting the computer screen display.
  • FIG. 1 A is a diagram of an exemplary environment for the system in which embodiments of the disclosed subject matter are performed;
  • FIG. IB is a diagram of the smartphone of FIG. 1 A;
  • FIG. 2 is a diagram of the architecture of the application, as downloaded and running on a smartphone;
  • FIG. 3A is a flow diagram of an example process in accordance with embodiments of the present invention.
  • FIG. 3B is a flow diagram of an example process associated with albums created by the alternative process in FIG. 3A;
  • FIGs. 4A-4M are screen shots of processes and subprocess of the flow diagram of FIG. 3 and FIG. 5; and,
  • FIG. 5 is a flow diagram of another embodiment of the invention. DETAILED DESCRIPTION OF THE DRAWINGS
  • aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit,” “module” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more non-transitory computer readable (storage) medium(s) having computer readable program code embodied thereon.
  • FIG. 1A shows an operating environment for a non-limiting example of the present invention.
  • the operating environment includes one or more networks 50 or communications networks.
  • the network(s) 50 is, for example, a communications network, such as a Local Area Network (LAN), or a Wide Area Network (WAN), including public networks such as the Internet.
  • LAN Local Area Network
  • WAN Wide Area Network
  • the network 50 is, for example, the Internet.
  • the network(s) 50 although shown as a single network, may be a combination of networks and/or multiple networks including, for example, the Internet and cellular networks.
  • An application server 100 is linked to the network.
  • the application server 100 stores an application 120 (APP) in accordance with the present invention.
  • the application server 100 utilizes hardware, software, processors and various storage media for performing its operations.
  • the APP 120 is downloadable by a mobile computer, such as the smartphone 110 of a user 111 (representative of multitudes of smartphone users), for example, via a cellular network (represented by cellular tower 102) or via WIFI® 103.
  • "Linked" as used herein includes both wired or wireless links, either direct or indirect, and placing the computers, including, servers, components and the like, in electronic and/or data communications with each other.
  • the smartphone 110 for example, is an iPhone® (which uses an iOS operating system, from Apple of Cupertino California), Samsung Galaxy® (which uses an Android® operating system), or any other commercially available smartphone.
  • the smartphone 110 is capable of downloading and running various applications, such as APP 120, and includes a camera 110c, for imaging and image processing functions, and the like, such as those associated with images, such as photographs, videos and the like.
  • the camera 110c and the imaging and image processing functions are typically standard on most smartphones.
  • the smartphone also includes a screen display, which is, for example, a touch screen 1 lOx, as shown in FIG. IB.
  • Embodiments of the present invention may also be stored on non-transient storage media, represented, for example, by a compact discs and the like.
  • the invention is adapted to be operated by the computers, such as personal computers (PCs), of the user 111.
  • PCs personal computers
  • FIG. 2 shows a block diagram of the smartphone 110 (also referred to herein as a computer, mobile computer, mobile computing device, and/or mobile device, these terms are used interchangeably herein), which is an exemplary computer of the user 111, after the program from either the APP 120 or the non-transitory storage media, has been downloaded and/or installed on the smartphone 110.
  • This installation renders the smartphone 110, as a system for performing the processes of the present invention, referred to hereinafter as the "system” (with the element number 110'), and transforming the smartphone (e.g., mobile computer) 110 into a special purpose computer.
  • system with the element number 110'
  • the smartphone 110 architecture of the system 110' on which the present invention is performed is shown in FIG. 2.
  • the smartphone includes the CPU 202, Storage/Memory 204, Operating System (OS) 206, Network connection module 208, and Application (APP) Interface 210, and camera and imaging functionality 110c, as provided by a camera and imaging system.
  • storage media 211 on the smartphone 110 which when linked to and in communication with the application 120 serves as temporary and/or permanent storage for images, e.g., photographic images, video and the like.
  • the application (APP) 120 as downloaded onto the smartphone 110 includes computer components, for example, modules for performing the invention.
  • These modules include those for image isolation 221, image previews 222, and creating "save" location(s) 223, for images, including the creating of containers (also referred to herein as "trays"), for receiving the images, e.g., in photographic and/or video form, on the touch screen of the smartphone 110 or other mobile computer, e.g., tablet, or computer monitor which is not a touch screen.
  • the containers for example, receive the images, e.g., in photographic and/or video form, and for example, receive and save the images, by stacking them.
  • the computer components also include, for example, a save detection/saving function module 224, and a release detection/releasing (of a hold, for example, caused by an contact, touch, swipe, drag, or other activation including face and/or eye movement recognition, on the touch screen l lOx of the smartphone 110, or mouse click if using a computer monitor without a touch screen) onscreen function module 225, are also part of the system 110'.
  • the downloaded application 120 also includes a module 226 for album creation and operating and storage media 227 for storage of the aforementioned albums. Modules 221-226 operate with the screen display/touch screen operation module 212.
  • the application (APP) 120 and its modules 221-226, and storage media 227 are controlled by the CPU 202.
  • the functionalities of the APP 120 may be downloadable code from portable storage media, e.g., compact discs (CDs) or other media, or downloadable storage media, from servers and the like, over networks 50.
  • portable storage media e.g., compact discs (CDs) or other media, or downloadable storage media, from servers and the like, over networks 50.
  • FIG. 3A shows a flow diagram, which details computer-implemented processes in accordance with embodiments of the disclosed subject matter. Reference is also made to elements shown in FIGs. 1A, IB and 2.
  • the process and subprocesses of FIG. 3 A include computerized processes performed by the smartphone 110, for example, the system 110' thereof.
  • the aforementioned processes and subprocesses are, for example, performed automatically or manually, or a combination thereof, and, for example, in real time.
  • the process begins at the START block 300.
  • the user 111 installs the program, e.g., APP 120, of the present invention locally on his smartphone 110, if this has not been done previously.
  • the process moves to block 304, where the image, for example, a photographic image or photograph, as viewed through the lens of the camera 110c or other imaging device/system of the smartphone 110, and is displayed on the screen display, e.g., the touch screen l lOx, is isolated.
  • This isolated image 402 is shown, for example, on a touchscreen 11 Ox of a smartphone 110, in FIG. 4 A. Should the image not yet be isolated, the process holds at block 304 until the image from the camera 110c is isolated.
  • the process moves to block 306, where the CPU 202 receives a signal that there is a contact 406 with the touch screen l lOx of the smartphone 110, as shown, for example, in FIG. 4B, holding the captured image 407 as a preview image of the photographic image (image 402).
  • This contact for example, by a finger or instrument, with the image 402, on the touch screen 1 lOx of the smartphone 110, is known, and referred to herein as a "hold".
  • the hold on the image 402 is monitored, to determine whether the hold is present, and remains on the image 402.
  • This monitoring and determination as to the existence of the hold is performed, for example, continuously. This can be, for example, at regular intervals, such as every 0.1 second, whether the held image, e.g., photograph 407, is still being held (or if the hold has not been broken).
  • the location of the image with respect to a designated location such as a "save" location on the screen display, e.g., touch screen l lOx of the smartphone 110 is determined, at the point (location) where the hold was detected and/or determined to be broken.
  • the "save" location is, for example, a dedicated area, which, for example, includes a container (or tray) for receiving images, such as photographic images and videos, for example, in a stacking arrangement.
  • the dedicated area for example, including the container, may be displayed on the screen display of the smartphone 110 as a display box 410 (FIG. 4A), or as one or more albums 485a-485c (FIGs. 4L and 4M).
  • the “save” location typically displays as a rectangular area, it may be of other shapes. However, typically, the “save” location is an area below a designated preprogrammed line (typically not visible) on the touch screen 1 lOx of the smartphone 110.
  • the process moves to block 314a, where (in accordance with system 110' rules and policies) the image is dismissed, for example, automatically deleted, as the image was never saved in storage media, e.g., storage media 211, of the smartphone 110.
  • the process then moves to block 322, where it ends.
  • At block 312 should at least a portion of the image 402/407 (FIG. 4C, which was previously held (e.g., instantaneously prior to the hold being broken, e.g., as shown in FIG. 4C) be, at least one of at, on, over, or within a predetermined proximity (distance) to the designated location, such as the "save" location 410 (FIG. 4C) (e.g., in accordance with system 110' rules and policies), when the hold was released, such that the image was dropped, for example, at the "save” location, into a container or the like, the process moves to block 314b.
  • a predetermined proximity e.g., in accordance with system 110' rules and policies
  • the image was previously moved to the drop location or point at or proximate to the "save" location (where, for example, saving of the image will occur) by dragging the image, by holding it, as shown by the contacts 406 and 406a-406c of FIGs. 4B and 4C, respectively.
  • the hold being broken (such that the image is dropped at or proximate to the "save" location, where, for example, the image is to be saved), with at least a portion of the image (e.g., photograph) or video, at least one of on, at, over, or in proximity to the "save" location (e.g., a visual or graphical representation thereof, including visual or graphical representations of containers, trays and display boxes), on the device screen display, such as the smartphone 110 touchscreen l lOx, the image (e.g., photograph) is taken or otherwise recorded (by the camera 110c, imaging system, or the like). Similarly, for a video, the video is taken, recorded, or completed.
  • the image e.g., photograph
  • the video is taken, recorded, or completed.
  • the image 407 now, for example, in the container, is temporarily saved, e.g., in temporary storage (e.g., storage media 211).
  • This this image 407 e.g., in the form of a photograph, and, for example, is added to a stack (if this image is not the initial or first image), which, for example as shown in FIGs. 4A-4C already includes one image in the "save" location or container 410, as indicated by the number "1" therein).
  • This dragged and dropped image 402/407 at e.g., a portion of the image at least one of at, on, or in a predetermined proximity to) the "save" location or container 410, as shown in FIG.
  • the process may move to one of more of blocks 316, 318 and 320, where optional subprocesses may be performed, prior to the process ending at block 322.
  • the optional subprocess is such that the image may be used with various social media (e.g., WhatsAppTM Messenger, Facebook®, LinkedinTM, SnapchatTM), or another sending application, such as email, to be transmitted to other online recipients.
  • various social media e.g., WhatsAppTM Messenger, Facebook®, LinkedinTM, SnapchatTM
  • another sending application such as email
  • the user contacts an icon on the touch screen HOx, to access a menu 420, known as "Cam Sharing", as shown in FIG. 4E.
  • This "Cam Sharing" menu 420 includes options for saving frames, at box 432, saving video, at box 433, or more options, at box 434, as also shown in FIG. 4E.
  • an image e.g., image or video
  • he selects the menu option by contacting block 434, so that social media and email applications appear on screen 1 lOx, in a series of icons 440, as shown in FIG. 4F.
  • the image e.g., photograph
  • WhatsAppTM Messenger WhatsAppTM Messenger
  • Facebook® Facebook®
  • LinkedinTM SnapchatTM
  • the process may move directly to block 322, where it ends, or onto block 318, depending whether the user wants to perform additional operations on the image.
  • a permanent save of the image may be performed. If a permanent save of the image is performed, the process may move directly to block 322, where it ends, or onto block 320, depending whether the user wants to perform additional operations on the image.
  • the system detects whether a deletion of the image, e.g., in the "save" location, has been s performed. Should the aforementioned deletion have been or has not performed, the process moves to block 322, where it ends.
  • blocks 300 and 304-322 are also performed for videos. This process is similar to that performed for images, e.g., photographic images or photographs, but adds block 307 between blocks 306 and 308, with the flow pathway in broken lines.
  • the image isolated is that from the instant video and serves as the preview image, which is subject to a hold, at block 306.
  • a second contact 460 is made on the touch screen HOx of the smartphone 110, , as shown in FIG. 4G. Also in FIG. 4G, the previous or first contact 406', from which the preview image of the video was made, is shown. This second contact 460 is held and once detected by the system 110', causes the system 110' to respond by creating the video. Alternately, instead of the aforementioned two contacts, a video mode may be selected by the user contacting a visible button 470 on the touch screen 11 Ox of the smartphone 110, as shown in FIG. 4H.
  • the process of block 308, for video is shown.
  • the hold 475 on the video preview image 476 of the video 462 is shown on the screen display, e.g., touch screen 1 lOx, of the smartphone 110.
  • FIG. 4J shows the video 462 being held and dragged 478a-478c to the "save" location 480, for example, a container, for saving (e.g., by dropping or otherwise releasing the hold, with the video 462, on, at or proximity to the "save” location 480), in accordance with blocks 312 and 314b.
  • the video 462 is stacked, as there is a previous video or image (e.g., photograph) in the container of the "save” location, indicated by "1".
  • FIG. 4K shows the video 462 in the container of the "save” location 480 as the second item in the stack, as indicated by the number "2" in the container of the "save” location 480, in accordance with block 314b.
  • blocks 300 and 304-322 for images such as photographic images or photographs, including the alternative process for images as videos, including block 307, may also be performed for albums, another type of container.
  • optional blocks 302a and 302b are added to both processes.
  • the process which includes the albums begins at the START block 300 as detailed above, and then moves to block 302a.
  • block 302a it is determined whether an album should be selected, for subsequent photos and/or videos.
  • the albums for selectin are stored, for example in the storage media 227 (FIG. 2). If no, at block 302a, the process moves to block 304, from where it resumes as detailed above. If, yes at block 302a, the process moves to block 302b.
  • a new or existing album is activated.
  • This album activation is shown, for example, in FIG. 4L, where albums 485a-485c are presented on the screen display, e.g., touch screen l lOx, of the smartphone 110, for selection.
  • the album is accessed by a contact, drag, slide, tap, mouse click (if a desktop or laptop computer), other activation, such as face or eye movement recognition, or the like.
  • the new album 485a which has been selected by the user contacting 487 the album (album graphic) 485a on the screen display, e.g., touch screen l lOx, of the smartphone 110, is detected by the system 110'.
  • the system 110' responds to this detection by activating an album (e.g., album 485a).
  • an album e.g., album 485a
  • FIG. 3B details an example process for operating on the albums once they are created.
  • an album is complete.
  • the process moves to block 352, where the system 110' receives a selection of users to receive the album, for example, as a digital file.
  • the users are, for example, social media "friends", such as those from Facebook®.
  • the process moves to block 354, where the selected users are added to a list to receive the album.
  • the album is sent to the selected users, from the system 110' (of the smartphone 110).
  • FIG. 5 is a flow diagram of another process in accordance with the invention.
  • the process begins at the START block 500.
  • the user 111 installs the program, e.g., APP 120, of the present invention locally on his smartphone 110, if this has not been done previously.
  • the program e.g., APP 120
  • the functionalities of the APP 120 may be downloadable code from portable storage media, e.g., compact discs (CDs) or other media, or downloadable storage media, from servers and the like, over networks 50.
  • the process moves to block 502, where the image, for example, a photographic image or photograph, as viewed by the camera 110c or other imaging device/system of the smartphone 110, is isolated, and is displayed on the touch screen l lOx of the smartphone 110. Should the image not yet be isolated, the process holds at block 502 until the image from the camera 110c is isolated.
  • the image for example, a photographic image or photograph, as viewed by the camera 110c or other imaging device/system of the smartphone 110.
  • the process moves to block 504, where the CPU 202 receives a signal that there is a contact with the touch screen l lOx of the smartphone 110, holding the captured image as a preview image of the photographic image.
  • This contact for example, is by a finger or instrument, with the image on the touch screen 11 Ox of the smartphone 110.
  • the hold on the image is monitored to determine whether the hold is present, and remains on the image.
  • the process moves to block 512b.
  • the image is permanently saved in the device storage 211 or on the cloud. From block 512b, the process moves to block 514, where it ends.
  • blocks 500-514 is also performed for videos. This process is similar to that performed for images, e.g., photographic images or photographs, but adds block 505 between blocks 504 and 506, with the flow pathway in broken lines.
  • the image, which is isolated is that from the instant video and serves as the preview image, which is subject to a hold, at block 504. The process then moves to block 505.
  • a second contact is made on the touch screen of the mobile device 110, to create the video, similar to that discussed for block 307 above.
  • the selection of a video is made by the user contacting a visible button for a video mode or operation on the screen display, e.g., touch screen l lOx of the smartphone 110.
  • a visible button for a video mode or operation on the screen display e.g., touch screen l lOx of the smartphone 110.
  • Implementation of the method and/or system of embodiments of the invention can involve performing or completing selected tasks manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of embodiments of the method and/or system of the invention, several selected tasks could be implemented by hardware, by software or by firmware or by a combination thereof using an operating system.
  • a data processor such as a computing platform for executing a plurality of instructions.
  • the data processor includes a volatile memory for storing instructions and/or data and/or a non-volatile storage, for example, non-transitory storage media such as a magnetic hard-disk and/or removable media, for storing instructions and/or data.
  • non-transitory computer readable (storage) medium may be utilized in accordance with the above -listed embodiments of the present invention.
  • the non-transitory computer readable (storage) medium may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
  • processes and portions thereof can be performed by software, hardware and combinations thereof. These processes and portions thereof can be performed by computers, computer-type devices, workstations, processors, micro-processors, other electronic searching tools and memory and other non-transitory storage-type devices associated therewith.
  • the processes and portions thereof can also be embodied in programmable non-transitory storage media, for example, compact discs (CDs) or other discs including magnetic, optical, etc., readable by a machine or the like, or other computer usable storage media, including magnetic, optical, or semiconductor storage, or other source of electronic signals.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Methods and systems are disclosed which allow users of computers, including mobile computers, such as smartphones, to decide whether or not an image, for example, a photograph or a video, is to be saved, before it is actually saved in the storage media of the computer.

Description

METHOD AND SYSTEM FOR MANAGING IMAGES
CROSS REFERENCES TO RELATED APPLICATIONS
This application is related to and claims priority from commonly owned US Provisional Patent Application Serial No. 62/396,836, entitled: Method and System for Managing Images, filed on September 20, 2016, the disclosure of which is incorporated by reference in its entirety herein.
TECHNICAL FIELD
The present invention is directed to methods and systems for managing images, in particular on computers, such as mobile devices, including smartphones.
BACKGROUND
As cell phone, in particular, smartphone use, continues, the camera feature is becoming ever more prominent. Photographs and videos are taken by the camera and stored in the storage media on the smartphone. This process is automatic. Once a photograph or video is taken, it is automatically stored in the storage media and accessible through the touch screen of the smartphone. This photograph or video storage takes up memory in storage media.
While some users delete the photograph or video, if unwanted, most users typically do not bother to do so. They typically do not care, or simply do not know how to delete unwanted photographs or videos. Even after a photograph or video has been deleted, not all of the memory space is free again once the deletion is complete.
SUMMARY OF THE INVENTION
The present invention provides methods, systems, and products, such as software products, for allowing a user of a computer, for example, a mobile computer, such as a smartphone, to determine whether or not an image, for example, a photograph or a video, is to be saved, before it is actually saved in the storage media of the computer, e.g., smartphone.
Embodiments of the present invention are directed to a method for managing images. The method comprises: obtaining an image which displays on a screen display of a computer; determining whether the image is held on the screen display; determining the location on the screen display where the image is held; and, performing an action on the image based on the location of the image on the display screen when the hold on the image on the screen display is determined to be broken. The action performed is such that: if the hold on the image is broken while at least a portion of the image is over or in a predetermined proximity to a designated location on the screen display where the image is to be saved, saving the image; and, if the hold on the image is broken while at least a portion of the image is not over or in the predetermined proximity to the designated location on the screen display where the image is to be saved, dismissing the image.
Optionally, the hold on the image on the screen display includes at least one of: a contact on the screen display over the image, when the screen display includes a touch screen; or, a mouse click on the image on the screen display, when the screen display is not a touch screen.
Optionally, the designated location on the screen display includes a dedicated area for receiving images.
Optionally, the dedicated area includes at least one container.
Optionally, the at least one container is represented on the screen display graphically. Optionally, the dedicated area includes a portion of the screen display.
Optionally, the saving the image includes adding the image to a stack of images in the at least one container.
Optionally, the at least one container is represented on the screen display as an album.
Optionally, the saving the image includes adding the image to a stack of images in the at least one container represented on the screen display as an album.
Optionally, the at least one container includes a plurality of containers and each container of the plurality of containers is represented on the screen display as an album.
Optionally, the dismissing of the image is performed without saving the image in storage media of the computer.
Optionally, the dismissing the image includes deleting the image.
Optionally, the obtaining the image is performed by an imaging device associated with the computer.
Optionally, the imaging device includes a camera and the computer includes a mobile computing device.
Optionally, the mobile computing device includes a smartphone. Optionally, the image includes at least one of a photographic image or a video.
Optionally, the obtaining the image, determining whether the image is held on the screen display; determining the location on the screen display where the image is held; and, performing an action on the image, is in real time.
Optionally, the determining whether the image is held on the screen display is performed continuously.
Embodiments of the present invention are directed to a system for managing images. The system comprises: computer components; and, a processor for executing the computer components. The computer components comprise: a first module for obtaining an image which displays on a screen display of a computer; a second module for determining whether the image is held on the screen display; a third module for determining the location on the screen display where the image is held; and, a fourth module for performing an action on the image based on the location of the image on the display screen when the hold on the image on the screen display is determined to be broken, such that: if the hold on the image is broken while at least a portion of the image is over or in a predetermined proximity to a designated location on the screen display where the image is to be saved, saving the image; and, if the hold on the image is broken while at least a portion of the image is not over or in the predetermined proximity to the designated location on the screen display where the image is to be saved, dismissing the image.
Optionally, the system additionally comprises an imaging system for providing the image for the first module.
Optionally, the imaging system includes a camera.
Optionally, the computer components, the processor and the imaging system are part of a mobile computer.
Optionally, the system additionally comprises storage media for saving the image.
Optionally, the image includes at least one of a photographic image or a video.
Embodiments of the present invention are directed to a computer usable non-transitory storage medium having a computer program embodied thereon for causing a suitable programmed system to manage images, by performing the following steps when such program is executed on the system. The steps comprise: obtaining an image which displays on a screen display of a computer; determining whether the image is held on the screen display; determining the location on the screen display where the image is held; and, performing an action on the image based on the location of the image on the display screen when the hold on the image on the screen display is determined to be broken, such that: if the hold on the image is broken while at least a portion of the image is over or in a predetermined proximity to a designated location on the screen display where the image is to be saved, saving the image; and, if the hold on the image is broken while at least a portion of the image is not over or in the predetermined proximity to the designated location on the screen display where the image is to be saved, dismissing the image.
Optionally, the hold on the image on the screen display includes at least one of: a contact on the screen display over the image, when the screen display includes a touch screen; or, a mouse click on the image on the screen display, when the screen display is not a touch screen.
Optionally, the designated location on the screen display includes a dedicated area for receiving images.
Optionally, the dedicated area includes at least one container.
Optionally, the at least one container is represented on the screen display graphically. Optionally, the dedicated area includes a portion of the screen display.
Optionally, the saving the image includes adding the image to a stack of images in the at least one container.
Optionally, the at least one container is represented on the screen display as an album.
Optionally, the saving the image includes adding the image to a stack of images in the at least one container represented on the screen display as an album.
Optionally, the at least one container includes a plurality of containers and each container of the plurality of containers is represented on the screen display as an album.
Optionally, the dismissing of the image is performed without saving the image in storage media of the computer.
Optionally, the dismissing the image includes deleting the image.
Optionally, the obtaining the image is performed by an imaging device associated with the computer.
Optionally, the imaging device includes a camera and the computer includes a mobile computing device.
Optionally, the mobile computing device includes a smartphone.
Optionally, the image includes at least one of a photographic image or a video. Optionally, the obtaining the image, the determining whether the image is held on the screen display; the determining the location on the screen display where the image is held; and, the performing an action on the image, is in real time.
Optionally, the determining whether the image is held on the screen display is performed continuously.
This document references terms that are used consistently or interchangeably herein. These terms, including variations thereof, are as follows.
A "computer" includes machines, computers, and computing or computer systems (for example, physically separate locations or devices), servers, computer, computing, and computerized devices, processors, processing systems, computing cores (for example, shared devices), and similar systems, workstations, modules and combinations of the aforementioned. The aforementioned "computer" may be in various types, such as a personal computer (e.g., laptop, desktop, or tablet computer), or any type of computing device, including mobile devices, mobile computing devices and mobile computers, that can be readily transported from one location to another location, such as smartphones (cellular and network linked), smart bands, smart watches, virtual and augmented reality headsets, personal digital assistants (PDA).
A "server" is typically a remote computer or remote computer system, or computer program therein, in accordance with the "computer" defined above, that is accessible over a communications medium, such as a communications network or other computer network, including the Internet. A "server" provides services to, or performs functions for, other computer programs (and their users), in the same or other computers. A server may also include a virtual machine, a software based emulation of a computer.
An "application" or "APP", includes executable software, and optionally, any graphical user interfaces (GUI), through which certain functionality may be implemented.
The terms "contact", "touch", "swipe" , "drag", and or other activation including face or eye movement recognition, involve the activation of an activatable location on a screen display, for example, a touch screen of a computer or computerized device, such as a smartphone, tablet computer and the like. The activation is similar to that of a mouse or other pointing device, caused by a "click" of the pointing device at an activatable location of the screen display. The screen displays are activatable, for example, by activating, e.g., contacting, touching, swiping, an activatable graphic area, location button, or icon, that causes an action of the various software and or hardware, including that for executing applications and supporting the computer screen display.
Unless otherwise defined herein, all technical and/or scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the invention pertains. Although methods and materials similar or equivalent to those described herein may be used in the practice or testing of embodiments of the invention, exemplary methods and/or materials are described below. In case of conflict, the patent specification, including definitions, will control. In addition, the materials, methods, and examples are illustrative only and are not intended to be necessarily limiting.
BRIEF DESCRIPTION OF DRAWINGS
Some embodiments of the present invention are herein described, by way of example only, with reference to the accompanying drawings. With specific reference to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of embodiments of the invention. In this regard, the description taken with the drawings makes apparent to those skilled in the art how embodiments of the invention may be practiced.
Attention is now directed to the drawings, where like reference numerals or characters indicate corresponding or like components. In the drawings:
FIG. 1 A is a diagram of an exemplary environment for the system in which embodiments of the disclosed subject matter are performed;
FIG. IB is a diagram of the smartphone of FIG. 1 A;
FIG. 2 is a diagram of the architecture of the application, as downloaded and running on a smartphone;
FIG. 3A is a flow diagram of an example process in accordance with embodiments of the present invention;
FIG. 3B is a flow diagram of an example process associated with albums created by the alternative process in FIG. 3A;
FIGs. 4A-4M are screen shots of processes and subprocess of the flow diagram of FIG. 3 and FIG. 5; and,
FIG. 5 is a flow diagram of another embodiment of the invention. DETAILED DESCRIPTION OF THE DRAWINGS
Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not necessarily limited in its application to the details of construction and the arrangement of the components and/or methods set forth in the following description and/or illustrated in the drawings. The invention is capable of other embodiments or of being practiced or carried out in various ways.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit," "module" or "system." Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more non-transitory computer readable (storage) medium(s) having computer readable program code embodied thereon.
Throughout this document, numerous textual and graphical references are made to trademarks, and domain names. These trademarks and domain names are the property of their respective owners, and are referenced only for explanation purposes herein.
Reference is now made to FIGs. 1A and IB. FIG. 1A shows an operating environment for a non-limiting example of the present invention. The operating environment includes one or more networks 50 or communications networks. The network(s) 50 is, for example, a communications network, such as a Local Area Network (LAN), or a Wide Area Network (WAN), including public networks such as the Internet. As shown in FIG. 1 A, the network 50 is, for example, the Internet. The network(s) 50, although shown as a single network, may be a combination of networks and/or multiple networks including, for example, the Internet and cellular networks.
An application server 100 is linked to the network. The application server 100 stores an application 120 (APP) in accordance with the present invention. The application server 100 utilizes hardware, software, processors and various storage media for performing its operations. The APP 120 is downloadable by a mobile computer, such as the smartphone 110 of a user 111 (representative of multitudes of smartphone users), for example, via a cellular network (represented by cellular tower 102) or via WIFI® 103. "Linked" as used herein includes both wired or wireless links, either direct or indirect, and placing the computers, including, servers, components and the like, in electronic and/or data communications with each other.
The smartphone 110, for example, is an iPhone® (which uses an iOS operating system, from Apple of Cupertino California), Samsung Galaxy® (which uses an Android® operating system), or any other commercially available smartphone. The smartphone 110 is capable of downloading and running various applications, such as APP 120, and includes a camera 110c, for imaging and image processing functions, and the like, such as those associated with images, such as photographs, videos and the like. The camera 110c and the imaging and image processing functions are typically standard on most smartphones. The smartphone also includes a screen display, which is, for example, a touch screen 1 lOx, as shown in FIG. IB.
Embodiments of the present invention may also be stored on non-transient storage media, represented, for example, by a compact discs and the like. In this manner the invention is adapted to be operated by the computers, such as personal computers (PCs), of the user 111.
FIG. 2 shows a block diagram of the smartphone 110 (also referred to herein as a computer, mobile computer, mobile computing device, and/or mobile device, these terms are used interchangeably herein), which is an exemplary computer of the user 111, after the program from either the APP 120 or the non-transitory storage media, has been downloaded and/or installed on the smartphone 110. This installation renders the smartphone 110, as a system for performing the processes of the present invention, referred to hereinafter as the "system" (with the element number 110'), and transforming the smartphone (e.g., mobile computer) 110 into a special purpose computer.
Initially, the smartphone 110 architecture of the system 110' on which the present invention is performed, is shown in FIG. 2. The smartphone includes the CPU 202, Storage/Memory 204, Operating System (OS) 206, Network connection module 208, and Application (APP) Interface 210, and camera and imaging functionality 110c, as provided by a camera and imaging system. There is also storage media 211 on the smartphone 110, which when linked to and in communication with the application 120 serves as temporary and/or permanent storage for images, e.g., photographic images, video and the like. There is also a module 212 for the operation of the screen display of the smartphone 110, which is, for example, a touch-screen. The application (APP) 120 as downloaded onto the smartphone 110 includes computer components, for example, modules for performing the invention. These modules include those for image isolation 221, image previews 222, and creating "save" location(s) 223, for images, including the creating of containers (also referred to herein as "trays"), for receiving the images, e.g., in photographic and/or video form, on the touch screen of the smartphone 110 or other mobile computer, e.g., tablet, or computer monitor which is not a touch screen. The containers, for example, receive the images, e.g., in photographic and/or video form, and for example, receive and save the images, by stacking them.
The computer components, also include, for example, a save detection/saving function module 224, and a release detection/releasing (of a hold, for example, caused by an contact, touch, swipe, drag, or other activation including face and/or eye movement recognition, on the touch screen l lOx of the smartphone 110, or mouse click if using a computer monitor without a touch screen) onscreen function module 225, are also part of the system 110'. The downloaded application 120 also includes a module 226 for album creation and operating and storage media 227 for storage of the aforementioned albums. Modules 221-226 operate with the screen display/touch screen operation module 212.
The application (APP) 120 and its modules 221-226, and storage media 227 are controlled by the CPU 202. Should the invention be used with a desktop or laptop computer, the functionalities of the APP 120 may be downloadable code from portable storage media, e.g., compact discs (CDs) or other media, or downloadable storage media, from servers and the like, over networks 50.
Attention is now directed to FIG. 3A which shows a flow diagram, which details computer-implemented processes in accordance with embodiments of the disclosed subject matter. Reference is also made to elements shown in FIGs. 1A, IB and 2. The process and subprocesses of FIG. 3 A include computerized processes performed by the smartphone 110, for example, the system 110' thereof. The aforementioned processes and subprocesses are, for example, performed automatically or manually, or a combination thereof, and, for example, in real time.
The process begins at the START block 300. At this time, the user 111 installs the program, e.g., APP 120, of the present invention locally on his smartphone 110, if this has not been done previously. The process moves to block 304, where the image, for example, a photographic image or photograph, as viewed through the lens of the camera 110c or other imaging device/system of the smartphone 110, and is displayed on the screen display, e.g., the touch screen l lOx, is isolated. This isolated image 402 is shown, for example, on a touchscreen 11 Ox of a smartphone 110, in FIG. 4 A. Should the image not yet be isolated, the process holds at block 304 until the image from the camera 110c is isolated.
From block 304, the process moves to block 306, where the CPU 202 receives a signal that there is a contact 406 with the touch screen l lOx of the smartphone 110, as shown, for example, in FIG. 4B, holding the captured image 407 as a preview image of the photographic image (image 402). This contact, for example, by a finger or instrument, with the image 402, on the touch screen 1 lOx of the smartphone 110, is known, and referred to herein as a "hold".
Moving to block 308, the hold on the image 402 is monitored, to determine whether the hold is present, and remains on the image 402. This monitoring and determination as to the existence of the hold, is performed, for example, continuously. This can be, for example, at regular intervals, such as every 0.1 second, whether the held image, e.g., photograph 407, is still being held (or if the hold has not been broken).
At block 310, it is determined whether the hold being monitored has been broken. If no detection of the hold being broken, the process returns to block 308, from where it resumes. If yes, there is a detection of the hold having been broken, and the process moves to block 312.
At block 312, the location of the image with respect to a designated location, such as a "save" location on the screen display, e.g., touch screen l lOx of the smartphone 110 is determined, at the point (location) where the hold was detected and/or determined to be broken. The "save" location is, for example, a dedicated area, which, for example, includes a container (or tray) for receiving images, such as photographic images and videos, for example, in a stacking arrangement. The dedicated area, for example, including the container, may be displayed on the screen display of the smartphone 110 as a display box 410 (FIG. 4A), or as one or more albums 485a-485c (FIGs. 4L and 4M). While the "save" location typically displays as a rectangular area, it may be of other shapes. However, typically, the "save" location is an area below a designated preprogrammed line (typically not visible) on the touch screen 1 lOx of the smartphone 110.
At block 312, should the image 402 or at least a portion thereof, not have been moved (e.g., by dragging), to (or otherwise dropped at a location or position), at least one of, at, on, over, or within a predetermined proximity (distance) to the designated location, such as the "save" location 410 at block 312, the process moves to block 314a, where (in accordance with system 110' rules and policies) the image is dismissed, for example, automatically deleted, as the image was never saved in storage media, e.g., storage media 211, of the smartphone 110. The process then moves to block 322, where it ends.
Alternately, at block 312, should at least a portion of the image 402/407 (FIG. 4C, which was previously held (e.g., instantaneously prior to the hold being broken, e.g., as shown in FIG. 4C) be, at least one of at, on, over, or within a predetermined proximity (distance) to the designated location, such as the "save" location 410 (FIG. 4C) (e.g., in accordance with system 110' rules and policies), when the hold was released, such that the image was dropped, for example, at the "save" location, into a container or the like, the process moves to block 314b. The image was previously moved to the drop location or point at or proximate to the "save" location (where, for example, saving of the image will occur) by dragging the image, by holding it, as shown by the contacts 406 and 406a-406c of FIGs. 4B and 4C, respectively. By the hold being broken (such that the image is dropped at or proximate to the "save" location, where, for example, the image is to be saved), with at least a portion of the image (e.g., photograph) or video, at least one of on, at, over, or in proximity to the "save" location (e.g., a visual or graphical representation thereof, including visual or graphical representations of containers, trays and display boxes), on the device screen display, such as the smartphone 110 touchscreen l lOx, the image (e.g., photograph) is taken or otherwise recorded (by the camera 110c, imaging system, or the like). Similarly, for a video, the video is taken, recorded, or completed.
At block 314b, the image 407, now, for example, in the container, is temporarily saved, e.g., in temporary storage (e.g., storage media 211). This this image 407, e.g., in the form of a photograph, and, for example, is added to a stack (if this image is not the initial or first image), which, for example as shown in FIGs. 4A-4C already includes one image in the "save" location or container 410, as indicated by the number "1" therein). This dragged and dropped image 402/407 at (e.g., a portion of the image at least one of at, on, or in a predetermined proximity to) the "save" location or container 410, as shown in FIG. 4D, now appears, for example, as the top image, e.g., photograph, on the stack (indicated by the number "2" on the image at the "save" location 410, as also shown in FIG. 4D). From block 314b, the process may move to block 322, where it ends. From this end, the process may start again at block 300.
Alternately, at block 314b, the process may move to one of more of blocks 316, 318 and 320, where optional subprocesses may be performed, prior to the process ending at block 322.
At block 316, the optional subprocess is such that the image may be used with various social media (e.g., WhatsApp™ Messenger, Facebook®, Linkedin™, Snapchat™), or another sending application, such as email, to be transmitted to other online recipients. For example, the user contacts an icon on the touch screen HOx, to access a menu 420, known as "Cam Sharing", as shown in FIG. 4E. This "Cam Sharing" menu 420 includes options for saving frames, at box 432, saving video, at box 433, or more options, at box 434, as also shown in FIG. 4E. For example, when the user decides to send an image, e.g., image or video, he selects the menu option, by contacting block 434, so that social media and email applications appear on screen 1 lOx, in a series of icons 440, as shown in FIG. 4F. The image, e.g., photograph, can be sent, via any of these social media, e.g., WhatsApp™ Messenger, Facebook®, Linkedin™, Snapchat™, and the like. If a social media/send application is employed, the process may move directly to block 322, where it ends, or onto block 318, depending whether the user wants to perform additional operations on the image.
At block 318, a permanent save of the image may be performed. If a permanent save of the image is performed, the process may move directly to block 322, where it ends, or onto block 320, depending whether the user wants to perform additional operations on the image.
At block 320, the system detects whether a deletion of the image, e.g., in the "save" location, has been s performed. Should the aforementioned deletion have been or has not performed, the process moves to block 322, where it ends.
The process of blocks 300 and 304-322 is also performed for videos. This process is similar to that performed for images, e.g., photographic images or photographs, but adds block 307 between blocks 306 and 308, with the flow pathway in broken lines. At block 304, the image isolated is that from the instant video and serves as the preview image, which is subject to a hold, at block 306.
At block 307, a second contact 460 is made on the touch screen HOx of the smartphone 110, , as shown in FIG. 4G. Also in FIG. 4G, the previous or first contact 406', from which the preview image of the video was made, is shown. This second contact 460 is held and once detected by the system 110', causes the system 110' to respond by creating the video. Alternately, instead of the aforementioned two contacts, a video mode may be selected by the user contacting a visible button 470 on the touch screen 11 Ox of the smartphone 110, as shown in FIG. 4H. With the two contacts 406', 460 detected by the system 110', or, the selection of the video mode selected by the contacting (pressing) of the visible button 470 on the touch screen 1 lOx (FIG. 4H) being detected by the system 110', the video is now created.
The process then moves from block 307 to block 308, where it resumes from block 308 as detailed above, except the aforementioned video is being processed in the same or similar manner as the image (photographic image).
As shown in FIG. 41, the process of block 308, for video, is shown. Here, the hold 475 on the video preview image 476 of the video 462 is shown on the screen display, e.g., touch screen 1 lOx, of the smartphone 110.
FIG. 4J shows the video 462 being held and dragged 478a-478c to the "save" location 480, for example, a container, for saving (e.g., by dropping or otherwise releasing the hold, with the video 462, on, at or proximity to the "save" location 480), in accordance with blocks 312 and 314b. The video 462 is stacked, as there is a previous video or image (e.g., photograph) in the container of the "save" location, indicated by "1".
FIG. 4K shows the video 462 in the container of the "save" location 480 as the second item in the stack, as indicated by the number "2" in the container of the "save" location 480, in accordance with block 314b.
The process of blocks 300 and 304-322 for images, such as photographic images or photographs, including the alternative process for images as videos, including block 307, may also be performed for albums, another type of container. In this case, optional blocks 302a and 302b, are added to both processes.
The process which includes the albums begins at the START block 300 as detailed above, and then moves to block 302a. At block 302a, it is determined whether an album should be selected, for subsequent photos and/or videos. The albums for selectin are stored, for example in the storage media 227 (FIG. 2). If no, at block 302a, the process moves to block 304, from where it resumes as detailed above. If, yes at block 302a, the process moves to block 302b.
At block 302b, a new or existing album is activated. This album activation is shown, for example, in FIG. 4L, where albums 485a-485c are presented on the screen display, e.g., touch screen l lOx, of the smartphone 110, for selection. The album is accessed by a contact, drag, slide, tap, mouse click (if a desktop or laptop computer), other activation, such as face or eye movement recognition, or the like. For example, as shown in FIG. 4M, the new album 485a, which has been selected by the user contacting 487 the album (album graphic) 485a on the screen display, e.g., touch screen l lOx, of the smartphone 110, is detected by the system 110'. The system 110' responds to this detection by activating an album (e.g., album 485a). With the album activated, the process moves to block 304, from where it resumes as detailed above.
FIG. 3B details an example process for operating on the albums once they are created. At block 350 an album is complete. The process moves to block 352, where the system 110' receives a selection of users to receive the album, for example, as a digital file. The users are, for example, social media "friends", such as those from Facebook®. The process moves to block 354, where the selected users are added to a list to receive the album. At block 356, the album is sent to the selected users, from the system 110' (of the smartphone 110).
FIG. 5 is a flow diagram of another process in accordance with the invention.
The process begins at the START block 500. At this time, the user 111 installs the program, e.g., APP 120, of the present invention locally on his smartphone 110, if this has not been done previously. Should the computer not be a smartphone, but a desktop or laptop computer, the functionalities of the APP 120 may be downloadable code from portable storage media, e.g., compact discs (CDs) or other media, or downloadable storage media, from servers and the like, over networks 50.
The process moves to block 502, where the image, for example, a photographic image or photograph, as viewed by the camera 110c or other imaging device/system of the smartphone 110, is isolated, and is displayed on the touch screen l lOx of the smartphone 110. Should the image not yet be isolated, the process holds at block 502 until the image from the camera 110c is isolated.
From block 502, the process moves to block 504, where the CPU 202 receives a signal that there is a contact with the touch screen l lOx of the smartphone 110, holding the captured image as a preview image of the photographic image. This contact, for example, is by a finger or instrument, with the image on the touch screen 11 Ox of the smartphone 110.
Moving to block 506, the hold on the image is monitored to determine whether the hold is present, and remains on the image. At block 508, it is determined whether the hold being monitored has been broken. If no, the process returns to block 506, from where it resumes. If yes, the hold is broken and the process moves to block 510.
At block 510, should the image or at least a portion thereof, not have been moved to (e.g., by dragging) a location on the screen display, e.g., smartphone 110 touch screen l lOx, followed by dropping the image at a location, which is not at, on, over or within a predetermined proximity (distance) of the "save" location on the screen display, e.g., smartphone 110 touch screen l lOx, the process moves to block 512a. At block 512a, the image, e.g., photographic image, is dismissed, for example, and automatically deletes, as it was never saved in storage media 211 (FIG. 2) of the smartphone 110. From block 512b, the process moves to block 514, where it ends.
Alternately, at block 510, should at least a portion of the image, which was previously held (e.g., instantaneously prior to the hold being broken) be (or dropped at), at least one of, at, on, over or within a predetermined proximity (distance) to the "save" location (e.g., in accordance with system 110' rules and policies), for example, having been dragged to this position on the touch screen 1 lOx by the hold, and the image subsequently dropped, the process moves to block 512b. At block 512b, the image is permanently saved in the device storage 211 or on the cloud. From block 512b, the process moves to block 514, where it ends.
The process of blocks 500-514 is also performed for videos. This process is similar to that performed for images, e.g., photographic images or photographs, but adds block 505 between blocks 504 and 506, with the flow pathway in broken lines. At block 502, the image, which is isolated, is that from the instant video and serves as the preview image, which is subject to a hold, at block 504. The process then moves to block 505.
At block 505, a second contact is made on the touch screen of the mobile device 110, to create the video, similar to that discussed for block 307 above. Alternately, also similar to that for block 307 above, the selection of a video is made by the user contacting a visible button for a video mode or operation on the screen display, e.g., touch screen l lOx of the smartphone 110. With the two contacts, or the video mode contact selection, having been detected by the system 110', in accordance with block 307 above, The process then moves from block 505 to block 506, where it resumes from block 506 as detailed above, where the aforementioned video is being processed in the same or similar manner as that for the image (photograph or photographic image).
Implementation of the method and/or system of embodiments of the invention can involve performing or completing selected tasks manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of embodiments of the method and/or system of the invention, several selected tasks could be implemented by hardware, by software or by firmware or by a combination thereof using an operating system.
For example, hardware for performing selected tasks according to embodiments of the invention could be implemented as a chip or a circuit. As software, selected tasks according to embodiments of the invention could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system. In an exemplary embodiment of the invention, one or more tasks according to exemplary embodiments of method and/or system as described herein are performed by a data processor, such as a computing platform for executing a plurality of instructions. Optionally, the data processor includes a volatile memory for storing instructions and/or data and/or a non-volatile storage, for example, non-transitory storage media such as a magnetic hard-disk and/or removable media, for storing instructions and/or data.
For example, any combination of one or more non-transitory computer readable (storage) medium(s) may be utilized in accordance with the above -listed embodiments of the present invention. The non-transitory computer readable (storage) medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
As will be understood with reference to the paragraphs and the referenced drawings, provided above, various embodiments of computer-implemented methods are provided herein, some of which can be performed by various embodiments of apparatuses and systems described herein and some of which can be performed according to instructions stored in non-transitory computer-readable storage media described herein. Still, some embodiments of computer- implemented methods provided herein can be performed by other apparatuses or systems and can be performed according to instructions stored in computer-readable storage media other than that described herein, as will become apparent to those having skill in the art with reference to the embodiments described herein. Any reference to systems and computer-readable storage media with respect to the following computer-implemented methods is provided for explanatory purposes, and is not intended to limit any of such systems and any of such non-transitory computer-readable storage media with regard to embodiments of computer-implemented methods described above. Likewise, any reference to the following computer-implemented methods with respect to systems and computer-readable storage media is provided for explanatory purposes, and is not intended to limit any of such computer-implemented methods disclosed herein.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware -based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The descriptions of the various embodiments of the present invention have been presented for purposes of illustration, but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen to best explain the principles of the embodiments, the practical application or technical improvement over technologies found in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.
As used herein, the singular form "a", "an" and "the" include plural references unless the context clearly dictates otherwise.
It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the invention, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable subcombination or as suitable in any other described embodiment of the invention. Certain features described in the context of various embodiments are not to be considered essential features of those embodiments, unless the embodiment is inoperative without those elements.
The above-described processes including portions thereof can be performed by software, hardware and combinations thereof. These processes and portions thereof can be performed by computers, computer-type devices, workstations, processors, micro-processors, other electronic searching tools and memory and other non-transitory storage-type devices associated therewith. The processes and portions thereof can also be embodied in programmable non-transitory storage media, for example, compact discs (CDs) or other discs including magnetic, optical, etc., readable by a machine or the like, or other computer usable storage media, including magnetic, optical, or semiconductor storage, or other source of electronic signals.
The processes (methods) and systems, including components thereof, herein have been described with exemplary reference to specific hardware and software. The processes (methods) have been described as exemplary, whereby specific steps and their order can be omitted and/or changed by persons of ordinary skill in the art to reduce these embodiments to practice without undue experimentation. The processes (methods) and systems have been described in a manner sufficient to enable persons of ordinary skill in the art to readily adapt other hardware and software as may be needed to reduce any of the embodiments to practice without undue experimentation and using conventional techniques.
Although the invention has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, it is intended to embrace all such alternatives, modifications and variations that fall within the spirit and broad scope of the appended claims.

Claims

Claims:
1. A method for managing images comprising:
obtaining an image which displays on a screen display of a computer;
determining whether the image is held on the screen display;
determining the location on the screen display where the image is held; and,
performing an action on the image based on the location of the image on the display screen when the hold on the image on the screen display is determined to be broken, such that:
if the hold on the image is broken while at least a portion of the image is over or in a predetermined proximity to a designated location on the screen display where the image is to be saved, saving the image; and,
if the hold on the image is broken while at least a portion of the image is not over or in the predetermined proximity to the designated location on the screen display where the image is to be saved, dismissing the image.
2. The method of claim 1, wherein the hold on the image on the screen display includes at least one of:
a contact on the screen display over the image, when the screen display includes a touch screen; or,
a mouse click on the image on the screen display, when the screen display is not a touch screen.
3. The method of clam 2, wherein the designated location on the screen display includes a dedicated area for receiving images.
4. The method of claim 3, wherein the dedicated area includes at least one container.
5. The method of claim 4, wherein the at least one container is represented on the screen display graphically.
6. The method of claim 3, wherein the dedicated area includes a portion of the screen display.
7. The method of claim 4, wherein the saving the image includes adding the image to a stack of images in the at least one container.
8. The method of claim 4, wherein the at least one container is represented on the screen display as an album.
9. The method of claim 8, wherein the saving the image includes adding the image to a stack of images in the at least one container represented on the screen display as an album.
10. The method of claim 9, wherein the at least one container includes a plurality of containers and each container of the plurality of containers is represented on the screen display as an album.
11. The method of claim 1 , wherein the dismissing of the image is performed without saving the image in storage media of the computer.
12. The method of claim 1 1, wherein the dismissing the image includes deleting the image.
13. The method of claim 1, wherein the obtaining the image is performed by an imaging device associated with the computer.
14. The method of claim 13, wherein the imaging device includes a camera and the computer includes a mobile computing device.
15. The method of claim 14, wherein the mobile computing device includes a smartphone.
16. The method of claim 1 , wherein the image includes at least one of a photographic image or a video.
17. The method of claim 1, wherein the obtaining the image, the determining whether the image is held on the screen display; the determining the location on the screen display where the image is held; and, the performing an action on the image, is in real time.
18. The method of claim 2, wherein the determining whether the image is held on the screen display is performed continuously.
19. A system for managing images comprising:
computer components; and,
a processor for executing the computer components, the computer components comprising:
a first module for obtaining an image which displays on a screen display of a computer;
a second module for determining whether the image is held on the screen display; a third module for determining the location on the screen display where the image is held; and,
a fourth module for performing an action on the image based on the location of the image on the display screen when the hold on the image on the screen display is determined to be broken, such that:
if the hold on the image is broken while at least a portion of the image is over or in a predetermined proximity to a designated location on the screen display where the image is to be saved, saving the image; and,
if the hold on the image is broken while at least a portion of the image is not over or in the predetermined proximity to the designated location on the screen display where the image is to be saved, dismissing the image.
20. The system of claim 19, additionally comprising an imaging system for providing the image for the first module.
21. The system of claim 20, wherein the imaging system includes a camera.
22. The system of claim 20, wherein the computer components, the processor and the imaging system are part of a mobile computer.
23. The system of claim 22, additionally comprising storage media for saving the image.
24. The system of claim 19, wherein the image includes at least one of a photographic image or a video.
25. A computer usable non -transitory storage medium having a computer program embodied thereon for causing a suitable programmed system to manage images, by performing the following steps when such program is executed on the system, the steps comprising:
obtaining an image which displays on a screen display of a computer;
determining whether the image is held on the screen display;
determining the location on the screen display where the image is held; and,
performing an action on the image based on the location of the image on the display screen when the hold on the image on the screen display is determined to be broken, such that:
if the hold on the image is broken while at least a portion of the image is over or in a predetermined proximity to a designated location on the screen display where the image is to be saved, saving the image; and,
if the hold on the image is broken while at least a portion of the image is not over or in the predetermined proximity to the designated location on the screen display where the image is to be saved, dismissing the image.
26. The computer usable non-transitory storage medium of claim 25, wherein the hold on the image on the screen display includes at least one of:
a contact on the screen display over the image, when the screen display includes a touch screen; or,
a mouse click on the image on the screen display, when the screen display is not a touch screen.
27. The computer usable non-transitory storage medium of clam 26, wherein the designated location on the screen display includes a dedicated area for receiving images.
28. The computer usable non-transitory storage medium of claim 27, wherein the dedicated area includes at least one container.
29. The computer usable non-transitory storage medium of claim 28, wherein the at least one container is represented on the screen display graphically.
30. The computer usable non-transitory storage medium of claim 27, wherein the dedicated area includes a portion of the screen display.
31. The computer usable non-transitory storage medium of claim 28, wherein the saving the image includes adding the image to a stack of images in the at least one container.
32. The computer usable non-transitory storage medium of claim 28, wherein the at least one container is represented on the screen display as an album.
33. The computer usable non-transitory storage medium of claim 32, wherein the saving the image includes adding the image to a stack of images in the at least one container represented on the screen display as an album.
34. The computer usable non-transitory storage medium of claim 33, wherein the at least one container includes a plurality of containers and each container of the plurality of containers is represented on the screen display as an album.
35. The computer usable non-transitory storage medium of claim 25, wherein the dismissing of the image is performed without saving the image in storage media of the computer.
36. The computer usable non-transitory storage medium of claim 35, wherein the dismissing the image includes deleting the image.
37. The computer usable non-transitory storage medium of claim 25, wherein the obtaining the image is performed by an imaging device associated with the computer.
38. The computer usable non-transitory storage medium of claim 37, wherein the imaging device includes a camera and the computer includes a mobile computing device.
39. The computer usable non-transitory storage medium of claim 38, wherein the mobile computing device includes a smartphone.
40. The computer usable non-transitory storage medium of claim 25, wherein the image includes at least one of a photographic image or a video.
41. The computer usable non-transitory storage medium of claim 25, wherein the obtaining the image, the determining whether the image is held on the screen display; the determining the location on the screen display where the image is held; and, the performing an action on the image, is in real time.
42. The computer usable non-transitory storage medium of claim 26, wherein the determining whether the image is held on the screen display is performed continuously.
PCT/IL2017/051062 2016-09-20 2017-09-19 Method and system for managing images WO2018055617A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662396836P 2016-09-20 2016-09-20
US62/396,836 2016-09-20

Publications (1)

Publication Number Publication Date
WO2018055617A1 true WO2018055617A1 (en) 2018-03-29

Family

ID=61689382

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2017/051062 WO2018055617A1 (en) 2016-09-20 2017-09-19 Method and system for managing images

Country Status (1)

Country Link
WO (1) WO2018055617A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4274217A4 (en) * 2020-12-30 2024-06-12 Vivo Mobile Communication Co., Ltd. Display control method and apparatus, electronic device, and medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060059427A1 (en) * 2002-01-06 2006-03-16 Glenn Reid Digital image albums
US20100062803A1 (en) * 2008-09-05 2010-03-11 Lg Electronics Inc. Mobile terminal with touch screen and method of capturing image using the same
US20140036131A1 (en) * 2012-08-06 2014-02-06 Beijing Xiaomi Technology Co.,Ltd. Method of capturing an image in a device and the device thereof
US20140071323A1 (en) * 2012-09-11 2014-03-13 Lg Electronics Inc. Mobile terminal and method for controlling of the same
US20150339035A1 (en) * 2012-10-24 2015-11-26 Huizhou Tcl Mobile Communication Co., Ltd. Mobile terminal-based photograph deletion method and mobile terminal
WO2016061634A1 (en) * 2014-10-24 2016-04-28 Beezbutt Pty Limited Camera application

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060059427A1 (en) * 2002-01-06 2006-03-16 Glenn Reid Digital image albums
US20100062803A1 (en) * 2008-09-05 2010-03-11 Lg Electronics Inc. Mobile terminal with touch screen and method of capturing image using the same
US20140036131A1 (en) * 2012-08-06 2014-02-06 Beijing Xiaomi Technology Co.,Ltd. Method of capturing an image in a device and the device thereof
US20140071323A1 (en) * 2012-09-11 2014-03-13 Lg Electronics Inc. Mobile terminal and method for controlling of the same
US20150339035A1 (en) * 2012-10-24 2015-11-26 Huizhou Tcl Mobile Communication Co., Ltd. Mobile terminal-based photograph deletion method and mobile terminal
WO2016061634A1 (en) * 2014-10-24 2016-04-28 Beezbutt Pty Limited Camera application

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4274217A4 (en) * 2020-12-30 2024-06-12 Vivo Mobile Communication Co., Ltd. Display control method and apparatus, electronic device, and medium

Similar Documents

Publication Publication Date Title
US20170318080A1 (en) Work environment for information sharing and collaboration
EP2993568B1 (en) Electronic device including touch sensitive display and method for operating the same
US10635371B2 (en) Method and apparatus for providing lock-screen
KR102311221B1 (en) operating method and electronic device for object
CA2729478C (en) Gestures on a touch-sensitive display
US11385788B2 (en) Sharing a file with a single contact
US8464184B1 (en) Systems and methods for gesture-based distribution of files
US10430047B2 (en) Managing content on an electronic device
EP2892208A1 (en) Method and apparatus for operating electronic device
CN106462834A (en) Locating of event on timeline
US20140068632A1 (en) Disabling the self-referential appearance of a mobile application in an intent via a background registration
AU2013290458A1 (en) Image identification and organisation according to a layout without user|intervention
US11036792B2 (en) Method for designating and tagging album of stored photographs in touchscreen terminal, computer-readable recording medium, and terminal
EP2983074B1 (en) Method and apparatus for displaying a screen in electronic devices
US20140006967A1 (en) Cross-application transfers of user interface objects
US10459965B2 (en) Method and apparatus for displaying images
US20190220170A1 (en) Method and apparatus for creating group
CN108476242A (en) The device and method that file is sent and received in the wireless communication system for supporting cloud storage service
US20160124599A1 (en) Method for controlling multi display and electronic device thereof
US20140258886A1 (en) Method for transferring a file from a device
CN108476152B (en) Techniques for attaching media captured by a mobile computing device to an electronic document
US20110199516A1 (en) Method of showing video on a touch-sensitive display
US20140033066A1 (en) Method and device for uploading and downloading file
WO2018055617A1 (en) Method and system for managing images
US20170192650A1 (en) Selecting a target application based on content

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17852539

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 19.07.2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17852539

Country of ref document: EP

Kind code of ref document: A1