US20160170585A1 - Display control device, method and computer program product - Google Patents

Display control device, method and computer program product Download PDF

Info

Publication number
US20160170585A1
US20160170585A1 US15050920 US201615050920A US2016170585A1 US 20160170585 A1 US20160170585 A1 US 20160170585A1 US 15050920 US15050920 US 15050920 US 201615050920 A US201615050920 A US 201615050920A US 2016170585 A1 US2016170585 A1 US 2016170585A1
Authority
US
Grant status
Application
Patent type
Prior art keywords
display
proximity
menu item
response
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US15050920
Inventor
Ryoko Amano
Takashi Nunomaki
Kenzo Nishikawa
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance interaction techniques based on cursor appearance or behaviour being affected by the presence of displayed objects, e.g. visual feedback during interaction with elements of a graphical user interface through change in cursor appearance, constraint movement or attraction/repulsion with respect to a displayed object
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • G06F3/04842Selection of a displayed object
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the screen or tablet into independently controllable areas, e.g. virtual keyboards, menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control; Control of cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in, e.g. mobile phones, computers or vehicles
    • H04N5/23216Control of parameters, e.g. field/angle of view of camera via graphical user interface, e.g. touchscreen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control; Control of cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in, e.g. mobile phones, computers or vehicles
    • H04N5/23293Electronic Viewfinder, e.g. displaying the image signal provided by an electronic image sensor and optionally additional information related to control or operation of the camera

Abstract

A graphical user interface apparatus, method and computer program product cooperate to control a display. An item is displayed at a periphery of a display and a detector detects when an object is proximate to the item. When detected the display displays a relation item. Then, when the object is detected as moving to be proximate to the relation item, a controller changes a displayed form of the relation item in response to detecting when the object is moved.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • The present application is a continuation of U.S. application Ser. No. 13/281,490 filed Oct. 26, 2011, which contains subject matter related to that disclosed in Japanese Priority Patent Application JP 2010-291081 filed in the Japan Patent Office on Dec. 27, 2010, the entire contents of both of which are hereby incorporated by reference.
  • BACKGROUND
  • The present technology relates to a display control device, a display control method, and computer program product, and in particular, relates to a display control device, method, and computer program product that are suitable for use when display control is performed on a display device that can be operated by proximity.
  • In the past, a technique has been proposed in which, in a car navigation device, when a user causes a finger to be in proximity to a display screen, a predetermined menu is displayed in that neighborhood with being superimposed on a map so that the user touches the finger to the displayed menu to select an item. For example, such a technique is disclosed in Japanese Unexamined Patent Application Publication No. 2002-358162.
  • SUMMARY
  • However, in the technology described in Japanese Unexamined Patent Application Publication No. 2002-358162, it may be supposed that an erroneous operation occurs in which a user accidentally touches a finger to an unintended item in a menu and hence an undesired processing is executed.
  • In view of such a situation as described above, the present technology is made. In addition, it is desirable to reliably select a desired item from among items displayed on a display device that can be operated by proximity.
  • According to one non-limiting embodiment, an apparatus includes a controller that causes a display to display a relation item in response to receiving an indication of a detection of an object's proximity to a displayed item, the displayed item being displayed on a periphery of the display, and changes a displayed form of a portion of the relation item that is proximate to or in contact with the object as the object moves along the relation item.
  • According to one aspect, the controller is configured to change the displayed form of the relation item when a touch position of the object coincides with the relation item.
  • According to another aspect, the controller recognizes a proximity response area that is larger than a size of the item, the proximity response area being an area that is a criterion used when detecting if the object is proximate to the item.
  • According to another aspect, the proximity response area is a rectangular area that surrounds the item.
  • According to another aspect, the controller causes the display to highlight at least the item when the object is detected to be proximate to the item.
  • According to another aspect, the controller causes an options menu to be displayed when the object is detected as being proximate to the item.
  • According to another aspect, the options menu is translucently superimposed on an image displayed on the display, and the relation item includes the options menu.
  • According to another aspect, the proximity response area at least includes an area occupied by the options menu and the item.
  • According to another aspect, the options menu includes a plurality of icons having respective contact response areas, and the proximity response area at least including an area occupied by the respective contact response areas.
  • According to another aspect, the controller causes a selected icon of the plurality of icons to be highlighted on the display when the object is positioned proximate to the selected icon.
  • According to another aspect, the controller also causes a function guide to be displayed, the functions guide explains functions assigned to respective menu options on the options menu.
  • According to another aspect, the options menu is translucently displayed so that a background image remains at least partially visible on the display.
  • According to another aspect, the display includes an electrostatic touch panel.
  • According to another aspect, the controller highlights the item by at least one of changing a color, size, shape, design and brightness of the item.
  • According to another aspect, the options menu includes option icons that are lower in rank than the item.
  • According to another aspect, the options menu includes user-settable option icons.
  • According to another aspect, when the object is not within a predetermined detection range of the display, a selection state of the item is not changed.
  • According to another aspect, the controller changes at least one of the plurality of icons to another icon when the object touches the display.
  • According to another aspect, the plurality of icons remain present when the object touches the display and the object is then removed from the display by a distance further than a proximity detection range.
  • According to another aspect, the options menu includes a plurality of icons having respective contact response areas, at least one of the contact response areas being smaller in area than an area occupied by a corresponding icon.
  • According to another aspect, the controller recognizes a selection state of a function associated with the item when the object is detected proximate to the item, and the controller maintains the selection state even when the object is not moved to be proximate to the options menu.
  • In a method embodiment the method includes receiving an indication of a detection of an object's proximity to a displayed item; causing a display to display a relation item in response to the receiving, the displayed item being displayed on a periphery of the display, and changing a displayed form of a portion of the relation item that is proximate to or in contact with the object as the object moves along the relation item.
  • In a non-transitory computer storage device embodiment that has instructions stored therein that when executed by a processing circuit implement a process that includes receiving an indication of a detection of an object's proximity to a displayed item; causing a display to display a relation item in response to the receiving, the displayed item being displayed on a periphery of the display, and changing a displayed form of a portion of the relation item that is proximate to or in contact with the object as the object moves along the relation item.
  • According to an embodiment of the present technology, it is possible to reliably select a desired item from among items displayed on a display device that can be operated by proximity.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram illustrating an example of a configuration of an embodiment of a digital camera to which the present technology is applied;
  • FIGS. 2A and 2B are perspective views illustrating examples of appearance configurations of the digital camera;
  • FIG. 3 is a block diagram illustrating an example of a configuration of a function realized by a CPU in the digital camera;
  • FIG. 4 is a diagram illustrating a display example of an image capturing standby screen;
  • FIG. 5 is a flowchart for explaining display control processing executed by the digital camera;
  • FIG. 6 is a flowchart for explaining display control processing executed by the digital camera;
  • FIG. 7 is a diagram illustrating examples of a proximity response area and a contact response area of an image capturing mode icon;
  • FIG. 8 is a diagram illustrating a display example of an options menu;
  • FIG. 9 is a diagram illustrating examples of an enlarged proximity response area of the image capturing mode icon and a contact response area of an options icon;
  • FIG. 10 is a diagram illustrating a display example of an image capturing standby screen when a finger is caused to be in proximity to the options icon; and
  • FIG. 11 is a diagram illustrating an example of an image capturing mode setting screen.
  • DETAILED DESCRIPTION OF EMBODIMENTS
  • Hereinafter, the form for the implementation of the present technology (hereinafter, referred to as “embodiment”) will be described. In addition, the embodiment will be described in the following order.
  • 1. Embodiment 2. Modifications 1. Embodiment [Example of Configuration of Digital Camera 1]
  • FIG. 1 is a block diagram illustrating an example of the configuration of an embodiment of a digital camera (digital still camera) 1 to which the present technology is applied.
  • A lens unit 11 includes an image-capturing lens, an aperture stop, a focus lens, and the like, and light entering the lens unit 11 is beamed into an imaging element 12.
  • For example, the imaging element 12 includes a charge coupled device (CCD), a complementary metal oxide semiconductor (CMOS) imager, or the like, subjects the light from the lens unit 11 to photoelectric conversion, and supplies, to an analog signal processing unit 13, an analog image signal obtained as the result thereof.
  • The analog signal processing unit 13 subjects the image signal from the imaging element 12 to analog signal processing such as correlated double sampling processing, automatic gain adjustment processing, and the like, and supplies the image signal subjected to the analog signal processing to an analog/digital (A/D) conversion unit 14.
  • The A/D conversion unit 14 A/D-converts the image signal from the analog signal processing unit 13, and supplies, to a digital signal processing unit 15, a digital image data obtained as the result thereof.
  • The digital signal processing unit 15 subjects the image data from the A/D conversion unit 14 to digital signal processing such as white balance adjustment processing, noise reduction processing, necessary compression-encoding processing (for example, Joint Photographic Experts Group (JPEG) encoding or the like), and the like, and supplies the image data to a display unit 17 in an input-output panel 18 and a recording device 19.
  • The input-output panel 18 includes an input unit 16 and a display unit 17.
  • The input unit 16 includes a device having a function for receiving (detecting) an input from the outside thereof, namely, for example, an electrostatic touch panel or the like, a set of a light source for light irradiation and a sensor for receiving the reflection of the light, reflected from an object, or the like.
  • The input unit 16 detects the proximity and touch (contact) of an object (for example, a finger of a user, a stylus pen used by the user, or the like) from the outside, and supplies a signal (hereinafter, referred to as “operation signal”) indicating the proximity or touched position thereof to a CPU 23.
  • The display unit 17 includes a device (display device) for displaying an image, for example, a liquid crystal panel or the like, and displays an image in accordance with image data or the like supplied from the digital signal processing unit 15.
  • In the input-output panel 18, the input unit 16 and the display unit 1, described above, are integrated with each other, it is possible to display an image in the display unit 17, and in the input unit 16, it is possible to receive an operation input (here, both the proximity and the touch) from the outside with respect to an image displayed on the display unit 17.
  • In addition, hereinafter, an object such as a finger, a stylus pen, or the like used for an input (the proximity or the touch) to the input unit 16 in the input-output panel 18 is called an input tool.
  • In addition, in addition to the finger and the stylus pen, as the input tool, an arbitrary object may also be used that can be caused to be in proximity and touch so as to specify a position on the input unit 16 in the input-output panel 18.
  • For example, a disk such as a Digital Versatile Disc (DVD) or the like, a removable semiconductor memory (not illustrated) such as a memory card or the like, and another removable recording medium (not illustrated) can be attached and detached to and from the recording device 19. In addition, the recording device 19 performs recording and reproduction control on the attached recording medium. Namely, the recording device 19 records, in the recording medium, the image data from the digital signal processing unit 15, and reads out and supplies the image data recorded in the recording medium to the digital signal processing unit 15.
  • An actuator 20 is a motor that adjusts the focus lens and the aperture stop in the lens unit 11, and is driven by a motor drive 21.
  • The motor drive 21 drives the actuator 20 in accordance with the control of the central processing unit (CPU) 23.
  • A TG (Timing Generator) 22 supplies, to the imaging element 12, a timing signal used for adjusting an exposure time and the like in accordance with the control of the CPU 23.
  • By executing a program recorded in a program read only memory (ROM) 26 and furthermore a program recorded in an electrically erasable programmable ROM (EEPROM) 25 as necessary, the CPU 23 controls individual blocks included in the digital camera 1.
  • An operation unit 24 is a physical button or the like operated by the user, and supplies, to the CPU 23, a signal (hereinafter, referred to as “operation signal”) corresponding to the operation of the user.
  • For example, the EEPROM 25 stores therein a program and data where the data is an imaging parameter or the like set by the user by operating the operation unit 24 or the like and it is also necessary for the data to be held when the power source of the digital camera 1 is turned off.
  • The program ROM 26 stores therein a program to be executed by the CPU 23, or the like.
  • A RAM 27 temporarily stores therein data and a program, necessary for the operation of the CPU 23.
  • In the digital camera 1 configured as described above, the CPU 23 executes a program recorded in the program ROM 26 or the like, thereby controlling individual portions of the digital camera 1.
  • On the other hand, the light entering the lens unit 11 is subjected to photoelectric conversion in the imaging element 12, and an image signal obtained as the result thereof is supplied to the analog signal processing unit 13. In the analog signal processing unit 13, the image signal from the imaging element 12 is subjected to analog signal processing, and is supplied to the A/D conversion unit 14.
  • In the A/D conversion unit 14, the image signal from the analog signal processing unit 13 is A/D-converted, and a digital image signal obtained as the result thereof is supplied to the digital signal processing unit 15.
  • In the digital signal processing unit 15, the image data from the A/D conversion unit 14 is subjected to digital signal processing, and is supplied to the display unit 17 in the input-output panel 18, thereby displaying a corresponding image, namely, a so-called through-the-lens image.
  • In addition, the CPU 23 executes predetermined processing in accordance with an operation signal from the input unit 16 in the input-output panel 18 or the operation unit 24.
  • Namely, for example, when the input unit 16 in the input-output panel 18 or the operation unit 24 is operated so that image-capturing is performed, the CPU 23 performs processing for capturing a still image as a photograph, causes the digital signal processing unit 15 to subject the image data from the A/D conversion unit 14 to compression-encoding processing, and causes the image data to be recorded in a recording medium through the recording device 19.
  • In addition, for example, when the input unit 16 in the input-output panel 18 or the operation unit 24 is operated so that reproduction is performed, the CPU 23 controls the digital signal processing unit 15, thereby causing the image data to be read out from the recording medium through the recording device 19.
  • Furthermore, the CPU 23 causes the digital signal processing unit 15 to extend the image data read out from the recording medium, and supplies the image data to the input-output panel 18, thereby causing the image data to be displayed.
  • In addition, in response to the operation signal or the like from the input unit 16 in the input-output panel 18 or the operation unit 24, the CPU 23 supplies, to the input-output panel 18 through the digital signal processing unit 15, the image of a graphical user interface (GUI) used for performing the operation of the digital camera 1, the confirmation of various kinds of information, or the like.
  • In addition, for example, data used for the GUI is stored in the EEPROM 25 or the program ROM 26.
  • In addition, for example, the program to be executed by the CPU 23 can be installed from a removable recording medium to the digital camera 1, or can be downloaded through a network and installed to the digital camera 1.
  • FIGS. 2A and 2B are perspective views illustrating examples of appearance configurations of the digital camera 1 in FIG. 1.
  • Namely, FIG. 2A is the perspective view of the front surface side of the digital camera 1 (a side directed toward a subject at the time of image-capturing), and FIG. 2B is the perspective view of the back surface side of the digital camera 1.
  • A lens cover 31 is provided so as to cover the front surface of the digital camera 1, and can move up and down.
  • When the lens cover 31 is located on an upper side, the lens unit 11 and the like are put into states in which the lens unit 11 and the like are covered. In addition, when the lens cover 31 is located on a lower side, the lens unit 11 and the like are exposed and the digital camera 1 is put into a state in which image-capturing can be performed.
  • In FIG. 2A, the lens cover 31 is located on the lower side, and the lens unit 11 is exposed.
  • An AF illuminator 32 is provided on the left side of the lens unit 11. For example, when the subject is darkened and it is difficult to focus on the subject with an AF function, the AF illuminator 32 emits light (auxiliary light) used for lighting the subject.
  • In addition, when a self-timer is used for image-capturing, the AF illuminator 32 also functions as a self-timer lamp that emits light so as to notify the user of image capturing timing based on the self-timer.
  • On the upper portion of the digital camera 1, a power button 33, a play button 34, a shutter button (release button) 35, and a zoom lever 36 are provided that are included in the operation unit 24 in FIG. 1.
  • The power button 33 is operated when the power source of the digital camera 1 is switched on and off, and the play button 34 is operated when image data recorded in the recording medium attached to the recording device 19 (in FIG. 1) is reproduced.
  • The shutter button (release button) 35 is operated when image data is recorded in the recording medium attached to the recording device 19 (in FIG. 1) (a photograph (still image) is captured), and the zoom lever 36 is operated when zoom is adjusted.
  • On the rear face of the digital camera 1, the input-output panel 18 is provided. On the input-output panel 18, a through-the-lens image, a GUI, or the like is displayed.
  • By causing the input tool to be in proximity to the input-output panel 18 or touching the input tool to the input-output panel 18, the user can supply various kinds of (operation) inputs to the digital camera 1.
  • [Example of Configuration of Function Realized by CPU 23]
  • FIG. 3 is a block diagram illustrating an example of a configuration of a portion of a function realized by the CPU 23 in FIG. 1 by executing a predetermined control program. The CPU 23 executes the predetermined control program, thereby realizing a function including functions in which the CPU 23 operates as a display control device 51 and an information processing device 52.
  • In response to an operation on the input unit 16 in the input-output panel 18 (the proximity or touch of the input tool) and an operation on the operation unit 24, the display control device 51 causes the GUI to be displayed on the display unit 17 in the input-output panel 18 through the digital signal processing unit 15, in order to perform the operation of the digital camera 1, the confirmation of various kinds of information, or the like.
  • The display control device 51 is configured so as to include an input detection unit 61 and a display control unit 62.
  • On the basis of the operation signal from the input unit 16 in the input-output panel 18, the input detection unit 61 detects an input (the proximity or touch of the input tool) from the outside to the input unit 16 in the input-output panel 18 and a position or the like on the input unit 16 subjected to the proximity or touch of the input tool. In addition, the input detection unit 61 supplies, to the display control unit 62, information including these detection results as operation information indicating an operation performed by the user on the input unit 16 in the input-output panel 18.
  • In response to an operation signal from the operation unit 24 and a user operation indicated by the operation information from the input detection unit 61, the display control unit 62 causes the GUI to be displayed on the display unit 17 in the input-output panel 18 through the digital signal processing unit 15, on the basis of data or the like stored in the EEPROM 25 or the program ROM 26.
  • In addition, on the basis of the operation information from the input detection unit 61, the display control unit 62 recognizes the function of the digital camera 1, the execution of the function being ordered by the user by operating the input unit 16 with respect to the GUI displayed on the display unit 17 in the input-output panel 18. In addition, the display control unit 62 notifies the information processing device 52 of the recognized function.
  • The information processing device 52 controls individual portions of the digital camera 1 and causes the individual portions to execute the function given notice of by the display control unit 62.
  • [Display Example of Input-Output Panel 18]
  • FIG. 4 is a diagram illustrating a display example of the display screen of the display unit 17 in the input-output panel 18.
  • FIG. 4 illustrates a display example of the display screen of the display unit 17 in the input-output panel 18 in a state in which the digital camera 1 is in an image capturing standby state and the input unit 16 is not subjected to the proximity or touch of the input tool.
  • Here, the image capturing standby state means a state in which, if the shutter button 35 (in FIG. 2) is operated, the capturing of a photograph (still image) is performed (a state in which an image is captured and recorded in the recording medium attached to the recording device 19 (in FIG. 1)).
  • In the display screen in FIG. 4, within a central area AC other than vertically long strip-shaped areas AL and AR located on both sides, a through-the-lens image is displayed. In addition, icons 101 to 105 to which predetermined functions are assigned are displayed in tandem with predetermined sizes at predetermined positions within the area AL. Furthermore, icons 106 and 107 to which predetermined functions are assigned and an icon group 108 indicating the state or the like of the digital camera 1 are displayed in tandem with predetermined sizes at predetermined positions within the area AR.
  • In addition, for example, the icon group 108 includes icons indicating the remaining amount of a battery in the digital camera 1, the number of photographs (still images) that can be recorded in a recording medium, an image capturing mode already set, and the like.
  • For example, by causing the input tool to be in proximity to or touch the icons 101 to 107 on the display screen in FIG. 4, the user can cause the digital camera 1 to execute functions assigned to individual icons.
  • In addition, hereinafter, the display screen in FIG. 4 is referred to as an image capturing standby screen.
  • In addition, hereinafter, it is assumed that the lateral directions of the input unit 16 and the display unit 17 in the input-output panel 18 are x-axis directions and the vertical directions thereof are y-axis directions. Accordingly, for example, the lateral direction of the image capturing standby screen in FIG. 4 is the x-axis direction and the vertical direction thereof is the y-axis direction.
  • Furthermore, hereinafter, in some case, the descriptions of the input unit 16 and the display unit 17 in the input-output panel 18 will be omitted, and the input unit 16 and the display unit 17 in the input-output panel 18 will be simply described as the input-output panel 18. For example, in some case, a phrase “the display unit 17 in the input-output panel 18 displays an image” is described as a phrase “the input-output panel 18 displays an image”, and a phrase “the input unit 16 in the input-output panel 18 is subjected to the proximity or touch of the input tool” is described as a phrase “the input-output panel 18 is subjected to the proximity or touch of the input tool”.
  • [Display Control Processing when Image Capturing Mode Icon is Operated]
  • Next, with reference to flowcharts in FIG. 5 and FIG. 6, focusing on the icon 106 (hereinafter, referred to as an image capturing mode icon 106) used for setting an image capturing mode in the image capturing standby screen in FIG. 4, display control processing will be described that is executed by the digital camera 1 when the image capturing mode icon 106 is operated.
  • In addition, for example, this processing is started when the power source of the digital camera 1 is turned on or an operation for causing the input-output panel 18 to display the image capturing standby screen is performed on the input-output panel 18 or the operation unit 24.
  • In addition, hereinafter, for convenience of description, the description of processing other than processing relating to the operation performed on the image capturing mode icon 106 (for example, processing performed when an icon other than the image capturing mode icon 106 is operated, or the like) will be omitted.
  • In Step S1, the digital camera 1 displays the image capturing standby screen. Specifically, the imaging element 12 supplies, to the analog signal processing unit 13, an image signal obtained as the result of image capturing. The analog signal processing unit 13 subjects the image signal from the imaging element 12 to analog signal processing, and supplies the image signal to the A/D conversion unit 14. The A/D conversion unit 14 A/D-converts the image signal from the analog signal processing unit 13, and supplies, to the digital signal processing unit 15, a digital image data obtained as the result thereof. The digital signal processing unit 15 subjects the image data from the A/D conversion unit 14 to digital signal processing, and supplies the image data to the input-output panel 18. On the basis of the image signal from the digital signal processing unit 15, the input-output panel 18 displays a through-the-lens image in the area AC in the image capturing standby screen.
  • In addition, the display control unit 62 causes the icons 101 to 105 to be displayed in tandem in the area AL in the image capturing standby screen through the digital signal processing unit 15, and causes the image capturing mode icon 106, the icon 107, and the icon group 108 to be displayed in tandem in the area AR in the image capturing standby screen.
  • In Step S2, the display control unit 62 sets a proximity response area and a contact response area of the image capturing mode icon 106.
  • Here, the proximity response area is an area to be a criterion used for detecting the proximity of the input tool to individual icons such as an icon, a character string, a thumbnail, and the like, displayed on the input-output panel 18. In addition, the contact response area is an area to be a criterion used for detecting the contact of the input tool with individual items displayed on the input-output panel 18.
  • FIG. 7 illustrates examples of a proximity response area Rc and a contact response area Rt1, set with respect to the image capturing mode icon 106 in processing performed in Step S2. The proximity response area Rc is slightly larger than the physical appearance of the image capturing mode icon 106, and is set to a rectangular area surrounding the image capturing mode icon 106. In addition, the contact response area Rt1 is set to the same area as the proximity response area Rc.
  • In Step S3, the input detection unit 61 performs the detection of the proximity position of the input tool on the basis of the operation signal from the input-output panel 18. The input detection unit 61 supplies, to the display control unit 62, operation information including the detection result of the proximity position of the input tool.
  • In Step S4, the display control unit 62 determines whether or not the input tool is caused to be in proximity to the proximity response area Rc of the image capturing mode icon 106. In addition, when the proximity of the input tool to the input-output panel 18 is not detected or the detected proximity position is out of the proximity response area Rc, it is determined that the input tool is not caused to be in proximity to the proximity response area Rc of the image capturing mode icon 106, and the processing returns to Step S3.
  • After that, in Step S4, until it is determined that the input tool is caused to be in proximity to the proximity response area Rc of the image capturing mode icon 106, processing operations in Steps S3 and S4 are repeatedly executed.
  • On the other hand, when, in Step S4, the proximity of the input tool to the input-output panel 18 is detected, and the detected proximity position is located within the proximity response area Rc, it is determined that the input tool is caused to be in proximity to the proximity response area Rc of the image capturing mode icon 106, and the processing proceeds to Step S5.
  • In Step S5, the display control unit 62 causes the display form of the image capturing mode icon 106 to be changed through the digital signal processing unit 15, and causes an options menu to be displayed.
  • FIG. 8 illustrates an example of the display screen displayed on the input-output panel 18 in the processing in Step S5 when a finger 121 as the input tool is brought close to the image capturing mode icon 106.
  • In the display screen in FIG. 8, the image capturing mode icon 106 and the surrounding area thereof are displayed highlighted, and a cursor 131 is displayed so as to surround the image capturing mode icon 106. Accordingly, by bringing the finger 121 close to the image capturing mode icon 106, the user can intuitively recognize that the image capturing mode icon 106 is selected (the image capturing mode icon 106 reacts to the finger 121).
  • In addition, highlighted display and cursor display are examples, and it may also be indicated in another form that the image capturing mode icon 106 is selected. For example, the color, size, design, brightness, or the like of the image capturing mode icon 106 may also be changed. In addition, the combination of a plurality of the changes of display forms may also be adopted.
  • In addition, above the image capturing mode icon 106, as information relating to the image capturing mode icon 106, a function guide 132 is displayed that explains a function assigned to the image capturing mode icon 106. Accordingly, the user can easily and swiftly recognize the detail of the function assigned to the image capturing mode icon 106.
  • Furthermore, a landscape-oriented options menu 133 extends in a direction perpendicular to the area AR, on the left side of the image capturing mode icon 106, and is displayed so as to be superimposed on the through-the-lens image of the area AC. In addition, options icons 141 to 143 are displayed abreast with predetermined sizes at predetermined positions within the options menu 133.
  • The options icons 141 to 143 are icons relating to the image capturing mode icon 106. More specifically, the options icons 141 to 143 are icons belonging to the lower rank of the image capturing mode icon 106 in the menu system of the digital camera 1. In addition, a function for setting one of the image capturing modes of the digital camera 1 is assigned to each of the options icons 141 to 143.
  • In addition, the user can freely set the options icons 141 to 143 caused to be displayed on the options menu 133. For example, in order to swiftly set a frequently used image capturing mode, the user can set an icon corresponding to the image capturing mode to an options icon, and cause the icon to be displayed on the options menu 133.
  • Furthermore, the function guide 132 and the options menu 133 are translucently displayed so that the through-the-lens image of the background can be seen transparently. Accordingly, even in a state in which the function guide 132 and the options menu 133 are displayed, it is possible to certainly view the through-the-lens image of the background.
  • In such a way as described above, the options menu 133 is displayed that includes the options icons 141 to 143 relating to the image capturing mode icon 106 displayed at the periphery of the proximity position to which the input tool is in proximity.
  • In Step S6, the display control unit 62 enlarges the proximity response area of the image capturing mode icon 106, and sets the contact response areas of the options icons 141 to 143.
  • FIG. 9 illustrates the proximity response area Rc′ of the image capturing mode icon 106 and the contact response areas Rt2 to Rt4 of the options icons 141 to 143, set in the processing performed in Step S6. In addition, in this drawing, in order to make this drawing understandable, the illustration of the cursor 131 and the function guide 132 and the highlighted display of the image capturing mode icon 106 are omitted. In addition, an area located above the options menu 133 and indicated by a two-dot chain line is an area in which the function guides of the options icons 141 to 143 are displayed, described later with reference to FIG. 10.
  • The proximity response area Rc′ of the image capturing mode icon 106 is larger than the proximity response area Rc in FIG. 7, and is set to a rectangular area including the proximity response area Rc, the options menu 133, and a display area for the function guides of the options icons 141 to 143.
  • In addition, also, part or all of the display area for the function guides may not be included in the proximity response area Rc′.
  • The contact response areas Rt2 to Rt4 are slightly larger than the physical appearances of the options icons 141 to 143, respectively, and are set to rectangular areas surrounding the options icons 141 to 143, respectively.
  • In addition, the contact response area Rt1 of the image capturing mode icon 106 and the contact response areas Rt2 to Rt4 of the options icons 141 to 143 are set to almost the same size.
  • In Step S7, the input detection unit 61 performs the detection of the contact position of the input tool on the basis of the operation signal of the input-output panel 18. The input detection unit 61 supplies, to the display control unit 62, operation information including the detection result of the contact position of the input tool.
  • In Step S8, the display control unit 62 determines whether or not the input tool touches the input-output panel 18, on the basis of the detection result due to the input detection unit 61. When it is determined that the input tool does not touch the input-output panel 18, the processing proceeds to Step S9.
  • In Step S9, in the same way as in the processing in Step S3, the detection of the proximity position of the input tool is performed, and the operation information including the detection result of the proximity position of the input tool is supplied from the input detection unit 61 to the display control unit 62.
  • In Step S10, the display control unit 62 determines whether or not the input tool is caused to be proximity to the enlarged proximity response area Rc′ of the image capturing mode icon 106. In addition, when the proximity of the input tool to the input-output panel 18 is detected and the detected proximity position is located within the proximity response area Rc′, it is determined that the input tool is caused to be proximity to the enlarged proximity response area Rc′, and the processing proceeds to Step S11.
  • In Step S11, in response to the proximity position of the input tool, the display control unit 62 changes the display forms or the like of individual icons through the digital signal processing unit 15.
  • Specifically, when the x coordinate of the proximity position of the input tool is located within a range in an x-axis direction of one of the contact response areas Rt1 to Rt4, the display control unit 62 determines that an icon corresponding to the contact response area is under selection.
  • In addition, hereinafter, an area defined by a range in the x-axis direction of each of the contact response areas Rt1 to Rt4 and a range in the y-axis direction of the proximity response area Rc′ is called a selection determination area. For example, the selection determination area of the image capturing mode icon 106 turns out to be an area by a range in the x-axis direction of the contact response area Rt1 and a range in the y-axis direction of the proximity response area Rc′.
  • In addition, the display control unit 62 changes the display form of an icon determined to be under selection, and causes a function guide for explaining a function assigned to the icon to be displayed.
  • For example, a case will be described in which, from the state illustrated in FIG. 8, the finger 121 is moved with staying in proximity to the input-output panel 18 and the finger 121 is positioned on the options icon 143 as illustrated in FIG. 10.
  • In this case, the x coordinate of the proximity position of the finger 121 is included in the selection determination area of the options icon 143, and hence it is determined that the options icon 143 is under selection. In addition, the options icon 143 is displayed highlighted, and the cursor 131 is displayed so as to surround the options icon 143. Namely, with the movement of the finger 121 that is the input tool, the display form of the options icon 143 is changed that is displayed at the periphery of the proximity position of the finger 121.
  • Accordingly, by moving the finger 121 with causing the finger 121 to stay in proximity to the input-output panel 18, the user can intuitively recognize that the selection of an icon is switched and an icon displayed in the proximity of the finger 121 is selected (the icon reacts to the finger 121).
  • In addition, in the same way as in the case of the image capturing mode icon 106 in FIG. 8, using another form other than highlighted display and cursor display or combining a plurality of the changes of display forms, it may also be indicated that each options icon is selected.
  • In addition, above the options menu 133, a function guide 151 is displayed that explains the name and the content of a function assigned to the options icon 143. Accordingly, the user can easily and swiftly recognize the detail of the function assigned to the options icon 143.
  • Furthermore, the function guide 151 is translucently displayed so that the through-the-lens image of the background can be seen transparently. Accordingly, even in a state in which the function guide 151 is displayed, it is possible to certainly view the through-the-lens image of the background.
  • In addition, when the proximity position of the input tool is not included in any one of selection determination areas of the icons, the selection state of the icon is not changed and is maintained without change.
  • For example, when, from the state illustrated in FIG. 10, the finger 121 is moved from a position above the options icon 143 to a position above the options icon 142, a state in which the proximity position of the finger 121 is not included in any one of the selection determination areas of the icons occurs on the way. At this time, the selection state of the options icon 143 is maintained without change, and the display screen in FIG. 10 is continuously displayed without change. In addition, when the proximity position of the finger 121 enters the selection determination area of the options icon 142, it is determined that the options icon 142 is under selection, the display form of the options icon 142 is changed, and the function guide of the options icon 142 is displayed.
  • Returning to FIG. 5, after that, the processing returns to Step S7. In addition, until, in Step S8, it is determined that the input tool touches the input-output panel 18 or in Step S10, it is determined that the input tool is not caused to be in proximity to the proximity response area Rc′, the processing operations from Step S7 to Step S11 are repeatedly executed.
  • Accordingly, even if the user does not cause the input tool to touch the input-output panel 18, the user can freely change selection for the image capturing mode icon 106 and the options icons 141 to 143 by moving the input tool within the proximity response area Rc′ with causing the input tool to stay in proximity to the input-output panel 18. In addition, with the change of the selection of an icon, the display form of the selected icon is changed, and the display of the function guide is switched.
  • On the other hand, when, in Step S10, the proximity of the input tool to the input-output panel 18 is not detected or the detected proximity position is out of the proximity response area Rc′, it is determined that the input tool is not caused to be in proximity to the proximity response area Rc′, and the processing proceeds to Step S12.
  • In Step S12, the display control unit 62 restores the response areas of individual icons. Namely, the display control unit 62 reduces the proximity response area Rc′ of the image capturing mode icon 106 to the proximity response area Rc, and cancels the settings of the contact response areas Rt2 to Rt4 of the options icons 141 to 143. Accordingly, the states of the response areas of individual icons are restored from the states in FIG. 9 to the states in FIG. 7.
  • In Step S13, the display control unit 62 sets the options menu 133 to nondisplay through the digital signal processing unit 15. Namely, the options menu 133 is closed, the function guide and the cursor 131 are eliminated, and the display screen of the input-output panel 18 is restored to the state in FIG. 4.
  • After that, the processing returns to Step S3, and processing subsequent to Step S3 is executed.
  • On the other hand, in Step S8, when it is determined that the input tool is caused to touch the input-output panel 18, the processing proceeds to Step S14.
  • In Step S14, the display control unit 62 determines whether or not the input tool is caused to touch within the enlarged proximity response area Rc′ of the image capturing mode icon 106. In addition, when the detected contact position of the input tool is located within the proximity response area Rc′, it is determined that the input tool is caused to touch within the proximity response area Rc′, and the processing to Step S15.
  • In Step S15, the display control unit 62 causes the display forms of individual icons or the like to be changed through the digital signal processing unit 15, in response to the contact position of the input tool.
  • In addition, in Step S15, basically the same processing as in Step S12 is performed while the proximity position is just replaced with the contact position. Namely, using the same determination method as in a case in which the proximity position is used, an icon under selection is detected on the basis of the contact position of the input tool. In addition, the display form of the icon under selection is changed, and a function guide corresponding to the icon is displayed.
  • In Step S16, in the same way as the processing in Step S7, the detection of the contact position of the input tool is performed, and the operation information including the detection result of the contact position of the input tool is supplied from the input detection unit 61 to the display control unit 62.
  • In Step S17, on the basis of the detection result due to the input detection unit 61, the display control unit 62 determines whether or not the input tool moves away from the input-output panel 18. When it is determined that the input tool does not move away from the input-output panel 18, in other words, it is determined that the input tool continues to be caused to touch the input-output panel 18, the processing returns to Step S14.
  • After that, until, in Step S14, it is determined that the input tool is not caused to touch within the proximity response area Rc′ of the image capturing mode icon 106 or, in Step S17, it is determined that input tool moves away from the input-output panel 18, the processing operations from Step S14 to S17 are repeatedly executed.
  • Accordingly, by moving the input tool within the proximity response area Rc′ with causing the input tool to touch the input-output panel 18, the user can freely change selection for the image capturing mode icon 106 and the options icons 141 to 143. In addition, with the change of the selection of an icon, the display form of the selected icon is changed, and the display of the function guide is switched.
  • On the other hand, when, in Step S17, it is determined that the input tool moves away from the input-output panel 18, the processing proceeds to Step S18. In addition, this corresponds to a case in which that input tool moves away from the input-output panel 18 with continuing to be caused to touch within the proximity response area Rc′.
  • In Step S18, the digital camera 1 executes a function assigned to the selected icon. Specifically, first, the display control unit 62 confirms the selection of an icon that has been selected immediately before the input tool moves away from the input-output panel 18. For example, when the input tool moves away from the input-output panel 18 during the selection of the image capturing mode icon 106, the selection of the image capturing mode icon 106 is confirmed. In the same way, when the input tool moves away from the input-output panel 18 during the selection of any one of the options icons 141 to 143, the selection of the options icon under selection is confirmed. Accordingly, on the basis of the contact position immediately before the input tool moves away from the input-output panel 18, the selection of the icon is confirmed.
  • In addition, when the selection of any one of the options icons 141 to 143 is confirmed, the display control unit 62 notifies the information processing device 52 of a function assigned to the icon, for example. By controlling individual portions of the digital camera 1, the information processing device 52 causes the function given notice of to be executed.
  • On the other hand, when the selection of the image capturing mode icon 106 is confirmed, the display control unit 62 causes the input-output panel 18 to display a display screen used for setting the image capturing mode, through the digital signal processing unit 15, for example.
  • FIG. 11 illustrates an example of the display screen (hereinafter, referred to as “image capturing mode setting screen”) displayed on the input-output panel 18.
  • In the image capturing mode setting screen in FIG. 11, icons 201 to 208, to each of which a function for setting one of image capturing modes of the digital camera 1 is assigned, are arrayed in a grid pattern. In addition, for example, the options icons 141 to 143 displayed on the options menu 133 are selected by the user from among the icons 201 to 208.
  • In addition, an icon 209, operated when the description of the function of each icon is displayed, and an icon 210, operated when the image capturing mode setting screen is closed and the image capturing standby screen in FIG. 4 is displayed, are displayed in an upper right corner.
  • In the same way as the operation for the options menu 133, by causing the input tool to be in proximity to or touch the input-output panel 18, the user can select a desired icon from among the icons 201 to 210 on the image capturing mode selection screen, and confirm the selection thereof.
  • In addition, the display control unit 62 notifies the information processing device 52 of a function assigned to the icon whose selection is confirmed. By controlling individual portions of the digital camera 1, the information processing device 52 causes the function given notice of to be executed.
  • After that, the display control processing is terminated.
  • On the other hand, when, in Step S14, the detected proximity position of the input tool is out of the proximity response area Rc′, it is determined that the input tool touches outside of the proximity response area Rc′, and the processing proceeds to Step S19.
  • In Step S19, in the same way as the processing in Step S12, the response areas of individual icons are restored.
  • In Step S20, in the same way as the processing in Step S13, the options menu 133 is set to nondisplay.
  • In Step S21, in the same way as the processing in Step S7, the detection of the contact position of the input tool is performed.
  • In Step S22, in the same way as the processing in Step S17, it is determined whether or not the input tool moves away from the input-output panel 18, and when it is determined that the input tool does not move away from the input-output panel 18, the processing returns to Step S21. After that, until it is determined that input tool moves away from the input-output panel 18, the processing operations in Step S21 and S22 are repeatedly executed.
  • In addition, when, in Step S22, it is determined that the input tool moves away from the input-output panel 18, the processing returns to Step S3, and the processing subsequent to Step S3 is executed.
  • In such a way as described above, just by causing the input tool to be in proximity to the image capturing mode icon 106, the user can cause the options menu 133 to be displayed. Furthermore, just by causing the input tool to touch the periphery of one of the options icons 141 to 143 without change, the user can cause a function assigned to the touched icon to be executed.
  • Accordingly, for example, compared with a case where an operation is performed in which, after the image capturing mode icon 106 is touched and the options menu 133 is displayed, one of the options icons 141 to 143 is touched, the number of operation steps is reduced. Therefore, it is possible to swiftly set an image capturing mode, and it is possible to perform image-capturing in a desired image capturing mode without missing a photo opportunity. In addition, it is possible to reduce the burden for the setting operation of an image capturing mode the user feels.
  • In addition, just by moving the input tool with causing the input tool to stay in proximity to the input-output panel 18, the selection of an icon is switched and the display form of the selected icon is changed. Therefore, the user can certainly recognize the selected icon before touching the input-output panel 18. Therefore, it is possible to reduce such an erroneous operation as the execution of an undesired function due to an erroneous touch to a different icon.
  • Furthermore, after causing the input tool to touch the input-output panel 18, it is also possible to freely select an icon until the input tool is moved away from the input-output panel 18. Therefore, even if the input tool is caused to touch an erroneous icon, it is possible to select a desired icon.
  • In addition, by translucently displaying the options menu 133 that is long and thin in a lateral direction, it is possible to reduce the size of a portion in which the through-the-lens image displayed in the area AC is hidden from view. Therefore, it is possible to set an image capturing mode while fully bringing a subject into view. Accordingly, the user can set an image capturing mode with recognizing the change of the subject, and perform image-capturing without missing a photo opportunity.
  • In addition, a situation does not occur in which, in order to set an image capturing mode, the whole screen is switched and the through-the-lens image is hidden from view, and hence it is possible to reduce stress the user feels owing to the change of the screen during image-capturing.
  • Furthermore, when the input tool is caused to be in proximity to the image capturing mode icon 106, the proximity response area of the image capturing mode icon 106 is enlarged. Therefore, even if the position of the input tool is slightly displaced during the operation thereof, the options menu 133 is continuously displayed without being closed. Accordingly, a situation in which the options menu 133 is erroneously closed is reduced, and the user can swiftly set the image capturing mode without stress.
  • 2. Modifications
  • In addition, an embodiment of the present technology is not limited to the above-mentioned embodiment, and various modifications may be made without departing from the scope of the present technology.
  • First Example of Modification
  • For example, when the image capturing mode icon 106 (contact response area Rt1) is touched, the types of options icons displayed on the options menu 133 may be changed. Namely, depending on when the input tool is caused to be in proximity to or touch the image capturing mode icon 106, the content of the options menu 133 may also be changed.
  • Second Example of Modification
  • In addition, for example, after the image capturing mode icon 106 (contact response area Rt1) is touched, the display of the options menu 133 may also be fixed. Namely, when the image capturing mode icon 106 is touched during the display of the options menu 133, even if the input tool is moved away from the input-output panel 18 after that, the display of the options menu 133 may also be continued. Consequently, the user can carefully select an image capturing mode from within the options menu 133. In addition, for example, when the selection of an icon is confirmed or a portion located outside of the proximity response area Rc′ is touched, the options menu 133 may also be closed.
  • Third Example of Modification
  • Furthermore, for example, when the input tool is brought close to the image capturing mode icon 106, the image capturing mode setting menu in FIG. 11 may also be displayed in place of the options menu 133. In this case, while the through-the-lens image is temporarily hidden from view, since, in the same way as in a case in which the options menu 133 is operated, just fewer operation steps used for selecting an icon are necessary, it is possible to swiftly select an image capturing mode from among more candidates and cause the through-the-lens image to be redisplayed.
  • Fourth Example of Modification
  • In addition, for example, when the input tool is not moved away from the input-output panel 18 (when not released) but the input tool touches the input-output panel 18, the selection of an icon may also be confirmed. Accordingly, it is possible to more swiftly set an image capturing mode.
  • Fifth Example of Modification
  • Furthermore, for example, when the proximity position or contact position of the input tool is out of the proximity response area Rc′, the options menu 133 is not immediately closed but a standby time may be provided. Consequently, for example, even if the proximity position or contact position of the input tool is temporarily out of the proximity response area Rc′, it is possible to continuously display the options menu 133 by swiftly returning the input tool to within the proximity response area Rc′.
  • Sixth Example of Modification
  • In addition, while, in the above description, an example has been illustrated in which an area (proximity response area Rc′) used for determining whether or not the options menu 133 is set to nondisplay is set to the same area at the time of proximity and at the time of contact, the area may also be changed. For example, since the position of the input tool with respect to the input-output panel 18 is more easily recognized at the time of contact than at the time of proximity, the proximity response area Rc′ may also be reduced in size at the time of contact, compared with the time of proximity.
  • Seventh Example of Modification
  • In the same way, the selection determination area of each icon may also be changed depending on the time of the proximity or the time of contact. For example, since, as described above, the position of the input tool with respect to the input-output panel 18 is more easily recognized at the time of contact than at the time of proximity, the selection determination area may also be reduced in size at the time of contact, compared with the time of proximity.
  • Eighth Example of Modification
  • In addition, while, in the above description, an example has been illustrated in which the selection determination area of each icon is made larger than the contact response area thereof, the contact response area may also be used as the selection determination area without change.
  • Ninth Example of Modification
  • Furthermore, the contact response area (or the selection determination area) of each icon does not necessarily include the all display area of the corresponding icon, and may also only include part of the display area of the corresponding icon. For example, when icons are adjacent to one another with being closely spaced, it is possible to prevent an adjacent icon from being erroneously selected, by setting the contact response area (or the selection determination area) to an area smaller than the display area of each icon.
  • In addition, with respect to each icon, the sizes of various kinds of response areas may also be changed.
  • Tenth Example of Modification
  • Furthermore, while, in the above description, an example has been illustrated in which the selection state of an icon is maintained when the proximity position or the contact position of the input tool is not included in the selection determination area of any one of icons, the selection of an icon may also be canceled with the options menu 133 being displayed.
  • Eleventh Example of Modification
  • In addition, the arrangement of the options icons within the options menu 133 is not limited to the above-mentioned example, and may also be arbitrarily set. For example, it is possible to arrange the options icons in tandem or two-dimensionally.
  • Furthermore, the shape, the size, the display position, the display direction, or the like of the options menu 133 may also be changed in response to the position of the image capturing mode icon 106, the direction or position of the sequence of icons including the image capturing mode icon 106, the content of the options menu 133, or the like.
  • Twelfth Example of Modification
  • In addition, while, in the above description, a menu configuration has been illustrated in which the image capturing mode icon 106 to be the trigger of the display of the options menu 133 is also selected, the present technology may also be applied to a case where a menu is displayed in the configuration of which an icon to be the trigger of menu display is not selected. For example, the case corresponds to a case, in the menu configuration of which the image capturing mode icon 106 is used for displaying the options menu 133 and only the options icons 141 to 143 are selected.
  • Thirteenth Example of Modification
  • Furthermore, the present technology may also be applied to a case in which an operation other than the setting of the image capturing mode is performed. For example, a case may be considered in which the shortcut functions of various kinds of operations are assigned to a plurality of shortcut icons and a menu including these shortcut icons is displayed when the input tool is caused to be in proximity to a predetermined icon.
  • Fourteenth Example of Modification
  • In addition, for example, the present technology may also be applied to a case in which items other than icons, such as a character string, a thumbnail of a moving image or a still image, and the like, are selected. Accordingly, for example, it is also possible to configure an item within the options menu 133, using a character string indicating a function name or the like.
  • Fifteenth Example of Modification
  • Furthermore, for example, as information relating to an item under selection, it is also possible to display information other than functions, such as a file name, an image capturing date, an updating date, or the like.
  • Sixteenth Example of Modification
  • In addition, the present technology may also be applied to a case in which a plurality of items can be selected at one time. For example, the present technology may also be applied to a case in which, by causing the input tool to be in proximity to or touch a portion located between a plurality of items, a plurality of items at the periphery thereof are selected at one time, or a case in which, since the ranges of the detected proximity position and contact position are large, a plurality of items can be selected at one time. In this case, for example, it is only necessary to change the display forms of the plural items under selection as described above.
  • Seventeenth Example of Modification
  • Furthermore, the present technology may also be applied to devices other than digital cameras, each of which includes a display device that can be operated owing to proximity or proximity and touch, and devices performing the display control of these devices. For example, the present technology may also be applied to personal computers (PCs), personal digital assistances (for example, a mobile phone, a portable music player, an electronic book reader, and the like), digital video cameras, game consoles, navigation devices, television receivers, displays, controllers (remote controllers are included) for various kinds of devices having display functions, and the like.
  • Specifically, for example, the present technology may also be applied to a case in which, when images are edited using a PC or the like, a menu used for selecting the type of editing is displayed with the images under editing continuing to be displayed. In addition, for example, the present technology may also be applied to a case in which, when a map is browsed using a PC, a navigation device, or the like, a menu used for selecting the display form (satellite photograph display, map display, land mark display, or the like) of the map is displayed.
  • The present disclosure contains subject matter related to that disclosed in Japanese Priority Patent Application JP 2010-291081 filed in the Japan Patent Office on Dec. 27, 2010, the entire contents of which are hereby incorporated by reference.
  • It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.

Claims (31)

  1. 1. (canceled)
  2. 2. An image processing apparatus, comprising:
    circuitry configured to
    display a plurality of menu items, including a first menu item, on a display, each menu item being displayed on the display in association with a distinct proximity response area, the plurality of menu items being displayed on the display in response to a first user operation;
    change a display state of the first menu item in response to detection of proximity of an object to a first proximity response area associated with the first menu item; and
    perform a function associated with the first menu item in response to a touch detected within the first proximity response area,
    wherein the first proximity response area includes at least a display area of the first menu item,
    the change in display state includes at least a change in a display color of the first menu item, and
    the plurality of displayed items are displayed over a captured image.
  3. 3. The image processing apparatus according to claim 2, wherein the first proximity response area includes regions on the display other than the display area of the first menu item.
  4. 4. The image processing apparatus according to claim 2, wherein the display of a function guide related to the first menu item is on a region other than a display area of any of the plurality of menu items.
  5. 5. The image processing apparatus according to claim 2, wherein each of the plurality of menu items has a predetermined shape and a predetermined size.
  6. 6. The image processing apparatus according to claim 2, wherein the first user operation is performed during a detection of the proximity of the object relative to the display.
  7. 7. The image processing apparatus according to claim 6, wherein a position of the plurality of menu items is related to a position of proximity of the object on the display.
  8. 8. The image processing apparatus according to claim 2, wherein the circuitry is further configured to restore the display state of the first menu item in response to the proximity of the object to a display region outside the first proximity response area.
  9. 9. The image processing apparatus according to claim 2, wherein the circuitry is further configured to restore the display state of the first menu item in response to detection of the object as being in proximity to a display region outside the first proximity response area.
  10. 10. The image processing apparatus according to claim 2, wherein the circuitry is further configured to provide a standby time to prolong the display state of the first menu item in response to the proximity of the object to a display region outside the first proximity response area.
  11. 11. The image processing apparatus according to claim 10, wherein the prolonged display state of the first menu item is continued by returning the proximity of the object within the first proximity response area before the standby time expires.
  12. 12. The image processing apparatus according to claim 2, wherein at least part of the plurality of menu items is partially translucent so that the captured image remains at least partially visible on the display.
  13. 13. The image processing apparatus according to claim 2, wherein the display includes a touch panel.
  14. 14. The image processing apparatus according to claim 2, wherein the object is a stylus provided along with the image processing apparatus.
  15. 15. The image processing apparatus according to claim 2, wherein the object is a finger of the user.
  16. 16. The image processing apparatus according to claim 2, wherein the circuitry is further configured to capture images.
  17. 17. The image processing apparatus according to claim 2, wherein the change in the display state further includes at least one of a change in display size, shape, design, function guide display, or brightness of the first menu item.
  18. 18. The image processing apparatus according to claim 2, wherein each of the plurality of menu items is associated with distinct contact response areas.
  19. 19. The image processing apparatus according to claim 2, wherein the circuitry is further configured to remove the display of the plurality of menu items in response to a touch detected outside all proximity response areas associated with the plurality of menu items.
  20. 20. The image processing apparatus according to claim 2, wherein the circuitry is further configured to remove the display of the plurality of menu items prior to performing the function associated with the first menu item in response to the touch detected within the first proximity response area.
  21. 21. An image capturing apparatus, comprising:
    an input-output interface including a touch display and a stylus; and
    circuitry configured to
    capture image data,
    detect proximity of the stylus over the touch display,
    display a cursor on a position of the touch display where the proximity of the stylus is detected,
    display a plurality of menu items, including a first menu item, on the touch display, each menu item being displayed on the touch display in association with a distinct proximity response area, the plurality of menu items being displayed on the touch display in response to a first user operation,
    change a display state of the first menu item in response to a detection of proximity of an object to a first proximity response area associated with the first menu item, and
    perform a function associated with the first menu item in response to a touch detected within the first proximity response area,
    wherein the first proximity response area includes at least a display area of the first menu item,
    the change in display state includes at least a change in a display color of the first menu item and a display of a function guide associated with the first menu item, and
    the plurality of displayed items are displayed over the captured image data.
  22. 22. An image processing method, comprising:
    displaying, with circuitry, a plurality of menu items, including a first menu item, on a display, each menu item being displayed on the display in association with a distinct proximity response area, the plurality of menu items being displayed on the display in response to a first user operation;
    changing, with the circuitry, a display state of the first menu item in response to detection of proximity of an object to a first proximity response area associated with the first menu item; and
    performing, with the circuitry, a function associated with the first menu item in response to a touch detected within the first proximity response area,
    wherein the first proximity response area includes at least a display area of the first menu item,
    the change in display state includes at least a change in a display color of the first menu item, and
    the plurality of displayed items are displayed over a captured image.
  23. 23. The image processing method according to claim 22, wherein the first proximity response area includes regions on the display other than the display area of the first menu item.
  24. 24. The image processing method according to claim 22, wherein the display of a function of the first menu item is on a region other than a display area of any of the plurality of menu items.
  25. 25. The image processing method according to claim 22, wherein the first user operation is performed during a detection of the proximity of the object relative to the display.
  26. 26. The image processing method according to claim 25, wherein a position of the plurality of menu items is related to a position of proximity of the object on the display.
  27. 27. A non-transitory computer-readable medium encoded with computer-readable instructions that, when executed by processing circuitry, cause the processing circuitry to perform an image processing method, comprising:
    displaying a plurality of menu items, including a first menu item, on a display, each menu item being displayed on the display in association with a distinct proximity response area, the plurality of menu items being displayed on the display in response to a first user operation;
    changing a display state of the first menu item in response to detection of proximity of an object to a first proximity response area associated with the first menu item; and
    performing a function associated with the first menu item in response to a touch detected within the first proximity response area,
    wherein the first proximity response area includes at least a display area of the first menu item,
    the change in display state includes at least a change in a display color of the first menu item, and
    the plurality of displayed items are displayed over a captured image.
  28. 28. The non-transitory computer-readable medium according to claim 27, wherein the first proximity response area includes regions on the display other than the display area of the first menu item.
  29. 29. The non-transitory computer-readable medium according to claim 27, wherein the display of a function of the first menu item is on a region other than a display area of any of the plurality of menu items.
  30. 30. The non-transitory computer-readable medium according to claim 27, wherein the first user operation is performed during a detection of the proximity of the object relative to the display.
  31. 31. The non-transitory computer-readable medium according to claim 30, wherein a position of the plurality of menu items is related to a position of proximity of the object on the display.
US15050920 2010-12-27 2016-02-23 Display control device, method and computer program product Pending US20160170585A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2010-291081 2010-12-27
JP2010291081A JP5652652B2 (en) 2010-12-27 2010-12-27 Display control apparatus and method
US13281490 US9329776B2 (en) 2010-12-27 2011-10-26 Display control device, method and computer program product
US15050920 US20160170585A1 (en) 2010-12-27 2016-02-23 Display control device, method and computer program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15050920 US20160170585A1 (en) 2010-12-27 2016-02-23 Display control device, method and computer program product

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13281490 Continuation US9329776B2 (en) 2010-12-27 2011-10-26 Display control device, method and computer program product

Publications (1)

Publication Number Publication Date
US20160170585A1 true true US20160170585A1 (en) 2016-06-16

Family

ID=45442856

Family Applications (2)

Application Number Title Priority Date Filing Date
US13281490 Active 2034-06-18 US9329776B2 (en) 2010-12-27 2011-10-26 Display control device, method and computer program product
US15050920 Pending US20160170585A1 (en) 2010-12-27 2016-02-23 Display control device, method and computer program product

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13281490 Active 2034-06-18 US9329776B2 (en) 2010-12-27 2011-10-26 Display control device, method and computer program product

Country Status (6)

Country Link
US (2) US9329776B2 (en)
EP (1) EP2469396A3 (en)
JP (1) JP5652652B2 (en)
KR (1) KR101860571B1 (en)
CN (1) CN102591570B (en)
RU (1) RU2519481C2 (en)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5906097B2 (en) * 2012-01-31 2016-04-20 キヤノン株式会社 Electronic apparatus, a control method, a program, and a recording medium
JP5911321B2 (en) * 2012-02-02 2016-04-27 キヤノン株式会社 Control method for a display control apparatus and a display control unit
DE102012014910A1 (en) * 2012-07-27 2014-01-30 Volkswagen Aktiengesellschaft Operator interface method for displaying an operation of a user interface facilitating information and program
KR20140021821A (en) * 2012-08-09 2014-02-21 삼성전자주식회사 Image capturing apparatus and image capturing method
JP5812054B2 (en) * 2012-08-23 2015-11-11 株式会社デンソー Operation device
CN102902406B (en) * 2012-08-24 2015-11-18 深圳雷柏科技股份有限公司 A touch-type terminal and control system of the electronic device
CN102902476B (en) * 2012-08-24 2015-08-19 深圳雷柏科技股份有限公司 A method for touch-terminal control apparatus and the electronic system
JP5854280B2 (en) * 2012-10-03 2016-02-09 ソニー株式会社 The information processing apparatus, information processing method, and program
KR101401480B1 (en) * 2012-10-31 2014-05-29 길상복 Mask display apparatus and method for learning
US9164609B2 (en) * 2013-03-13 2015-10-20 Amazon Technologies, Inc. Managing sensory information of a user device
KR20140112915A (en) * 2013-03-14 2014-09-24 삼성전자주식회사 User device and operating method thereof
JP5820414B2 (en) * 2013-03-19 2015-11-24 株式会社Nttドコモ Information processing apparatus and information processing method
US9570019B2 (en) * 2014-03-20 2017-02-14 Dell Products, Lp System and method for coordinating image capture in a camera hidden behind a display device
CN104182313B (en) * 2014-08-14 2018-09-04 小米科技有限责任公司 Delay photographing method and apparatus
US9641737B2 (en) 2014-08-14 2017-05-02 Xiaomi Inc. Method and device for time-delay photographing
JP2016046676A (en) * 2014-08-22 2016-04-04 株式会社リコー Imaging apparatus and imaging method
WO2016121329A1 (en) * 2015-01-29 2016-08-04 パナソニックIpマネジメント株式会社 Image processing device, stylus, and image processing method
CN106033295A (en) * 2015-03-09 2016-10-19 阿里巴巴集团控股有限公司 A menu display method and device and a mobile terminal
KR20160133781A (en) * 2015-05-13 2016-11-23 엘지전자 주식회사 Mobile terminal and method for controlling the same
CN105278815A (en) * 2015-06-30 2016-01-27 维沃移动通信有限公司 Method for checking next level of menu by terminal, and terminal
CN105094553B (en) * 2015-07-24 2018-05-18 广州久邦世纪科技有限公司 Seeds effects menu bar implementation method

Citations (70)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5363481A (en) * 1992-06-22 1994-11-08 Tektronix, Inc. Auto selecting scrolling device
US5664133A (en) * 1993-12-13 1997-09-02 Microsoft Corporation Context sensitive menu system/menu behavior
US5745717A (en) * 1995-06-07 1998-04-28 Vayda; Mark Graphical menu providing simultaneous multiple command selection
US5828376A (en) * 1996-09-23 1998-10-27 J. D. Edwards World Source Company Menu control in a graphical user interface
US5850218A (en) * 1997-02-19 1998-12-15 Time Warner Entertainment Company L.P. Inter-active program guide with default selection control
US20010054183A1 (en) * 2000-03-31 2001-12-20 Curreri Matthew R. Program surf grid
US20020089543A1 (en) * 2000-12-15 2002-07-11 Christian Ostergaard Recovering managent in a communication unit terminal
US6456892B1 (en) * 1998-07-01 2002-09-24 Sony Electronics, Inc. Data driven interaction for networked control of a DDI target device over a home entertainment network
US20030115167A1 (en) * 2000-07-11 2003-06-19 Imran Sharif Web browser implemented in an Internet appliance
US6707475B1 (en) * 2000-09-19 2004-03-16 Honeywell International Inc. System for selecting and displaying flight management system procedures
US20040051804A1 (en) * 2002-09-12 2004-03-18 Eastman Kodak Company Mutual display support for a digital information/imaging system
US6717600B2 (en) * 2000-12-15 2004-04-06 International Business Machines Corporation Proximity selection of selectable item in a graphical user interface
US20040095395A1 (en) * 1995-06-06 2004-05-20 Silicon Graphics, Inc. Method and apparatus for producing, controlling and displaying menus
US6999066B2 (en) * 2002-06-24 2006-02-14 Xerox Corporation System for audible feedback for touch screen displays
US20060112353A1 (en) * 2004-11-19 2006-05-25 Nintendo Co., Ltd. Image processing apparatus and storage medium storing image processing program
US20060129908A1 (en) * 2003-01-28 2006-06-15 Markel Steven O On-content streaming media enhancement
US20060158426A1 (en) * 2005-01-14 2006-07-20 Alps Electric Co., Ltd. Display apparatus with input devices
US20060187212A1 (en) * 2005-02-24 2006-08-24 Samsung Electronics Co., Ltd. User interface apparatus and method
US7158123B2 (en) * 2003-01-31 2007-01-02 Xerox Corporation Secondary touch contextual sub-menu navigation for touch screen interface
US20070236476A1 (en) * 2006-04-06 2007-10-11 Alps Electric Co., Ltd. Input device and computer system using the input device
US20070250786A1 (en) * 2006-04-19 2007-10-25 Byeong Hui Jeon Touch screen device and method of displaying and selecting menus thereof
US20070247446A1 (en) * 2006-04-25 2007-10-25 Timothy James Orsley Linear positioning input device
US20080062127A1 (en) * 2006-09-11 2008-03-13 Apple Computer, Inc. Menu overlay including context dependent menu icon
US20080158172A1 (en) * 2007-01-03 2008-07-03 Apple Computer, Inc. Proximity and multi-touch sensor detection and demodulation
US20080180549A1 (en) * 2007-01-31 2008-07-31 Samsung Electronics Co., Ltd. Multifunctional video apparatus and method of providing user interface thereof
US20090007020A1 (en) * 2007-06-28 2009-01-01 Sony Corporation Image display device, image pickup apparatus, image display method, and program thereof
US20090119708A1 (en) * 2007-11-07 2009-05-07 Comcast Cable Holdings, Llc User interface display without output device rendering
US20090157513A1 (en) * 2007-12-17 2009-06-18 Bonev Robert Communications system and method for serving electronic content
US20090201248A1 (en) * 2006-07-05 2009-08-13 Radu Negulescu Device and method for providing electronic input
US20090239588A1 (en) * 2008-03-21 2009-09-24 Lg Electronics Inc. Mobile terminal and screen displaying method thereof
US7600201B2 (en) * 2004-04-07 2009-10-06 Sony Corporation Methods and apparatuses for viewing choices and making selections
US20090289895A1 (en) * 2008-01-25 2009-11-26 Toru Nakada Electroencephalogram interface system, electroencephalogram interface apparatus, method, and computer program
US20100060597A1 (en) * 2008-09-10 2010-03-11 Samsung Digital Imaging Co., Ltd. Method and apparatus for displaying and selecting icons on a touch screen
US20100079405A1 (en) * 2008-09-30 2010-04-01 Jeffrey Traer Bernstein Touch Screen Device, Method, and Graphical User Interface for Moving On-Screen Objects Without Using a Cursor
US7697827B2 (en) * 2005-10-17 2010-04-13 Konicek Jeffrey C User-friendlier interfaces for a camera
US20100107099A1 (en) * 2008-10-27 2010-04-29 Verizon Data Services, Llc Proximity interface apparatuses, systems, and methods
US20100162168A1 (en) * 2008-12-24 2010-06-24 Research In Motion Limited Methods and systems for managing memory and processing resources for the control of a display screen to fix displayed positions of selected items on the display screen
US20100194682A1 (en) * 2009-01-30 2010-08-05 Research In Motion Limited Method for tap detection and for interacting with a handheld electronic device, and a handheld electronic device configured therefor
US20100251152A1 (en) * 2009-03-31 2010-09-30 Seong Yoon Cho Mobile terminal and controlling method thereof
US20110016390A1 (en) * 2009-07-14 2011-01-20 Pantech Co. Ltd. Mobile terminal to display menu information according to touch signal
US20110109577A1 (en) * 2009-11-12 2011-05-12 Samsung Electronics Co., Ltd. Method and apparatus with proximity touch detection
US20110113371A1 (en) * 2009-11-06 2011-05-12 Robert Preston Parker Touch-Based User Interface User Error Handling
US20110163969A1 (en) * 2010-01-06 2011-07-07 Freddy Allen Anzures Device, Method, and Graphical User Interface with Content Display Modes and Display Rotation Heuristics
US20110209093A1 (en) * 2010-02-19 2011-08-25 Microsoft Corporation Radial menus with bezel gestures
US20110209097A1 (en) * 2010-02-19 2011-08-25 Hinckley Kenneth P Use of Bezel as an Input Mechanism
US20110298732A1 (en) * 2010-06-03 2011-12-08 Sony Ericsson Mobile Communications Japan, Inc. Information processing apparatus and information processing method method
US20120120002A1 (en) * 2010-11-17 2012-05-17 Sony Corporation System and method for display proximity based control of a touch screen user interface
US8219152B2 (en) * 2008-09-03 2012-07-10 Lg Electronics Inc. Mobile terminal and control method thereof
US8219936B2 (en) * 2007-08-30 2012-07-10 Lg Electronics Inc. User interface for a mobile device using a user's gesture in the proximity of an electronic device
US20120260164A1 (en) * 2009-12-18 2012-10-11 Honda Motor Co., Ltd. Morphable Pad for Tactile Control
US20120266069A1 (en) * 2009-12-28 2012-10-18 Hillcrest Laboratories, Inc. TV Internet Browser
US8335997B2 (en) * 2008-09-18 2012-12-18 Chi Mei Communication Systems, Inc. Electronic device and method for sorting menu options of a program menu in the electronic device
US20120327009A1 (en) * 2009-06-07 2012-12-27 Apple Inc. Devices, methods, and graphical user interfaces for accessibility using a touch-sensitive surface
US8381135B2 (en) * 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
US20130073286A1 (en) * 2011-09-20 2013-03-21 Apple Inc. Consolidating Speech Recognition Results
US8458591B2 (en) * 2006-11-06 2013-06-04 Sony Corporation Image pickup apparatus, method for controlling display of image pickup apparatus, and computer program for executing method for controlling display of image pickup apparatus
US20130207989A1 (en) * 2006-04-28 2013-08-15 Sony Corporation Character highlighting control apparatus, display apparatus, highlighting display control method, and computer program
US20130263055A1 (en) * 2009-09-25 2013-10-03 Apple Inc. Device, Method, and Graphical User Interface for Manipulating User Interface Objects
US20140129941A1 (en) * 2011-11-08 2014-05-08 Panasonic Corporation Information display processing device
US20140143698A1 (en) * 2012-11-19 2014-05-22 Samsung Electronics Co., Ltd. Method and apparatus for providing user interface through proximity touch input
US20140165006A1 (en) * 2010-04-07 2014-06-12 Apple Inc. Device, Method, and Graphical User Interface for Managing Folders with Multiple Pages
US20140204279A1 (en) * 2010-08-06 2014-07-24 Samsung Electronics Co., Ltd. Display apparatus and control method thereof
US20150212628A1 (en) * 2008-06-24 2015-07-30 Lg Electronics Inc. Mobile terminal capable of sensing proximity touch
US9201520B2 (en) * 2011-02-11 2015-12-01 Microsoft Technology Licensing, Llc Motion and context sharing for pen-based computing inputs
US9239677B2 (en) * 2004-05-06 2016-01-19 Apple Inc. Operation of a computer with touch screen interface
US9261985B2 (en) * 2013-03-11 2016-02-16 Barnes & Noble College Booksellers, Llc Stylus-based touch-sensitive area for UI control of computing device
US9395914B2 (en) * 2009-10-26 2016-07-19 Samsung Electronics Co., Ltd. Method for providing touch screen-based user interface and portable terminal adapted to the method
US20160370982A1 (en) * 2015-06-18 2016-12-22 Apple Inc. Device, Method, and Graphical User Interface for Navigating Media Content
US20170098290A1 (en) * 2005-12-14 2017-04-06 Harold W. Milton, Jr. System for preparing a patent application
US20170097747A1 (en) * 2005-12-14 2017-04-06 Harold W. Milton, Jr. System for preparing a patent application

Family Cites Families (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9023A (en) * 1852-06-15 Carriage-axle
US5119079A (en) * 1990-09-17 1992-06-02 Xerox Corporation Touch screen user interface with expanding touch locations for a reprographic machine
JPH05197494A (en) * 1991-10-11 1993-08-06 Fuji Photo Film Co Ltd Touch pannel input device
US5798760A (en) 1995-06-07 1998-08-25 Vayda; Mark Radial graphical menuing system with concentric region menuing
US7730401B2 (en) * 2001-05-16 2010-06-01 Synaptics Incorporated Touch screen with user interface enhancement
JP2002358162A (en) 2001-06-01 2002-12-13 Sony Corp Picture display device
US6886138B2 (en) * 2001-07-05 2005-04-26 International Business Machines Corporation Directing users′ attention to specific icons being approached by an on-screen pointer on user interactive display interfaces
KR100520664B1 (en) * 2003-02-06 2005-10-10 삼성전자주식회사 Method for changing setting of user setting menu in mobile terminal
US7103852B2 (en) * 2003-03-10 2006-09-05 International Business Machines Corporation Dynamic resizing of clickable areas of touch screen applications
JP4555818B2 (en) * 2003-04-08 2010-10-06 フェイヴァリット システムズ エーエス Window and its control system including a computer device
US20050115816A1 (en) * 2003-07-23 2005-06-02 Neil Gelfond Accepting user control
US20050071761A1 (en) * 2003-09-25 2005-03-31 Nokia Corporation User interface on a portable electronic device
JP2005328379A (en) * 2004-05-14 2005-11-24 Toshiba Corp Input guide display operating system
EP1766502A2 (en) * 2004-06-29 2007-03-28 Philips Electronics N.V. Multi-layered display of a graphical user interface
US7239249B2 (en) * 2005-03-17 2007-07-03 Xerox Corporation Menu sign system
US7758799B2 (en) * 2005-04-01 2010-07-20 3D Systems, Inc. Edge smoothness with low resolution projected images for use in solid imaging
US7644374B2 (en) * 2005-04-14 2010-01-05 Microsoft Corporation Computer input control for specifying scope with explicit exclusions
JP2006303714A (en) * 2005-04-18 2006-11-02 Olympus Imaging Corp Mobile image display apparatus and camera including image display apparatus
US8645863B2 (en) * 2007-06-29 2014-02-04 Microsoft Corporation Menus with translucency and live preview
US20110012856A1 (en) * 2008-03-05 2011-01-20 Rpo Pty. Limited Methods for Operation of a Touch Input Device
US8576181B2 (en) * 2008-05-20 2013-11-05 Lg Electronics Inc. Mobile terminal using proximity touch and wallpaper controlling method thereof
EP2131272A3 (en) * 2008-06-02 2014-05-07 LG Electronics Inc. Mobile communication terminal having proximity sensor and display controlling method therein
JP2010019643A (en) * 2008-07-09 2010-01-28 Toyota Motor Corp Information terminal, navigation apparatus, and option display method
US20100026640A1 (en) * 2008-08-01 2010-02-04 Samsung Electronics Co., Ltd. Electronic apparatus and method for implementing user interface
KR101531192B1 (en) * 2008-11-14 2015-06-25 엘지전자 주식회사 Mobile terminal and map display method using the same.
KR101544475B1 (en) * 2008-11-28 2015-08-13 엘지전자 주식회사 Input and output control via the touch
KR20100069842A (en) * 2008-12-17 2010-06-25 삼성전자주식회사 Electronic apparatus implementing user interface and method thereof
KR101055924B1 (en) * 2009-05-26 2011-08-09 주식회사 팬택 A user interface apparatus and method in a touch device,
US8646189B2 (en) * 2010-01-15 2014-02-11 James William Tjerrild Pistachio/nut dryer-assembly and method
US8386965B2 (en) * 2010-01-15 2013-02-26 Apple Inc. Techniques and systems for enhancing touch screen device accessibility through virtual containers and virtually enlarged boundaries
JP5617603B2 (en) * 2010-12-21 2014-11-05 ソニー株式会社 Display control device, display control method, and program

Patent Citations (73)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5363481A (en) * 1992-06-22 1994-11-08 Tektronix, Inc. Auto selecting scrolling device
US5664133A (en) * 1993-12-13 1997-09-02 Microsoft Corporation Context sensitive menu system/menu behavior
US20040095395A1 (en) * 1995-06-06 2004-05-20 Silicon Graphics, Inc. Method and apparatus for producing, controlling and displaying menus
US5745717A (en) * 1995-06-07 1998-04-28 Vayda; Mark Graphical menu providing simultaneous multiple command selection
US5828376A (en) * 1996-09-23 1998-10-27 J. D. Edwards World Source Company Menu control in a graphical user interface
US5850218A (en) * 1997-02-19 1998-12-15 Time Warner Entertainment Company L.P. Inter-active program guide with default selection control
US6456892B1 (en) * 1998-07-01 2002-09-24 Sony Electronics, Inc. Data driven interaction for networked control of a DDI target device over a home entertainment network
US20010054183A1 (en) * 2000-03-31 2001-12-20 Curreri Matthew R. Program surf grid
US20030115167A1 (en) * 2000-07-11 2003-06-19 Imran Sharif Web browser implemented in an Internet appliance
US6707475B1 (en) * 2000-09-19 2004-03-16 Honeywell International Inc. System for selecting and displaying flight management system procedures
US6717600B2 (en) * 2000-12-15 2004-04-06 International Business Machines Corporation Proximity selection of selectable item in a graphical user interface
US20020089543A1 (en) * 2000-12-15 2002-07-11 Christian Ostergaard Recovering managent in a communication unit terminal
US6999066B2 (en) * 2002-06-24 2006-02-14 Xerox Corporation System for audible feedback for touch screen displays
US20040051804A1 (en) * 2002-09-12 2004-03-18 Eastman Kodak Company Mutual display support for a digital information/imaging system
US20060129908A1 (en) * 2003-01-28 2006-06-15 Markel Steven O On-content streaming media enhancement
US7158123B2 (en) * 2003-01-31 2007-01-02 Xerox Corporation Secondary touch contextual sub-menu navigation for touch screen interface
US7600201B2 (en) * 2004-04-07 2009-10-06 Sony Corporation Methods and apparatuses for viewing choices and making selections
US9239677B2 (en) * 2004-05-06 2016-01-19 Apple Inc. Operation of a computer with touch screen interface
US8381135B2 (en) * 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
US7428708B2 (en) * 2004-11-19 2008-09-23 Nintendo Co., Ltd. Image processing apparatus and storage medium storing image processing program
US20060112353A1 (en) * 2004-11-19 2006-05-25 Nintendo Co., Ltd. Image processing apparatus and storage medium storing image processing program
US20060158426A1 (en) * 2005-01-14 2006-07-20 Alps Electric Co., Ltd. Display apparatus with input devices
US20060187212A1 (en) * 2005-02-24 2006-08-24 Samsung Electronics Co., Ltd. User interface apparatus and method
US7697827B2 (en) * 2005-10-17 2010-04-13 Konicek Jeffrey C User-friendlier interfaces for a camera
US20170098290A1 (en) * 2005-12-14 2017-04-06 Harold W. Milton, Jr. System for preparing a patent application
US20170097747A1 (en) * 2005-12-14 2017-04-06 Harold W. Milton, Jr. System for preparing a patent application
US20070236476A1 (en) * 2006-04-06 2007-10-11 Alps Electric Co., Ltd. Input device and computer system using the input device
US20070250786A1 (en) * 2006-04-19 2007-10-25 Byeong Hui Jeon Touch screen device and method of displaying and selecting menus thereof
US20070247446A1 (en) * 2006-04-25 2007-10-25 Timothy James Orsley Linear positioning input device
US20130207989A1 (en) * 2006-04-28 2013-08-15 Sony Corporation Character highlighting control apparatus, display apparatus, highlighting display control method, and computer program
US20090201248A1 (en) * 2006-07-05 2009-08-13 Radu Negulescu Device and method for providing electronic input
US20080062127A1 (en) * 2006-09-11 2008-03-13 Apple Computer, Inc. Menu overlay including context dependent menu icon
US8458591B2 (en) * 2006-11-06 2013-06-04 Sony Corporation Image pickup apparatus, method for controlling display of image pickup apparatus, and computer program for executing method for controlling display of image pickup apparatus
US20080158172A1 (en) * 2007-01-03 2008-07-03 Apple Computer, Inc. Proximity and multi-touch sensor detection and demodulation
US20080180549A1 (en) * 2007-01-31 2008-07-31 Samsung Electronics Co., Ltd. Multifunctional video apparatus and method of providing user interface thereof
US20090007020A1 (en) * 2007-06-28 2009-01-01 Sony Corporation Image display device, image pickup apparatus, image display method, and program thereof
US8219936B2 (en) * 2007-08-30 2012-07-10 Lg Electronics Inc. User interface for a mobile device using a user's gesture in the proximity of an electronic device
US20090119708A1 (en) * 2007-11-07 2009-05-07 Comcast Cable Holdings, Llc User interface display without output device rendering
US20090157513A1 (en) * 2007-12-17 2009-06-18 Bonev Robert Communications system and method for serving electronic content
US20090289895A1 (en) * 2008-01-25 2009-11-26 Toru Nakada Electroencephalogram interface system, electroencephalogram interface apparatus, method, and computer program
US20090239588A1 (en) * 2008-03-21 2009-09-24 Lg Electronics Inc. Mobile terminal and screen displaying method thereof
US20150212628A1 (en) * 2008-06-24 2015-07-30 Lg Electronics Inc. Mobile terminal capable of sensing proximity touch
US8219152B2 (en) * 2008-09-03 2012-07-10 Lg Electronics Inc. Mobile terminal and control method thereof
US20100060597A1 (en) * 2008-09-10 2010-03-11 Samsung Digital Imaging Co., Ltd. Method and apparatus for displaying and selecting icons on a touch screen
US8335997B2 (en) * 2008-09-18 2012-12-18 Chi Mei Communication Systems, Inc. Electronic device and method for sorting menu options of a program menu in the electronic device
US20100079405A1 (en) * 2008-09-30 2010-04-01 Jeffrey Traer Bernstein Touch Screen Device, Method, and Graphical User Interface for Moving On-Screen Objects Without Using a Cursor
US20100107099A1 (en) * 2008-10-27 2010-04-29 Verizon Data Services, Llc Proximity interface apparatuses, systems, and methods
US20100162168A1 (en) * 2008-12-24 2010-06-24 Research In Motion Limited Methods and systems for managing memory and processing resources for the control of a display screen to fix displayed positions of selected items on the display screen
US20100194682A1 (en) * 2009-01-30 2010-08-05 Research In Motion Limited Method for tap detection and for interacting with a handheld electronic device, and a handheld electronic device configured therefor
US20100251152A1 (en) * 2009-03-31 2010-09-30 Seong Yoon Cho Mobile terminal and controlling method thereof
US20120327009A1 (en) * 2009-06-07 2012-12-27 Apple Inc. Devices, methods, and graphical user interfaces for accessibility using a touch-sensitive surface
US20110016390A1 (en) * 2009-07-14 2011-01-20 Pantech Co. Ltd. Mobile terminal to display menu information according to touch signal
US20130263055A1 (en) * 2009-09-25 2013-10-03 Apple Inc. Device, Method, and Graphical User Interface for Manipulating User Interface Objects
US9395914B2 (en) * 2009-10-26 2016-07-19 Samsung Electronics Co., Ltd. Method for providing touch screen-based user interface and portable terminal adapted to the method
US20110113371A1 (en) * 2009-11-06 2011-05-12 Robert Preston Parker Touch-Based User Interface User Error Handling
US20110109577A1 (en) * 2009-11-12 2011-05-12 Samsung Electronics Co., Ltd. Method and apparatus with proximity touch detection
US20120260164A1 (en) * 2009-12-18 2012-10-11 Honda Motor Co., Ltd. Morphable Pad for Tactile Control
US20120266069A1 (en) * 2009-12-28 2012-10-18 Hillcrest Laboratories, Inc. TV Internet Browser
US20110163969A1 (en) * 2010-01-06 2011-07-07 Freddy Allen Anzures Device, Method, and Graphical User Interface with Content Display Modes and Display Rotation Heuristics
US20110209097A1 (en) * 2010-02-19 2011-08-25 Hinckley Kenneth P Use of Bezel as an Input Mechanism
US20110209093A1 (en) * 2010-02-19 2011-08-25 Microsoft Corporation Radial menus with bezel gestures
US9310994B2 (en) * 2010-02-19 2016-04-12 Microsoft Technology Licensing, Llc Use of bezel as an input mechanism
US9367205B2 (en) * 2010-02-19 2016-06-14 Microsoft Technolgoy Licensing, Llc Radial menus with bezel gestures
US20140165006A1 (en) * 2010-04-07 2014-06-12 Apple Inc. Device, Method, and Graphical User Interface for Managing Folders with Multiple Pages
US20110298732A1 (en) * 2010-06-03 2011-12-08 Sony Ericsson Mobile Communications Japan, Inc. Information processing apparatus and information processing method method
US20140204279A1 (en) * 2010-08-06 2014-07-24 Samsung Electronics Co., Ltd. Display apparatus and control method thereof
US20120120002A1 (en) * 2010-11-17 2012-05-17 Sony Corporation System and method for display proximity based control of a touch screen user interface
US9201520B2 (en) * 2011-02-11 2015-12-01 Microsoft Technology Licensing, Llc Motion and context sharing for pen-based computing inputs
US20130073286A1 (en) * 2011-09-20 2013-03-21 Apple Inc. Consolidating Speech Recognition Results
US20140129941A1 (en) * 2011-11-08 2014-05-08 Panasonic Corporation Information display processing device
US20140143698A1 (en) * 2012-11-19 2014-05-22 Samsung Electronics Co., Ltd. Method and apparatus for providing user interface through proximity touch input
US9261985B2 (en) * 2013-03-11 2016-02-16 Barnes & Noble College Booksellers, Llc Stylus-based touch-sensitive area for UI control of computing device
US20160370982A1 (en) * 2015-06-18 2016-12-22 Apple Inc. Device, Method, and Graphical User Interface for Navigating Media Content

Also Published As

Publication number Publication date Type
US20120162242A1 (en) 2012-06-28 application
EP2469396A3 (en) 2016-08-31 application
KR101860571B1 (en) 2018-05-23 grant
US9329776B2 (en) 2016-05-03 grant
JP2012138012A (en) 2012-07-19 application
CN102591570A (en) 2012-07-18 application
RU2519481C2 (en) 2014-06-10 grant
KR20120074215A (en) 2012-07-05 application
EP2469396A2 (en) 2012-06-27 application
CN102591570B (en) 2017-08-08 grant
JP5652652B2 (en) 2015-01-14 grant
RU2011151852A (en) 2013-06-27 application

Similar Documents

Publication Publication Date Title
US20110074671A1 (en) Image display apparatus and control method thereof, and computer program
US20050184972A1 (en) Image display apparatus and image display method
US20070146528A1 (en) Image capturing apparatus with through image display function
US20100053355A1 (en) Information processing apparatus, information processing method, and program
US20110109581A1 (en) Digital image processing device and associated methodology of performing touch-based image scaling
US20100060597A1 (en) Method and apparatus for displaying and selecting icons on a touch screen
US20080186285A1 (en) Mobile equipment with display function
US20110018827A1 (en) Information processing apparatus, display method, and display program
US20110149138A1 (en) Variable rate browsing of an image collection
US20120198386A1 (en) Causing display of thumbnail images
US20090002516A1 (en) Image capturing apparatus, shooting control method, and program
JP2011039990A (en) Information processing apparatus, control method therefor, program and recording medium
US20060022961A1 (en) Reproduction apparatus, camera, and display switch method of reproduction apparatus
US20110063236A1 (en) Information processing device, display method and program
US20120113056A1 (en) Input device, input method, and computer readable storage device
US20040008210A1 (en) Electronic device, digital still camera and display control method
US20110063491A1 (en) Digital photographing apparatus and method of controlling the same
US20110025711A1 (en) Image processing apparatus, image processing method, and program
US20100026643A1 (en) Information processing apparatus, method, and program
US20110221948A1 (en) Image pickup apparatus and its control method
US20140375862A1 (en) Method for photographing control and electronic device thereof
US20130027570A1 (en) Display control system, display control apparatus and control method therefor
US20120154305A1 (en) Image display control apparatus and image display control method
US20120154307A1 (en) Image display control apparatus and image display control method
US20110061023A1 (en) Electronic apparatus including touch panel and displaying method of the electronic apparatus