JP5648844B2 - Image display control apparatus and image display control method - Google Patents

Image display control apparatus and image display control method Download PDF

Info

Publication number
JP5648844B2
JP5648844B2 JP2010284322A JP2010284322A JP5648844B2 JP 5648844 B2 JP5648844 B2 JP 5648844B2 JP 2010284322 A JP2010284322 A JP 2010284322A JP 2010284322 A JP2010284322 A JP 2010284322A JP 5648844 B2 JP5648844 B2 JP 5648844B2
Authority
JP
Japan
Prior art keywords
display
state
proximity
control
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2010284322A
Other languages
Japanese (ja)
Other versions
JP2012133524A5 (en
JP2012133524A (en
Inventor
布巻 崇
崇 布巻
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Priority to JP2010284322A priority Critical patent/JP5648844B2/en
Publication of JP2012133524A publication Critical patent/JP2012133524A/en
Publication of JP2012133524A5 publication Critical patent/JP2012133524A5/ja
Application granted granted Critical
Publication of JP5648844B2 publication Critical patent/JP5648844B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of power-saving mode
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • G06F1/3231Monitoring the presence, absence or movement of users
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/325Power saving in peripheral device
    • G06F1/3262Power saving in digitizer or tablet
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for entering handwritten data, e.g. gestures, text
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing
    • Y02D10/10Reducing energy consumption at the single machine level, e.g. processors, personal computers, peripherals or power supply
    • Y02D10/17Power management
    • Y02D10/173Monitoring user presence

Description

  The present invention relates to an image display control device and an image display control method, and more particularly, to an image display control device and an image display control method that enable a user to restore display with a simpler operation.

  Some electronic devices having a display panel have a function of detecting that the user does not use the device for a certain period of time and turning it off or setting it to a low luminance state for power saving or pixel protection of the display panel. . In this case, for example, display is restored by using a key operation from the user or a tap on the display panel as a trigger. Further, for example, there is a method in which a user inputs a figure (gesture) registered in advance and the screen saver is canceled (see, for example, Patent Document 1).

JP 2002-82734 A

  However, in the method of returning the display by using a tap on the display panel by the user as a trigger, an operation of tapping the display panel once is required to return. For example, if the operation to be performed after the display is restored is an operation for selecting a predetermined operation button displayed on the display panel, two operations (tap for returning the display and tap for selecting the operation button) ) Is required, which is troublesome. Moreover, the display cannot be returned to the user who does not know that the display return method is a tap operation.

  The present invention has been made in view of such a situation, and makes it possible to restore the display with a simpler operation for the user.

An image display control apparatus according to an aspect of the present invention includes: a proximity detection control unit configured to detect proximity of an object to a display unit; and a display state of the display unit when there is no proximity of the object for a predetermined time. Display control for performing first control and second control for controlling the state to the second display state and returning the second display state to the first display state when proximity of the object is detected In the first control, the first display state is a steady state capable of executing an operation corresponding to a user operation, and the second display state is a GUI on the steady state screen. button oFF state der to display a screen clearing the display of the operation buttons is, in the second control, the first display state, the a steady state, the second display state, than the steady-state The display brightness is low Degrees state or in the off state while completely turn off the display.

According to an image display control method of one aspect of the present invention, the display of the display unit is detected when the image display control apparatus that detects the proximity of the object to the display unit and controls the display unit does not have the object approaching for a certain period of time. A first control for controlling the state from the first display state to the second display state, and returning the second display state to the first display state when proximity of the object is detected; A step of performing a second control , wherein in the first control, the first display state is a steady state capable of executing an operation corresponding to a user operation, and the second display state is the steady state. button oFF state der to display a screen that erases the display of the GUI operation button in the screen is, in the second control, the first display state, the a steady state, the second display state, Display brightness more than the steady state Low light condition and the luminance, or in the off state while completely turn off the display.

In one aspect of the present invention, the display state of the display unit is controlled so as to change from the first display state to the second display state when the object does not approach the display unit for a certain period of time. When the proximity is detected, the first control and the second control that are controlled to return the second display state to the first display state are performed . In the first control, the first display state is a steady state in which an operation corresponding to the user's operation can be executed, and the second display state is a screen obtained by deleting the display of the GUI operation button in the steady state screen. Ri button oFF state der to display, in the second control, the first display state is a steady state, the second display state, low light condition than the steady state was a display luminance in the low luminance or, The display is turned off with the display completely turned off .

  The image display control device may be an independent device, or may be an internal block constituting one device.

  According to one aspect of the present invention, the display can be restored by a simpler operation for the user.

It is a block diagram which shows the structural example of the imaging device as one Embodiment of the image display control apparatus to which this invention is applied. It is a perspective view which shows the structural example of the external appearance of the imaging device of FIG. FIG. 3 is a diagram illustrating a display example of a screen by first display control of the imaging apparatus of FIG. 1. It is a flowchart explaining the display change process of 1st display control. It is a flowchart explaining the display return process of 1st display control. It is a figure which shows the example of a display of the screen by the 2nd display control of the imaging device of FIG. It is a flowchart explaining a 2nd display control process. It is a block diagram which shows the structural example of one Embodiment of the computer to which this invention is applied.

[Configuration example of imaging device]
FIG. 1 is a block diagram illustrating a configuration example of an imaging apparatus as an embodiment of an image display control apparatus to which the present invention is applied.

  1 is configured to include a lens unit 11 to a RAM 27.

  The lens unit 11 includes a photographing lens, a diaphragm, a focus lens, and the like. An imaging element 12 such as a CCD (Charge Coupled Device) is disposed on the optical path of subject light incident through the lens unit 11.

  The image sensor 12, the analog signal processing unit 13, the A / D (Analog / Digital) conversion unit 14, and the digital signal processing unit 15 are connected in that order.

  A display unit 17 and a recording device 19 are also connected to the digital signal processing unit 15. A touch panel 16 is provided on the image display surface of the display unit 17, and a touch screen 18 is configured by the touch panel 16 and the display unit 17. The display unit 17 is configured by, for example, a liquid crystal display (LCD).

  The lens unit 11 is connected to an actuator 20 for adjusting a diaphragm constituting the lens unit 11 and moving a focus lens. A motor driver 21 is also connected to the actuator 20. The motor driver 21 performs drive control of the actuator 20.

  A CPU (Central Processing Unit) 23 controls the entire imaging apparatus 1. For this reason, the CPU 23 includes an analog signal processing unit 13, an A / D conversion unit 14, a digital signal processing unit 15, a motor driver 21, a TG (Timing Generator) 22, an operation unit 24, an EEPROM (Electrically Erasable Programmable ROM) 25, A program ROM (Read Only Memory) 26, a RAM (Random Access Memory) 27, and a touch panel 16 are connected.

  The touch panel 16 is, for example, a capacitive touch panel, detects a tap (contact) on the touch panel 16 by a user's finger, and outputs the tapped position on the touch panel 16 to the CPU 23. In addition, the touch panel 16 also has a user's finger not touching the touch panel 16 due to a change (level) in capacitance, but is approaching within a predetermined distance (hereinafter referred to as proximity as appropriate). Can be detected. The distance that the touch panel 16 can detect the proximity is, for example, a distance within about 20 mm from the screen. If the distance is within 10 mm, the position on the touch panel 16 that is in proximity can be recognized. Note that the touch panel 16 is not limited to the user's finger and can detect contact or proximity, but may be any conductive object similar to that. However, the following description will be made as detection of the user's finger.

  The recording device 19 is composed of an optical disk such as a DVD (Digital Versatile Disc), a semiconductor memory such as a memory card, and other removable recording media. The recording device 19 records an image (signal thereof) obtained by imaging. The recording device 19 is detachable from the main body of the imaging apparatus 1.

  The EEPROM 25 stores various set information. Other information, for example, information that should be retained even when the power state is turned off, is stored in the EEPROM 25.

  The program ROM 26 stores a program executed by the CPU 23 and data necessary for executing the program.

  The RAM 27 temporarily stores programs and data necessary as a work area when the CPU 23 executes various processes.

  The outline of the operation of the entire imaging apparatus 1 having the configuration shown in FIG. 1 will be described below.

  The CPU 23 controls each unit constituting the imaging apparatus 1 by executing a program recorded in the program ROM 26. Then, the CPU 23 executes predetermined processes such as an imaging process and an image display control process on the display unit 17 in accordance with a signal from the touch panel 16 or a signal from the operation unit 24.

  The operation unit 24 is operated by a user and provides a signal corresponding to the operation to the CPU 23. The operation unit 24 includes, for example, a zoom lever (TELE / WIDE) 41 and a shutter button 42 which will be described later with reference to FIG.

  By driving the actuator 20, the lens unit 11 is exposed or stored from the housing of the imaging device 1. Further, the actuator 20 is driven to adjust the diaphragm constituting the lens unit 11 and move the focus lens constituting the lens unit 11.

  The TG 22 provides a timing signal to the image sensor 12 based on the control of the CPU 23. The exposure time and the like in the image sensor 12 are controlled by the timing signal.

  The image sensor 12 operates based on the timing signal provided from the TG 22 to receive subject light incident through the lens unit 11 and perform photoelectric conversion. The image sensor 12 provides an analog image signal corresponding to the amount of received light to the analog signal processing unit 13. At this time, the motor driver 21 drives the actuator 20 based on the control of the CPU 23.

  The analog signal processing unit 13 performs analog signal processing such as amplification on the analog image signal provided from the image sensor 12 based on the control of the CPU 23. The resulting analog image signal is provided from the analog signal processing unit 13 to the A / D conversion unit 14.

  The A / D conversion unit 14 performs A / D conversion on the analog image signal from the analog signal processing unit 13 based on the control of the CPU 23. The resulting digital image signal is provided from the A / D converter 14 to the digital signal processor 15.

  The digital signal processing unit 15 performs digital signal processing such as noise removal processing on the digital image signal provided from the A / D conversion unit 14 based on the control of the CPU 23. The digital signal processing unit 15 causes the display unit 17 to display an image corresponding to the digital image signal.

  The digital signal processing unit 15 compresses and encodes the digital image signal provided from the A / D conversion unit 14 according to a predetermined compression encoding method such as JPEG (Joint Photographic Experts Group). The digital signal processing unit 15 causes the recording device 19 to record the compression-coded digital image signal.

  The digital signal processing unit 15 also reads out a digital image signal that has been compression-encoded from the recording device 19 and decompresses and decodes the digital image signal according to an expansion-decoding method corresponding to a predetermined compression-encoding method. The digital signal processing unit 15 causes the display unit 17 to display an image corresponding to the digital image signal.

  In addition, based on the control of the CPU 23, the digital signal processing unit 15 generates images such as AF frames and menu buttons used for exhibiting the AF (auto focus) function, and displays them on the display unit 17.

  An image captured by the image sensor 12 is displayed on the display unit 17. In this case, an AF frame is set on the image displayed on the display unit 17. Focus is controlled based on the image inside the AF frame.

  As described above, the imaging apparatus 1 has an AF function. The imaging device 1 also has an AE (Automatic Exposure) function and an AWB (Auto White Balance) function. These functions are realized by the CPU 23 reading and executing the program in the program ROM 26. Furthermore, the AF function, the AE function, and the AWB function are merely examples of functions that the imaging apparatus 1 has. That is, the imaging device 1 has various functions related to shooting.

  FIG. 2 is a perspective view illustrating a configuration example of the appearance of the imaging device 1 of FIG.

  Hereinafter, of the surfaces of the imaging apparatus 1, the surface facing the subject when the user captures the subject, that is, the surface on which the lens unit 11 is arranged is referred to as the front surface. On the other hand, of the surfaces of the imaging device 1, the surface facing the user when the user captures the subject, that is, the surface opposite to the front surface is referred to as the rear surface. Of the surfaces of the imaging apparatus 1, when the user photographs a subject, the surface disposed on the upper side is referred to as the upper surface, and the surface disposed on the lower side is referred to as the lower surface.

  FIG. 2A is a perspective view illustrating a configuration example of the appearance of the front surface of the imaging apparatus 1. FIG. 2B is a perspective view illustrating a configuration example of the appearance of the rear surface of the imaging apparatus 1.

  The front surface of the imaging device 1 can be covered with a lens cover 47. When the front lens cover 47 is opened downward in the figure, the state shown in FIG. 2A is obtained. As shown in FIG. 2A, the photographing lens 45 and the AF illuminator 46 included in the lens unit 11 are arranged in that order from the right side at the upper part of the front surface where the cover of the lens cover 47 is removed.

  The AF illuminator 46 also serves as a self-timer lamp. A zoom lever (TELE / WIDE) 41, a shutter button 42, a playback button 43, and a power button 44 are arranged in this order from the left side of FIG. The zoom lever 41, the shutter button 42, the playback button 43, and the power button 44 are included in the operation unit 24 of FIG.

  As illustrated in FIG. 2B, a touch screen 18 is disposed on the entire rear surface of the imaging apparatus 1.

  On the touch screen 18, an image captured by the image sensor 12 is displayed in the shooting mode for shooting the subject, and an image recorded on the recording device 19 is displayed in the playback mode for displaying the captured image. The touch screen 18 also has a menu button for setting (changing) various setting items of the imaging apparatus 1 as a GUI (Graphical User Interface), a flash mode selection button, a self-timer button, a playback display button, and the like. Is displayed.

[First Embodiment of Image Display Control]
The imaging device 1 has a function of switching the display state of the touch screen 18 for power saving or the like when no user operation is performed for a certain period of time. With reference to FIG. 3 to FIG. 5, the first display control of the imaging apparatus 1 (CPU 23) as the first embodiment of the image display control of the present invention will be described.

[Example of screen by first display control]
FIG. 3 shows a display example of a screen displayed on the touch screen 18 by the first display control of the imaging apparatus 1.

  In the shooting mode, when the user performs some operation related to shooting with respect to the imaging apparatus 1, the shooting standby screen P1 shown in FIG. 3 is shown as a steady state in which operations corresponding to all the operations by the user can be executed. Is displayed on the touch screen 18. On the shooting standby screen P1, an image captured by the image sensor 12 is displayed in the center of the screen, and a menu button, a flash mode selection button, a self-timer button, a playback display button are displayed on the left and right sides of the screen on both sides. Etc. are displayed. Hereinafter, predetermined operation buttons displayed as the shooting standby screen P1 are collectively referred to as GUI operation buttons.

  In a steady state in which the shooting standby screen P1 is displayed on the display unit 17, when a user operation is not performed for a certain period of time, the imaging device 1 reduces the display luminance of the display unit 17 to be lower than that in the steady state. Change the display state to the brightness state. A low luminance screen P2 shown in FIG. 3 is a screen displayed on the display unit 17 in a low luminance state. The low brightness screen P2 is the same as the shooting standby screen P1 except that the brightness level is changed.

  When the proximity of the user's finger is detected in the low luminance state in which the low luminance screen P2 is displayed on the display unit 17, the imaging device 1 changes the display state of the display unit 17 to a steady state. That is, the imaging apparatus 1 returns the display on the display unit 17 to the shooting standby screen P1.

  However, when the user does not perform an operation for a certain time in the low luminance state, the imaging device 1 changes the display state to the unlit state in which the display of the display unit 17 is completely turned off (turns off). A display-off screen P3 in FIG. 3 shows a screen displayed on the display unit 17 in the unlit state.

  When the proximity of the user's finger is detected in the light-off state, the imaging device 1 changes the display state of the display unit 17 to a steady state. That is, the imaging apparatus 1 returns the display on the display unit 17 to the shooting standby screen P1.

[Flowchart of first display control]
FIG. 4 is a flowchart for explaining a display change process, which is a process when no user operation is performed, in the first display control described with reference to FIG. 3.

  First, in step S1, the imaging device 1 determines whether a certain time has passed without any user operation. The user operations here include operations on the operation unit 24 such as the zoom lever 41 and the shutter button 42 in addition to operations on the touch panel 16.

  The process in step S1 is repeated until it is determined that a certain time has elapsed. If it is determined in step S1 that the certain time has elapsed, the process proceeds to step S2.

  In step S <b> 2, the imaging apparatus 1 determines whether the current display state of the display unit 17 is a steady state, a low luminance state, or a light-off state.

  If it is determined in step S2 that the current display state is a steady state, the process proceeds to step S3, and the imaging device 1 changes the display state of the display unit 17 to a low luminance state.

  On the other hand, if it is determined in step S2 that the current display state is the low luminance state, the process proceeds to step S4, and the imaging device 1 changes the display state of the display unit 17 to the extinguished state.

  Or when it determines with the present display state being a light extinction state by step S2, a process returns to step S1 as it is.

  The above display change process is executed until the power of the imaging apparatus 1 is turned off.

  FIG. 5 is a flowchart for explaining a display return process, which is a process of detecting the proximity to the touch panel 16 and returning the display state to the steady state, in the first display control described with reference to FIG.

  In step S <b> 21, the imaging apparatus 1 determines whether contact or proximity of the user's finger with respect to the touch panel 16 has been detected.

  The process of step S21 is repeatedly executed until either the user's finger touch or proximity is detected. If it is determined in step S21 that the user's finger contact has been detected, the process proceeds to step S22, and the imaging apparatus 1 performs a predetermined process (tap process) determined in advance for the finger contact. Run. Thereafter, the process returns to step S21.

  On the other hand, if it is determined in step S21 that the proximity of the user's finger has been detected, the process proceeds to step S23, and the imaging apparatus 1 determines that the current display state of the display unit 17 is a steady state, a low luminance state, or It is determined whether the light is off.

  If it is determined in step S23 that the current display state is a steady state, the process returns to step S21.

  On the other hand, if it is determined in step S23 that the current display state is either the low luminance state or the extinguished state, the process proceeds to step S24, and the imaging device 1 changes the display state of the display unit 17 to the steady state. Change to After the process of step S24, the process returns to step S21, and the subsequent processes are repeated.

  The display return process described above is executed until the power of the imaging apparatus 1 is turned off.

[Second Embodiment of Image Display Control]
Next, the second display control of the imaging device 1 (CPU 23) as the second embodiment of the image display control of the present invention will be described.

[Example of screen by second display control]
FIG. 6 shows a display example of a screen displayed on the touch screen 18 by the second display control of the imaging apparatus 1.

  In the steady state, the shooting standby screen P11 is displayed on the touch screen 18. The shooting standby screen P11 has the same screen configuration as the shooting standby screen P1 of FIG.

  In a state where the shooting standby screen P11 is displayed on the touch screen 18, when the user does not perform an operation for a certain period of time, the imaging device 1 changes the display state from the steady state to the button-off state. On the display unit 17 in the button-off state, a button-off screen P12 in which the GUI operation button display on the shooting standby screen P11 is deleted is displayed.

When the proximity of the user's finger is detected on the touch panel 16 in the button-off state, the imaging device 1 changes the display state of the display unit 17 to a steady state. That is, the imaging apparatus 1 returns the display on the display unit 17 to the shooting standby screen P11.

[Second Display Control Flowchart]
FIG. 7 is a flowchart illustrating the second display control process described with reference to FIG.

  First, in step S <b> 41, the imaging apparatus 1 determines whether contact or proximity of the user's finger with respect to the touch panel 16 has been detected.

  If it is determined in step S41 that the finger contact of the user has been detected, the process proceeds to step S42, and the imaging apparatus 1 executes a predetermined process (tap process) determined in advance for the finger contact. Then, the process returns to step S41.

  On the other hand, if it is determined in step S41 that neither the finger contact nor the proximity of the user has been detected, the process proceeds to step S43, and the imaging apparatus 1 does not perform any user operation for a certain period of time. Determine if it has passed. The user operation here includes the operation of the operation unit 24 as in the first display control.

  If it is determined in step S43 that the predetermined time has not yet elapsed, the process returns to step S41, and the process of step S41 is executed again.

  On the other hand, if it is determined in step S43 that the predetermined time has passed without any user operation, the process proceeds to step S44, and the imaging device 1 determines that the current display state of the display unit 17 is the steady state or the button is off. Determine which of the states.

  If it is determined in step S44 that the current display state is the button off state, the process returns to step S41.

  On the other hand, when it is determined in step S44 that the current display state is the steady state, the process proceeds to step S45, and the imaging device 1 changes the display state of the display unit 17 to the button-off state. That is, the imaging apparatus 1 switches the display from the shooting standby screen P11 to the button-off screen P12 in which the GUI operation button display is deleted. After the process of step S45, the process returns to step S41, and the processes after step S41 are executed again. The processes in steps S41, S43 to S45 described above correspond to the display change process when no user operation is performed in the first display control process.

  On the other hand, if it is determined in step S41 described above that the proximity of the user's finger has been detected, the process proceeds to step S46, and the imaging apparatus 1 determines that the current display state of the display unit 17 is a steady state or a button-off state. It is determined which one is.

  If it is determined in step S46 that the current display state is a steady state, the process returns to step S41.

  On the other hand, if it is determined in step S46 that the current display state is the button-off state, the process proceeds to step S47, and the imaging device 1 changes the display state of the display unit 17 to a steady state. That is, the imaging apparatus 1 switches the display from the button-off screen P12 in which the display of the GUI operation buttons is deleted to the shooting standby screen P11. After the process of step S47, the process returns to step S41, and the processes after step S41 are executed again. The processes in steps S41, S46, and S47 described above correspond to the display return process in the first display control process that detects the user's proximity operation and returns the display state to the steady state.

  The above processing is executed until the power of the imaging apparatus 1 is turned off.

  As described above, the CPU 23 of the imaging apparatus 1 controls the display state of the touch panel 16 from the first display state to the second display state when there is no user proximity operation for a certain period of time, When the proximity operation is detected, control is performed so that the second display state is returned to the first display state.

  In the first embodiment described above, the first display state is a steady state, and the second display state is a low luminance state or a light-off state. In the second embodiment, the first display state is a steady state, and the second display state is a button-off state. Needless to say, the user's finger is a typical example of a detection target for detecting proximity and may be a palm or other conductive object.

  By such image display control of the imaging apparatus 1, since the user returns to a steady state if the user does not touch the touch screen 18, the display returning operation method is easier to understand for the user who does not know the operation for returning the display. Can be provided.

  In addition, if the operation is to select a GUI operation button displayed on the display unit 17, the state is always close in the middle of the operation of touching the touch screen 18. Therefore, according to the image display control of the imaging apparatus 1, conventionally, the two operations of the tap for returning the display and the tap for selecting the GUI operation button can be performed by one operation, and the operability is improved. improves. That is, the display can be restored by a simpler operation for the user.

  Further, since the touch screen 18 does not have to be touched to return to the steady state, it is possible to prevent fingerprints and dirt from being attached by touching and operating.

  In the above-described embodiment, the example of the shooting mode screen has been described. However, the same display control process is executed on the playback mode screen.

[Modification]
The embodiments of the present invention are not limited to the above-described embodiments, and various modifications can be made without departing from the scope of the present invention.

  For example, depending on the proximity state (distance) when the user's finger is close to the touch panel 16, not only whether or not the user's finger is close to the touch panel 16 but also the position (coordinates) on the screen of the finger that the user has made close. can do.

  Therefore, the imaging device 1 (CPU 23) detects not only the presence / absence of proximity (finger) but also a user's operation (gesture) for drawing a predetermined figure with the nearby finger, and the detected gesture is registered in advance. When it is the same as the gesture being performed, the display may be returned to the steady state. This is effective when the detection of proximity is an excessive detection. As an operation for returning the display, whether to detect only proximity or to detect a gesture due to proximity can be switched by setting. In addition, the detection target gesture may be registered in the imaging device 1 in advance, or the user may register a desired gesture.

  For example, when detecting a proximity gesture as a display return operation, even if the user performs a proximity gesture, when the display does not return from the extinguished state, the user is asked whether the battery is extinguished due to battery exhaustion, There may be situations where you do not know if the gesture is wrong. As such a countermeasure, when the proximity of the user is detected, a predetermined GUI display such as a cursor (plus display) or a character “proximity detection” can be performed. As a result, the user can easily understand that the display is not restored because the battery is not exhausted and the gesture is wrong.

[Computer configuration example]
The series of processes described above can be executed by hardware, but can also be executed by software.

  In this case, the imaging apparatus 1 in FIG. 1 may be executed, and for example, the personal computer shown in FIG. 8 may be executed.

  In FIG. 8, the CPU 101 executes various processes according to a program recorded in a ROM (Read Only Memory) 102 or a program loaded from a storage unit 108 to a RAM (Random Access Memory) 103. The RAM 103 also appropriately stores data necessary for the CPU 101 to execute various processes.

  The CPU 101, ROM 102, and RAM 103 are connected to each other via a bus 104. An input / output interface 105 is also connected to the bus 104.

  The input / output interface 105 includes an input unit 106 including a keyboard and a mouse, an output unit 107 including a touch panel display and a speaker, a storage unit 108 including a hard disk, and a communication unit 109 including a modem and a terminal adapter. Is connected. The communication unit 109 controls communication performed with other devices (not shown) via a network including the Internet.

  A drive 110 is connected to the input / output interface 105 as necessary, and a removable medium 111 made up of a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is appropriately attached, and a computer program read from them is loaded. These are installed in the storage unit 108 as necessary.

  When a series of processing is executed by software, a program constituting the software executes various functions by installing a computer incorporated in dedicated hardware or various programs. For example, a general-purpose personal computer is installed from a network or a recording medium.

  In this specification, the steps for describing a program recorded on a recording medium are executed in parallel or individually even if they are not necessarily processed in time series, as well as processes performed in time series in that order. It also includes the processing.

  As described above, the display unit 17 whose display is controlled by the image display control device to which the present invention is applied is, for example, a liquid crystal display. However, the present invention is applicable not only to a liquid crystal display but also to the following display device. That is, a display device in which display is instructed for each unit such as a frame or a field constituting a moving image (hereinafter, such unit is referred to as a frame), and a plurality of pixels constituting one frame are displayed for a predetermined time. The present invention can be applied to a display device that includes elements and can hold at least some of the display elements. Hereinafter, such a display element is referred to as a hold-type display element, and a display device in which a screen is configured by such a hold-type display element is referred to as a hold-type display apparatus. That is, the liquid crystal display device is merely an example of a hold type display device, and the present invention can be applied to the entire hold type display device.

  Furthermore, the present invention can be applied not only to a hold-type display device but also to, for example, a flat self-luminous display device using an organic EL (Electro Luminescent) device as a light emitting element. That is, the present invention can be applied to the entire display device including a display element in which an image includes a plurality of pixels and displays the pixels. Such a display device is referred to as a pixel type display device. Here, in the pixel type display device, one display element is not necessarily associated with one pixel.

  In other words, the display device whose display is controlled by the image display control device to which the present invention is applied may be a display device that can execute the series of processes described above.

  In the above-described embodiment, the case where the present invention is applied to an imaging device (for example, a digital camera) including a display device (display unit) has been described. However, the image display control of the present invention is also applied to other electronic devices including a display device, such as PDA (Personal Digital Assistants), mobile phones, portable game devices, portable playback devices, television receivers, and the like. Is possible.

  1 imaging device, 16 touch panel, 17 display unit, 18 touch screen, 23 CPU

Claims (9)

  1. Proximity detection control means for detecting the proximity of an object to the display unit;
    The display state of the display unit is controlled so as to change from the first display state to the second display state when there is no proximity of the object for a certain time, and when the proximity of the object is detected, the second state Display control means for performing a first control and a second control for returning the display state to the first display state,
    In the first control,
    The first display state is a steady state in which an operation corresponding to a user operation can be performed,
    Said second display state, Ri button OFF state der to display a screen that erases the display of the GUI operation buttons on the screen of the steady state,
    In the second control,
    The first display state is the steady state;
    The image display control device, wherein the second display state is a low luminance state in which display luminance is lower than that in the steady state, or a light-off state in which display is completely turned off .
  2. The display control means controls the display state of the display unit to be in the low luminance state when the object is not in proximity for a certain time in the steady state, and further controls the display of the object for a certain time in the low luminance state. If the proximity is not present, the image display control device according to the display state of the display unit in claim 1 for the second control so that the off state.
  3. The display unit is a part of an imaging device,
    The image display control device according to claim 1, wherein the first display state is a state in which the imaging device is on standby for photographing.
  4. The proximity detection control means detects a user gesture due to proximity as proximity of the object,
    The display control unit controls the second display state to return to the first display state when the detected user gesture based on the proximity is the same as a pre-registered gesture. An image display control device according to claim 1.
  5. The image display control apparatus according to claim 4 , further comprising registration means for registering the gesture detected by the proximity detection control means.
  6. The image display control device according to claim 4 , wherein the display control unit performs a predetermined GUI display on the display unit when the proximity detection control unit detects the proximity of the object.
  7.   The display unit;
      A proximity detector that detects the proximity of the object under the control of the proximity detection control means;
      Further comprising
      The image display control apparatus according to claim 1.
  8.   Function execution control means for controlling execution of a function assigned to the GUI operation button in response to detection of contact by the user with the GUI operation button displayed on the display unit
      The image display control apparatus according to claim 1.
  9. An image display control device that detects the proximity of an object to the display unit and controls the display unit,
    The display state of the display unit is controlled so as to change from the first display state to the second display state when there is no proximity of the object for a certain time, and when the proximity of the object is detected, the second state Performing a first control and a second control to return the display state to the first display state,
    In the first control,
    The first display state is a steady state in which an operation corresponding to a user operation can be performed,
    Said second display state, Ri button OFF state der to display a screen that erases the display of the GUI operation buttons on the screen of the steady state,
    In the second control,
    The first display state is the steady state;
    The image display control method, wherein the second display state is a low-brightness state in which display luminance is lower than that in the steady state, or a light-off state in which display is completely turned off .
JP2010284322A 2010-12-21 2010-12-21 Image display control apparatus and image display control method Active JP5648844B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2010284322A JP5648844B2 (en) 2010-12-21 2010-12-21 Image display control apparatus and image display control method

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
JP2010284322A JP5648844B2 (en) 2010-12-21 2010-12-21 Image display control apparatus and image display control method
US13/292,769 US9703403B2 (en) 2010-12-21 2011-11-09 Image display control apparatus and image display control method
EP11190876.0A EP2469376A3 (en) 2010-12-21 2011-11-28 Image display control apparatus and image display control method
TW100145040A TWI444854B (en) 2010-12-21 2011-12-07 Image display control apparatus and image display control method
KR1020110133677A KR20120070502A (en) 2010-12-21 2011-12-13 Image display control apparatus and image display control method
CN201110415883.9A CN102568433B (en) 2010-12-21 2011-12-14 Image display control apparatus and image display control method
RU2011151186/08A RU2506628C2 (en) 2010-12-21 2011-12-14 Image display control apparatus and image display control method
BRPI1105545-6A BRPI1105545A2 (en) 2010-12-21 2011-12-14 Image display control apparatus and method

Publications (3)

Publication Number Publication Date
JP2012133524A JP2012133524A (en) 2012-07-12
JP2012133524A5 JP2012133524A5 (en) 2013-12-12
JP5648844B2 true JP5648844B2 (en) 2015-01-07

Family

ID=45445764

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2010284322A Active JP5648844B2 (en) 2010-12-21 2010-12-21 Image display control apparatus and image display control method

Country Status (8)

Country Link
US (1) US9703403B2 (en)
EP (1) EP2469376A3 (en)
JP (1) JP5648844B2 (en)
KR (1) KR20120070502A (en)
CN (1) CN102568433B (en)
BR (1) BRPI1105545A2 (en)
RU (1) RU2506628C2 (en)
TW (1) TWI444854B (en)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103380394B (en) 2010-12-20 2017-03-22 追踪有限公司 Systems and methods for MEMS light modulator arrays with reduced acoustic emission
JP5686244B2 (en) 2010-12-21 2015-03-18 ソニー株式会社 Display control apparatus, display control method, and program
JP2012248066A (en) * 2011-05-30 2012-12-13 Canon Inc Image processing device, control method of the same, control program and imaging apparatus
US9213479B2 (en) * 2012-02-16 2015-12-15 Samsung Medison Co., Ltd. Method and apparatus for displaying image
JP5579780B2 (en) * 2012-06-06 2014-08-27 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Input device, input support method, and program
CN103514837B (en) * 2012-06-27 2017-04-12 中兴通讯股份有限公司 Terminal screen backlight control method and apparatus
US9711090B2 (en) 2012-08-09 2017-07-18 Panasonic Intellectual Property Corporation Of America Portable electronic device changing display brightness based on acceleration and distance
KR101985505B1 (en) * 2012-08-16 2019-06-03 엘지전자 주식회사 Terminal and control method thereof
US8886372B2 (en) 2012-09-07 2014-11-11 The Boeing Company Flight deck touch-sensitive hardware controls
US9170421B2 (en) * 2013-02-05 2015-10-27 Pixtronix, Inc. Display apparatus incorporating multi-level shutters
US20140225904A1 (en) * 2013-02-13 2014-08-14 Pixtronix, Inc. Shutter assemblies fabricated on multi-height molds
CN104243790B (en) * 2013-06-19 2019-01-25 深圳富泰宏精密工业有限公司 Electronic device camera unit method for closing and system
WO2015022498A1 (en) * 2013-08-15 2015-02-19 Elliptic Laboratories As Touchless user interfaces
JP6412778B2 (en) * 2014-11-19 2018-10-24 東芝映像ソリューション株式会社 Video apparatus, method, and program
JP2017005312A (en) * 2015-06-04 2017-01-05 キヤノン株式会社 Imaging apparatus and control method thereof
JP6532306B2 (en) * 2015-06-04 2019-06-19 キヤノン株式会社 Image pickup apparatus and control method thereof
US10432858B2 (en) 2015-06-04 2019-10-01 Canon Kabushiki Kaisha Image pickup apparatus and control method therefor having power control to control power state
KR101747355B1 (en) * 2015-12-23 2017-06-14 엘지전자 주식회사 Input device and air conditioner including the same
JP2018019302A (en) 2016-07-29 2018-02-01 シャープ株式会社 Image forming apparatus, control program, and control method
JP6422472B2 (en) * 2016-08-31 2018-11-14 キヤノン株式会社 Imaging apparatus, control method, and program
CN106486088B (en) * 2016-12-14 2019-03-29 深圳市邦华电子有限公司 A kind of screen luminance adjustment method, device and a kind of intelligent terminal
CN109388312A (en) * 2017-08-05 2019-02-26 益富可视精密工业(深圳)有限公司 Electronic device and its application execution method

Family Cites Families (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SU555412A1 (en) * 1975-09-03 1977-04-25 Институт Оптики Атмосферы Со Ан Ссср Next scanner
SU682880A1 (en) * 1977-05-18 1979-08-30 Предприятие П/Я А-1705 Television search and tracing apparatus
JPH06119090A (en) * 1992-10-07 1994-04-28 Hitachi Ltd Power economization control system
JPH06230860A (en) * 1993-02-08 1994-08-19 Fujitsu Ltd Power supply control system for display device
JP2002082734A (en) 2000-09-06 2002-03-22 Sony Corp Device and method for information processing and program storage medium
JP4395614B2 (en) * 2003-03-28 2010-01-13 カシオ計算機株式会社 Electronic device, power supply control method and program
BRPI0508822A (en) * 2004-03-18 2007-08-07 Koninkl Philips Electronics Nv scanning display apparatus, and, method for operating the same
KR100597798B1 (en) * 2005-05-12 2006-06-30 삼성전자주식회사 Method for offering to user motion recognition information in portable terminal
JP2007163891A (en) 2005-12-14 2007-06-28 Sony Corp Display apparatus
JP4635957B2 (en) * 2006-05-12 2011-02-23 株式会社デンソー In-vehicle operation system
KR101434199B1 (en) * 2006-10-02 2014-08-28 삼성전자주식회사 Terminal and display method for the same
US7747293B2 (en) 2006-10-17 2010-06-29 Marvell Worl Trade Ltd. Display control for cellular phone
US8970501B2 (en) * 2007-01-03 2015-03-03 Apple Inc. Proximity and multi-touch sensor detection and demodulation
KR20080064274A (en) 2007-01-04 2008-07-09 엘지전자 주식회사 Method of controlling a touch pad in a mobile communication terminal and the mobile communication terminal thereof
KR20080097553A (en) 2007-05-02 2008-11-06 (주)멜파스 Sleep mode wake-up method and sleep mode wake-up apparatus using touch sensitive pad for use in an electronic device
CN101952792B (en) 2007-11-19 2014-07-02 瑟克公司 Touchpad combined with a display and having proximity and touch sensing capabilities
US8432372B2 (en) 2007-11-30 2013-04-30 Microsoft Corporation User input using proximity sensing
US8199006B2 (en) 2007-11-30 2012-06-12 Hewlett-Packard Development Company, L.P. Computing device that detects hand presence in order to automate the transition of states
TWI379655B (en) 2007-12-21 2012-12-21 Wistron Corp Digital photo frame with power saving function and related power saving method
KR101495164B1 (en) * 2008-04-10 2015-02-24 엘지전자 주식회사 Mobile terminal and method for processing screen thereof
JP2010067104A (en) * 2008-09-12 2010-03-25 Olympus Corp Digital photo-frame, information processing system, control method, program, and information storage medium
TWI475885B (en) 2008-11-10 2015-03-01 Wistron Corp Control method for backlight module and application thereof
JP2010152573A (en) * 2008-12-25 2010-07-08 Sony Corp Display apparatus and display method
JP2010244132A (en) * 2009-04-01 2010-10-28 Mitsubishi Electric Corp User interface device with touch panel, method and program for controlling user interface
JP5426242B2 (en) 2009-06-11 2014-02-26 京楽産業.株式会社 Amusement center medal circulation system
US20120212440A1 (en) * 2009-10-19 2012-08-23 Sharp Kabushiki Kaisha Input motion analysis method and information processing device
US8350820B2 (en) * 2009-11-06 2013-01-08 Bose Corporation Touch-based user interface user operation accuracy enhancement
KR101596842B1 (en) * 2009-12-04 2016-02-23 엘지전자 주식회사 Mobile terminal with an image projector and method for controlling thereof

Also Published As

Publication number Publication date
US20120154307A1 (en) 2012-06-21
BRPI1105545A2 (en) 2013-04-02
US9703403B2 (en) 2017-07-11
TWI444854B (en) 2014-07-11
CN102568433B (en) 2017-12-01
TW201235886A (en) 2012-09-01
KR20120070502A (en) 2012-06-29
RU2011151186A (en) 2013-06-20
JP2012133524A (en) 2012-07-12
CN102568433A (en) 2012-07-11
EP2469376A3 (en) 2013-08-07
RU2506628C2 (en) 2014-02-10
EP2469376A2 (en) 2012-06-27

Similar Documents

Publication Publication Date Title
US10168829B2 (en) Information processing apparatus, control method therefor, program, and recording medium
US9160921B2 (en) Portable electronic equipment with automatic control to keep display turned on and method
US9525797B2 (en) Image capturing device having continuous image capture
CN102043536B (en) You touch detection device, an electronic device and a control method for a touch detecting device
US20100095205A1 (en) Portable Terminal and Control Method Therefor
US20100053342A1 (en) Image edit method and apparatus for mobile terminal
US20110019239A1 (en) Image Reproducing Apparatus And Image Sensing Apparatus
JP2011028345A (en) Condition change device, camera, mobile apparatus and program
JP2007158919A (en) Image display apparatus and image display method
WO2011037222A1 (en) Mobile terminal device, method for controlling mobile terminal device, and program
JP5306266B2 (en) Imaging apparatus and control method thereof
JP4670860B2 (en) Recording / playback device
EP2469394A1 (en) Information Processing Device, Method of Processing Information, and Computer Program Storage Device
JP4127982B2 (en) Portable electronic devices
KR20120113714A (en) Information processing device, display method, and program
JP2010055599A (en) Information processing apparatus and method, and program
KR101545883B1 (en) Method for controlling camera of terminal and terminal thereof
US20050184972A1 (en) Image display apparatus and image display method
CN102572271B (en) Image display control apparatus and image display control method
JP6083987B2 (en) Imaging apparatus, control method thereof, and program
US9443476B2 (en) Image pickup device and image pickup method
US7750968B2 (en) Image processing apparatus, image processing method, program, and storage medium
US8514313B2 (en) Imaging device and method for switching mode of imaging device
CN101996044B (en) Method and apparatus for controlling zoom using a touch screen
US20040201772A1 (en) Portable electronic apparatus and power source control method therefor

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20131028

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20131028

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20140528

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20140603

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20140711

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20140819

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20140930

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20141016

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20141029

R151 Written notification of patent or utility model registration

Ref document number: 5648844

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R151

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250