WO2023002676A1 - 情報処理装置、情報処理方法、プログラム - Google Patents
情報処理装置、情報処理方法、プログラム Download PDFInfo
- Publication number
- WO2023002676A1 WO2023002676A1 PCT/JP2022/010642 JP2022010642W WO2023002676A1 WO 2023002676 A1 WO2023002676 A1 WO 2023002676A1 JP 2022010642 W JP2022010642 W JP 2022010642W WO 2023002676 A1 WO2023002676 A1 WO 2023002676A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- date
- time
- images
- time information
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 64
- 238000003672 processing method Methods 0.000 title claims description 6
- 238000003384 imaging method Methods 0.000 claims description 165
- 238000000034 method Methods 0.000 claims description 53
- 238000012545 processing Methods 0.000 description 80
- 230000000875 corresponding effect Effects 0.000 description 68
- 230000008569 process Effects 0.000 description 44
- 238000010586 diagram Methods 0.000 description 23
- 238000004891 communication Methods 0.000 description 10
- 238000005516 engineering process Methods 0.000 description 9
- 230000007246 mechanism Effects 0.000 description 6
- 230000015654 memory Effects 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 6
- 230000035945 sensitivity Effects 0.000 description 5
- 238000006243 chemical reaction Methods 0.000 description 4
- 238000010191 image analysis Methods 0.000 description 4
- 238000001514 detection method Methods 0.000 description 3
- 238000005401 electroluminescence Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000015572 biosynthetic process Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 238000001454 recorded image Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/53—Querying
- G06F16/538—Presentation of query results
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/54—Browsing; Visualisation therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/5866—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, manually generated location and time information
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
Definitions
- This technology relates to an information processing device, an information processing method, and a program, and particularly to a technology for retrieving images.
- the imaging time is associated with the image and stored.
- an information processing apparatus for example, it is possible to arrange and display images in the order in which they were captured, based on the time at which they were captured.
- an information processing apparatus has been proposed that displays a list of images captured on a date and time period designated by the user.
- the purpose of this technology is to improve usability when searching for images.
- An information processing apparatus includes an image search unit that searches for an image corresponding to specified date and time information from among a plurality of images stored in association with imaging times; and a display control unit for displaying an image based on the image on the display unit.
- the information processing apparatus can compare the designated date and time information with the imaging times associated with the images, and determine images whose imaging times are close to the designated date and time information as search results.
- FIG. 10 is a diagram showing a playback mode; FIG. 10 is a diagram for explaining image search and display processing in specific example 1; FIG. 10 is a diagram for explaining image search and display processing in specific example 1; 10 is a flowchart showing the flow of image search and display processing in specific example 1.
- FIG. FIG. 11 is a diagram for explaining image search and display processing in specific example 2; 10 is a flowchart showing the flow of image search and display processing in specific example 2.
- FIG. 11 is a diagram for explaining image search and display processing in specific example 3; 14 is a flowchart showing the flow of image search and display processing in specific example 3.
- FIG. FIG. 16 is a diagram for explaining image search and display processing in specific example 4; 16 is a flow chart showing the flow of image search and display processing in specific example 4.
- FIG. 16 is a diagram for explaining image search and display processing in specific example 5; 19 is a flow chart showing the flow of image search and display processing in specific example 5.
- FIG. FIG. 21 is a diagram for explaining image search and display processing in specific example 7; 19 is a flow chart showing the flow of image search and display processing in specific example 7.
- FIG. FIG. 21 is a diagram for explaining image search and display processing in specific example 8; FIG.
- FIG. 21 is a diagram for explaining image search and display processing in specific example 9; 19 is a flowchart showing the flow of image search and display processing in specific example 9.
- FIG. FIG. 22 is a diagram for explaining image search and display processing in specific example 10; 29 is a flowchart showing the flow of image search display processing in specific example 10.
- FIG. It is a figure explaining a date-and-time input image.
- Configuration of Imaging Device> 1 and 2 are diagrams showing the appearance of an imaging device 1 as an example of an information processing device according to an embodiment of the present technology.
- FIG. 2 shows the camera housing 2 from which the lens barrel 3 has been removed.
- the information processing apparatus shown in FIGS. 1 and 2 is an example of the present technology, and a part thereof may be included in an external device connected wirelessly or by wire to the information processing apparatus.
- a processing unit other than the shown units may be provided.
- the image pickup apparatus 1 includes a camera housing 2 in which necessary parts are arranged inside and outside, and a lens mirror that is detachable from the camera housing 2 and attached to the front surface 2a.
- a cylinder 3 is provided. It is an example that the lens barrel 3 is detachable as a so-called interchangeable lens, and the lens barrel may be a lens barrel that cannot be removed from the camera housing 2 .
- a rear monitor 4 is arranged on the rear surface portion 2 b of the camera housing 2 .
- the rear monitor 4 displays live view images, reproduced images of recorded images, and the like.
- the rear monitor 4 is configured by a display device such as a liquid crystal display (LCD) or an organic EL (Electro-Luminescence) display.
- the rear monitor 4 is rotatable with respect to the camera housing 2 .
- the lower end portion of the rear monitor 4 is rotatable so as to move rearward with the upper end portion of the rear monitor 4 as a rotation axis.
- the right end or left end of the rear monitor 4 may be used as the rotation axis.
- it may be rotatable in directions around a plurality of axes.
- the EVF 5 is arranged on the upper surface portion 2 c of the camera housing 2 .
- the EVF 5 includes an EVF monitor 5a and a frame-shaped enclosure 5b projecting rearward so as to surround an upper portion and left and right sides of the EVF monitor 5a.
- the EVF monitor 5a is formed using an LCD, an organic EL display, or the like.
- An optical viewfinder (OVF) may be provided instead of the EVF monitor 5a.
- the operator 6 is, for example, a shutter button (release button), a playback menu start button, an enter button, a cross key, a cancel button, a zoom key, a slide key, and the like.
- manipulators 6 include various types of manipulators such as buttons, dials, pressable and rotatable composite manipulators.
- shutter operation, menu operation, reproduction operation, mode selection operation, focus operation, zoom operation, and parameter change operation can be performed by operating elements 6 of various modes.
- parameters include, for example, shutter speed, aperture value (F number), ISO sensitivity, and the like.
- the operator 6 also includes a touch panel provided on the front surface of the rear monitor 4 .
- FIG. 3 is a diagram showing the internal configuration of the imaging device 1. As shown in FIG. In the imaging device 1 , light from a subject enters an imaging device section 12 via an imaging optical system 11 .
- the imaging optical system 11 is provided with various lenses such as a zoom lens, a focus lens, and a condenser lens, an aperture mechanism, a drive mechanism for the zoom lens, and a drive mechanism for the focus lens.
- the imaging optical system 11 may be provided with a mechanical shutter (for example, a focal plane shutter).
- the imaging element unit 12 is configured by having an image sensor such as a CMOS (Complementary Metal Oxide Semiconductor) type or a CCD (Charge Coupled Device) type, for example.
- the image sensor unit 12 performs, for example, CDS (Correlated Double Sampling) processing, AGC (Automatic Gain Control) processing, etc. on an electrical signal obtained by photoelectrically converting light received by the image sensor, and further A/D (Analog /Digital) Perform conversion processing. Then, the imaging device unit 12 outputs the imaging signal as digital data to the signal processing unit 13 .
- CDS Correlated Double Sampling
- AGC Automatic Gain Control
- A/D Analog /Digital
- the signal processing unit 13 is configured as an image processing processor such as a DSP (Digital Signal Processor).
- the signal processing unit 13 performs various kinds of signal processing on the input imaging signal. For example, the signal processing unit 13 performs preprocessing, synchronization processing, YC generation processing, resolution conversion processing, file formation processing, and the like.
- clamp processing for clamping the black levels of R, G, and B to a predetermined level and correction processing among the R, G, and B color channels are performed on the imaging signal from the imaging device unit 12. .
- each pixel undergoes a color separation process to generate image data having all of R, G, and B color components.
- demosaic processing is performed as color separation processing.
- YC generation process a luminance (Y) signal and a color (C) signal are generated (separated) from R, G, and B image data.
- resolution conversion processing resolution conversion processing is performed on image data that has been subjected to various signal processing.
- the image data that has been subjected to the various processes described above is subjected to compression encoding for recording or communication, formatting, generation or addition of metadata, etc. to generate a file for recording or communication.
- a file for recording or communication I do.
- an image file in a format such as JPEG (Joint Photographic Experts Group), TIFF (Tagged Image File Format), or GIF (Graphics Interchange Format) is generated as a still image file.
- JPEG Joint Photographic Experts Group
- TIFF Tagged Image File Format
- GIF Graphics Interchange Format
- the signal processing unit 13 includes information on processing parameters in the signal processing unit 13, various control parameters acquired from the control unit 17, information indicating the operating states of the imaging optical system 11 and the imaging element unit 12, mode It is generated as including information such as setting information, date and time, and location.
- the metadata includes image data imaging time, model of imaging device 1, manufacturer, camera information such as serial number, data format, frame rate (in case of moving image), data size, angle of view information, imaging time parameter information, etc.
- the storage unit 14 is, for example, a non-volatile memory, and stores image files processed by the signal processing unit 13 .
- image data and image files may be simply referred to as images.
- the display unit 15 provides various displays to the photographer, and is, for example, the rear monitor 4 and the EVF monitor 5a arranged in the camera housing 2 of the imaging apparatus 1 as shown in FIG.
- the display unit 15 executes various displays on the display screen based on instructions from the control unit 17 .
- the display unit 15 displays images based on image files stored in the storage unit 14 .
- the display unit 15 displays various operation menus, icons, messages, etc., that is, as a GUI (Graphical User Interface), based on instructions from the control unit 17 .
- the imaging device 1 includes the display unit 15 , but an external device (not shown) connected to the imaging device 1 by wire or wirelessly may include the display unit 15 . That is, the display unit 15 of the external device may display the GUI based on the instruction from the control unit 17 of the imaging device 1 .
- the communication unit 16 performs wired or wireless data communication and network communication with external devices. For example, the communication unit 16 transmits and outputs an image file to an external information processing device, display device, recording device, playback device, or the like.
- the communication unit 16 performs various network communications such as the Internet, a home network, and a LAN (Local Area Network), and transmits and receives various data to and from servers, terminals, etc. on the network. can be done.
- the control unit 17 is composed of a microcomputer (arithmetic processing unit) equipped with a CPU (Central Processing Unit), ROM (Read Only Memory), and RAM (Random Access Memory).
- the control unit 17 is an information processing device that controls the operation of the imaging device 1 .
- the RAM is used for temporary storage of data, programs, etc. as a work area for various data processing of the CPU.
- the ROM is used to store an OS (Operating System) for the CPU to control each part, application programs for various operations, firmware, various setting information, and the like.
- OS Operating System
- Various types of setting information include communication setting information, setting information related to imaging operation, setting information related to image processing, and the like.
- the setting information related to the imaging operation includes shutter speed, F number, ISO sensitivity, mechanical shutter or electronic shutter curtain speed setting, frame rate, and the like.
- the control unit 17 has functions as an imaging control unit 31 , a display control unit 32 and an image search unit 33 .
- the imaging control unit 31 performs various controls when imaging an image.
- the imaging control unit 31 controls various signal processing instructions in the signal processing unit 13, imaging operations, recording operations, and the like according to user's operations.
- the imaging control unit 31 controls the operation of the diaphragm mechanism, controls the shutter speed of the imaging device unit 12, controls autofocus control, manual focus operation, drive control of the focus lens and the zoom lens according to the zoom operation, and exposure timing. control, etc.
- the display control unit 32 performs display control of the display unit 15 (rear monitor 4, EVF monitor 5a). For example, the display control unit 32 causes the rear monitor 4 to display a captured image, or causes the rear monitor 4 to display a GUI for changing various settings. Also, the display control unit 32 displays an image based on the search result by the image search unit 33 on the rear monitor 4 .
- the image search unit 33 searches for an image corresponding to the date and time information specified via the operator 6 from among the plurality of images stored in the storage unit 14 associated with the imaging time, as will be described later in detail. do.
- the image capturing time relates to the date and time when the image was captured, and indicates, for example, the "year", “month”, “day”, “hour”, and “minute” when the image was captured. Processing (image search display processing) executed by the display control unit 32 and the image search unit 33 will be described later.
- the imaging device 1 has the operator 6, but an external device (not shown) connected to the imaging device 1 by wire or wirelessly may also have the operator 6.
- FIG. That is, the image search unit 33 of the imaging device 1 may search for an image corresponding to date and time information specified via the operator 6 of the external device.
- a driver unit 18 and an audio output device 19 are also connected to the control unit 17 .
- the driver unit 18 is provided with, for example, a motor driver for a zoom lens driving motor, a motor driver for a focus lens driving motor, a motor driver for a diaphragm mechanism motor, and the like. These motor drivers apply drive currents to the corresponding drivers according to instructions from the imaging control unit 31 to move the focus lens and the zoom lens, open and close the diaphragm blades of the diaphragm mechanism, and the like.
- the audio output device 19 is a device that outputs audio, such as a speaker or a piezoelectric element.
- FIG. 4 is a diagram showing a playback mode.
- the imaging apparatus 1 is provided with a one-image playback mode and a list playback mode as methods for displaying images stored in the storage unit 14 on the rear monitor 4 .
- the one-image playback mode and the list playback mode can be switched by a predetermined operation on the operator 6 .
- the display control unit 32 displays one image out of the plurality of images stored in the storage unit 14 on the rear monitor 4 in the one-image reproduction mode. Then, when an operation for switching the image to be displayed is performed via the operator 6 , the display control unit 32 switches the image to be displayed according to the operation on the operator 6 and displays it on the rear monitor 4 .
- the display control unit 32 displays a plurality of images (nine images in the example of the lower part of FIG. 4) out of the plurality of images stored in the storage unit 14 in the list reproduction mode.
- a list is displayed on the rear monitor 4 in order of earliest imaging time (in order of earliest imaging order).
- the display control unit 32 places the cursor CS so that one of the displayed images is highlighted. Then, when an operation for switching the cursor CS is performed via the operator 6, the display control unit 32 moves the cursor CS according to the operation on the operator 6, thereby switching the image to be highlighted. .
- the display control unit 32 selects the other images stored in the storage unit 14 from the currently displayed list of images. , the images to be displayed as a list are switched to be displayed on the rear monitor 4. ⁇
- the display control unit 32 switches to the single-frame playback mode, and displays the image on which the cursor CS is placed on the rear monitor. Display one in 4.
- the imaging apparatus 1 may be able to execute all of the specific examples described below, or may be able to execute only some of them. Also, the images given as specific examples are merely examples, and any images may be used, and the number of stored images does not matter.
- FIG. 1 is diagrams for explaining image search and display processing in specific example 1.
- FIG. 1 it is assumed that nine still images 1 to 9 are stored in the storage unit 14 as shown in the upper part of FIG. 5 and the upper part of FIG. It is assumed that the still images 1 to 4 were captured between 15:20:00 and 15:00 on July 1, 2021. Further, still images 5 to 7 are taken between 15:39:50 and 40:50 on July 1, 2021. Also, still images 8 and 9 are taken between 15:50:15 and 20:00 on July 1, 2021.
- the date and time input image 41 includes a character string for inputting date and time information for retrieving images, and input fields for inputting "year”, “month”, “day”, “hour” and “minute”. 42 are provided.
- the "year”, “month”, “day”, “hour”, and “minute” provided in the input field 42 may be referred to as input items.
- the input item may include "seconds".
- the input field 42 may display the current date and time, or may display the imaging time of the image displayed immediately before the image search and display process. It may not be displayed, or other date and time may be displayed. If the input field 42 displays the current date and time or the imaging time of the image displayed immediately before the image search and display process as a default, the user can easily input the date and time compared to the case where there is no default display. It is possible.
- a triangle mark is displayed to indicate that the value can be changed according to the In the example on the lower left side of FIG. 5, triangular marks are displayed above and below the input item "year”.
- the display control unit 32 While the date and time input image 41 is displayed on the rear monitor 4, if the operator 6 is operated up and down, the display control unit 32 changes and displays the input items according to the up and down operation. For example, when the down operation is performed, the display control unit 32 changes the value of the input item to the next value.
- the display control unit 32 responds to the left/right operation. to change the modifiable input items. For example, when the right operation is performed, the display control unit 32 changes the changeable input item to the right by one, and displays triangular marks above and below the changed input item.
- the image search unit 33 searches for an image corresponding to the input item displayed in the input field 42, that is, the date and time information specified by the user.
- the image search unit 33 compares the date and time information specified by the user with the imaging time in the metadata associated with each of the plurality of images stored in the storage unit 14, thereby obtaining the image specified by the user. Searches for images corresponding to the date and time information.
- the image search unit 33 searches for an image captured at “15:40 on July 1, 2021”, which is the date and time information specified by the user, and the image capturing time matches the date and time information. Still images 6 to 7 are extracted. In this way, when a plurality of images are extracted, the image search unit 33 determines the image (still image 6) having the earliest imaging time among the extracted images as the detection result.
- the captured image (still image 6) is displayed on the entire surface of the rear monitor 4.
- FIG. the display control unit 32 continues to display the date and time input image 41 and superimposes the date and time input image 41 on the front of the retrieved image.
- the date and time input image 41 may be displayed with a predetermined transparency such as semi-transparency so that the searched image can be seen through.
- the user can confirm the retrieved image, and can immediately input the date and time information in the date and time input image 41 when the retrieved image is different from the intended image. Become.
- the display is controlled as shown in the lower right part of FIG.
- the unit 32 hides the date and time input image 41 .
- the image search unit 33 searches for images captured at "15:44 on July 1, 2021", which is the date and time information specified by the user. Then, if there is no image stored in the storage unit 14 whose imaging time matches the date and time information specified by the user, the image search unit 33 retrieves an image (date and time information) whose imaging time is closest to the date and time information specified by the user. An image with a small difference between the image pickup time and the still image 7) is determined as the detection result.
- the display control unit 32 sets the single-image reproduction mode, and displays the image (still image 7) retrieved by the image retrieval unit 33 on the entire surface of the rear monitor 4. At this time, the display control unit 32 continues to display the date and time input image 41 and superimposes the date and time input image 41 on the front of the retrieved image.
- the display is controlled as shown on the lower right side of FIG.
- the unit 32 hides the date and time input image 41 .
- the image search unit 33 searches for an image corresponding to the date and time information designated by the user via the date and time input image 41, and the display control unit 32 displays the searched image on the rear monitor 4. set to display.
- the image corresponding to the date and time information specified by the user means the image whose shooting time is the earliest among the images whose shooting times match the specified date and time information, and the image whose shooting time is closest to the specified date and time information. This includes images, and the same applies to the following description.
- Specific example 1 is particularly useful when the user knows the date and time information that the user wants to search. For example, it is useful when the user wants to display on the rear monitor 4 a predetermined image from among the images captured so far while viewing the ceremony during a break at an event such as a wedding.
- FIG. 7 is a flowchart showing the flow of image search and display processing in specific example 1.
- FIG. When the image search display process in specific example 1 is started, the display control unit 32 displays the date and time input image 41 in the center of the rear monitor 4 in step S1.
- step S3 the image search unit 33 determines whether a predetermined time has passed since the date and time information was last changed. If the predetermined time has not passed (No in step S3), the process proceeds to step S2. return.
- step S4 the image search unit 33 selects images corresponding to the specified date and time information from among the plurality of images stored in the storage unit 14. Search for images that
- step S5 the display control unit 32 sets the single-image reproduction mode, and displays the image retrieved by the image retrieval unit 33 on the entire surface of the rear monitor 4. At this time, the display control unit 32 continues to display the date and time input image 41 and superimposes the date and time input image 41 on the front of the retrieved image.
- step S6 the display control unit 32 determines whether an operation for hiding the date and time input image 41 has been performed on the operator 6, and the operation for hiding the date and time input image 41 is performed. If not (No in step S6), the process returns to step S2. On the other hand, if an operation is performed to hide the date and time input image 41 (Yes in step S6), the display control unit 32 hides the date and time input image 41 and terminates the image search and display process.
- FIG. 8 is a diagram for explaining image search and display processing in specific example 2. As shown in FIG. In specific example 2, as shown in the upper part of FIG. 8, it is assumed that an image similar to that in specific example 1 is stored in storage unit 14 .
- the single-image reproduction mode is set, and as shown in the lower left part of FIG. , and a still image 1) are displayed on the rear monitor 4 .
- the display control unit 32 superimposes the date and time input image 41 on the still image 1 as shown in the lower center of FIG. to display in the center of the rear monitor 4.
- the display control unit 32 displays the date and time input image 41 with the imaging time of the still image 1 entered in the input field 42 based on the metadata of the still image 1 displayed on the rear monitor 4. .
- the image search unit 33 After the input item in the input field 42 is changed according to the operation of the operator 6 by the user, when the operator 6 is not operated for a predetermined time (for example, 5 seconds), the image search unit 33 , an image corresponding to the input item displayed in the input field 42, that is, the date and time information specified by the user is retrieved. Note that the search method is the same as in the first specific example.
- the display control unit 32 switches to the image (still image 6) retrieved by the image retrieval unit 33 and displays it on the entire surface of the rear monitor 4 in the one-image playback mode, as shown in the lower right of FIG. At this time, the display control unit 32 continues displaying the date and time input image 41 .
- the display control unit 32 displays the date and time input image 41. You want to hide.
- FIG. 9 is a flowchart showing the flow of image search and display processing in specific example 2.
- FIG. 9 Note that the same reference numerals are assigned to the same processes as those of the specific example 1, and the description thereof is omitted.
- step S11 the display control unit 32 performs the image capturing time of the image displayed on the rear monitor 4 in the single-image playback mode with the image capturing time entered in the input field 42.
- a date and time input image 41 is displayed.
- step S12 the display control unit 32 switches the image displayed on the rear monitor 4 to the image retrieved by the image retrieval unit 33 and displays it. At this time, the display control unit 32 continues to display the date and time input image 41 and superimposes the date and time input image 41 on the front of the retrieved image. After that, the process of step S6 is performed.
- Specific example 2 is particularly useful when it is known how many minutes after the image displayed on the rear monitor 4 the image that the user wants to see is the image of the scene that occurred. For example, it is useful when the intervals between scoring scenes in sports are known, or when the time schedule for a given event is known.
- 10A and 10B are diagrams for explaining image search and display processing in specific example 3.
- the list reproduction mode is set, and as shown in the lower left part of FIG. . Further, the display control unit 32 puts the cursor CS on the still image 1, for example.
- the display control unit 32 displays a date and time input image 41 for designating date and time information, as shown in the lower center of FIG. is superimposed on the still image 1 and displayed in the center of the rear monitor 4 .
- the display control unit 32 stores the imaging time of the still image 1 in the input field 42 based on the metadata of the still image 1 on which the cursor CS is placed among the plurality of images displayed on the rear monitor 4.
- the date and time input image 41 is displayed in the input state.
- the image The search unit 33 searches for an image corresponding to the input item displayed in the input field 42, that is, the date and time information specified by the user. Note that the search method is the same as in the first specific example.
- the display control unit 32 switches the cursor CS to the image (here, the still image 8) searched by the image search unit 33, as shown in the lower right part of FIG. At this time, the display control unit 32 continues displaying the date and time input image 41 .
- the display control unit 32 displays the date and time input image 41. You want to hide.
- FIG. 11 is a flowchart showing the flow of image search and display processing in specific example 3.
- FIG. 11 Note that the same reference numerals are assigned to the same processes as those of the specific example 1, and the description thereof is omitted.
- step S21 the display control unit 32 picks up an image on which the cursor CS is placed, among a plurality of images displayed on the rear monitor 4 in the list reproduction mode.
- the date and time input image 41 is displayed with the time input in the input field 42 .
- step S22 the display control unit 32 displays the image searched by the image search unit 33 so that the cursor CS is placed thereon. At this time, the display control unit 32 continues to display the date and time input image 41 and superimposes the date and time input image 41 on the front of the retrieved image. After that, the process of step S6 is performed.
- Concrete Example 3 like Concrete Example 2, is particularly useful when it is known how many minutes after the image displayed on the rear monitor 4 the image that the user wants to see is the image of the scene that occurred.
- step S22 the image to which the cursor CS is placed may be moved to the upper left of the rear monitor 4 and displayed.
- 12A and 12B are diagrams for explaining image search and display processing in specific example 4.
- FIG. in specific example 4 as shown in the upper part of FIG. 12, it is assumed that an image similar to that in specific example 2 is stored in storage unit 14 . Further, in Specific Example 4, it is assumed that still images 1 to 4 are grouped. It is also assumed that still images 5 to 7 are grouped. It is also assumed that still images 8 to 9 are grouped.
- grouping refers to associating a plurality of images captured by one continuous imaging or interval imaging. Group information for identifying the grouped images is stored in the storage unit 14 in association with the image data of the grouped images. That is, it is possible to identify whether or not the image data stored in the storage section 14 belongs to a group based on the group information stored in the storage section 14 .
- continuous imaging means that still images are continuously captured at predetermined intervals while the shutter button as the operator 6 is being pressed.
- Interval imaging means capturing still images continuously at predetermined intervals.
- Example 4 as in Concrete Example 2, the single-image reproduction mode is set, and as shown in the lower left part of FIG. An image 41 is superimposed on the still image 1 and displayed in the center of the rear monitor 4 .
- the image The search unit 33 searches for an image (here, the still image 6) corresponding to the input item displayed in the input field 42, that is, the date and time information specified by the user.
- the image search section 33 determines any image in the group including the image as a search result. For example, the image search unit 33 determines the image (still image 5) having the earliest imaging time from among the images (still image 5 to still image 7) of the group including the still image 6 as the search result.
- the display control unit 32 switches and displays the image finally retrieved by the image retrieval unit 33 (here, still image 5) on the entire surface of the rear monitor 4, as shown on the lower right side of FIG. At this time, the display control unit 32 continues displaying the date and time input image 41 .
- the display control unit 32 displays the date and time input image 41. You want to hide.
- FIG. 13 is a flowchart showing the flow of image search and display processing in specific example 4.
- FIG. 13 Note that the same reference numerals are assigned to the same processes as in the second specific example, and the description thereof is omitted.
- step S31 the display control unit 32 groups the searched images based on the group information. Determine if it is As a result, if they are not grouped (No in step S31), the process proceeds to step S12. That is, if not grouped, in step S12, the display control unit 32 selects the image corresponding to the date and time information specified by the user (image whose imaging time matches the specified date and time information) searched in step S4. Among them, the image with the earliest imaging time or the image with the imaging time closest to the designated date and time information) is displayed.
- step S32 the image search unit 33 selects the captured image from the group to which the image corresponding to the date and time information specified by the user, searched in step S4, belongs. The image with the earliest time is determined as the search result. After that, the processes of steps S12 and S6 are performed.
- the image search unit 33 searches for the image in the group containing the image. One of the images is determined as the search result. At this time, in specific example 5, the image search unit 33 selects the image (here, Then, the still image 6) is determined as the search result.
- the display control unit 32 switches and displays the image (here, the still image 6) retrieved by the image retrieval unit 33 on the entire surface of the rear monitor 4, as shown on the lower right side of FIG. At this time, the display control unit 32 continues displaying the date and time input image 41 .
- the display control unit 32 displays the date and time input image 41. You want to hide.
- FIG. 15 is a flowchart showing the flow of image search and display processing in specific example 5.
- step S11 the processes of steps S11, steps S2 to S4, and step S31 are performed, and if the searched images are grouped (Yes in step S31), step In S41, the image search unit 33 determines, from among the groups, an image whose imaging time is closest to the designated date and time information as a search result. After that, the processes of steps S12 and S6 are performed.
- the image search unit 33 searches for an image corresponding to the designated date and time information, and selects the image with the earliest imaging time and the image with the closest imaging time to the date and time information from the group including the searched images.
- An image may be determined as a search result.
- the display control unit 32 displays, for example, the image with the earliest imaging time on the front of the rear monitor 4 as the main image, and the image with the closest imaging time to the date and time information as the sub-image on the edge of the rear monitor 4. It becomes possible to
- the image search unit 33 searches for an image corresponding to the designated date and time information, and if there is an image that has the same imaging conditions as the searched image, the Among certain images, an image with the earliest imaging time or an image with the closest imaging time to date information is determined as a search result.
- Other processes are the same as in specific examples 4 and 5. FIG.
- the same imaging condition refers to at least one imaging condition such as shutter speed, F number, ISO sensitivity, zoom magnification, etc. in the metadata associated with the image.
- the image search unit 33 searches for an image corresponding to the specified date and time information, and if there is another image having the same ISO sensitivity among the images shot continuously as the searched image, Among the images having the same ISO sensitivity as the searched image, the image with the earliest imaging time or the image with the closest imaging time to the date and time information is determined as the search result.
- the term “same” as used herein includes not only the case where the values are completely the same, but also the case where the values are within the same range.
- 16A and 16B are diagrams for explaining image search and display processing in specific example 7.
- FIG. 1 to 6 a case where a still image is stored in the storage unit 14 as an image has been described.
- specific example 7 a case where a still image and a moving image are stored in the storage unit 14 will be described.
- the display control unit 32 designates date and time information as shown in the lower center of FIG. A date and time input image 41 to be used is superimposed on the still image 1 and displayed in the center of the rear monitor 4. - ⁇
- the image The search unit 33 searches for an image corresponding to the input item displayed in the input field 42, that is, the date and time information specified by the user.
- the image search unit 33 selects a frame matching the date and time information specified by the user from among the plurality of frames constituting the moving image. search for. Specifically, the image search unit 33 reads the imaging time and frame rate from the metafile associated with the moving image. Then, the image search unit 33 subtracts the imaging time from the designated date and time information, and based on the result of the subtraction and the frame rate, determines the number of frames from the start of imaging that matches the date and time information designated by the user. frame. Then, the image search unit 33 determines frames corresponding to the determined number of frames as search results.
- Whether or not the image corresponding to the date and time information specified by the user is a moving image depends on whether the date and time information is later than the capturing time (imaging start time) of the moving image and the capturing time of the moving image. It can be determined by whether the date and time information is earlier than the time to which the time is added.
- the display control unit 32 switches to the image (frame) retrieved by the image retrieval unit 33 and displays it on the entire surface of the rear monitor 4, as shown in the lower right part of FIG. At this time, the display control unit 32 continues displaying the date and time input image 41 .
- the display control unit 32 displays the date and time input image 41. You want to hide.
- FIG. 17 is a flowchart showing the flow of image search and display processing in specific example 7.
- step S11 When the image search and display process in Specific Example 7 is started, the processes of steps S11 and steps S2 to S4 are performed. Determine if it is an image. As a result, if the image is not a moving image (No in step S51), the process proceeds to step S12. On the other hand, if it is a moving image (Yes in step S51), in step S52, the image search unit 33 detects a frame that matches the specified date and time information from among the plurality of frames that make up the retrieved moving image. Determined as After that, the processes of steps S12 and S6 are performed.
- the image corresponding to the date and time information specified by the user is a moving image
- a frame matching the date and time information specified by the user is determined as the search result from among the plurality of frames constituting the moving image.
- the first frame of the plurality of frames forming the moving image may be determined as the detection result.
- FIG. 8 are diagrams for explaining image search and display processing in specific example 8.
- FIG. 1 to 6 a case where a still image is stored in the storage unit 14 as an image has been described.
- specific example 8 a case will be described in which a dual recording function capable of capturing a still image is provided when, for example, the shutter button is pressed during capturing of a moving image. Therefore, in specific example 8, a still image and a moving image are stored in the storage unit 14 .
- the display control unit 32 designates date and time information as shown in the lower center of FIG. For example, a date and time input image 41 is superimposed on the still image 1 and displayed in the center of the rear monitor 4 .
- the image The search unit 33 searches for an image corresponding to the input item displayed in the input field 42, that is, the date and time information specified by the user.
- the image search unit 33 compares the imaging time (imaging start time) for the moving image and the imaging time for the still image, which is specified by the user. Search for an image (still image or moving image) corresponding to date and time information. Specifically, for moving image 1, the imaging time of “July 1, 2021, 15:39:50” is used as a comparison target. Therefore, here, as an image corresponding to the date and time information specified by the user, the still image 6 having the imaging time closest to the specified date and time information is retrieved. Other processes are the same as those of specific examples 1 to 6, and therefore are omitted.
- 19A and 19B are diagrams for explaining image search and display processing in specific example 9.
- FIG. 9 as shown in the lower part of FIG. 19 , it is assumed that a plurality of images (still images) captured on the same date at different times are stored in the storage unit 14 .
- the display control unit 32 displays a date and time input image 43 for designating date and time information, as shown on the upper left side of FIG. is superimposed on the still image 1 and displayed in the center of the rear monitor 4 .
- the date and time input image 43 is provided with an input field 44 in which only "year”, “month” and “day” are input items, and "hour” and “minute” input items are not provided. .
- the input item displayed in the input field 44 is changed according to the operation of the operator 6 by the user, when the operator 6 is not operated for a predetermined time (for example, 5 seconds), the input An image corresponding to the input item displayed in the column 44, that is, the date and time information (here, only the date) specified by the user is searched. In this case, compared to specific examples 1 to 8, since the search is performed only by the date, a rough search with low accuracy is performed.
- the image search unit 33 searches for the date and time information specified by the user, that is, the image of the date input by the user based on the imaging time in the metadata.
- the image captured at the earliest time among the images captured most concentratedly among the images captured on that date is retrieved. Determined as
- the image search unit 33 counts the number of images per unit time (every hour) for images corresponding to the date input by the user. Then, the image search unit 33 obtains a group of images separated by a time zone (for example, 15:00) when the number of counted images is 0. In the example in the lower part of FIG. 19, two image groups surrounded by dashed lines can be obtained.
- the image search unit 33 counts the number of images in each image group, and extracts the image group with the largest number of images as the image group captured most intensively. For example, a group of images captured between 16:00 and 18:00 is extracted as a group of images captured most intensively.
- the display control unit 32 superimposes and displays the image finally retrieved by the image retrieval unit 33 and the date and time input image 41 on the rear monitor 4, as shown in the upper center of FIG.
- the display is switched from the date and time input image 43 to the date and time input image 41, and the imaging time of the image displayed on the rear monitor 4 is input in the input field 42 of the date and time input image 41.
- the image search unit 33 searches for an image corresponding to the input item displayed in the input field 42, that is, the date and time information (here, including hour and minute) specified by the user. In this case, images are retrieved with higher accuracy (finely) than in the case of retrieving only by date.
- the display control unit 32 switches to the image searched by the image search unit 33 and displays it on the entire surface of the rear monitor 4, as shown on the upper right side of FIG. At this time, the display control unit 32 continues displaying the date and time input image 41 .
- the display control unit 32 displays the date and time input image 41. You want to hide.
- FIG. 20 is a flowchart showing the flow of image search and display processing in specific example 9.
- the display control unit 32 displays the date and time input image 43 including the input field 44 in step S61.
- step S62 the display control unit 32 changes and displays the input item in the input field 44 in step S62.
- step S63 the image search unit 33 determines whether a predetermined time has passed since the date and time information (date) was last changed. If the predetermined time has not passed (No in step S63), step S62 Return processing to .
- step S64 the image search unit 33 retrieves the specified date and time information (date ) to search for images corresponding to Then, when there are a plurality of images corresponding to the designated date and time information, the image search section 33 counts the number of images per unit time.
- step S65 the image search unit 33 obtains a group of images separated by a time zone (for example, 15:00) when the number of counted images is 0.
- step S66 the image search unit 33 counts the number of images in each image group, and extracts the image group with the largest number of images as the image group captured most concentratedly. Then, the image search unit 33 determines the image captured at the earliest time in the group of images captured most concentratedly as the search result.
- step S67 the display control unit 32 superimposes the image finally retrieved by the image retrieval unit 33 and the date/time input image 41 on the rear monitor 4 and displays them.
- steps S2 to S6 are executed. Note that, in Specific Example 9, a group of images separated by a time zone in which the counted number of sheets is 0 was obtained, but a group of images separated by each time (13:00, 14:00, 15:00, etc.) was obtained. time interval) may be obtained. In the example shown in the lower part of FIG. 19, the number of images in the group of images captured between 17:00 and 17:00 is the largest when each time is defined as a segment, so in step S67 of FIG. The fastest image is displayed on the rear monitor 4 .
- 21A and 21B are diagrams for explaining image search and display processing in specific example 10.
- FIG. FIG. 22 is a flowchart showing the flow of image search display processing in specific example 10. As shown in FIG. Specific example 10 is different in that the processes of steps S71 to S73 are executed instead of the processes of steps S66 and S67 in specific example 9.
- FIG. 22 is a flowchart showing the flow of image search display processing in specific example 10. As shown in FIG. Specific example 10 is different in that the processes of steps S71 to S73 are executed instead of the processes of steps S66 and S67 in specific example 9.
- step S65 the image search unit 33 obtains an image group divided by a time zone (for example, 15:00) when the number of counted images is 0.
- step S71 the display control unit 32 displays a list of images captured at the earliest times among the images in the image group on the rear monitor 4, as shown in FIG.
- the display control unit 32 displays a list of images captured at the earliest times among the images in the image group on the rear monitor 4, as shown in FIG.
- the method of arranging the images with the earliest imaging times in each image group is not limited to the method of arranging them vertically, but they may be arranged horizontally or may be displayed as thumbnails vertically and horizontally.
- step S72 when one of the listed images is selected via the operator 6, the display control unit 32 displays the selected image on the rear monitor 4 in step S73.
- a date and time input image 41 is superimposed and displayed with the imaging time of the image input in the input field 42 .
- the display control unit 32 selects and displays the one-image reproduction mode or the list display mode based on the scene identified by the image analysis processing. For example, when the scene is sports, the list display mode is selected because there is a high possibility that images are continuously captured. Also, if the scene is in portrait mode, the single-frame reproduction mode is set so that the image can be viewed carefully.
- the imaging device 1 is connected to an external device.
- the external device is a computer having a CPU such as a smart phone or a personal computer.
- a date and time input image 45 as shown in FIG. 23 is displayed on the display of the external device.
- "month, day”, “hour”, and “minute” are scrolled up and down, and the user's swipe operation (touch operation) is accepted.
- the imaging apparatus 1 when the imaging apparatus 1 receives the date and time information input via the date and time input image 45 in the external device, it searches for images as in Specific Examples 1 to 10, and transmits the searched images to the external device. Thereby, the imaging device 1 can transmit an image corresponding to the date and time information specified by the user to the external device.
- a plurality of imaging devices 1 are interconnected via a network.
- the date and time information is transmitted to the other imaging devices 1 .
- images corresponding to the date and time information specified by the user are searched for in all the imaging devices 1, and the searched images are transmitted to the imaging device 1 for which the date and time information is input.
- the embodiments are not limited to the examples described above, and various modifications can be made.
- the imaging device 1 has been described as an example of the information processing device, various devices such as a computer, a game machine, and a television receiver may be used as the information processing device.
- date and time information is input to the date and time input image 41 .
- a difference date and time input image for inputting a difference (for example, 15 minutes later) from the imaging time of the image displayed on the rear monitor 4 is displayed on the rear monitor 4, and the image searched by the image search unit 33 is displayed on the rear monitor 4.
- the differential date and time input image may also be displayed.
- the information processing apparatus as an embodiment includes an image searching unit 33 that searches for an image corresponding to designated date and time information from among a plurality of images stored in association with imaging times. and a display control unit 32 that displays an image based on the search result of the image search unit 33 on the display unit 15 (rear monitor 4).
- the information processing apparatus can compare the designated date and time information with the imaging times associated with the images, and determine images whose imaging times are close to the designated date and time information as search results. Therefore, the information processing apparatus can improve usability when searching for images.
- the image search unit 33 determines the image with the earliest imaging time among the plurality of associated images as the search result. can be considered. This makes it possible to display an image most suitable for the date and time information specified by the user.
- the image search unit 33 may determine an image whose imaging time is closest to the designated date and time information as a search result. As a result, even if there is no image associated with an imaging time that matches the date and time information specified by the user, by displaying an image that is close to the date and time information, there is a high possibility that the image is the image desired by the user. Images can be displayed.
- the display control unit 32 displays a date and time input image for allowing the user to input date and time information on the display unit 15, and displays the date and time input image when displaying images searched by the image search unit 33. can be considered. As a result, when an image that the user does not desire is displayed, it is possible to immediately input date and time information.
- the display control unit 32 displays a date and time input image for allowing the user to input date and time information on the display unit 15, and displays the imaging time associated with the image displayed on the display unit 15 on the date and time input image.
- a date and time input image for allowing the user to input date and time information on the display unit 15, and displays the imaging time associated with the image displayed on the display unit 15 on the date and time input image.
- the display control unit 32 displays on the display unit a date and time input image for inputting a difference from the imaging time associated with the image displayed on the display unit 15, and displays the image searched by the image search unit.
- a date and time input image for inputting a difference from the imaging time associated with the image displayed on the display unit 15, and displays the image searched by the image search unit.
- the plurality of images includes a still image and a moving image.
- the display control unit 32 displays the first frame of the moving image on the display unit 15. It is conceivable to display This allows the user to see the leading frame of the moving image corresponding to the designated date and time information.
- the plurality of images include still images and moving images, and if the image corresponding to the designated date and time information is a moving image, the display control unit 32 matches the designated date and time information in the moving image. It is conceivable to display frames of time on the display unit 15 . Thus, even if the image corresponding to the designated date and time information is a moving image, the image (still image) corresponding to the date and time information designated by the user can be shown to the user.
- a plurality of images are grouped, and if the image corresponding to the designated date and time information is a grouped image, the image search unit 33 searches for images in the group including the image corresponding to the designated date and time information. Among them, it is conceivable to determine the image with the earliest imaging time as the search result. With this, it is possible to cue an image from one continuously captured image.
- the plurality of images includes one or more grouped images, and if the image corresponding to the specified date and time information is a grouped image, the image search unit 33 retrieves the specified date and time information. It is conceivable that an image having an imaging time closest to the designated date and time information is determined as a search result from among the images in the group including the image corresponding to . As a result, it is possible to display the image that the user most desires from among the images captured in succession.
- the plurality of images includes one or more grouped images, and if the image corresponding to the specified date and time information is a grouped image, the image search unit 33 retrieves the specified date and time information. It is conceivable that the image captured at the earliest time and the image captured at the closest time to the specified date and time information are determined as search results from among the images in the group including the image corresponding to . As a result, it is possible to display side by side the cue image and the image most desired by the user from one continuously captured image.
- the display control unit 32 may simultaneously display an image whose imaging time is closest to the designated date and time information as a main image and an image whose imaging time is earliest as a sub-image. As a result, it is possible to display the cue image and the image most desired by the user from one continuously captured image in a more easily viewable manner.
- the image search unit 33 may search for images corresponding to date and time information specified by an external device. As a result, it becomes possible to transmit an image stored in the imaging device 1 to an external device by an operation from the external device.
- Images are stored in association with information indicating imaging conditions, and the image search unit 33 searches for any of the images under the same imaging conditions as the imaging conditions associated with the image corresponding to the designated date and time information. Decisions can be made as a result. As a result, an arbitrary image can be retrieved from one continuously captured image.
- the image search unit 33 determines, as a search result, the image with the earliest imaging time from among the images with the same imaging conditions as those associated with the image corresponding to the specified date and time information. With this, it is possible to cue an image from one continuously captured image.
- the image search unit 33 can search for images by different methods depending on the input accuracy of the designated date and time information. As a result, it is possible to perform a detailed search after performing a rough search at first, and it is possible to provide a user interface that is more user-friendly for the user.
- the image search unit 33 searches for images using date and time information with relatively low input accuracy, and then searches for images with date and time information with relatively high input accuracy based on the search results. As a result, it is possible to perform a detailed search after performing a rough search at first, and it is possible to provide a user interface that is more user-friendly for the user.
- the display control unit 32 may display on the display unit a date and time input image for allowing the user to input date and time information by touch operation. This makes it possible to provide a user interface that allows the user to intuitively input date and time information.
- the information processing apparatus searches for an image corresponding to specified date and time information from among a plurality of images stored in association with imaging times, and displays the searched image on the display unit.
- the program also includes an image search unit that searches for an image corresponding to specified date and time information from among a plurality of images stored in association with imaging times, and a display unit that displays images searched by the image search unit. and a display control unit to display on the computer.
- Such an information processing method and program can also obtain the same effect as the information processing apparatus.
- Such a program can be recorded in advance in an HDD as a recording medium built in a device such as a personal computer, or in a ROM, flash memory, or the like in a microcomputer having a CPU.
- a program can be recorded in advance in an HDD as a recording medium built in a device such as a personal computer, or in a ROM, flash memory, or the like in a microcomputer having a CPU.
- removable recording media such as flexible discs, CD-ROMs (Compact Disc Read Only Memory), MO (Magnet optical) discs, DVDs, Blu-ray discs, magnetic discs, semiconductor memories, and memory cards. It can be stored (recorded).
- Such removable recording media can be provided as so-called package software.
- the program can also be downloaded from a download site via a network such as a LAN (Local Area Network) or the Internet.
- LAN Local Area Network
- the present technology can also adopt the following configuration.
- an image search unit that searches for an image corresponding to specified date and time information from among a plurality of images stored with associated imaging times; a display control unit that displays an image based on a search result by the image search unit on a display unit;
- Information processing device with (2) The image search unit The information processing apparatus according to (1), wherein when there are a plurality of images associated with imaging times that match the specified date and time information, an image with the earliest imaging time among the plurality of associated images is determined as a search result. .
- the image search unit The information processing apparatus according to (1) or (2), wherein, when there is no image associated with an imaging time that matches the specified date/time information, an image having an imaging time closest to the specified date/time information is determined as a search result. .
- the display control unit displaying on the display unit a date and time input image for allowing the user to enter the date and time information;
- the information processing apparatus according to any one of (1) to (3), wherein the date and time input image is also displayed when the image retrieved by the image retrieval unit is displayed.
- the display control unit displaying on the display unit a date and time input image for allowing the user to enter the date and time information;
- the information processing apparatus according to any one of (1) to (4), wherein an imaging time associated with the image displayed on the display unit is displayed in the date/time input image.
- the display control unit displaying on the display unit a difference date and time input image for inputting a difference from an imaging time associated with the image displayed on the display unit;
- the information processing apparatus according to any one of (1) to (5), wherein the differential date/time input image is also displayed when displaying the image retrieved by the image retrieval unit.
- the plurality of images include still images and moving images
- the display control unit The information processing apparatus according to any one of (1) to (6), wherein when the image corresponding to the designated date and time information is a moving image, the first frame of the moving image is displayed on the display section.
- the plurality of images include still images and moving images
- the display control unit any one of (1) to (7), wherein, when the image corresponding to the designated date and time information is a moving image, the frame of the time matching the designated date and time information in the moving image is displayed on the display unit. information processing equipment.
- the plurality of images are grouped;
- the image search unit If the images corresponding to the specified date and time information are grouped images, the image with the earliest shooting time is determined as the search result from among the images in the group that includes the images corresponding to the specified date and time information.
- the information processing device according to any one of (1) to (8).
- the plurality of images includes one or more grouped images;
- the image search unit If the image corresponding to the specified date and time information is a grouped image, the image with the shooting time closest to the specified date and time information among the images in the group that includes the image corresponding to the specified date and time information. is determined as a search result.
- the information processing apparatus according to any one of (1) to (9).
- the plurality of images includes one or more grouped images;
- the image search unit If the image corresponding to the specified date and time information is a grouped image, the image with the earliest shooting time and the specified date and time from among the images in the group that includes the image corresponding to the specified date and time information.
- the information processing apparatus according to any one of (1) to (10), wherein an image whose imaging time is closest to the information is determined as a search result.
- the display control unit The information processing apparatus according to (11), wherein an image captured at a time closest to the specified date and time information is displayed as a main image, and an image captured at an earliest time is displayed as a sub-image.
- the image search unit The information processing apparatus according to any one of (1) to (12), wherein an image corresponding to date and time information specified by an external device is searched. (14) The image is stored in association with information indicating imaging conditions, The image search unit The information processing apparatus according to any one of (1) to (13), wherein any one of the images under the same imaging conditions as the imaging conditions associated with the image corresponding to the designated date and time information is determined as a search result. (15) The image search unit The information processing apparatus according to (14), wherein an image with the earliest imaging time is determined as a search result from among the images under the same imaging conditions as the imaging conditions associated with the image corresponding to the designated date and time information.
- the image search unit The information processing apparatus according to any one of (1) to (15), wherein the image is searched by different methods depending on the input accuracy of the specified date and time information.
- the image search unit (16) searching for images using date and time information with relatively low input accuracy, and then searching for images with date and time information with relatively high input accuracy based on the search results.
- the display control unit The information processing apparatus according to any one of (1) to (17), wherein a date and time input image for allowing the user to input the date and time information by touch operation is displayed on the display unit.
- the information processing device retrieving an image corresponding to specified date and time information from among a plurality of stored images associated with imaging times; An information processing method for displaying a searched image on a display unit.
- an image search unit that searches for an image corresponding to specified date and time information from among a plurality of images stored with associated imaging times; a display control unit that displays an image searched by the image search unit on a display unit; A program that makes a computer work.
- imaging device 4 rear monitor 6 operator 15 display unit 17 control unit 32 display control unit 33 image search unit
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Human Computer Interaction (AREA)
- Library & Information Science (AREA)
- Studio Devices (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
Description
これにより、情報処理装置は、指定された日時情報と、画像に関連付けられた撮像時刻とを比較し、指定された日時情報と撮像時刻が近い画像を検索結果として決定することが可能となる。
<1.撮像装置の構成>
<2.再生モード>
<3.画像検索表示処理>
<4.変形例>
<5.まとめ>
<6.本技術>
図1および図2は、本技術の実施形態に係る情報処理装置の一例としての撮像装置1の外観を示す図である。図2では、レンズ鏡筒3が取り外されたカメラ筐体2を示している。なお、以下では、被写体側を前方とし撮像者(ユーザ)側を後方として説明を行う。また、図1および図2に示す情報処理装置は本技術の一例であって、その一部を当該情報処理装置と無線または有線で接続された外部装置が有していてもよいし、以下に示した各部以外の処理部を有していても構わない。
なお、レンズ鏡筒3がいわゆる交換レンズとして着脱可能とされるのは一例であり、カメラ筐体2から取り外せないレンズ鏡筒であってもよい。
背面モニタ4は、例えば、液晶ディスプレイ(LCD:Liquid Crystal Display)や有機EL(Electro-Luminescence)ディスプレイ等の表示デバイスにより構成される。
背面モニタ4は、カメラ筐体2に対して回動可能とされている。例えば、背面モニタ4の上端部を回動軸として背面モニタ4の下端部が後方に移動するように回動可能とされている。なお、背面モニタ4の右端部や左端部が回動軸とされていてもよい。更に、複数の軸回り方向に回動可能とされていてもよい。
EVFモニタ5aは、LCDや有機ELディスプレイ等を用いて形成されている。なお、EVFモニタ5aに代わって光学式ファインダ(OVF:Optical View Finder)が設けられていてもよい。
これらの操作子6としては、ボタン、ダイヤル、押圧及び回転可能な複合操作子など、各種の態様のものを含んでいる。各種の態様の操作子6により、例えば、シャッター操作、メニュー操作、再生操作、モード選択操作、フォーカス操作、ズーム操作、パラメータ変更操作が可能とされる。なお、パラメータとしては、例えば、シャッタースピード、絞り値(F値)、ISO感度等が挙げられる。
また、操作子6としては、背面モニタ4の前面に設けられたタッチパネルも含まれる。
撮像装置1では、被写体からの光が、撮像光学系11を介して撮像素子部12に入射する。
撮像素子部12では、イメージセンサで受光した光を光電変換して得た電気信号について、例えばCDS(Correlated Double Sampling)処理、AGC(Automatic Gain Control)処理などを実行し、さらにA/D(Analog/Digital)変換処理を行う。そして、撮像素子部12は、デジタルデータとしての撮像信号を信号処理部13に出力する。
同時化処理では、各画素について、R,G,B全ての色成分を有するように画像データを生成する色分離処理を施す。例えば、ベイヤー配列のカラーフィルタを用いたイメージセンサの場合は、色分離処理としてデモザイク処理が行われる。
YC生成処理では、R,G,Bの画像データから、輝度(Y)信号および色(C)信号を生成(分離)する。
解像度変換処理では、各種の信号処理が施された画像データに対して、解像度変換処理を実行する。
例えば静止画像ファイルとしてJPEG(Joint Photographic Experts Group)、TIFF(Tagged Image File Format)、GIF(Graphics Interchange Format)等の形式の画像ファイルの生成を行う。またMPEG-4準拠の動画像・音声の記録に用いられているMP4フォーマットなどとしての画像ファイルの生成を行うことも考えられる。
なお、ロー(RAW)画像データとして画像ファイルを生成することも考えられる。
例えば、メタデータには、画像データの撮像時刻、撮像装置1の機種、製造者、シリアルナンバ等のカメラ情報、データ形式、フレームレート(動画像の場合)、データサイズ、画角情報、撮像時のパラメータ情報等が含まれる。
表示部15は、制御部17の指示に基づいて表示画面上に各種表示を実行させる。
例えば、表示部15は、記憶部14に記憶された画像ファイルに基づく画像を表示させる。
また、表示部15は、制御部17の指示に基づいて、各種操作メニュー、アイコン、メッセージ等、すなわちGUI(Graphical User Interface)としての表示を行う。なお、本実施例において撮像装置1は表示部15を備えるが、撮像装置1に有線または無線で接続された外部装置(不図示)が表示部15を備えていても構わない。すなわち、外部装置の表示部15が、撮像装置1の制御部17の指示に基づいてGUIを表示してもよい。
例えば、通信部16は、外部の情報処理装置、表示装置、記録装置、再生装置等に対して画像ファイルの送信出力を行う。
また、通信部16は、ネットワーク通信部として、例えばインターネット、ホームネットワーク、LAN(Local Area Network)等の各種のネットワーク通信を行い、ネットワーク上のサーバ、端末等との間で各種データ送受信を行うことができる。
ROMは、CPUが各部を制御するためのOS(Operating System)や、各種動作のためのアプリケーションプログラムや、ファームウエア、各種の設定情報等の記憶に用いられる。
各種の設定情報としては、通信設定情報、撮像動作に関する設定情報、画像処理に係る設定情報などがある。撮像動作に関する設定情報としては、シャッタースピード、F値、ISO感度、メカニカルシャッター又は電子シャッターの幕速設定、フレームレート等がある。
撮像制御部31は、画像を撮像する際の各種制御を行う。例えば、撮像制御部31は、信号処理部13における各種信号処理の指示、ユーザの操作に応じた撮像動作や記録動作等の制御を行う。
また、撮像制御部31は、絞り機構の動作制御、撮像素子部12のシャッタースピードの制御、オートフォーカス制御やマニュアルフォーカス操作、ズーム操作等に応じてのフォーカスレンズやズームレンズの駆動制御、露光タイミングの制御などを行う。
なお、本実施例において撮像装置1は操作子6を備えるが、撮像装置1に有線または無線で接続された外部装置(不図示)が操作子6を備えていても構わない。すなわち、外部装置の操作子6を介して指定された日時情報に対応する画像を、撮像装置1の画像検索部33が検索してもよい。
これらのモータドライバは、撮像制御部31からの指示に応じて駆動電流を対応するドライバに印加し、フォーカスレンズやズームレンズの移動、絞り機構の絞り羽根の開閉等を実行させることになる。
図4は、再生モードを示す図である。撮像装置1では、記憶部14に記憶された画像を背面モニタ4に表示する方法として、1枚再生モードおよび一覧再生モードが設けられている。撮像装置1では、1枚再生モードおよび一覧再生モードが、操作子6に対する所定の操作によって切替可能とされている。
次に画像検索表示処理について具体的な例を挙げて説明する。なお、以下で説明する具体的な例について撮像装置1は全て実行可能であってもよく、また、一部のみ実行可能であってもよい。また、具体例で挙げられる画像は一例であり、どのような画像であってもよく、記憶されている枚数も問わない。
図5および図6は、具体例1における画像検索表示処理を説明する図である。具体例1では、図5上段および図6上段に示すように、9枚の静止画像1~静止画像9が記憶部14に記憶されているとする。なお、静止画像1~4は、2021年7月1日15時20分0秒~15秒の間に撮像されたものとする。また、静止画像5~7は、2021年7月1日15時39分50秒~40分5秒の間に撮像されたものとする。また、静止画像8~9は、2021年7月1日15時50分15秒~20秒の間に撮像されたものとする。
このとき、表示制御部32は、日時入力画像41の表示を継続するとともに、日時入力画像41を検索された画像の前面に重畳して表示する。なお、ここでは、日時入力画像41は、例えば半透明など所定の透明度で、検索された画像が透けて見えるように表示されるようにしてもよい。
図8は、具体例2における画像検索表示処理を説明する図である。具体例2では、図8上段にも示すように、具体例1と同様の画像が記憶部14に記憶されているとする。
図10は、具体例3における画像検索表示処理を説明する図である。具体例3では、図10上段にも示すように、具体例1と同様の画像が記憶部14に記憶されているとする。
図12は、具体例4における画像検索表示処理を説明する図である。具体例4では、図12上段にも示すように、具体例2と同様の画像が記憶部14に記憶されているとする。また、具体例4では、静止画像1~静止画像4がグループ化されているとする。また、静止画像5~静止画像7がグループ化されているとする。また、静止画像8~静止画像9がグループ化されているとする。
ここで、グループ化とは、1回の連続撮像またはインターバル撮像によって撮像された複数の画像を関連付けるものである。なお、グループ化された画像を識別するためのグループ情報は、グループ化された画像の画像データと紐づけられて記憶部14に記憶されている。すなわち、記憶部14に記憶されたグループ情報により、記憶部14に記憶された画像データがグループに属するのかどうか識別することができる。
例えば、画像検索部33は、静止画像6が含まれるグループの画像(静止画像5~静止画像7)の中から、最も撮像時刻が早い画像(静止画像5)を検索結果として決定する。
図14は、具体例5における画像検索表示処理を説明する図である。具体例5では、図14上段にも示すように、具体例4と同様の画像、および、グループ情報が記憶部14に記憶されているとする。
このとき、具体例5では、画像検索部33は、静止画像6が含まれるグループの画像(静止画像5~静止画像7)の中から、指定された日時情報に撮像時刻が最も近い画像(ここでは、静止画像6)を検索結果として決定する。
具体例4および具体例5では、指定された日時情報に対応する画像を検索し、検索された画像についてグループ情報が記憶されている場合、検索された画像が含まれるグループの画像の中から、撮像時刻が最も早い画像、または、日時情報に撮像時刻が最も近い画像を検索結果として決定するようにした。
図16は、具体例7における画像検索表示処理を説明する図である。具体例1~具体例6では画像として静止画像が記憶部14に記憶されている場合について説明した。具体例7では、静止画像および動画像が記憶部14に記憶されている場合について説明する。
具体的には、画像検索部33は、動画像に対応付けられたメタファイルから撮像時刻およびフレームレートを読み出す。そして、画像検索部33は、指定された日時情報から撮像時刻を減算し、その減算結果とフレームレートとに基づいて、撮像開始から何フレーム目が、ユーザに指定された日時情報に一致する時刻のフレームかを判断する。そして、画像検索部33は、判断されたフレーム数に対応するフレームを検索結果として決定する。
なお、ユーザに指定された日時情報に対応する画像が動画像であるか否かは、動画像の撮像時刻(撮像開始時刻)よりも日時情報が後で、かつ、動画像の撮像時刻に撮像時間を加算した時刻よりも日時情報が前であるかによって判定することができる。
図18は、具体例8における画像検索表示処理を説明する図である。具体例1~具体例6では画像として静止画像が記憶部14に記憶されている場合について説明した。具体例8では、動画像の撮像中に例えばシャッターボタンが押下操作されると静止画像を撮像可能なデュアル記録機能を備える場合について説明する。したがって、具体例8では、静止画像および動画像が記憶部14に記憶されている。
具体的には、動画像1については撮像時刻である「2021年7月1日15:39:50」を比較対象とする。したがって、ここでは、ユーザに指定された日時情報に対応する画像として、指定された日時情報に最も近い撮像時刻を有する静止画像6が検索されることになる。なお、他の処理は具体例1~6と同様であるため省略する。
図19は、具体例9における画像検索表示処理を説明する図である。具体例9では、図19下段に示すように、同一の日付の異なる時間に撮像された複数の画像(静止画像)が記憶部14に記憶されているとする。
ここで、日時入力画像43は、「年」、「月」、「日」のみが入力項目とする入力欄44が設けられており、「時」、「分」の入力項目は設けられていない。
図21は、具体例10における画像検索表示処理を説明する図である。図22は、具体例10における画像検索表示処理の流れを示したフローチャートである。具体例10では、具体例9におけるステップS66、ステップS67の処理に代えて、ステップS71~ステップS73の処理が実行される点で異なる。
具体例11では、画像検索部33によって検索された、ユーザに指定された日時情報に対応する画像について所定の画像解析処理を行い、検索された画像が撮像されたシーンを識別する。なお、画像解析処理は、既知となっている種々の画像解析の手法を適用することが可能であり、ここでは詳しい説明を省略する。
具体例12では、撮像装置1が外部機器に接続されている。なお、外部機器としては、スマートフォン、パーソナルコンピュータ等のCPUを有するコンピュータである。そして、外部機器がスマートフォンである場合、図23に示すような日時入力画像45が表示部に外部機器の表示部に表示される。日時入力画像45は、「月、日」、「時」、「分」が上下にスクロール表示されており、ユーザのスワイプ操作(タッチ操作)を受け付ける。
具体例13では、複数の撮像装置1がネットワークを介して相互に接続されている。そして、いずれかの撮像装置1において、例えば日時入力画像41を介して日時情報が入力されると、その日時情報が他の撮像装置1に送信される。そして、全ての撮像装置1において、ユーザが指定した日時情報に対応する画像を検索し、日時情報が入力された撮像装置1に、検索された画像を送信する。
なお、実施形態としては上記により説明した実施例に限定されるものではなく、多様な変形例としての構成を採り得るものである。
例えば、情報処理装置として撮像装置1を例に挙げて説明したが、情報処理装置としては、コンピュータ、ゲーム機、テレビジョン受像機など様々な装置であってもよい。
上記のように実施形態としての情報処理装置(撮像装置1)は、撮像時刻が関連付けられて記憶された複数の画像の中から、指定された日時情報に対応する画像を検索する画像検索部33と、画像検索部33による検索結果に基づく画像を表示部15(背面モニタ4)に表示する表示制御部32と、を備える。
これにより、情報処理装置は、指定された日時情報と、画像に関連付けられた撮像時刻とを比較し、指定された日時情報と撮像時刻が近い画像を検索結果として決定することが可能となる。
したがって、情報処理装置は、画像を検索する際の使い勝手を向上することができる。
これにより、ユーザが指定した日時情報に最も適した画像を表示することが可能となる。
これにより、ユーザが指定した日時情報に一致する撮像時刻が関連付けられた画像がない場合であっても、その日時情報に近い画像を表示することで、ユーザが所望する画像である可能性が高い画像を表示することが可能となる。
これにより、ユーザが所望しない画像が表示されたときに、日時情報を直ぐに入力することが可能となる。
これにより、表示部に表示された画像から何分後に起きたシーンの画像を表示したかをユーザがわかっている場合に、ユーザにとって日時情報を指定する入力を容易にさせるユーザインタフェースを提供することが可能となる。
これにより、表示部に表示された画像から何分後に起きたシーンの画像を表示したかをユーザがわかっている場合に、ユーザはその差分を容易に入力させることが可能となる。
これにより、指定された日時情報に対応する動画像の先頭フレームをユーザに見せることができる。
これにより、指定された日時情報に対応する画像が動画像であっても、ユーザが指定した日時情報に対応する画像(静止画像)をユーザに見せることができる。
これにより、1つの連続して撮像された画像の中から、画像の頭出しを行うことができる。
これにより、1つの連続して撮像された画像の中から、ユーザが最も所望するであろう画像を表示することができる。
これにより、1つの連続して撮像された画像の中から、頭出しされた画像と、ユーザが最も所望するであろう画像とを並べて表示することができる。
これにより、1つの連続して撮像された画像の中から、頭出しされた画像と、ユーザが最も所望するであろう画像とをさらに見やすく表示することができる。
これにより、外部機器からの操作によって、撮像装置1に記憶された画像を外部機器に送信することが可能となる。
これにより、1つの連続して撮像された画像の中から、任意の画像を検索することができる。
これにより、1つの連続して撮像された画像の中から、画像の頭出しを行うことができる。
これにより、最初は大雑把に検索した後に、詳細に検索するなどが可能となり、ユーザにとって使い勝手がさらに向上したユーザインタフェースを提供することができる。
これにより、最初は大雑把に検索した後に、詳細に検索するなどが可能となり、ユーザにとって使い勝手がさらに向上したユーザインタフェースを提供することができる。
これにより、ユーザにとって直感的に日時情報を入力させることが可能なユーザインタフェースを提供することができる。
また、プログラムは、撮像時刻が関連付けられて記憶された複数の画像の中から、指定された日時情報に対応する画像を検索する画像検索部と、前記画像検索部によって検索された画像を表示部に表示する表示制御部と、してコンピュータを機能させる。
このような情報処理方法及びプログラムにおいても、情報処理装置と同様の効果を得ることができる。
あるいはまた、フレキシブルディスク、CD-ROM(Compact Disc Read Only Memory)、MO(Magnet optical)ディスク、DVD、ブルーレイディスク、磁気ディスク、半導体メモリ、メモリカードなどのリムーバブル記録媒体に、一時的あるいは永続的に格納(記録)しておくことができる。このようなリムーバブル記録媒体は、いわゆるパッケージソフトウェアとして提供することができる。
また、プログラムは、リムーバブル記録媒体からパーソナルコンピュータ等にインストールする他、ダウンロードサイトから、LAN(Local Area Network)、インターネットなどのネットワークを介してダウンロードすることもできる。
なお本技術は以下のような構成も採ることができる。
(1)
撮像時刻が関連付けられて記憶された複数の画像の中から、指定された日時情報に対応する画像を検索する画像検索部と、
前記画像検索部による検索結果に基づく画像を表示部に表示する表示制御部と、
を備えた情報処理装置。
(2)
前記画像検索部は、
指定された日時情報に一致する撮像時刻が関連付けられた画像が複数あった場合、関連付けられた複数の画像のうち撮像時刻が最も早い画像を検索結果として決定する
(1)に記載の情報処理装置。
(3)
前記画像検索部は、
指定された日時情報に一致する撮像時刻が関連付けられた画像がない場合、指定された日時情報に撮像時刻が最も近い画像を検索結果として決定する
(1)または(2)に記載の情報処理装置。
(4)
前記表示制御部は、
前記日時情報をユーザに入力させるための日時入力画像を前記表示部に表示し、
前記画像検索部によって検索された画像を表示する際に、前記日時入力画像も表示する
(1)から(3)のいずれかに記載の情報処理装置。
(5)
前記表示制御部は、
前記日時情報をユーザに入力させるための日時入力画像を前記表示部に表示し、
前記表示部に表示された画像に関連付けられた撮像時刻を前記日時入力画像に表示する
(1)から(4)のいずれかに記載の情報処理装置。
(6)
前記表示制御部は、
前記表示部に表示された画像に関連付けられた撮像時刻との差分を入力させるための差分日時入力画像を前記表示部に表示し、
前記画像検索部によって検索された画像を表示する際に、前記差分日時入力画像も表示する
(1)から(5)のいずれかに記載の情報処理装置。
(7)
前記複数の画像には、静止画像および動画像が含まれており、
前記表示制御部は、
指定された日時情報に対応する画像が動画像である場合、前記動画像の最初のフレームを前記表示部に表示する
(1)から(6)のいずれかに記載の情報処理装置。
(8)
前記複数の画像には、静止画像および動画像が含まれており、
前記表示制御部は、
指定された日時情報に対応する画像が動画像である場合、前記動画像における指定された日時情報に一致する時刻のフレームを前記表示部に表示する
(1)から(7)のいずれかに記載の情報処理装置。
(9)
前記複数の画像はグループ化されており、
前記画像検索部は、
指定された日時情報に対応する画像がグループ化された画像である場合、指定された日時情報に対応する画像が含まれるグループの画像の中から、撮像時刻が最も早い画像を検索結果として決定する
(1)から(8)のいずれかに記載の情報処理装置。
(10)
前記複数の画像には、1以上のグループ化された画像が含まれており、
前記画像検索部は、
指定された日時情報に対応する画像がグループ化された画像である場合、指定された日時情報に対応する画像が含まれるグループの画像の中から、指定された日時情報に撮像時刻が最も近い画像を検索結果として決定する
(1)から(9)のいずれかに記載の情報処理装置。
(11)
前記複数の画像には、1以上のグループ化された画像が含まれており、
前記画像検索部は、
指定された日時情報に対応する画像がグループ化された画像である場合、指定された日時情報に対応する画像が含まれるグループの画像の中から、撮像時刻が最も早い画像と、指定された日時情報に撮像時刻が最も近い画像とを検索結果として決定する
(1)から(10)のいずれかに記載の情報処理装置。
(12)
前記表示制御部は、
指定された日時情報に撮像時刻が最も近い画像をメイン画像として表示し、撮像時刻が最も早い画像をサブ画像として表示する
(11)に記載の情報処理装置。
(13)
前記画像検索部は、
外部機器によって指定された日時情報に対応する画像を検索する
(1)から(12)のいずれかに記載の情報処理装置。
(14)
前記画像は、撮像条件を示す情報が関連付けて記憶されており、
前記画像検索部は、
指定された日時情報に対応する画像に関連付けれた前記撮像条件と同一の撮像条件の画像のいずれかを検索結果として決定する
(1)から(13)のいずれかに記載の情報処理装置。
(15)
前記画像検索部は、
指定された日時情報に対応する画像に関連付けれた前記撮像条件と同一の撮像条件の画像の中から、撮像時刻が最も早い画像を検索結果として決定する
(14)に記載の情報処理装置。
(16)
前記画像検索部は、
指定された日時情報の入力精度に応じて異なる方法で画像を検索する
(1)から(15)のいずれかに記載の情報処理装置。
(17)
前記画像検索部は、
入力精度が相対的に低い日時情報で画像を検索した後、その検索結果に基づいて、入力精度が相対的に高い日時情報で画像を検索する
(16)に記載の情報処理装置。
(18)
前記表示制御部は、
前記日時情報をユーザにタッチ操作で入力させるための日時入力画像を前記表示部に表示する
(1)から(17)のいずれかに記載の情報処理装置。
(19)
情報処理装置が、
撮像時刻が関連付けられて記憶された複数の画像の中から、指定された日時情報に対応する画像を検索し、
検索された画像を表示部に表示する
情報処理方法。
(20)
撮像時刻が関連付けられて記憶された複数の画像の中から、指定された日時情報に対応する画像を検索する画像検索部と、
前記画像検索部によって検索された画像を表示部に表示する表示制御部と、
してコンピュータを機能させるプログラム。
4 背面モニタ
6 操作子
15 表示部
17 制御部
32 表示制御部
33 画像検索部
Claims (20)
- 撮像時刻が関連付けられて記憶された複数の画像の中から、指定された日時情報に対応する画像を検索する画像検索部と、
前記画像検索部による検索結果に基づく画像を表示部に表示する表示制御部と、
を備えた情報処理装置。 - 前記画像検索部は、
指定された日時情報に一致する撮像時刻が関連付けられた画像が複数あった場合、関連付けられた複数の画像のうち撮像時刻が最も早い画像を検索結果として決定する
請求項1に記載の情報処理装置。 - 前記画像検索部は、
指定された日時情報に一致する撮像時刻が関連付けられた画像がない場合、指定された日時情報に撮像時刻が最も近い画像を検索結果として決定する
請求項1に記載の情報処理装置。 - 前記表示制御部は、
前記日時情報をユーザに入力させるための日時入力画像を前記表示部に表示し、
前記画像検索部によって検索された画像を表示する際に、前記日時入力画像も表示する
請求項1に記載の情報処理装置。 - 前記表示制御部は、
前記日時情報をユーザに入力させるための日時入力画像を前記表示部に表示し、
前記表示部に表示された画像に関連付けられた撮像時刻を前記日時入力画像に表示する
請求項1に記載の情報処理装置。 - 前記表示制御部は、
前記表示部に表示された画像に関連付けられた撮像時刻との差分を入力させるための差分日時入力画像を前記表示部に表示し、
前記画像検索部によって検索された画像を表示する際に、前記差分日時入力画像も表示する
請求項1に記載の情報処理装置。 - 前記複数の画像には、静止画像および動画像が含まれており、
前記表示制御部は、
指定された日時情報に対応する画像が動画像である場合、前記動画像の最初のフレームを前記表示部に表示する
請求項1に記載の情報処理装置。 - 前記複数の画像には、静止画像および動画像が含まれており、
前記表示制御部は、
指定された日時情報に対応する画像が動画像である場合、前記動画像における指定された日時情報に一致する時刻のフレームを前記表示部に表示する
請求項1に記載の情報処理装置。 - 前記複数の画像はグループ化されており、
前記画像検索部は、
指定された日時情報に対応する画像がグループ化された画像である場合、指定された日時情報に対応する画像が含まれるグループの画像の中から、撮像時刻が最も早い画像を検索結果として決定する
請求項1に記載の情報処理装置。 - 前記複数の画像には、1以上のグループ化された画像が含まれており、
前記画像検索部は、
指定された日時情報に対応する画像がグループ化された画像である場合、指定された日時情報に対応する画像が含まれるグループの画像の中から、指定された日時情報に撮像時刻が最も近い画像を検索結果として決定する
請求項1に記載の情報処理装置。 - 前記複数の画像には、1以上のグループ化された画像が含まれており、
前記画像検索部は、
指定された日時情報に対応する画像がグループ化された画像である場合、指定された日時情報に対応する画像が含まれるグループの画像の中から、撮像時刻が最も早い画像と、指定された日時情報に撮像時刻が最も近い画像とを検索結果として決定する
請求項1に記載の情報処理装置。 - 前記表示制御部は、
指定された日時情報に撮像時刻が最も近い画像をメイン画像として表示し、撮像時刻が最も早い画像をサブ画像として表示する
請求項11に記載の情報処理装置。 - 前記画像検索部は、
外部機器によって指定された日時情報に対応する画像を検索する
請求項1に記載の情報処理装置。 - 前記画像は、撮像条件を示す情報が関連付けて記憶されており、
前記画像検索部は、
指定された日時情報に対応する画像に関連付けれた前記撮像条件と同一の撮像条件の画像のいずれかを検索結果として決定する
請求項1に記載の情報処理装置。 - 前記画像検索部は、
指定された日時情報に対応する画像に関連付けれた前記撮像条件と同一の撮像条件の画像の中から、撮像時刻が最も早い画像を検索結果として決定する
請求項14に記載の情報処理装置。 - 前記画像検索部は、
指定された日時情報の入力精度に応じて異なる方法で画像を検索する
請求項1に記載の情報処理装置。 - 前記画像検索部は、
入力精度が相対的に低い日時情報で画像を検索した後、その検索結果に基づいて、入力精度が相対的に高い日時情報で画像を検索する
請求項16に記載の情報処理装置。 - 前記表示制御部は、
前記日時情報をユーザにタッチ操作で入力させるための日時入力画像を前記表示部に表示する
請求項1に記載の情報処理装置。 - 情報処理装置が、
撮像時刻が関連付けられて記憶された複数の画像の中から、指定された日時情報に対応する画像を検索し、
検索された画像を表示部に表示する
情報処理方法。 - 撮像時刻が関連付けられて記憶された複数の画像の中から、指定された日時情報に対応する画像を検索する画像検索部と、
前記画像検索部によって検索された画像を表示部に表示する表示制御部と、
してコンピュータを機能させるプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023536605A JPWO2023002676A1 (ja) | 2021-07-21 | 2022-03-10 | |
CN202280049569.XA CN117678217A (zh) | 2021-07-21 | 2022-03-10 | 信息处理设备、信息处理方法和程序 |
EP22845616.6A EP4376427A1 (en) | 2021-07-21 | 2022-03-10 | Information processing device, information processing method, and program |
US18/570,524 US20240273135A1 (en) | 2021-07-21 | 2022-03-10 | Information processing device, information processing method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021120863 | 2021-07-21 | ||
JP2021-120863 | 2021-07-21 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023002676A1 true WO2023002676A1 (ja) | 2023-01-26 |
Family
ID=84979868
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/010642 WO2023002676A1 (ja) | 2021-07-21 | 2022-03-10 | 情報処理装置、情報処理方法、プログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240273135A1 (ja) |
EP (1) | EP4376427A1 (ja) |
JP (1) | JPWO2023002676A1 (ja) |
CN (1) | CN117678217A (ja) |
WO (1) | WO2023002676A1 (ja) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002300521A (ja) * | 2001-03-30 | 2002-10-11 | Sony Corp | 情報処理装置及び情報処理方法、プログラム並びに記録媒体 |
JP2004220420A (ja) * | 2003-01-16 | 2004-08-05 | Fuji Photo Film Co Ltd | 画像検索方法および装置並びにプログラム |
JP2006166193A (ja) * | 2004-12-09 | 2006-06-22 | Casio Comput Co Ltd | 電子カメラ |
JP2007312310A (ja) | 2006-05-22 | 2007-11-29 | Fujifilm Corp | 画像表示装置、表示制御プログラム及び撮影装置 |
JP2016004379A (ja) * | 2014-06-16 | 2016-01-12 | 大日本印刷株式会社 | 画像プリント装置及び注文受付端末 |
JP2020194302A (ja) * | 2019-05-27 | 2020-12-03 | キヤノン株式会社 | 情報処理装置、検索システム、情報処理装置の制御方法、プログラム |
-
2022
- 2022-03-10 US US18/570,524 patent/US20240273135A1/en active Pending
- 2022-03-10 CN CN202280049569.XA patent/CN117678217A/zh active Pending
- 2022-03-10 EP EP22845616.6A patent/EP4376427A1/en active Pending
- 2022-03-10 WO PCT/JP2022/010642 patent/WO2023002676A1/ja active Application Filing
- 2022-03-10 JP JP2023536605A patent/JPWO2023002676A1/ja active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002300521A (ja) * | 2001-03-30 | 2002-10-11 | Sony Corp | 情報処理装置及び情報処理方法、プログラム並びに記録媒体 |
JP2004220420A (ja) * | 2003-01-16 | 2004-08-05 | Fuji Photo Film Co Ltd | 画像検索方法および装置並びにプログラム |
JP2006166193A (ja) * | 2004-12-09 | 2006-06-22 | Casio Comput Co Ltd | 電子カメラ |
JP2007312310A (ja) | 2006-05-22 | 2007-11-29 | Fujifilm Corp | 画像表示装置、表示制御プログラム及び撮影装置 |
JP2016004379A (ja) * | 2014-06-16 | 2016-01-12 | 大日本印刷株式会社 | 画像プリント装置及び注文受付端末 |
JP2020194302A (ja) * | 2019-05-27 | 2020-12-03 | キヤノン株式会社 | 情報処理装置、検索システム、情報処理装置の制御方法、プログラム |
Also Published As
Publication number | Publication date |
---|---|
CN117678217A (zh) | 2024-03-08 |
EP4376427A1 (en) | 2024-05-29 |
US20240273135A1 (en) | 2024-08-15 |
JPWO2023002676A1 (ja) | 2023-01-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6324063B2 (ja) | 画像再生装置及びその制御方法 | |
JP5056061B2 (ja) | 撮像装置 | |
US20050134719A1 (en) | Display device with automatic area of importance display | |
JP2008263538A (ja) | 撮影装置、方法およびプログラム | |
KR101739379B1 (ko) | 디지털 촬영 장치 및 이의 제어 방법 | |
KR101626002B1 (ko) | 디지털 촬영 장치, 그 제어 방법, 및 컴퓨터 판독가능 저장매체 | |
KR101737086B1 (ko) | 디지털 촬영 장치 및 이의 제어 방법 | |
CN101076086B (zh) | 场景选择画面生成装置 | |
WO2023002676A1 (ja) | 情報処理装置、情報処理方法、プログラム | |
JP5066878B2 (ja) | カメラ及び表示システム | |
JP4887167B2 (ja) | 画像表示装置、画像表示プログラム及び撮影装置 | |
WO2021255975A1 (ja) | 撮像装置、撮像制御装置、撮像装置の制御方法、プログラム | |
KR101436325B1 (ko) | 동영상 대표 이미지 설정 방법 및 장치 | |
JP2008054128A (ja) | 撮像装置、画像表示装置及びそのプログラム | |
WO2022158203A1 (ja) | 撮像装置、撮像制御方法、プログラム | |
JP2005278003A (ja) | 画像処理装置 | |
KR101946574B1 (ko) | 영상 재생 장치, 방법, 및 컴퓨터 판독가능 저장매체 | |
JP6249771B2 (ja) | 画像処理装置、画像処理方法、プログラム | |
JP2017135606A (ja) | 撮像装置 | |
WO2022201948A1 (ja) | 情報処理装置、情報処理方法、プログラム | |
WO2022158201A1 (ja) | 画像処理装置、画像処理方法、プログラム | |
JP2012029119A (ja) | 表示制御装置、カメラ、および、表示装置 | |
JP2004312299A (ja) | デジタルカメラ | |
JP6594134B2 (ja) | 撮像装置、情報処理方法及びプログラム | |
JP2020043604A (ja) | 撮像装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2023536605 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18570524 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280049569.X Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022845616 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022845616 Country of ref document: EP Effective date: 20240221 |