CN116634121A - Electronic device, control method of electronic device, and storage medium - Google Patents

Electronic device, control method of electronic device, and storage medium Download PDF

Info

Publication number
CN116634121A
CN116634121A CN202310130508.2A CN202310130508A CN116634121A CN 116634121 A CN116634121 A CN 116634121A CN 202310130508 A CN202310130508 A CN 202310130508A CN 116634121 A CN116634121 A CN 116634121A
Authority
CN
China
Prior art keywords
frame
specified
reference position
image
frame image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310130508.2A
Other languages
Chinese (zh)
Inventor
清水瑠璃果
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2022195646A external-priority patent/JP2023121126A/en
Application filed by Canon Inc filed Critical Canon Inc
Publication of CN116634121A publication Critical patent/CN116634121A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/144Movement detection

Abstract

The invention relates to an electronic device, a control method of the electronic device and a storage medium. The electronic device sequentially displays a plurality of frame images of the moving image content on the screen, and displays a timeline area corresponding to a reproduction period of the plurality of frame images on the screen. The electronic device calculates a position on a locus connecting reference positions of the first frame image and the second frame image, sets the calculated position as a reference position of the third frame image, and sets a plurality of fourth frame images existing in a specified period specified by a user to maintain one reference position specified by the user. The electronic device displays the specified time period and other time periods differently in the time line region, and sequentially displays regions corresponding to reference positions set for each of the plurality of frame images on the screen.

Description

Electronic device, control method of electronic device, and storage medium
Technical Field
The invention relates to an electronic device, a control method of the electronic device and a storage medium.
Background
There are platforms and devices that can render and share VR content (e.g., omnidirectional images, omnidirectional panoramic images).
Here, in order to convert VR content into an image that can be easily processed, a method for extracting an area having a narrower viewing angle than VR content from VR content is known. The extraction view may be specified for each frame by a user operation.
Japanese patent application laid-open publication 2014-165763 and japanese patent application laid-open publication 2005-223416 disclose techniques that can designate an arbitrary position in VR content by a user operation and extract (select) an image in a range (a narrower range than the original VR content) centered on the position.
However, if the above-described technique is used to extract (select) regions for all frames of VR content, the user must specify positions for all frames. This requires many operating steps by the user and is time consuming.
Disclosure of Invention
In view of the above, the present invention enables a user to easily select a part of an area from content.
An aspect of the present invention is an electronic apparatus for reproducing moving image content, the electronic apparatus including: a display control unit configured to control to sequentially display a plurality of frame images including the moving image content on a screen, and to display a timeline area corresponding to a reproduction period of the plurality of frame images on the screen; a calculation unit configured to calculate, for a third frame image existing between a first frame image and a second frame image, which are two frame images for which reference positions have been set, a position on a locus connecting a first reference position of the first frame image and a second reference position of the second frame image, the reference position of the third frame image not yet set; a setting unit configured to automatically set the calculated position as a third reference position of the third frame image, and automatically set one fourth reference position specified according to an operation by a user so as to be maintained for a plurality of fourth frame images existing in a specified period of time specified in the timeline area according to the operation by the user; and a control unit configured to control to display the specified time period and other time periods differently in the time line region on the screen, and control to reproduce the moving image content by sequentially displaying regions corresponding to reference positions set for the plurality of frame images respectively on the screen.
An aspect of the present invention is a control method of an electronic apparatus for reproducing moving image content, the control method including the steps of: control is performed to sequentially display a plurality of frame images including the moving image content on a screen, and a timeline area corresponding to a reproduction period of the plurality of frame images is displayed on the screen; calculating, for a third frame image existing between a first frame image and a second frame image, a position on a locus connecting a first reference position of the first frame image and a second reference position of the second frame image, wherein the first frame image and the second frame image are two frame images for which reference positions have been set, the reference positions of the third frame image not having been set; automatically setting the calculated position as a third reference position of the third frame image; for a plurality of fourth frame images existing in a specified period of time specified in the timeline area according to an operation by a user, one fourth reference position specified according to the operation by the user is automatically set so that the one fourth reference position is maintained; control is performed to display the specified time period and other time periods differently in the timeline area on the screen; and controlling to reproduce the moving image content by sequentially displaying regions corresponding to the reference positions set for the plurality of frame images, respectively, on the screen.
Further features of the invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Drawings
Fig. 1A to 1C are diagrams depicting a digital camera;
fig. 2A and 2B are diagrams depicting a display control apparatus;
fig. 3A to 3C are diagrams and tables for explaining display control processing;
fig. 4 is a flowchart showing a control process;
fig. 5 is a flowchart of the moving image generation process;
fig. 6A to 6D are diagrams for explaining the speed control process; and
fig. 7 is a flowchart of the speed control process.
Detailed Description
A technique to systematically and automatically set a position to be an extraction reference (extraction reference position) may be possible. For example, the extraction reference position at each timing between the first timing and the second timing is set such that the extraction reference position at the first timing is changed to the extraction reference position at the second timing. However, with this technique, in the case where the extraction reference positions are set at the same position for each timing between the first timing and the second timing, the user has to manually set the same extraction reference position for both the first timing and the second timing. Therefore, in the following embodiments, a technique to more easily set the extraction reference position between two timings to the same position is described.
Embodiments of the present invention will be described with reference to the accompanying drawings. However, the following embodiments are not intended to limit the invention according to the claims, and not all combinations of features described in the present embodiments are necessary as solutions to the problems disclosed in the present invention. In the following description, the same constituent elements are denoted by the same reference numerals. The various embodiments of the invention described below may be implemented alone or as a combination of multiple embodiments or features of such embodiments where desired or where it is beneficial to combine elements or features of the various embodiments into one embodiment.
Example 1
Embodiment 1 of the present invention will be described with reference to the accompanying drawings. Fig. 1A is a front perspective view (external view) of a digital camera 100 (image pickup apparatus) as an electronic device. Fig. 1B is a rear perspective view (external view) of the digital camera 100. The digital camera 100 is an omnidirectional camera.
The barrier 102a is a protective window of the imaging lens 103a used for the "camera section a" whose imaging range is on the front side of the digital camera 100. The barrier 102a may be an outer side surface of the imaging lens 103a itself. The "camera portion a" is a wide-angle camera whose imaging range is a wide range (at least 180 ° in the up-down-left-right direction) of the front side of the digital camera 100. The barrier 102B is a protective window of the imaging lens 103B used for the "camera section B" whose imaging range is on the rear side of the digital camera 100. The barrier 102b may be an outer side surface of the imaging lens 103b itself. The "camera section B" is a wide-angle camera whose imaging range is a wide range (at least 180 ° in the up-down-left-right direction) of the rear side of the digital camera 100.
The display unit 28 is a display unit for displaying various information. The shutter button 61 is an operation unit to instruct shooting of an image. The mode selection switch 60 is an operation unit for switching various modes. The connection I/F25 is a connector between a connection cable to connect with an external device (e.g., a smart phone, a personal computer, a TV) and the digital camera 100. The operation unit 70 is an operation unit constituted by an operation member (e.g., various switches, buttons, dials, touch sensors) to accept various operations from a user. The power switch 72 is a push button for switching the power supply ON/OFF (ON/OFF).
The light emitting unit 21 is a light emitting member such as a Light Emitting Diode (LED). The light emitting unit 21 notifies the user of various states of the digital camera 100 by a light emitting pattern or a light emitting color. The fixing unit 40 is, for example, a tripod screw hole, and is a member to mount the digital camera 100 to a fixing device such as a tripod.
Fig. 1C is a block diagram depicting a structural example of the digital camera 100. The barrier 102a covers the image pickup system of the digital camera 100 including the "camera section a" of the image pickup lens 103a to prevent contamination and damage of the image pickup system (including the image pickup lens 103a, the shutter 101a, and the image pickup unit 22 a). The image pickup lens 103a is a lens group, and includes a zoom lens and a focus lens. The imaging lens 103a is, for example, a wide-angle lens. The shutter 101a is a shutter having a diaphragm function to adjust the amount of object light entering the image capturing unit 22 a. The image pickup unit 22a is an image pickup element composed of a CCD or CMOS element or the like that converts an optical image into an electric signal. The a/D converter 23a converts an analog signal output from the image pickup unit 22a into a digital signal.
The barrier 102B covers the image pickup system of the "camera section B" of the digital camera 100 including the image pickup lens 103B to prevent contamination and damage of the image pickup system (including the image pickup lens 103B, the shutter 101B, and the image pickup unit 22B). The image pickup lens 103b is a lens group, and includes a zoom lens and a focus lens. The imaging lens 103b is, for example, a wide-angle lens. The shutter 101b is a shutter having a diaphragm function to adjust the amount of object light entering the imaging unit 22 b. The image pickup unit 22b is an image pickup element composed of a CCD or CMOS element or the like that converts an optical image into an electric signal. The a/D converter 23b converts an analog signal output from the image pickup unit 22b into a digital signal.
VR images are captured by the imaging units 22a and 22 b. VR images are images that can be VR displayed. The VR image includes an omnidirectional image captured by an omnidirectional camera, and a panoramic image having an image range (effective imaging range) wider than a display range that can be displayed on the display unit all at once. VR images include not only still images but also moving images and live view images (images acquired from cameras in a near real-time manner). The VR image has an image range (effective image range) of a field of view of 360 ° at maximum in the vertical direction (vertical angle, angle from the top, elevation angle, depression angle, and altitude angle) and 360 ° at maximum in the horizontal direction (horizontal angle, azimuth angle). Further, even if the imaging range is vertically less than 360 ° and horizontally less than 360 °, the VR image includes an image having a wide viewing angle (field of view range) wider than the viewing angle that can be captured by a standard camera, or an image having an image range (effective imaging range) wider than the display range that can be displayed all at once on the display unit. For example, an image captured by an omnidirectional camera that can capture an object existing in a field of view (angle of view) of 360 ° in the horizontal direction (horizontal angle, azimuth angle) and 210 ° in a vertical angle centered on the zenith is a VR image.
Further, for example, an image captured by a camera that can capture an object existing in a field of view (angle of view) of 180 ° in the horizontal direction (horizontal angle, azimuth angle) and 180 ° in a vertical angle centered on the horizontal direction is a VR image. In other words, an image having an image range in which the field of view is at least 160 ° (±80°) in the vertical direction and the horizontal direction, respectively, and having an image range wider than a range that can be recognized by the human eye all at once is a VR image. If VR display is performed on the VR image (the VR image is displayed in the display mode "VR view") and the posture of the display device is changed in the horizontal rotation direction, an omnidirectional image seamless in the horizontal direction (horizontal rotation direction) can be seen. In the case of the vertical direction (vertical rotation direction), a seamless omnidirectional image can be viewed within ±105° from the position directly above (zenith), but a range exceeding 105 ° from the position directly above becomes a blank region. VR images may be defined as images whose image range is in at least a portion of a virtual space (VR space).
The VR image (VR view) is a display method (display mode) in which the display range of the VR image can be changed so that an image in a field of view corresponding to the posture of the display device is displayed according to the posture of the display device. In the case of viewing by wearing a Head Mounted Display (HMD) as a display device, an image in a visual field range corresponding to the direction of the face of the user is displayed. For example, it is assumed that an image centered on a field angle (viewing angle) of horizontal 0 ° (a specific direction such as north, etc.) and vertical 90 ° (90 ° from the zenith, i.e., a horizontal direction) is currently displayed. If the posture of the display unit is reversed back and forth (for example, the direction of the display surface is changed from south to north), the display range of the VR image is changed from an image centered on the angle of view of 180 ° horizontally (opposite direction, such as south) and 90 ° vertically (horizontal direction). In the case where the user is wearing the HMD for viewing, if the user turns their face from north to south (i.e., if the user turns around), the image displayed on the HMD also changes from a north image to a south image. With this VR display, the user can visually experience as if they were in the actual location in the VR image (within the VR space). A smart phone installed in VR goggles (head mounted adapter) may be considered an HMD.
The method for displaying VR images is not limited to the above-described method, but the display range may be moved (scrolled) not according to a change in posture but according to an operation of a touch panel, a direction button, or the like by a user. Also during display on the VR display (in the VR view mode), the display range can be changed not only by changing the posture but also by a touch moving operation to the touch panel and a drag operation with an operation member such as a mouse.
The image processing unit 24 performs resizing processing (e.g., predetermined pixel interpolation, reduction) and color conversion processing on the data from the a/D converter 23a and the a/D converter 23b or on the data from the memory control unit 15. The image processing unit 24 also performs predetermined arithmetic processing using the captured image data. The system control unit 50 performs exposure control and ranging control based on the arithmetic operation result acquired by the image processing unit 24. Whereby through-the-lens (TTL) Autofocus (AF) processing, auto-exposure (AE) processing, and pre-flash Emission (EF) processing are performed. The image processing unit 24 also performs predetermined arithmetic processing using the captured image data, and performs TTL Automatic White Balance (AWB) processing based on the obtained arithmetic operation result.
The image processing unit 24 also performs basic image processing on the two images (fisheye images) acquired from the a/D converter 23a and the a/D converter 23b, and then synthesizes (performs image connection processing) the two images to generate a single VR image. In the image connection processing performed on the two images, the image processing unit 24 calculates the amount of deviation between the reference image and the comparison image for each region by the pattern matching processing in each of the two images to detect the connection position. Then, the image processing unit 24 performs distortion correction on the two images by geometric conversion, respectively, taking into account the detected connection positions and the respective optical system lens characteristics, and converts the corrected two images into an image in an omnidirectional image format. By mixing these two images in the omnidirectional image format, the image processing unit 24 finally generates one omnidirectional image (VR image). The generated omnidirectional image (VR image) is, for example, an image generated by equidistant cylindrical projection, and the position of each pixel may correspond to coordinates on the surface of the sphere. When displaying an existing VR image on a live view or reproducing a VR image, processing to perform VR display on the VR image such as image extraction processing, enlargement processing, distortion correction, and the like is performed, and rendering (rendering) to draw the VR image on the VRAM of the memory 32 is also performed.
The output data from the a/D converters 23a and 23b are written to the memory 32 via the image processing unit 24 and the memory control unit 15 or only via the memory control unit 15. The memory 32 stores image data acquired by the image pickup units 22a and 22b and converted into digital data by the a/D converters 23a and 23b, and image data output to an external display via the connection I/F25. The memory 32 has a storage capacity sufficient to store a predetermined number of still images and moving images and sound of a predetermined duration.
The memory 32 also functions as a memory (video memory) for displaying images. The data for image display stored in the memory 32 can be output to an external display via the connection I/F25. VR images (VR images captured by the imaging units 22a and 22b, generated by the image processing unit 24, and stored in the memory 32) are sequentially transferred to the display, and the VR images are displayed on the display. By this processing, live view display (LV display) of VR images is realized. The image displayed by live view is hereinafter referred to as "LV image". The live view display can also be performed by transmitting the VR image stored in the memory 32 to an external device (for example, a smart phone) wirelessly connected via the communication unit 54, and displaying the VR image on the external device side (remote LV display).
The nonvolatile memory 56 is an electrically erasable/recordable memory. For the nonvolatile memory 56, for example, an EEPROM is used. In the nonvolatile memory 56, constants, programs, and the like used for the operation of the system control unit 50 are stored. The "program" herein refers to a computer program for executing the processing steps of various flowcharts to be described later.
The system control unit 50 is a control unit constituted by at least one processor or circuit, and generally controls the digital camera 100. The system control unit 50 realizes the processes of the embodiments by executing programs recorded in the nonvolatile memory 56. For the system memory 52, for example, RAM is used. In the system memory 52, constants and variables for causing the system control unit 50 to operate, programs read from the nonvolatile memory 56, and the like are developed. The system control unit 50 also performs display control by controlling the memory 32, the image processing unit 24, and the memory control unit 15.
The system timer 53 is a timer unit to measure time (time to perform various controls and time of an internal clock).
The mode selection switch 60, the shutter button 61, and the operation unit 70 are operation members for inputting various operation instructions to the system control unit 50. The mode selection switch 60 switches the operation mode of the system control unit 50 to a still image recording mode, a moving image shooting mode, a reproduction mode, a communication connection mode, or the like. The still image photographing mode includes: an automatic image capturing mode, an automatic scene determination mode, a manual mode, a diaphragm priority mode (Av mode), a shutter speed priority mode (Tv mode), and a program AE mode. The still image recording mode includes various scene modes to set imaging for each imaging scene and a custom mode. The user can switch directly to one of these modes using the mode switch 60. Alternatively, the user may first select a list screen in the image capturing mode using the mode selection switch 60, then select one of the plurality of modes displayed on the display unit 28, and switch to the selected mode using another operation member. In the same manner, the moving image photographing mode may include a plurality of modes.
The first shutter switch 62 becomes on halfway in the operation of the shutter button 61 arranged in the digital camera 100, that is, in a half-pressed state (image capturing preparation instruction), and generates a first shutter switch signal SW1. By generating the first shutter switch signal SW1, the system control unit 50 starts an image capturing preparation operation such as an Auto Focus (AF) process, an Auto Exposure (AE) process, an Auto White Balance (AWB) process, or a pre-flash Emission (EF) process.
The second shutter switch 64 becomes on when the operation of the shutter button 61 is completed, that is, in a fully pressed state (imaging instruction), and generates a second shutter switch signal SW2. By generating the second shutter switch signal SW2, the system control unit 50 starts a series of image capturing processing operations from the step of reading the signals from the image capturing units 22a and 22b until the step of writing the image data to the recording medium 90.
The shutter button 61 is not limited to a shutter button that can be operated in two stages of full-press and half-press, but may be an operation member that can be pressed in one stage. In this case, the image capturing preparation operation and the image capturing operation are continuously performed by one-stage pressing. This operation is equivalent to an operation in the case where the shutter button that can be fully pressed and half pressed is fully pressed (an operation in the case where the SW1 signal and the SW2 signal are generated almost simultaneously).
The respective operation members of the operation unit 70 can be used as various function buttons assigned with appropriate functions according to scenes by selecting from various function icons and options displayed on the display unit 28. The function buttons are, for example, an end button, a back button, a forward button, a jump button, a filter button, and an attribute change button. For example, if a menu button is pressed, a menu screen that can make various settings is displayed on the display unit 28. By operating the operation unit 70 while checking the menu screen displayed on the display unit 28, the user can intuitively make various settings.
The power supply control unit 80 is constituted by a battery detection circuit, a DC-DC converter, a switching circuit (a circuit for switching blocks to be energized), and the like. The power control unit 80 detects whether a battery is mounted, the type of the battery, and the remaining power of the battery. The power supply control unit 80 also controls the DC-DC converter based on the detection result and the instruction from the system control unit 50, and supplies a required voltage to each unit (including the recording medium 90) in a required period of time. The power supply unit 30 is constituted by a primary battery (e.g., an alkaline battery, a lithium battery), a secondary battery (e.g., a NiCd battery, a NiMH battery, a lithium battery), an AC adapter, and the like.
The recording medium I/F18 is an interface with the recording medium 90 (e.g., memory card, hard disk). The recording medium 90 is a recording medium such as a memory card for recording a photographed image. The recording medium 90 is constituted by a semiconductor memory, an optical disk, a magnetic disk, or the like. The recording medium 90 may be a removable recording medium that is removable from the digital camera 100, or may be an internal recording medium of the digital camera 100.
The communication unit 54 is connected to an external device wirelessly or via a cable, and transmits/receives video signals and audio signals. The communication unit 54 may also be connected to a wireless LAN or the internet. The communication unit 54 may transmit the image (including the LV image) captured by the imaging unit 22a or the imaging unit 22b and the image recorded in the recording medium 90. The communication unit 54 may also receive images and various other information from external devices.
The posture detecting unit 55 detects the posture of the digital camera 100 with respect to the gravitational direction. Based on the posture detected by the posture detecting unit 55, it is possible to determine whether the images captured by the image capturing units 22a and 22b are images captured by the horizontally held digital camera 100 or images captured by the vertically held digital camera 100. Further, it is possible to determine how much the imaging units 22a and 22b are tilted in three axis directions (yaw, pitch, roll) when the imaging units 22a and 22b capture images. The system control unit 50 may attach posture information according to the posture detected by the posture detection unit 55 to the image file of the VR images captured by the image capturing units 22a and 22 b. The system control unit 50 may also record an image in a rotated state (by adjusting the direction of the image to correct the inclination) according to the detected posture. For the posture detection unit 55, at least one or a combination of an acceleration sensor, a gyro sensor, a geomagnetic sensor, an azimuth sensor, a height sensor, and the like may be used. Using the posture detecting unit (acceleration sensor, gyro sensor, orientation sensor), the movement (e.g., panning, tilting, lifting, holding stationary) of the digital camera 100 can be detected.
The microphone 20 is a microphone for collecting sound around the digital camera 100 to be recorded as sound of a moving image of a VR image. The connection I/F25 is provided withA connection plug of a cable, a USB cable, or the like, so that an image is transmitted or received with respect to a connected external device.
Fig. 2A shows an example of an external view of a display control apparatus 200 as one type of electronic apparatus. The display 205 is a display unit to display images and various information. As described later, the display 205 is integrated with the touch panel 206 a. Thereby, the display control apparatus 200 can detect a touch operation to the display surface of the display 205. The display control apparatus 200 may display and reproduce VR images (VR contents) in VR format on the display 205.
The operation unit 206 includes a touch panel 206a and operation units 206b, 206c, 206d, and 206e. The operation unit 206b is a power button that accepts an operation to switch the power of the display control apparatus 200 on/off. The operation unit 206c and the operation unit 206d are volume buttons to increase/decrease the volume of sound output from the sound output unit 212. The operation unit 206e is a home button to display a home screen on the display 205. The sound output terminal 212a is a headphone jack, and is a terminal for outputting sound to headphones, an external speaker, or the like. The speaker 212b is an internal speaker of the main unit to output sound.
Fig. 2B shows an example of the structure of the display control apparatus 200. The display control device 200 may be configured using such a display device as a smart phone. The CPU 201, memory 202, nonvolatile memory 203, image processing unit 204, display 205, operation unit 206, storage medium I/F207, external I/F209, and communication I/F210 are connected to an internal bus 250. Further, the sound output unit 212 and the posture detection unit 213 are also connected to the internal bus 250. The units connected to the internal bus 250 can exchange data with each other via the internal bus 250.
The CPU 201 is a control unit that generally controls the display control apparatus 200, and is constituted by at least one processor or circuit. The memory 202 is constituted by, for example, a RAM (for example, a volatile memory using semiconductor elements). The CPU 201 controls each unit of the display control apparatus 200 using the memory 202 as a work memory, for example, according to a program stored in the nonvolatile memory 203. In the nonvolatile memory 203, image data, audio data, other data, various programs for operation of the CPU 201, and the like are stored. The nonvolatile memory 203 is constituted by, for example, a flash memory, a ROM, or the like.
Based on the control of the CPU 201, the image processing unit 204 performs various types of image processing on images (for example, images stored in the nonvolatile memory 203 and the storage medium 208, video signals acquired via the external I/F209, images acquired via the communication I/F210). The image processing performed by the image processing unit 204 includes a/D conversion processing, D/a conversion processing, encoding processing for image data, compression processing, decoding processing, enlargement/reduction processing (resizing), noise reduction processing, and color conversion processing. The image processing unit 204 also performs various other types of image processing such as panorama development, mapping processing, and conversion on VR images as wide-range images (including but not limited to omni-directional images) having data in a wide range. The image processing unit 204 may be constituted by a dedicated circuit block for performing specific image processing. Depending on the type of image processing, the CPU 201 can perform image processing according to a program without using the image processing unit 204.
The display 205 displays an image, a Graphical User Interface (GUI) screen constituting a GUI, and the like based on control of the CPU 201. The CPU 201 generates a display control signal according to a program, and controls each unit of the display control device 200 (controls so that a video signal to display an image on the display 205 is generated and output to the display 205). The display 205 displays an image based on the video signal. The display control apparatus 200 itself may be provided with only a unit up to an interface for outputting a video signal for displaying an image on the display 205, and the display 205 may be constituted by an externally connected monitor (e.g., TV).
The operation unit 206 is an input device for accepting a user operation. The operation unit 206 includes a text information input device (e.g., a keyboard), a pointing device (e.g., a mouse, a touch panel), buttons, dials, a joystick, a touch sensor, and a touch pad. The touch panel is a planar input device that is superimposed on the display 205, and from which coordinate information according to a touch position is output.
The storage medium I/F207 may be mounted with a storage medium 208 (memory card, CD, DVD). Based on the control of the CPU201, the storage medium I/F207 reads data from the mounted storage medium 208 or writes data to the storage medium 208. The external I/F209 is an interface to connect with an external device wirelessly or via a cable, and input/output video signals and audio signals. The communication I/F210 is an interface to communicate with an external device or the network 211 or the like, and transmits/receives various data such as files and commands.
The sound output unit 212 outputs sounds of moving images and music data, operation sounds, bell sounds, and various notification sounds. The sound output unit 212 includes a sound output terminal 212a (a terminal for connecting headphones or the like) and a speaker 212b. The sound output unit 212 may output sound via wireless communication.
The posture detection unit 213 detects the posture of the display control apparatus 200 with respect to the gravitational direction and the inclination of the posture with respect to each axis of yaw, roll, and pitch. Based on the posture detected by the posture detecting unit 213, whether the display control apparatus 200 is held horizontally, held vertically, flipped up, flipped down, or in a tilted posture, or the like, can be detected. For the posture detection unit 213, at least one of an acceleration sensor, a gyro sensor, a geomagnetic sensor, an azimuth sensor, a height sensor, and the like may be used, or a combination of a plurality of such sensors may be used.
The operation unit 206 includes a touch panel 206a. The CPU 201 can detect the following operation or the state thereof on the touch panel 206a.
The finger or pen that did not Touch the Touch panel 206a newly touches the Touch panel 206a, i.e., the start of Touch (hereinafter referred to as Touch-Down)
A finger or pen touching the Touch panel 206a (hereinafter referred to as Touch-On)
The finger or pen is moving on the Touch panel 206a in the touched state (hereinafter referred to as Touch-Move)
The finger or pen touching the Touch panel 206a is released from the Touch panel 206a, i.e., the end of the Touch (hereinafter referred to as Touch-stop (Touch-Up))
Nothing is touching the Touch panel 206a (hereinafter referred to as Touch-Off)
Upon detecting touch, a touch duration is detected simultaneously. Touch persistence is typically continued to be detected unless touch cessation is detected after touchdown. Also, when a touch movement is detected, a touch duration is simultaneously detected. Even if a touch is detected to continue, no touch movement is detected unless the touch position is moving. In the case where a touch stop by a finger or pen is detected and nothing is touching the touch panel 206a, no touch is detected.
These operations/states and position coordinates of the position on the touch panel 206a that the finger or pen is touching are notified to the CPU 201 via the internal bus. Based on the notified information, the CPU 201 determines which operation (touch operation) is performed on the touch panel 206 a. For touch movement, the movement direction of the finger or pen moving on the touch panel 206a is also determined for the vertical component and the horizontal component on the touch panel 206a, respectively, based on the change in the position coordinates. When it is determined that the touch has moved at least a predetermined distance, it is determined that the sliding operation has been performed. The operation of causing the finger to move a certain distance quickly while touching the touch panel 206a and then releasing the finger is referred to as "flicking". In other words, flicking is an operation in which a finger suddenly and rapidly moves (flicks) on the touch panel 206 a. In the case where a touch movement at least a predetermined distance at a predetermined speed or faster is detected and a touch stop is detected immediately after that, it is determined that a flick is performed (it is determined that a flick has occurred after a slide operation).
Further, a touch operation of simultaneously touching a plurality of points (for example, two points) and bringing the touch positions close to each other is referred to as "Pinch-In", and a touch operation of bringing the touch positions away from each other is referred to as "Pinch-Out". The separation and kneading are collectively referred to as a "kneading" operation (or simply "kneading"). As the touch panel 206a, any of various types of touch panels such as a resistive film type, an electrostatic capacitance type, a surface acoustic wave type, an electromagnetic induction type, an image recognition type, a photoelectric sensor type, and the like can be used. Some types detect touches when actually touching the touch panel, while other types detect touches when a finger or pen is in proximity to the touch panel, although any type may be used.
The storage medium 208 stores data such as an image to be displayed on the display 205. The CPU 201 performs recording/reading of data with respect to the storage medium 208 via the storage medium I/F207.
The external I/F209 is an interface to perform data communication with an external device by enabling a USB cable or the like to be inserted into the display control device 200. The communication I/F210 is an interface to perform data communication with the external network 211 via wireless communication.
The sound output unit 212 outputs, for example, sound of content reproduced in the display control apparatus 200. The posture detection unit 213 detects the posture of the display control apparatus 200, and notifies the CPU 201 of the posture information.
Now, a display control process for setting an extraction reference position (viewpoint position) for extracting (selecting) a part of an image at a view angle of VR content acquired by the digital camera 100 by image capturing will be described with reference to fig. 3A to 3C and fig. 4. Here, the VR content may be an omnidirectional image or an omnidirectional panoramic image. The VR content may be an image captured by a camera that can capture an object in a field of view (angle of view) of 180 ° or less in a horizontal direction (horizontal angle, azimuth angle) and 180 ° or less in a vertical direction centered on the horizontal direction. The VR content may be a still image or a moving image as long as an image (frame image) corresponds to each reproduction time (frame). In the following description, it is assumed that VR content is moving image content including a plurality of frame images. Each frame image corresponds to a reproduction time in a reproduction period (reproduction duration) of VR content. VR content is reproduced by sequentially displaying a plurality of frame images on a screen according to reproduction time.
Data about the extraction reference position associated with each frame (each reproduction time) of VR content has been stored in the position setting data. For example, as shown in fig. 3C, the position setting data is data in a table format. The position setting data is stored in the memory 202. In the position setting data, a predetermined initial value (for example, the center position of VR content) is associated with each frame as an extraction reference position even in the case where the extraction reference position is not specified by the user.
Display control screen
Fig. 3A is a display control screen 300 displayed on the display 205 of the display control apparatus 200 for performing display control processing.
The display control screen 300 includes a reproduction display area 301 and a timeline area 302. The reproduction display area 301 displays a view angle area generated by extracting (selecting) a range (a part of a frame image) centered on the extraction reference position from a frame image (one frame image at the current reproduction time) of VR content currently being reproduced. The timeline area 302 corresponds to a reproduction period of VR content and represents temporal variations (variations according to the passage of reproduction time) of VR content in at least a portion of the reproduction period.
In the reproduction display area 301, a frame image corresponding to the reproduction time of VR content is displayed, and if an extraction reference position specified by the user (the extraction reference position specified by the user is hereinafter referred to as a "specified position") is set for the frame image (frame), an indicator 303 is also displayed at the specified position. Instead of setting one designated position in one frame (frame image), the user may set one common designated position for one period (block) made up of a plurality of continuous frame images. In a period in which one specified position is set (referred to as a "specified period"), extraction reference positions associated with all frames included in the specified period are fixed to the specified position.
Here, in embodiment 1, the pointer 303 is displayed at the central position of the extracted image (view angle region), but may be displayed at coordinates of a plurality of positions such as two opposite corners of the view angle region and the like. The pointer 303 may be any display item as long as the item can indicate that the frame currently being reproduced is included in the specified period of time in the reproduction display area 301.
In the timeline area 302, for example, one frame (reproduction time) is selected from VR content for every predetermined number of frames (predetermined period of time), and these images (frame images) of the selected plurality of frames are displayed in the order of frames (in the order of reproduction time). By arranging the images of the plurality of frames like this, it is possible to express temporal variation of VR content in at least a part of the reproduction period. In the timeline area 302, for an image of a selected frame, only a view angle area centered on an extraction reference position associated with the frame is displayed. In the timeline area 302, the more right the display position is, the more rearward frame images are displayed. In other words, in the timeline area 302, the more right the display position is, the later the period of reproduction time is.
In the case where there is a period (designated period) in which the extraction reference position is set in the timeline area 302, the designated period is displayed so that the period can be distinguished from a period (referred to as "unspecified period") in which the extraction reference position is not set. In embodiment 1, the display control apparatus 200 fills the specified period 304 as shown in fig. 3A such that the frame image corresponding to the specified period 304 appears transparent, but the specified period 304 may be highlighted by a different method. The display control apparatus 200 may highlight the non-designated time period instead of the designated time period 304.
Further, on the display control screen 300, the display control apparatus 200 may display an add button 305, the add button 305 being used to set a specified position for a frame (reproduction frame) being displayed and reproduced in the reproduction display area 301. In the case where a specified position is set in the reproduction frame in the reproduction display area 301 on the display control screen 300, the display control apparatus 200 may display the copy button 306 to reflect the specified position in other frames. For example, a specified period of time is set in the timeline area 302 according to a user operation, and by touching the copy button 306, a specified position is copied and set for a plurality of frames (frame images) included in the specified period of time 304.
Here, it is assumed that a designated position is set for a frame (reproduction frame) being displayed and reproduced in the reproduction display area 301 according to a user operation, and different designated positions are set for different reproduction frames. In this case, for a plurality of frames (frame images) existing between the two frames, positions of the frames on a trajectory connecting the specified positions of the two frames (frame images) are calculated for each frame, and the positions are automatically set.
The display control apparatus 200 may also display a generation button 307 on the display control screen 300, the generation button 307 being for generating a moving image by extracting a viewing angle region of each frame from VR content. By touching various buttons (an add button 305, a copy button 306, a generate button 307) displayed on the display 205, the user can send instructions corresponding to the various buttons to the display control apparatus 200. Further, after a plurality of specified periods are selected, upon pressing a specific button, the display control apparatus 200 (CPU 201) may control the durations of the selected plurality of specified periods to be the same length. Further, after a plurality of specified periods are selected, when a specific button is pressed, the CPU 201 may control specified positions associated with the selected plurality of specified periods to the same position.
Further, for example, the range of the specified period 304 may be changeable according to a pinching operation (pinching or kneading) performed on the specified period 304 displayed in the timeline area 302. Further, the designated position associated with the frame of the frame image being displayed in the reproduction display area 301 may be changeable according to the touch movement performed using the pointer 303. In this case, when the specified position associated with the frame is changed, the CPU 201 changes the specified position in the same manner for all frames in the specified period to which the frame belongs.
Display control processing
Fig. 4 is a flowchart showing the display control process. In the case where it is determined that a predetermined operation is performed on the operation unit 206, the processing of the flowchart starts. The predetermined operation here may be an operation of pressing the add button 305 to set a new designated position. The predetermined operation may also be an operation to press the copy button 306 to set information on a specified position set for a certain period of time (frame) to another period of time (frame). The predetermined operation may also be a pinching operation performed in a specified period 304 displayed on the timeline area 302, or a touch movement performed on the pointer 303.
The processing of this flowchart is realized by the CPU 201 executing a program stored in the nonvolatile memory 203.
Hereinafter, it is assumed that VR contents are moving images of 100 frames having reproduction periods of frames f1 to f100. In other words, the start frame of VR content is frame f1 and the end frame of VR content is frame f100.
In step S401, the CPU 201 acquires data (operation data) according to a user operation. Here, the operation data includes data related to a specified period of time (a period of time during which the extraction reference position is maintained) and data related to a specified position associated with the specified period of time. The first frame (initial frame) in the specified period is referred to as the "start frame", and the last frame in the specified period is referred to as the "end frame". Since the range of the specified period can be identified thereby, the data of the specified period may be data relating to both the start frame and the end frame.
For example, if the user performs a touch movement on the pointer 303 and changes the specified position in the reproduction frame (the frame image currently in the reproduction display area 301), the CPU 201 acquires data related to the updated specified position. Then, when the user touches the copy button 306 in the timeline area, data related to a specified period of time from the start frame to the end frame corresponding to the frame image is acquired.
In step S401, a plurality of operation data may be acquired. For example, in the case where an operation to change the specified position and the specified period of time associated with VR content in batch (for example, an operation to set the specified positions associated with a plurality of specified periods of time to the same position) is performed, the CPU 201 acquires a plurality of pieces of operation data.
In steps S402 to S415 below, the CPU 201 repeats the processing independently for each operation data acquired in step S401. The operation data to be processed here is hereinafter referred to as "target data". The target data is operation data having a specified period of time of "start frame ft to end frame ft".
In step S402, the CPU 201 stores the target data in the specified data group. As shown in fig. 3B, the specified data group herein may include one or more than one specified data (for example, specified data 311 to 313 in the case of fig. 3B). Each of the one or more specified data includes data related to a specified location and data related to a specified period of time during which the extraction reference location is maintained at the specified location. Since the specified position and the specified period are associated with each other and stored as specified data in the specified data group, the data is set so that the extraction reference position is maintained (the extraction reference position is fixed) during the specified period. The specified data group may be data in a data table format (specified data table) as shown in fig. 3B, or may be data in an array format (specified data array).
In step S403, the CPU 201 determines whether the specified data group includes previous data. Here, "previous data" refers to specified data whose start frame is a frame before the start frame ft of the target data. For example, if the start frame ft of the target data is the frame f14, the specified data group shown in fig. 3B includes the specified data 311 (the specified data including the start frame f10 preceding the start frame ft) as the previous data. If it is determined that the specified data group includes the previous data, the process advances to step S404. If it is determined that the specified data group does not include the previous data, the process advances to step S408.
In step S404, the CPU 201 determines whether any of the specified periods of the previous data included in the specified data group overlaps with the specified period of the target data. Here, "two specified periods overlap" refers to a state in which two specified periods share at least one identical frame. For example, the specified period of the frames f5 to f10 and the specified period of the frames f10 to f15 share the frame f10, and thus these specified periods overlap. If the specified period of the target data is the frames f14 to f18 and the previous data is the specified data 311 (the specified period is the specified data of f10 to f 15), the two specified periods overlap. If it is determined that any of the specified periods of the previous data overlaps with the specified period of the target data, the process advances to step S405. If it is determined that none of the specified periods of the previous data overlaps with the specified period of the target data, the process advances to step S406.
The process may proceed to S405 not only in the case where any of the specified periods of the preceding data overlaps with the specified period of the target data, but also in the case where any of the specified periods of the preceding data is continuous with the specified period of the target data. For example, the specified periods of frames f5 to f9 and the specified periods of frames f10 to f15 are continuous because no other frame exists between the two specified periods.
In step S405, the CPU 201 shifts the start frame ft of the target data backward by the amount of a predetermined number of frames to eliminate the overlap (or succession) of the specified period of the target data with the specified period of the previous data. For example, in the case where frames f14 to f20 of a specified period as target data overlap with frames f10 to f15 of a specified period as previous data (specified data 311), the CPU 201 shifts the start frame ft of the target data backward by three frames. In other words, the CPU 201 changes the start frame ft of the target data to the frame f17. Here, the CPU 201 updates the specified data (specified data corresponding to the target data) stored in the specified data group according to the target data changed in the start frame (specified period).
Alternatively, the CPU201 may shift forward the end frame of the specified period of the previous data ("overlapping previous data") overlapping (or continuing) with the specified period of the target data by the amount of a predetermined number of frames. The overlapping previous data is stored in the specified data group, and thus a specified period (end frame) of the overlapping previous data in the specified data group is updated.
Here, the changed target (which of the start frame ft of the target data and the end frame of the overlapping previous data) and the predetermined frame number may be predetermined or may be set by an operation. The CPU201 may shift the start frame ft of the target data or the end frame overlapping the previous data until the overlapping (or continuous) of the specified period of the target data with the specified period of the previous data is eliminated. In embodiment 1, the CPU201 changes either one of two specified periods, but may or may not change both periods. Further, the CPU201 may control such that the user cannot set operation data (target data) overlapping (or continuing with) any of the specified data in the specified data group.
In step S406, the CPU201 calculates (determines) an extraction reference position of a frame ("intermediate frame") between the end frame overlapping the previous data and the start frame ft of the target data. Here, the CPU201 calculates, for example, an extraction reference position of the intermediate frame such that the extraction reference position moves at a constant speed on a trajectory connecting the specified position of the overlapping previous data and the specified position of the target data from the end frame to the start frame ft (with the lapse of reproduction time) of the overlapping previous data. Alternatively, the CPU201 may calculate the extraction reference position of the intermediate frame such that the extraction reference position moves at a constant speed on a locus connecting the specified position overlapping the previous data and the specified position of the target data from the start frame of the intermediate frame to the end frame of the intermediate frame (with the lapse of the reproduction time). This moving speed of the extraction reference position is not limited to a constant speed, but may be arbitrarily set by the user.
In step S407, the CPU 201 stores each extraction reference position of the intermediate frame calculated in step S406 in the position setting data. In other words, the CPU 201 updates each extraction reference position of the intermediate frame in the position setting data to each extraction reference position calculated in step S406.
In step S408, the CPU 201 calculates (determines) the extraction reference position of each frame from the start frame f1 of the VR content until the frame ft-1 (the frame immediately before the start frame ft of the target data). Here, the extraction reference position of each frame may be set as a specified position of the target data, or may be an extraction reference position of the frame f1 stored in advance in the position setting data. The CPU 201 may calculate each extraction reference position of the frames f1 to ft-1, respectively, such that the extraction reference position moves from the extraction reference position of the frame f1 stored in the position setting data to a specified position of the target data at a constant speed from the frames f1 to ft-1. Then, the CPU 201 updates each extraction reference position of the frames f1 to ft-1 in the position setting data as the calculated extraction reference position, respectively.
In step S409, the CPU 201 determines whether the specified data group includes subsequent data. The "subsequent data" herein refers to specified data in which the end frame is a frame subsequent to the end frame of the target data. For example, if the end frame ft' of the target data is the frame f30, the specified data group shown in fig. 3B includes specified data 313 (specified data including the end frame f33 preceding the end frame ft) as the subsequent data. If it is determined that the specified data group includes the subsequent data, the process advances to step S410. If it is determined that the specified data group does not include the subsequent data, the process advances to S414.
In step S410, the CPU 201 determines whether any of the specified periods of the subsequent data included in the specified data group overlaps with the specified period of the target data. If it is determined that any of the specified periods of the subsequent data overlaps with the specified period of the target data, the process advances to step S411. If it is determined that none of the specified periods of the subsequent data overlaps with the specified period of the target data, the process advances to step S412.
The process may proceed to S411 not only in the case where any of the specified periods of the subsequent data overlaps with the specified period of the target data but also in the case where any of the specified periods of the subsequent data is continuous with the specified period of the target data.
In step S411, the CPU 201 shifts the end frame ft' of the target data forward by the amount of a predetermined number of frames to eliminate the overlap (continuation) of the specified period of the target data and the specified period of the subsequent data. Here, the CPU 201 updates the specified data (specified data corresponding to the target data) stored in the specified data group according to the target data changed in the end frame ft' (specified period).
Alternatively, the CPU 201 may shift the start frame of the specified period of the subsequent data ("overlapping subsequent data") overlapping the specified period of the target data backward by the amount of a predetermined number of frames. The overlapping subsequent data is stored in the specified data group, and thus a specified period (starting frame) of the overlapping subsequent data in the specified data group is updated.
Here, the changed target (which of the end frame ft' of the target data and the start frame of the overlapped subsequent data is to be changed) and the predetermined frame number may be predetermined or may be set by an operation. In embodiment 1, the CPU 201 changes either one of two specified periods, but may or may not change both periods. Further, the CPU 201 may control such that the user cannot set operation data (target data) overlapping (or continuing with) any of the specified data in the specified data group.
In step S412, the CPU 201 calculates an extraction reference position of a frame (intermediate frame) between the start frame of the superimposed subsequent data and the end frame ft' of the target data. Here, the CPU 201 calculates the extraction reference position of the intermediate frame so that the extraction reference position moves at a constant speed on a trajectory connecting the specified position of the target data and the specified position of the overlapping subsequent data from the end frame ft' to the start frame of the overlapping subsequent data. Alternatively, the CPU 201 may calculate the extraction reference position of the intermediate frame so that the extraction reference position moves at a constant speed on a track connecting the specified position of the target data and the specified position of the overlapping subsequent data from the start point of the intermediate frame to the end point frame of the intermediate frame (with the lapse of the reproduction time). The moving speed of the extraction reference position may be arbitrarily set by the user.
In step S413, the CPU 201 stores each extraction reference position of the intermediate frame calculated in step S412 in the position setting data. In other words, the CPU 201 updates each extraction reference position of the intermediate frame in the position setting data to each extraction reference position calculated in step S412.
In step S414, the CPU 201 calculates extraction reference positions of frames from the frame ft '+1 of VR content (a frame subsequent to the end frame ft' of the target data) to the end frame f 100. Here, the extraction reference position of each frame may be a specified position of the target data or an extraction reference position of the frame f100 stored in advance in the position setting data. The CPU 201 may calculate each extraction reference position of each of the frames ft '+1 to f100 such that from the end frame ft' to the frame f100, the extraction reference position moves at a constant speed on a trajectory connecting a specified position of the target data to the extraction reference position of the frame f100 stored in advance. Then, the CPU 201 updates each extraction position of the frames ft' +1 to f100 in the position setting data as the calculated extraction reference position.
In step S415, the CPU 201 updates the extraction reference position of each frame from the start frame ft to the end frame ft' in the position setting data to the specified position of the target data. For example, assume that the specified period of time of the target data is frames f14 to f18 and the specified position of the target data is coordinates (200 ). In this case, the CPU 201 updates the extraction reference positions associated with the frames f14 to f18 in the position setting data to coordinates (200 ).
In the case where the processing in step S415 ends here for the target data, if such operation data still exists, the CPU 201 sets any one of the remaining operation data for which the processing in steps S402 to S415 has not been executed as new target data. Then, the CPU 201 executes the processing in steps S402 to S415 on the new target data. Thus, the CPU 201 can execute the processing in steps S402 to S415 on all the acquired operation data.
In step S416, based on the specified data group and the position setting data, the CPU 201 updates the display of the reproduction display area 301 and the pointer 303 on the display control screen 300 displayed on the display 205. For example, the CPU 201 acquires the extraction reference position of the reproduction frame from the position setting data, and displays the angle-of-view region corresponding to the extraction reference position in the VR content on the reproduction display area 301. The CPU 201 may first perform zenith correction on the frame image of each reproduction frame included in the specified period, and then display the reproduction image. Thus, even if the posture of the image capturing apparatus changes during capturing VR content (particularly during image capturing corresponding to a specified period of time), the same position-angle-of-view region can be reproduced and displayed on the celestial sphere. In the case where a specified period of time exists in the VR content, it is possible to perform zenith correction on the frame images of all the reproduction frames included in the VR content, and if the specified period of time does not exist, the zenith correction is not performed. In any case, it is preferable to perform zenith correction on the frame image of each reproduction frame included in the specified period.
In the case where the reproduction frame is included in the specified period within any one of the specified data group, the CPU 201 displays the pointer 303. In the case where the reproduction frame is not included in the specified period within any one of the specified data group, the CPU 201 does not display the pointer 303. The CPU 201 updates the display of the specified period in the timeline area 302 based on the specified data group.
Moving image generation processing
Now, with reference to fig. 5, a moving image generation process to generate a moving image extracted (selected) at a part of the angle of view from VR contents for which a specified period of time is set will be described based on the flowchart in fig. 4. Fig. 5 is a flowchart of the moving image generation process.
In the case where it is determined that the generation button 307 (request to generate a moving image extracted at a part of the view angle region from VR content) is pressed, the processing in the flowchart of fig. 5 starts. The processing of this flowchart is realized by the CPU 201 executing a program stored in the nonvolatile memory 203.
Here, for the number of frames of VR contents, the CPU 201 repeatedly executes the processing in steps S501 and S502 for one frame at a time from the start frame f 1. In other words, the CPU 201 independently executes the processing in steps S501 and S502 for each of the start frame f1 and the end frame f 100. Hereinafter, the frame to be processed in steps S501 and S502 is referred to as a "target frame".
In step S501, the CPU201 acquires data related to the extraction reference position of the target frame from the position setting data.
In step S502, based on the data on the extraction reference position acquired in step S501, the CPU201 extracts (selects) a part of the view angle region from the target frame (frame image of the target frame) of the VR content. The CPU201 stores the extracted view angle region as an extracted image in the memory 202.
At the end of the processing in steps S501 and S502, the CPU201 performs the processing in steps S501 and S502 with the next frame of the target frame as a new target frame unless the target frame is the last frame. If the target frame is the last frame, the CPU201 performs the processing in step S503.
In step S503, the CPU201 acquires an extracted image of each frame of VR content from the memory 202. Then, the CPU201 generates a moving image by concatenating all the acquired extracted images in the order of frames (in the order of reproduction time).
Speed control process
As described above, in the case where the specified position is not set in a certain period (non-specified period) in steps S406 and S412 of the display control process of fig. 4, the CPU201 calculates the extraction reference position for the specified period based on the specified position in the specified period before and after the non-specified period. Further, in the unspecified period, the extraction reference position may be calculated such that the extraction reference position moves from a certain position (coordinates) to another position as the reproduction time elapses. Hereinafter, a speed control process to control the speed at which the reference position is extracted to move in the unspecified period will be described with reference to fig. 6A to 6D and fig. 7.
Fig. 6A shows a display control screen 600 displayed on the display 205 for performing the speed control process. The display control screen 600 in fig. 6A has substantially the same display configuration as the display control screen 300 shown in fig. 3A. Therefore, only the noticeable display configuration in the speed control process will be described below.
On the display control screen 600, a timeline area 602 to display a temporal change (change with the passage of reproduction time) of VR content is displayed. In the timeline area 602, a specified period 604 in which a specified position (maintenance of the specified position) is set is highlighted. Here, the non-specified time period may be highlighted instead of the specified time period 604.
In the timeline area 602, a moving speed graph 608 (a graph representing the moving speed of the extraction reference position in the unspecified period of time) is displayed in a manner superimposed on the unspecified period of time. In the case where the moving speed of the extraction reference position in the non-specified period is a constant speed, as shown in fig. 6A, a moving speed graph 608 in the non-specified period is represented by a diagonal line. In the case where the moving speed of the extraction reference position in the unspecified period is not a constant speed, as shown in fig. 6B to 6D, the moving speed graph 608 is a moving speed graph represented by a curve. Instead of the moving speed graph, a different display method to represent the moving speed of the extraction reference position in the unspecified period of time may be used. The movement speed profile 608 may be displayed over all of the non-specified time periods displayed in the timeline area 602 or may be displayed over only the specified non-specified time periods selected by the user.
Fig. 7 is a flowchart of a speed control process to control the moving speed of the extraction reference position based on the moving speed map. When it is determined that an operation to select one of the moving speed profiles is performed, the speed control process is started. The operation to select one of the moving speed profiles may be a touch on the moving speed profile or a pressing of a designated button to select the moving speed profile. The processing of this flowchart is realized by the CPU 201 executing a program stored in the nonvolatile memory 203.
In step S701, the CPU 201 acquires data on the selected moving speed profile and data on a non-specified period of time over which the moving speed profile is superimposed.
In step S702, the CPU 201 calculates an extraction reference position in each frame of the unspecified period based on the shape of the movement speed profile and the unspecified period. For example, in the case where a moving speed map having the shape shown in fig. 6A is acquired, the CPU 201 determines the extraction reference position of each frame so that the moving speed of the extraction reference position in the unspecified period becomes a constant speed.
Further, in the case where the moving speed graphs having the shapes shown in fig. 6B to 6D are acquired, the CPU 201 sets the moving speed of the extraction reference position in the unspecified period to a high speed according to the inclination in the moving speed graphs in the period in which the inclination is large. In contrast, the CPU 201 sets the moving speed of the extraction reference position to a slower speed according to the inclination in the period in which the inclination is small in the moving speed map. Then, the CPU 201 determines the extraction reference position of each frame by adjusting the moving speed of the extraction reference position so as not to exceed the number of frames in the unspecified period.
For example, the moving speed graph shown in fig. 6B shows that the moving speed at which the reference position is extracted is fastest in the first frame (initial frame) of the unspecified period, and the moving speed sequentially decreases in the following frames. On the other hand, the moving speed graph shown in fig. 6C shows that the moving speed at which the reference position is extracted is slowest in the first frame (initial frame) of the unspecified period, and the moving speed sequentially increases in the following frames, which is opposite to the case of fig. 6B. The moving speed profile shown in fig. 6D shows that the moving speed becomes faster or slower according to the shape of the moving speed profile. The user can freely change the shape of the moving speed profile by performing a touch movement on the moving speed profile.
In step S703, the CPU 201 updates the extraction reference position of each frame of the unspecified period in the position setting data to the extraction reference position of each frame calculated in S702.
In step S704, the CPU 201 updates the display of the reproduction display area on the display control screen 600 displayed on the display 205 based on the specified data group and the position setting data, as in step S416. The CPU 201 also updates the display of the timeline area 602 based on the specified data group.
In embodiment 1, an example in which the display control apparatus 200 performs display control processing, moving image generation processing, and speed control processing is described. However, the digital camera 100 may include at least a part of the structure described in the description of the display control apparatus 200, and perform these processing steps.
The CPU 201 may extract the viewing angle region after performing perspective projection transformation (coordinate transformation such that a near object appears large and a far object appears small as in the case of human vision) on VR content and mapping the transformed VR content on a plane. The CPU 201 can extract the view angle region without performing such image processing on VR contents. Instead of VR contents, the present embodiment is also applicable to images having a wide area (angle of view) or the like.
As described above, according to the present embodiment, the user can easily set the specified period as the period for which the extraction reference position is maintained. Further, the designated time period and the non-designated time period are displayed differently in the timeline area, so the user can easily recognize the designated time period, and can easily set or change the designated time period.
This makes it easy for the user to set the extraction reference position to a desired position. As a result, the user can easily select a part of the area (area to be extracted) from the content.
As long as an area within a range existing in a direction to be an extraction reference in the virtual space can be extracted (selected) from VR contents, an "extraction reference direction" may be used instead of the "extraction reference position". In this case, as for the specified direction, the user specifies a direction in which the extraction reference position set in the specified period is maintained. In this way, any reference may be used instead of "extracting the reference position" as long as the reference can be used to determine the viewing angle region in the VR content.
According to the present invention, a user can easily select a part of an area of content.
Although the present invention has been described based on the preferred embodiments thereof, the present invention is not limited to these specific embodiments, and various modes within the scope not departing from the spirit of the present invention are also included in the present invention. Further, part of the above embodiments may be combined as necessary.
In the above description, "the process proceeds to step S1 in the case where a is B or more, and the process proceeds to step S2 if a is smaller (lower) than B" may be interpreted as "the process proceeds to step S1 in the case where a is greater (higher) than B, and the process proceeds to step S2 in the case where a is B or less". In contrast, "the process proceeds to step S1 in the case where a is greater (higher) than B, and proceeds to step S2 in the case where a is B or less" may be interpreted as "the process proceeds to step S1 in the case where a is B or more, and proceeds to step S2 in the case where a is less (lower) than B". In other words, "a or more" may be interpreted as "greater than (higher; longer; more)" than a, and "a or less" may be interpreted as "smaller than (lower; shorter; less)" than a, as long as no inconsistency occurs. Further, "greater (higher; longer; more)" than a may be interpreted as "a or more", and "smaller (lower; shorter; less)" than a may be interpreted as "a or less".
Other embodiments
The embodiments of the present invention can also be realized by a method in which software (program) that performs the functions of the above embodiments is supplied to a system or apparatus, a computer of the system or apparatus or a method in which a Central Processing Unit (CPU), a Micro Processing Unit (MPU), or the like reads out and executes the program, through a network or various storage mediums.
Although the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.

Claims (16)

1. An electronic device for reproducing moving image content, the electronic device comprising:
a display control unit configured to control to sequentially display a plurality of frame images including the moving image content on a screen, and to display a timeline area corresponding to a reproduction period of the plurality of frame images on the screen;
a calculation unit configured to calculate, for a third frame image existing between a first frame image and a second frame image, which are two frame images for which reference positions have been set, a position on a locus connecting a first reference position of the first frame image and a second reference position of the second frame image, the reference position of the third frame image not yet set;
A setting unit configured to automatically set the calculated position as a third reference position of the third frame image, and automatically set one fourth reference position specified according to an operation by a user so as to be maintained for a plurality of fourth frame images existing in a specified period of time specified in the timeline area according to the operation by the user; and
a control unit configured to control to display the specified period of time and the other period of time differently in the timeline area on the screen, and control to reproduce the moving image content by sequentially displaying areas corresponding to reference positions set for the plurality of frame images, respectively, on the screen.
2. The electronic device of claim 1, wherein,
there are a plurality of specified time periods in the moving image content, wherein the first frame image is a last frame image of a first specified time period, and the second frame image is a first frame image of a second specified time period after the first specified time period, and
the first reference position as a fourth reference position of the first frame image is different from the second reference position as a fourth reference position of the second frame image.
3. The electronic device according to claim 1 or 2, wherein,
the setting unit sets a reference position in one of the plurality of frame images displayed on the screen according to an operation by a user, and
the first frame image and the second frame image are displayed on the screen, and reference positions in the first frame image and the second frame image are set according to a user operation.
4. The electronic device according to claim 1 or 2, wherein,
in the case where the first frame image is present before the second frame image, the third reference position changes from the first reference position to the second reference position as the reproduction time of the moving image content passes.
5. The electronic device according to claim 1 or 2, wherein,
displaying a graph representing a speed of change of the reference position in the other time period on the other time period in the time line region, and
the setting unit determines a reference position of the frame image corresponding to each reproduction time in the other time period based on the shape of the graph.
6. The electronic device of claim 5, wherein a shape of the graph is changeable according to a user's operation.
7. The electronic device according to claim 1 or 2, wherein,
in a state where maintenance of the reference position is set for a plurality of specified time periods, the setting unit changes the lengths of at least two time periods selected by the user from the plurality of specified time periods to the same length when the user performs the first operation.
8. The electronic device according to claim 1 or 2, wherein,
the control unit selects a plurality of reproduction times by selecting one reproduction time for each predetermined time in a reproduction period of the moving image content, and
the control unit performs control such that areas respectively corresponding to the reproduction times of the plurality of reproduction times selected from the moving image content are arranged in the order of reproduction times and displayed in the timeline area.
9. The electronic device of claim 8, wherein,
the region corresponding to each of the plurality of reproduction times is a region corresponding to a reference position associated with each of the plurality of frames corresponding to the plurality of reproduction times.
10. The electronic device according to claim 1 or 2, wherein,
The control unit controls such that an area corresponding to a reference position of a current reproduction time, which is a part of the area of the moving image content, is displayed in a predetermined area, and
the control unit controls such that a predetermined display item is displayed in the area in the case where the current reproduction time is included in the specified period.
11. The electronic device according to claim 1 or 2, wherein,
the moving image content is a moving image of an omnidirectional image or a moving image of a panoramic image.
12. The electronic device according to claim 1 or 2, wherein,
the plurality of frame images including the moving image content are omnidirectional images, and
the electronic device further includes a correction unit configured to perform zenith correction for each frame image.
13. The electronic device of claim 12, wherein,
the zenith correction is performed on the fourth frame image of the plurality of frame images including the moving image content.
14. The electronic device according to claim 1 or 2, further comprising a generation unit configured to:
extracting, from the plurality of frame images including the moving image content, an area corresponding to a reference position set for each of the plurality of frame images, and
New moving image content including images of the extracted areas in time series is generated.
15. A control method of an electronic apparatus for reproducing moving image content, the control method comprising the steps of:
control is performed to sequentially display a plurality of frame images including the moving image content on a screen, and a timeline area corresponding to a reproduction period of the plurality of frame images is displayed on the screen;
calculating, for a third frame image existing between a first frame image and a second frame image, a position on a locus connecting a first reference position of the first frame image and a second reference position of the second frame image, wherein the first frame image and the second frame image are two frame images for which reference positions have been set, the reference positions of the third frame image not having been set;
automatically setting the calculated position as a third reference position of the third frame image;
for a plurality of fourth frame images existing in a specified period of time specified in the timeline area according to an operation by a user, one fourth reference position specified according to the operation by the user is automatically set so that the one fourth reference position is maintained;
Control is performed to display the specified time period and other time periods differently in the timeline area on the screen; and
control is performed to reproduce the moving image content by sequentially displaying areas corresponding to reference positions set for the plurality of frame images, respectively, on the screen.
16. A computer-readable storage medium storing a program which, when executed on a computer, causes the computer to function as the units of the electronic device according to any one of claims 1 to 14.
CN202310130508.2A 2022-02-18 2023-02-17 Electronic device, control method of electronic device, and storage medium Pending CN116634121A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2022-023654 2022-02-18
JP2022195646A JP2023121126A (en) 2022-02-18 2022-12-07 Electronic apparatus, method for controlling electronic apparatus, and program
JP2022-195646 2022-12-07

Publications (1)

Publication Number Publication Date
CN116634121A true CN116634121A (en) 2023-08-22

Family

ID=87640547

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310130508.2A Pending CN116634121A (en) 2022-02-18 2023-02-17 Electronic device, control method of electronic device, and storage medium

Country Status (1)

Country Link
CN (1) CN116634121A (en)

Similar Documents

Publication Publication Date Title
CN110691187B (en) Electronic device, control method of electronic device, and computer-readable medium
CN111385470B (en) Electronic device, control method of electronic device, and computer-readable medium
CN110881097B (en) Display control apparatus, control method, and computer-readable medium
JP6978826B2 (en) Display control device and its control method, program, and storage medium
US11079898B2 (en) Electronic device for controlling display of VR image, control method of electronic device, and non-transitory computer readable medium
JP7267764B2 (en) ELECTRONIC DEVICE, ELECTRONIC DEVICE CONTROL METHOD, PROGRAM, AND STORAGE MEDIUM
JP2021174317A (en) Electronic apparatus and control method therefor
JP7204511B2 (en) Electronic device, electronic device control method, program
CN116634121A (en) Electronic device, control method of electronic device, and storage medium
US20230269360A1 (en) Electronic device and method for controlling electronic device
US11558599B2 (en) Electronic apparatus, control method for electronic apparatus, and non-transitory computer-readable storage medium
US11252328B2 (en) Electronic device and method for controlling the same
JP2023121126A (en) Electronic apparatus, method for controlling electronic apparatus, and program
US20230283844A1 (en) Information processing apparatus, control method of information processing apparatus, non-transitory computer readable medium, and system
CN110881102B (en) Image capturing apparatus, control method of image capturing apparatus, and computer readable medium
JP7086762B2 (en) Display control device
JP2020205554A (en) Display control device, control method of the same, program, and storage medium
JP2020191599A (en) Imaging apparatus and control method of the same
JP2019118058A (en) Electronic apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination