WO2014156257A1 - 表示制御装置、表示制御方法および記録媒体 - Google Patents
表示制御装置、表示制御方法および記録媒体 Download PDFInfo
- Publication number
- WO2014156257A1 WO2014156257A1 PCT/JP2014/051514 JP2014051514W WO2014156257A1 WO 2014156257 A1 WO2014156257 A1 WO 2014156257A1 JP 2014051514 W JP2014051514 W JP 2014051514W WO 2014156257 A1 WO2014156257 A1 WO 2014156257A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display control
- virtual object
- imaging
- captured image
- image
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2628—Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/69—Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/265—Mixing
Definitions
- the present disclosure relates to a display control device, a display control method, and a recording medium.
- the ratio of the moving amount of the virtual object to the moving amount of the imaging range in the real space is independent of the imaging magnification. It is constant.
- the ratio of the movement amount of the virtual object to the movement amount of the imaging range in the real space is constant, it is assumed that the user's usability is not improved. Therefore, when displaying a virtual object corresponding to an imaging location, it is desirable to realize a technique that can improve the user's feeling of use.
- an image acquisition unit that acquires a captured image in which an imaging range is captured, a display control unit that controls a display unit so that a virtual object corresponding to an imaging target reflected in the captured image is displayed, And the display control unit displays the virtual object so that a ratio of the movement amount of the virtual object in the virtual image to the movement amount of the imaging range in real space changes according to the imaging magnification of the captured image.
- a display control device is provided.
- acquiring a captured image in which an imaging range is captured controlling a display unit so that a virtual object corresponding to an imaging target reflected in the captured image is displayed, and the imaging Displaying the virtual object so that a ratio of the movement amount of the virtual object in the virtual image to the movement amount of the imaging range in real space changes according to the imaging magnification of the image.
- the computer controls the display unit so that the image acquisition unit that acquires the captured image in which the imaging range is captured and the virtual object corresponding to the imaging target that appears in the captured image are displayed.
- a display control unit wherein the display control unit changes a ratio of a movement amount of the virtual object in the virtual image to a movement amount of the imaging range in real space according to an imaging magnification of the captured image.
- a computer-readable recording medium that records a program for displaying the virtual object and causing it to function as a display control device is provided.
- FIG. 3 is a diagram illustrating a functional configuration example of a display control device according to an embodiment of the present disclosure. It is a figure showing an example of functional composition of an information processor concerning an embodiment of this indication. It is a figure for explaining an outline of an information processing system concerning an embodiment of this indication.
- 4 is a diagram for describing an operation example of an information processing system according to an embodiment of the present disclosure.
- FIG. It is a figure for demonstrating extraction of the virtual object according to imaging magnification. It is a figure for demonstrating the example of the relationship between the movement amount of the imaging range in real space, and the movement amount of the virtual object in a virtual image.
- FIG. 5 is a flowchart illustrating an operation example of an information processing system according to an embodiment of the present disclosure. It is a figure for demonstrating the example of the relationship between the variation
- FIG. 3 is a diagram illustrating a hardware configuration example of a display control device according to an embodiment of the present disclosure. It is a figure which shows the hardware structural example of the information processing apparatus which concerns on embodiment of this indication.
- a plurality of constituent elements having substantially the same functional configuration may be distinguished by attaching different alphabets or numbers after the same reference numeral.
- it is not necessary to particularly distinguish each of a plurality of constituent elements having substantially the same functional configuration only the same reference numerals are given.
- Embodiment 1-1 Configuration example of information processing system 1-2.
- Functional configuration example of display control apparatus 1-3 Functional configuration example of information processing apparatus 1-4.
- Overview of information processing system 1-5 Functional details of information processing system 1-6.
- FIG. 1 is a diagram illustrating a configuration example of an information processing system 1 according to an embodiment of the present disclosure.
- the information processing system 1 includes a display control device 10 and an information processing device 20.
- the display control device 10 and the information processing device 20 can communicate via the network 30.
- the display control device 10 has a function of performing display control.
- a case where the display control device 10 is applied to a digital camera will be described as an example.
- the display control device 10 may be applied to a device other than a digital camera.
- the display control device 10 includes a video camera, a smartphone with a camera function, a PDA (Personal Digital Assistant), a PC (Personal Computer), a mobile phone, a portable music playback device, a portable video processing device, a portable game device, You may apply to a telescope, a binoculars, etc.
- the information processing apparatus 20 can perform information processing in accordance with a request from the display control apparatus 10 and return a processing result to the display control apparatus 10.
- the case where the information processing system 1 includes the information processing apparatus 20 will be described as an example. However, some or all of the functions of the information processing apparatus 20 are displayed in place of the information processing apparatus 20
- the device 10 may have. For example, when the display control device 10 has a functional configuration (recognition unit 211 and extraction unit 212) described below, the information processing system 1 may not include the information processing device 20.
- the network 30 is a wired or wireless transmission path for information transmitted from a connected device.
- the network 30 may include a public line network such as the Internet, a telephone line network, and a satellite communication network, various LANs (Local Area Network) including Ethernet (registered trademark), WAN (Wide Area Network), and the like.
- the network may include a dedicated line network such as an IP-VPN (Internet Protocol-Virtual Private Network).
- FIG. 2 is a diagram illustrating a functional configuration example of the display control apparatus 10 according to the embodiment of the present disclosure.
- the display control apparatus 10 includes a control unit 110, an operation unit 120, an imaging unit 130, a storage unit 140, a communication unit 150, and a display unit 160.
- the control unit 110 corresponds to a processor such as a CPU (Central Processing Unit).
- the control unit 110 exhibits various functions of the control unit 110 by executing a program stored in the storage unit 140 or another storage medium.
- the control unit 110 includes an image acquisition unit 111 and a display control unit 112. The functions of the image acquisition unit 111 and the display control unit 112 will be described later.
- the operation unit 120 detects an operation by the user and outputs it to the control unit 110.
- operation by a user is equivalent to operation which presses a button.
- the operation unit 120 may be configured by hardware other than buttons (for example, a touch panel).
- the operation unit 120 is integrated with the display control device 10, but the operation unit 120 may be configured separately from the display control device 10.
- the imaging unit 130 acquires a captured image by imaging the imaging range, and outputs the captured image to the control unit 110.
- the imaging unit 130 performs imaging with an imaging magnification set according to an operation input to the operation unit 120.
- the adjustment of the imaging magnification can be realized by a zoom function, but this zoom function is not particularly limited, and may be an optical zoom function or an electronic zoom function.
- the imaging unit 130 is integrated with the display control device 10, but the imaging unit 130 may be configured separately from the display control device 10.
- the storage unit 140 stores a program for operating the control unit 110 using a storage medium such as a semiconductor memory or a hard disk. Further, for example, the storage unit 140 can also store various data (for example, various setting information, content, etc.) used by the program. In the example illustrated in FIG. 2, the storage unit 140 is integrated with the display control device 10, but the storage unit 140 may be configured separately from the display control device 10.
- the communication unit 150 can communicate with the information processing apparatus 20.
- the communication format by the communication unit 150 is not particularly limited, and the communication by the communication unit 150 may be wireless communication or wired communication.
- the communication unit 150 is integrated with the display control device 10, but the communication unit 150 may be configured separately from the display control device 10.
- the communication unit 150 may not exist.
- the display unit 160 displays various information according to the control by the display control unit 112.
- the display unit 160 includes, for example, an LCD (Liquid Crystal Display), an organic EL (Electroluminescence) display device, or the like.
- the display unit 160 is integrated with the display control device 10, but the display unit 160 may be configured separately from the display control device 10.
- FIG. 3 is a diagram illustrating a functional configuration example of the information processing apparatus 20 according to the embodiment of the present disclosure.
- the information processing apparatus 20 includes a control unit 210, a storage unit 220, and a communication unit 230.
- the control unit 210 corresponds to a processor such as a CPU (Central Processing Unit), for example.
- the control unit 210 performs various functions of the control unit 210 by executing a program stored in the storage unit 220 or another storage medium.
- the control unit 210 includes a recognition unit 211 and an extraction unit 212. The functions of the recognition unit 211 and the extraction unit 212 will be described later.
- the storage unit 220 stores a program for operating the control unit 210 using a storage medium such as a semiconductor memory or a hard disk.
- the storage unit 220 can store various data (for example, various setting information, content, and the like) used by the program.
- the storage unit 220 is integrated with the information processing apparatus 20, but the storage unit 220 may be configured separately from the information processing apparatus 20.
- the communication unit 230 can communicate with the display control device 10.
- the communication format by the communication unit 230 is not particularly limited, and the communication by the communication unit 230 may be wireless communication or wired communication.
- the communication unit 230 is integrated with the information processing apparatus 20, but the communication unit 230 may be configured separately from the information processing apparatus 20.
- FIG. 4 is a diagram for describing an overview of the information processing system 1 according to the embodiment of the present disclosure.
- a captured image is acquired by the image acquisition unit 111.
- the imaging target 40 since the imaging target 40 exists within the imaging range, the imaging target 40 appears in the captured image.
- the imaging target 40 is a painting will be described as an example, but the imaging target 40 may be an object other than a painting.
- FIG. 5 is a diagram for describing an operation example of the information processing system 1 according to the embodiment of the present disclosure.
- the imaging target 40 is recognized from the captured image 60 by the recognition unit 211.
- the recognition unit 221 recognizes the position and orientation of the imaging target 40 and the imaging target 40 in real space from the captured image 60.
- the recognition unit 221 can recognize the imaging target 40 by collating the feature amount determined from the captured image 60 with the feature amount of a real object registered in advance.
- the recognition unit 221 determines the feature amount of the imaging target 40 shown in the captured image 60 in accordance with a feature amount determination method such as SIFT method or Random Ferns method, and the determined feature amount is registered in advance as a real object. Compare with the feature quantity of. Then, the recognizing unit 221 recognizes real object identification information associated with the feature quantity most suitable for the feature quantity of the imaging target 40 shown in the captured image 60, and the position and orientation of the imaging target 40 in the real space.
- a feature amount determination method such as SIFT method or Random Ferns method
- the recognition unit 221 uses a feature amount dictionary in which the feature amount of the real object is associated with the real object identification information, but the feature amount dictionary may be stored in the storage unit 140, It may be received by the communication unit 150 from the server.
- the feature amount of the real object may be, for example, a set of feature amounts determined from the learning image of the real object according to the SIFT method or the Random Ferns method.
- the recognition of the imaging target 40 is not limited to this example.
- the recognition unit 221 may be a known figure or symbol associated with the imaging target 40, an artificial marker (for example, a two-dimensional barcode such as CyberCode (registered trademark) or QR code (registered trademark)), a natural marker, or the like.
- the imaging target 40 may be indirectly recognized by recognizing the marker.
- the recognition unit 221 may recognize the imaging target 40 and recognize the position and orientation of the imaging target 40 from the size and shape of the captured image 60 of the imaging target 40.
- the recognition unit 221 recognizes the position and orientation of the imaging target 40 included in the captured image 60 by image processing.
- the position and orientation recognition method of the imaging target 40 is a recognition method based on image processing. It is not limited to.
- the recognition unit 221 can detect the orientation of the imaging target 40 and the current position of the imaging target 40, and estimate the position and orientation of the imaging target 40 in the captured image 60 based on the detection result.
- a virtual image 50 corresponding to the imaging target 40 is acquired by the recognition unit 211.
- the recognition unit 211 when information associated with real object identification information and a virtual image is stored in advance in the storage unit 220, the virtual image 50 associated with the real object identification information of the imaging target 40 is recognized by the recognition unit 211. Is obtained from the storage unit 220.
- the virtual object 51 is extracted from the virtual image 50 by the extraction unit 212.
- the extraction unit 212 extracts the virtual object 51 corresponding to the imaging location 41 in the imaging target 40 from the virtual image 50. More specifically, as illustrated in FIG. 5, the extraction unit 212 determines the position of the virtual object 51 corresponding to the imaging location 41 in the imaging target 40 according to the position and orientation of the imaging target 40 and exists at the determined position.
- the virtual object 51 to be extracted is extracted from the virtual image 50.
- the display control unit 112 controls the display unit 160 so that the virtual object 51 is displayed on the display unit 160.
- the display control unit 112 may display the virtual object 51 instead of the captured image 60 or may display a composite image in which the virtual object 51 is superimposed on the captured image 60.
- the display control unit 112 adjusts the position and orientation of the virtual object 51 so as to match the position and orientation of the imaging target 40 and superimposes them on the captured image 60 as shown in FIG. You may display the made synthesized image on the display part 160.
- the display control unit 112 may display the virtual object 51 extracted by the extraction unit 212 as it is, or may display the virtual object 51 after performing some kind of processing.
- the display control unit 112 may perform a filtering process according to the captured image 60 on the virtual object 51 and control the virtual object after the filtering process to be displayed.
- the filtering process may be a process of bringing the pixel value of the virtual object closer to the pixel value of the captured image.
- techniques such as lighting and color transfer can be used.
- the virtual object 51 changed according to the changed imaging magnification is extracted from the virtual image 50 by the extraction unit 212.
- the display control unit 112 controls display of the extracted virtual object 51.
- the extraction unit 212 may extract the virtual object 51 in a narrow area from the virtual image 50 as the imaging magnification increases.
- FIG. 6 is a diagram for explaining extraction of the virtual object 51 according to the imaging magnification.
- the imaging magnification py is higher than the imaging magnification px.
- the virtual object 51 is so narrow that the virtual object 51 extracted according to the imaging magnification py is narrower than the virtual object 51 extracted according to the imaging magnification px. May be extracted and displayed.
- the sharpness of the captured image 60 decreases as the imaging magnification increases.
- the virtual object 51 corresponding to the imaging magnification is extracted and displayed in this manner, so that even if the imaging magnification is increased, the virtual object 51 corresponding to the imaging magnification is displayed without reducing the sharpness. be able to.
- FIG. 6 in the case of the imaging magnification py, a narrow area of the virtual image 50 is extracted and displayed as the virtual object 51, but the sharpness of the virtual object 51 does not need to be lowered.
- the imaging range when the imaging range is moved by the user, it is necessary to move the virtual object 51 in the virtual image 50 with the movement of the imaging range.
- the ratio of the movement amount of the virtual object 51 in the virtual image 50 to the movement amount of the imaging range in the real space is constant regardless of the imaging magnification, it is assumed that the user's usability is not improved.
- FIG. 7 is a diagram for explaining an example of the relationship between the movement amount of the imaging range in the real space and the movement amount of the virtual object 51 in the virtual image 50.
- the imaging magnification py is higher than the imaging magnification px.
- the imaging locations 41-y1 and 41-y2 at the imaging magnification py are smaller areas than the imaging locations 41-x1 and 41-x2 at the imaging magnification px.
- the imaging is performed with the imaging magnification py as compared with the case where the imaging is performed with the imaging magnification px.
- the amount of movement of the imaging range relative to the size of the imaging location becomes larger. If the amount of movement of the relative imaging range increases, it may lead to a situation where it is difficult for the user to place the location to be captured within the imaging range.
- the display control unit 112 changes the ratio of the movement amount of the virtual object 51 in the virtual image 50 to the movement amount m of the imaging range in the real space according to the imaging magnification of the captured image 60. 51 may be displayed.
- the extraction unit 212 may change the ratio of the movement amount of the virtual object 51 in the virtual image 50 to the movement amount m of the imaging range in the real space according to the imaging magnification of the captured image 60. Then, the extraction unit 212 may move the virtual object 51 in the virtual image 50 by the movement amount calculated based on the changed ratio.
- the display control unit 112 may display the virtual object 51 after movement.
- the amount of movement of the virtual object 51 in the virtual image 50 is smaller when the image is taken with the imaging magnification py than when the image is taken with the imaging magnification px, the user's feeling of use is improved. Be expected.
- the virtual image 50 captured with the imaging magnification py is larger than the movement amount nx of the virtual object 51 in the virtual image 50 captured with the imaging magnification px.
- the movement amount ny of the virtual object 51 at is preferably smaller.
- the display control unit 112 displays the virtual object 51 such that the larger the imaging magnification of the captured image 60, the smaller the ratio of the movement amount of the virtual object 51 in the virtual image 50 to the movement amount of the imaging range in the real space.
- the extraction unit 212 may reduce the ratio of the movement amount of the virtual object 51 in the virtual image 50 to the movement amount of the imaging range in the real space as the imaging magnification of the captured image 60 is larger.
- the virtual object before the movement is denoted by 51-1 and the virtual object after the movement is denoted by 51-2.
- FIG. 8 is a flowchart illustrating an operation example of the information processing system 1 according to the embodiment of the present disclosure.
- the image acquisition unit 111 acquires a captured image captured by the imaging unit 130 (S12).
- the recognition unit 211 recognizes the imaging target from the captured image (S13), and acquires a virtual image corresponding to the imaging target (S14).
- the display control unit 112 may perform control so that the captured image is displayed.
- the extraction unit 212 extracts a virtual object from the virtual image (S15), and the display control unit 112 controls the display unit 160 so that the virtual object is displayed on the display unit 160.
- the display unit 160 displays the virtual object according to the control by the display control unit 112 (S16).
- the extraction unit 212 acquires the imaging magnification (S18), and extracts a virtual object corresponding to the imaging magnification from the virtual image (S19). .
- the display control unit 112 controls the display unit 160 so that the virtual object is displayed on the display unit 160.
- the display unit 160 displays the virtual object again according to the control by the display control unit 112 (S20). If the imaging magnification has not been changed (“No” in S17), the operation may proceed to S21.
- the extraction unit 212 determines a movement amount of the virtual object in the virtual image (S22), and moves the virtual object in the virtual image (S22). S23).
- the display control unit 112 controls the display unit 160 so that the virtual object after being moved is displayed on the display unit 160.
- the display unit 160 displays the virtual object again under the control of the display control unit 112 (S24).
- the predetermined conditions are not particularly limited.
- the predetermined condition may be a condition that the imaging range has moved, or may be another condition.
- the predetermined condition will be described later in detail.
- the operation after S24 is not particularly limited. For example, while the imaging unit is recognized by the recognition unit 211, the operations after S17 may be repeated. On the other hand, for example, when another imaging target is recognized by the recognition unit 211, the operations after S14 may be repeated. When the recognition unit 211 no longer recognizes the imaging target, the display control unit 112 may control the display unit 160 such that the captured image is displayed on the display unit 160.
- FIG. 9 is a diagram for explaining an example of the relationship between the change amount of the imaging range, the imaging magnification, and the display content.
- the display control unit 112 displays the display unit 160 so that the captured image is displayed on the display unit 160. Control is sufficient. Further, the display control unit 112 may control the display unit 160 so that the captured image is displayed on the display unit 160 when the imaging magnification is smaller than the lower limit value of the imaging magnification capable of displaying the virtual object.
- the upper limit value and the lower limit value of the imaging magnification at which a virtual object can be displayed may be determined in advance. This is because if the imaging magnification is too large or too small, the size of the virtual object may not be appropriate.
- the range from the lower limit value to the upper limit value corresponds to the imaging magnification range in which the virtual object can be displayed.
- maximum value corresponds to the maximum value that can be set as the imaging magnification of the imaging unit 130
- minimum value corresponds to the minimum value that can be set as the imaging magnification of the imaging unit 130.
- the virtual object When the imaging magnification is in a range in which a virtual object can be displayed, the virtual object may be moved in the virtual image as the imaging range is moved. However, as described above, when a predetermined condition is satisfied The virtual object may be moved. For example, when the amount of change in the imaging range is less than the first threshold, it is assumed that the imaging range has changed due to a user's camera shake. Therefore, in such a case, the display control unit 112 may prohibit the movement of the virtual object in the virtual image. In the example illustrated in FIG. 9, when the imaging magnification is the lower limit value, the first threshold value is indicated as the threshold value Th1.
- the change amount of the imaging range may be acquired by any technique.
- the display control unit 112 may measure the amount of movement of the feature point in the captured image as the amount of change in the imaging range. Further, the display control unit 112 may acquire the amount of movement of the imaging unit 130 detected by a predetermined sensor as the amount of change in the imaging range.
- the type of the predetermined sensor is not particularly limited. For example, when the predetermined sensor is an acceleration sensor, the acceleration measured by the acceleration sensor may be acquired as the amount of change in the imaging range.
- the first threshold value may be constant regardless of the imaging magnification, but may be changed according to the imaging magnification. That is, the display control unit 112 may control the first threshold according to the imaging magnification of the captured image. For example, as described above, the larger the imaging magnification of the captured image, the narrower the imaging location, and the greater the amount of movement of the imaging range relative to the size of the imaging location. Therefore, the display control unit 112 may increase the first threshold value as the imaging magnification of the captured image increases. In this way, even if the change of the imaging range becomes large to some extent, it is expected that the movement of the virtual object is prohibited and the user's feeling of use is improved.
- the line indicating the first threshold value draws a curve, but the shape of this curve is not limited.
- the line indicating the first threshold value may be a straight line. That is, the first threshold value may change linearly as the imaging magnification changes, or may change nonlinearly.
- the display control unit 112 may move the virtual object in the virtual image.
- the second threshold value is indicated as the threshold value Th2.
- the display control unit 112 may prohibit the movement of the virtual object in the virtual image.
- the second threshold may be constant regardless of the imaging magnification, but may be changed according to the imaging magnification. That is, the display control unit 112 may control the second threshold according to the imaging magnification of the captured image. For example, as described above, the larger the imaging magnification of the captured image, the narrower the imaging location, and the greater the amount of movement of the imaging range relative to the size of the imaging location. Therefore, the display control unit 112 may increase the second threshold value as the imaging magnification of the captured image increases. In this way, even if the change in the imaging range becomes large to some extent, it is expected that the virtual object is moved and the user's feeling of use is improved.
- the line indicating the second threshold value draws a curve, but the shape of this curve is not limited.
- the line indicating the second threshold value may be a straight line. That is, the second threshold value may change linearly as the imaging magnification changes, or may change nonlinearly.
- FIG. 10 is a diagram for explaining an example of the relationship between the sharpness of a captured image and display content.
- the virtual object corresponding to the imaging target is displayed when the imaging target is recognized from the captured image, but the captured image is displayed when the imaging target is not recognized from the captured image.
- switching between displaying a virtual object or displaying a captured image may be performed according to the definition of the captured image. For example, when the sharpness of the captured image is higher than a predetermined threshold, it is assumed that the captured image is suitable for user browsing, but when the sharpness of the captured image is lower than the predetermined threshold, It is assumed that the captured image is not suitable for user browsing.
- the display control unit 112 may control the captured image to be displayed when the sharpness of the captured image exceeds a predetermined threshold.
- the display control unit 112 may perform control so that the virtual object is displayed when the sharpness of the captured image is below a predetermined threshold.
- the sharpness c ⁇ a of the captured image is higher than the threshold Th3, but the sharpness c ⁇ b of the captured image is lower than the threshold Th3.
- the display control unit 112 performs control so that the virtual object is displayed in the case of the sharpness ca of the captured image, and in the case of the sharpness cb of the captured image, You may control so that a captured image is displayed.
- FIG. 11 is a diagram illustrating a display example of an imaging magnification range in which a virtual object can be displayed.
- the range from the lower limit value to the upper limit value of the imaging magnification corresponds to the imaging magnification range in which the virtual object can be displayed. Therefore, if the display control unit 112 is controlled to display an imaging magnification range in which the virtual object can be displayed, it is expected that convenience for the user is improved.
- the display control unit 112 controls the range of the imaging magnification in which the virtual object can be displayed so as to be displayed by the range from the lower limit value to the upper limit value.
- the display control unit 112 may perform control so that a range from the minimum value to the maximum value that can be set as the imaging magnification of the imaging unit 130 is displayed. Further, as shown in FIG. 11, the display control unit 112 may perform control so that the current imaging magnification pn is displayed.
- the display control unit 112 controls the recommended imaging magnification associated with the virtual image to be displayed, the user can easily view the virtual image with the imaging magnification adjusted to the recommended imaging magnification. Become.
- the display control unit 112 controls the recommended imaging magnification associated with each area of the virtual image to be displayed, the user adjusts the imaging magnification to the recommended imaging magnification to display the area of the virtual image. It becomes easy to browse.
- FIG. 12 is a diagram showing a display example of the recommended imaging magnification associated with each area of the virtual image.
- the recommended imaging magnification associated with each region of the virtual image by the display control unit 112 is “recommended imaging magnification 5 times” “recommended imaging magnification 10 times” “recommended imaging magnification 3 times”. ”And the like. Note that the recommended imaging magnification may be registered in advance in association with a virtual image.
- the display control unit 112 may adjust the orientation of the virtual object so as to match the orientation of the imaging target and display it on the display unit 160.
- the display unit 160 does not adjust the orientation of the virtual object. May be displayed.
- the imaging magnification is high, the virtual object is displayed in a large size on the screen, so the posture of the virtual object may not be adjusted.
- the imaging magnification is low, the virtual object is small on the screen. Since it is displayed, the posture of the virtual object may be adjusted.
- the display control unit 112 may display the virtual object so that the attitude of the virtual object changes according to the imaging magnification of the captured image. For example, when the imaging magnification is smaller than a predetermined threshold, the display control unit 112 adjusts and displays the orientation of the virtual object so as to match the orientation of the imaging target, and the imaging magnification is larger than the predetermined threshold. May be displayed without adjusting the posture of the virtual object.
- FIG. 13 is a diagram for explaining an example in which the posture of the virtual object is changed in accordance with the imaging magnification.
- the imaging magnification px is smaller than the threshold Th4, and the imaging magnification py is larger than the threshold Th4. Therefore, when the image is captured at the imaging magnification px, the display control unit 112 adjusts and displays the orientation of the virtual object 51 so as to match the orientation of the imaging target, and the image is captured at the imaging magnification py. In this case, the virtual object 51 may be displayed without adjusting the posture.
- FIG. 14 is a diagram illustrating a hardware configuration example of the display control apparatus 10 according to the embodiment of the present disclosure.
- the hardware configuration example illustrated in FIG. 14 is merely an example of the hardware configuration of the display control apparatus 10. Therefore, the hardware configuration of the display control apparatus 10 is not limited to the example shown in FIG.
- the display control device 10 includes a CPU (Central Processing Unit) 801, a ROM (Read Only Memory) 802, a RAM (Random Access Memory) 803, a sensor 804, an input device 808, An output device 810, a storage device 811, a drive 812, an imaging device 813, and a communication device 815 are provided.
- CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- the CPU 801 functions as an arithmetic processing device and a control device, and controls the overall operation in the display control device 10 according to various programs. Further, the CPU 801 may be a microprocessor.
- the ROM 802 stores programs used by the CPU 801, calculation parameters, and the like.
- the RAM 803 temporarily stores programs used in the execution of the CPU 801, parameters that change as appropriate during the execution, and the like. These are connected to each other by a host bus including a CPU bus.
- the sensor 804 includes various detection sensors such as a terminal state detection sensor for detecting the state of the display control device 10 and its peripheral circuits.
- Examples of the sensor 804 include an inclination sensor, an acceleration sensor, an orientation sensor, a temperature sensor, a humidity sensor, and an illuminance sensor.
- a detection signal from the sensor 804 is sent to the CPU 801. Thereby, the CPU 801 can know the state of the display control device 10 (tilt, acceleration, direction, temperature, humidity, illuminance, etc.).
- the input device 808 includes an input unit for a user to input information, such as a mouse, a keyboard, a touch panel, a button, a microphone, a switch, and a lever, and an input control circuit that generates an input signal based on the input by the user and outputs the input signal to the CPU 801 Etc.
- the user of the display control device 10 can input various data and instruct processing operations to the display control device 10 by operating the input device 808.
- the output device 810 includes a display device such as a liquid crystal display (LCD) device, an OLED (Organic Light Emitting Diode) device, and a lamp. Furthermore, the output device 810 includes an audio output device such as a speaker and headphones. For example, the display device displays a captured image or a generated image. On the other hand, the audio output device converts audio data or the like into audio and outputs it.
- a display device such as a liquid crystal display (LCD) device, an OLED (Organic Light Emitting Diode) device, and a lamp.
- the output device 810 includes an audio output device such as a speaker and headphones.
- the display device displays a captured image or a generated image.
- the audio output device converts audio data or the like into audio and outputs it.
- the storage device 811 is a data storage device configured as an example of a storage unit of the display control device 10.
- the storage device 811 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like.
- the storage device 811 stores programs executed by the CPU 801 and various data.
- the drive 812 is a reader / writer for a storage medium, and is built in or externally attached to the display control device 10.
- the drive 812 reads information recorded on a removable storage medium 71 such as a mounted magnetic disk, optical disk, magneto-optical disk, or semiconductor memory, and outputs the information to the RAM 803.
- the drive 812 can also write information into the removable storage medium 71.
- the imaging device 813 includes an imaging optical system such as a photographing lens and a zoom lens that collects light, and a signal conversion element such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor).
- the imaging optical system collects light emitted from the subject and forms a subject image in the signal conversion unit, and the signal conversion element converts the formed subject image into an electrical image signal.
- the communication device 815 is a communication interface configured with, for example, a communication device for connecting to a network.
- the communication device 815 may be a wireless LAN (Local Area Network) compatible communication device, an LTE (Long Term Evolution) compatible communication device, or a wire communication device that performs wired communication.
- the communication device 815 can communicate with the information processing device 20 via the network 30.
- FIG. 15 is a diagram illustrating a hardware configuration example of the information processing apparatus 20 according to the embodiment of the present disclosure.
- the hardware configuration example illustrated in FIG. 15 is merely an example of the hardware configuration of the information processing apparatus 20. Therefore, the hardware configuration of the information processing apparatus 20 is not limited to the example illustrated in FIG.
- the information processing apparatus 20 includes a CPU (Central Processing Unit) 901, a ROM (Read Only Memory) 902, a RAM (Random Access Memory) 903, a storage apparatus 911, a drive 912, A communication device 915.
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- the CPU 901 functions as an arithmetic processing device and a control device, and controls the overall operation in the information processing device 20 according to various programs. Further, the CPU 901 may be a microprocessor.
- the ROM 902 stores programs used by the CPU 901, calculation parameters, and the like.
- the RAM 903 temporarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like. These are connected to each other by a host bus including a CPU bus.
- the storage device 911 is a data storage device configured as an example of a storage unit of the information processing device 20.
- the storage device 911 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like.
- the storage device 911 stores programs executed by the CPU 901 and various data.
- the drive 912 is a reader / writer for a storage medium, and is built in or externally attached to the information processing apparatus 20.
- the drive 912 reads information recorded on a removable storage medium 71 such as a mounted magnetic disk, optical disk, magneto-optical disk, or semiconductor memory, and outputs the information to the RAM 903.
- the drive 912 can also write information into the removable storage medium 71.
- the communication device 915 is a communication interface configured by a communication device for connecting to a network, for example.
- the communication device 915 may be a wireless LAN (Local Area Network) compatible communication device, an LTE (Long Term Evolution) compatible communication device, or a wire communication device that performs wired communication.
- the communication device 915 can communicate with the information processing device 20 via the network 30, for example.
- the image acquisition unit 111 that acquires the captured image in which the imaging range is captured, and the virtual object extracted from the virtual image corresponding to the imaging target that appears in the captured image are displayed.
- a display control device 10 is provided that includes a display control unit 112 that controls to be displayed.
- the display control unit 112 displays the virtual object so that the ratio of the movement amount of the virtual object in the virtual image to the movement amount of the imaging range in the real space changes according to the imaging magnification of the captured image. According to such a configuration, it is possible to improve the user's feeling of use.
- An image acquisition unit that acquires a captured image in which the imaging range is captured;
- a display control unit that controls a display unit so that a virtual object corresponding to an imaging target reflected in the captured image is displayed;
- the display control unit displays the virtual object so that a ratio of a movement amount of the virtual object in the virtual image to a movement amount of the imaging range in real space changes according to an imaging magnification of the captured image.
- Display control device (2)
- the display control unit controls the display unit so that a virtual object extracted from a virtual image corresponding to the imaging target is displayed.
- the display control device according to (1).
- the display control unit displays the virtual object so that the ratio of the movement amount of the virtual object in the virtual image to the movement amount of the imaging range in real space decreases as the imaging magnification of the captured image increases.
- the display control device according to (1).
- the display control unit prohibits movement of the virtual object in the virtual image when the amount of change in the imaging range is less than a first threshold;
- the display control apparatus according to any one of (1) to (3).
- the display control unit controls the first threshold according to an imaging magnification of the captured image;
- the display control unit increases the first threshold as the imaging magnification of the captured image increases.
- the display control unit prohibits the movement of the virtual object in the virtual image when the amount of change in the imaging range exceeds a second threshold;
- the display control apparatus according to any one of (4) to (6).
- the display control unit controls the second threshold according to an imaging magnification of the captured image;
- the display control apparatus according to (7).
- the display control unit controls the captured image to be displayed when the sharpness of the captured image exceeds a predetermined threshold, and when the sharpness of the captured image is lower than a predetermined threshold, Controlling the virtual object to be displayed;
- the display control apparatus according to any one of (1) to (8).
- the display control unit controls the imaging magnification range in which the virtual object can be displayed to be displayed;
- the display control device according to any one of (1) to (9).
- the display control unit controls the recommended imaging magnification associated with the virtual image to be displayed.
- the display control apparatus according to any one of (1) to (10).
- (12) The display control unit performs control so that a recommended imaging magnification associated with each region of the virtual image is displayed.
- the display control device according to (11).
- (13) The display control unit performs control so that a virtual object after being subjected to filter processing according to the captured image is displayed;
- the display control unit displays the virtual object as if the posture of the virtual object has changed according to the imaging magnification of the captured image.
- the display control apparatus according to any one of (1) to (13).
- An image acquisition unit that acquires a captured image in which the imaging range is captured;
- a display control unit that controls a display unit so that a virtual object corresponding to an imaging target reflected in the captured image is displayed;
- the display control unit displays the virtual object so as to change a ratio of a movement amount of the virtual object in the virtual image to a movement amount of the imaging range in real space according to an imaging magnification of the captured image.
- a computer-readable recording medium that records a program for causing it to function as a control device.
- DESCRIPTION OF SYMBOLS 1 Information processing system 10 Display control apparatus 20 Information processing apparatus 30 Network 40 Imaging object 41 Imaging location 60 Captured image 50 Virtual image 51 Virtual object 110 Control part 111 Image acquisition part 112 Display control part 120 Operation part 130 Imaging part 140 Storage part 150 Communication unit 160 Display unit 210 Control unit 211 Recognition unit 212 Extraction unit 220 Storage unit 230 Communication unit 221 Recognition unit
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Controls And Circuits For Display Device (AREA)
- Studio Devices (AREA)
Abstract
Description
1.実施形態
1-1.情報処理システムの構成例
1-2.表示制御装置の機能構成例
1-3.情報処理装置の機能構成例
1-4.情報処理システムの概要
1-5.情報処理システムの機能詳細
1-6.ハードウェア構成例
2.むすび
まず、本開示の実施形態について説明する。
最初に、本開示の実施形態に係る情報処理システム1の構成例について説明する。図1は、本開示の実施形態に係る情報処理システム1の構成例を示す図である。図1に示すように、情報処理システム1は、表示制御装置10と情報処理装置20とを備える。表示制御装置10と情報処理装置20とはネットワーク30を介して通信を行うことが可能である。
続いて、本開示の実施形態に係る表示制御装置10の機能構成例について説明する。図2は、本開示の実施形態に係る表示制御装置10の機能構成例を示す図である。図2に示すように、表示制御装置10は、制御部110、操作部120、撮像部130、記憶部140、通信部150および表示部160を備える。
続いて、本開示の実施形態に係る情報処理装置20の機能構成例について説明する。図3は、本開示の実施形態に係る情報処理装置20の機能構成例を示す図である。図3に示すように、情報処理装置20は、制御部210、記憶部220および通信部230を備える。
続いて、本開示の実施形態に係る情報処理システム1の概要について説明する。図4は、本開示の実施形態に係る情報処理システム1の概要を説明するための図である。図4に示すように、撮像部130により撮像範囲が撮像されると、画像取得部111により撮像画像が取得される。図4に示した例では、撮像範囲内に撮像対象40が存在するため、撮像画像には撮像対象40が映る。なお、以下の説明においては、撮像対象40が絵画である場合を例として説明するが、撮像対象40は絵画以外の物体であってもよい。
続いて、本開示の実施形態に係る情報処理システム1の機能詳細について説明する。まず、本開示の実施形態に係る情報処理システム1の動作例について説明する。図8は、本開示の実施形態に係る情報処理システム1の動作例を示すフローチャートである。
続いて、本開示の実施形態に係る表示制御装置10のハードウェア構成例について説明する。図14は、本開示の実施形態に係る表示制御装置10のハードウェア構成例を示す図である。ただし、図14に示したハードウェア構成例は、表示制御装置10のハードウェア構成の一例を示したに過ぎない。したがって、表示制御装置10のハードウェア構成は、図14に示した例に限定されない。
以上説明したように、本開示の実施形態によれば、撮像範囲が撮像された撮像画像を取得する画像取得部111と、撮像画像に映る撮像対象に応じた仮想画像から抽出された仮想オブジェクトが表示されるように制御する表示制御部112と、を備える表示制御装置10が提供される。表示制御部112は、撮像画像の撮像倍率に応じて、実空間における撮像範囲の移動量に対する仮想画像における仮想オブジェクトの移動量の割合が変化するように仮想オブジェクトを表示させる。かかる構成によれば、ユーザの使用感を向上させることが可能である。
(1)
撮像範囲が撮像された撮像画像を取得する画像取得部と、
前記撮像画像に映る撮像対象に応じた仮想オブジェクトが表示されるように表示部を制御する表示制御部と、を備え、
前記表示制御部は、前記撮像画像の撮像倍率に応じて、実空間における前記撮像範囲の移動量に対する仮想画像における前記仮想オブジェクトの移動量の割合が変化するように前記仮想オブジェクトを表示させる、
表示制御装置。
(2)
前記表示制御部は、前記撮像対象に応じた仮想画像から抽出された仮想オブジェクトが表示されるように前記表示部を制御する、
前記(1)に記載の表示制御装置。
(3)
前記表示制御部は、前記撮像画像の撮像倍率が大きいほど、実空間における前記撮像範囲の移動量に対する前記仮想画像における前記仮想オブジェクトの移動量の割合が小さくなるように前記仮想オブジェクトを表示させる、
前記(1)に記載の表示制御装置。
(4)
前記表示制御部は、前記撮像範囲の変化量が第1の閾値を下回る場合には、前記仮想画像における前記仮想オブジェクトの移動を禁止する、
前記(1)~(3)のいずれか一項に記載の表示制御装置。
(5)
前記表示制御部は、前記撮像画像の撮像倍率に応じて前記第1の閾値を制御する、
前記(4)に記載の表示制御装置。
(6)
前記表示制御部は、前記撮像画像の撮像倍率が大きいほど、前記第1の閾値を大きくする、
前記(5)に記載の表示制御装置。
(7)
前記表示制御部は、前記撮像範囲の変化量が第2の閾値を上回る場合には、前記仮想画像における前記仮想オブジェクトの移動を禁止する、
前記(4)~(6)のいずれか一項に記載の表示制御装置。
(8)
前記表示制御部は、前記撮像画像の撮像倍率に応じて前記第2の閾値を制御する、
前記(7)に記載の表示制御装置。
(9)
前記表示制御部は、前記撮像画像の鮮明度が所定の閾値を上回る場合には、前記撮像画像が表示されるように制御し、前記撮像画像の鮮明度が所定の閾値を下回る場合には、前記仮想オブジェクトが表示されるように制御する、
前記(1)~(8)のいずれか一項に記載の表示制御装置。
(10)
前記表示制御部は、前記仮想オブジェクトが表示され得る撮像倍率の範囲が表示されるように制御する、
前記(1)~(9)のいずれか一項に記載の表示制御装置。
(11)
前記表示制御部は、前記仮想画像に対応付けられている推奨撮像倍率が表示されるように制御する、
前記(1)~(10)のいずれか一項に記載の表示制御装置。
(12)
前記表示制御部は、前記仮想画像の各領域に対応付けられている推奨撮像倍率が表示されるように制御する、
前記(11)に記載の表示制御装置。
(13)
前記表示制御部は、前記撮像画像に応じたフィルタ処理が施された後の仮想オブジェクトが表示されるように制御する、
前記(1)~(12)のいずれか一項に記載の表示制御装置。
(14)
前記表示制御部は、前記撮像画像の撮像倍率に応じて、前記仮想オブジェクトの姿勢が変化したように前記仮想オブジェクトを表示させる、
前記(1)~(13)のいずれか一項に記載の表示制御装置。
(15)
撮像範囲が撮像された撮像画像を取得することと、
前記撮像画像に映る撮像対象に応じた仮想オブジェクトが表示されるように表示部を制御することと、
前記撮像画像の撮像倍率に応じて、実空間における前記撮像範囲の移動量に対する仮想画像における前記仮想オブジェクトの移動量の割合が変化するように前記仮想オブジェクトを表示させることと、
を含む、表示制御方法。
(16)
コンピュータを、
撮像範囲が撮像された撮像画像を取得する画像取得部と、
前記撮像画像に映る撮像対象に応じた仮想オブジェクトが表示されるように表示部を制御する表示制御部と、を備え、
前記表示制御部は、前記撮像画像の撮像倍率に応じて、実空間における前記撮像範囲の移動量に対する仮想画像における前記仮想オブジェクトの移動量の割合を変化するように前記仮想オブジェクトを表示させる、表示制御装置として機能させるためのプログラムを記録した、コンピュータに読み取り可能な記録媒体。
10 表示制御装置
20 情報処理装置
30 ネットワーク
40 撮像対象
41 撮像箇所
60 撮像画像
50 仮想画像
51 仮想オブジェクト
110 制御部
111 画像取得部
112 表示制御部
120 操作部
130 撮像部
140 記憶部
150 通信部
160 表示部
210 制御部
211 認識部
212 抽出部
220 記憶部
230 通信部
221 認識部
Claims (16)
- 撮像範囲が撮像された撮像画像を取得する画像取得部と、
前記撮像画像に映る撮像対象に応じた仮想オブジェクトが表示されるように表示部を制御する表示制御部と、を備え、
前記表示制御部は、前記撮像画像の撮像倍率に応じて、実空間における前記撮像範囲の移動量に対する仮想画像における前記仮想オブジェクトの移動量の割合が変化するように前記仮想オブジェクトを表示させる、
表示制御装置。 - 前記表示制御部は、前記撮像対象に応じた仮想画像から抽出された仮想オブジェクトが表示されるように前記表示部を制御する、
請求項1に記載の表示制御装置。 - 前記表示制御部は、前記撮像画像の撮像倍率が大きいほど、実空間における前記撮像範囲の移動量に対する前記仮想画像における前記仮想オブジェクトの移動量の割合が小さくなるように前記仮想オブジェクトを表示させる、
請求項1に記載の表示制御装置。 - 前記表示制御部は、前記撮像範囲の変化量が第1の閾値を下回る場合には、前記仮想画像における前記仮想オブジェクトの移動を禁止する、
請求項1に記載の表示制御装置。 - 前記表示制御部は、前記撮像画像の撮像倍率に応じて前記第1の閾値を制御する、
請求項4に記載の表示制御装置。 - 前記表示制御部は、前記撮像画像の撮像倍率が大きいほど、前記第1の閾値を大きくする、
請求項5に記載の表示制御装置。 - 前記表示制御部は、前記撮像範囲の変化量が第2の閾値を上回る場合には、前記仮想画像における前記仮想オブジェクトの移動を禁止する、
請求項4に記載の表示制御装置。 - 前記表示制御部は、前記撮像画像の撮像倍率に応じて前記第2の閾値を制御する、
請求項7に記載の表示制御装置。 - 前記表示制御部は、前記撮像画像の鮮明度が所定の閾値を上回る場合には、前記撮像画像が表示されるように制御し、前記撮像画像の鮮明度が所定の閾値を下回る場合には、前記仮想オブジェクトが表示されるように制御する、
請求項1に記載の表示制御装置。 - 前記表示制御部は、前記仮想オブジェクトが表示され得る撮像倍率の範囲が表示されるように制御する、
請求項1に記載の表示制御装置。 - 前記表示制御部は、前記仮想画像に対応付けられている推奨撮像倍率が表示されるように制御する、
請求項1に記載の表示制御装置。 - 前記表示制御部は、前記仮想画像の各領域に対応付けられている推奨撮像倍率が表示されるように制御する、
請求項11に記載の表示制御装置。 - 前記表示制御部は、前記撮像画像に応じたフィルタ処理が施された後の仮想オブジェクトが表示されるように制御する、
請求項1に記載の表示制御装置。 - 前記表示制御部は、前記撮像画像の撮像倍率に応じて、前記仮想オブジェクトの姿勢が変化したように前記仮想オブジェクトを表示させる、
請求項1に記載の表示制御装置。 - 撮像範囲が撮像された撮像画像を取得することと、
前記撮像画像に映る撮像対象に応じた仮想オブジェクトが表示されるように表示部を制御することと、
前記撮像画像の撮像倍率に応じて、実空間における前記撮像範囲の移動量に対する仮想画像における前記仮想オブジェクトの移動量の割合が変化するように前記仮想オブジェクトを表示させることと、
を含む、表示制御方法。 - コンピュータを、
撮像範囲が撮像された撮像画像を取得する画像取得部と、
前記撮像画像に映る撮像対象に応じた仮想オブジェクトが表示されるように表示部を制御する表示制御部と、を備え、
前記表示制御部は、前記撮像画像の撮像倍率に応じて、実空間における前記撮像範囲の移動量に対する仮想画像における前記仮想オブジェクトの移動量の割合を変化するように前記仮想オブジェクトを表示させる、表示制御装置として機能させるためのプログラムを記録した、コンピュータに読み取り可能な記録媒体。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015508122A JP6304238B2 (ja) | 2013-03-29 | 2014-01-24 | 表示制御装置、表示制御方法および記録媒体 |
US14/778,047 US9992419B2 (en) | 2013-03-29 | 2014-01-24 | Display control apparatus for displaying a virtual object |
EP14774412.2A EP2981060A4 (en) | 2013-03-29 | 2014-01-24 | DISPLAY CONTROL DEVICE, DISPLAY CONTROL METHOD, AND RECORDING MEDIUM |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013074221 | 2013-03-29 | ||
JP2013-074221 | 2013-03-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014156257A1 true WO2014156257A1 (ja) | 2014-10-02 |
Family
ID=51623260
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/051514 WO2014156257A1 (ja) | 2013-03-29 | 2014-01-24 | 表示制御装置、表示制御方法および記録媒体 |
Country Status (4)
Country | Link |
---|---|
US (1) | US9992419B2 (ja) |
EP (1) | EP2981060A4 (ja) |
JP (1) | JP6304238B2 (ja) |
WO (1) | WO2014156257A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021117799A1 (ja) * | 2019-12-11 | 2021-06-17 | 株式会社Cygames | 既知の画像を用いた画像撮影の安定化のための方法、プログラム、電子装置 |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6492571B2 (ja) * | 2014-11-20 | 2019-04-03 | 株式会社リコー | 情報処理システム、情報処理装置、画面表示方法及びプログラム |
US10244175B2 (en) * | 2015-03-09 | 2019-03-26 | Apple Inc. | Automatic cropping of video content |
US10509556B2 (en) * | 2017-05-02 | 2019-12-17 | Kyocera Document Solutions Inc. | Display device |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0378373A (ja) * | 1989-08-22 | 1991-04-03 | Fuji Photo Optical Co Ltd | テレビカメラ操作装置 |
JP2001281523A (ja) * | 2000-03-31 | 2001-10-10 | Fuji Photo Optical Co Ltd | 制御装置 |
JP2002051253A (ja) * | 2000-07-31 | 2002-02-15 | Fuji Photo Film Co Ltd | 電子撮像装置 |
JP2003264740A (ja) * | 2002-03-08 | 2003-09-19 | Cad Center:Kk | 展望鏡 |
JP2004128701A (ja) * | 2002-09-30 | 2004-04-22 | Fuji Photo Film Co Ltd | 撮影補助方法およびシステム |
JP2008301230A (ja) * | 2007-05-31 | 2008-12-11 | Olympus Imaging Corp | 撮像システム及び撮像装置 |
JP2009060339A (ja) * | 2007-08-31 | 2009-03-19 | Nikon Corp | 電子カメラ |
JP2012024772A (ja) | 2010-07-20 | 2012-02-09 | Amada Co Ltd | レーザ加工ヘッド |
JP2012165447A (ja) * | 2012-04-19 | 2012-08-30 | Olympus Corp | カメラ |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5625765A (en) * | 1993-09-03 | 1997-04-29 | Criticom Corp. | Vision systems including devices and methods for combining images for extended magnification schemes |
RU2010123652A (ru) * | 2010-06-10 | 2011-12-20 | Корпорация "САМСУНГ ЭЛЕКТРОНИКС Ко., Лтд." (KR) | Система и способ визуализации стереоизображений и многовидовых изображений для управления восприятием глубины стереоскопического изображения, создаваемого телевизионным приемником |
US20120256906A1 (en) * | 2010-09-30 | 2012-10-11 | Trident Microsystems (Far East) Ltd. | System and method to render 3d images from a 2d source |
JP5762718B2 (ja) * | 2010-10-20 | 2015-08-12 | シャープ株式会社 | 画像形成装置 |
JP5960796B2 (ja) * | 2011-03-29 | 2016-08-02 | クアルコム,インコーポレイテッド | ローカルマルチユーザ共同作業のためのモジュール式のモバイル接続ピコプロジェクタ |
TWI518436B (zh) * | 2012-01-17 | 2016-01-21 | 明基電通股份有限公司 | 影像擷取裝置及影像處理方法 |
JP2013161390A (ja) | 2012-02-08 | 2013-08-19 | Sony Corp | サーバ、クライアント端末、システム、およびプログラム |
US9310611B2 (en) * | 2012-09-18 | 2016-04-12 | Qualcomm Incorporated | Methods and systems for making the use of head-mounted displays less obvious to non-users |
NL2010302C2 (en) * | 2013-02-14 | 2014-08-18 | Optelec Dev B V | A system for determining a recommended magnification factor for a magnifier such as a loupe or an electronic magnifier to be used by a person. |
US9213403B1 (en) * | 2013-03-27 | 2015-12-15 | Google Inc. | Methods to pan, zoom, crop, and proportionally move on a head mountable display |
-
2014
- 2014-01-24 EP EP14774412.2A patent/EP2981060A4/en not_active Ceased
- 2014-01-24 US US14/778,047 patent/US9992419B2/en active Active
- 2014-01-24 JP JP2015508122A patent/JP6304238B2/ja active Active
- 2014-01-24 WO PCT/JP2014/051514 patent/WO2014156257A1/ja active Application Filing
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0378373A (ja) * | 1989-08-22 | 1991-04-03 | Fuji Photo Optical Co Ltd | テレビカメラ操作装置 |
JP2001281523A (ja) * | 2000-03-31 | 2001-10-10 | Fuji Photo Optical Co Ltd | 制御装置 |
JP2002051253A (ja) * | 2000-07-31 | 2002-02-15 | Fuji Photo Film Co Ltd | 電子撮像装置 |
JP2003264740A (ja) * | 2002-03-08 | 2003-09-19 | Cad Center:Kk | 展望鏡 |
JP2004128701A (ja) * | 2002-09-30 | 2004-04-22 | Fuji Photo Film Co Ltd | 撮影補助方法およびシステム |
JP2008301230A (ja) * | 2007-05-31 | 2008-12-11 | Olympus Imaging Corp | 撮像システム及び撮像装置 |
JP2009060339A (ja) * | 2007-08-31 | 2009-03-19 | Nikon Corp | 電子カメラ |
JP2012024772A (ja) | 2010-07-20 | 2012-02-09 | Amada Co Ltd | レーザ加工ヘッド |
JP2012165447A (ja) * | 2012-04-19 | 2012-08-30 | Olympus Corp | カメラ |
Non-Patent Citations (1)
Title |
---|
See also references of EP2981060A4 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021117799A1 (ja) * | 2019-12-11 | 2021-06-17 | 株式会社Cygames | 既知の画像を用いた画像撮影の安定化のための方法、プログラム、電子装置 |
JP2021092993A (ja) * | 2019-12-11 | 2021-06-17 | 株式会社Cygames | 既知の画像を用いた画像撮影の安定化のための方法、プログラム、電子装置 |
Also Published As
Publication number | Publication date |
---|---|
EP2981060A1 (en) | 2016-02-03 |
JP6304238B2 (ja) | 2018-04-04 |
EP2981060A4 (en) | 2016-09-28 |
US9992419B2 (en) | 2018-06-05 |
JPWO2014156257A1 (ja) | 2017-02-16 |
US20160295117A1 (en) | 2016-10-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109635621B (zh) | 用于第一人称视角中基于深度学习识别手势的系统和方法 | |
US11188187B2 (en) | Information processing apparatus, information processing method, and recording medium | |
US9811910B1 (en) | Cloud-based image improvement | |
JP5765019B2 (ja) | 表示制御装置、表示制御方法、およびプログラム | |
US9836886B2 (en) | Client terminal and server to determine an overhead view image | |
CN109076159B (zh) | 电子设备及其操作方法 | |
US9626076B2 (en) | Display apparatus for displaying images and method thereof | |
JP5776903B2 (ja) | 画像処理装置及び画像処理方法並びにプログラム | |
US20120275648A1 (en) | Imaging device and imaging method and program | |
US20090227283A1 (en) | Electronic device | |
US9269009B1 (en) | Using a front-facing camera to improve OCR with a rear-facing camera | |
US20180240213A1 (en) | Information processing system, information processing method, and program | |
US9854174B2 (en) | Shot image processing method and apparatus | |
CN108399349A (zh) | 图像识别方法及装置 | |
JP6304238B2 (ja) | 表示制御装置、表示制御方法および記録媒体 | |
US9628700B2 (en) | Imaging apparatus, imaging assist method, and non-transitory recoding medium storing an imaging assist program | |
US10623625B2 (en) | Focusing control device, imaging device, focusing control method, and nontransitory computer readable medium | |
WO2015068447A1 (ja) | 情報処理装置、情報処理方法および情報処理システム | |
JP6044633B2 (ja) | 情報処理装置、情報処理方法およびプログラム | |
JP2013004001A (ja) | 表示制御装置、表示制御方法、およびプログラム | |
WO2017126216A1 (ja) | 撮像制御装置、撮像制御方法及びコンピュータプログラム | |
JP2013080266A (ja) | 入力装置 | |
KR102605451B1 (ko) | 이미지 내에 포함된 복수의 외부 객체들 각각에 대응하는 복수의 서비스들을 제공하는 전자 장치 및 방법 | |
JP2013207356A (ja) | 情報処理装置、情報処理方法およびプログラム | |
JP6064995B2 (ja) | 情報処理装置、情報処理方法およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14774412 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015508122 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014774412 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14778047 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |