WO2016157951A1 - 表示制御装置、表示制御方法および記録媒体 - Google Patents
表示制御装置、表示制御方法および記録媒体 Download PDFInfo
- Publication number
- WO2016157951A1 WO2016157951A1 PCT/JP2016/051790 JP2016051790W WO2016157951A1 WO 2016157951 A1 WO2016157951 A1 WO 2016157951A1 JP 2016051790 W JP2016051790 W JP 2016051790W WO 2016157951 A1 WO2016157951 A1 WO 2016157951A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display control
- display area
- designated position
- control unit
- display
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/02—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/14—Display of multiple viewports
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/34—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators for rolling or scrolling
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04803—Split screen, i.e. subdividing the display area or the window area into separate subareas
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2356/00—Detection of the display position w.r.t. other display screens
Definitions
- the present disclosure relates to a display control device, a display control method, and a recording medium.
- a technique in which an intersection between a user-designated vector and a display area is detected as a designated position, and display control is performed according to the designated position. For example, a user's hand region is detected from an image captured by a camera, a shadow portion is extracted from the hand region, a plurality of edges of the shadow portion are detected as line segments by Hough transform, and an acute angle is detected from the detected line segments.
- a technique for detecting the position of an intersection formed as a position designated by a user is disclosed (for example, see Patent Document 1).
- the display control unit includes: a detection unit that detects an intersection of a user-specified vector and a plane including a display region as a specified position; and a display control unit that performs display control based on the specified position. And a display control device that performs predetermined display control when the designated position is outside the display area.
- the method includes: detecting an intersection of a user-specified vector and a plane including a display area as a specified position; and performing display control based on the specified position.
- a display control method including performing predetermined display control when outside the display area is provided.
- the computer includes a detection unit that detects an intersection of a user-specified vector and a plane including the display area as a specified position, and a display control unit that performs display control based on the specified position.
- the display control unit is a computer-readable recording medium on which a program for causing the display control device to function as a display control device that performs predetermined display control when the designated position is outside the display area is recorded. Provided.
- FIG. 3 is a diagram illustrating a hardware configuration example of a display control device according to an embodiment of the present disclosure.
- a plurality of constituent elements having substantially the same functional configuration may be distinguished by attaching different alphabets or numbers after the same reference numeral.
- it is not necessary to particularly distinguish each of a plurality of constituent elements having substantially the same functional configuration only the same reference numerals are given.
- Embodiment 1-1 Overview of information processing system 1-2.
- Functional configuration example of display control apparatus 1-3 Example of designated vector calculation 1-4.
- FIG. 1 is a diagram for describing an overview of an information processing system 10 according to an embodiment of the present disclosure.
- the information processing system 10 includes a display control device 100, a detection device 120, and a display device 170.
- the detection device 120 is a camera module that captures an image.
- the detection device 120 images an actual space using an image sensor such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) to generate an image.
- An image generated by the detection device 120 is output to the display control device 100.
- the detection device 120 is configured separately from the display control device 100, but the detection device 120 may be integrated with the display control device 100.
- the display device 170 displays various information in the display area 171 according to the control by the display control device 100.
- the display device 170 includes, for example, an LCD (Liquid Crystal Display), an organic EL (Electroluminescence) display device, or the like.
- the display device 170 is configured separately from the display control device 100, but the display device 170 may be integrated with the display control device 100.
- the display device 170 has a display area 171, and the display control apparatus 100 displays contents C 1 to C 6 in the display area 171.
- a technique is known in which an intersection between a user U's designated vector and the display area 171 is detected as a designated position, and display control is performed according to the designated position. For example, when the intersection of the designated vector and the display area 171 exists in any of the contents C1 to C6 for a predetermined time, the content in which the intersection exists for a predetermined time may be enlarged (predetermined time). The slide show of the content where the intersection exists may be started.
- the display control apparatus 100 includes a video camera, a digital camera, a PDA (Personal Digital Assistants), a tablet terminal, a smartphone, a mobile phone, a portable music playback device, a portable video processing device, a portable game device, a television device, It may be applied to digital signage and the like.
- a PC Personal Computer
- PDA Personal Digital Assistants
- the display control device 100 includes a video camera, a digital camera, a PDA (Personal Digital Assistants), a tablet terminal, a smartphone, a mobile phone, a portable music playback device, a portable video processing device, a portable game device, a television device, It may be applied to digital signage and the like.
- FIG. 2 is a diagram illustrating a functional configuration example of the display control apparatus 100 according to the embodiment of the present disclosure.
- the display control device 100 includes a control unit 110 and a storage unit 130.
- the control unit 110 corresponds to a processor such as a CPU (Central Processing Unit) or a DSP (Digital Signal Processor).
- the control unit 110 exhibits various functions of the control unit 110 by executing a program stored in the storage unit 130 or another storage medium.
- the control unit 110 includes functional blocks such as a detection unit 111, a display control unit 112, and an execution unit 113. The functions of these functional blocks will be described later.
- the storage unit 130 stores a program for operating the control unit 110 using a storage medium such as a semiconductor memory or a hard disk.
- the storage unit 130 can also store various data (for example, images) used by the program.
- the storage unit 130 is integrated with the display control device 100, but the storage unit 130 may be configured separately from the display control device 100.
- FIG. 3 is a diagram illustrating an example in which the direction from the elbow to the wrist of the user U is given as a designated vector by the user U.
- the detection unit 111 recognizes the skeleton information of the user U three-dimensionally from an image captured by the detection device 120. Then, when the positions of the elbow b1 and the wrist b2 are acquired from the three-dimensionally recognized skeleton information, the detection unit 111 can detect the direction from the elbow b1 to the wrist b2 as the designated vector v. is there.
- FIG. 4 is a diagram showing an example in which the direction from the elbow of the user U to the fingertip is given as a designated vector by the user U.
- the detection unit 111 recognizes the skeleton information of the user U three-dimensionally from an image captured by the detection device 120. Then, when the positions of the wrist b2 and the fingertip b3 are acquired from the three-dimensionally recognized skeleton information, the detection unit 111 can detect the direction from the wrist b2 to the fingertip b3 as the designated vector v. is there.
- the designated vector v is calculated with higher accuracy than when the direction from the elbow b1 to the wrist b2 is detected as the designated vector v. I think it can be done.
- any one of the direction from the wrist b2 to the fingertip b3 and the direction from the elbow b1 to the wrist b2 may be used as the designated vector v, but the direction from the wrist b2 to the fingertip b3 and the elbow
- An average of directions from b1 to the wrist b2 may be used as the designated vector v.
- the detecting unit 111 detects an intersection between the designated vector v and a plane including the display area 171 as a designated position.
- FIG. 5 is a diagram for explaining an example in which the intersection point between the designated vector v and the plane including the display area 171 is detected as the designated position.
- t is a scale variable
- the coordinate of the wrist b2 is p
- the projection matrix from the detection device 120 to the display device 170 is P
- the detection unit 111 intersects the designated vector v and the plane including the display region 171.
- x can be calculated by the following equation (1).
- the relative positional relationship between the detection device 120 and the display device 170 is determined (for example, when the detection device 120 is incorporated at a predetermined position of the display device 170), the predetermined position is determined in advance.
- a projection matrix P may be used.
- the relative positional relationship between the detection device 120 and the display device 170 is not fixed (for example, when the display device 170 and the detection device 120 are installed separately, or the detection device 120 is embedded in the projector). In such a case, calibration may be executed by the display control device 100 (projection transformation P may be calculated).
- the user U performs pointing in order for a total of five points including the four corners of the display area 171 and the center of the display area 171, and calibration is performed based on the pointing by the user U toward the five points. May be.
- an object displayed in the display area 171 may be read by a camera fixed to the detection device 120, and calibration may be performed based on the position of the read object.
- the calibration data once calculated can be used continuously as long as the positional relationship between the detection device 120 and the display device 170 does not change.
- the display control unit 112 performs display control based on the designated position.
- the display control unit 112 performs predetermined display control when the designated position is outside the display area 171. With this configuration, when the position designated by the user U is outside the display area 171, it can be fed back to the user U that the outside of the display area 171 has been designated.
- the predetermined display control is not particularly limited. Hereinafter, an example of predetermined display control will be described.
- FIG. 6 is a diagram illustrating a first display control example in a case where the designated position is outside the display area 171.
- the center of the display area 171 is shown as the center position Pc.
- the display control unit 112 may display the object in the display area 171 when the designated position is outside the display area 171. Then, it is possible to visually notify the user U that the outside of the display area 171 has been designated.
- the display control unit 112 since the designated position Pt1 that is outside the display area 171 is designated by the user U, the display control unit 112 displays the object B1 in the display area 171.
- the object displayed in the display area 171 only needs to be visible to the user U, and the color, size, shape, and the like of the object are not particularly limited.
- the display position of the object is not particularly limited.
- the display control unit 112 may display the object at the end of the display area 171 when the specified position is outside the display area 171. Then, it becomes possible to inform the user U more intuitively that the outside of the display area 171 has been designated. In the example shown in FIG. 6, since the designated position Pt1 is outside the display area 171, the display control unit 112 displays the object B1 at the end of the display area 171.
- the display control unit 112 determines whether the end of the display area 171 is a predetermined position (for example, the center position Pc) of the display area 171 and the line segment connecting the designated position. Objects may be displayed at the intersections. By doing so, it is possible to notify the user U as to which direction the designated position by the user U is based on the position of the display area 171. In the example shown in FIG. 6, since the designated position Pt1 is outside the display area 171, the display control unit 112 determines the end of the display area 171, the predetermined position (for example, the center position Pc) of the display area 171, and the designated position. The object B1 is displayed at the intersection T1 with the line segment connecting Pt1.
- a predetermined position for example, the center position Pc
- the predetermined position of the display area 171 is the center position Pc
- the predetermined position of the display area 171 is not limited to the center position Pc.
- the display control unit 112 may display the same object in the display area 171 without depending on the designated position, or may change the object according to the designated position. For example, the display control unit 112 may change the size of the object according to the designated position. Referring to FIG. 6, since the designated position Pt2 is also outside the display area 171, the display control unit 112 intersects the end T2 of the display area 171, the center position Pc of the display area 171, and a line segment connecting the designated positions. The object B2 is displayed on the screen.
- the display control unit 112 sets the size of the object B1 to the object because the distance D1 from the intersection T1 to the designated position Pt1 is smaller than the distance D2 from the intersection T2 to the designated position Pt2. It is larger than the size of B2.
- the display control unit 112 increases the size of the object as the distance from the intersection between the end of the display area 171 and the line connecting the center position Pc of the display area 171 and the specified position to the specified position decreases.
- the display control unit 112 may reduce the size of the object as the distance from the intersection to the designated position decreases.
- FIG. 7 is a diagram illustrating a second display control example in the case where the designated position is outside the display area 171.
- the display control unit 112 may change the shape of the object according to the designated position.
- the display control unit 112 determines the degree of deformation of the object B1 because the distance D1 from the intersection T1 to the designated position Pt1 is smaller than the distance D2 from the intersection T2 to the designated position Pt2.
- the degree of deformation of the object B2 is smaller (the shape of the object B1 is a semicircular shape as in the case where the designated position is inside the display area 171 and the shape of the object B2 is a semielliptical shape).
- the display control unit 112 reduces the degree of deformation of the object as the distance from the intersection point between the end of the display area 171 and the line segment connecting the center position Pc of the display area 171 and the specified position decreases.
- the display control unit 112 may increase the degree of deformation of the object as the distance from the intersection to the designated position decreases.
- FIG. 8 is a diagram illustrating a third display control example in the case where the designated position is outside the display area 171.
- the display control unit 112 changes the shape of the object according to the designated position has been described.
- the display control unit 112 may change the color of the object according to the designated position.
- the display control unit 112 determines that the distance D1 from the intersection point T1 to the designated position Pt1 is smaller than the distance D2 from the center position Pc to the designated position Pt2 of the display area 171.
- the color of B1 is lighter than the color of object B2.
- the display control unit 112 reduces the color of the object as the distance from the intersection between the end of the display area 171 and the line connecting the center position Pc of the display area 171 and the specified position to the specified position decreases.
- how to change the color of the object is not limited. Therefore, the display control unit 112 may darken the color of the object as the distance from the center position Pc of the display area 171 to the designated position becomes smaller.
- the change in the color of the object may not be a change in the color intensity of the object.
- FIG. 9 is a flowchart showing an example of the flow of operations for displaying an object in the display area 171 when the designated position is outside the display area 171. Note that the operation of displaying an object in the display area 171 when the designated position is outside the display area 171 is not limited to the example shown in the flowchart of FIG.
- the detection unit 111 calculates a user-designated vector based on the image captured by the detection device 120 (S11). Subsequently, the detection unit 111 detects an intersection point between a user-designated vector and a plane including the display area 171 as a designated position (S12). Subsequently, the display control unit 112 determines whether or not the designated position is outside the display area 171 (S13). If the display control unit 112 determines that the designated position is not outside the display area 171 (S13: No), the display control unit 112 shifts the operation to S15. On the other hand, when it is determined that the designated position is outside the display area 171 (S13: Yes), the object is displayed in the display area 171 (S14), and the operation is shifted to S15.
- the execution unit 113 determines whether or not a predetermined operation has been performed by the user (S15). When it is determined that the user has not performed the predetermined operation (S15: No), the execution unit 113 ends the operation. On the other hand, if the execution unit 113 determines that a predetermined operation has been performed by the user (S15: Yes), the execution unit 113 executes a process corresponding to the designated position (S16) and ends the operation.
- the process corresponding to the designated position is not particularly limited, but as described above, the process may be a process of enlarging and displaying the content in which the intersection of the designated vector by the user U and the display area 171 exists over a predetermined time.
- FIG. 10 is a diagram illustrating a fourth display control example in the case where the designated position is outside the display area 171.
- the display control unit 112 may correct the designated position when the designated position is outside the display area 171. With this configuration, even when the designated position is outside the display area 171, the designated position is shifted to the designated position after correction.
- the display control unit 112 connects the center position Pc and the designated position Pd when the designated position Pd is outside the display area 171.
- the designated position Pd may be corrected to the intersection Pe between the minute and the end of the display area 171. If the designated position Pd is corrected at the intersection Pe, the user can easily understand where the designated position after correction is.
- a correction area Ar may be provided around the display area 171. The display control unit 112 may correct the designated position Pd when the designated position Pd is inside the correction area Ar.
- FIG. 11 is a flowchart showing an example of the flow of operations for correcting the designated position when the designated position is outside the display area 171. Note that the operation of correcting the designated position when the designated position is outside the display area 171 is not limited to the example shown in the flowchart of FIG.
- the detection unit 111 calculates a user-designated vector based on the image captured by the detection device 120 (S11). Subsequently, the detection unit 111 detects an intersection point between a user-designated vector and a plane including the display area 171 as a designated position (S12). Subsequently, the display control unit 112 determines whether or not the designated position is outside the display area 171 (S13). If the display control unit 112 determines that the designated position is not outside the display area 171 (S13: No), the display control unit 112 shifts the operation to S15. On the other hand, when it is determined that the designated position is outside the display area 171 (S13: Yes), the designated position is corrected (S21), and the operation is shifted to S15.
- the execution unit 113 determines whether or not a predetermined operation has been performed by the user (S15). When it is determined that the user has not performed the predetermined operation (S15: No), the execution unit 113 ends the operation. On the other hand, if the execution unit 113 determines that a predetermined operation has been performed by the user (S15: Yes), the execution unit 113 executes a process corresponding to the designated position (S16) and ends the operation.
- the process corresponding to the designated position is not particularly limited, but as described above, the process may be a process of enlarging and displaying the content in which the intersection of the designated vector by the user U and the display area 171 exists over a predetermined time.
- FIG. 12 is a diagram illustrating a fifth display control example in the case where the designated position is outside the display area 171.
- the display control unit 112 may scroll the content of the display area 171 based on the designated position. With this configuration, it is possible to increase the scrollable amount of content.
- the scroll direction of the content is not limited, for example, as illustrated in FIG. 12, the display control unit 112, when the specified position Pt is outside the display area 171, the specified position Pt with reference to the center position Pc.
- the content may be scrolled based on the direction. If the content is scrolled in such a direction, the user can intuitively specify the scroll direction of the content.
- FIG. 12 shows an example in which the map data is content, but the content may be other than map data.
- the content may be photo data (may be displayed by a photo viewer).
- the content scrolling speed is not limited, for example, when the designated position Pt is outside the display area 171, the display control unit 112 uses the reference position of the display area 171 (for example, the end of the display area 171 and the display area 171).
- the content may be scrolled at a speed corresponding to the distance D between the center position Pc and the line segment connecting the specified position Pt) and the specified position Pt. If the content is scrolled at such a speed, the user can intuitively specify the scroll speed of the content.
- the display control unit 112 determines that the distance D between the reference position of the display area 171 and the designated position is The content may be scrolled at a higher speed as the size increases.
- the display control unit 112 scrolls the content of the display area 171 based on the specified position when the specified position is outside the display area 171.
- the display control unit 112 may switch the content of the display area 171 based on the specified position.
- the display control unit 112 may switch the content based on the direction of the designated position Pt with respect to the center position Pc when the designated position Pt is outside the display area 171. .
- the content switching speed is not limited, for example, when the designated position Pt is outside the display area 171, the display control unit 112 displays the reference position of the display area 171 (for example, the end portion of the display area 171 and the display area 171). The content may be switched at a speed corresponding to the distance D between the specified position Pt and the intersection point between the center position Pc of the region 171 and the line segment connecting the specified position Pt.
- FIG. 13 is a flowchart illustrating an example of the flow of operations for scrolling content based on a specified position when the specified position is outside the display area 171. Note that the operation of scrolling the content based on the designated position when the designated position is outside the display area 171 is not limited to the example shown in the flowchart of FIG.
- the detection unit 111 calculates a user-designated vector based on the image captured by the detection device 120 (S11). Subsequently, the detection unit 111 detects an intersection point between a user-designated vector and a plane including the display area 171 as a designated position (S12). Subsequently, the display control unit 112 determines whether or not the designated position is outside the display area 171 (S13). If the display control unit 112 determines that the designated position is not outside the display area 171 (S13: No), the operation is terminated. On the other hand, when it is determined that the designated position is outside the display area 171 (S13: Yes), the content is scrolled based on the designated position (S31), and the operation is terminated.
- FIG. 14 is a diagram illustrating a sixth display control example in the case where the designated position is outside the display area 171.
- the display control unit 112 may perform a drag operation based on the movement of the designated position when the designated position is moved from the inside to the outside of the display area 171. With this configuration, it is possible to increase the width of the drag operation.
- the direction and size of the drag operation are not limited, for example, as shown in FIG. 14, the display control unit 112 has moved the designated position from the inner position Pt1 to the outer position Pt2 in the display area 171.
- the drag operation may be performed according to the direction and size from the position Pt1 to the outer position Pt2. If the drag operation is performed according to such direction and size, the user can intuitively perform the drag operation.
- FIG. 15 is a diagram illustrating a seventh display control example when the designated position is outside the display area 171.
- the display control unit 112 may perform a pinch-out operation based on the movement of each of the plurality of designated positions when the movement of each of the plurality of designated positions is performed from the inside to the outside of the display area 171. With this configuration, it is possible to increase the width of the pinch-out operation.
- the operation executed by the pinch-out operation may be content expansion.
- the display control unit 112 moves the designated position from the position Pt1 inside the display area 171 to the position Pt2 outside and moves from the position Pt3 inside the display area 171 to the position Pt4 outside.
- the designated position is moved, the content displayed in the display area 171 may be enlarged. If the content is expanded by a pinch-out operation, the user can intuitively expand the content.
- FIG. 16 is a diagram illustrating an eighth display control example in the case where the designated position is outside the display area 171.
- the display control unit 112 may perform a pinch-in operation based on the movement of each of the plurality of designated positions when the movement of each of the plurality of designated positions is performed from the outside to the inside of the display area 171. With this configuration, it is possible to increase the width of the pinch-in operation.
- the operation executed by the pinch-in operation may be content reduction.
- the display control unit 112 moves the designated position from the position Pt1 outside the display area 171 to the position Pt2 inside and moves from the position Pt3 outside the display area 171 to the position Pt4 inside.
- the designated position is moved, the content displayed in the display area 171 may be reduced. If the content is reduced by a pinch-in operation, the user can intuitively reduce the content.
- FIG. 17 is a flowchart illustrating an example of a flow of operations for performing processing on content based on movement of a designated position. Note that the operation for processing the content based on the movement of the designated position is not limited to the example shown in the flowchart of FIG.
- the detection unit 111 calculates a user-designated vector based on the image captured by the detection device 120 (S11). Subsequently, the detection unit 111 detects an intersection point between a user-designated vector and a plane including the display area 171 as a designated position (S12). Subsequently, the display control unit 112 determines whether or not the designated position has been moved from the inside to the outside of the display area 171 (S40).
- the display control unit 112 determines that the designated position has been moved from the inside to the outside of the display area 171 (S40: Yes)
- the display control unit 112 performs a drag operation (S40) and ends the operation.
- the display control unit 112 determines that the designated position is not moved from the inside to the outside of the display area 171 (S40: No)
- the display control unit 112 moves each of the plurality of designated positions from the outside to the inside of the display area 171. It is determined whether or not has been made (S42).
- the display control unit 112 determines that each of the plurality of designated positions has been moved from the outside to the inside of the display area 171 (S42: Yes)
- the display control unit 112 performs a pinch-in operation (S43) and ends the operation.
- the display control unit 112 determines that each of the plurality of designated positions has not been moved from the outside to the inside of the display area 171 (S42: No)
- the plurality of designated positions from the inside to the outside of the display area 171. It is determined whether or not each movement has been made (S44).
- the display control unit 112 determines that each of the plurality of designated positions has been moved from the inside to the outside of the display area 171 (S44: Yes), it performs a pinch-out operation (S45) and ends the operation. To do. On the other hand, when it is determined that the plurality of designated positions are not moved from the inside to the outside of the display area 171 (S44: No), the display control unit 112 ends the operation.
- the designation vector by the user U may be given in any way.
- the detection unit 111 may be given a designated vector based on sensor data detected by the sensor.
- FIG. 18 is a diagram for explaining an example in which a designated vector is given based on sensor data detected by a sensor. As shown in FIG. 18, the user U can operate the sensor R.
- the detection unit 111 calculates a designated vector based on the sensor data detected by the sensor R, and detects an intersection between the designated vector and a plane including the display area 171 as a designated position.
- the sensor data detected by the sensor R may be a movement of the sensor R.
- the sensor data may be an acceleration detected by an acceleration sensor or an angular velocity detected by a gyro sensor.
- a method disclosed in International Publication No. 2009/008372 can be adopted as a method for calculating the intersection between the designated vector and the display area 171.
- the designated vector may be given by the user's line of sight.
- FIG. 19 is a diagram for explaining an example in which a designated vector is given by a user's line of sight. As shown in FIG. 19, the user U can send a line of sight to the display area 171.
- the detection unit 111 detects the line of sight of the user U.
- the method for detecting the line of sight of the user U is not particularly limited.
- the detection unit 111 may detect the line of sight of the user U based on an imaging result obtained by imaging the eye area of the user U.
- an imaging device for example, when an infrared camera is used as the imaging device, an infrared irradiation device that irradiates infrared rays to the eye region of the user U may be provided. If it does so, the infrared rays reflected by the user's eye area
- the detection unit 111 may detect the line of sight of the user U based on the orientation of the HMD.
- the detection unit 111 may detect the line of sight of the user U based on the myoelectricity detected by the myoelectric sensor. Then, the detection unit 111 calculates the line of sight of the user U as a designated vector, and detects an intersection between the designated vector and a plane including the display area 171 as a designated position.
- FIG. 20 is a diagram illustrating a hardware configuration example of the display control apparatus 100 according to the embodiment of the present disclosure.
- the hardware configuration example illustrated in FIG. 20 is merely an example of the hardware configuration of the display control apparatus 100. Therefore, the hardware configuration of the display control apparatus 100 is not limited to the example illustrated in FIG.
- the display control apparatus 100 includes a CPU (Central Processing Unit) 801, a ROM (Read Only Memory) 802, a RAM (Random Access Memory) 803, an input device 808, and an output device 810.
- the CPU 801 functions as an arithmetic processing device and a control device, and controls the overall operation in the display control device 100 according to various programs. Further, the CPU 801 may be a microprocessor.
- the ROM 802 stores programs used by the CPU 801, calculation parameters, and the like.
- the RAM 803 temporarily stores programs used in the execution of the CPU 801, parameters that change as appropriate during the execution, and the like. These are connected to each other by a host bus including a CPU bus.
- the input device 808 includes an input unit for a user to input information, such as a mouse, a keyboard, a touch panel, a button, a microphone, a switch, and a lever, and an input control circuit that generates an input signal based on the input by the user and outputs the input signal to the CPU 801 Etc.
- the user of the display control device 100 can input various data and instruct processing operations to the display control device 100 by operating the input device 808.
- the output device 810 includes a display device such as a liquid crystal display (LCD) device, an OLED (Organic Light Emitting Diode) device, and a lamp. Furthermore, the output device 810 includes an audio output device such as a speaker and headphones. For example, the display device displays a captured image or a generated image. On the other hand, the audio output device converts audio data or the like into audio and outputs it.
- a display device such as a liquid crystal display (LCD) device, an OLED (Organic Light Emitting Diode) device, and a lamp.
- the output device 810 includes an audio output device such as a speaker and headphones.
- the display device displays a captured image or a generated image.
- the audio output device converts audio data or the like into audio and outputs it.
- the storage device 811 is a data storage device configured as an example of a storage unit of the display control device 100.
- the storage device 811 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like.
- the storage device 811 stores programs executed by the CPU 801 and various data.
- the drive 812 is a reader / writer for a storage medium, and is built in or externally attached to the display control device 100.
- the drive 812 reads out information recorded on a removable storage medium such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and outputs the information to the RAM 803.
- the drive 812 can also write information in a removable storage medium.
- the imaging device 813 includes an imaging optical system such as a photographing lens and a zoom lens that collects light, and a signal conversion element such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor).
- the imaging optical system collects light emitted from the subject and forms a subject image in the signal conversion unit, and the signal conversion element converts the formed subject image into an electrical image signal.
- the communication device 815 is a communication interface configured with, for example, a communication device for connecting to a network.
- the communication device 815 may be a wireless LAN (Local Area Network) compatible communication device, an LTE (Long Term Evolution) compatible communication device, or a wire communication device that performs wired communication.
- the communication device 815 can communicate with other devices via a network, for example.
- the detection unit 111 that detects the intersection of the specified vector by the user U and the plane including the display area 171 as the specified position, and the display control based on the specified position.
- the display control unit 112 is provided, and the display control unit 112 performs predetermined display control when the designated position is outside the display area 171. According to such a configuration, when the position designated by the user is outside the display area 171, it is possible to feed back to the user that the outside of the display area 171 has been designated.
- the display control unit 112 may suppress the vibration of the object by filtering the object displayed in the display area 171.
- the display control unit 112 may suppress the vibration of the object by filtering the object displayed in the display area 171.
- the display control unit 112 may suppress the vibration of the object by filtering the object displayed in the display area 171.
- by adding an afterimage to the object displayed in the display area 171 it is possible to make the user feel as if the vibration of the object is suppressed.
- the operation of the display control apparatus 100 does not necessarily have to be performed in time series in the order described in the flowchart.
- the operations of the display control apparatus 100 may be performed in an order different from the order described in the flowchart, or at least a part of the operations described in the flowchart may be performed in parallel.
- a detection unit that detects an intersection of a specified vector by a user and a plane including a display area as a specified position;
- a display control unit that performs display control based on the designated position,
- the display control unit is a display control device that performs predetermined display control when the designated position is outside the display area.
- the display control unit displays a predetermined object in the display area when the designated position is outside the display area;
- the display control device according to (1).
- the display control unit displays a predetermined object at an end of the display area when the designated position is outside the display area;
- the display control apparatus according to (2).
- the display control unit displays a predetermined object at an intersection of an end of the display area, a predetermined position of the display area, and a line segment connecting the specified position when the specified position is outside the display area.
- the display control device changes the predetermined object according to the designated position.
- the display control apparatus changes a size of the predetermined object according to the designated position;
- the display control apparatus changes to (5).
- the display control unit changes the shape of the predetermined object according to the designated position.
- the display control unit changes a color of the predetermined object according to the designated position;
- the display control unit when the designated position is outside the display area, to scroll the content of the display area based on the designated position; The display control device according to (1).
- the display control unit when the designated position is outside the display area, to scroll the content based on the direction of the designated position with respect to a predetermined position of the display area; The display control apparatus according to (9).
- the display control unit scrolls the content at a speed according to a distance between a predetermined position of the display area and the designated position when the designated position is outside the display area; The display control apparatus according to (9).
- (12) The display control unit switches the content of the display area based on the designated position when the designated position is outside the display area; The display control device according to (1).
- the display control unit performs a drag operation based on the movement of the designated position when the designated position is moved from the inside to the outside of the display area.
- the display control device according to (1).
- the display control unit performs a pinch-in operation or a pinch-out operation based on the movement of each of the plurality of designated positions when the movement of each of the plurality of designated positions is performed between the inside and the outside of the display area.
- the display control device according to (1).
- the display control unit corrects the designated position when the designated position is outside the display area; The display control device according to (1).
- the display control unit corrects the designated position at an intersection of a predetermined segment of the display area and a line segment connecting the designated position and an end of the display area when the designated position is outside the display area.
- the display control apparatus according to (15).
- Computer A detection unit that detects an intersection of a specified vector by a user and a plane including a display area as a specified position; A display control unit that performs display control based on the designated position,
- the display control unit is a computer-readable recording medium that records a program for causing the display control unit to function as a display control device that performs predetermined display control when the designated position is outside the display area.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Controls And Circuits For Display Device (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
1.実施形態
1-1.情報処理システムの概要
1-2.表示制御装置の機能構成例
1-3.指定ベクトル算出の一例
1-4.キャリブレーションデータ
1-5.指定位置に応じた表示制御例
1-6.指定ベクトル算出の他の例
1-7.ハードウェア構成例
2.むすび
[1-1.情報処理システムの概要]
最初に、本開示の実施形態に係る情報処理システム10の概要について説明する。図1は、本開示の実施形態に係る情報処理システム10の概要について説明するための図である。図1に示すように、情報処理システム10は、表示制御装置100、検出装置120および表示装置170を備えている。
続いて、本開示の実施形態に係る表示制御装置100の機能構成例について説明する。図2は、本開示の実施形態に係る表示制御装置100の機能構成例を示す図である。図2に示すように、表示制御装置100は、制御部110および記憶部130を備える。
まず、ユーザUによる指定ベクトルはどのように与えられてもよいが、ユーザUによる指定ベクトルがユーザの指差しによって与えられる例について説明する。図3は、ユーザUの肘から手首への方向がユーザUによる指定ベクトルとして与えられる例を示す図である。図3に示すように、検出部111は、検出装置120によって撮像された画像からユーザUの骨格情報を3次元的に認識する。そして、検出部111は、3次元的に認識された骨格情報から肘b1および手首b2それぞれの位置が取得された場合、肘b1から手首b2への方向を指定ベクトルvとして検出することが可能である。
続いて、検出部111は、指定ベクトルvと表示領域171を含む平面との交点を指定位置として検出する。図5は、指定ベクトルvと表示領域171を含む平面との交点を指定位置として検出する例を説明するための図である。例えば、tをスケール変数とし、手首b2の座標をpとし、検出装置120から表示装置170への射影行列をPとすると、検出部111は、指定ベクトルvと表示領域171を含む平面との交点xを、以下の式(1)によって算出することができる。
続いて、表示制御部112は、指定位置に基づいて表示制御を行う。そして、表示制御部112は、指定位置が表示領域171の外側である場合に、所定の表示制御を行う。かかる構成により、ユーザUによって指定された位置が表示領域171の外側である場合に、表示領域171の外側が指定された旨をユーザUに対してフィードバックすることが可能となる。所定の表示制御については特に限定されない。以下、所定の表示制御の例について説明する。
以上、指定位置が表示領域171の外側である場合における表示制御例について説明した。ここで、上記したように、ユーザUによる指定ベクトルはどのように与えられてもよい。例えば、検出部111は、センサによって検出されたセンサデータに基づいて指定ベクトルが与えられてもよい。図18は、センサによって検出されたセンサデータに基づいて指定ベクトルが与えられる例を説明するための図である。図18に示すように、ユーザUは、センサRを操作することが可能である。
続いて、本開示の実施形態に係る表示制御装置100のハードウェア構成例について説明する。図20は、本開示の実施形態に係る表示制御装置100のハードウェア構成例を示す図である。ただし、図20に示したハードウェア構成例は、表示制御装置100のハードウェア構成の一例を示したに過ぎない。したがって、表示制御装置100のハードウェア構成は、図20に示した例に限定されない。
<<2.むすび>>
以上説明したように、本開示の実施形態によれば、ユーザUによる指定ベクトルと表示領域171を含む平面との交点を指定位置として検出する検出部111と、指定位置に基づいて表示制御を行う表示制御部112と、を備え、表示制御部112は、指定位置が表示領域171の外側である場合に、所定の表示制御を行う、表示制御装置100が提供される。かかる構成によれば、ユーザによって指定された位置が表示領域171の外側である場合に、表示領域171の外側が指定された旨をユーザに対してフィードバックすることが可能となる。
(1)
ユーザによる指定ベクトルと表示領域を含む平面との交点を指定位置として検出する検出部と、
前記指定位置に基づいて表示制御を行う表示制御部と、を備え、
前記表示制御部は、前記指定位置が前記表示領域の外側である場合に、所定の表示制御を行う、表示制御装置。
(2)
前記表示制御部は、前記指定位置が前記表示領域の外側である場合に、前記表示領域に所定のオブジェクトを表示させる、
前記(1)に記載の表示制御装置。
(3)
前記表示制御部は、前記指定位置が前記表示領域の外側である場合に、前記表示領域の端部に所定のオブジェクトを表示させる、
前記(2)に記載の表示制御装置。
(4)
前記表示制御部は、前記指定位置が前記表示領域の外側である場合に、前記表示領域の端部と前記表示領域の所定位置および前記指定位置を結ぶ線分との交点に所定のオブジェクトを表示させる、
前記(3)に記載の表示制御装置。
(5)
前記表示制御部は、前記所定のオブジェクトを前記指定位置に応じて変化させる、
前記(2)~(4)のいずれか一項に記載の表示制御装置。
(6)
前記表示制御部は、前記所定のオブジェクトのサイズを前記指定位置に応じて変化させる、
前記(5)に記載の表示制御装置。
(7)
前記表示制御部は、前記所定のオブジェクトの形状を前記指定位置に応じて変化させる、
前記(5)に記載の表示制御装置。
(8)
前記表示制御部は、前記所定のオブジェクトの色を前記指定位置に応じて変化させる、
前記(5)に記載の表示制御装置。
(9)
前記表示制御部は、前記指定位置が前記表示領域の外側である場合に、前記指定位置に基づいて前記表示領域のコンテンツをスクロールさせる、
前記(1)に記載の表示制御装置。
(10)
前記表示制御部は、前記指定位置が前記表示領域の外側である場合に、前記表示領域の所定位置を基準とした前記指定位置の方向に基づいて前記コンテンツをスクロールさせる、
前記(9)に記載の表示制御装置。
(11)
前記表示制御部は、前記指定位置が前記表示領域の外側である場合に、前記表示領域の所定位置と前記指定位置との距離に応じた速度により前記コンテンツをスクロールさせる、
前記(9)に記載の表示制御装置。
(12)
前記表示制御部は、前記指定位置が前記表示領域の外側である場合に、前記指定位置に基づいて前記表示領域のコンテンツを切り替える、
前記(1)に記載の表示制御装置。
(13)
前記表示制御部は、前記表示領域の内側から外側への前記指定位置の移動がなされた場合に、前記指定位置の移動に基づいてドラッグ操作を行う、
前記(1)に記載の表示制御装置。
(14)
前記表示制御部は、前記表示領域の内側と外側との間において複数の指定位置それぞれの移動がなされた場合に、前記複数の指定位置それぞれの移動に基づいてピンチイン操作またはピンチアウト操作を行う、
前記(1)に記載の表示制御装置。
(15)
前記表示制御部は、前記指定位置が前記表示領域の外側である場合に、前記指定位置を補正する、
前記(1)に記載の表示制御装置。
(16)
前記表示制御部は、前記指定位置が前記表示領域の外側である場合に、前記表示領域の所定位置および前記指定位置を結ぶ線分と前記表示領域の端部との交点に前記指定位置を補正する、
前記(15)に記載の表示制御装置。
(17)
ユーザによる指定ベクトルと表示領域を含む平面との交点を指定位置として検出することと、
前記指定位置に基づいて表示制御を行うことと、を含み、
前記指定位置が前記表示領域の外側である場合に、所定の表示制御を行うことを含む、
表示制御方法。
(18)
コンピュータを、
ユーザによる指定ベクトルと表示領域を含む平面との交点を指定位置として検出する検出部と、
前記指定位置に基づいて表示制御を行う表示制御部と、を備え、
前記表示制御部は、前記指定位置が前記表示領域の外側である場合に、所定の表示制御を行う、表示制御装置として機能させるためのプログラムを記録した、コンピュータに読み取り可能な記録媒体。
100 表示制御装置
110 制御部
111 検出部
112 表示制御部
113 実行部
120 検出装置
130 記憶部
170 表示装置
171 表示領域
Ar 補正領域
B1、B2 オブジェクト
C1~C6 コンテンツ
D(D1、D2) 距離
Pc 中心位置
Pd 指定位置
Pe 交点
Pt1~Pt4 指定位置
T1 交点
T2 交点
U ユーザ
v 指定ベクトル
x 交点
Claims (18)
- ユーザによる指定ベクトルと表示領域を含む平面との交点を指定位置として検出する検出部と、
前記指定位置に基づいて表示制御を行う表示制御部と、を備え、
前記表示制御部は、前記指定位置が前記表示領域の外側である場合に、所定の表示制御を行う、表示制御装置。 - 前記表示制御部は、前記指定位置が前記表示領域の外側である場合に、前記表示領域に所定のオブジェクトを表示させる、
請求項1に記載の表示制御装置。 - 前記表示制御部は、前記指定位置が前記表示領域の外側である場合に、前記表示領域の端部に所定のオブジェクトを表示させる、
請求項2に記載の表示制御装置。 - 前記表示制御部は、前記指定位置が前記表示領域の外側である場合に、前記表示領域の端部と前記表示領域の所定位置および前記指定位置を結ぶ線分との交点に所定のオブジェクトを表示させる、
請求項3に記載の表示制御装置。 - 前記表示制御部は、前記所定のオブジェクトを前記指定位置に応じて変化させる、
請求項2に記載の表示制御装置。 - 前記表示制御部は、前記所定のオブジェクトのサイズを前記指定位置に応じて変化させる、
請求項5に記載の表示制御装置。 - 前記表示制御部は、前記所定のオブジェクトの形状を前記指定位置に応じて変化させる、
請求項5に記載の表示制御装置。 - 前記表示制御部は、前記所定のオブジェクトの色を前記指定位置に応じて変化させる、
請求項5に記載の表示制御装置。 - 前記表示制御部は、前記指定位置が前記表示領域の外側である場合に、前記指定位置に基づいて前記表示領域のコンテンツをスクロールさせる、
請求項1に記載の表示制御装置。 - 前記表示制御部は、前記指定位置が前記表示領域の外側である場合に、前記表示領域の所定位置を基準とした前記指定位置の方向に基づいて前記コンテンツをスクロールさせる、
請求項9に記載の表示制御装置。 - 前記表示制御部は、前記指定位置が前記表示領域の外側である場合に、前記表示領域の所定位置と前記指定位置との距離に応じた速度により前記コンテンツをスクロールさせる、
請求項9に記載の表示制御装置。 - 前記表示制御部は、前記指定位置が前記表示領域の外側である場合に、前記指定位置に基づいて前記表示領域のコンテンツを切り替える、
請求項1に記載の表示制御装置。 - 前記表示制御部は、前記表示領域の内側から外側への前記指定位置の移動がなされた場合に、前記指定位置の移動に基づいてドラッグ操作を行う、
請求項1に記載の表示制御装置。 - 前記表示制御部は、前記表示領域の内側と外側との間において複数の指定位置それぞれの移動がなされた場合に、前記複数の指定位置それぞれの移動に基づいてピンチイン操作またはピンチアウト操作を行う、
請求項1に記載の表示制御装置。 - 前記表示制御部は、前記指定位置が前記表示領域の外側である場合に、前記指定位置を補正する、
請求項1に記載の表示制御装置。 - 前記表示制御部は、前記指定位置が前記表示領域の外側である場合に、前記表示領域の所定位置および前記指定位置を結ぶ線分と前記表示領域の端部との交点に前記指定位置を補正する、
請求項15に記載の表示制御装置。 - ユーザによる指定ベクトルと表示領域を含む平面との交点を指定位置として検出することと、
前記指定位置に基づいて表示制御を行うことと、を含み、
前記指定位置が前記表示領域の外側である場合に、所定の表示制御を行うことを含む、
表示制御方法。 - コンピュータを、
ユーザによる指定ベクトルと表示領域を含む平面との交点を指定位置として検出する検出部と、
前記指定位置に基づいて表示制御を行う表示制御部と、を備え、
前記表示制御部は、前記指定位置が前記表示領域の外側である場合に、所定の表示制御を行う、表示制御装置として機能させるためのプログラムを記録した、コンピュータに読み取り可能な記録媒体。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/559,873 US20180059811A1 (en) | 2015-03-31 | 2016-01-22 | Display control device, display control method, and recording medium |
JP2017509318A JPWO2016157951A1 (ja) | 2015-03-31 | 2016-01-22 | 表示制御装置、表示制御方法および記録媒体 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-073743 | 2015-03-31 | ||
JP2015073743 | 2015-03-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016157951A1 true WO2016157951A1 (ja) | 2016-10-06 |
Family
ID=57004121
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/051790 WO2016157951A1 (ja) | 2015-03-31 | 2016-01-22 | 表示制御装置、表示制御方法および記録媒体 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20180059811A1 (ja) |
JP (1) | JPWO2016157951A1 (ja) |
WO (1) | WO2016157951A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020509439A (ja) * | 2017-03-20 | 2020-03-26 | グーグル エルエルシー | 仮想現実における保持オブジェクトの安定化 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7247519B2 (ja) * | 2018-10-30 | 2023-03-29 | セイコーエプソン株式会社 | 表示装置、及び、表示装置の制御方法 |
CN115004132A (zh) * | 2020-01-29 | 2022-09-02 | 索尼集团公司 | 信息处理装置、信息处理系统和信息处理方法 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2009072504A1 (ja) * | 2007-12-07 | 2009-06-11 | Sony Corporation | 制御装置、入力装置、制御システム、制御方法及びハンドヘルド装置 |
JP2010282408A (ja) * | 2009-06-04 | 2010-12-16 | Sony Corp | 制御装置、入力装置、制御システム、ハンドヘルド装置及び制御方法 |
US20140267142A1 (en) * | 2013-03-15 | 2014-09-18 | Qualcomm Incorporated | Extending interactive inputs via sensor fusion |
JP2015028690A (ja) * | 2013-07-30 | 2015-02-12 | ソニー株式会社 | 情報処理装置、情報処理方法、及びプログラム |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7046242B2 (en) * | 2000-06-05 | 2006-05-16 | Namco Ltd. | Game system, program and image generating method |
US7487462B2 (en) * | 2002-02-21 | 2009-02-03 | Xerox Corporation | Methods and systems for indicating invisible contents of workspace |
US6892352B1 (en) * | 2002-05-31 | 2005-05-10 | Robert T. Myers | Computer-based method for conveying interrelated textual narrative and image information |
US8225224B1 (en) * | 2003-02-25 | 2012-07-17 | Microsoft Corporation | Computer desktop use via scaling of displayed objects with shifts to the periphery |
US7415352B2 (en) * | 2005-05-20 | 2008-08-19 | Bose Corporation | Displaying vehicle information |
US8037421B2 (en) * | 2005-10-11 | 2011-10-11 | Research In Motion Limited | System and method for organizing application indicators on an electronic device |
KR20110020642A (ko) * | 2009-08-24 | 2011-03-03 | 삼성전자주식회사 | 사용자 접근을 인식하여 반응하는 gui제공 장치 및 방법 |
JP5372091B2 (ja) * | 2011-09-15 | 2013-12-18 | 株式会社ワコム | 電子機器および電子機器の表示画面制御方法 |
WO2013190538A1 (en) * | 2012-06-20 | 2013-12-27 | Pointgrab Ltd. | Method for touchless control of a device |
US20140046923A1 (en) * | 2012-08-10 | 2014-02-13 | Microsoft Corporation | Generating queries based upon data points in a spreadsheet application |
JP6036065B2 (ja) * | 2012-09-14 | 2016-11-30 | 富士通株式会社 | 注視位置検出装置及び注視位置検出方法 |
US8589818B1 (en) * | 2013-01-03 | 2013-11-19 | Google Inc. | Moveable viewport for indicating off-screen content |
US9405982B2 (en) * | 2013-01-18 | 2016-08-02 | GM Global Technology Operations LLC | Driver gaze detection system |
US9448979B2 (en) * | 2013-04-10 | 2016-09-20 | International Business Machines Corporation | Managing a display of results of a keyword search on a web page by modifying attributes of DOM tree structure |
US20140375541A1 (en) * | 2013-06-25 | 2014-12-25 | David Nister | Eye tracking via depth camera |
US10281987B1 (en) * | 2013-08-09 | 2019-05-07 | Leap Motion, Inc. | Systems and methods of free-space gestural interaction |
US9880711B2 (en) * | 2014-01-22 | 2018-01-30 | Google Llc | Adaptive alert duration |
US9665240B2 (en) * | 2014-01-27 | 2017-05-30 | Groupon, Inc. | Learning user interface having dynamic icons with a first and second visual bias |
US10339141B2 (en) * | 2014-10-03 | 2019-07-02 | The Regents Of The University Of Michigan | Detecting at least one predetermined pattern in stream of symbols |
US10222927B2 (en) * | 2014-10-24 | 2019-03-05 | Microsoft Technology Licensing, Llc | Screen magnification with off-screen indication |
-
2016
- 2016-01-22 WO PCT/JP2016/051790 patent/WO2016157951A1/ja active Application Filing
- 2016-01-22 US US15/559,873 patent/US20180059811A1/en not_active Abandoned
- 2016-01-22 JP JP2017509318A patent/JPWO2016157951A1/ja active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2009072504A1 (ja) * | 2007-12-07 | 2009-06-11 | Sony Corporation | 制御装置、入力装置、制御システム、制御方法及びハンドヘルド装置 |
JP2010282408A (ja) * | 2009-06-04 | 2010-12-16 | Sony Corp | 制御装置、入力装置、制御システム、ハンドヘルド装置及び制御方法 |
US20140267142A1 (en) * | 2013-03-15 | 2014-09-18 | Qualcomm Incorporated | Extending interactive inputs via sensor fusion |
JP2015028690A (ja) * | 2013-07-30 | 2015-02-12 | ソニー株式会社 | 情報処理装置、情報処理方法、及びプログラム |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020509439A (ja) * | 2017-03-20 | 2020-03-26 | グーグル エルエルシー | 仮想現実における保持オブジェクトの安定化 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2016157951A1 (ja) | 2018-01-25 |
US20180059811A1 (en) | 2018-03-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230245262A1 (en) | Display device, computer program, and computer-implemented method | |
JP5869177B1 (ja) | 仮想現実空間映像表示方法、及び、プログラム | |
WO2014188798A1 (ja) | 表示制御装置、表示制御方法および記録媒体 | |
JP7005161B2 (ja) | 電子機器及びその制御方法 | |
US20110019239A1 (en) | Image Reproducing Apparatus And Image Sensing Apparatus | |
EP3154255A1 (en) | Imaging device and video generation method by imaging device | |
JP2014127184A (ja) | 情報処理装置および表示制御方法 | |
JP6255838B2 (ja) | 表示装置、表示制御方法及びプログラム | |
JP7154789B2 (ja) | 表示制御装置、その制御方法、プログラム及び記憶媒体 | |
US20150063785A1 (en) | Method of overlappingly displaying visual object on video, storage medium, and electronic device | |
JP6145738B2 (ja) | 表示装置及びコンピュータプログラム | |
WO2015136952A1 (ja) | ジェスチャー認識装置、ヘッドマウントディスプレイ、および携帯端末 | |
JP5220157B2 (ja) | 情報処理装置及びその制御方法、プログラム、並びに記憶媒体 | |
WO2016157951A1 (ja) | 表示制御装置、表示制御方法および記録媒体 | |
JP2016224173A (ja) | 制御装置及び制御方法 | |
JP5919570B2 (ja) | 画像表示装置及び画像表示方法 | |
JP2017059196A (ja) | 仮想現実空間映像表示方法、及び、プログラム | |
KR102278229B1 (ko) | 전자기기 및 그 제어 방법 | |
JP7005160B2 (ja) | 電子機器及びその制御方法 | |
JP5957619B2 (ja) | 機器の形状変更の感知方法及び装置 | |
JP6614516B2 (ja) | 表示装置及びコンピュータプログラム | |
JP5170300B1 (ja) | 情報処理端末、情報処理方法、およびプログラム | |
WO2019102885A1 (ja) | 画像の表示部分を変更可能な電子機器 | |
JP6344670B2 (ja) | 表示装置及びコンピュータプログラム | |
KR102070925B1 (ko) | 팬, 틸트, 줌 제어장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16771837 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017509318 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15559873 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16771837 Country of ref document: EP Kind code of ref document: A1 |