WO2017195299A1 - Simulation system - Google Patents

Simulation system Download PDF

Info

Publication number
WO2017195299A1
WO2017195299A1 PCT/JP2016/064021 JP2016064021W WO2017195299A1 WO 2017195299 A1 WO2017195299 A1 WO 2017195299A1 JP 2016064021 W JP2016064021 W JP 2016064021W WO 2017195299 A1 WO2017195299 A1 WO 2017195299A1
Authority
WO
WIPO (PCT)
Prior art keywords
pointer
size
article
unit
user
Prior art date
Application number
PCT/JP2016/064021
Other languages
French (fr)
Japanese (ja)
Inventor
幸宏 陽奥
鈴木 達也
遠藤 康浩
Original Assignee
富士通株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士通株式会社 filed Critical 富士通株式会社
Priority to PCT/JP2016/064021 priority Critical patent/WO2017195299A1/en
Publication of WO2017195299A1 publication Critical patent/WO2017195299A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

Definitions

  • the present invention relates to a simulation system.
  • an information input device for a user to control a graphic cursor displayed on a display.
  • a first sensor that generates first sensor data in response to a first type of user motion that is a motion of a part of the user's body, and a motion of a part of the user's body that is finer than the first type of user motion
  • a second sensor that generates second sensor data in response to a second type of user action.
  • the graphic having a wide-range movement component corresponding to the first type of user action and a high-precision range movement component corresponding to the second type of user action and representing a higher-precision movement than the wide-range movement component.
  • the apparatus further includes at least one processor that calculates a hybrid cursor movement signal that is a signal for moving the cursor.
  • the at least one processor is based on a first sensitivity parameter representing a sensitivity of the first sensor with respect to the first type of user action, which is determined by adding the second sensor data to the first sensor data.
  • a wide range moving component is calculated.
  • the at least one processor is based on a second sensitivity parameter representing a sensitivity of the second sensor with respect to the second type of user action, which is determined by adding the first sensor data to the second sensor data.
  • a high-accuracy range moving component is calculated (see, for example, Patent Document 1).
  • the at least one processor sets the first sensitivity parameter to be smaller as the second type user action is more intense when both the first type user action and the second type user action are executed.
  • the wide-range moving component is suppressed, and the second sensitivity parameter is set to be smaller as the first type user operation is more intense, thereby suppressing the high-accuracy range moving component.
  • the graphic cursor includes at least a first cursor and a second cursor located in the first cursor and smaller than the first cursor, and the at least one processor further has a large first sensitivity parameter.
  • the conventional information input device operates the two cursors of the first cursor and the second cursor, and thus there is a problem that the usability is not good.
  • an object is to provide a simulation system that is easy to use.
  • a simulation system stores a display unit that displays an image of the article based on article data representing the shape and position of the article, a pointer whose position is manipulated by a user, and the article data.
  • a data storage unit a first detection unit that detects an instruction operation in which the user indicates the position of the pointer, and a coordinate system of the display unit based on the instruction operation detected by the first detection unit.
  • a second detection unit that detects the position of the pointer, an area generation unit that includes the pointer and generates a detection area that is larger than the pointer, and the type of the article that is at least partially located within the detection area
  • a size setting unit for setting the size of the pointer, and an output for causing the display unit to display the pointer having a size set by the size setting unit. And a part.
  • a simulation system with good usability can be provided.
  • FIG. 1 is a perspective view of a computer system to which a processing apparatus according to an embodiment is applied. It is a block diagram explaining the structure of the principal part in the main-body part of a computer system. It is a figure which shows shape data. It is a figure which shows an example of the image of articles
  • FIG. 25 is a diagram illustrating a locus of a pointer displayed by the simulation system when the instruction operation described in FIG. 24 is performed. It is a figure which shows the result of having performed the instruction
  • FIG. 1 is a diagram illustrating a simulation system 100 according to an embodiment.
  • FIG. 2 is a diagram illustrating a configuration of the processing device 120 of the simulation system 100.
  • the simulation system 100 includes a screen 110A, a projection device 110B, 3D (3-dimensional) glasses 110C, a processing device 120, and a position measurement device 140.
  • the simulation system 100 can be applied to an assembly support system in order to grasp assembly workability in a virtual space, for example.
  • an operation of assembling an electronic component such as a CPU (Central Processing Unit) module, a memory module, a communication module, or a connector on a mother board or the like can be performed in a virtual space.
  • a CPU Central Processing Unit
  • simulation system 100 can be applied not only to the assembly support system but also to various systems for confirming workability in a three-dimensional space.
  • a projector screen can be used as the screen 110A.
  • the size of the screen 110A may be set as appropriate according to the application.
  • An image projected by the projection device 110B is displayed on the screen 110A.
  • images of the article 111, the buttons 111A and 111B, and the pointer 130A are displayed on the screen 110A.
  • the pointer 130A is displayed in the direction in which the user 1 points the hand toward the screen 110A.
  • the user 1 may be in a state where the fingertip of the right hand is opened or held.
  • the pointer 130A is displayed on the screen 110A in the direction in which the user 1 moves the right arm and points with the hand.
  • An operation in which the user 1 instructs with the right arm to move the pointer 130A is referred to as an instruction operation.
  • the projection device 110B may be any device that can project an image on the screen 110A.
  • a projector can be used.
  • the projection device 110B is connected to the processing device 120 via a cable 110B1, and projects an image input from the processing device 120 onto the screen 110A.
  • the projection device 110B is of a type that can project a 3D image (stereoscopic image) onto the screen 110A.
  • the screen 110A and the projection device 110B are examples of a display unit.
  • the user 1 who uses the simulation system 100 wears the 3D glasses 110C.
  • the 3D glasses 110C may be any glasses that can convert an image projected on the screen 110A by the projection device 110B into a 3D image.
  • polarized glasses for polarizing incident light or liquid crystal shutter glasses having a liquid crystal shutter are used. Can do.
  • a liquid crystal display panel may be used instead of the screen 110A and the projection device 110B.
  • the 3D glasses 110C may not be used.
  • a head mounted display capable of viewing a 3D image may be used instead of the screen 110A and the projection device 110B.
  • the processing device 120 includes a human body detection unit 121, a position detection unit 122, a detection region generation unit 123, an operation detection unit 124, an object determination unit 125, a pointer generation unit 126, a data holding unit 127, and a video output unit 128.
  • the processing device 120 is realized by a computer having a memory, for example.
  • the human body detection unit 121 determines whether or not the body of the user 1 exists based on data that three-dimensionally represents the position and shape of the body of the user 1 input from the position measurement device 140. When doing so, the coordinates indicating the position of each part of the body of the user 1 are obtained. As an example, the position of each part of the body of the user 1 is represented by the position of the skeleton of the user 1. The position of the skeleton includes, for example, the position of the head, shoulder, elbow, wrist, and hand.
  • the human body detection unit 121 is an example of a second detection unit together with the position detection unit 122.
  • the position detection unit 122 obtains coordinates P (P x , P y , P z ) based on the coordinates representing the position of each part of the body of the user 1 input from the human body detection unit 121.
  • the position detection unit 122 is an example of a second detection unit together with the human body detection unit 121.
  • the position detection unit 122 obtains a straight line connecting the right shoulder and the right wrist of the user 1 based on the coordinates representing the position of each part of the body of the user 1 input from the human body detection unit 121, and the straight line and the screen 110A. Find the coordinates of the intersection with.
  • the position detection unit 122 converts the coordinate value of the intersection of the straight line and the screen 110A into the coordinate in the image projected on the screen 110A, and outputs it as coordinates P (P x , P y , P z ). Note that the position measurement device 140 may detect the coordinates P (P x , P y , P z ).
  • an X axis is defined in the horizontal direction parallel to the screen 110A
  • a Y axis is defined in the vertical direction
  • a Z axis is defined in the horizontal direction perpendicular to the screen 110A.
  • the magnitude of the vector SH is given by the following equation (2) using the right shoulder coordinates S (S x , S y , S z ) and the right wrist coordinates H (H x , H y , H z ). expressed.
  • a coordinate P1 (P1 x , P1 y , P1 z ) represented by the following expression (3) is obtained by setting ⁇ L as an amount (offset amount) offset by the user 1 from the screen 10A in the Z-axis direction.
  • the coordinates P1 (P1 x , P1 y , P1 z ) are coordinates obtained by the equations (1), (2), and (3), and the straight line connecting the right shoulder and the right wrist of the user 1 and the screen 110A.
  • the coordinates of this intersection are the coordinates in the real space.
  • the position detection unit 122 converts the coordinates P1 (P1 x , P1 y , P1 z ) of the intersection into coordinates in the image projected on the screen 110A, and outputs the coordinates P (P x , P y , P z ). To do. Coordinates P (P x, P y, P z) are coordinates obtained by converting the coordinates in the image to be projected intersection coordinates P1 and (P1 x, P1 y, P1 z) of the screen 110A.
  • the position detector 122 converts the coordinates P (P x , P y , P) obtained by converting the coordinates P1 (P1 x , P1 y , P1 z ) of the intersection in the real space into values in the coordinate system in the virtual space. z ).
  • Detection area generator 123 generates a detection region 130B to the center coordinates P (P x, P y, P z) a.
  • the detection area generation unit 123 is an example of an area generation unit.
  • the pointer 130A is a sphere centered on the coordinates P (P x , P y , P z ), and the radius of the detection area 130B is larger than the radius of the pointer 130A. For this reason, the detection region 130B is arranged concentrically with the pointer 130A.
  • the detection area 130B is an area outside the surface of the pointer 130A and included in a sphere defined by a predetermined radius with the coordinates P (P x , P y , P z ) as the center.
  • the detection region 130B includes a spherical surface defined by a predetermined radius.
  • the pointer 130A and the detection area 130B are the same in that they move according to the user's instruction operation.
  • the pointer 130A is displayed as an image on the screen 110A, but the detection area 130B is not displayed on the screen 110A.
  • the pointer 130A is used for determining contact with the article 111 or the button 111A or 111B displayed on the screen 110A, whereas the detection area 130B is the article 111 or the button 111A or 111B existing around the pointer 130A. Used to determine the presence of.
  • the motion detection unit 124 detects the motion of the user 1 based on the coordinates representing the position of each part of the body of the user 1 input from the human body detection unit 121.
  • the operation of the user 1 is an operation such as a gesture performed by the user 1, and here, as an example, an operation for resetting the size of the pointer 130A, an operation for determining an input to the button 111A or 111B, and There is an operation of canceling the input to the button 111A or 111B.
  • the operation for determining input to the buttons 111A and 111B is an operation of tapping the buttons 111A and 111B, respectively. Further, the operation of canceling the input to the button 111A is an operation of paying while tracing the button 111A.
  • the operation of canceling the input to the button 111A is performed by moving the pointer 130A leftward while the pointer 130A touches the button 111A. It is an operation to move to.
  • the operation of canceling the input to the button 111B is an operation of paying while tracing the button 111B.
  • the operation of canceling the input to the button 111B is performed by moving the pointer 130A in the right direction while the pointer 130A touches the button 111B. It is an operation to move to.
  • the operation for resetting the size of the pointer 130A is an operation of crossing both arms.
  • the motion detection unit 124 is an example of a first determination unit and a second determination unit.
  • the object determination unit 125 determines the type of article at least a part of which is located inside the detection area 130B.
  • the type of article is, for example, the kind of article 111 or button 111A or 111B.
  • Whether or not at least a part is located inside the detection area 130B may be determined by whether or not the detection area 130B and the display area of the article 111 or the button 111A or 111B have an intersection. When at least a part of the article is located inside the detection area 130B, the case where the article is located on the outer peripheral surface (boundary) of the detection area 130B is also included.
  • the pointer generator 126 generates the pointer 130A as a spherical image centered on the coordinates P (P x , P y , P z ).
  • the radius of the pointer 130A is smaller than the radius of the detection area 130B. Since the pointer 130A and the detection area 130B are arranged concentrically, the detection area 130B exists around the pointer 130A.
  • the pointer generation unit 126 is an example of a size setting unit.
  • the pointer 130A moves according to the user's instruction operation, and is used to determine contact with the article 111 or the button 111A or 111B displayed on the screen 110A.
  • Whether the pointer 130A and the article 111 or the button 111A or 111B are in contact with each other is determined based on the coordinates P (P x , P y , P z ) and within the pointer 130A having a predetermined radius. The determination may be made based on whether at least a part of the display area of the button 111A or 111B is included.
  • the interior of the pointer 130A includes the surface of the pointer 130A.
  • Whether at least a part of the display area of the article 111 or the button 111A or 111B is included in the pointer 130A depends on whether or not the pointer 130A and the display area of the article 111 or the button 111A or 111B have an intersection. What is necessary is just to judge.
  • the display area of the article 111 or the button 111A or 111B is located on the surface (boundary) of the pointer 130A Is also included.
  • the pointer 130A is in contact with the article 111 or the button 111A or 111B.
  • the pointer 130A is in contact with the article 111 or the button 111A or 111B. In this case, it is only necessary to transmit the contact to the user 1 by changing the color of the pointer 130A.
  • the size of the pointer 130A generated by the pointer generator 126 is set based on data representing the size of the pointer 130A held in the data holding unit 127. In addition, the pointer generation unit 126 sets the size of the pointer 130A according to the type of article existing inside the detection area 130B. In addition, the pointer generation unit 126 sets the size of the pointer 130 ⁇ / b> A according to the operation of the user 1. A specific method for setting the size of the pointer 130A will be described later.
  • the data holding unit 127 holds data such as article data representing the coordinates and shape of the article 111 or the button 111A or 111B, image data of the pointer 130A, and data representing the size of the pointer 130A.
  • the data holding unit 127 is realized by a memory and is an example of a data storage unit.
  • the output terminal of the video output unit 128 is connected to the projection device 110B by a cable 110B1.
  • the video output unit 128 outputs an image specified by the article data of the article 111 held in the data holding unit 127 to the projection device 110B and displays it on the screen 110A.
  • the video output unit 128 displays the pointer 130A on the projection device 110B.
  • the image of the pointer 130A is generated by the pointer generator 126.
  • the position measuring device 140 is a device that acquires data that three-dimensionally represents the position and shape of the body of the user 1.
  • the position measuring device 140 is installed above the screen 110A and has a detection range 142 in front of the screen 110A.
  • the detection range 142 extends from the camera 140A of the position measurement device 140 to the front of the screen 110A.
  • the position measuring device 140 is connected to the processing device 120 by a cable 141.
  • the position measuring device 140 calculates a distance (depth) to a point included in the image based on, for example, the time from irradiating the subject with an infrared laser and receiving the reflected light. Device.
  • the position measurement device 140 acquires an image of the user 1 who performs an instruction operation toward the screen 110A, and acquires three-dimensional distance image data representing the posture of the user 1, a gesture, and the like.
  • the position measurement device 140 transmits the acquired three-dimensional data to the processing device 120 via the cable 141.
  • FIG. 3 is a perspective view of a computer system to which the processing device 120 of the embodiment is applied.
  • a computer system 10 shown in FIG. 3 includes a main body 11, a display 12, a keyboard 13, a mouse 14, and a modem 15.
  • the main unit 11 includes a CPU (Central Processing Unit), an HDD (Hard Disk Drive), a disk drive, and the like.
  • the display 12 displays an analysis result or the like on the screen 12A according to an instruction from the main body 11.
  • the display 12 may be a liquid crystal monitor, for example.
  • the keyboard 13 is an input unit for inputting various information to the computer system 10.
  • the mouse 14 is an input unit that designates an arbitrary position on the screen 12 ⁇ / b> A of the display 12.
  • the modem 15 accesses an external database or the like and downloads a program or the like stored in another computer system.
  • a program for causing the computer system 10 to function as the processing device 120 is stored in a portable recording medium such as the disk 17 or downloaded from the recording medium 16 of another computer system using a communication device such as the modem 15. Are input to the computer system 10 and compiled.
  • a program that causes the computer system 10 to have a function as the processing device 120 causes the computer system 10 to operate as the processing device 120.
  • This program may be stored in a computer-readable recording medium such as the disk 17.
  • the computer-readable recording medium is limited to a portable recording medium such as a disk 17, an IC card memory, a magnetic disk such as a floppy (registered trademark) disk, a magneto-optical disk, a CD-ROM, or a USB (Universal Serial Bus) memory. It is not something.
  • the computer-readable recording medium includes various recording media accessible by a computer system connected via a communication device such as a modem 15 or a LAN.
  • FIG. 4 is a block diagram illustrating a configuration of a main part in the main body 11 of the computer system 10.
  • the main body 11 includes a CPU 21 connected by a bus 20, a memory unit 22 including a RAM or a ROM, a disk drive 23 for the disk 17, and a hard disk drive (HDD) 24.
  • the display 12, the keyboard 13, and the mouse 14 are connected to the CPU 21 via the bus 20, but these may be directly connected to the CPU 21.
  • the display 12 may be connected to the CPU 21 via a known graphic interface (not shown) that processes input / output image data.
  • the keyboard 13 and the mouse 14 are input units of the processing device 120.
  • the display 12 is a display unit that displays input contents and the like for the processing device 120 on the screen 12A.
  • the computer system 10 is not limited to the configuration shown in FIGS. 3 and 4, and various known elements may be added or alternatively used.
  • FIG. 5 is a diagram showing shape data.
  • the article data is data representing the coordinates and shape of the article displayed on the screen 110A.
  • the article data has an article ID, a shape type, reference coordinates, a size, and a rotation angle.
  • the shape type represents the outer shape of the article.
  • the shape types indicate Cuboid (cuboid) and Cylinder (cylindrical body).
  • the reference coordinate indicates the coordinate value of a point that serves as a reference for coordinates representing the entire article.
  • the unit of the coordinate value is meter (m).
  • An XYZ coordinate system is used as the coordinate system.
  • the size represents the length of the article in the X-axis direction, the length in the Y-axis direction, and the length in the Z-axis direction.
  • the unit is meters (m).
  • the length in the X-axis direction represents the vertical length
  • the length in the Y-axis direction represents the height
  • the length in the Z-axis direction represents the depth (the length in the horizontal direction).
  • the rotation angle is represented by rotation angles ⁇ x, ⁇ y, and ⁇ z with respect to the X-axis direction, the Y-axis direction, and the Z-axis direction.
  • the unit is degree (deg.).
  • the rotation angle ⁇ x is an angle for rotating the article about the X axis as a rotation axis.
  • the rotation angles ⁇ y and ⁇ z are angles at which the article is rotated about the Y axis and the Z axis as rotation axes, respectively.
  • the positive directions of the rotation angles ⁇ x, ⁇ y, and ⁇ z may be determined in advance.
  • an image specified by the article data can be represented in the same manner as the article image displayed by the CAD data.
  • the article data is stored in the data holding unit 127 of the processing device 120.
  • FIG. 6 is a diagram illustrating an example of an image of an article.
  • FIG. 6 shows three articles represented by the article data in FIG.
  • An article with an article ID of 001 has a shape type of Cuboid (cuboid), reference coordinates (X, Y, Z) of (0.0, 0.0, 0.0), and a size of (0.8, 0.2, 0.4), and the rotation angles ⁇ x, ⁇ y, ⁇ z are (0.0, 0.0, 0.0).
  • An article with an article ID of 002 has a shape type of Cuboid (cuboid), reference coordinates (X, Y, Z) of (0.6, 0.2, 0.0), and a size of (0.2, 0.2, 0.1), and the rotation angles ⁇ x, ⁇ y, ⁇ z are (0.0, 0.0, 0.0).
  • the article with the article ID 002 is arranged on the article with the article ID 001.
  • the article with the article ID 003 has a shape type of Cylinder, a reference coordinate (X, Y, Z) of (0.8, 0.3, 0.1), and a size of (0.2 , 1.0, 0.3), and the rotation angles ⁇ x, ⁇ y, ⁇ z are (0.0, 0.0, 90.0).
  • the article with the article ID 003 is connected to the X axis positive direction side of the article with the article ID 002 in a state where the article ID is rotated 90 degrees about the Z axis.
  • the article data in the image projected on the screen 110A using the article data having the article ID, shape type, reference coordinates, size, and rotation angle shown in FIG. Define coordinates and shape.
  • the coordinates of the eight vertices are the length in the X-axis direction, the length in the Y-axis direction, the length in the Y-axis direction, and the Z-axis direction with respect to the reference coordinates. Can be obtained by adding or subtracting the length.
  • the coordinates of the eight vertices represent the coordinates of the corner of the article whose shape type is Cuboid.
  • the expression representing the 12 sides is an expression representing the coordinates of the Edge of the article whose shape type is Cuboid.
  • the expressions representing the eight vertices and / or the expressions representing the 12 sides are obtained, the expressions representing the six surfaces of the article whose shape type is Cuboid are obtained, and the coordinates of the surface are obtained. be able to.
  • the shape type is Cylinder (cylindrical body)
  • An expression representing a certain circle (or ellipse) can be obtained.
  • an equation representing a circle (or ellipse) at both ends and a reference coordinate are used, an equation representing the coordinates of the circle (or ellipse) at both ends can be obtained.
  • the coordinates of the side surface of the cylinder can be obtained by using an expression representing the coordinates of the circles (or ellipses) at both ends.
  • FIG. 7 is a diagram showing size data.
  • the size data is data in a table format in which the article ID of the article displayed on the screen 110A is associated with the pointer size of the pointer 130A.
  • Pointer size (X p, Y p, Z p) is the X-axis direction width X p, Y-axis direction of the height Y p, and represents the depth Z p in the Z-axis direction.
  • Pointer 130A has a width X p, is displayed as an ellipsoid having a height Y p, and the depth Z p.
  • the pointer size (X p , Y p , Z p ) associated with the article having the article ID 001 is (0.05, 0.02, 0.05).
  • Pointer size article ID is associated with the article 002 (X p, Y p, Z p) is a (0.01,0.01,0.01).
  • the pointer size (X p , Y p , Z p ) associated with the article with the article ID 002 is (0.015, 0.05, 0.015).
  • FIGS. 8 to 16 are diagrams showing a method for determining the size of the pointer 130A according to the size of the article.
  • the surface (ellipsoidal surface) of the pointer 130A is represented by Expression (4).
  • the parameter k is 0.05.
  • Article 111-2 has a shape in which three L-shaped blocks are connected.
  • the size of the pointer 130A is a size that can enter the gap between the three blocks of the article 111-2.
  • the parameter k is 0.05.
  • the article 111-3 is provided with a rectangular parallelepiped hole in the rectangular parallelepiped 111-31, and a prismatic portion 111-32 is provided in the hole.
  • the size of the pointer 130A is a size that can enter the gap between the hole of the rectangular parallelepiped 111-31 and the prism portion 111-32.
  • the parameters a, b, and c of the ellipsoid are expressed by the following equation (5). It may be set to l1 obtained in step (1).
  • the size of the pointer 130A obtained by Expression (6) is a size that can enter the gap between the three blocks of the article 111-2.
  • the size of the pointer 130A obtained by Expression (7) is a size that can enter the gap between the hole of the rectangular parallelepiped 111-31 and the prism portion 111-32.
  • the parameter k is 0.5.
  • X2A is the size of the gap between the three L-shaped blocks of the article 111-2 and is the minimum value of the outer dimensions of the article 111-2.
  • the parameters a, b, and c may be set to a value that is half the minimum value of the outer dimensions of the article 111-2.
  • the size of the pointer 130A obtained in this way is a size that can enter the gap between the three blocks of the article 111-2.
  • the parameter k is 0.5.
  • Y3A is the height in the Y-axis direction of the rectangular column part 111-32 of the rectangular parallelepiped 111-31, and is the minimum value of the outer dimensions of the article 111-3.
  • the parameters a, b, and c may be set to a value that is half the minimum value of the outer dimensions of the article 111-3.
  • the size of the pointer 130A thus obtained is a size that can enter the gap between the hole of the rectangular parallelepiped 111-31 and the prism portion 111-32.
  • FIG. 17 is a diagram showing size data in which an article ID is associated with a pointer size.
  • buttons 111A and 111B and the article ID of the article 111 illustrated in FIG. 1 and a pointer size associated with the article ID will be described.
  • the item ID of the button 111A is 011
  • the item ID of the button 111B is 012
  • the item ID of the item 111 is 013.
  • the pointer size (X p , Y p , Z p ) for the button 111A with the article ID 011 is (0.01, 0.01, 0.01).
  • the pointer size (X p , Y p , Z p ) of the button 111B with the article ID 012 is (0.01, 0.01, 0.01).
  • the article 111 of the article ID is 013, the size of a pointer (X p, Y p, Z p) is a (0.04,0.04,0.04).
  • FIG. 18 and 19 are diagrams showing the relationship between the article and the size of the pointer 130A.
  • the pointer 130A is displayed based on the size data shown in FIG. 17 when the image of the article 111 and the buttons 111A and 111B is displayed on the screen 110A as shown in FIG. 1 will be described.
  • a pointer 130A is displayed on the screen 110A, and a detection area 130B is set around the pointer 130A.
  • the detection area 130B is not displayed on the screen 110A.
  • the size of the pointer 130A is set to an initial value.
  • the initial value may be set according to, for example, the size of the screen 110A and the appropriate position of the user 1 with respect to the screen 110A.
  • buttons 111A and 111B enter the detection area 130B
  • the pointer 130A is displayed using the minimum value among the plurality of pointer sizes associated with the plurality of articles.
  • the pointer size (X p , Y p , Z p ) is set to (0.01, 0.01, 0.01) here. .
  • buttons 111A and 111B are relatively small articles, the user 1 can easily select the buttons 111A and 111B by making the pointer 130A smaller than the initial value.
  • the pointer size (X p , Y p , Z p ) becomes (0 .04, 0.04, 0.04).
  • the article 111 is larger than the buttons 111A and 111B and is a relatively large article, so that the user 1 can easily select the article 111 by making the pointer 130A larger than the initial value.
  • the simulation system 100 changes the size of the pointer 130A according to the size of the article existing in the detection area 130B. This is to improve the usability of the simulation system 100 by making it easy for the user 1 to see the pointer 130A.
  • FIG. 20 is a flowchart illustrating processing executed by the processing device 120 according to the embodiment.
  • the flowchart shown in FIG. 20 shows processing for setting the pointer size of the pointer 130A and displaying the pointer 130A on the screen 110A.
  • Processing device 120 starts processing after power is turned on (start).
  • the processing apparatus 120 acquires article data from the data holding unit 127 (step S1).
  • the processing device 120 generates a video signal using the article data, and causes the projection device 110B to project an image (step S2).
  • an image of the stereoscopic model of the article 111 and the buttons 111A and 111B is displayed on the screen 110A.
  • the image of the article 111 and the buttons 111A and 111B displayed on the screen 110A represents a virtual object existing in the virtual space.
  • steps S1 and S2 are performed by the video output unit 128.
  • the processing device 120 acquires data representing the position and shape of the body of the user 1 three-dimensionally from the position measurement device 140 (step S3).
  • the process of step S3 is performed by the human body detection unit 121.
  • the processing device 120 determines whether the body of the user 1 exists based on the data acquired in step S3 (step S4).
  • the process of step S4 is performed by the human body detection unit 121.
  • step S5 If the processing device 120 determines that the body of the user 1 exists (S4: YES), the processing device 120 obtains coordinates representing the position of each part of the body of the user 1 (step S5).
  • the process of step S5 is performed by the human body detection unit 121.
  • the processing device 120 detects the coordinates P (P x , P y , P z ) (step S6).
  • the coordinates P (P x , P y , P z ) are obtained by converting the coordinates of the intersection point of the straight line connecting the right shoulder and the right wrist of the user 1 and the screen 110A to the coordinates in the image projected on the screen 110A.
  • the coordinates are obtained by the position detector 122.
  • the process of step S6 is performed by the position detection unit 122.
  • the processing device 120 generates the detection area 130B (step S7).
  • the processing in step S7 is performed by the detection area generation unit 123.
  • the detection area generator 123 generates a detection area 130B having a predetermined radius centered on the coordinates P (P x , P y , P z ).
  • the detection area 130B is an area outside the surface of the pointer 130A and included in a sphere defined by a predetermined radius with the coordinates P (P x , P y , P z ) as the center.
  • the processing device 120 determines whether or not an article exists in the detection area 130B (step S8).
  • the process of step S8 is performed by the object determination unit 125.
  • the object determination unit 125 determines whether or not the detection area 130B and the article 111 or the display area of the button 111A or 111B have an intersection, and whether or not there is an article that is at least partially located inside the detection area 130B. judge. When an article exists inside the detection area 130B, the object determination unit 125 determines the type of the article.
  • step S9 the processing device 120 sets the pointer size to an initial value.
  • the processing in step S9 is performed by the pointer generator 126.
  • the initial value of the pointer size is held in the data holding unit 127.
  • the processing device 120 displays the pointer 130A on the screen 110A (step S10).
  • the processing in step S10 is performed by the video output unit 128.
  • the video output unit 128 causes the projection device 110B to display the image of the pointer 130A generated by the pointer generation unit 126 on the screen 110A.
  • step S11 If the processing device 120 determines that an article exists in the detection area 130B (S8: YES), the processing apparatus 120 acquires the article ID of the article existing in the detection area 130B (step S11).
  • the process of step S11 is performed by the object determination unit 125.
  • step S11 when there are a plurality of articles in the detection area 130B, a plurality of article IDs are acquired.
  • the object determination unit 125 acquires the article ID of the article 111 or the button 111A or 111B having an intersection with the detection area 130B.
  • the processing device 120 reads the pointer size corresponding to the article ID acquired in step S11 from the size data (see FIGS. 7 and 18) (step S12).
  • the processing in step S12 is performed by the pointer generation unit 126.
  • the pointer generation unit 126 reads a plurality of pointer sizes.
  • step S13 When the processing device 120 reads out a plurality of pointer sizes in step S11, the processing device 120 selects the smallest pointer size among the plurality of pointer sizes (step S13). The processing in step S13 is performed by the pointer generation unit 126. Note that, when the number of article IDs acquired in step S11 is one, the pointer generation unit 126 does not perform any particular process in step S13.
  • the processing device 120 determines whether the pointer size is smaller than a predetermined lower limit (step S14).
  • the processing in step S14 is performed by the pointer generator 126.
  • the pointer generation unit 126 reads out a predetermined lower limit value of the pointer size from the data holding unit 127, one pointer size read in step S12, or the pointer size selected in S13 and the pointer size read from the data holding unit 127 Is compared with a predetermined lower limit value.
  • the reason why the pointer size is compared with the predetermined lower limit in this way is to prevent the pointer 130A from becoming too small by reducing the pointer size in the process of step S109 described later.
  • the predetermined lower limit value may be set according to the size of the screen 110A, the appropriate position of the user 1 with respect to the screen 110A, and the like.
  • step S15 the processing device 120 corrects the pointer size to the predetermined lower limit value (step S15).
  • the processing in step S15 is performed by the pointer generator 126. If the pointer size is smaller than the predetermined lower limit value, it is difficult for the user 1 to see, and therefore the pointer size is corrected to be increased to the lower limit value before being displayed on the screen 110A.
  • the processing device 120 displays the pointer 130A having the pointer size corrected to the lower limit value on the screen 110A (step S10).
  • the processing in step S10 is performed by the video output unit 128.
  • the processing device 120 displays the pointer 130A having the pointer size set in step S12 or S13 on the screen 110A (step S10). .
  • the pointer 130A having the pointer size set by the processing device 120 is displayed on the screen 110A.
  • the flow shown in FIG. 20 is repeatedly executed.
  • FIG. 21 is a diagram showing the relationship between the article and the size of the pointer 130A.
  • images of the article 111 and the buttons 111A and 111B are displayed on the screen 110A.
  • the determination operation to the button 111B is erroneously performed, and the operation to cancel the input to the button 111B is performed.
  • a case where the size of 130A is changed will be described.
  • the operation for determining the input to the buttons 111A and 111B and the operation for canceling the input to the buttons 111A and 111B are determined as operations that can be detected by the operation detection unit 124.
  • the size of the pointer 130A is set to an initial value.
  • the pointer size (X p , Y p , Z p ) becomes (0.01, 0.01, 0.01).
  • the user 1 wants to tap the button 111A in this state, but the right hand 2 swings, the pointer 130A moves from the point B3 to the point B4, and the pointer 130A does not touch the button 111A but touches the button 111B. It becomes a state.
  • the input to the button 111B is determined. Since the user 1 wants to determine the input to the button 111A, the user 1 performs a cancel operation. Specifically, the user 1 performs an operation of paying the right hand 2 to the right side with the pointer 130A touching the button 111B. Thereby, the canceling operation is detected by the operation detecting unit 124.
  • the pointer size (X p , Y p , Z p ) associated with the button 111B for which the cancel operation has been performed is set to a size of 90%. That is, the pointer 130A is 10% smaller.
  • a learning function Such a function of reducing the pointer size in accordance with the cancel operation is referred to as a learning function.
  • User 1 moves the pointer 130A from point B4 to point B5, and the pointer 130A is in a state of touching the button 111A.
  • the user 1 may tap the button 111A in this state. Since the pointer 130A is 10% smaller, it is easier to select the button 111A, and erroneous input can be suppressed.
  • FIG. 22 is a diagram showing size data in which an article ID is associated with a pointer size. Here, a change in size data before and after the cancel operation is performed will be described.
  • the size data shown on the left side of FIG. 22 is the size data before the cancel operation is performed, and is the same as the size data shown in FIG.
  • the pointer size associated with the item ID 012 corresponding to the button 111B is 10% smaller as shown on the right side of FIG. Is done.
  • the pointer size is reduced when the cancel operation is performed when the cancel operation is performed within a predetermined time for the same button as the button for which the cancel operation is performed. This is because when the user 1 notices an erroneous input, it is considered that the cancel operation is performed within a long time after the determination operation is performed. In addition, if the cancel operation is performed after a certain amount of time has elapsed since the determination operation was performed, it is considered that the erroneous input is not canceled but the intention to cancel is performed.
  • the predetermined time for determining whether or not the canceling operation is performed may be set to an appropriate value through experiments or simulations.
  • the predetermined time is 5 seconds.
  • the determination operation is an operation (selection operation) for determining selection of the button 111A or 111B.
  • FIG. 23 is a flowchart illustrating processing executed by the processing device 120 according to the embodiment.
  • FIG. 23 a case will be described in which images of the article 111 and the buttons 111A and 111B are displayed on the screen 110A as shown in FIG.
  • Processing device 120 starts processing after power is turned on (start).
  • the processing device 120 displays the pointer 130A on the screen 110A (step S101).
  • the process of step S101 is the process shown in FIG.
  • the processing device 120 detects the operation of the user 1 based on the coordinates representing the position of each part of the body of the user 1 input from the human body detection unit 121 (step S102).
  • the process in step S102 is performed by the operation detection unit 124.
  • the operation of the user 1 is an operation such as a gesture performed by the user 1, and here, as an example, an operation for resetting the size of the pointer 130A, an operation for determining an input to the button 111A or 111B, and The operation for canceling the input to the button 111A or 111B is determined in advance as an operation that can be detected by the operation detection unit 124.
  • the processing device 120 determines whether or not the operation of the user 1 detected in step S102 is a reset operation (step S103).
  • the process in step S103 is performed by the operation detection unit 124.
  • step S104 determines whether or not the operation of the user 1 detected in step S102 is a determination operation.
  • the process in step S104 is performed by the operation detection unit 124.
  • step S104 determines whether the operation is not a determination operation (S104: NO). If the processing device 120 determines whether the operation of the user 1 detected in step S102 is a cancel operation (step S105). The process in step S105 is performed by the operation detection unit 124.
  • the processing device 120 determines that the operation is not a cancel operation (S105: NO), the series of processing ends (end). If it is determined that none of the reset operation, the determination operation, and the cancel operation, the process is terminated. If the power is turned on, the process is repeated from the start.
  • step S106 the processing device 120 stores the item ID of the item for which the determining operation has been performed (step S106).
  • the processing in step S106 is performed by the pointer generator 126. For example, when the article for which the determination operation has been performed is the button 111A, the article ID of the button 111A is stored.
  • the processing unit 120 ends the series of processing (end). If the power is on, the process is repeated from the start.
  • step S107 the processing device 120 determines whether or not the cancel operation has been performed for the same article as the already stored article ID.
  • the processing in step S107 is performed by the pointer generator 126.
  • the processing device 120 determines that the article ID is the same as the already-stored article ID (S107: YES), the elapsed time from the storage of the article ID of the article for which the determination operation has been performed until the cancel operation is performed Is determined within a predetermined time (step S108).
  • step S108 is performed by the pointer generator 126. Whether or not the elapsed time is within the predetermined time may be determined by whether or not the elapsed time is equal to or shorter than the predetermined time.
  • the processing device 120 When determining that the elapsed time is within the predetermined time (S108: YES), the processing device 120 reduces the pointer size associated with the canceled article to 90% of the pointer size included in the size data. (Step S109). The processing in step S109 is performed by the pointer generator 126.
  • the pointer size (X p associated with the button 111B is canceled.
  • Y p , Z p ) is set to a size of 90%.
  • the processing device 120 ends the series of processing (end). If the power is on, the process is repeated from the start.
  • step S110 the processing device 120 sets the pointer size to an initial value (step S110).
  • the processing in step S110 is performed by the pointer generation unit 126.
  • the initial value of the pointer size is held in the data holding unit 127 similarly to the size data shown in FIG.
  • FIG. 24 is a diagram showing a display on the screen 110A when an instruction operation is actually performed in the simulation system 100.
  • FIG. On the screen 110A nine markers are arranged in three rows and three columns. The marker is displayed as a sphere like the pointer 130A, and the center marker is a marker C1.
  • the interval between the nine markers is 100 pixels. For 100 pixels, when the width of the screen 110A in the X-axis direction is 3 meters, the interval between the markers is 150 mm.
  • the radius of the marker is 30 mm, and the radius of the detection region 130B is 100 mm.
  • the initial value of the radius of the pointer 130A is 40 mm, and the radius of the pointer 130A when the marker exists in the detection area 130B is 20 mm.
  • the radius of the pointer 130A shown in FIG. 24 is 20 mm.
  • FIG. 25 is a diagram illustrating a locus of the pointer 130A displayed by the simulation system 100 when the instruction operation described with reference to FIG. 24 is performed. That is, FIG. 25 shows the locus of the coordinates P (P x , P y , P z ) of the pointer 130A obtained by the simulation system 100.
  • the points surrounded by a broken-line circle indicate the coordinates P (P x , P y when an instruction operation is performed to keep the pointer 130A aligned with the marker C1 after the pointer 130A reaches the marker C1. , P z ).
  • FIG. 25 the points scattered from the upper right to the lower left toward the broken circle indicate the locus when the pointer 130A is moved from the start point A6 to the marker C1 shown in FIG.
  • the numerical value (pixel value) of the XY coordinate used by calculation is shown.
  • the position of the point surrounded by the broken-line circle fluctuates because the right hand 2 shakes in an attempt to keep the pointer 130A aligned with the marker C1 after the pointer 130A reaches the marker C1. is there.
  • FIG. 26 shows the result of the instruction operation for moving the pointer 130A to the marker C1 in this way.
  • FIG. 26 is a diagram showing a result of an instruction operation for moving the pointer 130A to the marker C1.
  • a marker in the detection area 130B a case where the process of changing the radius of the pointer 130A from the initial value of 40 mm to 20 mm is described as “with process”.
  • the result when the pointer 130A is set to 90% size by the learning function is also referred to as “processing & learning function”. Show.
  • the radius of the pointer 130A in the case of “with processing & learning function” is 18 mm.
  • the number of frames in which the pointer 130A touches only the marker C1 is 582, and the number of frames in which the pointer 130A touches a marker other than the marker C1 is 59. For this reason, the probability of contacting a marker other than the marker C1 was 9%.
  • the probability that the pointer 130A touches only the number of frames that touch only the marker C1 is significantly reduced as compared with the case of “without processing”.
  • the number of frames in which the pointer 130A contacts only the marker C1 is 540, and the number of frames in which the pointer 130A contacts a marker other than the marker C1 is 35. For this reason, the probability of contacting a marker other than the marker C1 was 6%.
  • the size of the pointer 130A is changed according to the type of the article.
  • the pointer size is made smaller than the initial value, and when the article is relatively large, the pointer size is made larger than the initial value.
  • the pointer size is set according to the size of the article existing around the pointer 130A, the usability of the simulation system 100 is greatly improved.
  • the pointer size is reduced to 90% by the learning function.
  • 1 tries to move the pointer 130A to the same article again, it is easy to move the pointer 130A to the desired article.
  • the pointer size is reduced to 81%. As described above, when the cancel operation is performed a plurality of times on the same article, the size decreases by 10%.
  • the operability when the user 1 operates the pointer 130A after the learning function is performed is as follows. It will be improved.
  • the article 111 or the button 111A or 111B is used.
  • the determining operation and the canceling operation have been described as a mode of inputting or canceling the button 111A or 111B.
  • a determination operation may be performed. After performing the determination operation, the article 111 may be moved together with the pointer 130A. And after moving the article
  • the pointer size associated with the article ID 011 corresponding to the button 111A in the detection area 130B may be reduced by 10%.
  • the pointer size associated with the article IDs 011 and 012 corresponding to the buttons 111A and 111B in the detection area 130B is reduced by 10%.
  • the pointer 130A is displayed as an ellipsoid.
  • the pointer 130A may be displayed as an image having a shape other than the ellipsoid.
  • the detection area 130B is an ellipsoidal area, but the detection area 130B may be an area having a shape other than an ellipsoid. Further, although the detection area 130B has been described as an area that exists outside the pointer 130A, it may be an area that also includes the inside of the pointer 130A.

Abstract

Provided is a simulation system that is easy to use. The simulation system comprises: a display unit for displaying a pointer having a position that can be manipulated by a user and an image of an article based on article data representing the shape and the position of the article; a data storage unit for storing the article data; a first detection unit that detects an indication operation in which the user indicates the position of the pointer; a second detection unit that detects the position of the pointer in the coordinate system of the display unit on the basis of the indication operation detected by the first detection unit; an area generation unit for generating a detection area that includes the pointer and that is larger than the pointer; a size setting unit that sets the size of the pointer in accordance with the type of an article at least partially positioned within the detection area; and an output unit that causes the display unit to display the pointer at the size set by the size setting unit.

Description

シミュレーションシステムSimulation system
 本発明は、シミュレーションシステムに関する。 The present invention relates to a simulation system.
 従来より、ディスプレイ上に表示されるグラフィックカーソルをユーザが制御するための情報入力装置がある。ユーザの身体の一部の動きである第1種のユーザ動作に反応して第1センサデータを生成する第1センサと、前記第1種のユーザ動作よりも細かいユーザの身体の一部の動きである第2種のユーザ動作に反応して第2センサデータを生成する第2センサとを備える。 Conventionally, there is an information input device for a user to control a graphic cursor displayed on a display. A first sensor that generates first sensor data in response to a first type of user motion that is a motion of a part of the user's body, and a motion of a part of the user's body that is finer than the first type of user motion And a second sensor that generates second sensor data in response to a second type of user action.
 前記第1種のユーザ動作に対応した広範囲移動成分と、前記第2種のユーザ動作に対応し、前記広範囲移動成分よりも高精度な移動を表す高精度範囲移動成分とを有し、前記グラフィックカーソルを移動させる信号であるハイブリッドカーソル移動信号を算出する、少なくとも1つのプロセッサをさらに備える。 The graphic having a wide-range movement component corresponding to the first type of user action and a high-precision range movement component corresponding to the second type of user action and representing a higher-precision movement than the wide-range movement component. The apparatus further includes at least one processor that calculates a hybrid cursor movement signal that is a signal for moving the cursor.
 前記少なくとも1つのプロセッサは、前記第1センサデータに前記第2センサデータを加味して決定される、前記第1種のユーザ動作に対する前記第1センサの感度を表す第1感度パラメータに基づいて前記広範囲移動成分を算出する。 The at least one processor is based on a first sensitivity parameter representing a sensitivity of the first sensor with respect to the first type of user action, which is determined by adding the second sensor data to the first sensor data. A wide range moving component is calculated.
 前記少なくとも1つのプロセッサは、前記第2センサデータに前記第1センサデータを加味して決定される、前記第2種のユーザ動作に対する前記第2センサの感度を表す第2感度パラメータに基づいて前記高精度範囲移動成分を算出する(例えば、特許文献1参照)。 The at least one processor is based on a second sensitivity parameter representing a sensitivity of the second sensor with respect to the second type of user action, which is determined by adding the first sensor data to the second sensor data. A high-accuracy range moving component is calculated (see, for example, Patent Document 1).
 前記少なくとも1つのプロセッサは、前記第1種のユーザ動作及び前記第2種のユーザ動作の双方が実行されている場合、前記第2種のユーザ動作が激しいほど前記第1感度パラメータを小さく設定して前記広範囲移動成分を抑制し、前記第1種のユーザ動作が激しいほど前記第2感度パラメータを小さく設定して前記高精度範囲移動成分を抑制する。 The at least one processor sets the first sensitivity parameter to be smaller as the second type user action is more intense when both the first type user action and the second type user action are executed. The wide-range moving component is suppressed, and the second sensitivity parameter is set to be smaller as the first type user operation is more intense, thereby suppressing the high-accuracy range moving component.
 前記グラフィックカーソルは、少なくとも第1カーソルと、前記第1カーソル内に位置し前記第1カーソルよりも小さい第2カーソルとで構成され、前記少なくとも1つのプロセッサは、さらに、前記第1感度パラメータが大きいほど、または、前記第2感度パラメータが小さいほど、前記第1カーソルのサイズを小さく設定する。 The graphic cursor includes at least a first cursor and a second cursor located in the first cursor and smaller than the first cursor, and the at least one processor further has a large first sensitivity parameter. The smaller the second sensitivity parameter is, the smaller the size of the first cursor is set.
特開2014-528625号公報JP 2014-528625 A
 ところで、従来の情報入力装置は、第1カーソルと第2カーソルという2つのカーソルを操作することになるため、使い勝手が良好ではないという課題がある。 By the way, the conventional information input device operates the two cursors of the first cursor and the second cursor, and thus there is a problem that the usability is not good.
 そこで、使い勝手が良好なシミュレーションシステムを提供することを目的とする。 Therefore, an object is to provide a simulation system that is easy to use.
 本発明の実施の形態のシミュレーションシステムは、物品の形状と位置を表す物品データに基づく前記物品の画像と、利用者によって位置が操作されるポインタとを表示する表示部と、前記物品データを格納するデータ格納部と、前記利用者が前記ポインタの位置を指示する指示動作を検出する第1検出部と、前記第1検出部によって検出される前記指示動作に基づき、前記表示部の座標系におけるポインタの位置を検出する第2検出部と、前記ポインタを含み、前記ポインタよりも大きい検知領域を生成する領域生成部と、前記検知領域の内部に少なくとも一部が位置する物品の種類に応じて、前記ポインタのサイズを設定するサイズ設定部と、前記表示部に、前記サイズ設定部によって設定されるサイズの前記ポインタを表示させる出力部とを含む。 A simulation system according to an embodiment of the present invention stores a display unit that displays an image of the article based on article data representing the shape and position of the article, a pointer whose position is manipulated by a user, and the article data. A data storage unit, a first detection unit that detects an instruction operation in which the user indicates the position of the pointer, and a coordinate system of the display unit based on the instruction operation detected by the first detection unit. According to a second detection unit that detects the position of the pointer, an area generation unit that includes the pointer and generates a detection area that is larger than the pointer, and the type of the article that is at least partially located within the detection area A size setting unit for setting the size of the pointer, and an output for causing the display unit to display the pointer having a size set by the size setting unit. And a part.
 使い勝手が良好なシミュレーションシステムを提供することができる。 A simulation system with good usability can be provided.
実施の形態のシミュレーションシステムを示す図である。It is a figure which shows the simulation system of embodiment. シミュレーションシステムの処理装置の構成を示す図である。It is a figure which shows the structure of the processing apparatus of a simulation system. 実施の形態の処理装置が適用されるコンピュータシステムの斜視図である。1 is a perspective view of a computer system to which a processing apparatus according to an embodiment is applied. コンピュータシステムの本体部内の要部の構成を説明するブロック図である。It is a block diagram explaining the structure of the principal part in the main-body part of a computer system. 形状データを示す図である。It is a figure which shows shape data. 物品の画像の一例を示す図である。It is a figure which shows an example of the image of articles | goods. サイズデータを示す図である。It is a figure which shows size data. 物品のサイズによってポインタのサイズを決定する手法を示す図である。It is a figure which shows the method of determining the size of a pointer with the size of articles | goods. 物品のサイズによってポインタのサイズを決定する手法を示す図である。It is a figure which shows the method of determining the size of a pointer with the size of articles | goods. 物品のサイズによってポインタのサイズを決定する手法を示す図である。It is a figure which shows the method of determining the size of a pointer with the size of articles | goods. 物品のサイズによってポインタのサイズを決定する手法を示す図である。It is a figure which shows the method of determining the size of a pointer with the size of articles | goods. 物品のサイズによってポインタのサイズを決定する手法を示す図である。It is a figure which shows the method of determining the size of a pointer with the size of articles | goods. 物品のサイズによってポインタのサイズを決定する手法を示す図である。It is a figure which shows the method of determining the size of a pointer with the size of articles | goods. 物品のサイズによってポインタのサイズを決定する手法を示す図である。It is a figure which shows the method of determining the size of a pointer with the size of articles | goods. 物品のサイズによってポインタのサイズを決定する手法を示す図である。It is a figure which shows the method of determining the size of a pointer with the size of articles | goods. 物品のサイズによってポインタのサイズを決定する手法を示す図である。It is a figure which shows the method of determining the size of a pointer with the size of articles | goods. 物品IDとポインタサイズを関連付けたサイズデータを示す図である。It is a figure which shows the size data which linked | related article ID and pointer size. 物品とポインタのサイズとの関係を示す図である。It is a figure which shows the relationship between an article | item and the size of a pointer. 物品とポインタのサイズとの関係を示す図である。It is a figure which shows the relationship between an article | item and the size of a pointer. 実施の形態の処理装置120が実行する処理を示すフローチャートである。It is a flowchart which shows the process which the processing apparatus 120 of embodiment performs. 物品とポインタのサイズとの関係を示す図である。It is a figure which shows the relationship between an article | item and the size of a pointer. 物品IDとポインタサイズを関連付けたサイズデータを示す図である。It is a figure which shows the size data which linked | related article ID and pointer size. 実施の形態の処理装置が実行する処理を示すフローチャートである。It is a flowchart which shows the process which the processing apparatus of embodiment performs. 実際にシミュレーションシステムで指示動作を行った場合のスクリーンの表示を示す図である。It is a figure which shows the display of a screen at the time of actually performing instruction | indication operation | movement with a simulation system. 図24で説明した指示動作を行った場合に、シミュレーションシステムによって表示されるポインタの軌跡を示す図である。FIG. 25 is a diagram illustrating a locus of a pointer displayed by the simulation system when the instruction operation described in FIG. 24 is performed. ポインタをマーカに移動させる指示動作を行った結果を示す図である。It is a figure which shows the result of having performed the instruction | indication operation | movement which moves a pointer to a marker.
 以下、本発明のシミュレーションシステムを適用した実施の形態について説明する。 Hereinafter, embodiments to which the simulation system of the present invention is applied will be described.
 <実施の形態>
 図1は、実施の形態のシミュレーションシステム100を示す図である。図2は、シミュレーションシステム100の処理装置120の構成を示す図である。
<Embodiment>
FIG. 1 is a diagram illustrating a simulation system 100 according to an embodiment. FIG. 2 is a diagram illustrating a configuration of the processing device 120 of the simulation system 100.
 シミュレーションシステム100は、スクリーン110A、投影装置110B、3D(3 Dimension)眼鏡110C、処理装置120、及び位置計測装置140を含む。 The simulation system 100 includes a screen 110A, a projection device 110B, 3D (3-dimensional) glasses 110C, a processing device 120, and a position measurement device 140.
 実施の形態のシミュレーションシステム100は、例えば、組み立て作業性を仮想空間において把握するため組立支援システムに適用することができる。組立支援システムでは、例えば、CPU(Central Processing Unit:中央演算処理装置)モジュール、メモリモジュール、通信モジュール、又はコネクタ等の電子部品をマザーボード等に組み付ける作業を仮想空間で模擬的に行うことができる。 The simulation system 100 according to the embodiment can be applied to an assembly support system in order to grasp assembly workability in a virtual space, for example. In the assembly support system, for example, an operation of assembling an electronic component such as a CPU (Central Processing Unit) module, a memory module, a communication module, or a connector on a mother board or the like can be performed in a virtual space.
 しかしながら、実施の形態のシミュレーションシステム100は、組立支援システムに限らず、三次元空間での作業性を確認する様々なシステムに適用することができる。 However, the simulation system 100 according to the embodiment can be applied not only to the assembly support system but also to various systems for confirming workability in a three-dimensional space.
 スクリーン110Aは、例えば、プロジェクタ用スクリーンを用いることができる。スクリーン110Aのサイズは、用途に応じて適宜設定すればよい。スクリーン110Aには、投影装置110Bによって投影される画像が表示される。ここでは、物品111、ボタン111A、111B、及びポインタ130Aの画像がスクリーン110Aに表示されていることとする。 As the screen 110A, for example, a projector screen can be used. The size of the screen 110A may be set as appropriate according to the application. An image projected by the projection device 110B is displayed on the screen 110A. Here, it is assumed that images of the article 111, the buttons 111A and 111B, and the pointer 130A are displayed on the screen 110A.
 シミュレーションシステム100では、一例として、利用者1がスクリーン110Aに向けた手が指す方向にポインタ130Aが表示される。利用者1は、右手の指先を開いた状態にしても、握った状態にしてもよい。利用者1が右腕を動かして手で指す方向において、スクリーン110A上にポインタ130Aが表示される。利用者1がポインタ130Aを移動させるために右腕で指示する動作を指示動作と称す。 In the simulation system 100, as an example, the pointer 130A is displayed in the direction in which the user 1 points the hand toward the screen 110A. The user 1 may be in a state where the fingertip of the right hand is opened or held. The pointer 130A is displayed on the screen 110A in the direction in which the user 1 moves the right arm and points with the hand. An operation in which the user 1 instructs with the right arm to move the pointer 130A is referred to as an instruction operation.
 投影装置110Bは、スクリーン110Aに画像を投影できる装置であればよく、例えば、プロジェクタを用いることができる。投影装置110Bは、ケーブル110B1によって処理装置120に接続されており、処理装置120から入力される画像をスクリーン110Aに投影する。ここでは、投影装置110Bは、3D画像(立体視の画像)をスクリーン110Aに投影できるタイプのものである。 The projection device 110B may be any device that can project an image on the screen 110A. For example, a projector can be used. The projection device 110B is connected to the processing device 120 via a cable 110B1, and projects an image input from the processing device 120 onto the screen 110A. Here, the projection device 110B is of a type that can project a 3D image (stereoscopic image) onto the screen 110A.
 なお、スクリーン110Aと投影装置110Bは、表示部の一例である。 The screen 110A and the projection device 110B are examples of a display unit.
 3D眼鏡110Cは、シミュレーションシステム100を利用する利用者1が装着する。3D眼鏡110Cは、投影装置110Bによってスクリーン110Aに投影される画像を3D画像に変換できる眼鏡であればよく、例えば、入射光を偏光する偏光眼鏡、又は、液晶シャッターを有する液晶シャッター眼鏡を用いることができる。 The user 1 who uses the simulation system 100 wears the 3D glasses 110C. The 3D glasses 110C may be any glasses that can convert an image projected on the screen 110A by the projection device 110B into a 3D image. For example, polarized glasses for polarizing incident light or liquid crystal shutter glasses having a liquid crystal shutter are used. Can do.
 なお、スクリーン110A及び投影装置110Bの代わりに、例えば、液晶ディスプレイパネルを用いてもよい。また、3D眼鏡110Cが不要な場合は、3D眼鏡110Cを用いなくてもよい。また、スクリーン110A及び投影装置110Bの代わりに、3D画像を見ることができるヘッドマウントディスプレイを用いてもよい。 For example, a liquid crystal display panel may be used instead of the screen 110A and the projection device 110B. Further, when the 3D glasses 110C are unnecessary, the 3D glasses 110C may not be used. Further, instead of the screen 110A and the projection device 110B, a head mounted display capable of viewing a 3D image may be used.
 処理装置120は、人体検出部121、位置検出部122、検知領域生成部123、動作検出部124、物体判定部125、ポインタ生成部126、データ保持部127、及び映像出力部128を有する。処理装置120は、例えば、メモリを有するコンピュータによって実現される。 The processing device 120 includes a human body detection unit 121, a position detection unit 122, a detection region generation unit 123, an operation detection unit 124, an object determination unit 125, a pointer generation unit 126, a data holding unit 127, and a video output unit 128. The processing device 120 is realized by a computer having a memory, for example.
 人体検出部121は、位置計測装置140から入力される利用者1の体の位置及び形状等を三次元的に表すデータに基づいて、利用者1の体が存在するかどうを判定し、存在する場合には、利用者1の体の各部の位置を表す座標を求める。利用者1の体の各部の位置は、一例として、利用者1の骨格の位置によって表される。骨格の位置には、例えば、頭部、肩、肘、手首、手の位置が含まれる。なお、人体検出部121は、位置検出部122とともに、第2検出部の一例である。 The human body detection unit 121 determines whether or not the body of the user 1 exists based on data that three-dimensionally represents the position and shape of the body of the user 1 input from the position measurement device 140. When doing so, the coordinates indicating the position of each part of the body of the user 1 are obtained. As an example, the position of each part of the body of the user 1 is represented by the position of the skeleton of the user 1. The position of the skeleton includes, for example, the position of the head, shoulder, elbow, wrist, and hand. The human body detection unit 121 is an example of a second detection unit together with the position detection unit 122.
 位置検出部122は、人体検出部121から入力される利用者1の体の各部の位置を表す座標に基づいて、座標P(P,P,P)を求める。位置検出部122は、人体検出部121とともに、第2検出部の一例である。 The position detection unit 122 obtains coordinates P (P x , P y , P z ) based on the coordinates representing the position of each part of the body of the user 1 input from the human body detection unit 121. The position detection unit 122 is an example of a second detection unit together with the human body detection unit 121.
 位置検出部122は、人体検出部121から入力される利用者1の体の各部の位置を表す座標に基づいて、利用者1の右肩と右手首とを結ぶ直線を求め、直線とスクリーン110Aとの交点の座標を求める。 The position detection unit 122 obtains a straight line connecting the right shoulder and the right wrist of the user 1 based on the coordinates representing the position of each part of the body of the user 1 input from the human body detection unit 121, and the straight line and the screen 110A. Find the coordinates of the intersection with.
 位置検出部122は、直線とスクリーン110Aとの交点の座標値をスクリーン110Aに投影される画像の中の座標に変換し、座標P(P,P,P)として出力する。なお、座標P(P,P,P)は、位置計測装置140が検出するようにしてもよい。 The position detection unit 122 converts the coordinate value of the intersection of the straight line and the screen 110A into the coordinate in the image projected on the screen 110A, and outputs it as coordinates P (P x , P y , P z ). Note that the position measurement device 140 may detect the coordinates P (P x , P y , P z ).
 ここで、スクリーン110Aと平行な水平方向にX軸、鉛直方向にY軸、スクリーン110Aと垂直な水平方向にZ軸を定義する。ここで、利用者1の右肩の座標をS(S,S,S)、右手首の座標をH(H,H,H)とすると、右肩と右手首とを結ぶ方向のベクトルSHは以下の式(1)で表される。ベクトルSHは、利用者1が右手でポインタ130Aの位置を指示する方向を表す。 Here, an X axis is defined in the horizontal direction parallel to the screen 110A, a Y axis is defined in the vertical direction, and a Z axis is defined in the horizontal direction perpendicular to the screen 110A. Here, if the coordinates of the right shoulder of the user 1 are S (S x , S y , S z ) and the coordinates of the right wrist are H (H x , H y , H z ), the right shoulder and the right wrist are A vector SH in the connecting direction is expressed by the following equation (1). Vector SH represents a direction in which user 1 indicates the position of pointer 130A with the right hand.
Figure JPOXMLDOC01-appb-M000001
 ベクトルSHの大きさは、右肩の座標S(S,S,S)と、右手首の座標H(H,H,H)とを用いると、次式(2)で表される。
Figure JPOXMLDOC01-appb-M000001
The magnitude of the vector SH is given by the following equation (2) using the right shoulder coordinates S (S x , S y , S z ) and the right wrist coordinates H (H x , H y , H z ). expressed.
Figure JPOXMLDOC01-appb-M000002
 利用者1がスクリーン10AからZ軸方向にオフセットしている量(オフセット量)をΔLとして、次式(3)で表される座標P1(P1,P1,P1)を求める。
Figure JPOXMLDOC01-appb-M000002
A coordinate P1 (P1 x , P1 y , P1 z ) represented by the following expression (3) is obtained by setting ΔL as an amount (offset amount) offset by the user 1 from the screen 10A in the Z-axis direction.
Figure JPOXMLDOC01-appb-M000003
 座標P1(P1,P1,P1)は、式(1)、(2)、(3)によって求まる座標であり、利用者1の右肩と右手首とを結ぶ直線とスクリーン110Aとの交点の座標である。この交点の座標は、現実の空間における座標である。
Figure JPOXMLDOC01-appb-M000003
The coordinates P1 (P1 x , P1 y , P1 z ) are coordinates obtained by the equations (1), (2), and (3), and the straight line connecting the right shoulder and the right wrist of the user 1 and the screen 110A. The coordinates of the intersection. The coordinates of this intersection are the coordinates in the real space.
 位置検出部122は、交点の座標P1(P1,P1,P1)をスクリーン110Aに投影される画像の中の座標に変換し、座標P(P,P,P)として出力する。座標P(P,P,P)は、交点の座標P1(P1,P1,P1)をスクリーン110Aに投影される画像の中の座標に変換した座標である。 The position detection unit 122 converts the coordinates P1 (P1 x , P1 y , P1 z ) of the intersection into coordinates in the image projected on the screen 110A, and outputs the coordinates P (P x , P y , P z ). To do. Coordinates P (P x, P y, P z) are coordinates obtained by converting the coordinates in the image to be projected intersection coordinates P1 and (P1 x, P1 y, P1 z) of the screen 110A.
 以上のようにして、位置検出部122は、現実空間における交点の座標P1(P1,P1,P1)を仮想空間における座標系の値に変換した座標P(P,P,P)を求める。 As described above, the position detector 122 converts the coordinates P (P x , P y , P) obtained by converting the coordinates P1 (P1 x , P1 y , P1 z ) of the intersection in the real space into values in the coordinate system in the virtual space. z ).
 検知領域生成部123は、座標P(P,P,P)を中心とする検知領域130Bを生成する。検知領域生成部123は、領域生成部の一例である。ポインタ130Aは、座標P(P,P,P)を中心とする球体であり、検知領域130Bの半径は、ポインタ130Aの半径よりも大きい。このため、検知領域130Bは、ポインタ130Aと同心球状に配置される。 Detection area generator 123 generates a detection region 130B to the center coordinates P (P x, P y, P z) a. The detection area generation unit 123 is an example of an area generation unit. The pointer 130A is a sphere centered on the coordinates P (P x , P y , P z ), and the radius of the detection area 130B is larger than the radius of the pointer 130A. For this reason, the detection region 130B is arranged concentrically with the pointer 130A.
 検知領域130Bは、ポインタ130Aの表面よりも外側で、座標P(P,P,P)を中心とし、所定の半径で規定される球体に含まれる領域である。なお、検知領域130Bは、所定の半径で規定される球体の表面を含む。 The detection area 130B is an area outside the surface of the pointer 130A and included in a sphere defined by a predetermined radius with the coordinates P (P x , P y , P z ) as the center. The detection region 130B includes a spherical surface defined by a predetermined radius.
 ポインタ130Aと検知領域130Bは、利用者の指示動作に従って移動する点では同じである。ポインタ130Aは画像としてスクリーン110Aに表示されるが、検知領域130Bは、スクリーン110Aには表示されない。 The pointer 130A and the detection area 130B are the same in that they move according to the user's instruction operation. The pointer 130A is displayed as an image on the screen 110A, but the detection area 130B is not displayed on the screen 110A.
 ポインタ130Aは、スクリーン110Aに表示される物品111又はボタン111Aあるいは111Bとの接触の判定に用いられるのに対して、検知領域130Bは、ポインタ130Aの周囲に存在する物品111又はボタン111Aあるいは111B等の存在を判定するために用いられる。 The pointer 130A is used for determining contact with the article 111 or the button 111A or 111B displayed on the screen 110A, whereas the detection area 130B is the article 111 or the button 111A or 111B existing around the pointer 130A. Used to determine the presence of.
 動作検出部124は、人体検出部121から入力される利用者1の体の各部の位置を表す座標に基づいて、利用者1の動作を検出する。利用者1の動作とは、利用者1が行うゼスチャー等の動作あり、ここでは、一例として、ポインタ130Aのサイズをリセットするための動作、ボタン111A又は111Bへの入力を決定する動作、及び、ボタン111A又は111Bへの入力をキャンセルする動作がある。 The motion detection unit 124 detects the motion of the user 1 based on the coordinates representing the position of each part of the body of the user 1 input from the human body detection unit 121. The operation of the user 1 is an operation such as a gesture performed by the user 1, and here, as an example, an operation for resetting the size of the pointer 130A, an operation for determining an input to the button 111A or 111B, and There is an operation of canceling the input to the button 111A or 111B.
 ボタン111A及び111Bへの入力を決定する動作は、それぞれ、ボタン111A及び111Bをタップする動作である。また、ボタン111Aへの入力をキャンセルする動作は、ボタン111Aをなぞりながら払う動作である。ここでは、利用者1は右手2を利用して操作するため、より具体的には、ボタン111Aへの入力をキャンセルする動作は、ポインタ130Aがボタン111Aに触れた状態で、ポインタ130Aを左方向に移動させる動作である。 The operation for determining input to the buttons 111A and 111B is an operation of tapping the buttons 111A and 111B, respectively. Further, the operation of canceling the input to the button 111A is an operation of paying while tracing the button 111A. Here, since the user 1 operates using the right hand 2, more specifically, the operation of canceling the input to the button 111A is performed by moving the pointer 130A leftward while the pointer 130A touches the button 111A. It is an operation to move to.
 また、ボタン111Bへの入力をキャンセルする動作は、ボタン111Bをなぞりながら払う動作である。ここでは、利用者1は右手2を利用して操作するため、より具体的には、ボタン111Bへの入力をキャンセルする動作は、ポインタ130Aがボタン111Bに触れた状態で、ポインタ130Aを右方向に移動させる動作である。 Further, the operation of canceling the input to the button 111B is an operation of paying while tracing the button 111B. Here, since the user 1 operates using the right hand 2, more specifically, the operation of canceling the input to the button 111B is performed by moving the pointer 130A in the right direction while the pointer 130A touches the button 111B. It is an operation to move to.
 また、ポインタ130Aのサイズをリセットするための動作は、両腕をクロスさせる動作である。動作検出部124は、第1判定部及び第2判定部の一例である。 Further, the operation for resetting the size of the pointer 130A is an operation of crossing both arms. The motion detection unit 124 is an example of a first determination unit and a second determination unit.
 物体判定部125は、検知領域130Bの内部に少なくとも一部が位置する物品の種類を判定する。物品の種類とは、一例として、物品111又はボタン111Aあるいは111Bの種類である。 The object determination unit 125 determines the type of article at least a part of which is located inside the detection area 130B. The type of article is, for example, the kind of article 111 or button 111A or 111B.
 検知領域130Bの内部に少なくとも一部が位置するかどうかは、検知領域130Bと、物品111又はボタン111Aあるいは111Bの表示領域とが交点を有するかどうかで判定すればよい。検知領域130Bの内部に物品の少なくとも一部が位置する場合には、検知領域130Bの外周面(境界)上に物品が位置する場合も含まれるものとする。 Whether or not at least a part is located inside the detection area 130B may be determined by whether or not the detection area 130B and the display area of the article 111 or the button 111A or 111B have an intersection. When at least a part of the article is located inside the detection area 130B, the case where the article is located on the outer peripheral surface (boundary) of the detection area 130B is also included.
 ポインタ生成部126は、座標P(P,P,P)を中心とする球体状の画像としてポインタ130Aを生成する。ポインタ130Aの半径は、検知領域130Bの半径よりも小さい。ポインタ130Aと検知領域130Bとは同心球状に配置されるため、ポインタ130Aの周りには、検知領域130Bが存在する。ポインタ生成部126は、サイズ設定部の一例である。 The pointer generator 126 generates the pointer 130A as a spherical image centered on the coordinates P (P x , P y , P z ). The radius of the pointer 130A is smaller than the radius of the detection area 130B. Since the pointer 130A and the detection area 130B are arranged concentrically, the detection area 130B exists around the pointer 130A. The pointer generation unit 126 is an example of a size setting unit.
 ポインタ130Aは、利用者の指示動作に従って移動し、スクリーン110Aに表示される物品111又はボタン111Aあるいは111Bとの接触の判定に用いられる。 The pointer 130A moves according to the user's instruction operation, and is used to determine contact with the article 111 or the button 111A or 111B displayed on the screen 110A.
 ポインタ130Aと、物品111又はボタン111Aあるいは111Bとが接触しているかどうかは、座標P(P,P,P)を中心とし、所定の半径を有するポインタ130Aの内部に、物品111又はボタン111Aあるいは111Bの表示領域の少なくとも一部が含まれるかどうかで判定すればよい。ポインタ130Aの内部は、ポインタ130Aの表面を含む。 Whether the pointer 130A and the article 111 or the button 111A or 111B are in contact with each other is determined based on the coordinates P (P x , P y , P z ) and within the pointer 130A having a predetermined radius. The determination may be made based on whether at least a part of the display area of the button 111A or 111B is included. The interior of the pointer 130A includes the surface of the pointer 130A.
 ポインタ130Aの内部に、物品111又はボタン111Aあるいは111Bの表示領域の少なくとも一部が含まれるかどうかは、ポインタ130Aと、物品111又はボタン111Aあるいは111Bの表示領域とが、交点を有するかどうかで判定すればよい。 Whether at least a part of the display area of the article 111 or the button 111A or 111B is included in the pointer 130A depends on whether or not the pointer 130A and the display area of the article 111 or the button 111A or 111B have an intersection. What is necessary is just to judge.
 ポインタ130Aの内部に物品111又はボタン111Aあるいは111Bの表示領域の少なくとも一部が含まれる場合には、ポインタ130Aの表面(境界)上に、物品111又はボタン111Aあるいは111Bの表示領域が位置する場合も含まれるものとする。 When at least a part of the display area of the article 111 or the button 111A or 111B is included in the pointer 130A, the display area of the article 111 or the button 111A or 111B is located on the surface (boundary) of the pointer 130A Is also included.
 なお、本実施の形態では、ポインタ130Aと、物品111又はボタン111Aあるいは111Bとが接触しているかどうかについては、詳しく説明しないが、例えば、ポインタ130Aと、物品111又はボタン111Aあるいは111Bとが接触している場合に、ポインタ130Aの色を変えることにより、接触していることを利用者1に伝達すればよい。 In this embodiment, it is not described in detail whether the pointer 130A is in contact with the article 111 or the button 111A or 111B. For example, the pointer 130A is in contact with the article 111 or the button 111A or 111B. In this case, it is only necessary to transmit the contact to the user 1 by changing the color of the pointer 130A.
 ポインタ生成部126が生成するポインタ130Aのサイズは、データ保持部127に保持されるポインタ130Aのサイズを表すデータに基づいて設定される。また、ポインタ生成部126は、検知領域130Bの内部に存在する物品の種類に応じて、ポインタ130Aのサイズを設定する。また、ポインタ生成部126は、利用者1の動作に応じてポインタ130Aのサイズを設定する。ポインタ130Aのサイズを設定する具体的な方法については後述する。 The size of the pointer 130A generated by the pointer generator 126 is set based on data representing the size of the pointer 130A held in the data holding unit 127. In addition, the pointer generation unit 126 sets the size of the pointer 130A according to the type of article existing inside the detection area 130B. In addition, the pointer generation unit 126 sets the size of the pointer 130 </ b> A according to the operation of the user 1. A specific method for setting the size of the pointer 130A will be described later.
 データ保持部127は、物品111又はボタン111Aあるいは111Bの座標と形状を表す物品データ、及び、ポインタ130Aの画像データ、ポインタ130Aのサイズを表すデータ等のデータを保持する。データ保持部127は、メモリによって実現され、データ格納部の一例である。 The data holding unit 127 holds data such as article data representing the coordinates and shape of the article 111 or the button 111A or 111B, image data of the pointer 130A, and data representing the size of the pointer 130A. The data holding unit 127 is realized by a memory and is an example of a data storage unit.
 映像出力部128の出力端子は、ケーブル110B1によって投影装置110Bに接続されている。映像出力部128は、データ保持部127に保持される物品111の物品データによって特定される画像を投影装置110Bに出力し、スクリーン110Aに表示させる。 The output terminal of the video output unit 128 is connected to the projection device 110B by a cable 110B1. The video output unit 128 outputs an image specified by the article data of the article 111 held in the data holding unit 127 to the projection device 110B and displays it on the screen 110A.
 また、映像出力部128は、投影装置110Bにポインタ130Aを表示させる。ポインタ130Aの画像は、ポインタ生成部126によって生成される。 Also, the video output unit 128 displays the pointer 130A on the projection device 110B. The image of the pointer 130A is generated by the pointer generator 126.
 位置計測装置140は、利用者1の体の位置及び形状等を三次元的に表すデータを取得する装置である。位置計測装置140は、スクリーン110Aの上方に設置されており、スクリーン110Aの正面に、検出範囲142を有する。検出範囲142は、位置計測装置140のカメラ140Aから、スクリーン110Aの正面にかけて延在する。位置計測装置140は、ケーブル141によって処理装置120に接続されている。 The position measuring device 140 is a device that acquires data that three-dimensionally represents the position and shape of the body of the user 1. The position measuring device 140 is installed above the screen 110A and has a detection range 142 in front of the screen 110A. The detection range 142 extends from the camera 140A of the position measurement device 140 to the front of the screen 110A. The position measuring device 140 is connected to the processing device 120 by a cable 141.
 位置計測装置140は、より具体的には、例えば、赤外線レーザを被写体に向けて照射し、反射光を受光するまでの時間に基づいて、画像に含まれる点までの距離(深度)を算出する装置である。位置計測装置140は、スクリーン110Aに向かって指示動作を行う利用者1の画像を取得し、利用者1の姿勢、及び、ゼスチャー等を表す三次元的な距離画像のデータを取得する。位置計測装置140は、取得した三次元的なデータを、ケーブル141を介して処理装置120に伝送する。 More specifically, the position measuring device 140 calculates a distance (depth) to a point included in the image based on, for example, the time from irradiating the subject with an infrared laser and receiving the reflected light. Device. The position measurement device 140 acquires an image of the user 1 who performs an instruction operation toward the screen 110A, and acquires three-dimensional distance image data representing the posture of the user 1, a gesture, and the like. The position measurement device 140 transmits the acquired three-dimensional data to the processing device 120 via the cable 141.
 図3は、実施の形態の処理装置120が適用されるコンピュータシステムの斜視図である。図3に示すコンピュータシステム10は、本体部11、ディスプレイ12、キーボード13、マウス14、及びモデム15を含む。 FIG. 3 is a perspective view of a computer system to which the processing device 120 of the embodiment is applied. A computer system 10 shown in FIG. 3 includes a main body 11, a display 12, a keyboard 13, a mouse 14, and a modem 15.
 本体部11は、CPU(Central Processing Unit:中央演算装置)、HDD(Hard Disk Drive:ハードディスクドライブ)、及びディスクドライブ等を内蔵する。ディスプレイ12は、本体部11からの指示により画面12A上に解析結果等を表示する。ディスプレイ12は、例えば、液晶モニタであればよい。キーボード13は、コンピュータシステム10に種々の情報を入力するための入力部である。マウス14は、ディスプレイ12の画面12A上の任意の位置を指定する入力部である。モデム15は、外部のデータベース等にアクセスして他のコンピュータシステムに記憶されているプログラム等をダウンロードする。 The main unit 11 includes a CPU (Central Processing Unit), an HDD (Hard Disk Drive), a disk drive, and the like. The display 12 displays an analysis result or the like on the screen 12A according to an instruction from the main body 11. The display 12 may be a liquid crystal monitor, for example. The keyboard 13 is an input unit for inputting various information to the computer system 10. The mouse 14 is an input unit that designates an arbitrary position on the screen 12 </ b> A of the display 12. The modem 15 accesses an external database or the like and downloads a program or the like stored in another computer system.
 コンピュータシステム10に処理装置120としての機能を持たせるプログラムは、ディスク17等の可搬型記録媒体に格納されるか、モデム15等の通信装置を使って他のコンピュータシステムの記録媒体16からダウンロードされ、コンピュータシステム10に入力されてコンパイルされる。 A program for causing the computer system 10 to function as the processing device 120 is stored in a portable recording medium such as the disk 17 or downloaded from the recording medium 16 of another computer system using a communication device such as the modem 15. Are input to the computer system 10 and compiled.
 コンピュータシステム10に処理装置120としての機能を持たせるプログラムは、コンピュータシステム10を処理装置120として動作させる。このプログラムは、例えばディスク17等のコンピュータ読み取り可能な記録媒体に格納されていてもよい。コンピュータ読み取り可能な記録媒体は、ディスク17、ICカードメモリ、フロッピー(登録商標)ディスク等の磁気ディスク、光磁気ディスク、CD-ROM、USB(Universal Serial Bus)メモリ等の可搬型記録媒体に限定されるものではない。コンピュータ読み取り可能な記録媒体は、モデム15又はLAN等の通信装置を介して接続されるコンピュータシステムでアクセス可能な各種記録媒体を含む。 A program that causes the computer system 10 to have a function as the processing device 120 causes the computer system 10 to operate as the processing device 120. This program may be stored in a computer-readable recording medium such as the disk 17. The computer-readable recording medium is limited to a portable recording medium such as a disk 17, an IC card memory, a magnetic disk such as a floppy (registered trademark) disk, a magneto-optical disk, a CD-ROM, or a USB (Universal Serial Bus) memory. It is not something. The computer-readable recording medium includes various recording media accessible by a computer system connected via a communication device such as a modem 15 or a LAN.
 図4は、コンピュータシステム10の本体部11内の要部の構成を説明するブロック図である。本体部11は、バス20によって接続されたCPU21、RAM又はROM等を含むメモリ部22、ディスク17用のディスクドライブ23、及びハードディスクドライブ(HDD)24を含む。実施の形態では、ディスプレイ12、キーボード13、及びマウス14は、バス20を介してCPU21に接続されているが、これらはCPU21に直接的に接続されていてもよい。また、ディスプレイ12は、入出力画像データの処理を行う周知のグラフィックインタフェース(図示せず)を介してCPU21に接続されていてもよい。 FIG. 4 is a block diagram illustrating a configuration of a main part in the main body 11 of the computer system 10. The main body 11 includes a CPU 21 connected by a bus 20, a memory unit 22 including a RAM or a ROM, a disk drive 23 for the disk 17, and a hard disk drive (HDD) 24. In the embodiment, the display 12, the keyboard 13, and the mouse 14 are connected to the CPU 21 via the bus 20, but these may be directly connected to the CPU 21. The display 12 may be connected to the CPU 21 via a known graphic interface (not shown) that processes input / output image data.
 コンピュータシステム10において、キーボード13及びマウス14は、処理装置120の入力部である。ディスプレイ12は、処理装置120に対する入力内容等を画面12A上に表示する表示部である。 In the computer system 10, the keyboard 13 and the mouse 14 are input units of the processing device 120. The display 12 is a display unit that displays input contents and the like for the processing device 120 on the screen 12A.
 なお、コンピュータシステム10は、図3及び図4に示す構成のものに限定されず、各種周知の要素を付加してもよく、又は代替的に用いてもよい。 The computer system 10 is not limited to the configuration shown in FIGS. 3 and 4, and various known elements may be added or alternatively used.
 図5は、形状データを示す図である。 FIG. 5 is a diagram showing shape data.
 物品データは、スクリーン110Aに表示する物品の座標と形状を表すデータである。物品データは、物品ID、形状タイプ、基準座標、サイズ、及び回転角度を有する。 The article data is data representing the coordinates and shape of the article displayed on the screen 110A. The article data has an article ID, a shape type, reference coordinates, a size, and a rotation angle.
 形状タイプは、物品の外形を表す。図5では、一例として、形状タイプがCuboid(直方体)とCylinder(円柱体)を示す。 The shape type represents the outer shape of the article. In FIG. 5, as an example, the shape types indicate Cuboid (cuboid) and Cylinder (cylindrical body).
 基準座標は、物品の全体を表す座標の基準になる点の座標値を示す。座標値の単位はメートル(m)である。なお、座標系としては、XYZ座標系を用いる。 The reference coordinate indicates the coordinate value of a point that serves as a reference for coordinates representing the entire article. The unit of the coordinate value is meter (m). An XYZ coordinate system is used as the coordinate system.
 サイズは、物品のX軸方向の長さ、Y軸方向の長さ、Z軸方向の長さを表す。単位はメートル(m)である。一例として、X軸方向の長さは縦の長さを表し、Y軸方向の長さは高さを表し、Z軸方向の長さは奥行き(横方向の長さ)を表す。 The size represents the length of the article in the X-axis direction, the length in the Y-axis direction, and the length in the Z-axis direction. The unit is meters (m). As an example, the length in the X-axis direction represents the vertical length, the length in the Y-axis direction represents the height, and the length in the Z-axis direction represents the depth (the length in the horizontal direction).
 回転角度は、X軸方向、Y軸方向、Z軸方向に対する回転角度θx、θy、θzで表される。単位は度(deg.)である。回転角度θxは、X軸を回転軸として物品を回転させる角度である。同様に、回転角度θy及びθzは、それぞれ、Y軸及びZ軸を回転軸として物品を回転させる角度である。回転角度θx、θy、θzの正方向は、予め決めておけばよい。 The rotation angle is represented by rotation angles θx, θy, and θz with respect to the X-axis direction, the Y-axis direction, and the Z-axis direction. The unit is degree (deg.). The rotation angle θx is an angle for rotating the article about the X axis as a rotation axis. Similarly, the rotation angles θy and θz are angles at which the article is rotated about the Y axis and the Z axis as rotation axes, respectively. The positive directions of the rotation angles θx, θy, and θz may be determined in advance.
 このような物品データを用いれば、CADデータによって表示される物品の画像と同様に、物品データによって特定される画像を表すことができる。 If such article data is used, an image specified by the article data can be represented in the same manner as the article image displayed by the CAD data.
 なお、物品データは、処理装置120のデータ保持部127に格納されている。 The article data is stored in the data holding unit 127 of the processing device 120.
 図6は、物品の画像の一例を示す図である。 FIG. 6 is a diagram illustrating an example of an image of an article.
 図6には、図8の物品データによって表される3つの物品を示す。 FIG. 6 shows three articles represented by the article data in FIG.
 物品IDが001の物品は、形状タイプがCuboid(直方体)で、基準座標(X,Y,Z)が(0.0,0.0,0.0)であり、サイズが(0.8,0.2,0.4)であり、回転角度θx、θy、θzが(0.0,0.0,0.0)である。 An article with an article ID of 001 has a shape type of Cuboid (cuboid), reference coordinates (X, Y, Z) of (0.0, 0.0, 0.0), and a size of (0.8, 0.2, 0.4), and the rotation angles θx, θy, θz are (0.0, 0.0, 0.0).
 基準座標(X,Y,Z)が(0.0,0.0,0.0)であるため、物品IDが001の物品の1つの頂点は、XYZ座標系の原点Oと一致している。 Since the reference coordinates (X, Y, Z) are (0.0, 0.0, 0.0), one vertex of the article whose article ID is 001 coincides with the origin O of the XYZ coordinate system. .
 物品IDが002の物品は、形状タイプがCuboid(直方体)で、基準座標(X,Y,Z)が(0.6,0.2,0.0)であり、サイズが(0.2,0.2,0.1)であり、回転角度θx、θy、θzが(0.0,0.0,0.0)である。 An article with an article ID of 002 has a shape type of Cuboid (cuboid), reference coordinates (X, Y, Z) of (0.6, 0.2, 0.0), and a size of (0.2, 0.2, 0.1), and the rotation angles θx, θy, θz are (0.0, 0.0, 0.0).
 このため、物品IDが002の物品は、物品IDが001の物品の上に配置されている。 For this reason, the article with the article ID 002 is arranged on the article with the article ID 001.
 物品IDが003の物品は、形状タイプがCylinder(円柱体)で、基準座標(X,Y,Z)が(0.8,0.3,0.1)であり、サイズが(0.2,1.0,0.3)であり、回転角度θx、θy、θzが(0.0,0.0,90.0)である。 The article with the article ID 003 has a shape type of Cylinder, a reference coordinate (X, Y, Z) of (0.8, 0.3, 0.1), and a size of (0.2 , 1.0, 0.3), and the rotation angles θx, θy, θz are (0.0, 0.0, 90.0).
 このため、物品IDが003の物品は、Z軸を回転軸として90度回転させた状態で、物品IDが002の物品のX軸正方向側に接続されている。 For this reason, the article with the article ID 003 is connected to the X axis positive direction side of the article with the article ID 002 in a state where the article ID is rotated 90 degrees about the Z axis.
 なお、上述のように、実施の形態では、図8に示す物品ID、形状タイプ、基準座標、サイズ、及び回転角度を有する物品データを用いて、スクリーン110Aに投影される画像の中における物品の座標と形状を規定する。 As described above, in the embodiment, the article data in the image projected on the screen 110A using the article data having the article ID, shape type, reference coordinates, size, and rotation angle shown in FIG. Define coordinates and shape.
 例えば、形状タイプがCuboid(直方体)の場合に、8つの頂点の座標は、基準座標に対して、サイズで表される物品のX軸方向の長さ、Y軸方向の長さ、Z軸方向の長さを加算又は減算することによって求めることができる。8つの頂点の座標は、形状タイプがCuboid(直方体)の物品のCorner(角)の座標を表す。 For example, when the shape type is Cuboid, the coordinates of the eight vertices are the length in the X-axis direction, the length in the Y-axis direction, the length in the Y-axis direction, and the Z-axis direction with respect to the reference coordinates. Can be obtained by adding or subtracting the length. The coordinates of the eight vertices represent the coordinates of the corner of the article whose shape type is Cuboid.
 8つの頂点の座標を求めれば、12本の辺を表す式を求めることができる。12本の辺を表す式は、形状タイプがCuboid(直方体)の物品のEdge(辺)の座標を表す式である。 If the coordinates of eight vertices are obtained, an expression representing 12 sides can be obtained. The expression representing the 12 sides is an expression representing the coordinates of the Edge of the article whose shape type is Cuboid.
 また、8つの頂点の座標、及び/又は、12本の辺を表す式を求めれば、形状タイプがCuboid(直方体)の物品の6つの表面を表す式が求まり、Surface(面)の座標を求めることができる。 Further, if the coordinates representing the eight vertices and / or the expressions representing the 12 sides are obtained, the expressions representing the six surfaces of the article whose shape type is Cuboid are obtained, and the coordinates of the surface are obtained. be able to.
 また、形状タイプがCylinder(円柱体)の場合には、サイズで表される物品のX軸方向の長さ、Y軸方向の長さ、Z軸方向の長さに基づいて、円柱の両端にある円(又は楕円)を表す式を求めることができる。また、両端の円(又は楕円)を表す式と基準座標とを用いれば、両端の円(又は楕円)の座標を表す式を求めることができる。円柱体の側面の座標は、両端の円(又は楕円)の座標を表す式を用いることよって求めることができる。 In addition, when the shape type is Cylinder (cylindrical body), based on the length in the X-axis direction, the length in the Y-axis direction, and the length in the Z-axis direction of the article represented by the size, An expression representing a certain circle (or ellipse) can be obtained. Further, if an equation representing a circle (or ellipse) at both ends and a reference coordinate are used, an equation representing the coordinates of the circle (or ellipse) at both ends can be obtained. The coordinates of the side surface of the cylinder can be obtained by using an expression representing the coordinates of the circles (or ellipses) at both ends.
 ここでは、形状タイプがCuboid(直方体)とCylinder(円柱体)の物品について説明したが、球体、三角錐、凹部を有する直方体等の様々な形状の物品についても、同様にスクリーン110Aに投影される画像の中における座標と形状を求めることができる。 Here, the articles whose shape types are Cuboid (cuboid) and cylinder (cylindrical body) have been described, but articles of various shapes such as a sphere, a triangular pyramid, and a rectangular parallelepiped having a concave portion are similarly projected onto the screen 110A. The coordinates and shape in the image can be obtained.
 図7は、サイズデータを示す図である。 FIG. 7 is a diagram showing size data.
 サイズデータは、スクリーン110Aに表示される物品の物品IDと、ポインタ130Aのポインタサイズを関連付けたテーブル形式のデータである。ポインタサイズ(X,Y,Z)は、X軸方向の幅X、Y軸方向の高さY、及びZ軸方向の奥行きZを表す。 The size data is data in a table format in which the article ID of the article displayed on the screen 110A is associated with the pointer size of the pointer 130A. Pointer size (X p, Y p, Z p) is the X-axis direction width X p, Y-axis direction of the height Y p, and represents the depth Z p in the Z-axis direction.
 ポインタ130Aは、幅X、高さY、及び奥行きZを有する楕円体として表示される。一例として、物品IDが001の物品に関連付けられるポインタサイズ(X,Y,Z)は、(0.05,0.02,0.05)である。物品IDが002の物品に関連付けられるポインタサイズ(X,Y,Z)は、(0.01,0.01,0.01)である。物品IDが002の物品に関連付けられるポインタサイズ(X,Y,Z)は、(0.015,0.05,0.015)である。 Pointer 130A has a width X p, is displayed as an ellipsoid having a height Y p, and the depth Z p. As an example, the pointer size (X p , Y p , Z p ) associated with the article having the article ID 001 is (0.05, 0.02, 0.05). Pointer size article ID is associated with the article 002 (X p, Y p, Z p) is a (0.01,0.01,0.01). The pointer size (X p , Y p , Z p ) associated with the article with the article ID 002 is (0.015, 0.05, 0.015).
 このように、ポインタサイズを物品に応じて設定する理由について、図8乃至図16を用いて説明する。 The reason why the pointer size is set in accordance with the article will be described with reference to FIGS.
 図8乃至図16は、物品のサイズによってポインタ130Aのサイズを決定する手法を示す図である。 8 to 16 are diagrams showing a method for determining the size of the pointer 130A according to the size of the article.
 ポインタ130Aの表面(楕円面)は、式(4)で表される。 The surface (ellipsoidal surface) of the pointer 130A is represented by Expression (4).
Figure JPOXMLDOC01-appb-M000004
 幅X、高さY、及び奥行きZの半分の値が、それぞれ、式(4)のパラメータa、b、cである。すなわち、a=X/2、b=Y/2、c=Z/2である。
Figure JPOXMLDOC01-appb-M000004
Half the values of the width X p , the height Y p , and the depth Z p are the parameters a, b, and c in the equation (4), respectively. That is, a = X p / 2, b = Y p / 2, c = Z p / 2.
 図8に示すように、幅X1、高さY1、奥行きZ1の物品111-1をスクリーン110Aに表示する場合には、楕円体のパラメータa、b、cは、一例として、a=kX1、b=kY1、c=kZ1に設定され、パラメータkは0.05に設定される。 As shown in FIG. 8, when an article 111-1 having a width X1, a height Y1, and a depth Z1 is displayed on the screen 110A, the parameters a, b, and c of the ellipsoid are, for example, a = kX1, b = KY1, c = kZ1, and the parameter k is set to 0.05.
 図9に示すように、幅X2、高さY2、奥行きZ2の物品111-2をスクリーン110Aに表示する場合には、楕円体のパラメータa、b、cは、一例として、a=kX2、b=kY2、c=kZ2に設定される。パラメータkは0.05である。 As shown in FIG. 9, when an article 111-2 having a width X2, a height Y2, and a depth Z2 is displayed on the screen 110A, the parameters a, b, and c of the ellipsoid are, for example, a = kX2, b = KY2, c = kZ2. The parameter k is 0.05.
 物品111-2は、3つのL字型のブロックを接続した形状を有している。ポインタ130Aのサイズは、物品111-2の3つのブロックの間の隙間に入ることができるサイズである。 Article 111-2 has a shape in which three L-shaped blocks are connected. The size of the pointer 130A is a size that can enter the gap between the three blocks of the article 111-2.
 図10に示すように、幅X3、高さY3、奥行きZ3の物品111-3をスクリーン110Aに表示する場合には、楕円体のパラメータa、b、cは、一例として、a=kX3、b=kY3、c=kZ3に設定される。パラメータkは0.05である。 As shown in FIG. 10, when an article 111-3 having a width X3, a height Y3, and a depth Z3 is displayed on the screen 110A, the parameters a, b, and c of the ellipsoid are, for example, a = kX3, b = KY3, c = kZ3. The parameter k is 0.05.
 物品111-3は、直方体111-31の内部に直方体状の穴部が設けられており、穴部の中に角柱部111-32が設けられている。ポインタ130Aのサイズは、直方体111-31の穴部と角柱部111-32との隙間に入ることができるサイズである。 The article 111-3 is provided with a rectangular parallelepiped hole in the rectangular parallelepiped 111-31, and a prismatic portion 111-32 is provided in the hole. The size of the pointer 130A is a size that can enter the gap between the hole of the rectangular parallelepiped 111-31 and the prism portion 111-32.
 また、図11に示すように、幅X1、高さY1、奥行きZ1の物品111-1をスクリーン110Aに表示する場合には、例えば、楕円体のパラメータa、b、cを次式(5)で求められるl1に設定してもよい。この場合、パラメータkは0.05であり、a=b=c=l1である。 In addition, as shown in FIG. 11, when the article 111-1 having the width X1, the height Y1, and the depth Z1 is displayed on the screen 110A, for example, the parameters a, b, and c of the ellipsoid are expressed by the following equation (5). It may be set to l1 obtained in step (1). In this case, the parameter k is 0.05 and a = b = c = l1.
Figure JPOXMLDOC01-appb-M000005
 また、図12に示すように、幅X2、高さY2、奥行きZ2の物品111-2をスクリーン110Aに表示する場合には、例えば、楕円体のパラメータa、b、cを次式(6)で求められるl2に設定してもよい。この場合、パラメータkは0.05であり、a=b=c=l2である。
Figure JPOXMLDOC01-appb-M000005
Also, as shown in FIG. 12, when displaying an article 111-2 having a width X2, a height Y2, and a depth Z2 on the screen 110A, for example, the parameters a, b, and c of an ellipsoid are expressed by the following equation (6). It may be set to l2 obtained by the above. In this case, the parameter k is 0.05, and a = b = c = 12.
Figure JPOXMLDOC01-appb-M000006
 式(6)で求められるポインタ130Aのサイズは、物品111-2の3つのブロックの間の隙間に入ることができるサイズである。
Figure JPOXMLDOC01-appb-M000006
The size of the pointer 130A obtained by Expression (6) is a size that can enter the gap between the three blocks of the article 111-2.
 また、図13に示すように、幅X3、高さY3、奥行きZ3の物品111-3をスクリーン110Aに表示する場合には、例えば、楕円体のパラメータa、b、cを次式(7)で求められるl3に設定してもよい。この場合、パラメータkは0.05であり、a=b=c=l3である。 Further, as shown in FIG. 13, when displaying an article 111-3 having a width X3, a height Y3, and a depth Z3 on the screen 110A, for example, parameters a, b, and c of an ellipsoid are expressed by the following equation (7). It may be set to l3 obtained in step (1). In this case, the parameter k is 0.05 and a = b = c = 13.
Figure JPOXMLDOC01-appb-M000007
 式(7)で求められるポインタ130Aのサイズは、直方体111-31の穴部と角柱部111-32との隙間に入ることができるサイズである。
Figure JPOXMLDOC01-appb-M000007
The size of the pointer 130A obtained by Expression (7) is a size that can enter the gap between the hole of the rectangular parallelepiped 111-31 and the prism portion 111-32.
 また、図14に示すように、物品111-1をスクリーン110Aに表示する場合に、楕円体のパラメータa、b、cを、一例として、a=b=c=kX1に設定し、パラメータkを0.5に設定してもよい。すなわち、パラメータa、b、cを、幅X1、高さY1、奥行きZ1のうちの最小値の半分の値に設定してもよい。 Further, as shown in FIG. 14, when the article 111-1 is displayed on the screen 110A, the parameters a, b, and c of the ellipsoid are set to a = b = c = kX1 as an example, and the parameter k is set to You may set to 0.5. That is, the parameters a, b, and c may be set to half the minimum value of the width X1, the height Y1, and the depth Z1.
 また、図15に示すように、物品111-2をスクリーン110Aに表示する場合に、楕円体のパラメータa、b、cを、一例として、a=b=c=kX2Aに設定してもよい。パラメータkは0.5である。 Further, as shown in FIG. 15, when the article 111-2 is displayed on the screen 110A, the ellipsoid parameters a, b, and c may be set to a = b = c = kX2A as an example. The parameter k is 0.5.
 X2Aは、物品111-2の3つのL字型のブロックの隙間の寸法であり、物品111-2の外寸のうちの最小値である。このように、パラメータa、b、cを、物品111-2の外寸のうちの最小値の半分の値に設定してもよい。 X2A is the size of the gap between the three L-shaped blocks of the article 111-2 and is the minimum value of the outer dimensions of the article 111-2. In this way, the parameters a, b, and c may be set to a value that is half the minimum value of the outer dimensions of the article 111-2.
 このようにして求められるポインタ130Aのサイズは、物品111-2の3つのブロックの間の隙間に入ることができるサイズである。 The size of the pointer 130A obtained in this way is a size that can enter the gap between the three blocks of the article 111-2.
 また、図16に示すように、物品111-3をスクリーン110Aに表示する場合に、楕円体のパラメータa、b、cを、一例として、a=b=c=kY3Aに設定してもよい。パラメータkは0.5である。 Further, as shown in FIG. 16, when the article 111-3 is displayed on the screen 110A, the parameters a, b, and c of the ellipsoid may be set to a = b = c = kY3A as an example. The parameter k is 0.5.
 Y3Aは、直方体111-31の角柱部111-32のY軸方向の高さであり、物品111-3の外寸のうちの最小値である。このように、パラメータa、b、cを、物品111-3の外寸のうちの最小値の半分の値に設定してもよい。 Y3A is the height in the Y-axis direction of the rectangular column part 111-32 of the rectangular parallelepiped 111-31, and is the minimum value of the outer dimensions of the article 111-3. In this way, the parameters a, b, and c may be set to a value that is half the minimum value of the outer dimensions of the article 111-3.
 このようにして求められるポインタ130Aのサイズは、直方体111-31の穴部と角柱部111-32との隙間に入ることができるサイズである。 The size of the pointer 130A thus obtained is a size that can enter the gap between the hole of the rectangular parallelepiped 111-31 and the prism portion 111-32.
 図17は、物品IDとポインタサイズを関連付けたサイズデータを示す図である。ここでは、図1に示すボタン111A、111B、及び物品111の物品IDと、物品IDに関連付けられるポインタサイズの一例について説明する。 FIG. 17 is a diagram showing size data in which an article ID is associated with a pointer size. Here, an example of the buttons 111A and 111B and the article ID of the article 111 illustrated in FIG. 1 and a pointer size associated with the article ID will be described.
 ボタン111Aの物品IDが011であり、ボタン111Bの物品IDが012であり、物品111の物品IDが013であることとする。 Suppose that the item ID of the button 111A is 011, the item ID of the button 111B is 012, and the item ID of the item 111 is 013.
 物品IDが011のボタン111Aについては、ポインタサイズ(X,Y,Z)は、(0.01,0.01,0.01)である。物品IDが012のボタン111Bについては、ポインタサイズ(X,Y,Z)は、(0.01,0.01,0.01)である。物品IDが013の物品111については、ポインタサイズ(X,Y,Z)は、(0.04,0.04,0.04)である。 The pointer size (X p , Y p , Z p ) for the button 111A with the article ID 011 is (0.01, 0.01, 0.01). The pointer size (X p , Y p , Z p ) of the button 111B with the article ID 012 is (0.01, 0.01, 0.01). The article 111 of the article ID is 013, the size of a pointer (X p, Y p, Z p) is a (0.04,0.04,0.04).
 図18及び図19は、物品とポインタ130Aのサイズとの関係を示す図である。ここでは、一例として、図1に示すように、スクリーン110Aに物品111とボタン111A、111Bの画像を表示させる場合に、図17に示すサイズデータに基づいてポインタ130Aを表示する場合について説明する。 18 and 19 are diagrams showing the relationship between the article and the size of the pointer 130A. Here, as an example, a case where the pointer 130A is displayed based on the size data shown in FIG. 17 when the image of the article 111 and the buttons 111A and 111B is displayed on the screen 110A as shown in FIG. 1 will be described.
 スクリーン110Aにはポインタ130Aが表示され、ポインタ130Aの周りには検知領域130Bが設定される。検知領域130Bは、スクリーン110Aには表示されない。 A pointer 130A is displayed on the screen 110A, and a detection area 130B is set around the pointer 130A. The detection area 130B is not displayed on the screen 110A.
 図18に示すように、ポインタ130Aが点A1にあるときは、ポインタ130Aのサイズは、初期値に設定される。初期値は、一例として、(X,Y,Z)=(0.03,0.03,0.03)である。初期値は、例えば、スクリーン110Aのサイズ、及び、スクリーン110Aに対する利用者1の適正な位置等に応じて設定すればよい。 As shown in FIG. 18, when the pointer 130A is at the point A1, the size of the pointer 130A is set to an initial value. The initial value is, for example, a (X p, Y p, Z p) = (0.03,0.03,0.03). The initial value may be set according to, for example, the size of the screen 110A and the appropriate position of the user 1 with respect to the screen 110A.
 ポインタ130Aが点A1から点B1に移動して、検知領域130Bの内部にボタン111A及び111Bが入ると、ポインタサイズ(X,Y,Z)は、(0.01,0.01,0.01)に設定される。 When the pointer 130A moves from the point A1 to the point B1 and the buttons 111A and 111B enter the detection area 130B, the pointer size (X p , Y p , Z p ) becomes (0.01, 0.01, 0.01).
 このように、検知領域130Bの内部に複数の物品(ボタン111A及び111B)が入る場合には、複数の物品に関連付けられている複数のポインタサイズのうちの最小値を用いてポインタ130Aを表示する。ボタン111A及び111Bに関連付けられているポインタサイズは互いに等しいため、ここでは、ポインタサイズ(X,Y,Z)は、(0.01,0.01,0.01)に設定される。 Thus, when a plurality of articles ( buttons 111A and 111B) enter the detection area 130B, the pointer 130A is displayed using the minimum value among the plurality of pointer sizes associated with the plurality of articles. . Since the pointer sizes associated with the buttons 111A and 111B are equal to each other, the pointer size (X p , Y p , Z p ) is set to (0.01, 0.01, 0.01) here. .
 ボタン111A及び111Bは、比較的小さい物品であるため、ポインタ130Aを初期値よりも小さくすることにより、利用者1がボタン111A及び111Bを選択しやすくするためである。 Because the buttons 111A and 111B are relatively small articles, the user 1 can easily select the buttons 111A and 111B by making the pointer 130A smaller than the initial value.
 また、図19に示すように、ポインタ130Aが点A1から点B2に移動して、検知領域130Bの内部に物品111が入ると、ポインタサイズ(X,Y,Z)は、(0.04,0.04,0.04)に設定される。 As shown in FIG. 19, when the pointer 130A moves from the point A1 to the point B2 and the article 111 enters the detection area 130B, the pointer size (X p , Y p , Z p ) becomes (0 .04, 0.04, 0.04).
 物品111は、ボタン111A及び111Bよりも大きく、比較的大きな物品であるため、ポインタ130Aを初期値よりも大きくすることにより、利用者1が物品111を選択しやすくするためである。 This is because the article 111 is larger than the buttons 111A and 111B and is a relatively large article, so that the user 1 can easily select the article 111 by making the pointer 130A larger than the initial value.
 このように、シミュレーションシステム100は、検知領域130Bに存在する物品のサイズに応じて、ポインタ130Aのサイズを変更する。利用者1にとってポインタ130Aが見えやすくすることにより、シミュレーションシステム100の使い勝手を良好にするためである。 Thus, the simulation system 100 changes the size of the pointer 130A according to the size of the article existing in the detection area 130B. This is to improve the usability of the simulation system 100 by making it easy for the user 1 to see the pointer 130A.
 図20は、実施の形態の処理装置120が実行する処理を示すフローチャートである。図20に示すフローチャートは、ポインタ130Aのポインタサイズを設定して、ポインタ130Aをスクリーン110Aに表示する処理を示す。 FIG. 20 is a flowchart illustrating processing executed by the processing device 120 according to the embodiment. The flowchart shown in FIG. 20 shows processing for setting the pointer size of the pointer 130A and displaying the pointer 130A on the screen 110A.
 ここでは、一例として、図1に示すように、スクリーン110Aに物品111とボタン111A、111Bの画像を表示させる場合について説明する。 Here, as an example, a case will be described in which images of the article 111 and the buttons 111A and 111B are displayed on the screen 110A as shown in FIG.
 処理装置120は、電源投入後に処理を開始する(スタート)。 Processing device 120 starts processing after power is turned on (start).
 処理装置120は、データ保持部127から物品データを取得する(ステップS1)。 The processing apparatus 120 acquires article data from the data holding unit 127 (step S1).
 処理装置120は、物品データを用いて映像信号を生成し、投影装置110Bに画像を投影させる(ステップS2)。これにより、スクリーン110Aに物品111とボタン111A、111Bの立体視のモデルの画像が表示される。スクリーン110Aに表示される物品111とボタン111A、111Bの画像は、仮想空間に存在する仮想物体を表す。 The processing device 120 generates a video signal using the article data, and causes the projection device 110B to project an image (step S2). As a result, an image of the stereoscopic model of the article 111 and the buttons 111A and 111B is displayed on the screen 110A. The image of the article 111 and the buttons 111A and 111B displayed on the screen 110A represents a virtual object existing in the virtual space.
 なお、ステップS1及びS2の処理は、映像出力部128によって行われる。 Note that the processing of steps S1 and S2 is performed by the video output unit 128.
 処理装置120は、位置計測装置140から利用者1の体の位置及び形状等を三次元的に表すデータを取得する(ステップS3)。ステップS3の処理は、人体検出部121によって行われる。 The processing device 120 acquires data representing the position and shape of the body of the user 1 three-dimensionally from the position measurement device 140 (step S3). The process of step S3 is performed by the human body detection unit 121.
 処理装置120は、ステップS3で取得したデータに基づいて、利用者1の体が存在するかどうを判定する(ステップS4)。ステップS4の処理は、人体検出部121によって行われる。 The processing device 120 determines whether the body of the user 1 exists based on the data acquired in step S3 (step S4). The process of step S4 is performed by the human body detection unit 121.
 処理装置120は、利用者1の体が存在する(S4:YES)と判定すると、利用者1の体の各部の位置を表す座標を求める(ステップS5)。ステップS5の処理は、人体検出部121によって行われる。 If the processing device 120 determines that the body of the user 1 exists (S4: YES), the processing device 120 obtains coordinates representing the position of each part of the body of the user 1 (step S5). The process of step S5 is performed by the human body detection unit 121.
 処理装置120は、座標P(P,P,P)を検出する(ステップS6)。座標P(P,P,P)は、利用者1の右肩と右手首とを結ぶ直線とスクリーン110Aとの交点の座標をスクリーン110Aに投影される画像の中の座標に変換した座標であり、位置検出部122によって求められる。ステップS6の処理は、位置検出部122によって行われる。 The processing device 120 detects the coordinates P (P x , P y , P z ) (step S6). The coordinates P (P x , P y , P z ) are obtained by converting the coordinates of the intersection point of the straight line connecting the right shoulder and the right wrist of the user 1 and the screen 110A to the coordinates in the image projected on the screen 110A. The coordinates are obtained by the position detector 122. The process of step S6 is performed by the position detection unit 122.
 処理装置120は、検知領域130Bを生成する(ステップS7)。ステップS7の処理は、検知領域生成部123によって行われる。検知領域生成部123は、座標P(P,P,P)を中心とする所定の半径の検知領域130Bを生成する。 The processing device 120 generates the detection area 130B (step S7). The processing in step S7 is performed by the detection area generation unit 123. The detection area generator 123 generates a detection area 130B having a predetermined radius centered on the coordinates P (P x , P y , P z ).
 なお、検知領域130Bは、ポインタ130Aの表面よりも外側で、座標P(P,P,P)を中心とし、所定の半径で規定される球体に含まれる領域である。 The detection area 130B is an area outside the surface of the pointer 130A and included in a sphere defined by a predetermined radius with the coordinates P (P x , P y , P z ) as the center.
 処理装置120は、検知領域130Bの内部に物品が存在するかどうかを判定する(ステップS8)。ステップS8の処理は、物体判定部125によって行われる。物体判定部125は、検知領域130Bと、物品111又はボタン111Aあるいは111Bの表示領域とが交点を有するかどうかで判定し、検知領域130Bの内部に少なくとも一部が位置する物品があるかどうかを判定する。検知領域130Bの内部に物品が存在する場合は、物体判定部125によって物品の種類が判定される。 The processing device 120 determines whether or not an article exists in the detection area 130B (step S8). The process of step S8 is performed by the object determination unit 125. The object determination unit 125 determines whether or not the detection area 130B and the article 111 or the display area of the button 111A or 111B have an intersection, and whether or not there is an article that is at least partially located inside the detection area 130B. judge. When an article exists inside the detection area 130B, the object determination unit 125 determines the type of the article.
 処理装置120は、検知領域130Bの内部に物品が存在しない(S8:NO)と判定すると、ポインタサイズを初期値に設定する(ステップS9)。ステップS9の処理は、ポインタ生成部126によって行われる。ポインタサイズの初期値は、データ保持部127に保持されている。 If the processing device 120 determines that there is no article in the detection area 130B (S8: NO), the processing device 120 sets the pointer size to an initial value (step S9). The processing in step S9 is performed by the pointer generator 126. The initial value of the pointer size is held in the data holding unit 127.
 処理装置120は、ポインタ130Aをスクリーン110Aに表示する(ステップS10)。ステップS10の処理は、映像出力部128によって行われる。映像出力部128は、ポインタ生成部126によって生成されるポインタ130Aの画像を、投影装置110Bにスクリーン110Aに表示させる。 The processing device 120 displays the pointer 130A on the screen 110A (step S10). The processing in step S10 is performed by the video output unit 128. The video output unit 128 causes the projection device 110B to display the image of the pointer 130A generated by the pointer generation unit 126 on the screen 110A.
 ポインタ130Aがスクリーン110Aに表示されると、一連の処理が終了する(エンド)。 When the pointer 130A is displayed on the screen 110A, a series of processing ends (end).
 また、処理装置120は、検知領域130Bの内部に物品が存在する(S8:YES)と判定すると、検知領域130Bの内部に存在する物品の物品IDを取得する(ステップS11)。ステップS11の処理は、物体判定部125によって行われる。ステップS11において、検知領域130Bの内部に複数の物品が存在する場合には、複数の物品IDを取得する。 If the processing device 120 determines that an article exists in the detection area 130B (S8: YES), the processing apparatus 120 acquires the article ID of the article existing in the detection area 130B (step S11). The process of step S11 is performed by the object determination unit 125. In step S11, when there are a plurality of articles in the detection area 130B, a plurality of article IDs are acquired.
 物体判定部125は、検知領域130Bと交点を有する、物品111又はボタン111Aあるいは111Bの物品IDを取得する。 The object determination unit 125 acquires the article ID of the article 111 or the button 111A or 111B having an intersection with the detection area 130B.
 処理装置120は、ステップS11で取得した物品IDに対応するポインタサイズをサイズデータ(図7、18参照)から読み出す(ステップS12)。ステップS12の処理は、ポインタ生成部126によって行われる。なお、ステップS11で取得した物品IDが複数ある場合には、ポインタ生成部126は、複数のポインタサイズを読み出す。 The processing device 120 reads the pointer size corresponding to the article ID acquired in step S11 from the size data (see FIGS. 7 and 18) (step S12). The processing in step S12 is performed by the pointer generation unit 126. When there are a plurality of article IDs acquired in step S11, the pointer generation unit 126 reads a plurality of pointer sizes.
 これは、例えば、図18に示すように、検知領域130Bにボタン111A及び111Bが存在する場合に、図17に示す011と012の物品IDに関連付けられた2つのポインタサイズを読み出すことに対応する。 This corresponds to, for example, reading out two pointer sizes associated with the article IDs 011 and 012 shown in FIG. 17 when the buttons 111A and 111B exist in the detection area 130B as shown in FIG. .
 処理装置120は、ステップS11で複数のポインタサイズを読み出した場合には、複数のポインタサイズのうち、最も小さいポインタサイズを選択する(ステップS13)。ステップS13の処理は、ポインタ生成部126によって行われる。なお、ステップS11で取得した物品IDが1つである場合には、ポインタ生成部126は、ステップS13では特に処理を行わない。 When the processing device 120 reads out a plurality of pointer sizes in step S11, the processing device 120 selects the smallest pointer size among the plurality of pointer sizes (step S13). The processing in step S13 is performed by the pointer generation unit 126. Note that, when the number of article IDs acquired in step S11 is one, the pointer generation unit 126 does not perform any particular process in step S13.
 処理装置120は、ポインタサイズが所定の下限値よりも小さいかどうかを判定する(ステップS14)。ステップS14の処理は、ポインタ生成部126によって行われる。 The processing device 120 determines whether the pointer size is smaller than a predetermined lower limit (step S14). The processing in step S14 is performed by the pointer generator 126.
 ポインタ生成部126は、ポインタサイズの所定の下限値をデータ保持部127から読み出し、ステップS12で読み出した1つのポインタサイズ、又は、S13で選択したポインタサイズと、データ保持部127から読み出したポインタサイズの所定の下限値とを比較する。 The pointer generation unit 126 reads out a predetermined lower limit value of the pointer size from the data holding unit 127, one pointer size read in step S12, or the pointer size selected in S13 and the pointer size read from the data holding unit 127 Is compared with a predetermined lower limit value.
 このようにポインタサイズを所定の下限値と比較するのは、後述するステップS109の処理でポインタサイズが縮小されることによって、ポインタ130Aが小さくなりすぎないようにするためである。所定の下限値は、スクリーン110Aのサイズ、及び、スクリーン110Aに対する利用者1の適正な位置等に応じて設定すればよい。 The reason why the pointer size is compared with the predetermined lower limit in this way is to prevent the pointer 130A from becoming too small by reducing the pointer size in the process of step S109 described later. The predetermined lower limit value may be set according to the size of the screen 110A, the appropriate position of the user 1 with respect to the screen 110A, and the like.
 処理装置120は、ポインタサイズが所定の下限値よりも小さい(S14:YES)と判定すると、ポインタサイズを所定の下限値に修正する(ステップS15)。ステップS15の処理は、ポインタ生成部126によって行われる。ポインタサイズが所定の下限値よりも小さいと、利用者1にとって見にくいため、スクリーン110Aに表示する前に、下限値まで大きくなるように修正する。 If the processing device 120 determines that the pointer size is smaller than the predetermined lower limit value (S14: YES), the processing device 120 corrects the pointer size to the predetermined lower limit value (step S15). The processing in step S15 is performed by the pointer generator 126. If the pointer size is smaller than the predetermined lower limit value, it is difficult for the user 1 to see, and therefore the pointer size is corrected to be increased to the lower limit value before being displayed on the screen 110A.
 処理装置120は、下限値に修正したポインタサイズのポインタ130Aをスクリーン110Aに表示する(ステップS10)。ステップS10の処理は、映像出力部128によって行われる。 The processing device 120 displays the pointer 130A having the pointer size corrected to the lower limit value on the screen 110A (step S10). The processing in step S10 is performed by the video output unit 128.
 また、処理装置120は、ポインタサイズが所定の下限値よりも小さくない(S14:NO)と判定すると、ステップS12又はS13で設定されたポインタサイズのポインタ130Aをスクリーン110Aに表示する(ステップS10)。 On the other hand, when determining that the pointer size is not smaller than the predetermined lower limit (S14: NO), the processing device 120 displays the pointer 130A having the pointer size set in step S12 or S13 on the screen 110A (step S10). .
 ポインタ130Aがスクリーン110Aに表示されると、一連の処理が終了する(エンド)。 When the pointer 130A is displayed on the screen 110A, a series of processing ends (end).
 以上のように、処理装置120によって設定されるポインタサイズのポインタ130Aがスクリーン110Aに表示される。図20に示すフローは、繰り返し実行される。 As described above, the pointer 130A having the pointer size set by the processing device 120 is displayed on the screen 110A. The flow shown in FIG. 20 is repeatedly executed.
 図21は、物品とポインタ130Aのサイズとの関係を示す図である。図21では、図18及び図19と同様に、スクリーン110Aに物品111とボタン111A、111Bの画像を表示されている。 FIG. 21 is a diagram showing the relationship between the article and the size of the pointer 130A. In FIG. 21, as in FIGS. 18 and 19, images of the article 111 and the buttons 111A and 111B are displayed on the screen 110A.
 ここでは、一例として、利用者1がボタン111Aへの入力を決定したいときに、誤ってボタン111Bへの決定動作を行ってしまい、ボタン111Bへの入力をキャンセルする動作が行われる場合に、ポインタ130Aのサイズを変更する場合について説明する。 Here, as an example, when the user 1 wants to determine the input to the button 111A, the determination operation to the button 111B is erroneously performed, and the operation to cancel the input to the button 111B is performed. A case where the size of 130A is changed will be described.
 上述のように、ボタン111A及び111Bへの入力を決定する動作、及び、ボタン111A及び111Bへの入力をキャンセルする動作は、動作検出部124によって検出可能な動作として決められている。 As described above, the operation for determining the input to the buttons 111A and 111B and the operation for canceling the input to the buttons 111A and 111B are determined as operations that can be detected by the operation detection unit 124.
 まず、ポインタ130Aが点A2にあるときは、ポインタ130Aのサイズは、初期値に設定される。初期値は、一例として、(X,Y,Z)=(0.03,0.03,0.03)である。 First, when the pointer 130A is at the point A2, the size of the pointer 130A is set to an initial value. The initial value is, for example, a (X p, Y p, Z p) = (0.03,0.03,0.03).
 ポインタ130Aが点A2から点B3に移動して、検知領域130Bの内部にボタン111A及び111Bが入ると、ポインタサイズ(X,Y,Z)は、(0.01,0.01,0.01)に設定される。 When the pointer 130A moves from the point A2 to the point B3 and the buttons 111A and 111B enter the detection area 130B, the pointer size (X p , Y p , Z p ) becomes (0.01, 0.01, 0.01).
 利用者1は、この状態でボタン111Aをタップしたいが、右手2が振れて、ポインタ130Aが点B3から点B4に移動し、ポインタ130Aは、ボタン111Aには触れておらず、ボタン111Bに触れた状態になる。 The user 1 wants to tap the button 111A in this state, but the right hand 2 swings, the pointer 130A moves from the point B3 to the point B4, and the pointer 130A does not touch the button 111A but touches the button 111B. It becomes a state.
 この状態で利用者1がタップすると、ボタン111Bへの入力が決定される。利用者1は、ボタン111Aへの入力を決定したいので、キャンセル動作を行う。具体的には、利用者1は、ポインタ130Aがボタン111Bに触れた状態で、右手2を右側に払う動作を行う。これにより、キャンセル動作が動作検出部124によって検出される。 When the user 1 taps in this state, the input to the button 111B is determined. Since the user 1 wants to determine the input to the button 111A, the user 1 performs a cancel operation. Specifically, the user 1 performs an operation of paying the right hand 2 to the right side with the pointer 130A touching the button 111B. Thereby, the canceling operation is detected by the operation detecting unit 124.
 キャンセル動作が行われると、キャンセル動作が行われたボタン111Bに関連付けられたポインタサイズ(X,Y,Z)が、90%のサイズに設定される。すなわち、ポインタ130Aが10%小さくなる。このようにキャンセル動作に応じてポインタサイズを縮小する機能を学習機能と称す。 When the cancel operation is performed, the pointer size (X p , Y p , Z p ) associated with the button 111B for which the cancel operation has been performed is set to a size of 90%. That is, the pointer 130A is 10% smaller. Such a function of reducing the pointer size in accordance with the cancel operation is referred to as a learning function.
 利用者1は、ポインタ130Aを点B4から点B5に移動し、ポインタ130Aは、ボタン111Aに触れた状態になる。利用者1は、この状態で、ボタン111Aをタップすればよい。ポインタ130Aが10%小さくなっているので、ボタン111Aを選択しやすくなっており、誤入力を抑制することができる。 User 1 moves the pointer 130A from point B4 to point B5, and the pointer 130A is in a state of touching the button 111A. The user 1 may tap the button 111A in this state. Since the pointer 130A is 10% smaller, it is easier to select the button 111A, and erroneous input can be suppressed.
 図22は、物品IDとポインタサイズを関連付けたサイズデータを示す図である。ここでは、キャンセル動作が行われる前と後におけるサイズデータの変化について説明する。 FIG. 22 is a diagram showing size data in which an article ID is associated with a pointer size. Here, a change in size data before and after the cancel operation is performed will be described.
 図22の左側に示すサイズデータは、キャンセル動作が行われる前のサイズデータであり、図17に示すサイズデータと等しい。 The size data shown on the left side of FIG. 22 is the size data before the cancel operation is performed, and is the same as the size data shown in FIG.
 図21を用いて説明したように、ボタン111Bへの入力がキャンセルされると、図22の右側に示すように、ボタン111Bに対応する物品IDが012に関連付けられたポインタサイズが、10%小さくされる。 As described with reference to FIG. 21, when the input to the button 111B is canceled, the pointer size associated with the item ID 012 corresponding to the button 111B is 10% smaller as shown on the right side of FIG. Is done.
 このように、キャンセル動作が行われた場合には、ポインタサイズを小さくすることにより、その後に誤入力が行われないようにする。 In this way, when a cancel operation is performed, the pointer size is reduced to prevent subsequent erroneous input.
 なお、キャンセル動作が行われた場合にポインタサイズを小さくするのは、キャンセル動作が行われたボタンと同一のボタンについて、所定の時間内にキャンセル動作が行われた場合である。利用者1が誤入力に気付く場合には、決定動作を行ってから、それほど長い時間が経たないうちにキャンセル動作を行うと考えられるからである。また、決定動作を行ってから、ある程度長い時間が経ってからキャンセル動作を行う場合は、誤入力をキャンセルするのではなく、キャンセルする意志を持って行うと考えられるからである。 Note that the pointer size is reduced when the cancel operation is performed when the cancel operation is performed within a predetermined time for the same button as the button for which the cancel operation is performed. This is because when the user 1 notices an erroneous input, it is considered that the cancel operation is performed within a long time after the determination operation is performed. In addition, if the cancel operation is performed after a certain amount of time has elapsed since the determination operation was performed, it is considered that the erroneous input is not canceled but the intention to cancel is performed.
 このため、キャンセル動作であるかどうかを判定する所定の時間は、実験又はシミュレーション等で適切な値に設定すればよい。例えば、所定の時間は、5秒である。なお、決定動作とは、ボタン111A又は111Bの選択を決定する操作(選択操作)である。 For this reason, the predetermined time for determining whether or not the canceling operation is performed may be set to an appropriate value through experiments or simulations. For example, the predetermined time is 5 seconds. The determination operation is an operation (selection operation) for determining selection of the button 111A or 111B.
 図23は、実施の形態の処理装置120が実行する処理を示すフローチャートである。ここでは、一例として、図1に示すように、スクリーン110Aに物品111とボタン111A、111Bの画像を表示させる場合について説明する。 FIG. 23 is a flowchart illustrating processing executed by the processing device 120 according to the embodiment. Here, as an example, a case will be described in which images of the article 111 and the buttons 111A and 111B are displayed on the screen 110A as shown in FIG.
 処理装置120は、電源投入後に処理を開始する(スタート)。 Processing device 120 starts processing after power is turned on (start).
 処理装置120は、ポインタ130Aをスクリーン110Aに表示する(ステップS101)。ステップS101の処理は、図20に示す処理である。 The processing device 120 displays the pointer 130A on the screen 110A (step S101). The process of step S101 is the process shown in FIG.
 処理装置120は、人体検出部121から入力される利用者1の体の各部の位置を表す座標に基づいて、利用者1の動作を検出する(ステップS102)。ステップS102の処理は、動作検出部124によって行われる。 The processing device 120 detects the operation of the user 1 based on the coordinates representing the position of each part of the body of the user 1 input from the human body detection unit 121 (step S102). The process in step S102 is performed by the operation detection unit 124.
 利用者1の動作とは、利用者1が行うゼスチャー等の動作あり、ここでは、一例として、ポインタ130Aのサイズをリセットするための動作、ボタン111A又は111Bへの入力を決定する動作、及び、ボタン111A又は111Bへの入力をキャンセルする動作が予め動作検出部124によって検出可能な動作として決められている。 The operation of the user 1 is an operation such as a gesture performed by the user 1, and here, as an example, an operation for resetting the size of the pointer 130A, an operation for determining an input to the button 111A or 111B, and The operation for canceling the input to the button 111A or 111B is determined in advance as an operation that can be detected by the operation detection unit 124.
 処理装置120は、ステップS102で検出された利用者1の動作がリセット動作であるかどうかを判定する(ステップS103)。ステップS103の処理は、動作検出部124によって行われる。 The processing device 120 determines whether or not the operation of the user 1 detected in step S102 is a reset operation (step S103). The process in step S103 is performed by the operation detection unit 124.
 処理装置120は、リセット動作ではない(S103:NO)と判定すると、ステップS102で検出された利用者1の動作が決定動作であるかどうか判定する(ステップS104)。ステップS104の処理は、動作検出部124によって行われる。 If the processing device 120 determines that the operation is not a reset operation (S103: NO), it determines whether or not the operation of the user 1 detected in step S102 is a determination operation (step S104). The process in step S104 is performed by the operation detection unit 124.
 処理装置120は、決定動作ではない(S104:NO)と判定すると、ステップS102で検出された利用者1の動作がキャンセル動作であるかどうか判定する(ステップS105)。ステップS105の処理は、動作検出部124によって行われる。 If the processing device 120 determines that the operation is not a determination operation (S104: NO), the processing device 120 determines whether the operation of the user 1 detected in step S102 is a cancel operation (step S105). The process in step S105 is performed by the operation detection unit 124.
 処理装置120は、キャンセル動作ではない(S105:NO)と判定すると、一連の処理を終了する(エンド)。リセット動作、決定動作、及びキャンセル動作のいずれでもないと判定した場合には処理を終了し、電源が投入されていれば、再びスタートから処理を繰り返し実行する。 If the processing device 120 determines that the operation is not a cancel operation (S105: NO), the series of processing ends (end). If it is determined that none of the reset operation, the determination operation, and the cancel operation, the process is terminated. If the power is turned on, the process is repeated from the start.
 処理装置120は、決定動作である(S104:YES)と判定すると、決定動作が行われた物品の物品IDを記憶する(ステップS106)。ステップS106の処理は、ポインタ生成部126によって行われる。例えば、決定動作が行われた物品がボタン111Aである場合は、ボタン111Aの物品IDを記憶する。 If it is determined that the processing device 120 is the determining operation (S104: YES), the processing device 120 stores the item ID of the item for which the determining operation has been performed (step S106). The processing in step S106 is performed by the pointer generator 126. For example, when the article for which the determination operation has been performed is the button 111A, the article ID of the button 111A is stored.
 処理装置120は、物品IDを記憶すると、一連の処理を終了する(エンド)。電源が投入されていれば、再びスタートから処理を繰り返し実行する。 When the processing device 120 stores the article ID, the processing unit 120 ends the series of processing (end). If the power is on, the process is repeated from the start.
 処理装置120は、キャンセル動作である(S105:YES)と判定すると、既に記憶した物品IDと同一の物品についてキャンセル動作が行われたかどうかを判定する(ステップS107)。ステップS107の処理は、ポインタ生成部126によって行われる。 If it is determined that the processing device 120 is a cancel operation (S105: YES), the processing device 120 determines whether or not the cancel operation has been performed for the same article as the already stored article ID (step S107). The processing in step S107 is performed by the pointer generator 126.
 処理装置120は、既に記憶した物品IDと同一の物品である(S107:YES)と判定すると、決定動作が行われた物品の物品IDを記憶してから、キャンセル動作が行われるまでの経過時間が所定時間以内であるかどうかを判定する(ステップS108)。 If the processing device 120 determines that the article ID is the same as the already-stored article ID (S107: YES), the elapsed time from the storage of the article ID of the article for which the determination operation has been performed until the cancel operation is performed Is determined within a predetermined time (step S108).
 ステップS108の処理は、ポインタ生成部126によって行われる。経過時間が所定時間以内であるかどうかは、経過時間が所定時間以下であるかどうかで判定すればよい。 The processing in step S108 is performed by the pointer generator 126. Whether or not the elapsed time is within the predetermined time may be determined by whether or not the elapsed time is equal to or shorter than the predetermined time.
 処理装置120は、経過時間が所定時間以内である(S108:YES)と判定すると、サイズデータに含まれるポインタサイズのうち、キャンセル動作が行われた物品に関連付けられたポインタサイズを90%に縮小する(ステップS109)。ステップS109の処理は、ポインタ生成部126によって行われる。 When determining that the elapsed time is within the predetermined time (S108: YES), the processing device 120 reduces the pointer size associated with the canceled article to 90% of the pointer size included in the size data. (Step S109). The processing in step S109 is performed by the pointer generator 126.
 利用者1が誤操作を取り消すためにキャンセル動作を行ったと考えられるからである。例えば、図21を用いて説明したように、利用者1が誤ってボタン111Bへの入力を決定し、ボタン111Bへの入力をキャンセルした場合には、ボタン111Bに関連付けられたポインタサイズ(X,Y,Z)が、90%のサイズに設定される。 This is because it is considered that the user 1 has performed a cancel operation in order to cancel the erroneous operation. For example, as described with reference to FIG. 21, when the user 1 erroneously determines the input to the button 111B and cancels the input to the button 111B, the pointer size (X p associated with the button 111B is canceled. , Y p , Z p ) is set to a size of 90%.
 処理装置120は、ポインタサイズを90%に縮小すると、一連の処理を終了する(エンド)。電源が投入されていれば、再びスタートから処理を繰り返し実行する。 When the pointer size is reduced to 90%, the processing device 120 ends the series of processing (end). If the power is on, the process is repeated from the start.
 また、処理装置120は、既に記憶した物品IDと同一の物品ではない(S107:NO)と判定した場合と、経過時間が所定時間以内ではない(S108:NO)と判定した場合は、一連の処理を終了する(エンド)。電源が投入されていれば、再びスタートから処理を繰り返し実行する。 In addition, when it is determined that the processing device 120 is not the same product as the previously stored product ID (S107: NO), and when it is determined that the elapsed time is not within the predetermined time (S108: NO), a series of End processing (END). If the power is on, the process is repeated from the start.
 なお、既に記憶した物品IDと同一の物品ではない(S107:NO)と判定した場合は、利用者1が誤操作を取り消すためにキャンセル動作を行った場合とは異なると考えられるので、一連の処理を終了する。 If it is determined that the article is not the same as the already-stored article ID (S107: NO), it is considered different from the case where the user 1 performs a cancel operation in order to cancel the erroneous operation. Exit.
 また、経過時間が所定時間以内ではない(S108:NO)と判定した場合は、利用者1が誤操作を取り消すためにキャンセル動作を行った場合とは異なると考えられるので、一連の処理を終了する。 If it is determined that the elapsed time is not within the predetermined time (S108: NO), it is considered that the user 1 performs a cancel operation in order to cancel the erroneous operation, and thus the series of processing ends. .
 また、処理装置120は、リセット動作である(S103:YES)と判定した場合は、ポインタサイズを初期値に設定する(ステップS110)。ステップS110の処理は、ポインタ生成部126によって行われる。ポインタサイズの初期値は、図17に示すサイズデータと同様に、データ保持部127に保持されている。 If the processing device 120 determines that the reset operation is being performed (S103: YES), the processing device 120 sets the pointer size to an initial value (step S110). The processing in step S110 is performed by the pointer generation unit 126. The initial value of the pointer size is held in the data holding unit 127 similarly to the size data shown in FIG.
 図24は、実際にシミュレーションシステム100で指示動作を行った場合のスクリーン110Aの表示を示す図である。スクリーン110Aには、9つのマーカが3行、3列で配置されている。マーカはポインタ130Aのような球体として表示されており、中央のマーカをマーカC1とする。 FIG. 24 is a diagram showing a display on the screen 110A when an instruction operation is actually performed in the simulation system 100. FIG. On the screen 110A, nine markers are arranged in three rows and three columns. The marker is displayed as a sphere like the pointer 130A, and the center marker is a marker C1.
 利用者1が、矢印で示すように、右手2を右上から左下にかけて動かす指示動作を行い、ポインタ130Aを始点A6からマーカC1に移動させようとする場合において、マーカC1とポインタ130Aの位置ずれを測定した。 When the user 1 performs an instruction operation to move the right hand 2 from the upper right to the lower left as indicated by an arrow and tries to move the pointer 130A from the start point A6 to the marker C1, the positional deviation between the marker C1 and the pointer 130A is detected. It was measured.
 なお、9つのマーカ同士の間隔は、100ピクセルである。100ピクセルは、スクリーン110AのX軸方向の幅が3メートルの場合に、マーカ同士の間隔は、150mmである。マーカの半径は、30mm、検知領域130Bの半径は、100mmである。ポインタ130Aの半径の初期値は、40mm、検知領域130Bの内部にマーカが存在する場合のポインタ130Aの半径は、20mmである。図24に示すポインタ130Aのの半径は、20mmである。 The interval between the nine markers is 100 pixels. For 100 pixels, when the width of the screen 110A in the X-axis direction is 3 meters, the interval between the markers is 150 mm. The radius of the marker is 30 mm, and the radius of the detection region 130B is 100 mm. The initial value of the radius of the pointer 130A is 40 mm, and the radius of the pointer 130A when the marker exists in the detection area 130B is 20 mm. The radius of the pointer 130A shown in FIG. 24 is 20 mm.
 図25は、図24で説明した指示動作を行った場合に、シミュレーションシステム100によって表示されるポインタ130Aの軌跡を示す図である。すなわち、図25は、シミュレーションシステム100によって求められるポインタ130Aの座標P(P,P,P)の軌跡を示す。 FIG. 25 is a diagram illustrating a locus of the pointer 130A displayed by the simulation system 100 when the instruction operation described with reference to FIG. 24 is performed. That is, FIG. 25 shows the locus of the coordinates P (P x , P y , P z ) of the pointer 130A obtained by the simulation system 100.
 図25において、破線の円で囲まれる点は、ポインタ130AがマーカC1に到達した後に、ポインタ130AをマーカC1に合わせ続けようとする指示動作を行っているときの座標P(P,P,P)の軌跡を示す。 In FIG. 25, the points surrounded by a broken-line circle indicate the coordinates P (P x , P y when an instruction operation is performed to keep the pointer 130A aligned with the marker C1 after the pointer 130A reaches the marker C1. , P z ).
 このため、図25において、破線の円に向かって右上から左下に点在する点は、図24に示す始点A6からマーカC1にポインタ130Aを移動させているときの軌跡を示す。なお、図25には、計算で用いたXY座標の数値(ピクセル値)を示す。 For this reason, in FIG. 25, the points scattered from the upper right to the lower left toward the broken circle indicate the locus when the pointer 130A is moved from the start point A6 to the marker C1 shown in FIG. In addition, in FIG. 25, the numerical value (pixel value) of the XY coordinate used by calculation is shown.
 図25に示すように、破線の円で囲まれる点の位置が振れているのは、ポインタ130AがマーカC1に到達した後に、ポインタ130AをマーカC1に合わせ続けようとして、右手2が振れるからである。 As shown in FIG. 25, the position of the point surrounded by the broken-line circle fluctuates because the right hand 2 shakes in an attempt to keep the pointer 130A aligned with the marker C1 after the pointer 130A reaches the marker C1. is there.
 このようにポインタ130AをマーカC1に移動させる指示動作を行った結果を図26に示す。 FIG. 26 shows the result of the instruction operation for moving the pointer 130A to the marker C1 in this way.
 図26は、ポインタ130AをマーカC1に移動させる指示動作を行った結果を示す図である。ここでは、検知領域130Bにマーカが存在する場合に、ポインタ130Aの半径を初期値の40mmから20mmに変更する処理を行う場合を「処理あり」と記す。 FIG. 26 is a diagram showing a result of an instruction operation for moving the pointer 130A to the marker C1. Here, when there is a marker in the detection area 130B, a case where the process of changing the radius of the pointer 130A from the initial value of 40 mm to 20 mm is described as “with process”.
 また、比較用に、検知領域130Bにマーカが存在しても、ポインタ130Aの半径を初期値の40mmに固定する場合を「処理なし」と記す。 For comparison, a case where the radius of the pointer 130A is fixed to the initial value of 40 mm even when a marker is present in the detection area 130B is referred to as “no processing”.
 また、ポインタ130Aの半径が20mmに設定された状態でキャンセル動作が行われることにより、学習機能によってポインタ130Aが90%のサイズに設定される場合の結果を「処理あり&学習機能」として併せて示す。「処理あり&学習機能」の場合のポインタ130Aの半径は18mmである。 In addition, when the cancel operation is performed in a state where the radius of the pointer 130A is set to 20 mm, the result when the pointer 130A is set to 90% size by the learning function is also referred to as “processing & learning function”. Show. The radius of the pointer 130A in the case of “with processing & learning function” is 18 mm.
 なお、「処理あり」の場合と「処理なし」の場合には、学習機能を使用していない。 Note that the learning function is not used in the case of “with processing” and “without processing”.
 このような3つの場合について、ポインタ130AをマーカC1に移動させて、ポインタ130AをマーカC1に合わせ続けようとしているときに、ポインタ130AがマーカC1のみに接触するフレームと、ポインタ130AがマーカC1以外のマーカ(マーカC1の周りの8つのマーカのうちのいずれかのマーカ)に接触するフレームとに分けた。 In these three cases, when the pointer 130A is moved to the marker C1 and the pointer 130A is continuously aligned with the marker C1, the frame in which the pointer 130A contacts only the marker C1 and the pointer 130A other than the marker C1 And a frame in contact with the marker (any one of the eight markers around the marker C1).
 この結果、「処理なし」の場合に、ポインタ130AがマーカC1のみに接触するフレームの数は、633であり、ポインタ130AがマーカC1以外のマーカに接触するフレームは24%であった。このため、マーカC1以外のマーカに接触する確率は、196であった。 As a result, in the case of “no processing”, the number of frames in which the pointer 130A touches only the marker C1 is 633, and the number of frames in which the pointer 130A touches a marker other than the marker C1 is 24%. For this reason, the probability of contacting a marker other than the marker C1 was 196.
 「処理あり」の場合に、ポインタ130AがマーカC1のみに接触するフレームの数は、582であり、ポインタ130AがマーカC1以外のマーカに接触するフレームは59であった。このため、マーカC1以外のマーカに接触する確率は、9%であった。 In the case of “with processing”, the number of frames in which the pointer 130A touches only the marker C1 is 582, and the number of frames in which the pointer 130A touches a marker other than the marker C1 is 59. For this reason, the probability of contacting a marker other than the marker C1 was 9%.
 このように、「処理あり」の場合は、「処理なし」の場合に比べて、ポインタ130AがマーカC1のみに接触するフレームの数に接触する確率が大幅に減った。 Thus, in the case of “with processing”, the probability that the pointer 130A touches only the number of frames that touch only the marker C1 is significantly reduced as compared with the case of “without processing”.
 また、「処理あり&学習機能」の場合に、ポインタ130AがマーカC1のみに接触するフレームの数は、540であり、ポインタ130AがマーカC1以外のマーカに接触するフレームは35であった。このため、マーカC1以外のマーカに接触する確率は、6%であった。 In the case of “with processing & learning function”, the number of frames in which the pointer 130A contacts only the marker C1 is 540, and the number of frames in which the pointer 130A contacts a marker other than the marker C1 is 35. For this reason, the probability of contacting a marker other than the marker C1 was 6%.
 このように、「処理あり&学習機能」の場合は、「処理なし」の場合に比べて、ポインタ130AがマーカC1のみに接触するフレームの数に接触する確率が大幅に減った。また、「処理あり」の場合よりも、さらに改善された。 Thus, in the case of “with processing & learning function”, the probability that the pointer 130A touches only the number of frames that touch only the marker C1 is significantly reduced as compared with the case of “without processing”. In addition, it was further improved over the case of “with treatment”.
 以上のように、実施の形態によれば、検知領域130Bに物品の少なくとも一部が存在するようになると、物品の種類に応じてポインタ130Aのサイズを変更する。物品が比較的小さい場合には、ポインタサイズを初期値よりも小さくし、物品が比較的大きい場合には、ポインタサイズを初期値よりも大きくする。 As described above, according to the embodiment, when at least a part of an article is present in the detection area 130B, the size of the pointer 130A is changed according to the type of the article. When the article is relatively small, the pointer size is made smaller than the initial value, and when the article is relatively large, the pointer size is made larger than the initial value.
 このように、ポインタ130Aの周囲に存在する物品のサイズに応じてポインタサイズを設定するため、シミュレーションシステム100の使い勝手が大幅に改善される。 Thus, since the pointer size is set according to the size of the article existing around the pointer 130A, the usability of the simulation system 100 is greatly improved.
 また、ボタン111A及び111Bのような物品に対して決定動作が行われた後の所定時間以内にキャンセル動作が行われた場合には、学習機能によってポインタサイズを90%に縮小するため、利用者1が再度同じ物品にポインタ130Aを移動させようとするときに、ポインタ130Aを所望の物品に移動させることが容易になる。 In addition, when the cancel operation is performed within a predetermined time after the determination operation is performed on the article such as the buttons 111A and 111B, the pointer size is reduced to 90% by the learning function. When 1 tries to move the pointer 130A to the same article again, it is easy to move the pointer 130A to the desired article.
 また、学習機能が一度行われた後に、再度キャンセル動作が行われると、ポインタサイズは、81%に縮小されることになる。このように、同一の物品に対してキャンセル動作が複数回行われると、サイズが10%ずつ小さくなって行く。 Also, if the cancel operation is performed again after the learning function is performed once, the pointer size is reduced to 81%. As described above, when the cancel operation is performed a plurality of times on the same article, the size decreases by 10%.
 このような学習機能によるポインタサイズの縮小は、利用者1の動きの特性に合わせて行われるため、学習機能が行われた後には、利用者1がポインタ130Aを操作する際の操作性が、より向上することになる。 Since the pointer size is reduced by the learning function according to the movement characteristics of the user 1, the operability when the user 1 operates the pointer 130A after the learning function is performed is as follows. It will be improved.
 このため、使い勝手を大幅に改善したシミュレーションシステム100を提供することができる。 Therefore, it is possible to provide the simulation system 100 with greatly improved usability.
 なお、以上では、物品111又はボタン111Aあるいは111Bを用いて説明した。特に、決定動作とキャンセル動作については、ボタン111A又は111Bの入力又はキャンセルする形態として説明した。 In the above description, the article 111 or the button 111A or 111B is used. In particular, the determining operation and the canceling operation have been described as a mode of inputting or canceling the button 111A or 111B.
 しかしながら、例えば、ポインタ130Aを利用して物品111を移動させるために、物品111を選択する際に、決定動作を行うようにしてもよい。決定動作を行った後は、ポインタ130Aとともに物品111が移動できるようにしてもよい。そして、このように物品111を移動した後に、物品111の選択を解除するために、キャンセル動作を行うようにしてもよい。このようにすれば、シミュレーションシステム100で、より使い勝手の良い組立支援システムを実現することができる。 However, for example, when the article 111 is selected in order to move the article 111 using the pointer 130A, a determination operation may be performed. After performing the determination operation, the article 111 may be moved together with the pointer 130A. And after moving the article | item 111 in this way, in order to cancel selection of the article | item 111, you may make it perform a cancellation operation | movement. In this way, it is possible to realize an easy-to-use assembly support system with the simulation system 100.
 また、以上では、ボタン111Bへの入力がキャンセルされた場合に、図22の右側に示すように、ボタン111Bに対応する物品IDである012に関連付けられたポインタサイズを、10%小さくする形態について説明した。 In the above description, when the input to the button 111B is canceled, as shown on the right side of FIG. 22, the pointer size associated with the item ID 012 corresponding to the button 111B is reduced by 10%. explained.
 しかしながら、ボタン111Bへの入力がキャンセルされた場合に、検知領域130B内にあるボタン111Aに対応する物品IDである011に関連付けられたポインタサイズについても、10%小さくしてもよい。この場合は、検知領域130B内にあるボタン111A及び111Bに対応する物品IDである011及び012に関連付けられたポインタサイズが10%小さくされることになる。 However, when the input to the button 111B is canceled, the pointer size associated with the article ID 011 corresponding to the button 111A in the detection area 130B may be reduced by 10%. In this case, the pointer size associated with the article IDs 011 and 012 corresponding to the buttons 111A and 111B in the detection area 130B is reduced by 10%.
 また、以上では、検知領域130Bの内部に少なくとも一部が位置するかどうかを、検知領域130Bと、物品111又はボタン111Aあるいは111Bの表示領域とが交点を有するかどうかで判定する形態について説明した。 In the above description, a mode has been described in which whether or not at least a part of the detection area 130B is located is determined by whether or not the detection area 130B and the display area of the article 111 or the button 111A or 111B have an intersection. .
 しかしながら、検知領域130Bの内部に少なくとも一部が位置するかどうかは、座標P(P,P,P)を中心とする検知領域130Bに、物品111又はボタン111Aあるいは111Bの表示領域が含まれるかどうかで判定してもよい。 However, whether at least partially located within the detection area 130B, the coordinates P (P x, P y, P z) in the detection area 130B centered on, the display area of the article 111 or buttons 111A or 111B You may determine by whether it is contained.
 また、以上では、ポインタ130Aが楕円体として表示される形態について説明したが、ポインタ130Aは、楕円体以外の形状を有する画像として表示されてもよい。 In the above description, the pointer 130A is displayed as an ellipsoid. However, the pointer 130A may be displayed as an image having a shape other than the ellipsoid.
 また、以上では、検知領域130Bが楕円体状の領域である形態について説明したが、検知領域130Bは楕円体以外の形状の領域であってもよい。また、検知領域130Bはポインタ130Aの外側に存在する領域として説明したが、ポインタ130Aの内部も含む領域であってもよい。 In the above description, the detection area 130B is an ellipsoidal area, but the detection area 130B may be an area having a shape other than an ellipsoid. Further, although the detection area 130B has been described as an area that exists outside the pointer 130A, it may be an area that also includes the inside of the pointer 130A.
 以上、本発明の例示的な実施の形態のシミュレーションシステムについて説明したが、本発明は、具体的に開示された実施の形態に限定されるものではなく、特許請求の範囲から逸脱することなく、種々の変形や変更が可能である。 Although the simulation system of the exemplary embodiment of the present invention has been described above, the present invention is not limited to the specifically disclosed embodiment, and does not depart from the scope of the claims. Various modifications and changes are possible.
 10 コンピュータシステム
 11 本体部
 12 ディスプレイ
 13 キーボード
 14 マウス
 15 モデム
 100 シミュレーションシステム
 110A スクリーン
 110B 投影装置
 110C 3D眼鏡
 120 処理装置
 121 人体検出部
 122 位置検出部
 123 検知領域生成部
 124 動作検出部
 125 物体判定部
 126 ポインタ生成部
 127 データ保持部
 128 映像出力部
 130A ポインタ
 130B 検知領域
 140 位置計測装置
DESCRIPTION OF SYMBOLS 10 Computer system 11 Main body part 12 Display 13 Keyboard 14 Mouse 15 Modem 100 Simulation system 110A Screen 110B Projection apparatus 110C 3D glasses 120 Processing apparatus 121 Human body detection part 122 Position detection part 123 Detection area generation part 124 Motion detection part 125 Object determination part 126 Pointer generation unit 127 Data holding unit 128 Video output unit 130A Pointer 130B Detection area 140 Position measurement device

Claims (8)

  1.  物品の形状と位置を表す物品データに基づく前記物品の画像と、利用者によって位置が操作されるポインタとを表示する表示部と、
     前記物品データを格納するデータ格納部と、
     前記利用者が前記ポインタの位置を指示する指示動作を検出する第1検出部と、
     前記第1検出部によって検出される前記指示動作に基づき、前記表示部の座標系におけるポインタの位置を検出する第2検出部と、
     前記ポインタを含み、前記ポインタよりも大きい検知領域を生成する領域生成部と、
     前記検知領域の内部に少なくとも一部が位置する物品の種類に応じて、前記ポインタのサイズを設定するサイズ設定部と、
     前記表示部に、前記サイズ設定部によって設定されるサイズの前記ポインタを表示させる出力部と
     を含む、シミュレーションシステム。
    A display unit for displaying an image of the article based on article data representing the shape and position of the article, and a pointer whose position is operated by a user;
    A data storage unit for storing the article data;
    A first detection unit that detects an instruction operation in which the user indicates the position of the pointer;
    A second detection unit that detects a position of a pointer in the coordinate system of the display unit based on the instruction operation detected by the first detection unit;
    An area generation unit that includes the pointer and generates a detection area larger than the pointer;
    A size setting unit that sets the size of the pointer according to the type of the article at least a part of which is located inside the detection area;
    An output unit that causes the display unit to display the pointer having a size set by the size setting unit.
  2.  前記データ格納部は、前記物品の種類と、前記ポインタのサイズとを関連付けたサイズデータをさらに格納しており、
     前記サイズ設定部は、前記検知領域の内部に少なくとも一部が位置する前記物品の種類に前記サイズデータの中で対応するサイズに、前記ポインタのサイズを設定する、請求項1記載のシミュレーションシステム。
    The data storage unit further stores size data in which the type of the article is associated with the size of the pointer,
    The simulation system according to claim 1, wherein the size setting unit sets the size of the pointer to a size corresponding to a type of the article at least a part of which is located inside the detection area in the size data.
  3.  前記ポインタは、前記第2検出部によって検出される位置を中心に前記サイズを有するように表示され、
     前記検知領域は、前記第2検出部によって検出される位置を中心とし、前記ポインタを含み、前記ポインタよりも大きい領域である、請求項1又は2記載のシミュレーションシステム。
    The pointer is displayed so as to have the size around the position detected by the second detection unit,
    The simulation system according to claim 1, wherein the detection area is an area that is centered on a position detected by the second detection unit, includes the pointer, and is larger than the pointer.
  4.  前記ポインタは、前記第2検出部によって検出される位置を中心とする楕円体として表示される、請求項3記載のシミュレーションシステム。 The simulation system according to claim 3, wherein the pointer is displayed as an ellipsoid centered on a position detected by the second detection unit.
  5.  前記サイズ設定部は、前記検知領域の内部に少なくとも一部が位置する物品が存在しない場合は、前記ポインタのサイズを所定のサイズに設定する、請求項1乃至4のいずれか一項記載のシミュレーションシステム。 The simulation according to any one of claims 1 to 4, wherein the size setting unit sets the size of the pointer to a predetermined size when there is no article that is at least partially located inside the detection area. system.
  6.  前記ポインタによる前記物品の選択操作が行われたかどうかを判定する第1判定部と、
     前記第1判定部によって前記選択操作が行われたと判定された場合に、前記選択操作がキャンセルされたかどうかを判定する第2判定部と
     をさらに含み、
     前記サイズ設定部は、前記第2判定部によって前記選択操作がキャンセルされたと判定された場合に、前記ポインタのサイズを所定割合小さく設定する、請求項1乃至5のいずれか一項記載のシミュレーションシステム。
    A first determination unit that determines whether the selection operation of the article by the pointer has been performed;
    A second determination unit that determines whether the selection operation has been canceled when the first determination unit determines that the selection operation has been performed;
    The simulation system according to any one of claims 1 to 5, wherein the size setting unit sets the size of the pointer to be smaller by a predetermined ratio when it is determined by the second determination unit that the selection operation has been canceled. .
  7.  前記サイズ設定部は、前記第1判定部によって前記選択操作が行われたと判定されてから所定時間以内に、前記第2判定部によって前記選択操作がキャンセルされたと判定された場合に、前記ポインタのサイズを所定割合小さく設定する、請求項6記載のシミュレーションシステム。 The size setting unit, when it is determined that the selection operation is canceled by the second determination unit within a predetermined time after it is determined that the selection operation is performed by the first determination unit. The simulation system according to claim 6, wherein the size is set smaller by a predetermined ratio.
  8.  前記サイズ設定部は、前記ポインタのサイズを所定割合小さく設定した場合に、前記小さくされたサイズが所定の下限値よりも小さい場合は、前記ポインタのサイズを前記下限値に設定する、請求項7記載のシミュレーションシステム。 The size setting unit sets the size of the pointer to the lower limit value when the size of the pointer is set smaller by a predetermined ratio and the reduced size is smaller than a predetermined lower limit value. The simulation system described.
PCT/JP2016/064021 2016-05-11 2016-05-11 Simulation system WO2017195299A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/064021 WO2017195299A1 (en) 2016-05-11 2016-05-11 Simulation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/064021 WO2017195299A1 (en) 2016-05-11 2016-05-11 Simulation system

Publications (1)

Publication Number Publication Date
WO2017195299A1 true WO2017195299A1 (en) 2017-11-16

Family

ID=60267552

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/064021 WO2017195299A1 (en) 2016-05-11 2016-05-11 Simulation system

Country Status (1)

Country Link
WO (1) WO2017195299A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10161841A (en) * 1996-12-02 1998-06-19 Mitsubishi Heavy Ind Ltd Pointer display control unit
JPH10254675A (en) * 1997-03-14 1998-09-25 Matsushita Electric Ind Co Ltd Data input method and data input device using the method
WO2013098869A1 (en) * 2011-12-26 2013-07-04 株式会社日立製作所 Computer that repositions object, and method and program that reposition object
JP2013143144A (en) * 2012-01-09 2013-07-22 Samsung Electronics Co Ltd Display device and method for selecting item thereof
JP2013152697A (en) * 2011-12-28 2013-08-08 Alps Electric Co Ltd Input device and electronic apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10161841A (en) * 1996-12-02 1998-06-19 Mitsubishi Heavy Ind Ltd Pointer display control unit
JPH10254675A (en) * 1997-03-14 1998-09-25 Matsushita Electric Ind Co Ltd Data input method and data input device using the method
WO2013098869A1 (en) * 2011-12-26 2013-07-04 株式会社日立製作所 Computer that repositions object, and method and program that reposition object
JP2013152697A (en) * 2011-12-28 2013-08-08 Alps Electric Co Ltd Input device and electronic apparatus
JP2013143144A (en) * 2012-01-09 2013-07-22 Samsung Electronics Co Ltd Display device and method for selecting item thereof

Similar Documents

Publication Publication Date Title
CN106774880B (en) Three-dimensional tracking of user control devices in space
US6198485B1 (en) Method and apparatus for three-dimensional input entry
RU2288512C2 (en) Method and system for viewing information on display
JP6046729B2 (en) Omni-directional gesture input
EP1611503B1 (en) Auto-aligning touch system and method
TWI512548B (en) Moving trajectory generation method
US8305365B2 (en) Mobile device and area-specific processing executing method
JP5117418B2 (en) Information processing apparatus and information processing method
KR20080045510A (en) Controlling method and apparatus for user interface of electronic machine using virtual plane
TW201911133A (en) Controller tracking for multiple degrees of freedom
JP5802247B2 (en) Information processing device
CN101140598A (en) Part identification image processor, program for generating part identification image, and recording medium storing the same
US11579711B2 (en) Three-dimensional object position tracking system
US20230418431A1 (en) Interactive three-dimensional representations of objects
WO2017195299A1 (en) Simulation system
JP2006323454A (en) Three-dimensional instruction input system, three-dimensional instruction input device, three-dimensional instruction input method, and program
KR101598807B1 (en) Method and digitizer for measuring slope of a pen
JP4868044B2 (en) Object attribute change processing device, object attribute change processing method, and 3D model processing device and 3D model processing method
JP7452917B2 (en) Operation input device, operation input method and program
JP4997265B2 (en) Component part data simplification device, component part data simplification method, and component part data simplification program
JP2017204126A (en) Simulation system
US20190265806A1 (en) Virtual reality enclosures with magnetic field sensing
WO2017195298A1 (en) Simulation system
JP4389350B2 (en) Object attribute change processing device, object attribute change processing method, three-dimensional model processing device, three-dimensional model processing method, and program providing medium
Chanda et al. Analysis of Multi Touch Interactive Device for a Cost Effective Architecture

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16901647

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 16901647

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP