WO2009147806A1 - 遠隔操作装置及び遠隔操作方法 - Google Patents
遠隔操作装置及び遠隔操作方法 Download PDFInfo
- Publication number
- WO2009147806A1 WO2009147806A1 PCT/JP2009/002380 JP2009002380W WO2009147806A1 WO 2009147806 A1 WO2009147806 A1 WO 2009147806A1 JP 2009002380 W JP2009002380 W JP 2009002380W WO 2009147806 A1 WO2009147806 A1 WO 2009147806A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- image
- display screen
- dimensional
- unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
Definitions
- the present invention generates a user's three-dimensional human body shape model (hereinafter simply referred to as “three-dimensional model”), and displays a user image when the generated three-dimensional model is three-dimensionally rendered from a viewpoint at an arbitrary position.
- the present invention relates to a remote operation device and a remote operation method for remotely operating an apparatus.
- FIG. 1 There is an electronic mirror device as shown in FIG. 1 that generates a user's three-dimensional model and displays an arbitrary viewpoint.
- the electronic mirror device generates a three-dimensional model of the user 101 using measurement data measured by the three-dimensional human body shape measuring device 102 (a camera in FIG. 1). Then, the electronic mirror device generates a user projection image 104 by rendering the three-dimensional model, and displays the generated user projection image 104 on the display screen 103.
- a method of remotely operating the electronic mirror device by selecting and determining a menu image displayed on the display screen 103 by the user 101 located away from the display screen 103 can be considered.
- the electronic mirror device is remotely operated using a projection image of the user displayed on the display screen.
- the electronic mirror device determines that the menu image 204 that overlaps the projection image 203 of the user 201 displayed on the display screen 202 is the menu image “selected” by the user 201 (step S1).
- the electronic mirror device is “determined” by the user 201 as the menu image that includes the position where the movement amount of the user in the direction perpendicular to the display screen 202 is the largest among the menu images “selected” by the user 201.
- the menu image is determined (step S2).
- Patent Document 1 An example in which such a remote operation method is applied to a projector is disclosed in Patent Document 1.
- the above conventional technique has a problem that a device for measuring information in the depth direction is required in addition to a device for displaying a projected image.
- the present invention has been made in view of the above-described problems, and an object thereof is to provide a remote operation device that can realize remote operation of an electronic mirror device or the like with a relatively simple configuration.
- one aspect of the remote control device of the present invention corresponds to a menu image displayed on a display screen and specified by a user at a position away from the display screen.
- a remote operation device for outputting operation information which is a first and second three-dimensional model showing an appearance of the shape of the user at a first time and a second time after the first time.
- a three-dimensional model generation unit that generates the three-dimensional model of the first and second three-dimensional models generated by the three-dimensional model generation unit, and three-dimensional rendering of the first and second user images
- the menu image and the second user image Display of the second user image when the first operation determination unit for determining whether or not the second user image and the second user image generated by the three-dimensional rendering unit are displayed on the display screen
- a second operation determination unit that calculates a movement amount in a direction perpendicular to the screen, and specifies coordinates on the display screen indicating a position designated by the user based on the calculated movement amount; When the operation determination unit determines that the menu image and the second user image overlap, the coordinates specified by the second operation determination unit are included in the area where the menu image is displayed.
- An operation information output unit that outputs operation information corresponding to the menu image, and the second operation determination unit is generated when the first and second 3D models are three-dimensionally rendered.
- Is Referring to a depth buffer in which depth information indicating the relationship between the coordinates on the projection plane and the depth value is stored, and obtaining a difference value between the depth value of the first user image and the depth value of the second user image. Calculated as the amount of movement.
- the depth buffer it is possible to calculate the amount of movement of the user image in the direction perpendicular to the projection plane, so that the distance between the user and the display screen is detected using a device such as a distance sensor. Therefore, it becomes possible to output operation information. That is, remote control of the electronic mirror device or the like can be realized with a relatively simple configuration.
- the three-dimensional rendering unit generates the first and second user images by three-dimensionally rendering the first and second three-dimensional models using a first projection matrix, and A third user image is generated by three-dimensionally rendering the second three-dimensional model using a second projection matrix different from the first projection matrix, and generated by the three-dimensional rendering unit It is preferable that a third user image is displayed on the display screen.
- the menu image specified by the user can be determined using the user image for operation different from the user image for display, the user has a certain operation feeling regardless of the position or size of the user.
- the menu image can be designated with.
- the first projection matrix is an orthographic projection matrix and the second projection matrix is a perspective projection matrix.
- the user image for display is generated using the orthogonal projection matrix, so that the user can instruct the menu image with a certain operation feeling regardless of the positional relationship between the user and the display screen.
- the first projection matrix is preferably a projection matrix that changes the size of the user image in accordance with the size of the user and the size of the display screen.
- the user can instruct the menu image with a certain operation feeling.
- one aspect of the integrated circuit of the present invention is an integrated circuit for outputting operation information corresponding to a menu image displayed on a display screen and specified by a user at a position away from the display screen.
- a three-dimensional circuit that generates first and second three-dimensional models that are three-dimensional models showing the appearance of the user's shape at a first time and a second time after the first time.
- a model generation unit a three-dimensional rendering unit that generates first and second user images by three-dimensionally rendering the first and second three-dimensional models generated by the three-dimensional model generation unit;
- a first determination is made as to whether or not the menu image and the second user image overlap when the second user image generated by the three-dimensional rendering unit is displayed on the display screen.
- the amount of movement of the second user image in the direction perpendicular to the display screen is calculated. Then, based on the calculated movement amount, a second operation determination unit that specifies coordinates on the display screen indicating the position designated by the user, and the menu image and the second operation by the first operation determination unit. Operation information corresponding to the menu image when the coordinates specified by the second operation determination unit are included in the area where the menu image is displayed. And an operation information output unit that outputs the coordinates and depth values on the projection plane that are generated when the first and second three-dimensional models are three-dimensionally rendered. When Referring to depth buffer depth information indicating the relationship is stored, it calculates a difference value between the depth values of the depth value and the second user image of the first user image as the moving amount.
- the present invention can be realized not only as such a remote operation device, but also as a remote operation method including steps of operations of characteristic components included in such a remote operation device. It can also be realized as a program for causing a computer to execute steps.
- the amount of movement of the user image in the direction perpendicular to the projection plane can be calculated by referring to the depth buffer. Therefore, the distance between the user and the display screen can be calculated using a device such as a distance sensor. Operation information can be output without detection. That is, remote control of the electronic mirror device or the like can be realized with a relatively simple configuration.
- FIG. 1 is a diagram showing an outline of a conventional electronic mirror device.
- FIG. 2 is a diagram for explaining a preferred remote operation method in the conventional electronic mirror device.
- FIG. 3 is a block diagram illustrating a functional configuration of the electronic mirror device including the remote control device according to the first embodiment of the present invention.
- FIG. 4A is a flowchart showing a flow of processing by the remote control device according to Embodiment 1 of the present invention.
- FIG. 4B is a flowchart showing a flow of processing by the remote control device according to Embodiment 1 of the present invention.
- FIG. 5 is a diagram for explaining the flow of processing by the remote control device according to the first embodiment of the present invention.
- FIG. 6 is a block diagram showing a functional configuration of an electronic mirror device provided with a remote control device according to Embodiment 2 of the present invention.
- FIG. 7 is a flowchart showing a flow of processing by the remote control device according to the second embodiment of the present invention.
- FIG. 8 is a diagram for explaining the flow of processing by the remote control device according to the second embodiment of the present invention.
- FIG. 9 is a diagram illustrating the difference in the projected image depending on the positional relationship between the user and the display screen.
- FIG. 10 is a diagram for explaining the perspective projection.
- FIG. 11 is a diagram for explaining orthographic projection.
- FIG. 12 is a diagram for explaining a modification of the second embodiment of the present invention.
- FIG. 13 is a diagram for explaining a modification of the present invention.
- FIG. 3 is a block diagram illustrating a functional configuration of the electronic mirror device including the remote control device according to the first embodiment of the present invention.
- the electronic mirror device 10 includes a three-dimensional human body shape measurement unit 11, a display unit 12, an electronic mirror control unit 13, and a remote operation device 20.
- the 3D human body shape measuring unit 11 measures the 3D human body shape of the user and outputs the measurement data to the 3D model generating unit 21 of the remote control device 20.
- the three-dimensional human body shape measurement unit 11 may measure the user's three-dimensional human body shape using a method generally used when measuring a three-dimensional human body shape, such as a stereo measurement method or a light cutting method.
- the display unit 12 has a display screen such as a liquid crystal display, and displays the display image output by the display image composition unit 26.
- the electronic mirror control unit 13 controls the operation of each component constituting the electronic mirror device 10.
- the electronic mirror control unit 13 controls the operation of each component according to the operation information output by the operation information output unit 25. For example, when the operation information indicates the luminance UP, the electronic mirror control unit 13 increases the luminance of the image to be displayed by a predetermined value.
- the remote operation device 20 is a device that outputs operation information corresponding to the menu image displayed on the display screen according to the movement of the user at a position away from the display screen.
- the remote operation device 20 includes a three-dimensional model generation unit 21, a data storage unit 22, a three-dimensional rendering unit 23, an operation determination unit 24, an operation information output unit 25, a display image synthesis unit 26, and a control unit 27.
- the 3D model generation unit 21 sequentially generates a 3D model of the user from the measurement data measured by the 3D human body shape measurement unit 11. Specifically, the three-dimensional model generation unit 21 generates a three-dimensional model using a method that is generally used when generating a three-dimensional model, such as the Delaunay triangulation method or the marching cubes method. .
- the data storage unit 22 is a storage medium such as a memory, and stores rendering parameters including a projection matrix.
- the three-dimensional rendering unit 23 renders the three-dimensional model generated by the three-dimensional model generation unit 21 three-dimensionally using the rendering parameters stored in the data storage unit 22, thereby generating a two-dimensional projection image (hereinafter, Simply “user image”). Specifically, the three-dimensional rendering unit 23 generates a user image by executing a series of rendering processes such as modeling conversion, light source calculation, projection conversion, viewport conversion, and texture mapping. In this rendering process, the three-dimensional rendering unit 23 stores color information and depth information in a color buffer and a depth buffer, which are buffer memories (not shown).
- the color information stored in the color buffer is information indicating the relationship between the position on the projection plane and the color when the three-dimensional model is projected onto the projection plane.
- the depth information stored in the depth buffer is information indicating the relationship between the position and the depth value when the three-dimensional model is projected onto the projection plane.
- the operation determination unit 24 is an example of the first and second operation determination units, and refers to the color information and depth information stored in the color buffer and the depth buffer by the three-dimensional rendering unit 23, thereby enabling the first operation The determination and the second operation determination are executed.
- the operation determination unit 24 stores, in the color buffer, whether or not the menu image and the user image overlap when the user image generated by the three-dimensional rendering unit 23 is displayed on the display screen.
- the color information is used for determination (first operation determination).
- the operation determination unit 24 calculates the movement amount of the user image in the direction perpendicular to the display screen when the user image generated by the three-dimensional rendering unit 23 is displayed on the display screen. Then, the operation determination unit 24 specifies coordinates on the display screen instructed by the user based on the calculated movement amount (second operation determination). Specifically, the operation determination unit 24 specifies, for example, the coordinates on the display screen corresponding to the position on the projection plane that is the largest movement amount as the coordinates instructed by the user. The operation determination unit 24 refers to the depth buffer and calculates a difference value between depth values of the current user image and the past user image as a movement amount.
- the operation information output unit 25 is a case where the operation determination unit 24 determines that the menu image and the second user image overlap, and the coordinates specified by the operation determination unit 24 are included in the area where the menu image is displayed.
- the operation information corresponding to the menu image is output to the display image composition unit 26, the electronic mirror control unit 13, and the like.
- the operation information corresponding to the menu image may be acquired from, for example, an operation information table in which the menu image and the operation information are stored in association with each other.
- the display image composition unit 26 generates a display menu image based on the operation information output by the operation information output unit 25. Then, the display image synthesis unit 26 generates a display image by synthesizing the generated display menu image and the user image generated by the three-dimensional rendering unit 23. Then, the display image composition unit 26 outputs the generated display image to the display unit 12.
- the control unit 27 controls the operation of each component of the remote operation device 20.
- the 3D model generated by the 3D model generation unit 21 or the user image generated by the 3D rendering unit 23 may be stored in a storage unit (not shown). In that case, each component reads the three-dimensional model or user image stored in the storage unit, and executes processing using the read three-dimensional model or user image.
- FIG. 4A and FIG. 4B are flowcharts showing the flow of processing by the remote control device according to Embodiment 1 of the present invention.
- FIG. 5 is a diagram for explaining the flow of processing by the remote control device according to Embodiment 1 of the present invention.
- the three-dimensional model generation unit 21 generates a three-dimensional model from the user's three-dimensional human body measurement data (S101).
- the three-dimensional rendering unit 23 generates a user image to be displayed on the display screen by three-dimensionally rendering the three-dimensional model.
- the three-dimensional rendering unit 23 when generating the user image, the three-dimensional rendering unit 23 generates color information and depth information, and stores the generated color information and depth information in the rendering buffer 401 (S102).
- the rendering buffer 401 includes a color buffer 402 that stores color information that is a projected image on the user's display screen, and a depth buffer 403 that stores depth information representing a distance from the user's display screen. Consists of
- the operation determination unit 24 selects a menu image that is an operation determination target and has not yet been selected (S103).
- the operation determination unit 24 performs an operation determination process using the color information stored in the color buffer 402 and the depth information stored in the depth buffer 403 (S104). Details of the operation determination process will be described later with reference to FIG. 4B.
- the operation determination unit 24 determines whether or not all menu images displayed on the display screen have been selected in step S103 (S105). Here, when all the menu images are not selected (No in S105), the operation determination unit 24 repeats the processes in steps S103 to S104. On the other hand, when all the menu images are selected (Yes in S105), the operation information output unit 25 outputs the operation information corresponding to the menu image “determined” by the user indicated by the determination information (S106). .
- the display image composition unit 26 generates a display menu image 404 based on the output operation information. Specifically, the display image composition unit 26 generates a display menu image 404 in which the color of the menu image indicated by the operation information is changed, for example. Then, the display image synthesis unit 26 outputs the display image 405 obtained by synthesizing the generated display menu image 404 and the user image to the display unit 12 (S107). Finally, the display unit 12 displays the display image 405 on the display screen (S108).
- the rendering buffer 401 is referred to.
- the operation determination process can be divided into a first operation determination for specifying a menu image “selected” by the user and a second operation determination for specifying a menu image “decided” by the user.
- the operation determination unit 24 acquires color information of the display area of the menu image selected in step S103 from the color buffer 402 (S111). Next, the operation determination unit 24 determines whether or not a user image is included in the display area of the menu image (S112). Specifically, for example, when the color information at a position corresponding to the display area of the menu image has a value different from the initial value, the operation determination unit 24 includes the user image in the display area of the menu image. It is determined that
- the operation determination unit 24 determines that the menu image is selected by the user, and the second operation determination Execute.
- the operation determination unit 24 determines that the menu image is not selected by the user, and performs the second operation determination. Do not execute.
- the operation determination unit 24 acquires the depth information of the display area of the menu image selected in step S103 from the depth buffer 403 (S121).
- the operation determination unit 24 acquires current and past depth information from the depth buffer 403.
- the operation determination unit 24 for example, based on the depth information generated based on the three-dimensional model generated in step S101 and the three-dimensional model generated immediately before the three-dimensional model.
- the depth information generated in this way is acquired from the depth buffer 403.
- the operation determination unit 24 calculates the difference value of the depth value indicated by the acquired depth information for each position on the projection plane (S122). Subsequently, the operation determination unit 24 determines whether or not the calculated maximum value of the difference value is larger than the maximum value of the difference value already calculated in another menu image (S123).
- the operation determination unit 24 selects the menu image “determined” by the user.
- the determination information for display is updated to information indicating the menu image (S124).
- the remote control device 20 specifies the menu image that has been “selected” and “decided” by the user using the color information and the depth information stored in the color buffer and the depth buffer. Can do. That is, the remote operation device 20 displays the information in the horizontal direction with respect to the display screen used in the first operation determination and the information in the vertical direction with respect to the display screen used in the second operation determination, respectively. Since three-dimensional rendering processing for generating an image can be obtained as color information and depth information stored in a color buffer and a depth buffer, no separate measurement processing or recognition processing is required. That is, the remote operation device 20 can realize remote operation of the electronic mirror device 10 with a relatively simple configuration.
- the color information referred to by the operation determination unit 24 may include not only a visible color value (such as RGB or HSV) but also control information of a pixel at a corresponding position such as an ⁇ value (A) representing transparency.
- the control information is a value different from the initial value among the color information at the position corresponding to the display area of the menu image
- the operation determination unit 24 places the user in the display area of the menu image. It may be determined that an image is included.
- the operation determination unit 24 may not update the determination information when the maximum value of the difference value of the depth value does not reach a predetermined threshold value or when the operation value does not move in the display screen direction. Thereby, the remote operation device can avoid an erroneous operation due to an operation not intended by the user.
- the display image composition unit 26 may change the color of the menu image or the user image according to the difference value of the depth value calculated by the operation determination unit 24. Thereby, the user can confirm the menu image determined by the user.
- the remote operation device 40 according to the present embodiment is different from the remote operation device 20 according to the first embodiment in that a user image for executing an operation determination process and a user image for display on a display screen are generated.
- FIG. 6 is a block diagram illustrating a functional configuration of the electronic mirror device 30 including the remote control device according to the second embodiment of the present invention.
- the remote operation device 40 according to the present embodiment and the remote operation device 20 according to the first embodiment include a part of the processing of the three-dimensional rendering unit and the display image composition unit and a part of the data stored in the data storage unit. Although different, the other parts are the same.
- FIG. 6 the same components as those in FIG. 3 are denoted by the same reference numerals, and description thereof is omitted.
- the data storage unit 42 includes a first rendering parameter including an orthogonal projection matrix (hereinafter referred to as “operation rendering parameter”) and a second rendering parameter including a perspective projection matrix (hereinafter referred to as “display rendering parameter”). .)
- the three-dimensional rendering unit 43 renders the three-dimensional model generated by the three-dimensional model generation unit 21 three-dimensionally using the operation rendering parameters stored in the data storage unit 42, thereby obtaining color information and depth information. Is generated. Then, the three-dimensional rendering unit 43 stores color information in the color buffer and stores depth information in the depth buffer.
- the three-dimensional rendering unit 43 performs the three-dimensional rendering of the three-dimensional model generated by the three-dimensional model generation unit 21 using the display rendering parameters stored in the data storage unit 42, thereby displaying the display user. Generate an image.
- the three-dimensional rendering unit 43 includes two three-dimensional rendering units including a first three-dimensional rendering unit that generates a user image for display and a second three-dimensional rendering unit that generates a user image for operation. May be. Further, the control unit 27 may perform time-sharing control by a process of generating a display user image and a process of generating an operation user image.
- the display image composition unit 46 generates a display menu image based on the operation information output by the operation information output unit 25. Then, the display image synthesis unit 46 generates a display image by synthesizing the generated display menu image and the display user image generated by the three-dimensional rendering unit 43. Then, the display image synthesis unit 46 outputs the generated display image to the display unit 12.
- FIG. 7 is a flowchart showing a flow of processing by the remote control device according to the second embodiment of the present invention.
- FIG. 8 is a figure for demonstrating the flow of a process by the remote control apparatus in Embodiment 2 of this invention.
- the same processes as those in FIG. 4A are denoted by the same reference numerals, and the description thereof is omitted.
- the three-dimensional rendering unit 43 After the processing of step S101 is executed, the three-dimensional rendering unit 43 generates an operation user image (color information and depth information) by three-dimensionally rendering the three-dimensional model using the operation rendering parameters. (S201). Then, the three-dimensional rendering unit 43 stores the generated color information and depth information in the operation rendering buffer 601. Note that the three-dimensional rendering unit 43 generates an operation user image using a projection matrix different from that used when generating the display user image in step S202.
- the operation determination unit 24 and the operation information output unit 25 execute the processes of steps S103 to S106 as in the first embodiment.
- the operation determination unit 24 executes the operation determination process using the color information and the depth information stored in the operation rendering buffer 601. Specifically, the operation determination unit 24 performs the first operation determination using color information stored in the operation color buffer 602. In addition, the operation determination unit 24 performs the second operation determination using the depth information stored in the operation depth buffer 603.
- the three-dimensional rendering unit 43 generates a user image for display by three-dimensionally rendering the same three-dimensional model as when the user image for operation is generated in step S201 using the display rendering parameter. (S202). Then, the three-dimensional rendering unit 43 stores the generated color information and depth information in the display rendering buffer 604. Note that the three-dimensional rendering unit 43 generates a display user image using a projection matrix different from that used when generating the operation user image in step S201.
- the display image composition unit 46 generates a display menu image 605 based on the output operation information. Then, the display image synthesis unit 46 outputs the display image 606 obtained by synthesizing the generated display menu image 404 and the display user image to the display unit 12 (S203). Finally, the display unit 12 displays the display image 606 on the display screen (S106).
- the remote operation device 40 As described above, the remote operation device 40 according to the present embodiment generates a user image for operation and a user image for display from the same three-dimensional model, and preferred examples thereof are shown in FIGS. It explains using.
- the size of the user image changes depending on the distance between the user and the display screen because the user image displayed on the display screen is generally generated by a perspective projection considering perspective. is there.
- a projection image 805 is generated at a position where a projection line 804 connecting the viewpoint 801 and the object 802 intersects the projection plane 803. Therefore, the size of the projection image 805 changes according to the distance between the object 802 and the projection plane 803.
- the projection formula using the perspective projection matrix is expressed as shown in Formula (1).
- x, y, and z are three-dimensional coordinates before projection
- x ′, y ′, z ′, and w ′ are homogeneous coordinates after projection
- l, r, b, t, n, and f are Each of the constants represents a boundary of a three-dimensional space.
- the coordinate in the horizontal direction (XY direction) with respect to the screen changes depending on the vertical direction (Z direction).
- the size of the user image does not change according to the distance between the user and the display screen.
- a projection image 810 is generated at a position where projection lines 809 extending from the viewpoint 806 toward the object 807 and parallel to the projection plane 808 intersect with the projection plane 808. The Therefore, the size of the projected image 810 is constant regardless of the distance between the object 807 and the projection plane 808.
- the projection formula based on the orthogonal projection matrix is expressed as shown in Formula (2).
- Equation (2) the coordinate in the horizontal direction (XY direction) with respect to the screen does not depend on the vertical direction (Z direction). Therefore, the horizontal information and the vertical information are accurately separated and stored in the operation color buffer 602 and the operation depth buffer 603.
- the three-dimensional rendering unit 43 generates the color information and the depth information stored in the operation rendering buffer using an orthogonal projection matrix, so that the user can maintain a constant level regardless of the positional relationship of the user in the direction perpendicular to the display screen.
- a menu image can be instructed with an operational feeling.
- the three-dimensional rendering unit 43 generates an operation user image using an orthogonal projection matrix, but may generate an operation user image using another projection matrix.
- FIG. 12 is a diagram for explaining a modification of the second embodiment of the present invention. As shown in FIG. 12, depending on the size relationship between the user 901 and the display screen 902, even if the three-dimensional rendering unit 43 generates a user image for operation using the orthographic projection matrix, the user image is displayed in the menu. In some cases, it is difficult for the user to specify a menu image because the user does not reach it or touches it immediately.
- the three-dimensional rendering unit 43 operates using a projection matrix that changes the size of the user image 903 according to the size of the user 901 and the display screen 902 (for example, the measured height or width).
- a user image may be generated.
- the three-dimensional rendering unit 43 may generate a user image for operation using a projection matrix that enlarges or reduces the user image 903, for example, around a point (Cx, Cy) by s times. good.
- a projection expression based on a projection matrix for enlarging or reducing the user image by s times is expressed as Expression (3).
- the remote operation device generates the user image for operation using the projection matrix that enlarges or reduces the user image by s times, so that the user 901 and the display screen 902 do not depend on the size. Therefore, it is possible to always generate the user image 904 in which the user 901 can easily indicate the menu image.
- the projection matrix for generating the user image for operation shown in the present embodiment is an example, and the remote control device according to the present invention generates the user image for operation using another projection matrix. May be.
- the remote control device when the remote control device generates a user image obtained by three-dimensionally rendering a user's three-dimensional model from a viewpoint at an arbitrary position, or when generating a user image that is reversed horizontally, a user image for display is generated.
- the projection matrix to do changes.
- the projection matrix used when generating the user image for operation may be constant. Thereby, the user can instruct the menu image with the same operation feeling.
- the color information generated when generating the user image for operation does not have to be information indicating the actual color of the user, but is information indicating the presence or absence of the user image (for example, 1-bit data). good. This is because the user image for operation does not need to be displayed on the display unit 12. As a result, the remote control device can reduce the memory area used.
- the display user image displayed on the display screen is different from the operation user image used for the operation determination process, so that the outline of the operation user image is displayed or the operation user image is displayed.
- the color may be displayed in a translucent manner. Thereby, the user can easily understand the operation.
- the remote operation device executes the second operation determination after executing the first operation determination, but executes the first operation determination after executing the second operation determination. May be.
- the remote control device identifies coordinates where the amount of movement in the direction perpendicular to the display screen is equal to or greater than a predetermined threshold, and determines whether the coordinates are included in the display area of the menu image. Also good.
- the predetermined threshold value may be a predetermined value, for example, or may be a maximum value of the calculated movement amount.
- the remote control device in the above embodiment generates a user image similar to the case where the display screen is an actual mirror by the three-dimensional rendering process, but the user image is different from the mirror image reflected in the actual mirror. May be generated and output to the display unit.
- the remote control device may generate, for example, a user image that reflects the user's side surface by performing three-dimensional rendering so that the direction parallel to the display screen is the line of sight.
- each functional block included in the remote control device of the above embodiment is typically realized as an LSI (Large Scale Integration) which is an integrated circuit. That is, as shown in FIG. 3 or FIG. 6, the remote control device typically includes an LSI 51 or an LSI 52. These functional blocks may be individually made into one chip, or may be made into one chip so as to include a part or all of them. Further, the data storage unit may be provided inside or outside the integrated circuit, and may be constituted by a single memory or a plurality of memories. Although referred to as LSI here, it may be referred to as IC (Integrated Circuit), system LSI, super LSI, or ultra LSI depending on the degree of integration.
- IC Integrated Circuit
- the method of circuit integration is not limited to LSI's, and implementation using dedicated circuitry or general purpose processors is also possible.
- an FPGA Field Programmable Gate Array
- a reconfigurable processor that can reconfigure the connection or setting of the circuit cells inside the LSI may be used.
- integrated circuit technology comes out to replace LSI's as a result of the advancement of semiconductor technology or a derivative other technology, it is naturally also possible to carry out function block integration using this technology. Biotechnology can be applied.
- the electronic mirror device includes a remote operation device
- another device having a function of generating a user's three-dimensional model and displaying an arbitrary viewpoint may include the remote operation device.
- various devices such as a television 1104, a recorder / player 1105, or a game machine 1106 may include a remote control device.
- each device may include a module substrate 1102 on which a system LSI 1101 on which each functional block included in the remote remote control device is realized is mounted.
- the present invention may be realized as a remote operation method in which operations of characteristic components included in the remote operation device are used as steps.
- the steps included in such a remote operation method may be realized as a program that is executed by a computer including a CPU (Central Processing Unit), a memory, and the like.
- a program may be distributed via a recording medium such as a CD-ROM or a transmission medium such as the Internet.
- the remote control device of the present invention generates various three-dimensional models of users and displays various user images at arbitrary viewpoints based on the generated three-dimensional models, such as an electronic mirror device, a television, a recorder / player, or a game machine. It can be used as a user interface for equipment.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- General Engineering & Computer Science (AREA)
- Computing Systems (AREA)
- Geometry (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
2008年6月2日に出願された出願番号2008-144458の日本出願の明細書、図面及び特許請求の範囲における開示の全体は、参照用として本願に取り込まれる。
図3は、本発明の実施の形態1における遠隔操作装置を備える電子ミラー装置の機能構成を示すブロック図である。図3に示すように、電子ミラー装置10は、3次元人体形状計測部11と、表示部12と、電子ミラー制御部13と、遠隔操作装置20とを備える。
次に、本発明の実施の形態2における遠隔操作装置について説明する。
11 3次元人体形状計測部
12 表示部
13 電子ミラー制御部
20、40 遠隔操作装置
21 3次元モデル生成部
22、42 データ記憶部
23、43 3次元レンダリング部
24 操作判定部
25 操作情報出力部
26、46 表示画像合成部
27 制御部
51、52 LSI
101、201、701、705、901 ユーザ
102 3次元人体形状計測装置
103、202、702、706、902 表示画面
104、203 ユーザの投影像
204、704、708 メニュー画像
401 レンダリングバッファ
402 カラーバッファ
403 デプスバッファ
404、605 表示用メニュー画像
405、606 表示用画像
601 操作用レンダリングバッファ
602 操作用カラーバッファ
603 操作用デプスバッファ
604 表示用レンダリングバッファ
703、707、903、904 ユーザ像
801、806 視点
802、807 オブジェクト
803、808 投影面
804、809 投射線
805、810 投影像
1101 システムLSI
1102 モジュール基板
1104 テレビ
1105 レコーダー/プレーヤー
1106 ゲーム機
Claims (7)
- 表示画面に表示されたメニュー画像であって前記表示画面から離れた位置にいるユーザにより指定されたメニュー画像に対応する操作情報を出力するための遠隔操作装置であって、
第1の時刻及び前記第1の時刻より後の第2の時刻における前記ユーザの形状の外観を示す3次元モデルである第1及び第2の3次元モデルを生成する3次元モデル生成部と、
前記3次元モデル生成部によって生成された第1及び第2の3次元モデルを3次元レンダリングすることにより、第1及び第2のユーザ像を生成する3次元レンダリング部と、
前記3次元レンダリング部によって生成された第2のユーザ像が前記表示画面に表示された場合に、前記メニュー画像と前記第2のユーザ像とが重なるか否かを判定する第1の操作判定部と、
前記3次元レンダリング部によって生成された第2のユーザ像が前記表示画面に表示された場合における、前記第2のユーザ像の前記表示画面と垂直な方向への移動量を算出し、算出した移動量に基づいて、前記ユーザが指定した位置を示す前記表示画面上の座標を特定する第2の操作判定部と、
前記第1の操作判定部によって前記メニュー画像と前記第2のユーザ像とが重なると判定された場合であって、前記第2の操作判定部によって特定された座標が前記メニュー画像の表示される領域に含まれる場合に、前記メニュー画像に対応する操作情報を出力する操作情報出力部とを備え、
前記第2の操作判定部は、前記第1及び第2の3次元モデルが3次元レンダリングされる際に生成される、投影面上の座標と奥行き値との関係を示す奥行き情報が格納されるデプスバッファを参照し、前記第1のユーザ像の奥行き値と前記第2のユーザ像の奥行き値との差分値を前記移動量として算出する
遠隔操作装置。 - 前記3次元レンダリング部は、第1の射影行列を用いて前記第1及び第2の3次元モデルを3次元レンダリングすることにより、前記第1及び第2のユーザ像を生成し、さらに、第1の射影行列とは異なる第2の射影行列を用いて、前記第2の3次元モデルを3次元レンダリングすることにより、第3のユーザ像を生成し、
前記3次元レンダリング部によって生成された第3のユーザ像が前記表示画面に表示される
請求項1に記載の遠隔操作装置。 - 前記第1の射影行列は、正射影行列であり、
前記第2の射影行列は、透視射影行列である
請求項2に記載の遠隔操作装置。 - 前記第1の射影行列は、前記ユーザの大きさ及び前記表示画面の大きさに応じてユーザ像の大きさを変化させる射影行列である
請求項2に記載の遠隔操作装置。 - 表示画面に表示されたメニュー画像であって前記表示画面から離れた位置にいるユーザにより指定されたメニュー画像に対応する操作情報を出力するための遠隔操作方法であって、
第1の時刻及び前記第1の時刻より後の第2の時刻における前記ユーザの形状の外観を示す3次元モデルである第1及び第2の3次元モデルを生成する3次元モデル生成ステップと、
前記3次元モデル生成ステップにおいて生成された第1及び第2の3次元モデルを3次元レンダリングすることにより、第1及び第2のユーザ像を生成する3次元レンダリングステップと、
前記3次元レンダリングステップにおいて生成された第2のユーザ像が前記表示画面に表示された場合に、前記メニュー画像と前記第2のユーザ像とが重なるか否かを判定する第1の操作判定ステップと、
前記3次元レンダリングステップにおいて生成された第2のユーザ像が前記表示画面に表示された場合における、前記第2のユーザ像の前記表示画面と垂直な方向への移動量を算出し、算出した移動量に基づいて、前記ユーザが指定した位置を示す前記表示画面上の座標を特定する第2の操作判定ステップと、
前記第1の操作判定ステップにおいて前記メニュー画像と前記第2のユーザ像とが重なると判定された場合であって、前記第2の操作判定ステップにおいて特定された座標が前記メニュー画像の表示される領域に含まれる場合に、前記メニュー画像に対応する操作情報を出力する操作情報出力ステップとを含み、
前記第2の操作判定ステップでは、前記第1及び第2の3次元モデルが3次元レンダリングされる際に生成される、投影面上の座標と奥行き値との関係を示す奥行き情報が格納されるデプスバッファを参照し、前記第1のユーザ像の奥行き値と前記第2のユーザ像の奥行き値との差分値を前記移動量として算出する
遠隔操作方法。 - 表示画面に表示されたメニュー画像であって前記表示画面から離れた位置にいるユーザにより指定されたメニュー画像に対応する操作情報を出力するための集積回路であって、
第1の時刻及び前記第1の時刻より後の第2の時刻における前記ユーザの形状の外観を示す3次元モデルである第1及び第2の3次元モデルを生成する3次元モデル生成部と、
前記3次元モデル生成部によって生成された第1及び第2の3次元モデルを3次元レンダリングすることにより、第1及び第2のユーザ像を生成する3次元レンダリング部と、
前記3次元レンダリング部によって生成された第2のユーザ像が前記表示画面に表示された場合に、前記メニュー画像と前記第2のユーザ像とが重なるか否かを判定する第1の操作判定部と、
前記3次元レンダリング部によって生成された第2のユーザ像が前記表示画面に表示された場合における、前記第2のユーザ像の前記表示画面と垂直な方向への移動量を算出し、算出した移動量に基づいて、前記ユーザが指定した位置を示す前記表示画面上の座標を特定する第2の操作判定部と、
前記第1の操作判定部によって前記メニュー画像と前記第2のユーザ像とが重なると判定された場合であって、前記第2の操作判定部によって特定された座標が前記メニュー画像の表示される領域に含まれる場合に、前記メニュー画像に対応する操作情報を出力する操作情報出力部とを備え、
前記第2の操作判定部は、前記第1及び第2の3次元モデルが3次元レンダリングされる際に生成される、投影面上の座標と奥行き値との関係を示す奥行き情報が格納されるデプスバッファを参照し、前記第1のユーザ像の奥行き値と前記第2のユーザ像の奥行き値との差分値を前記移動量として算出する
集積回路。 - 表示画面に表示されたメニュー画像であって前記表示画面から離れた位置にいるユーザにより指定されたメニュー画像に対応する操作情報を出力するためのプログラムであって、
第1の時刻及び前記第1の時刻より後の第2の時刻における前記ユーザの形状の外観を示す3次元モデルである第1及び第2の3次元モデルを生成する3次元モデル生成ステップと、
前記3次元モデル生成ステップにおいて生成された第1及び第2の3次元モデルを3次元レンダリングすることにより、第1及び第2のユーザ像を生成する3次元レンダリングステップと、
前記3次元レンダリングステップにおいて生成された第2のユーザ像が前記表示画面に表示された場合に、前記メニュー画像と前記第2のユーザ像とが重なるか否かを判定する第1の操作判定ステップと、
前記3次元レンダリングステップにおいて生成された第2のユーザ像が前記表示画面に表示された場合における、前記第2のユーザ像の前記表示画面と垂直な方向への移動量を算出し、算出した移動量に基づいて、前記ユーザが指定した位置を示す前記表示画面上の座標を特定する第2の操作判定ステップと、
前記第1の操作判定ステップにおいて前記メニュー画像と前記第2のユーザ像とが重なると判定された場合であって、前記第2の操作判定ステップにおいて特定された座標が前記メニュー画像の表示される領域に含まれる場合に、前記メニュー画像に対応する操作情報を出力する操作情報出力ステップとをコンピュータに実行させ、
前記第2の操作判定ステップでは、前記第1及び第2の3次元モデルが3次元レンダリングされる際に生成される、投影面上の座標と奥行き値との関係を示す奥行き情報が格納されるデプスバッファを参照し、前記第1のユーザ像の奥行き値と前記第2のユーザ像の奥行き値との差分値を前記移動量として算出する
プログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010515752A JP5340280B2 (ja) | 2008-06-02 | 2009-05-29 | 遠隔操作装置及び遠隔操作方法 |
US12/671,532 US8432391B2 (en) | 2008-06-02 | 2009-05-29 | Remote control device and remote control method |
CN2009801000735A CN101784980B (zh) | 2008-06-02 | 2009-05-29 | 遥控操作装置及遥控操作方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008144458 | 2008-06-02 | ||
JP2008-144458 | 2008-06-02 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2009147806A1 true WO2009147806A1 (ja) | 2009-12-10 |
Family
ID=41397890
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/002380 WO2009147806A1 (ja) | 2008-06-02 | 2009-05-29 | 遠隔操作装置及び遠隔操作方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US8432391B2 (ja) |
JP (1) | JP5340280B2 (ja) |
CN (1) | CN101784980B (ja) |
WO (1) | WO2009147806A1 (ja) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5659510B2 (ja) * | 2010-03-10 | 2015-01-28 | ソニー株式会社 | 画像処理装置、画像処理方法及びプログラム |
JP5585505B2 (ja) * | 2011-03-17 | 2014-09-10 | セイコーエプソン株式会社 | 画像供給装置、画像表示システム、画像供給装置の制御方法、画像表示装置、及び、プログラム |
US9438890B2 (en) * | 2011-08-25 | 2016-09-06 | Panasonic Intellectual Property Corporation Of America | Image processor, 3D image capture device, image processing method, and image processing program |
US9628843B2 (en) * | 2011-11-21 | 2017-04-18 | Microsoft Technology Licensing, Llc | Methods for controlling electronic devices using gestures |
US9842425B2 (en) * | 2012-09-21 | 2017-12-12 | Euclideon Pty Ltd. | System and method for rendering three-dimensional scenes by a computer graphics processor using orthogonal projection |
KR102307530B1 (ko) * | 2012-11-23 | 2021-09-30 | 카덴스 메디컬 이미징 아이엔씨 | 제 1 랜더링된 투영 및 제 2 랜더링된 투영간의 전이를 사용자에게 디스플레이하는 방법 및 시스템 |
FR3034078B1 (fr) * | 2015-03-27 | 2017-03-24 | Airbus Helicopters | Procede et dispositif pour signaler au sol un aeronef en vol, et aeronef muni de ce dispositif |
KR20180035434A (ko) | 2016-09-29 | 2018-04-06 | 삼성전자주식회사 | 디스플레이 장치 및 그의 제어 방법 |
JP7320352B2 (ja) * | 2016-12-28 | 2023-08-03 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 三次元モデル送信方法、三次元モデル受信方法、三次元モデル送信装置及び三次元モデル受信装置 |
US11733781B2 (en) * | 2019-04-02 | 2023-08-22 | Project Dasein Llc | Leveraging machine learning and fractal analysis for classifying motion |
CN113855287B (zh) * | 2021-07-06 | 2023-09-26 | 上海优医基医疗影像设备有限公司 | 一种带评估种植精度的口腔种植手术机器人及控制方法 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000075991A (ja) * | 1998-08-28 | 2000-03-14 | Aqueous Research:Kk | 情報入力装置 |
JP2004265222A (ja) * | 2003-03-03 | 2004-09-24 | Nippon Telegr & Teleph Corp <Ntt> | インタフェース方法、装置、およびプログラム |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1841290A (zh) * | 2003-03-28 | 2006-10-04 | 精工爱普生株式会社 | 信息显示系统及其信息处理装置、指示装置和标记显示法 |
JP2004318823A (ja) * | 2003-03-28 | 2004-11-11 | Seiko Epson Corp | 情報表示システム、情報処理装置、ポインティング装置および情報表示システムにおけるポインタマーク表示方法 |
JP2005322055A (ja) | 2004-05-10 | 2005-11-17 | Nippon Telegr & Teleph Corp <Ntt> | 作業支援用情報提示装置、作業支援用情報提示方法及び作業支援用情報を提示するためのプログラム |
JP4711223B2 (ja) * | 2005-08-02 | 2011-06-29 | 株式会社セガ | 画像生成プログラム、記憶媒体、画像処理方法及び画像処理装置 |
-
2009
- 2009-05-29 CN CN2009801000735A patent/CN101784980B/zh not_active Expired - Fee Related
- 2009-05-29 WO PCT/JP2009/002380 patent/WO2009147806A1/ja active Application Filing
- 2009-05-29 US US12/671,532 patent/US8432391B2/en not_active Expired - Fee Related
- 2009-05-29 JP JP2010515752A patent/JP5340280B2/ja not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000075991A (ja) * | 1998-08-28 | 2000-03-14 | Aqueous Research:Kk | 情報入力装置 |
JP2004265222A (ja) * | 2003-03-03 | 2004-09-24 | Nippon Telegr & Teleph Corp <Ntt> | インタフェース方法、装置、およびプログラム |
Non-Patent Citations (1)
Title |
---|
YONEMOTO S.: "Affordance-based Perceptual User Interfaces", ITE TECHNICAL REPORT 2003 NEN 2 GATSU KAISAIBUN, vol. 27, no. 9, 4 February 2003 (2003-02-04), pages 171 - 176 * |
Also Published As
Publication number | Publication date |
---|---|
JP5340280B2 (ja) | 2013-11-13 |
CN101784980B (zh) | 2013-09-18 |
US20110018864A1 (en) | 2011-01-27 |
CN101784980A (zh) | 2010-07-21 |
JPWO2009147806A1 (ja) | 2011-10-20 |
US8432391B2 (en) | 2013-04-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5340280B2 (ja) | 遠隔操作装置及び遠隔操作方法 | |
US11928838B2 (en) | Calibration system and method to align a 3D virtual scene and a 3D real world for a stereoscopic head-mounted display | |
US8970586B2 (en) | Building controllable clairvoyance device in virtual world | |
US20220157011A1 (en) | Synthesizing an image from a virtual perspective using pixels from a physical imager array weighted based on depth error sensitivity | |
US9857470B2 (en) | Using photometric stereo for 3D environment modeling | |
US20130057574A1 (en) | Storage medium recorded with program, information processing apparatus, information processing system, and information processing method | |
US8854358B2 (en) | Computer-readable storage medium having image processing program stored therein, image processing apparatus, image processing method, and image processing system | |
US20170109920A1 (en) | Method and apparatus rendering caustics | |
US9639972B2 (en) | Computer-readable storage medium having stored therein display control program, display control apparatus, display control method, and display control system for performing display control of a display apparatus capable of stereoscopic display | |
JP7403967B2 (ja) | 情報処理装置、映像生成装置、画像処理システム、それらの制御方法及びプログラム | |
KR20210086837A (ko) | 증강현실을 이용한 실내 인테리어 시뮬레이션 방법 | |
US20230062973A1 (en) | Image processing apparatus, image processing method, and storage medium | |
WO2018179254A1 (ja) | 画像生成装置、画像生成方法及びプログラム | |
JP7006810B2 (ja) | 3次元計測装置、移動ロボット、手押し車型移動装置および3次元計測処理方法 | |
US9225968B2 (en) | Image producing apparatus, system and method for producing planar and stereoscopic images | |
JP2017059041A (ja) | 画像処理装置、画像処理方法、およびプログラム | |
JP5950701B2 (ja) | 画像表示システム、パズルゲームシステム、画像表示方法、パズルゲーム方法、画像表示装置、パズルゲーム装置、画像表示プログラム、および、パズルゲームプログラム | |
JP2004030408A (ja) | 三次元画像表示装置及び表示方法 | |
JP2010182094A (ja) | 立体画像描画装置、立体画像描画方法、立体画像描画プログラム | |
JP7107015B2 (ja) | 点群処理装置、点群処理方法およびプログラム | |
KR101227155B1 (ko) | 저해상도 그래픽 영상을 고해상도 그래픽 영상으로 실시간 변환하는 그래픽 영상 처리 장치 및 방법 | |
JP6564259B2 (ja) | 画像処理装置、画像処理方法 | |
JP2013257664A (ja) | 画像処理装置及びその制御方法、プログラム | |
JP5287613B2 (ja) | 画像表示方法、情報処理装置および画像表示プログラム | |
US20240233259A1 (en) | Information processing apparatus, information processing method, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980100073.5 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010515752 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12671532 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09758073 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09758073 Country of ref document: EP Kind code of ref document: A1 |