JP6548923B2 - Command execution system and position measurement device - Google Patents

Command execution system and position measurement device Download PDF

Info

Publication number
JP6548923B2
JP6548923B2 JP2015049493A JP2015049493A JP6548923B2 JP 6548923 B2 JP6548923 B2 JP 6548923B2 JP 2015049493 A JP2015049493 A JP 2015049493A JP 2015049493 A JP2015049493 A JP 2015049493A JP 6548923 B2 JP6548923 B2 JP 6548923B2
Authority
JP
Japan
Prior art keywords
command
object
user
area
setting unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2015049493A
Other languages
Japanese (ja)
Other versions
JP2016170596A (en
Inventor
信策 阿部
信策 阿部
Original Assignee
株式会社ミツトヨ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ミツトヨ filed Critical 株式会社ミツトヨ
Priority to JP2015049493A priority Critical patent/JP6548923B2/en
Publication of JP2016170596A publication Critical patent/JP2016170596A/en
Application granted granted Critical
Publication of JP6548923B2 publication Critical patent/JP6548923B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a command execution system and a position measurement device that receives a command selected by a user and executes the command on a computer.

  The window system has become mainstream as a computer interface. The user refers to the content displayed in the window on the screen to perform various processes such as editing and file operation.

  Patent Document 1 discloses a portable terminal device that performs an operation in which a background image of a screen and an item are associated with each other and associated with an item focused by the user. Further, Patent Document 2 discloses a graphic user interface for accessing information in an electronic file system. In addition, Patent Document 3 discloses, for an image having an individual image area as a part of the image, one in which the function of the device menu is adapted to at least one of the image area.

Unexamined-Japanese-Patent No. 2007-041641 Japanese Patent Application Publication No. 10-507020 Japanese Patent Application Publication No. 2008-511895

  However, in the technology described in Patent Document 1, the functional relationship between the background image and the item is important, and when the function to be associated increases, the number and the level of the background image increase and it becomes difficult to select a desired item. It will In addition, there is a problem that the user can not allocate or customize the item, and can not perform setting according to the user's preference or image setting and function setting according to the application.

  In the technology described in Patent Document 2, only the link to the electronic file is assigned to the graphic, and the execution of various commands to the computer is not supported. In addition, as the number of items to be associated increases, the number of sub images increases.

  The technique described in Patent Document 3 is the same as arranging icons displayed in an image on a grid, and it is difficult to know what function is assigned to which image. In addition, there is a problem that the customization of assignment of functions to images is low.

  As described above, in any of the technologies, the assignment of functions is not easy for the user and the degree of customization is low, so it takes a long time to get used to the correspondence between images and functions. In addition, common menu operations can not be performed among a plurality of applications, and there is a problem that each application has to remember the operation.

  An object of the present invention is to provide a command execution system and a position measurement device capable of assigning a command so as to match a user's sense and to easily associate functions.

  In order to solve the above problems, a command execution system according to the present invention is a command execution system which receives a command selected by a user and causes a computing unit of a computer to execute the command. Allocation area setting unit for setting an allocation area, an object setting unit for setting an object in the allocation area according to a user's instruction, a command setting unit for setting an association between an object and a command, and a user And a command execution unit for causing the calculation unit to execute a command associated with the target received by the selection receiving unit.

  According to such a configuration, the allocation area allocated by the user and the target associated with the command are set in the display area of the information by the computer. The allocation area, the object and the command are arbitrarily set according to the preference of the user. Thereby, the user can freely set an easy-to-understand command to the desired object displayed in the allocation area.

  In the command execution system according to the present invention, the command setting associates a plurality of applications with one command, and the command execution unit causes an active application among the plurality of applications to execute processing according to the command. You may do so.

  According to such a configuration, it is possible to associate a command that can execute the same processing by a plurality of applications for one target. Thereby, even when the applications are different, the same target object is associated with the command for performing the same processing. The user may select the same object to execute the same process regardless of the application.

  In the command execution system of the present invention, the object may be displayed two-dimensionally or three-dimensionally in the assignment area. According to such a configuration, the user can select a flat pattern (photograph, illustration, image, etc.) and execute a command, or a three-dimensional pattern (hologram, solid image in virtual reality space, etc.) You can choose to execute a command.

  In the command execution system according to the present invention, the object setting unit sets a plurality of objects and sets information on the upper and lower overlap of the plurality of objects in a specific direction in the allocation area, and the selection receiving unit When the user designates an overlapping area of a plurality of objects, the selection of the upper object may be accepted based on the overlapping information set by the object setting unit. According to such a configuration, even if a plurality of objects overlap each other, commands can be associated with each other.

  The command execution system of the present invention further includes a display moving unit that moves the position of the assigned area in the display area according to a user's instruction, and the target setting unit moves the position of the allocated area by the display moving unit. At this time, the position of the object may be moved according to the movement of the allocation area.

  According to such a configuration, when the user moves the allocation area in the display area, the object also moves as the allocation area moves. Thereby, the relative positional relationship between the allocation area and the object can be maintained.

  In the command execution system according to the present invention, the command setting unit may set the processing order of the plurality of commands and the plurality of commands in association with one object. According to such a configuration, the user can execute a plurality of commands only by selecting one target.

  In the command execution system of the present invention, the allocation area is an output area of one image file, and the object may be a specific pattern in the image file. According to such a configuration, it is possible to assign a command to a specific pattern in one image file.

  A position measurement apparatus according to the present invention includes a measurement head for detecting a position of a measurement object in a predetermined coordinate system, and a control device for calculating and outputting a detection result by the measurement head. It is characterized by including an execution system. According to such a configuration, when performing measurement with the measuring device, the user can send a command to the control device by selecting an object in the assignment area set in preference. As a result, a user-friendly interface can be realized even if the measurement device is specialized in handling.

  In the position measurement device of the present invention, the command setting unit may associate at least a part of the series of processes from calculation of the detection result to output as a command with the object. According to such a configuration, it is possible to perform a plurality of processes only by selecting one target.

It is a block diagram which illustrates the command execution system concerning a 1st embodiment. It is a schematic diagram which shows the example of a panel and an object. It is a figure which illustrates the correspondence table of a subject and a command. It is a schematic diagram which shows the application example of a target object. (A)-(c) is a schematic diagram which shows the application example of a target object. It is a processing flowchart of a command execution system. It is a processing flowchart of a command execution system. It is a processing flowchart of a command execution system. A position measurement device according to a second embodiment will be described. (A)-(c) is a schematic diagram which shows the example of a display of the display of a position measurement apparatus.

  Hereinafter, embodiments of the present invention will be described based on the drawings. In the following description, the same members are denoted by the same reference numerals, and the description of the members once described will be omitted as appropriate.

First Embodiment
FIG. 1 is a block diagram illustrating a command execution system according to the first embodiment.
The command execution system according to the present embodiment is a system that receives a command selected by the user and causes the CPU (Central Processing Unit) 10 of the computer PC to execute the command.

  The computer PC includes a CPU 10, a storage device 20, an arithmetic unit 30, a display control unit 40, an input / output control unit 50, and an external device control unit 60. The CPU 10 is a part that executes various programs. Storage device 20 includes a main storage unit and a secondary storage unit. Arithmetic unit 30 is hardware that performs various operations. The display control unit 40 is a part that controls the display 45. The input / output control unit 50 controls input / output of information by the keyboard 51 or the touch panel 52.

  In the present embodiment, the command execution system is realized by a program executed by the CPU 10 of the computer PC. The command execution system includes a panel generation unit 71, a panel operation unit 72, and a command execution unit 73.

  Panel generation unit 71 includes an allocation area setting unit 711, an object setting unit 712, and a command setting unit 713. The allocation area setting unit 711 performs processing of setting an allocation area according to a user's instruction in a display area of information by the computer PC. For example, a predetermined area of the display area of the display 45 is set as an allocation area. In this embodiment, this allocated area is referred to as a panel. The allocation area setting unit 711 sets a panel in the display area of the display 45 according to an instruction from the user. The panel is arbitrarily set according to the user's preference.

  The object setting unit 712 performs a process of setting an object in the panel according to an instruction of the user. For example, a process of setting a pattern such as an image or a photo in the panel is performed. The object setting unit 712 is. An object is set at a predetermined position in the panel in accordance with an instruction from the user. The object setting unit 712 sets position information of the display area of the object instructed by the user and identification information of the object. The object is arbitrarily selected according to the user's preference and set at an arbitrary position in the panel.

  The command setting unit 713 performs processing of setting an association between an object and a command. The command setting unit 713 associates a predetermined command for an object in accordance with an instruction from the user. Thus, the command executed by the CPU 10 is associated with the target set by the target setting unit 712. The user can arbitrarily assign a command to a desired object.

  The selection receiving unit 721 performs a process of receiving a selection of an object by the user. The user uses the touch panel 52 or a pointing device such as a mouse to select an object in the panel. The selection receiving unit 721 detects identification information of an object included in the position based on the position selected by the user.

  The display moving unit 722 performs a process of moving the position of the panel in the display area according to the user's instruction. The user can move the position of the panel in the display area according to the preference. That is, the user can move the position of the panel by operating the touch panel 52 or operating the mouse or the like. The display moving unit 722 receives the user's operation and moves the panel. As the panel moves, the object setting unit 712 moves the position of the object in accordance with the movement of the panel. Thus, the relative position between the panel and the object does not change, and the panel and the object can be moved within the display area.

  The command execution unit 73 performs a process of causing the CPU 10 to execute a command associated with the target received by the selection receiving unit 721. When the user selects a predetermined target, the selection receiving unit 721 detects the identification information of the selected target and sends it to the command execution unit 73. The command execution unit 73 selects a command associated with the identification information sent from the selection receiving unit 721 and sends it to the CPU 10. As a result, command operations on the operating system and various application software are executed.

FIG. 2 is a schematic view showing an example of a panel and an object.
As shown in FIG. 2, the display 45 and the touch panel 52 are provided with a display area DR. The panel PR is set at a predetermined position in the display area DR. The user designates the position and size of the panel PR by operating the touch panel 52 or operating the mouse or the like. The allocation area setting unit 711 that has received this specification stores, in the storage device 20, coordinate information of the area to which the panel PR in the display area DR belongs.

  Also, the user can place a desired object in the panel PR. In the example shown in FIG. 2, three images IMG-1, IMG-2, and IMG-3 are arranged in the panel PR as objects. The object setting unit 712 stores, in the storage device 20, coordinate information of the region to which the images IMG-1, IMG-2, and IMG-3 belong by the operation of the touch panel 52 by the user or the operation of the mouse or the like.

Here, in a plurality of objects, parts of each other may overlap and be laid out. In this case, the object setting unit 712 sets upper and lower overlap information of a plurality of objects in a specific direction of the panel PR (for example, a direction orthogonal to the panel PR). In the example shown in FIG.
The images are arranged from top to bottom in the order of images IMG-1, IMG-3 and IMG-2. For the portion where the images overlap at the top and bottom, the top image is prioritized.

  A command is associated with each of the images IMG-1, IMG-2, and IMG-3 that are objects. The user sends, to the command setting unit 713, an instruction to assign a command to each of the images IMG-1, IMG-2, and IMG-3 laid out on the panel PR. The command setting unit 713 associates the images IMG-1, IMG-2, and IMG-3 with the command according to the instruction sent from the user.

FIG. 3 is a diagram exemplifying a correspondence table between an object and a command.
The command setting unit 713 sets an association between an object and a command by using a table TB as shown in FIG. For example, command C1 is associated with image IMG-1, command C2 is associated with image IMG-2, and command C3 is associated with image IMG-3. The command includes an instruction to the operating system, an instruction to predetermined application software, an input instruction of an arbitrary character string, and the like.

  A plurality of commands for one object may be associated with the processing order of the plurality of commands. Moreover, the command matched with one target object may be matched with several application software. For example, commands common to different application software such as file operations (overwrite save, save as a print, etc.) are associated as common commands for one object.

  By associating the object with the command in this manner, the user can execute a command to the CPU 10 of the computer PC by selecting a desired object from the panel PR.

  In the present embodiment, the panel PR, the object and the command can be freely set according to the preference of the user. Thereby, the user can set an easy-to-understand command to a desired object (for example, images IMG-1, IMG-2, IMG-3) displayed in panel PR, and images IMG-1, IMG The commands can be linked and operated according to the contents of -2 and IMG-3.

FIG. 4 and FIG. 5 are schematic views showing application examples of an object.
In the example shown in FIG. 4, photographs PH-1, PH-2 and PH-3 are used as the objects. By associating photos PH-1, PH-2 and PH-3 with commands, the user intuitively associates and operates commands from the contents of photos PH-1, PH-2 and PH-3. be able to.

  For example, the command "open file" is associated with the picture PH-1 of the scene where the door is opened. Also, a command of "print" is associated with the photograph PH-2 of the printer. As described above, by associating commands intuitively understood from the contents of the photos PH-1 and PH-2, accurate command operations can be performed by any user.

  In addition, for example, the user's favorite command may be associated with the photo PH-3 of the dog. For example, the "overwrite save" command is associated. An object that is easy to understand for a specific user tends to be an impression. For example, a photo PH-3 of a pet dog is very impressive for the user. By associating the photo PH-3 with a specific command, the user can select and execute the command accurately by association.

  In the example shown in FIG. 5A, the image MAP of the map is used as the object. For example, in the image MAP of the Japanese map, one image MAP is divided into a plurality of areas (for example, by prefectures). Then, a command is associated with each area. In this example, commands can be assigned to each prefecture.

  The correspondence between each area (prefecture) and the command can be arbitrarily set by the user. For example, “Hokkaido” has an image of a food warehouse for a certain user, and this image is associated with a frequently used command (for example, “overwrite save”). This makes it possible to intuitively select and execute a command in association with the user's image.

  The panel PR on which the map image MAP is arranged can be displayed in the display area DR as needed regardless of the application software. As a result, regardless of which application software is used, it is possible to execute a command that performs the same process only by selecting a desired prefecture using the same map image MAP.

  In the command execution system according to the present embodiment, it is possible to set an object desired by the user and arbitrarily associate a command. In the example shown in FIG. 5B, the image GP of the popular group is set as the target. For example, the part of each member of the image GP of the popular group is divided into areas, and a command is associated with each area. As a command, a command linked to a blog or a home page of each member may be associated corresponding to the area of each member. Thus, by selecting the region of the desired member from the image GP, the browser (application software) executes a command to open the member's blog or home page.

  Also, a group photo of a friend or a family may be used as the image GP. A command for sending an e-mail may be associated with the email address of each member of the group photo. By selecting the area of the member to which the user wants to send the e-mail from the image GP of the group photo, the user activates the e-mail software and the e-mail address of the member is automatically input to the destination.

  In the example shown in FIG. 5C, the image STR of the constellation is set as the object. Thus, the object can be arbitrarily set according to the preference of the user. That is, by setting an object of interest according to the taste preference of the user to be used, it is possible to make a strong impression of the association with the command.

  In the above example, although a rectangular area is shown as the panel PR, it may be an area other than the rectangular. Further, although the example in which the object is set in the panel PR has been shown, the outline area of the object may be matched with the panel PR.

  In the command execution system according to the present embodiment, it is possible to associate a user-friendly command with an object according to the user's preference. This makes it possible to intuitively select command processing in relation to a desired object, for example, even for complex command processing and difficult-to-remember command processing.

(Processing flowchart)
Next, a processing flowchart of the command execution system according to the present embodiment will be described.
6 to 8 are process flowcharts.
6 shows a flowchart of setting processing of an object, FIG. 7 shows a flowchart of setting processing of a command, and FIG. 8 shows a flowchart of execution processing of a command.

  First, the setting process of an object will be described with reference to FIG. This process is performed by the object setting unit 712. Here, the case where the outer region of the object and the panel PR coincide with each other is taken as an example. First, as shown in step S101, an object is selected. Examples of the target include figures and photographs displayed on the display 45 or the like. The object is arbitrarily selected by the user.

  Next, as shown in step S102, the selection region of the object is calculated. Here, a process of cutting out a predetermined area from among the previously selected objects is performed. The cutting process may be performed automatically, or a region arbitrarily designated by the user may be cut.

  Next, as shown in step S103, the areas are arranged. The previously cut out area can be placed at an arbitrary position according to a user's instruction. Next, as shown in step S104, the area is adjusted. For example, it is determined whether or not there is an overlap in a plurality of areas, and if there is an overlap, an upper / lower relationship is set.

  Next, as shown in step S105, it is determined whether an object is added. If there is an addition, the processing of steps S101 to S104 is repeated. If there is no addition, the process proceeds to step S106, and the setting of the object is stored.

  Next, command setting processing will be described with reference to FIG. This process is performed by the command setting unit 713. First, as shown in step S201, the previously set target is displayed. For example, the display 45 displays the object and the cut out area.

  Next, as shown in step S202, an area of an object to which a command is to be assigned is selected. This selection is performed by the operation of the touch panel 52 or the mouse by the user. Next, as shown in step S203, a command to be assigned to the selected area is accepted. Thereby, the correspondence between the area of the object and the command is performed.

  Next, as shown in step S204, it is determined whether there is an area to which a command is to be allocated next. If there is an area to be allocated, the processing of steps S201 to S203 is repeated. If there is no area to be allocated, the process proceeds to step S205, and the correspondence between the area set up to that point and the command is stored as a table TB as shown in FIG.

  Next, command execution processing will be described with reference to FIG. This process is performed by the command execution unit 73. First, as shown in step S301, the panel PR is displayed. For example, the instruction from the user is received, and the panel PR and the target object instructed on the display 45 or the like are displayed.

  Next, as shown in step S302, it is determined whether or not there is a selection of an object (region). When the target (area) is selected, processing of the command associated with the selected target (area) is executed as shown in step S303. That is, based on the table TB as shown in FIG. 3, processing for sending a command associated with the selected object (area) to the CPU 10 is performed. Thereby, the command is executed by the selection of the object (area).

  Next, as shown in step S304, it is determined whether to end the process. If the process does not end, the processes of steps S301 to S303 are repeated.

  The processes of the flowcharts shown in FIGS. 6 to 8 are realized by a program executed by the CPU 10 of the computer PC. This program is recorded on various recording media or distributed via a network.

Second Embodiment
Next, a position measurement device according to a second embodiment will be described.
FIG. 9 is a schematic view illustrating the position measurement device according to the present embodiment.
The position measurement device 1 according to the present embodiment is an arm-type three-dimensional measurement device. The position measurement device 1 includes a measurement head 100 attached to the tip of the articulated arm 15, and a control device 110 that calculates and outputs the detection result of the measurement object OB by the measurement head 100. Control device 110 may be a computer PC including display 45.

  A rod-shaped measurement probe 11 is provided at the tip of the measurement head 100. A spherical probe 11 a is provided at the tip of the measurement probe 11. The coordinates of the contact point can be determined by bringing the probe 11a into contact with the detection point of the measurement object OB.

  In addition, the measuring head 100 is provided with a handle 12. The handle 12 is a grip portion when the user US holds the measuring head 100. The handle 12 is provided with a button 12a. By pressing the button 12a, the user US can obtain the coordinates of the probe 11a at that time, or can instruct the start of various processing.

  Here, the measurement head 100 has a contact type and a non-contact type. Among them, as the contact type measurement head 100, a touch trigger probe for acquiring a position signal when the probe 11a of the measurement probe 11 contacts an object, and the button 12a is pressed in a state where the probe 11a is in contact with the object. And a hard probe for acquiring a position signal. Further, as the non-contact type measurement head 100, a laser probe may be mentioned. In the laser probe, instead of the rod-like measurement probe 11, a laser light source and a light receiving element are provided. In the case of the laser probe, for example, when the button 12a is pressed, the laser light is irradiated from the laser light source to the measurement object OB, and the reflected light is photoelectrically converted by the light receiving element to obtain a position signal. The measurement head 100 applied in the present embodiment is a hard probe or a laser probe that acquires a position signal by pressing the button 12 a.

  The articulated arm 15 is installed, for example, on the surface plate STG. The articulated arm 15 is configured to support the measurement head 100 and to move the measurement head 100 to various positions and directions in accordance with the movement of the user US holding the handle 12. The articulated arm 15 can be rotated about, for example, six axes. The articulated arm 15 is provided with an angle sensor 151 with respect to each axis. When the user US holding the handle 12 moves the measuring head 100, the angle sensor 151 for each axis outputs an angle for each axis.

  The control device 110 calculates and outputs the coordinates of the detection point designated by the measuring head 100 based on the value of the angle output from the angle sensor 151 or the like. In the present embodiment, coordinates in the three-dimensional coordinate system of X, Y, Z are calculated. Specifically, when the probe 11a of the measurement head 100 contacts the detection position of the measurement object OB, the angle of the output from the angle sensor 151, the length of the arm, etc. are used to determine the contact point X of the probe 11a. , Y, Z coordinates are calculated.

  In the position measurement device 1 of the present embodiment, the control device 110 includes the command execution system according to the first embodiment. That is, the command execution system according to the first embodiment performs processing that is executed by a command, such as control of the position measurement device 1 and arithmetic processing, various file operations, and the like on the measurement results.

FIGS. 10A to 10C are schematic views showing display examples of the display of the position measurement device.
FIGS. 10A to 10C show display examples of measurement application software executed by the control device 110 of the position measurement device 1. Each of FIGS. 10A to 10C is a display example of application software different from each other.

  Even when different application software is executed, a common panel PR (object) is displayed. The commands assigned to the objects of the panel PR can be made common even if the application software is different. For example, when an image MAP of a Japanese map is set as an object, the image MAP is displayed regardless of any application software, and a desired area (for example, prefecture) of the image MAP is selected. The command associated with the area can be executed.

  For example, when a "print" command is assigned to "Hokkaido", "print" can be executed by selecting "Hokkaido" regardless of which application software is being executed.

  Further, at least a part of the series of processes from calculation of the detection result by the position measurement device 1 to output may be associated with the object as a command. As a result, even the complicated processing of the position measurement device 1 can be executed only by selecting the object. For example, by associating the user US's favorite image with the command, the user US can intuitively select the command, and the user interface can be improved.

  As described above, according to the command execution system and the position measurement device 1 according to the present embodiment, it is possible to execute processing by assigning a command so as to be easily associated with a function that matches the user's sense.

  Although the present embodiment has been described above, the present invention is not limited to these examples. For example, in the command execution system, the panel PR is not limited to a planar one. The panel PR may be a three-dimensional area or a plane area or a three-dimensional area in the virtual reality space. Further, the object is not limited to a planar image or photograph, and may be a hologram, a three-dimensional image on a virtual reality space, or the like. Those skilled in the art may appropriately add, delete, or change the design of the components from the above-described embodiments, or may appropriately combine the features of the embodiments, and the gist of the present invention is also included. It is included in the scope of the present invention as long as it

  The present invention can be suitably used for other measurement devices such as a three-dimensional measurement device and an image measurement device using a wireless measurement head 100 without using an arm, in addition to the arm-type three-dimensional measurement device measurement. .

1 ... Position measurement device 10 ... CPU
11: Measurement probe 11a: Measuring element 12: Handle 12a: Button 15: Multi-joint arm 20: Storage device 30: Arithmetic device 40: Display control unit 45: Display 50: Input / output control unit 51: Keyboard 52: Touch panel 60: External Device control unit 71 panel generation unit 72 panel operation unit 73 command execution unit 100 measurement head 110 control device 151 angle sensor 711 assignment area setting unit 712 object setting unit 713 command setting unit 721 selection Reception part 722 ... Display movement part DR ... Display area GP ... Image IMG-1 ... Image IMG-2 ... Image IMG-3 ... Image MAP ... Image OB ... Measurement object PH-1 ... Photo PH-2 ... Photo PH-3 ... Photo PR ... Panel STG ... Plated plate STR ... Image TB ... Table US ... User

Claims (11)

  1. A command execution system that receives a command selected by a user and causes a computing unit of a computer to execute the command.
    An allocation area setting unit configured to set an allocation area according to an instruction from the user in the display area of information by the computer;
    An object setting unit configured to set an object in the allocation area according to an instruction of the user;
    A command setting unit configured to set an association between the target and a command;
    A selection receiving unit that receives the selection of the object by the user;
    A command execution unit that causes the calculation unit to execute the command associated with the object received by the selection reception unit;
    Equipped with
    The object setting unit sets a plurality of the objects, and sets upper and lower overlap information of the plurality of objects in a specific direction in the allocation area,
    The selection receiving unit is characterized in that, when the user designates an overlapping area of the plurality of objects, the selection receiving unit receives the selection of the object which is on the basis of the overlapping information set by the object setting unit. Command execution system.
  2.   The command execution system according to claim 1, wherein the object is three-dimensionally displayed in the assignment area.
  3. A command execution system that receives a command selected by a user and causes a computing unit of a computer to execute the command.
    An allocation area setting unit configured to set an allocation area according to an instruction from the user in the display area of information by the computer;
    An object setting unit configured to set an object in the allocation area according to an instruction of the user;
    A command setting unit configured to set an association between the target and a command;
    A selection receiving unit that receives the selection of the object by the user;
    A command execution unit that causes the calculation unit to execute the command associated with the object received by the selection reception unit;
    Equipped with
    The object setting unit sets a plurality of the objects, and sets upper and lower overlap information of the plurality of objects in a specific direction in the allocation area,
    The selection receiving unit receives a selection of the target that is on the basis of the overlapping information set by the target setting unit when the user designates an overlapping area of the plurality of targets.
    The said command object is three-dimensionally displayed in the said display area, The command execution system characterized by the above-mentioned.
  4.   The command execution system according to claim 3, wherein the object is displayed as a hologram in a display area of information by the computer.
  5.   The command execution system according to claim 3, wherein the object is displayed as a three-dimensional image in a virtual reality space in a display area of information by the computer.
  6. The command setting unit associates a plurality of applications with one command.
    The command execution system according to any one of claims 1 to 5 , wherein the command execution unit causes an active application among the plurality of applications to execute a process according to the command.
  7. The apparatus further comprises a display moving unit that moves the position of the allocation area in the display area in accordance with an instruction of the user.
    The object setting unit, one of the claims 1-6, characterized in that moving the position of the allocated the object in accordance with the movement of area when the position of the assigned area by said display mobile unit has moved Command execution system described in one.
  8. 8. The command setting unit according to any one of claims 1 to 7 , wherein the command setting unit sets the processing order of the plurality of commands and the plurality of commands in association with each other for one target object. Described command execution system.
  9. The allocation area is an output area by one image file,
    The command execution system according to any one of claims 1 to 8 , wherein the object is a specific pattern in the image file.
  10. A measuring head for detecting the position of a measurement object in a predetermined coordinate system;
    A control device for calculating and outputting a detection result by the measurement head;
    Equipped with
    A position measurement device including the command execution system according to any one of claims 1 to 9 , wherein the control device.
  11. The position measurement device according to claim 10 , wherein the command setting unit associates at least a part of a series of processes from calculation of the detection result to output as the command.
JP2015049493A 2015-03-12 2015-03-12 Command execution system and position measurement device Active JP6548923B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015049493A JP6548923B2 (en) 2015-03-12 2015-03-12 Command execution system and position measurement device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2015049493A JP6548923B2 (en) 2015-03-12 2015-03-12 Command execution system and position measurement device

Publications (2)

Publication Number Publication Date
JP2016170596A JP2016170596A (en) 2016-09-23
JP6548923B2 true JP6548923B2 (en) 2019-07-24

Family

ID=56983865

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2015049493A Active JP6548923B2 (en) 2015-03-12 2015-03-12 Command execution system and position measurement device

Country Status (1)

Country Link
JP (1) JP6548923B2 (en)

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003067048A (en) * 1998-04-08 2003-03-07 Hitachi Ltd Information processor
JPH07151512A (en) * 1993-10-05 1995-06-16 Mitsutoyo Corp Operating device of three dimensional measuring machine
JPH09325845A (en) * 1996-06-05 1997-12-16 Alpine Electron Inc Operation instruction method using touch panel
JP2005108041A (en) * 2003-09-30 2005-04-21 Toshiba Corp Method for displaying menu screen on portable terminal and portable terminal
JP4317774B2 (en) * 2004-02-26 2009-08-19 任天堂株式会社 Game device and game program using touch panel
JPWO2008010276A1 (en) * 2006-07-20 2009-12-17 株式会社ナビタイムジャパン Map display system, map display device, map display method, and map distribution server
KR101382504B1 (en) * 2007-05-21 2014-04-07 삼성전자주식회사 Apparatus and method for making macro
JP4024835B2 (en) * 2007-06-25 2007-12-19 富士通株式会社 Icon management method, icon usage method and icon usage program
DE112011100302T5 (en) * 2010-01-20 2012-10-25 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine with multiple communication channels
JP2013092988A (en) * 2011-10-27 2013-05-16 Kyocera Corp Device, method, and program
JP6184053B2 (en) * 2012-01-18 2017-08-23 キヤノン株式会社 Information terminal, display control method, and program
WO2013137191A1 (en) * 2012-03-12 2013-09-19 株式会社エヌ・ティ・ティ・ドコモ Remote control system, remote control method, communication device and program
JP6082554B2 (en) * 2012-09-26 2017-02-15 京セラ株式会社 Mobile terminal device, program, and control method for mobile terminal device
JP5905417B2 (en) * 2013-07-29 2016-04-20 京セラ株式会社 Mobile terminal and display control method

Also Published As

Publication number Publication date
JP2016170596A (en) 2016-09-23

Similar Documents

Publication Publication Date Title
KR101541928B1 (en) visual feedback display
TWI438661B (en) User interface device and method for in response to an input event
CN101970184B (en) Operation teaching system and operation teaching method
US20100313143A1 (en) Method for transmitting content with intuitively displaying content transmission direction and device using the same
US7539563B2 (en) System and method for identifying objects in a space
KR20100041006A (en) A user interface controlling method using three dimension multi-touch
US20130198653A1 (en) Method of displaying input during a collaboration session and interactive board employing same
US20150128077A1 (en) Method and apparatus for editing touch display
US9075444B2 (en) Information input apparatus, information input method, and computer program
RU2421776C2 (en) Method of controllig position of control point in command region and device control method
KR20040069984A (en) Utility object for specialized data entry
US20120089938A1 (en) Information Processing Apparatus, Information Processing Method, and Program
DE102014006318A1 (en) Physical object detection and touchscreen interaction
JP2014512530A (en) Coordinate positioning device
US9335860B2 (en) Information processing apparatus and information processing system
JP2009282857A (en) Portable terminal and area specifying processing performing method
US20140053102A1 (en) Terminal and method for providing user interface
US9323422B2 (en) Spatially-aware projection pen display
CN102591564B (en) Information processing apparatus and information processing method
JP4389090B2 (en) Information display device
US20110216091A1 (en) Bimanual interactions on digital paper using a pen and a spatially-aware mobile projector
US8799821B1 (en) Method and apparatus for user inputs for three-dimensional animation
JP2007116270A (en) Terminal and apparatus control system
WO2006106173A1 (en) A method and a device for visual management of metadata
BRPI1100175A2 (en) information processing apparatus and method, and, program

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20180206

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20181019

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20181030

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20181214

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20190312

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20190528

A911 Transfer of reconsideration by examiner before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A911

Effective date: 20190604

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20190625

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20190626

R150 Certificate of patent or registration of utility model

Ref document number: 6548923

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150