WO2021157379A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et programme - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations et programme Download PDF

Info

Publication number
WO2021157379A1
WO2021157379A1 PCT/JP2021/002173 JP2021002173W WO2021157379A1 WO 2021157379 A1 WO2021157379 A1 WO 2021157379A1 JP 2021002173 W JP2021002173 W JP 2021002173W WO 2021157379 A1 WO2021157379 A1 WO 2021157379A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
work area
area
work
information processing
Prior art date
Application number
PCT/JP2021/002173
Other languages
English (en)
Japanese (ja)
Inventor
保乃花 尾崎
健太郎 井田
青木 悠
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2021157379A1 publication Critical patent/WO2021157379A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/08Cursor circuits
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Definitions

  • the present technology relates to an information processing device, an information processing method, and a program, and is suitable for use when, for example, a plurality of users in remote locations collaborate in a virtual space. And about the program.
  • Patent Document 1 proposes collaborative work using a pen.
  • This technology was made in view of such a situation, and when collaborating with a worker in a remote place, it is possible to avoid a collision of actions and perform the work. ..
  • the information processing device of one aspect of the present technology is located in an area different from the work content performed by the first user on the first work area and the second user in the first work area. It is provided with a display control unit that displays information on the behavior of the second user when the work is not performed on the second work area corresponding to the first work area in the first work area.
  • the information processing method of one aspect of the present technology is different from the work content performed by the information processing apparatus by the first user on the first work area and the work content performed by the second user on the first work area.
  • the program of one aspect of the present technology is located on the computer in an area different from the work content performed by the first user on the first work area and the second user in the first work area. , Execution of a process including a step of displaying information on the behavior of the second user when the work is not performed on the second work area corresponding to the first work area in the first work area. Let me.
  • the information processing device, the information processing method, and the program of one aspect of the present technology include the work contents performed by the first user on the first work area and the first work area by the second user. Information about the behavior of the second user when not working on the second work area corresponding to the first work area, which is located in a different area, is displayed in the first work area.
  • the information processing device may be an independent device or an internal block constituting one device.
  • the program can be provided by transmitting via a transmission medium or by recording on a recording medium.
  • FIG. 1 It is a figure which shows the configuration example of an information processing system. It is a figure which shows the structure in one Embodiment of the information processing apparatus to which this technique is applied. It is a figure for demonstrating the working state. It is a figure for demonstrating the pen tip state mark. It is a figure for demonstrating the pen tip shadow mark. It is a figure for demonstrating the person mark. It is a figure for demonstrating the processing of an information processing apparatus. It is a flowchart for demonstrating the processing of the pen tip recognition part. It is a figure for demonstrating behavior data. It is a figure for demonstrating the setting of the prediction drawing area. It is a figure for demonstrating the setting of the prediction drawing area. It is a figure for demonstrating the recording medium.
  • FIG. 1 is a diagram showing a configuration example of an information processing system to which the present technology is applied.
  • the information processing system 10 is configured so that the information processing device 21A and the information processing device 21B can exchange data with each other via the network 22.
  • the information processing device 21A is a device installed in the room A and used by the user A. Further, it is assumed that the information processing device 21B is a device installed in the room B and used by the user B. Room A and room B may be in remote locations or adjacent rooms, but here, remote locations, at least in positions where user A and user B cannot see each other's actions. The following explanation will be given assuming that you are in.
  • the information processing system 10 is used, for example, when user A and user B work together on a virtual work area. For example, when user A is a teacher and user B is a student, it is used when user A writes a problem on a work area and user B writes a solution to the problem on a board. It is also used when the user A and the user B draw a picture on the work area.
  • user A and user B are not in a state where they can recognize each other's actions in different spaces, when user A is trying to write, user B cannot recognize the action.
  • the user A may write to the area where the writing is to be performed.
  • Information on the behavior of the user B for example, information on the area where the user B is trying to perform the work is presented so that the user A can recognize the behavior of the user B so that the collision of the behavior does not occur. By recognizing the presented user B's action, the user A can take a workaround so as not to collide with the action.
  • the user B can also recognize the behavior of the user A and take a workaround so as not to collide with the behavior.
  • FIG. 2 is a diagram showing a configuration example of the information processing device 21.
  • the information processing device 21 shown in FIG. 2 has a configuration including a processing unit 31, a sensor unit 32, an operation unit 33, and a display unit 34.
  • the information processing device 21 is a processing unit that recognizes data and generates display data based on data received from a pen-type device (digital pen 111 in FIG. 2), a sensor unit 32 that detects a user, and the sensor unit 32. It is composed of 31, an operation unit 33 that accepts input from the user, and a display unit 34 that presents information to the user.
  • the processing unit 31 includes an I / F unit (interface unit) 51, a pen tip recognition unit 52, a pen recognition unit 53, an environment recognition unit 54, a user recognition unit 55, a map management unit 56, a data processing unit 57, a timer 58, and a storage unit.
  • a unit 59 and a communication unit 60 are provided.
  • the data processing unit 57 includes a drawing data generation unit 71, an action data generation unit 72, and a display data generation unit 73.
  • the sensor unit 32 includes a motion sensor 91, an acceleration sensor 92, a depth sensor 93, a microphone 94, a camera 95, a gyro sensor 96, and a geomagnetic sensor 97.
  • the sensor unit 32 may include one or more of these sensors, and may not be configured to include all the sensors of the motion sensor 91 to the geomagnetic sensor 97. Further, the configuration may include sensors other than the sensors listed here.
  • the operation unit 33 includes a digital pen 111, a touch panel 112, and a keyboard 113.
  • the operation unit 33 may include all of the digital pen 111, the touch panel 112, and the keyboard 113, or may include one or more operation units among them. Further, the configuration may include other operation units such as a mouse.
  • the description will be continued with an example of a case where the operator uses the digital pen 111 as an operating body, but the operating body used by the worker is the digital pen 111 and the touch panel mentioned here.
  • An operating body other than 112 and the keyboard 113 may be used.
  • a worker's finger or a pointer can be used as an operating body.
  • the operating body may be any one capable of designating a specific area of the work area.
  • the display unit 34 includes a projector 131, a display 132, a speaker 133, and a unidirectional speaker 134.
  • the display unit 34 may be a projector 131 or a display 132, and may not include both the projector 131 and the display 132.
  • the speaker 133 or the unidirectional speaker 134 can be used in combination with the projector 131 or the display 132 to output the sound. When the audio is not output, the speaker 133 and the unidirectional speaker 134 may not be provided.
  • the I / F unit 51 of the processing unit 31 exchanges data with the sensor unit 32, the operation unit 33, the display unit 34, the data processing unit 57, and the communication unit 60.
  • the pen tip recognition unit 52 recognizes the position of the pen tip of the digital pen 111 by using the sensor information obtained from the sensor unit 32.
  • the pen recognition unit 53 recognizes the position and orientation of the digital pen 111.
  • the environment recognition unit 54 recognizes the size, brightness, loudness of environmental sound, etc. of the work space (for example, room A or room B).
  • the user recognition unit 55 estimates the position and orientation of the user.
  • the map management unit 56 recognizes the drawing target surface (work area).
  • the drawing data generation unit 71 of the data processing unit 57 calculates the trajectory of the pen at the time of drawing from the movement of the digital pen 111 with respect to the input surface (work area), and creates data.
  • the behavior data generation unit 72 estimates the behavior of the user using the information processing device 21, for example, in the case of the information processing device 21A, the behavior of the user A is estimated.
  • the display data generation unit 73 generates data for the user A to recognize the behavior estimated as the behavior of the user B on the information processing device 21B side used by another user, for example, the user B.
  • the action data generation unit 72 estimates the user's behavior, for example, information indicating whether or not the user is trying to work, an area where the user is trying to work, and the like from the state of the digital pen 111 and the user.
  • Data related to the behavior of user B (described as behavior data) generated by the behavior data generation unit 72B of the information processing device 21B is supplied to the information processing device 21A on the user A side, and display data is generated in the information processing device 21A. It is supplied to the unit 73A.
  • the display data generation unit 73A uses the behavior data of the user B supplied from the information processing device 21B to generate drawing data representing the behavior of the user B, for example, a work area in which the user B is trying to work.
  • the display data generation unit 73 is based on the drawing data generated by the drawing data generation unit 71 and the behavior data generated by the behavior data generation unit 72 of the information processing device 21 on the other user side via the communication unit 60. Display data to be displayed on the display unit 34 is generated.
  • the data processed by the data processing unit 57 is supplied to the I / F unit 51 and supplied to the display unit 34, or is supplied to another information processing device 21 via the communication unit 60, if necessary. do.
  • the timer 58 generates time information as needed. For example, when the content of the work performed by the user using the digital pen 111 is stored in the storage unit 59 with the passage of time, the timer 58 generates time information, associates it with the work content, and stores it in the storage unit 59.
  • the time information from the timer 58 may be the time at the time of work or the elapsed time from the time at which the work is started.
  • the storage unit 59 stores various data as needed. As described above, for example, data in which time information and work contents are associated with each other is stored.
  • the sensor unit 32 includes a digital pen 111 and a sensor for detecting the coordinates, orientation, and surrounding information of the user.
  • the depth sensor 93 can measure the distance from a predetermined position to the digital pen 111 to detect the sitting position and the orientation of the digital pen 111, and can obtain the distance according to the measurement result.
  • the acceleration sensor 92 and the gyro sensor 96 may be incorporated in the digital pen 111 so that the digital pen 111 detects its own coordinates and orientation.
  • the coordinates and orientation of the user can be obtained by measuring the distance from a predetermined position to the user with the depth sensor 93 and according to the measurement result. Further, by analyzing the image captured by the camera 95, it is possible to detect the user's sitting position and orientation.
  • the user may wear a wearable device, and the coordinates and orientation of the user may be detected from the data obtained by the wearable device.
  • the coordinates and orientation of the user are detected by the wearable device, it is possible to detect more details such as the line of sight, and the detection result of such more details can also be used in the process described later.
  • the coordinates may be, for example, three-dimensional coordinates having a predetermined position in the room A as the origin, or may be three-dimensional coordinates having the predetermined position on the work surface as the origin.
  • the display unit 34 presents the drawing data and action data generated by the processing unit 31 to the user.
  • the operation unit 33 receives input information from the digital pen 111 and operation information from the touch panel 112 and the keyboard 113.
  • FIG. 3 shows a state in which processing is being performed by the information processing device 21A on the user A side.
  • the work area 201A can be, for example, an area projected on the wall surface by the projector 131A. Further, the work area 201A may be the display 132A. Here, the description will be continued by taking as an example the case where the work area 201A is an area projected on the wall surface by the projector 131A and is larger than the user A.
  • the state shown in FIG. 3 is a state in which the user A draws the line 221 with the digital pen 111A in the work area 201A.
  • the user B is working in the work area 201B provided by the information processing device 21B by using the information processing device 21B located at a different location from the user A.
  • the drawing data drawn by the user B in the work area 201B using the digital pen 111B is transmitted from the information processing device 21B to the information processing device 21A via the network 22 (FIG. 1).
  • the information processing device 21A When the information processing device 21A receives the drawing data of the line 222 drawn by the digital pen 111B from the information processing device 21B, the information processing device 21A displays the line 222 in the work area A based on the drawing data.
  • the line 222 is displayed at the position and size on the work area 201A on the user A side corresponding to the position and size drawn by the user B on the work area 201B on the user B side.
  • the information processing device 21B on the user B side generates the position information of the digital pen 111B when the digital pen 111B is located at a position away from the work area 201B, in other words, when it is not in the drawing state. , Transmit to the information processing device 21A.
  • the display data generation unit 73A of the information processing device 21A displays, for example, the pen tip state mark 223 shown in FIG. Generate display data.
  • the pen tip status mark 223 is a mark that pseudo-displays the position of the pen tip of the digital pen 111B held by the user B on the work area 201A of the user A.
  • the pen tip state mark 223 is a mark that causes user A to recognize in which area of the work area the pen tip of the digital pen 111B held by user B is in a state in which user B is not drawing. Is.
  • User A can grasp the position of user B by referring to the pen tip status mark 223. By referring to the pen tip state mark 223, the user A can roughly grasp the area where the user B is trying to work.
  • the predicted drawing area 224 and the pen tip shadow mark 225 are displayed.
  • the predicted drawing area 224 represents an area estimated as an area in which user B is trying to perform work.
  • the pen tip shadow mark 225 is displayed substantially in the center of the predicted drawing area 224.
  • the pen tip shadow mark 225 is in the direction in which the pen tip of the user B's digital pen 111B is facing, is on a line extending from the pen tip of the digital pen 111B, and is on the work area 201A corresponding to the coordinates intersecting the work area 201B. It is displayed at the position of.
  • Drawing data related to the predicted drawing area 224 and the pen tip shadow mark 225 is also generated on the information processing device 21B side and transmitted from the information processing device 21B to the information processing device 21A.
  • the display data generation unit 73A of the information processing device 21A is shown in FIG. 3, for example. Display data for displaying the shown predicted drawing area 224 and the pen tip shadow mark 225 is generated.
  • the user A can grasp the area that the user B is trying to draw. Therefore, the user A can prevent the work conflict from occurring by referring to the prediction drawing area 224.
  • the pen tip state mark 223, the predicted drawing area 224, and the pen tip shadow mark 225 may all be displayed, or one or two of these marks (areas) may be displayed. Is also good.
  • the user A can recognize the approximate distance between the digital pen 111B of the user B and the work area 201B.
  • the digital pen 111B is separated from the work area 201B.
  • the digital pen 111B and the work area 201B are close to each other.
  • the pen tip status mark 223 represents the pen tip of the digital pen 111B held by the user B, if the pen tip of the digital pen 111B moves, for example, if it moves in a direction approaching the work area 201B, it matches the movement. Then, the pen tip status mark 223 presented to the user A also moves. Therefore, when the pen tip state mark 223 approaches the pen tip shadow mark 225, the user B brings the digital pen 111B closer to the work area 201B.
  • the distance between the digital pen 111 and the work area 201 can be recognized.
  • This distance is an index indicating whether or not the user B is trying to draw (whether or not there is an intention to start drawing).
  • the user B when the user B is trying to draw, the user B brings the digital pen 111B closer to the work area 201B, and when the user B is not willing to draw, the user B does not change the position of the digital pen 111B. It is considered to be separated from the work area 201B. Therefore, by displaying the pen tip state mark 223 and the pen tip shadow mark 225, it is possible to indicate to the user A whether or not the user B is in the state of drawing.
  • the digital pen 111B is in contact with the work area 201B. That is, the drawing has started.
  • the displayed pen tip state mark 223 and pen tip shadow mark 225 are erased, and the drawn locus (line 222, etc.) is displayed instead.
  • the predicted drawing area 224 represents, for example, an area set as an area in which the user B may draw a predetermined range centered on the pen tip shadow mark 225.
  • the size of the predicted drawing area 224 may be a preset size (fixed value), or a size (variable) according to the distance between the pen tip state mark 223 and the pen tip shadow mark 225. Value) may be.
  • the predicted drawing area 224 is set. A small area may be displayed.
  • the pen tip state mark 223 and the pen tip shadow mark 225 are far apart, in other words, since the user B is not trying to draw, the prediction is made when the digital pen 111B is not brought close to the work area 201B. A large area may be displayed as the drawing area 224.
  • the predicted drawing area 224 predicted as the range to be drawn by the user B is a predicted area with relatively high accuracy, so that the area can be made small.
  • the size of the predicted drawing area 224 may express the intention of the user B whether or not to perform the work.
  • the intention of user B (cooperative worker) to start work can be expressed by the distance between the pen tip state mark 223 and the pen tip shadow mark 225. Further, the intention of the user B (cooperative worker) to start the work can be expressed by the size of the predicted drawing area 224.
  • these two displays may be displayed at the same time, or only one of them may be displayed.
  • the size of the predicted drawing area 224 may be constant, a gauge may be displayed at the end of the work area 201A, and the gauge may indicate the intention of the collaborative worker to start work.
  • the gauge corresponds to, for example, the distance between the digital pen 111 and the work area 201, and corresponds to the distance between the pen tip state mark 223 and the pen tip shadow mark 225.
  • the display corresponding to the distance between the pen tip state mark 223 and the pen tip shadow mark 225 may be expressed by a gauge.
  • such a gauge and the pen tip state mark 223 may be displayed, or the gauge and the pen tip shadow mark 225 may be displayed. Further, the gauge may be displayed together with the pen tip state mark 223, the predicted drawing area 224, and the pen tip shadow mark 225.
  • the working state shown in FIG. 3 is an example. Further, the description with reference to FIG. 3 has been described by taking the work area 201A presented by the information processing device 21A as an example on the user A side, but the work area 201B presented by the information processing device 21B on the user B side. However, basically the same working state is presented. On the user B side, the position and direction of the pen tip of the digital pen 111A held by the user A, the predicted drawing area, and the like are displayed.
  • FIG. 4 is a diagram showing an example of the pen tip state mark 223.
  • the pen tip state mark 223 shown in A of FIG. 4 has a comet-like shape, and when divided into a core part and a tail part like a comet, the core part (circular in the lower left direction in the figure). The part) represents the pen tip, and by looking at the tail part, it is shaped so that you can intuitively know in which direction you are going.
  • the pen tip state mark 223 shown in B of FIG. 4 is an arrow.
  • the pen tip state mark 223 shown in B of FIG. 4 has a triangular shape. Both are shapes suitable for intuitively recognizing the position of the pen tip and the direction in which the pen tip is facing, and such a shape can be used as the pen tip state mark 223.
  • the pen tip state mark 223 shown in FIG. 4 is an example and does not indicate a limitation.
  • the pen tip state mark 223 may have a shape that imitates the digital pen 111.
  • FIG. 5 is a diagram showing an example of the pen tip shadow mark 225.
  • the pen tip shadow mark 225 shown in A of FIG. 5 has an elliptical shape.
  • an elliptical shape is shown as the pen tip shadow mark 225, but other shapes such as a circle or a polygon may be used.
  • the elliptical pen tip shadow mark 225 is used as shown in A of FIG. 5, as shown in B to D of FIG. 5, depending on the shade of the elliptical color, the transparency of the elliptical color, and the size of the ellipse.
  • the display indicating the distance between the digital pen 111 and the work area 201 may be displayed.
  • the pen tip shadow mark 225 shown in FIG. 5B is an elliptical mark, and is an example of a case where the distance between the digital pen 111 and the work area 201 is represented by the shade of the mark. The darker the color density in the ellipse, the closer the distance between the digital pen 111 and the work area 201.
  • the pen tip shadow mark 225 shown in FIG. 5C is an elliptical mark, and is an example of a case where the distance between the digital pen 111 and the work area 201 is represented by its transparency. The lower the transparency of the color in the ellipse, the closer the distance between the digital pen 111 and the work area 201.
  • the pen tip shadow mark 225 shown in D of FIG. 5 is an elliptical mark, and is an example of a case where the distance between the digital pen 111 and the work area 201 is represented by its size. The larger the size of the ellipse, the closer the distance between the digital pen 111 and the work area 201.
  • the pen tip shadow mark 225 may be used as a display indicating the distance between the digital pen 111 and the work area 201.
  • the predicted drawing area 224 may be represented by an elliptical shape as shown in FIG. 5A. Further, although an elliptical shape is shown as the predicted drawing area 224, other shapes such as a circle or a polygon may be used.
  • FIG. 5A when the ellipse-shaped predicted drawing area 224 is used as shown in FIG. 5A, as shown in FIGS. 5B to D, depending on the shade of the ellipse color, the transparency of the ellipse color, and the size of the ellipse A display indicating the distance between the digital pen 111 and the work area 201 may be displayed.
  • the predicted drawing area 224 shown in FIG. 5B is an elliptical mark, and is an example of a case where the distance between the digital pen 111 and the work area 201 is represented by the shade of the mark. The darker the color density in the ellipse, the closer the distance between the digital pen 111 and the work area 201.
  • the predicted drawing area 224 shown in FIG. 5C is an elliptical mark, and is an example of a case where the distance between the digital pen 111 and the work area 201 is represented by its transparency. The lower the transparency of the color in the ellipse, the closer the distance between the digital pen 111 and the work area 201.
  • the predicted drawing area 224 shown in D of FIG. 5 is an elliptical mark, and is an example of a case where the distance between the digital pen 111 and the work area 201 is represented by its size. The larger the size of the ellipse, the closer the distance between the digital pen 111 and the work area 201.
  • the state of the user B can be recognized by the user A by the pen tip state mark 223, the predicted drawing area 224, and the pen tip shadow mark 225. That is, the user A can know the state of the user B who is not present by looking at the pen tip state mark 223, the predicted drawing area 224, and the pen tip shadow mark 225. Therefore, the user A can take a workaround such as drawing outside the prediction drawing area 224 so that the work conflict with the user B does not occur.
  • the user A since it is possible to make the user A recognize the state of the user B even if all of the pen tip state mark 223, the predicted drawing area 224, and the pen tip shadow mark 225 are not displayed, any one of them.
  • the mark (area) of 1 or 2 may be displayed.
  • a mark that imitates the user B is used instead of the pen tip state mark 223, the predicted drawing area 224, and the pen tip shadow mark 225, or the pen tip state mark 223, the predicted drawing area. It may be displayed together with 224, the pen tip shadow mark 225, and the like.
  • the mark shown in FIG. 6 is a person mark 226 that imitates a human figure.
  • the person mark 226 is a mark indicating the position where the user B is present. By looking at the person mark 226, the user A can recognize that the user B is in the (area) at that position.
  • the person mark 226 shown in A of FIG. 6 is a transparent mark, and is a mark having a shape that imitates a person.
  • the shape that imitates a person may be a shape that reproduces the shape of user B as it is, or may be a simplified shape that can be recognized as a person as shown in A of FIG. good. Further, the image itself obtained by shooting the user B may be used.
  • the person mark 226 shown in B of FIG. 6 is a shadow mark, which is a mark having a shape that imitates the shadow of a person.
  • the shape of the shadow of the user B may be reproduced as it is, or may be a simplified shape that can be recognized as the shadow of a person as shown in FIG. 6B. good.
  • the person mark 226 shown in C of FIG. 6 is a bone mark, which is a mark having a shape that imitates a human bone.
  • the shape may be a shape that reproduces the shape corresponding to the bone of user B (height, shoulder width, etc. of user B), or can be recognized as a human bone as shown in C of FIG. It may have a simplified shape.
  • the position where the user B is present is represented by displaying the person mark 226, the position is displayed large when the user B is close to the work area 201B, and displayed small when the user B is away from the work area 201B. You may do so. That is, the distance between the user B and the work area 201B may be represented by the size of the person mark 226.
  • FIG. 7 is a diagram for explaining the processing flow of the information processing device 21. From the information obtained by the sensor unit 32, the coordinates and orientation of the digital pen 111 and the user, the surrounding situation (the situation of the room A), and the like are detected. The information obtained by the sensor unit 32 is supplied to the pen tip recognition unit 52, the pen recognition unit 53, the environment recognition unit 54, and the user recognition unit 55.
  • the pen tip recognition unit 52 detects the position of the pen tip of the digital pen 111 and supplies the detection result to the drawing data generation unit 71 and the action data generation unit 72.
  • the pen recognition unit 53 detects the position and orientation of the digital pen 111, and supplies the detection result to the action data generation unit 72.
  • the environment recognition unit 54 detects the size, brightness, loudness of the environmental sound, etc. of the working space, and supplies the detection result to the behavior data generation unit 72.
  • the user recognition unit 55 detects the position and orientation of the user and supplies the detection result to the action data generation unit 72.
  • the pen tip recognition unit 52 determines whether or not the drawing mode is set based on the position of the pen tip.
  • the drawing mode is a mode in which the pen tip of the digital pen 111 comes into contact with the work area 201 and lines, characters, and the like are drawn (a mode in which the actual work is executed).
  • FIG. 8 is a flowchart for explaining the processing in the pen tip recognition unit 52.
  • step S11 it is determined whether or not the pen tip of the digital pen 111 is detected. In step S11, the process of step S11 is repeated until it is determined that the pen tip of the digital pen 111 has been detected.
  • step S11 If it is determined in step S11 that the pen tip of the digital pen 111 has been detected, the process proceeds to step S12. In step S12, it is determined whether or not the drawing mode is being activated.
  • the pen tip recognition unit 52 can determine whether or not the drawing mode is being activated by determining whether or not the pen tip of the digital pen 111 is in contact with the work area 201.
  • a switch may be provided at the tip of the digital pen 111 so that it can be determined whether or not the switch is in the drawing mode depending on the pressed state. In this case, when the switch is pressed, the digital pen 111 is pressed against the work area 201, and it is determined that the drawing mode is being activated.
  • the digital pen 111 is provided with an acceleration sensor (the acceleration sensor 92 included in the sensor unit 32 may be included in the digital pen 111), and whether or not the drawing mode is set depending on the moving direction of the digital pen 111. May be determined. In the drawing mode, the digital pen 111 moves along the work area 201. Therefore, by monitoring the movement of the digital pen 111 along the work area 201, it is possible to determine whether or not the drawing mode is being activated.
  • the acceleration sensor 92 included in the sensor unit 32 may be included in the digital pen 111
  • an IMU Inertial Measurement Unit
  • the IMU is provided with, for example, a three-axis gyro and a three-direction accelerometer, and has a configuration in which a three-dimensional angular velocity and acceleration can be obtained. If it is estimated from the information obtained from the IMU that the digital pen 111 is located away from the work area 201, it can be determined that the drawing mode is not set.
  • whether or not the drawing mode may be determined may be determined from the orientation of the pen tip of the digital pen 111 and the orientation of the face of the user holding the digital pen 111.
  • the user faces the work area 201 and moves the digital pen 111 on the work area 201.
  • the user's face faces the work area 201 and the pen tip of the digital pen 111 also faces the work area 201, it can be determined that the drawing mode is in effect.
  • the determination as to whether or not the drawing mode is activated may be made by a method other than those illustrated here.
  • step S12 If it is determined in step S12 that the drawing mode is being activated, the process proceeds to step S13.
  • step S13 the pen tip information is acquired from the pen tip detection camera.
  • a camera for detecting the pen tip and a camera for detecting behavior, which will be described later, may be separately provided, or may be used, for example, in the camera 95 (FIG. 2).
  • step S14 the pen tip information acquired in step S13 is used to perform a two-dimensional calculation of the pen tip.
  • the process comes to step S14 in the drawing mode. That is, when work, for example, text or a figure is drawn on the work area 201 by the digital pen 111. Therefore, such drawn data is acquired (calculated) in step S14. Specifically, the coordinates (two-dimensional coordinates) at which the digital pen 111 is located on the work area 201 are calculated.
  • step S15 the drawing data generation unit 71 is instructed to create the calculated two-dimensional pen tip coordinates and drawing data.
  • step S12 determines whether the drawing mode is not being activated. If it is determined in step S12 that the drawing mode is not being activated, the process proceeds to step S16.
  • step S16 pen tip information is acquired from the action detection camera. The action or action detection will be described later, but it is an action when the user is not drawing, and is to detect such an action.
  • step S17 the pen tip information acquired in step S16 is used to perform three-dimensional calculation of the pen tip.
  • the process comes to step S17 when the drawing mode is not set. For example, the user holds the digital pen 111 at a position away from the work area 201. In such a state, a process for generating the pen tip state mark 223, the predicted drawing area 224, and the pen tip shadow mark 225 indicating whether or not the user intends to draw is executed.
  • step S17 the three-dimensional coordinates of the digital pen 111 are calculated as the position information of the pen tip. Since the distance between the digital pen 11 and the work area 201 is calculated or the predicted drawing area is set from the three-dimensional coordinates of the digital pen 111, the position of the pen tip of the digital pen 111, in other words, in step S17, The three-dimensional coordinates of the pen tip in the work space are calculated.
  • step S18 the action data generation unit 72 is instructed to generate the three-dimensional coordinates of the pen tip and the action data.
  • FIG. 9 is a table in which behavior data, input values, and display data are associated.
  • the behavior data is data generated by the behavior data generation unit 72.
  • the input value is the data (information) required to generate the behavior data.
  • the display data is information presented to the user based on the behavior data, and is data for displaying, for example, the pen tip state mark 223.
  • the action data includes "three-dimensional position of the pen", “user's position”, “predicted drawing area”, “intention to start drawing”, “work content” and the like.
  • the input value required to generate the "three-dimensional position of the pen" of the behavior data is the "coordinates of the pen tip".
  • the pen tip is imaged by a plurality of cameras, the pen tip is detected from the images obtained from the plurality of cameras, and the three-dimensional position of the pen tip is specified using each detection result. Can be done.
  • an infrared LED Light Emitting Diode
  • the infrared LED may be detected by an infrared camera so that the three-dimensional position of the pen tip can be specified.
  • the three-dimensional position of the pen tip may be specified.
  • the position of the pen tip may be specified by a method other than those illustrated here.
  • the display data generated from the "three-dimensional position of the pen" of the action data is the data for displaying the pen tip state mark 223 and the pen tip shadow mark 225.
  • the input value required to generate the "user's position" of the behavior data is the "user's coordinates".
  • the coordinates of the user are the same as in the case of obtaining the three-dimensional coordinates of the pen tip, for example, the user is imaged by a plurality of cameras, the user is detected from the images obtained from the plurality of cameras, and the user is used by using each detection result.
  • the three-dimensional position of can be specified.
  • the display data generated from the "user's position" of the behavior data is the data for displaying the person mark 226.
  • the input values required to generate the "predictive drawing area" of the behavior data are "pen tip coordinates", “pen acceleration”, and "user coordinates".
  • the input values required when generating the behavior data of the three-dimensional position of the pen and the position of the user can be used.
  • an acceleration sensor can be incorporated in the digital pen 111, and the data obtained from the acceleration sensor can be used. Further, the pen tip of the digital pen 111 is detected from the images captured at a predetermined time, the moving distance of the pen tip is obtained, and the acceleration of the digital pen 111 is obtained from the moving distance and the predetermined time which is the shooting interval. May be required.
  • the display data generated from the "predicted drawing area" of the behavior data is the data for displaying the predicted drawing area 224.
  • the process for generating the behavior data of the predicted drawing area will be described later with reference to FIGS. 10 and 11.
  • the input values required to generate the "intention to start drawing” of the behavior data are "pen tip coordinates", “pen acceleration”, and "user coordinates”. These input values are the same as the input values required to generate the "predicted drawing area" of the behavior data, and the input values required to generate the "predicted drawing area” of the behavior data. Can be shared.
  • the display data generated from the "drawing start position" of the action data is data related to the shape change and color expression in the pen tip state mark 223 and the pen tip shadow mark 225.
  • the shape change and color expression in the pen tip state mark 223, as described with reference to FIGS. 4 and 5, the pen tip state mark 223 and the pen tip state mark 223 are formed between the digital pen 111 and the work area 201. It is a shape change or color expression when it is changed according to a distance or the like.
  • the "intention to start drawing" of the action data is related to the intention of starting drawing by determining that the user intends to start drawing when, for example, the distance between the digital pen 111 and the work area 201 is equal to or less than the threshold value. Behavioral data may be generated.
  • the digital pen 111 and the work area 201 are separated from each other, it is considered that the user has no intention of working, and at least it can be determined that the work is not started immediately after that. , Prevent the pen tip status mark 223 and the pen tip shadow mark 225 from being displayed. In order to make such a determination, it is determined whether or not to generate the "intention to start drawing" of the action data, and as the determination, as described above, the work with the digital pen 111 is performed. It may be possible to determine whether or not the distance to the region 201 is equal to or less than the threshold value.
  • the input value required to generate the "tool information" of the behavior data is the "information of the tool used”.
  • the display data generated from the "tool information" of the action data is data related to the shape change and color expression in the pen tip state mark 223 and the pen tip shadow mark 225.
  • tools such as line type (thick, thin, dotted line, arrow, etc.), fill, eraser, color, etc. when drawing with the digital pen 111 may be prepared.
  • Information about such a tool may be expressed by the shape and color of the pen tip state mark 223 and the pen tip shadow mark 225.
  • the predicted drawing area or the like may be estimated by using the information of the tool used by the user. For example, when a drawing tool tool other than a pen, for example, a tool such as a pointing stick, is used, the user can see the area in the direction pointed by the pointing stick, in other words, the direction pointed by the pen tip of the digital pen 111. It can be estimated that this is the area of interest.
  • behavior data As the behavior data, such data is generated by the behavior data generation unit 72 as needed.
  • the digital pen 111 is located in a specific area where the distance from the work area 201 is equal to or less than the threshold value, and drawing (work) is started within a certain time from the direction and the moving direction of the digital pen 111. This is an area that is judged to have a high possibility of
  • the first setting method of the predicted drawing area will be described with reference to FIG.
  • An intersection P1 between the line L1 extending parallel to the direction in which the pen tip of the digital pen 111 is facing and the work area 201 is obtained.
  • a region having a predetermined size centered on the intersection P1 is set as the predicted drawing region A1.
  • the size of the predicted drawing area A1 may be set according to the distance between the digital pen 111 and the work area 201, or may be set to a preset size. ..
  • the second setting method of the predicted drawing area will be described with reference to FIG.
  • An intersection P2 between the line L2 extending parallel to the direction in which the pen tip of the digital pen 111 is moving and the work area 201 is obtained.
  • a region having a predetermined size centered on the intersection P2 is set as the predicted drawing region A2.
  • the size of the predicted drawing area A2 may be set according to the distance between the digital pen 111 and the work area 201, or may be set to a preset size. ..
  • Either the first setting method or the second setting method of the predicted drawing area may be applied so that the predicted drawing area is set, or the predicted drawing area is set in combination. You may.
  • the center (intersection point P1) of the area set by the first setting method and the first An area having a predetermined size centered on the point P3 at the center of the center (intersection point P2) of the area set by the setting method of 2 may be set as the predicted drawing area.
  • the predicted drawing area is set by combining the first setting method and the second setting method of the predicted drawing area, the area including the areas set by each method is set as the predicted drawing area. It may be done.
  • the predicted drawing area may be set by another setting method. Further, setting methods other than the first and second setting methods can be further added so that the predicted drawing area can be set by combining a plurality of setting methods. Further, when the prediction drawing area is set by combining a plurality of setting methods, one prediction drawing area may be selected by giving a priority from the set plurality of prediction drawing areas.
  • step S18 the action data generation unit 72 is instructed to generate the action data. Then, the action data generation unit 72 uses the data obtained from the pen tip recognition unit 52, the pen recognition unit 53, the environment recognition unit 54, and the user recognition unit 55 to obtain the action data as described with reference to FIG. Generate.
  • the pen tip recognition unit 52 issues an instruction to the drawing data generation unit 71, and the drawing data is generated.
  • the pen tip recognition unit 52 issues an instruction to the action data generation unit 72, and the action data is generated. In this way, the generated data differs depending on whether or not the drawing mode is used.
  • the pen tip recognition unit 52 acquires the data of the pen tip, all the values are processed three-dimensionally, and the three-dimensional coordinates are on the drawing target surface (the surface assuming the work area 201). If it exists in, it may be determined that it is drawing data, and if it is not on the drawing target surface, it may be determined that it is action data.
  • step S17 the process in step S17 is executed without determining whether or not the drawing mode is being activated in step S12 (FIG. 8). For example, when the pen tip is detected, a three-dimensional calculation is performed based on the pen tip information, and whether or not the calculation result of the three-dimensional calculation, that is, the three-dimensional coordinates are the coordinates on the drawing target surface. It is judged.
  • the drawing mode is generated assuming that the drawing mode is activated.
  • the action mode is generated assuming that the drawing mode is not activated.
  • the pen tip recognition unit 52 determines that the drawing mode is being activated, the pen tip recognition unit 52 supplies the two-dimensional data of the drawing target surface to the drawing data generation unit 71 as data regarding the position of the pen tip of the digital pen 111.
  • the pen tip recognition unit 52 determines that the drawing mode is not activated, the pen tip recognition unit 52 supplies three-dimensional data in the work space to the action data generation unit 72 as data regarding the position of the pen tip of the digital pen 111.
  • the drawing data generation unit 71 generates drawing data related to text, lines, etc. drawn by the digital pen 111 displayed in the work area 201, and supplies the drawing data to the display data generation unit 73.
  • the drawing data from the drawing data generation unit 71 is also supplied to the information processing device 21 of another user who is collaborating, and is displayed as work content on the information processing device 21 side of the other user. Used.
  • the behavior data generation unit 72 generates behavior data and supplies it to the communication unit 60. For example, when each part shown in FIG. 7 is the information processing device 21B used by the user B, the action data is transmitted to the information processing device 21A used by the user A. On the information processing device 21A side, the action data received by the communication unit 60A is supplied to the display data generation unit 73A.
  • each unit shown in FIG. 7 is the information processing device 21B used by the user B
  • the action data transmitted from the information processing device 21A used by the user A is received by the communication unit 60B. And is supplied to the display data generation unit 73B.
  • What is supplied to the display data generation unit 73 is the action data received by the communication unit 60, and the display data generation unit 73 processes the action data related to the actions of other users.
  • the display data generation unit 73 generates the display data described with reference to FIG. 9 based on the supplied behavior data. Further, the display data generation unit 73 generates display data related to the display to be presented to the user based on the generated display data and the drawing data from the drawing data generation unit 71, and supplies the display data to the display unit 34.
  • the display unit 34 displays based on the display data from the display data generation unit 73. Further, the display data from the display data generation unit 73 is stored in the storage unit 59 after time information is added by the timer 58.
  • the display as shown in FIG. 3 is presented to the user. That is, the work content (line 221) that the user himself worked on on the work area 201, the work content that the collaborative worker worked on the work area 201 on the collaborative worker side (line 222), and the collaborative worker worked. Information on the behavior of the cooperative work difference when not done (pen tip status mark 223, etc.) is displayed.
  • the user A displays on the work area 201 whether or not the state of the user B who cannot directly visually recognize the operation, for example, whether or not the drawing is about to be started. It can be recognized by looking at the pen tip state mark 223, the predicted drawing area 224, and the pen tip shadow mark 225. Therefore, the user A can perform the work so that the work does not collide with the user B.
  • the display data generation unit 73 may delay the output of the display data generated from the action data to the display unit 34 by a predetermined time before outputting the display data.
  • the display unit 34 may delay the display based on the display data from the display data generation unit 73 by a predetermined time before performing the display. That is, the display of information about other users, such as the pen tip state mark 223 and the predicted drawing area 224, may be performed in real time or may be delayed.
  • the user A interrupts the drawing work as the timing for displaying the information regarding the status of another user (here, the user B) such as the pen tip state mark 223 and the predicted drawing area 224. It may be displayed when the user A is doing the drawing work, and may not be displayed when the user A is performing the drawing work.
  • the user A interrupts the drawing work, for example, when he / she is away from the work area 201, the digital pen 111 is not moving, and the like.
  • the shading of the display such as the predicted drawing area 224 may be controlled according to the relative distance between the user and the collaborative worker (user A and user B).
  • a work conflict occurs when user A and user B try to work on the same area (near area). In other words, when the user A and the user B work on different areas, particularly distant areas, the possibility of work conflict is low.
  • the display of the predicted drawing area 224 or the like may be changed according to the high possibility of a work collision, in other words, according to the relative distance between the user and the collaborative worker.
  • the predicted drawing area 224 or the like may not be displayed or may be displayed in a light color.
  • the predicted drawing area 224 or the like may be displayed in a dark color or may be displayed such as blinking to call attention.
  • different displays are displayed depending on whether the position of the user B (cooperative worker) on the three dimensions, the position of the digital pen 111B, etc. can be accurately acquired and when the position cannot be acquired. You may be asked. In other words, the display may be performed according to the accuracy of the input value when the behavior data is generated.
  • the display color of the pen tip status mark 223 and the person mark 226 to be displayed is darkened, and when it is determined that the accuracy of the input value is low, it is displayed. Control may be performed such that the display color of the pen tip state mark 223 and the person mark 226 is displayed lightly.
  • the displayed person mark 226 when it is determined that the accuracy of the input value is high, the displayed person mark 226 is, for example, a mark as shown in A of FIG. 6, and when it is determined that the accuracy of the input value is low, it is displayed. Display control may be performed such that the person mark 226 to be displayed is a shadow mark as shown in FIG. 6B, for example.
  • the sensor unit 32, the display unit 34 (FIG. 2), and the like may be wearable devices.
  • the sensor unit 32 is a wearable device, for example, the position of the pen tip of the digital pen 111 may be detected by the sensor unit 32 installed in the wearable device at the time of drawing. Further, the locus may be saved only when the digital pen 111 enters the viewing angle of the wearable device, so that the amount of data may be reduced.
  • the information of the collaborative worker has been described by giving an example of presenting the information to the worker (corresponding to user A) by displaying the pen tip status mark 223 or the like.
  • the information of the collaborative worker may be presented to the worker by the following method.
  • the information of the collaborative worker may be presented to the worker by the output using hearing.
  • the output using hearing it is possible to apply an output expressed by a footstep of a size corresponding to the position of a collaborative worker or an output expressed by the writing sound of a pen.
  • the information of the collaborative worker may be presented to the worker by the output using the sense of touch.
  • the position of the cooperating worker may be expressed by the vibration of a magnitude corresponding to the position of the cooperating worker, or may be expressed by the vibration of the pen.
  • the pen vibrates and the vibration that expresses the feeling of a collision is generated by the worker. May be given to.
  • the work area 201 described above does not have to be a flat surface.
  • the present technology can be applied to work such as editing a 3D model.
  • the information about the collaborative worker is transmitted from the information processing device 21B to the information processing device 21A via the network 22. If there is a delay in this communication, the expression method may be devised, such as adjusting the playback speed, so that the display is displayed so that the operator does not feel the delay.
  • the work contents (drawing data) of the collaborative worker may be supplied to the worker side in real time (reproduced in real time), or may be supplied (reproduced) at a certain time. You may do so.
  • the solution drawn in real time may be supplied to the worker (user A) and displayed, or the solution may be displayed.
  • the solutions may be displayed all at once.
  • the display intensity of the locus may be changed according to the distance from the drawing target surface (working area 201). For example, a locus drawn when the user is close to the drawing target surface may be displayed strongly, and a locus drawn when the user is away from the drawing target surface may be displayed weakly.
  • the display of the locus drawn by the collaborative worker may not be completely the same as the actual drawing, but may be changed according to the situation of the worker. For example, regardless of the drawing position and drawing size of the collaborating worker, the locus drawn by the collaborating worker may be displayed at a position and size that are easy for the worker to see.
  • a timer 58 (FIG. 2) is provided, and for example, the display data generated by the display data generation unit 73 may be associated with the time information and stored in the storage unit 59.
  • the display data stored in the storage unit 59 the work contents and the like performed in the past may be time-lapse reproduced.
  • cooperative work may be performed on the time-lapse reproduced display.
  • the conference can be reviewed by time-lapse reproducing the data related to the work using the digital pen 111. It is also possible to perform work such as adding a new memo while looking back at the meeting by time-lapse playback.
  • adding a memo the memo can be added without the action colliding with the past data already stored.
  • the instructions when taking notes to the performer in movie shooting, etc., the instructions are recorded in advance, and when shooting, the instructions are played back according to the movement of the performer to support the shooting. This technology can be applied.
  • the instruction when it is desired to add or change an instruction when repeating a rehearsal, the instruction can be added without conflicting with the past data already recorded.
  • a function that sets a tag when recording a trajectory may be added.
  • desired reproduction such as reproducing only specific data at the time of reproduction.
  • the series of processes described above can be executed by hardware or software.
  • the programs constituting the software are installed on the computer.
  • the computer includes a computer embedded in dedicated hardware and, for example, a general-purpose personal computer capable of executing various functions by installing various programs.
  • FIG. 12 is a block diagram showing a configuration example of computer hardware that executes the above-mentioned series of processes programmatically.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • An input / output interface 505 is further connected to the bus 504.
  • An input unit 506, an output unit 507, a storage unit 508, a communication unit 509, and a drive 510 are connected to the input / output interface 505.
  • the input unit 506 includes a keyboard, a mouse, a microphone, and the like.
  • the output unit 507 includes a display, a speaker, and the like.
  • the storage unit 508 includes a hard disk, a non-volatile memory, and the like.
  • the communication unit 509 includes a network interface and the like.
  • the drive 510 drives a removable recording medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
  • the CPU 501 loads the program stored in the storage unit 508 into the RAM 503 via the input / output interface 505 and the bus 504 and executes the above-described series. Is processed.
  • the program executed by the computer (CPU 501) can be recorded and provided on a removable recording medium 511 as a package medium or the like, for example.
  • the program can also be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
  • the program can be installed in the storage unit 508 via the input / output interface 505 by mounting the removable recording medium 511 in the drive 510. Further, the program can be received by the communication unit 509 and installed in the storage unit 508 via a wired or wireless transmission medium. In addition, the program can be pre-installed in the ROM 502 or the storage unit 508.
  • the program executed by the computer may be a program that is processed in chronological order according to the order described in this specification, or may be a program that is processed in parallel or at a necessary timing such as when a call is made. It may be a program in which processing is performed.
  • system represents the entire device composed of a plurality of devices.
  • the present technology can also have the following configurations.
  • (1) The contents of work performed by the first user on the first work area, The behavior of the second user when the second user is located in an area different from the first work area and is not working on the second work area corresponding to the first work area.
  • An information processing device including a display control unit that displays information and information related to the information in the first work area.
  • (2) The display control unit
  • the information processing device according to (1) wherein the work content performed by the second user on the second work area is further displayed on the first work area.
  • (3) The first work area and the second work area are virtual work areas installed at different positions.
  • the predicted area predicted as the area where the second user works is displayed in the first work area, any one of the above (1) to (7).
  • the information processing device described in. (9) The information processing apparatus according to (8) above, wherein the prediction region is a region having a predetermined size centered on a point where a line extending in the direction in which the operating body is facing intersects with the second work region. .. (10) The information processing according to (8) above, wherein the prediction region is a region having a predetermined size centered on a line extending in the direction in which the operating body is moving and a point where the second work region intersects.
  • the information processing apparatus (11) The information processing apparatus according to (8), wherein the prediction area is displayed in a size corresponding to a distance between the operating body and the second work area. (12) The information indicating the position corresponding to the intersection of the direction in which the operating body is facing and the second working area is displayed in a color corresponding to the distance between the operating body and the second working area. 7) The information processing apparatus according to any one of (11). (13) As the information regarding the behavior of the second user, the first work corresponding to the position where the second user is located with respect to the second work area in the form of copying the second user. The information processing device according to (1) above, which is displayed on the area. (14) The information processing apparatus according to any one of (1) to (13), which stores display data displayed in the first work area.
  • the information processing apparatus which generates coordinate data.
  • Information processing device The work done by the first user on the first work area, The behavior of the second user when the second user is located in an area different from the first work area and is not working on the second work area corresponding to the first work area.
  • On the computer The work done by the first user on the first work area, The behavior of the second user when the second user is located in an area different from the first work area and is not working on the second work area corresponding to the first work area.
  • a program for executing a process including a step of displaying information about the user in the first work area.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Automation & Control Theory (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente technologie concerne un dispositif de traitement d'informations, un procédé de traitement d'informations et un programme qui empêchent que des opérations d'opérateurs qui effectuent des opérations en coopération dans une zone opérationnelle virtuelle n'entrent en conflit les unes avec les autres. Ce dispositif de traitement d'informations est pourvu d'une unité de commande d'affichage qui affiche, dans une première zone opérationnelle, le contenu d'une opération effectuée dans la première zone opérationnelle par un premier utilisateur, et des informations concernant le comportement d'un second utilisateur lorsque le second utilisateur se trouve dans une zone différente de la première zone opérationnelle et qu'il n'effectue aucune opération dans une seconde zone opérationnelle correspondant à la première zone opérationnelle. La présente invention peut être appliquée, par exemple, à un dispositif de traitement d'informations qui effectue une commande lorsqu'un opérateur situé dans un emplacement distant effectue des opérations dans une zone opérationnelle virtuelle.
PCT/JP2021/002173 2020-02-07 2021-01-22 Dispositif de traitement d'informations, procédé de traitement d'informations et programme WO2021157379A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-019647 2020-02-07
JP2020019647 2020-02-07

Publications (1)

Publication Number Publication Date
WO2021157379A1 true WO2021157379A1 (fr) 2021-08-12

Family

ID=77199322

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/002173 WO2021157379A1 (fr) 2020-02-07 2021-01-22 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Country Status (1)

Country Link
WO (1) WO2021157379A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011180690A (ja) * 2010-02-26 2011-09-15 Brother Industries Ltd 表示制御装置、表示制御方法、および表示制御プログラム
JP2012185823A (ja) * 2011-03-07 2012-09-27 Ricoh Co Ltd 共同環境における位置情報の提供
JP2014126965A (ja) * 2012-12-26 2014-07-07 Ricoh Co Ltd 情報処理システム、サーバー機器及び情報処理装置
JP2016515325A (ja) * 2013-02-20 2016-05-26 マイクロソフト テクノロジー ライセンシング,エルエルシー 鏡のメタファを使用した遠隔没入型体験の提供
JP2016170613A (ja) * 2015-03-12 2016-09-23 コニカミノルタ株式会社 会議支援装置、会議支援システム、会議支援プログラム、及び会議支援方法
WO2019244437A1 (fr) * 2018-06-18 2019-12-26 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011180690A (ja) * 2010-02-26 2011-09-15 Brother Industries Ltd 表示制御装置、表示制御方法、および表示制御プログラム
JP2012185823A (ja) * 2011-03-07 2012-09-27 Ricoh Co Ltd 共同環境における位置情報の提供
JP2014126965A (ja) * 2012-12-26 2014-07-07 Ricoh Co Ltd 情報処理システム、サーバー機器及び情報処理装置
JP2016515325A (ja) * 2013-02-20 2016-05-26 マイクロソフト テクノロジー ライセンシング,エルエルシー 鏡のメタファを使用した遠隔没入型体験の提供
JP2016170613A (ja) * 2015-03-12 2016-09-23 コニカミノルタ株式会社 会議支援装置、会議支援システム、会議支援プログラム、及び会議支援方法
WO2019244437A1 (fr) * 2018-06-18 2019-12-26 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Similar Documents

Publication Publication Date Title
US11494000B2 (en) Touch free interface for augmented reality systems
US9778815B2 (en) Three dimensional user interface effects on a display
US9417763B2 (en) Three dimensional user interface effects on a display by using properties of motion
CN107810465B (zh) 用于产生绘制表面的系统和方法
US11755122B2 (en) Hand gesture-based emojis
US11030796B2 (en) Interfaces and techniques to retarget 2D screencast videos into 3D tutorials in virtual reality
US20190312917A1 (en) Resource collaboration with co-presence indicators
CN113821124B (zh) 用于触摸检测的imu
WO2021157379A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
WO2023250361A1 (fr) Génération d'interfaces utilisateur affichant des graphiques de réalité augmentée
US11281337B1 (en) Mirror accessory for camera based touch detection
US20240319867A1 (en) Systems, methods and user interfaces for object tracing
CN117616365A (zh) 用于动态选择对象的操作模态的方法和设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21750622

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21750622

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP