WO2022034745A1 - Information processing device for superimposing write screen image - Google Patents

Information processing device for superimposing write screen image Download PDF

Info

Publication number
WO2022034745A1
WO2022034745A1 PCT/JP2021/023540 JP2021023540W WO2022034745A1 WO 2022034745 A1 WO2022034745 A1 WO 2022034745A1 JP 2021023540 W JP2021023540 W JP 2021023540W WO 2022034745 A1 WO2022034745 A1 WO 2022034745A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
application
image
display
writing
Prior art date
Application number
PCT/JP2021/023540
Other languages
French (fr)
Japanese (ja)
Inventor
健太郎 井田
保乃花 尾崎
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Priority to JP2022542592A priority Critical patent/JPWO2022034745A1/ja
Publication of WO2022034745A1 publication Critical patent/WO2022034745A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/37Details of the operation on graphic patterns
    • G09G5/377Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns

Definitions

  • This disclosure relates to information processing devices, information processing methods, and programs.
  • Patent Document 1 the handwriting image input to the real space by the digital pen is saved and read, and the handwriting image is projected (displayed) to the real space, but it is developed by other applications such as existing applications. It is not considered to be superimposed on the displayed image.
  • a control unit that displays and outputs a first screen presented by a first application and a second screen with a transparent background presented by a second application that accepts write input.
  • the control unit controls to superimpose and display the second screen on the first screen, and displays the second screen including the display of the write input and the first screen.
  • the processor displays and outputs a first screen presented by the first application and a second screen with a transparent background presented by the second application accepting write input.
  • the second screen including the display of the write input and the first screen are used to perform control to superimpose and display the second screen on the first screen.
  • a computer is displayed and output as a first screen presented by a first application and a second screen with a transparent background presented by a second application that accepts write input.
  • the control unit controls to superimpose and display the second screen on the first screen, and the second screen including the display of the write input and the first screen.
  • FIG. 1 is a diagram illustrating an outline of an information processing system according to an embodiment of the present disclosure.
  • the information processing system according to the present embodiment includes a projector 210, a camera 310, a digital pen 400, and an information processing device 100.
  • the projector 210 is a projection device that projects an image on an arbitrary place in real space.
  • the projector 210 projects an image into the real space included in the projection angle of view.
  • the projected angle of view means a projectable range and is also called a projection area.
  • the projection area 211 is defined by the position of the projector 210, the projection direction, and the angle of the projectable range about the projection direction as the central axis.
  • the image projected by the projector 210 is also referred to as a projected image. It is assumed that the projected image is an image projected on the entire projection area 211, and drawing information corresponding to the operation of the digital pen 400 by the user is mapped and projected in the projected image. In the example shown in FIG.
  • a handwriting image L showing a movement locus on the drawing surface (corresponding to the projection area 211) of the digital pen 400 is projected as drawing information.
  • the projector 210 is an example of a display device
  • the projection area is an example of a display area
  • the projected image is an example of a display image.
  • the camera 310 is an imaging device that captures images in real space.
  • the camera 310 has a lens system, a drive system, and an image pickup element such as an IR (infrared) camera and an RGB (red, green, blue) camera, and captures an image (still image or moving image).
  • the camera 310 images the real space included in the image pickup angle of view 311.
  • the image pickup angle of view 311 means an image pickup range, and is defined by the installation position of the camera 310, the image pickup direction, and the angle of the image pickup range centered on the image pickup direction.
  • the image captured by the camera 310 is also referred to as a captured image.
  • the image pickup angle of view 311 of the camera 310 according to the present embodiment may be a range including at least the projection area 211.
  • the digital pen 400 is an input device in which a light emitting unit such as an IR (infrared light) LED (Light Emitting Diode) is mounted on the pen tip.
  • the light emitting unit emits light by operating a button or switch provided on the digital pen 400, pressing the pen tip against the ground surface, swinging the pen, or the like. Further, the digital pen 400 may transmit detection information such as a user operation of a button or a switch provided on the digital pen 400 or a movement of the digital pen 400 to the information processing apparatus 100.
  • the information processing apparatus 100 detects a light emitting point from the captured image captured by the camera 310 and recognizes the position of the digital pen 400.
  • the imaging angle of view 311 is preset in a range including at least the projection region 211. Alternatively, the projection area 211 may be divided and photographed by a plurality of cameras 310.
  • the information processing apparatus 100 continuously recognizes, that is, tracks the position of the digital pen 400 based on the captured image continuously captured by the camera 310. Further, the information processing apparatus 100 wirelessly connects to the digital pen 400, transmits a light emitting command of the digital pen 400, and receives information indicating that the switch operation is performed by the digital pen 400.
  • the information processing apparatus 100 when the information processing apparatus 100 receives information from the digital pen 400 indicating that the pen tip is pressed against the ground surface (switch ON state), the information processing apparatus 100 displays the movement trajectory of the digital pen 400 as a handwriting on the projected image. Take control. As a result, the user can perform the drawing operation as if he / she was actually writing in the real space with a pen or the like.
  • FIG. 2 is a diagram showing an example of the configuration of the information processing system according to the present embodiment.
  • the information processing system according to the present embodiment includes an information processing device 100, a display device 200, a sensor 300, and a digital pen 400.
  • the display device 200 displays an image that reflects the drawing information input by the digital pen 400.
  • a projector 210 that projects an image into a real space is used as shown in FIG. 1, but a display may also be used.
  • the display device 200 may be realized by a screen and a rear projector that illuminates the screen from the opposite side (back side).
  • the projector may be, for example, a device having a drive mechanism and capable of projecting in any direction. By having such a mechanism, it is possible to display an image not only in one place but also in various places, and it is possible to realize an interactive function such as projecting in the direction in which the user is. Further, a plurality of display devices 200 may be provided.
  • each display device 200 By displaying an image reflecting drawing information on each display device 200, more users can enjoy drawing at the same time. For example, if it is set so that three users can draw each image projected by one projector 210a (when input by three digital pens 400 is possible), another projector 210b May be set so that three more users can draw on the image projected by the projector 210b. By projecting the images projected by the projector 210a and the projector 210b side by side on a wall or the like in a real space, six users can enjoy drawing at the same time.
  • the display device 200 may include a component capable of output other than the display.
  • the display device 200 may be combined with a sound output device such as a speaker.
  • the speaker may be a unidirectional speaker capable of forming directivity in a single direction.
  • the unidirectional speaker outputs sound in the direction in which the user is, for example.
  • the sound output device may be provided in a place different from the display device 200.
  • the sensor 300 detects a user operation on the projection area.
  • a camera 310 that captures a projection region is used as shown in FIG.
  • the camera 310 detects the light emission of the digital pen 400 held by the user.
  • the light emission of the digital pen 400 is assumed to be the light emission of the IR LED provided on the pen tip as described later.
  • the camera 310 may be configured to take an image through a visible light cut filter (for example, an optical filter that mainly passes light in the IR wavelength band). .. That is, an IR camera may be used as the camera 310.
  • the number of cameras 310 may be plural, and may further include, for example, an RGB camera.
  • the camera 310 is assumed to be a camera capable of observing the area displayed by the display device 200, and in the present embodiment, it is assumed that the camera 310 is arranged parallel to and in the vicinity of the optical axis direction of the projector 210.
  • the arrangement position of the projector 210 and the camera 310 is not particularly limited, but the rotation component can be suppressed by making the arrangement positions of the projector 210 and the camera 310 as close and parallel as possible.
  • the rotation component is, for example, a conversion amount generated when performing coordinate conversion from projector coordinates to screen coordinates.
  • a projection transformation matrix between the camera 310 and the projector 210 may be generated in advance, and the projection region may be cut out from the captured image or the projected image may be deformed using the projection transformation matrix.
  • the sensor 300 may include other sensors that sense various information in addition to the camera 310.
  • the sensor 300 may sense information such as the position of the user and the height of the user in addition to the user's operation on the projection area.
  • the sensor 300 may further include a depth sensor, a microphone, and the like.
  • the depth sensor is a device that acquires depth information such as an infrared range measuring device, an ultrasonic range measuring device, a LiDAR (Laser Imaging Detection and Ringing), or a stereo camera.
  • the depth sensor may be a ToF (Time Of Flight) camera capable of acquiring a highly accurate distance image.
  • ToF Time Of Flight
  • a microphone is a device that collects ambient sound and outputs audio data converted into a digital signal via an amplifier and an ADC (Analog Digital Converter).
  • the microphone may be an array microphone.
  • the sensor 300 may sense the user operation with respect to the projection area by a sensor other than the camera 310. Therefore, for example, the sensor 300 may further include a touch sensor or the like provided in the projection area.
  • the digital pen 400 is an example of an input device used by a user.
  • the digital pen 400 is used when the user draws on the projection area 211.
  • the digital pen 400 is formed by a pen-shaped device as an example.
  • the digital pen 400 includes a communication module 410, a control unit 420, an IR LED 430, and a switch 440.
  • the IR LED 450 is an example of a light emitting unit and emits infrared light.
  • the IR LED 450 lights up under the control of the control unit 420.
  • the emission of such infrared light (the bright spot of the IR LED) is detected by the camera 310, and the position can be recognized by the information processing apparatus 100.
  • the IR LED 450 is provided at the tip (pen tip) of the digital pen 400 formed by the pen-shaped device. As a result, the drawing position when the pen tip is grounded to the drawing surface and drawing is performed can be detected by the camera 310.
  • a pen-shaped device is used as an example of the input device, but the present disclosure is not limited to this, and if the bright spot of light emission indicating the drawing position can be detected, the device is formed with a different shape. It may have been done. For example, it may be something that the user sprays, such as a spray can, or it may be a device that can be worn on the limbs.
  • the switch 440 determines whether or not drawing is being performed by the digital pen 400 (that is, an input state). For example, when the pen tip of the digital pen 400 is pressed against a physical object (drawing surface) such as a wall, the pen tip is pushed in and the switch is turned on. It has a mechanism.
  • the control unit 420 transmits the ON / OFF state of the switch to the information processing apparatus 100.
  • a mechanism in which the switch is turned on by pressing the pen tip against the drawing surface (ground surface) has been described, but the present disclosure is not limited to this, and a button or switch provided on the digital pen 400 or the like is described.
  • the switch may be turned on by the operation detection by the operation detection, the operation detection to the operation body using the motion sensor of the operation body, or the like.
  • the communication module 410 is connected to the information processing device 100 by wire or wirelessly to transmit and receive data.
  • the communication module 410 is connected to the information processing device 100 by, for example, a wired / wireless LAN (Local Area Network), Wi-Fi (registered trademark), Bluetooth (registered trademark), ZigBee (registered trademark), short-range wireless communication, or the like. do.
  • the control unit 420 functions as an arithmetic processing unit and a control device, and controls the overall operation in the digital pen 400 according to various programs. Further, the control unit 420 is realized by an electronic circuit such as a CPU (Central Processing Unit), a microprocessor, or a microcontroller. Further, the control unit 420 may include a ROM (Read Only Memory) for storing programs to be used, calculation parameters, and the like, and a RAM (Random Access Memory) for temporarily storing parameters and the like that change as appropriate.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the control unit 420 controls to transmit the ON / OFF state of the switch 440 and the operation state of other switches and buttons (not shown) from the communication module 410 to the information processing device 100. Further, the control unit 420 controls the lighting of the IR LED 450 according to the lighting control command of the IR LED 450 received from the information processing apparatus 100 by the communication module 410.
  • the operation using the digital pen 400 according to the present embodiment is not limited to the drawing input and is displayed. It is also possible to perform various operations such as erasing the displayed locus image and clicking on the displayed image.
  • the digital pen 400 may be further provided with a color LED.
  • the color LED lights up in any color under the control of the control unit 420.
  • the control unit 420 may control the lighting of the color LED 430 according to the lighting control command of the color LED received from the information processing apparatus 100 by the communication module 410.
  • the color LED is provided, for example, at the tip of the digital pen 400.
  • the color of the color LED may be controlled to be the same as the color of the handwriting of the handwriting image displaying the locus drawn by the digital pen 400.
  • the color of the handwriting is determined and shared by the information processing apparatus 100 or the digital pen 400. By controlling the color of the pen tip and the color of the handwriting in the same way, the entertainment of the drawing experience in the real space is further enhanced.
  • the color LED may be a visible light LED.
  • the information processing apparatus 100 includes an I / F (Interface) unit 110, a control unit 120, an operation input unit 130, a display unit 140, and a storage unit 150.
  • the information processing device 100 is realized by, for example, a smartphone, a tablet terminal, a PC (personal computer), or the like. Further, the information processing apparatus 100 may be arranged in the same space as the projector 210, the camera 310, and the digital pen 400, or may be a server on the Internet.
  • the information processing apparatus 100 may be realized by, for example, an edge server, an intermediate server, a cloud server, or the like.
  • the I / F unit 110 is a connection device for connecting the information processing device 100 and other devices.
  • the I / F unit 110 includes, for example, a USB (Universal Serial Bus) connector, a wired / wireless LAN (Local Area Network), Wi-Fi (registered trademark), Bluetooth (registered trademark), ZigBee (registered trademark), and a mobile communication network (registered trademark). It is realized by LTE (Long Term Evolution), 3G (3rd generation mobile communication method), 4G (4th generation mobile communication method), 5G (5th generation mobile communication method), and the like.
  • the I / F unit 110 inputs / outputs information to / from the projector 210 included in the display device 200, the camera 310 included in the sensor 300, and the digital pen 400.
  • Control unit 120 The control unit 120 functions as an arithmetic processing unit and a control device, and controls the overall operation in the information processing device 100 according to various programs.
  • the control unit 120 is realized by an electronic circuit such as a CPU (Central Processing Unit) or a microprocessor. Further, the control unit 120 may include a ROM (Read Only Memory) for storing programs to be used, calculation parameters, and the like, and a RAM (Random Access Memory) for temporarily storing parameters and the like that change as appropriate.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the control unit 120 functions as an application execution unit 121, a display image generation unit 122, and a display control unit 123.
  • the application execution unit 121 starts an application (application program) and executes various command execution processes according to the application program.
  • the storage unit 150 stores data of one or more applications, and the application execution unit 121 calls a specific application from the storage unit 150 and executes it according to an instruction. In the present embodiment, a case where the first application and the second application different from the first application are executed by the application execution unit 121 will be described.
  • the second application is, for example, a writing (drawing input) application that outputs the content input (drawn) by the user to the display device 200 on a screen with a transparent background. More specifically, the writing application detects the position (position on the drawing surface in the real space) of the digital pen 400, which is an example of the input device, by the sensor 300, and detects the locus drawn on the drawing surface by the digital pen 400. Generate a handwriting image to display. The writing application may also identify the positions of the plurality of digital pens 400 and generate a plurality of handwriting images. Further, the writing application maps one or more handwriting images and outputs a screen (writing screen image) through which the background is transparent.
  • the second application is software that appropriately performs image processing based on the writing screen image saved after the writing is completed and the data of the screen (written screen image) presented by the first application. You may.
  • the second application may generate an image (composite process) in which the writing screen image is superimposed on the written screen image after the writing is completed (after the writing screen image is saved).
  • the first application is an application that presents an image (display image) to be displayed on the display device 200. More specifically, the first application is a general application such as a text creation application, a presentation application, a spreadsheet application, an electronic document viewing / editing application generated in a predetermined format, a photo or video viewing application, and the like. Application may be. Further, the first application acquires some processing (image processing) based on the writing screen image, such as acquiring the writing screen image generated and saved by the second application and performing image composition based on the writing screen image. It may be software that can be used.
  • the application execution unit 121 controls to superimpose the writing screen image (second screen) presented by the second application on the display image (first screen) presented by the first application. I do. Such control may be performed according to the second application or may be performed according to the first application. Alternatively, it may be performed according to the operation input by the administrator who operates the information processing apparatus 100.
  • By superimposing a writing screen image with a transparent background on the display image presented by the first application it is not necessary to implement a drawing input function by an input device such as a digital pen 400 on the first application side. It is possible to present the writing (drawing input) state to the image presented by the first application. Specific examples will be described later with reference to FIGS. 4 and 5.
  • the display image generation unit 122 generates a display image (projected image) to be projected by the projector 210.
  • the display image generation unit 122 is in a state in which a writing screen image with a transparent background presented by the second application is superimposed on the display image presented by the first application executed by the application execution unit 121. Generate a display image.
  • the display control unit 123 controls the display image generated by the display image generation unit 122 to be output to the projector 210 via the I / F unit 110 and projected by the projector 210.
  • the operation input unit 130 has a function of receiving an operation input from the user.
  • the operation input unit 130 may be realized by an input device such as a keyboard, a mouse, a touch panel, a button, or a switch. It may be assumed that the operation input to the information processing apparatus 100 is performed by an administrator who is a user different from one or more users who write on the drawing surface using the digital pen 400. The administrator may operate the first application and the second application, start / end the projection by the projector 210, and the like.
  • the display unit 140 is a display that displays various operation screens and the like for operating the information processing apparatus 100.
  • the display unit 140 is realized by a display device such as a liquid crystal display (LCD) or an organic EL ((Electro Luminescence) display), and the display unit 140 also has a display image projected by the projector 210. It may be displayed.
  • LCD liquid crystal display
  • organic EL Electro Luminescence
  • the storage unit 150 is realized by a ROM (Read Only Memory) that stores programs and arithmetic parameters used for processing of the control unit 120, and a RAM (Random Access Memory) that temporarily stores parameters and the like that change as appropriate.
  • the storage unit 150 stores various information input from the external device by the I / F unit 110 and various information calculated and generated by the control unit 120.
  • various information generated by each application executed by the application execution unit 121 may be stored in the storage unit 150 and appropriately read out.
  • the configuration of the information processing apparatus 100 is not limited to the example shown in FIG.
  • the information processing device 100 may be composed of a plurality of devices, or at least a part of the information processing device 100 may be provided in the projector 210 or the camera 310. Further, at least a part of the configuration of the information processing apparatus 100 may be provided in the server.
  • the information processing apparatus 100 may have a function of detecting the position of a user who is writing with the digital pen 400 based on the sensing data received from the sensor 300. For example, when an RGB-IR camera in which an RGB camera and an IR camera are combined is used, the information processing apparatus 100 can acquire a visible light image and an IR image at the same time. The control unit 120 of the information processing apparatus 100 can recognize the user's position and the like from the visible light image. Further, the information processing apparatus 100 may recognize the user's position or the like based on the sensing data detected by the depth sensor such as a stereo camera or a ToF (Time Of Flight) camera.
  • the depth sensor such as a stereo camera or a ToF (Time Of Flight) camera.
  • the writing application 1210 mainly functions as a light emission recognition unit 1211, a coordinate conversion unit 1212, an operation detection unit 1213, and a writing screen image generation unit 1214.
  • the light emission recognition unit 1211 detects the bright spot of the IR LED as the position of the digital pen 400 (input device) based on the captured image of the projection area 211 (display area) captured by the camera 310 (imaging device) (the bright spot of the IR LED). Recognition process).
  • the detected position coordinates of the bright spot (the light emitting position of the IR LED detected in the camera coordinate system) are stored in the storage unit 150. Further, the light emission recognition unit 1211 continuously detects the position of the detected bright spot based on the captured image continuously captured by the camera 310, that is, tracks the position of the digital pen 400.
  • the coordinate conversion unit 1212 converts the position coordinates of the bright spot recognized by the light emission recognition unit 1211 into the display coordinate system of the projector 210. Specifically, the coordinate conversion unit 1212 converts (calculates) the position coordinates using the projection matrix. The converted position coordinates are stored in the storage unit 150. Further, the coordinate conversion unit 1212 continuously converts the position coordinates based on the position coordinates of the bright spot continuously recognized by the light emission recognition unit 1211.
  • the operation detection unit 1213 detects that an operation has been performed on the digital pen 400 based on the information received from the digital pen 400. For example, the operation detection unit 1213 detects that the switch 440 provided on the digital pen 400 is turned on (that is, it is in a drawing state).
  • the writing screen image generation unit 1214 generates a writing screen image in which writing on the drawing surface by the digital pen 400 is reflected. For example, the writing screen image generation unit 1214 draws based on the bright spot (IR light position) converted by the coordinate conversion unit 1212 when the operation detection unit 1213 detects that the digital pen 400 is in the drawing state. Generate an image that maps information (handwriting image). As a result, for example, when the user touches the pen tip of the digital pen 400 on the drawing surface and draws a locus, an image to which a handwriting image showing the locus is mapped is generated. The writing screen image generation unit 1214 generates a screen image that reflects the writing on the drawing surface by the digital pen 400 and that has a transparent background.
  • the writing application can identify the positions of a plurality of digital pens 400 and generate a handwriting image showing a locus drawn by each of the digital pens 400. That is, it is possible for a plurality of people to write on the drawing surface at the same time.
  • the identification of the position of the digital pen 400 can be realized, for example, by associating the ID received from the digital pen 400 with the bright spot detected from the image captured by the camera 310.
  • the writing application 1210 communicates with the digital pen 400 via the I / F unit 110 of the information processing apparatus 100 to acquire an ID.
  • the writing application 1210 transmits an IR light emission command to the digital pen 400 connected by communication, and then regards a bright spot newly detected from the captured image as the position of the digital pen 400, and ID of the digital pen 400. To associate.
  • FIG. 4 is a diagram showing an example of a projected image in which the writing screen image according to the present embodiment is superimposed and displayed.
  • a handwriting image 511 showing a locus drawn by a user on a drawing surface (projection area 211) in real space using a digital pen 400, and a bubble image 501 to represent a bubble.
  • a projected image 50 including 503 is projected.
  • the bubble images 501 to 503 are examples of depictions that clearly indicate the place (area) to be written by the user on the drawing surface, and the color, shape, size, number, and expression method are particularly limited as long as the display can distinguish the areas. Not done. For example, an announcement such as "Please write in a bubble" may be made in advance to the user. Further, when the writing is completed, an animation in which the written bubble images 501 to 503 randomly float on the screen may be displayed in the projection area 211 (details will be described later with reference to FIG. 8).
  • the presentation of the handwriting image 511 by the digital pen 400 can be performed by the writing application which is an example of the second application.
  • the presentation of the bubble images 501 to 503 and the subsequent presentation of the animation of the bubble images 501 to 503 can be performed by the written application which is an example of the first application.
  • the "written application” is an application on which the writing screen image presented by the writing application is superimposed and displayed. Although writing is not actually performed, in the present specification, it appears as if writing is being performed when the background presented by the writing application is located on the back side of the transparent writing screen. Called "written application".
  • the write-to-write application may be a general application or an application having a function of cooperating with the write application.
  • the projected image 50 projected on the projection area 211 has a configuration in which two images are superimposed, as shown in FIG.
  • FIG. 5 is a diagram illustrating superimposition of two images according to the present embodiment.
  • the first image 500 shown in FIG. 5 is a screen of a bubble image presented by the writing application
  • the second image 510 is a writing screen presented by the writing application. Since the background of the second image 510 (writing screen) is transparent, when the second image 510 (writing screen) is superimposed on the first image 500, it is as if writing on the screen of the bubble image as shown in FIG. It becomes possible to show to.
  • both the first image 500 and the second image 510 are displayed in full screen, but the present embodiment is not limited to this.
  • at least a part of the area of the second image may be superimposed on at least a part of the area of the first image.
  • a plurality of second images may be superimposed on the area of the first image.
  • the plurality of second images in this case may be at least partially overlapped with each other, may be separated from each other, or may be arranged adjacent to each other.
  • FIG. 6 is a flowchart showing an example of the overall flow of the first operation process according to the present embodiment. The first operation process will be described when there is no cooperation between applications (when interprocess communication or the like described later is not performed).
  • the application execution unit 121 starts the first application (written application) (step S103).
  • the application execution unit 121 starts a second application (writing application) (step S106).
  • the starting order of the first application and the second application is not particularly limited, and they can be started at each timing. Specifically, it may be activated in response to an operation input by the administrator (for example, double-clicking each icon displayed on the display unit 140). Further, the second application (writing application) may be started by the application execution unit 121 in response to a specific event that occurs in the digital pen 400, such as an operation of a button provided on the digital pen 400. Unlike the first and second applications, the application execution unit 121 may be a function of a service program that is started separately, and the service program receives from the digital pen 400 via the I / F unit 110. Monitor the information. In this way, when there is no cooperation between applications, each application (first application and second application) is started based on a manual operation by a user or an administrator.
  • the display image generation unit 122 generates a display image in which the screen presented by the second application is superimposed on the screen presented by the first application, and the display control unit 123 displays the display image from the I / F unit 110. It is output to the apparatus 200 and the display is controlled (step S109).
  • the display image generation unit 122 and the display control unit 123 may be functions of the service program.
  • a writing process using the digital pen 400 is performed (step S112).
  • the writing application generates a handwriting image showing a locus drawn on a drawing surface (projection area 211) in real space by a digital pen 400, maps it to a writing screen image, and projects it on the projection area 211.
  • a writing screen image has a transparent background and is displayed superimposed on the screen presented by the application to be written. Therefore, the user substantially writes on the transparent screen (drawing surface) superimposed and displayed on the screen presented by the application to be written, and the writing is reflected on the transparent screen. It becomes possible to visually recognize the writing state on the screen presented by the application to be written.
  • the writing application displays an icon indicating that writing with the digital pen 400 is possible (that is, the writing application is running) and that pen input is possible on the screen presented by the writing application. It may be notified by, or it may be notified by playing voice guidance or music from a speaker (not shown).
  • the writing application performs a process of saving the written image (writing screen image) in the storage unit 150 (step S115).
  • the operation of ending the writing may be performed by the user who is writing or the administrator who is operating the information processing apparatus 100.
  • the administrator may operate a specific key on the keyboard (an example of the operation input unit 130) to end the operation, or the user operates the end icon displayed on the projected screen with the digital pen 400 to end the operation. You may let me. Alternatively, it may be automatically terminated by a timer. Further, the event may be terminated when a predetermined end event that occurs in the digital pen 400, such as the operation of a button provided on the digital pen 400, is performed. Further, it may be terminated when the service program detects that the written application presenting the screen displayed on the back of the screen presented by the writing application has terminated and the service program notifies the termination.
  • the writing screen image is saved as a single data regardless of the image on the written side (the written image that is not actually written but appears to be overwritten on the screen). Can be done. That is, the writing screen image is managed by a second application different from the first application that handles the written image. Specifically, for example, the second image 510 shown in FIG. 5 is saved as a writing screen image. At this time, the writing application may add writing date / time information and information of the writing application to the writing screen image to be saved. Further, at this point, no change (writing) has been made in the data to the image to be written (the display screen presented by the application to be written, for example, the first image 500 shown in FIG. 5).
  • the application to be written may have a function of performing various processes such as compositing based on the written image saved by the writing application.
  • the write-to-write application acquires the saved written image by an operation by the administrator or the like, and performs a process of synthesizing the written image and the written image (step S118).
  • the written image is an image with a transparent background, and the written application can generate a written image by superimposing the written image on the written image and synthesizing the written image.
  • the control unit 120 may execute such a synthesis process by another application.
  • any application that does not implement the function of linking with the write application can be used as the write-to-write application.
  • the user or the administrator can easily perform pen input to any application simply by starting the application for which pen input is desired (first application) and the pen input application (second application). ..
  • FIG. 7 is a sequence diagram showing an example of the operation processing of the writing display application and the writing application according to the present embodiment.
  • the message display application is started by a start operation by an administrator or the like (clicking the icon of the message display application arranged on the screen displayed on the display unit 140, etc.) (step S123). ..
  • the Good Luck Flag display application is started, the menu screen or the like presented by the Good Luck Flag display application is displayed on the display unit 140.
  • the administrator or the like controls the operation input unit 130 to project the screen presented by the message display application on the projector 210.
  • the screen presented by the message display application may be automatically projected by the projector 210 connected to the information processing apparatus 100. At this time, the screen displayed on the display unit 140 may be projected, or a part of the screen displayed on the display unit 140 may be projected.
  • step S129 when the write function in cooperation with the write application is turned on (execution instruction) by the administrator or the like (step S129), the good-luck flag display application controls the start of the write application (execution of the process to start) (step S129). Step S132).
  • the light emission recognition unit 1211 included in the functional configuration of the writing application detects the bright spot due to the IR light emission of the digital pen 400 based on the captured image of the projection area 211 captured by the camera 310, and recognizes it as the position of the digital pen 400. (Step S138). Then, the light emission recognition unit 1211 starts tracking (a process of continuously recognizing the bright spot) of the recognized bright spot (position of the digital pen 400).
  • the coordinate conversion unit 1212 included in the functional configuration of the writing application performs coordinate conversion for converting the bright spot position (camera coordinate system) detected from the captured image into the display coordinate system of the projector 210 (step S141).
  • the writing screen image generation unit 1214 included in the functional configuration of the writing application acquires information from the digital pen 400 that the switch 440 is turned on by pressing the digital pen 400 against the drawing surface, the digital pen 400 draws the image.
  • a handwriting image showing a trajectory drawn on the surface (movement trajectory of the recognized bright spot) is mapped, and a writing screen image transparent to the background is generated (step S144).
  • the writing application controls to display (project) the generated writing screen image from the projector 210 (step S147). More specifically, as a function of the display control unit 123, display (projection) control of the writing screen image can be performed.
  • the good-luck flag display application controls to display (project) the image to be written (step S150).
  • the display of the image to be written may be performed in conjunction with the writing function ON shown in step S129. More specifically, as a function of the display control unit 123, display (projection) control of the image to be written is performed.
  • the image to be written is an image including a display clearly indicating the writing area, which is generated assuming writing on the screen by the digital pen 400 realized by the writing application (for example, the first image of FIG. 5). 500).
  • the image to be written projected on the projection area 211 the user writes in the area specified as the writing area (for example, in the area of the bubble images 501 to 503 shown in FIG. 4) with the digital pen 400. Writing with the digital pen 400 is detected by the function of the writing application and reflected in the projection area 211. Specifically, the writing screen image is superimposed and displayed on the image to be written.
  • the writing display application turns on the writing timer when the writing time is set (step S153).
  • the write timer may be counted after displaying the image to be written, for example.
  • the good-luck flag display application may display a countdown for counting the remaining time on the image to be written.
  • a countdown display (display of the same countdown) may be performed near each of the bubble images 501 to 503 shown in FIG.
  • the display positions of the bubble images 501 to 503 do not change, and an animation may be added such that the frame portion expressing the bubbles and the shadow of the bubbles are slightly swayed.
  • the good-luck flag display application controls to end the writing (step S159). Specifically, the good-luck flag display application notifies the writing application of the end of writing, for example, by interprocess communication.
  • writing to the image to be written is completed by a countdown, but the present embodiment is not limited to this, and for example, the administrator may end writing at any timing.
  • the writing application when the writing application receives a notification (for example, interprocess communication) from the writing display application, the writing application saves the writing image superimposed on the written image in the storage unit 150 (step S162).
  • the data format at the time of saving the written image is arbitrarily set. Further, the written image to be saved is assumed to be saved on one screen to which a plurality of handwritten images are mapped, for example, as in the second image 510 shown in FIG.
  • the operation of the writing application ends.
  • the end operation may be an end operation by the administrator, or may be a notification (for example, interprocess communication) from the message display application. Further, after saving the above-mentioned written image, if it is not set to continue the next writing, it may be automatically terminated. In the case of the setting for continuing the next writing, the writing application saves the written image, displays a new unwritten writing screen, and repeats the above steps S138 to S162.
  • the writing display application acquires the writing screen image saved by the writing application from the storage unit 150 (step S168) and generates a composite image (step S171).
  • a case where the writing application is started as shown in step S132 and a function of generating a composite image with the written image based on the writing screen image is described will be described. ..
  • the good-luck flag display application controls to display the generated composite image (good-luck flag image) (step S174).
  • the display of the composite image is presented by an animation floating around the bubble images 501 to 503 newly displayed for the next writing, and the above steps S150 to S174 are repeated. May be.
  • the composite image floating in the background may be displayed for a certain period of time and deleted after a certain period of time.
  • By presenting a display in which the composite image floats around one after another each time it is written it is possible to realize the display of the message by a large number of people. Since the writing screen image is saved separately, even after the composite image is deleted without being saved, the writing display application can read it at any time to generate and display the composite image (combined image).
  • FIG. 8 is a diagram illustrating a case where a composite image with a written image is generated based on a writing screen image.
  • the writing display application simultaneously displays the written image (as an example) displayed on the background side.
  • the process of separating with the meaningful layout information of the first image 500) is performed.
  • the layout information includes, for example, information on the position and contour of the bubble image that clearly indicates the range of the writing area included in the first image 500.
  • the write-to-write application separates at least a portion of the write screen image, for example, by the contour of a bubble image showing the extent of the write area contained in the first image 500. More specifically, as shown in FIG. 8, separated images 510a to 510c separated by the contour of each bubble image are extracted from the second image 510.
  • the message display application generates composite images 520a to 520c in which separated images are combined with each bubble image cut out from the first image 500.
  • the message display application controls to display the composite images 520a to 520c on the projected image (first image 500b) projected by the projector 210.
  • the message display application may present an animation in which composite images 520a to 520c are randomly floated on the screen.
  • the message display application can appropriately adjust the size, lightness, hue, saturation, transparency, etc. of the composite images 520a to 520c.
  • bubble images 501 to 503 in which the next writing is performed are displayed.
  • the composite images 520a to 520c may be displayed, for example, around the bubble images 501 to 503 or as a background image.
  • step S177 / Yes when the end operation is performed by the administrator or the like (step S177 / Yes), the operation of the write-to-write application ends.
  • the generated composite image is expressed as a "good-luck flag", but the present disclosure is not limited to this.
  • the write-to-write application may be an application that can enhance the entertainment of pen input in cooperation with the write application.
  • FIG. 9 is a diagram showing an example of a notification of a pen input state and an end icon according to the present embodiment.
  • the projected image 50 may display an icon 540 indicating that writing by the digital pen 400 is possible, and an end icon 542 for ending pen input.
  • the writing application receives the instruction to end writing and performs the writing image saving process shown in step S162. Further, the writing end display application may also acquire the input of the writing end instruction, and the processing of the above step 159 or the processing of the above step 168 and subsequent steps may be performed.
  • the writing application may notify that writing with the digital pen 400 is possible by displaying an image 541 showing a state in which a person's silhouette, a character, CG, a person in a live-action image, or the like is writing. good.
  • the image 541 may be a still image or a moving image. Further, the writing application may notify by text display that writing with the digital pen 400 is possible.
  • the writing application may notify that writing by the digital pen 400 is possible by playing voice guidance or music from a speaker (not shown). Further, the writing application may notify that writing by the digital pen 400 is possible by lighting a color LED provided on the digital pen 400.
  • General applications include, for example, text creation applications, presentation applications, spreadsheet applications, electronic document viewing / editing applications generated in a predetermined format, and photo and video viewing applications. Can be mentioned.
  • FIG. 10 is a flowchart showing an example of operation processing of a general application and a writing application according to the present embodiment.
  • the application execution unit 121 of the information processing apparatus 100 activates a general application (written application) (step S203).
  • the general application may be started by the operation input by the administrator.
  • the administrator may be the same person as the user who writes using the digital pen 400.
  • step S209 when the write function set in the general application is turned on (step S206 / Yes), the write application is started (step S209).
  • the ON operation of the writing function is performed by, for example, from the menu screen of a general application by the administrator. Further, the writing application may be started by interprocess communication or the like from a general application.
  • the control unit 120 controls to superimpose the screen of the writing application on the screen of the general application and display (project) it on the projection area 211 or the like (step S212).
  • the screen of the writing application is a screen with a transparent background. Although details are omitted, as in each operation process described above, there is a screen in which the background to which the handwriting image showing the trajectory drawn on the drawing surface (projection area 211, etc.) in the real space is mapped by the digital pen 400 is transparent. , Presented by the writing application. The user can use the digital pen 400 to write on the screen of a general application projected on the projection area 211 or the like.
  • the general application determines whether or not to save the contents written by the linked writing application (step S218). Whether or not to save the written contents may be set in advance by the administrator or the user, or a pop-up display or the like may be displayed to confirm with the administrator or the user before terminating the general application.
  • step S218 when saving the written contents (step S218 / Yes), the general application requests the writing application to save the writing screen image by interprocess communication or the like.
  • the writing application saves the writing screen image in the storage unit 150 and ends the operation.
  • the operation of the general application is also terminated (step S224).
  • the general application ends the operation without making a save request to the writing application (step S221).
  • the general application may send the operation end notification to the writing application by interprocess communication or the like.
  • the writing application saves the writing screen image and then ends the writing application (step S230).
  • the saving of the writing screen image may be temporary saving. For example, even if the writing application is terminated first, when the general application is terminated, it is determined whether or not to save the written contents. When the written content is not saved, the general application may request the writing application to discard the temporarily saved written content by interprocess communication or the like. When saving the written contents, the destruction request is not made.
  • a pen input application (writing application) can be started by an operation from the general application.
  • the user or the administrator does not need to load the image data or the like into the pen input application in advance, and the troublesome operation is reduced.
  • writing is performed on the superimposed screen, so that the convenience is further improved.
  • the written content is saved separately from the data on the side to be written, the written content can be read at any time to check the writing state.
  • the written content can be easily hidden by the screen switching operation that displays the application to be written in the foreground.
  • the notification from the general application to the writing application is performed by inter-process communication or the like (inter-app communication), but the present disclosure is not limited to this.
  • the notification may be given via a service program separately started by the application execution unit 121.
  • the eraser function according to the present embodiment has a function of erasing only the locus drawn on the writing application side and a function of erasing the information displayed on the written application side together with the locus.
  • FIG. 11 is a diagram illustrating two types of eraser functions according to the present embodiment. Execution of the eraser function can be performed using, for example, a digital pen 400.
  • the handwriting written on the bubble image is traced with the digital pen 400 with the eraser function ON as shown in the screen example 550 of FIG. 11, only the handwriting written on the screen example 551 is erased.
  • the handwriting and the background bubble image may be erased.
  • the background bubble image is an image presented by the writing application and the handwriting is an image presented by the writing application, for example, by reflecting the input information by the eraser function only in the writing application, only the handwriting can be erased. It is possible.
  • the writing application may perform an operation process of erasing the handwriting displayed at the position of the recognized digital pen 400.
  • the writing application may notify the writing application of the input information (eraser input coordinates, etc.) by the eraser function by interprocess communication, or the background on the writing application side.
  • a method of painting with a color can be considered.
  • FIG. 12 is a sequence diagram showing an operation process when the eraser function is linked according to the present embodiment. In FIG. 12, it is assumed that the eraser function can be added to the application to be written.
  • step S303 when the writing application detects the eraser input that also targets the background by the digital pen 400 (step S303), the writing application notifies the writing application of the eraser input coordinates by interprocess communication or the like (step S306).
  • step S306 the writing application side naturally performs the process of erasing the handwriting corresponding to the eraser input coordinates.
  • the write-to-write application receives the eraser input coordinates from the write application and performs a process of erasing the image (for example, a bubble image) corresponding to the eraser input coordinates (step S309).
  • the write-to-write application has an added function that can receive the eraser input coordinates from the write application and perform the corresponding processing (erasing processing).
  • FIG. 13 is a flowchart showing an operation process when there is no cooperation of the eraser function according to the present embodiment. As shown in FIG. 13, the write-to-write application without the linkage function is activated according to an operation by an administrator or the like and displays an image (step S313).
  • the writing application is also started according to the operation of the administrator or the like, and the writing screen image with the transparent background is superimposed on the image of the writing application and displayed.
  • the writing application can detect the eraser input that targets the background as well as the drawing input by the digital pen 400 (step S316).
  • the writing application performs a process of filling the handwriting corresponding to the eraser input coordinates with the background color (step S319).
  • the background color is the background color of the screen presented by the application to be written.
  • white is the background color.
  • the state of screen example 552 can be realized by painting the portion traced by the digital pen 400 with white, which is the background color of the back side screen (the screen presented by the application to be written), and the handwriting and the background. You can make it look like both of them have disappeared.
  • step S322 when the end operation by the administrator or the like is performed in the write-to-write application (step S322), the write-to-write application saves the latest data in the storage unit 150 (overwrite save, etc.) and ends the operation (step S325). ). If there is no update from the already saved data, the saving process may not be performed.
  • the writing application acquires the saved data of the application to be written (step S328), and performs a process of synthesizing the writing screen image including the background color filling (eraser processing) with the acquired data (step S331). ..
  • the process of synthesizing is to save the acquired data in a state in which the writing screen image including the eraser process in which the background is transparent is superimposed (may be saved in an image format).
  • the synthesized data is stored in the storage unit 150. Further, only the writing content (writing screen image) including the eraser processing may be stored in the storage unit 150. By superimposing it on the image of the application to be written at any time, it is possible to view the state in which the writing including the eraser processing has been performed.
  • the writing application may monitor the display screen (projected image) to detect page feed (change in the back screen).
  • the write-to-write application saves the write screen image (may add page information) when page feed is performed, and displays a new write screen image on the display screen. good.
  • the writing application side or another application is provided with a function to reflect the information of the writing screen image including the eraser processing in the data format of the writing application
  • the data format of the writing application is used as the compositing process.
  • the editing process that reflects the information of the writing screen image including the eraser processing may be performed above.
  • the function of erasing only the handwriting and the function of erasing the handwriting and the background may be appropriately switched as needed.
  • the writing application may be selected by the user using a GUI (graphic user interface).
  • FIG. 14 is a diagram showing an example of a GUI for selecting an eraser function according to the present embodiment.
  • an icon 561 for selecting a function for erasing the handwriting and the background and an icon 562 for selecting the function for erasing only the handwriting are displayed.
  • Each icon makes it possible to intuitively select the eraser function.
  • each icon may be selected by using the digital pen 400 or by touching the icon by hand.
  • the touch operation can be detected by a touch sensor, a depth sensor, or the like.
  • FIG. 15 is a diagram illustrating selection of an eraser function by switching drawing tools according to the present embodiment.
  • the screen example 570 of FIG. 15 shows an example in which the function of erasing only the handwriting is executed when the user rubs with a finger
  • the screen example 572 shows a function of erasing the handwriting and the background when the user erases with the digital pen 400.
  • the hand or finger is used, only the handwriting may be erased, and when the digital pen 400 is used, the background may be erased.
  • a finger recognition using a depth sensor or a method using a distance measuring sensor such as LiDAR can be considered.
  • a distance measuring sensor such as LiDAR
  • a touch sensor may be provided on the drawing surface.
  • the eraser function may be switched by using a drawing tool.
  • the function of erasing only the handwriting may be executed, and when the mouse operation is performed, the function of erasing the background may be executed.
  • the eraser function may be switched depending on the operation method. For example, when a specific place is traced once within a unit time, the function of erasing only the handwriting may be executed, and when the trace is repeated a plurality of times within a certain time, the function of erasing the background may be executed. More specifically, for example, when the same place is traced once within 1 second, the function of erasing only the handwriting may be executed, and when the trace is repeated a plurality of times, the function of erasing the background may be executed.
  • FIG. 16 is a diagram illustrating adjustment of the display position of the bubble images 581 to 583 according to the position of the user's head according to the present embodiment.
  • bubble images 581 to 583 are displayed on the projected image 52.
  • the bubble images 581 to 583 are examples of images presented by the application to be written and clearly indicating the place (area) to be written by the user.
  • the application to be written controls to divide the screen vertically and display one bubble image in each divided area in consideration of writing by a plurality of people
  • the position and height of the head of the user who writes. Is detected and the bubble image is controlled to be displayed at a height within reach.
  • the bubble images 581 to 583 are displayed at appropriate heights according to the positions and heights of the heads of the users U1 to U3.
  • an RGB camera for example, an RGB camera, a depth sensor, a line sensor for measuring height, or the like may be used. Further, the height may be calculated by a method of measuring the distance from overhead by an ultrasonic sensor.
  • the writable area may be changed on the screen presented by the writing application in conjunction with the UI that has meaning on the screen presented by the writing application.
  • the writable area of the write application may also change dynamically.
  • the writable area is an area for displaying a handwriting image written using the digital pen 400. For example, if the writable area is linked in real time to the contour of the bubble image presented by the writeable application, the write application will not be able to write outside the contour of the bubble image (the handwriting image will not be displayed). You may.
  • the second is matched to the contour of the bubble image.
  • the handwriting image protruding outside the outline of the bubble image may not be the target of composition.
  • the display in which the written image is superimposed on the written image is not limited to the display by projection, and the display device other than the projector 210, for example, a TV device, a display, a tablet terminal, a smartphone, a mobile phone, a PC (personal computer), or a touch panel. It may be a display, an HMD (head mount display), a wearable device, or the like.
  • the present technology can also have the following configurations.
  • a control unit for displaying and outputting a first screen presented by the first application and a second screen with a transparent background presented by the second application that accepts write input is provided.
  • the control unit Control is performed to superimpose and display the second screen on the first screen.
  • Image processing using the second screen including the display of the write input and the first screen is performed.
  • Information processing equipment (2) The information processing device according to (1) above, wherein the control unit performs a process of generating a composite image in which the second screen is combined with the first screen.
  • the control unit sets an image obtained by separating at least a part from the second screen according to the layout information extracted from the first screen, as a part of the first screen corresponding to the layout information.
  • the information processing apparatus which performs a process of generating a composite image combined with an image.
  • the layout information includes information on the outline of an image that clearly indicates a writing area included in the first screen.
  • the second application recognizes the position of the input device, accepts the locus drawn on the drawing surface by the input device as the write input, and displays the handwriting image showing the locus on the second screen.
  • the information processing apparatus according to any one of (1) to (4).
  • (6) The information processing device according to (5) above, wherein the control unit controls to display a display screen on which the second screen is superimposed on the first screen on the drawing surface.
  • the second application performs a process of filling a target portion with a background color of the first screen when receiving a display of the write input and an input for erasing the display of the corresponding first screen.
  • the information processing apparatus according to (11).
  • the second application receives the display of the write input and the input for erasing the display of the corresponding first screen, the coordinate information indicating the target portion of the input for erasing the display to the first application.
  • the information processing apparatus according to (11) above.
  • the processor Displaying and outputting the first screen presented by the first application and the second screen with a transparent background presented by the second application that accepts write input. Controlling the display of the second screen superimposed on the first screen is performed.
  • (15) Computer The first screen presented by the first application and the second screen with a transparent background presented by the second application that accepts write input are made to function as a control unit for displaying and outputting.
  • the control unit Control is performed to superimpose and display the second screen on the first screen.
  • Information processing device 110 I / F unit 120 Control unit 121 Application execution unit 122 Display image generation unit 123 Display control unit 130 Operation input unit 140 Display unit 150 Storage unit 200 Display device 210 Projector 300 Sensor 310 Camera 400 Digital pen 410 Communication module 420 Control unit 430 IR LED 440 switch

Abstract

[Problem] To provide an information processing device, an information processing method, and a program which can increase the convenience of user input by displaying information about the user input in a superimposed manner on screens presented by different applications. [Solution] This information processing device includes a control unit which displays and outputs a first screen presented by a first application, and a second screen with a transmissive background presented by a second application that accepts write input. The control unit performs control to display the second screen on the first screen in a superimposed manner, and performs image processing that uses the first screen and the second screen including the display of the write input.

Description

[規則37.2に基づきISAが決定した発明の名称] 書き込み画面画像を重畳する情報処理装置[Name of the invention determined by ISA based on Rule 37.2.] Information processing device that superimposes the writing screen image
 本開示は、情報処理装置、情報処理方法、およびプログラムに関する。 This disclosure relates to information processing devices, information processing methods, and programs.
 近年、実空間における描画面上でユーザが描画具を動かすと、描画具の軌跡を示す線等の情報(以下、描画情報とも称する)をプロジェクタ等の投影装置により描画面に投影する描画システムが登場している。例えば、下記特許文献1では、かかる描画システムに用いられる描画具であるデジタルペンから発光される赤外光を検出してデジタルペンにより描かれる筆跡を認識し、デジタルペンのペン先が接地している物体(描画面)に筆跡画像を投影する技術が開示されている。 In recent years, when a user moves a drawing tool on a drawing surface in real space, a drawing system that projects information such as a line indicating the trajectory of the drawing tool (hereinafter, also referred to as drawing information) onto the drawing surface by a projection device such as a projector has been used. It has appeared. For example, in Patent Document 1 below, infrared light emitted from a digital pen, which is a drawing tool used in such a drawing system, is detected to recognize handwriting drawn by the digital pen, and the pen tip of the digital pen touches the ground. A technique for projecting a handwriting image on an existing object (drawing surface) is disclosed.
国際公開第2019/111465号International Publication No. 2019/11146
 上記特許文献1では、デジタルペンにより実空間に入力された筆跡画像の保存や読み出し、筆跡画像の実空間への投影(表示)が行われるが、既存のアプリケーション等の他のアプリケーションにより展開される表示画像に重ねて表示することについては考慮されていない。 In Patent Document 1, the handwriting image input to the real space by the digital pen is saved and read, and the handwriting image is projected (displayed) to the real space, but it is developed by other applications such as existing applications. It is not considered to be superimposed on the displayed image.
 本開示によれば、第1のアプリケーションにより提示される第1の画面と、書き込み入力を受け付ける第2のアプリケーションにより提示される、背景が透過された第2の画面とを、表示出力する制御部を備え、前記制御部は、前記第1の画面に、前記第2の画面を重畳して表示する制御を行い、前記書き込み入力の表示を含む前記第2の画面と前記第1の画面とを用いた画像処理を行う、情報処理装置を提案する。 According to the present disclosure, a control unit that displays and outputs a first screen presented by a first application and a second screen with a transparent background presented by a second application that accepts write input. The control unit controls to superimpose and display the second screen on the first screen, and displays the second screen including the display of the write input and the first screen. We propose an information processing device that performs the image processing used.
 本開示によれば、プロセッサが、第1のアプリケーションにより提示される第1の画面と、書き込み入力を受け付ける第2のアプリケーションにより提示される、背景が透過された第2の画面とを、表示出力することと、前記第1の画面に、前記第2の画面を重畳して表示する制御を行うことと、前記書き込み入力の表示を含む前記第2の画面と前記第1の画面とを用いた画像処理を行うことと、を含む、情報処理方法を提案する。 According to the present disclosure, the processor displays and outputs a first screen presented by the first application and a second screen with a transparent background presented by the second application accepting write input. The second screen including the display of the write input and the first screen are used to perform control to superimpose and display the second screen on the first screen. We propose information processing methods including image processing.
 本開示によれば、コンピュータを、第1のアプリケーションにより提示される第1の画面と、書き込み入力を受け付ける第2のアプリケーションにより提示される、背景が透過された第2の画面とを、表示出力する制御部として機能させ、前記制御部は、前記第1の画面に、前記第2の画面を重畳して表示する制御を行い、前記書き込み入力の表示を含む前記第2の画面と前記第1の画面とを用いた画像処理を行う、プログラムを提案する。 According to the present disclosure, a computer is displayed and output as a first screen presented by a first application and a second screen with a transparent background presented by a second application that accepts write input. The control unit controls to superimpose and display the second screen on the first screen, and the second screen including the display of the write input and the first screen. We propose a program that performs image processing using the screen of.
本開示の一実施形態による情報処理システムの概要について説明する図である。It is a figure explaining the outline of the information processing system by one Embodiment of this disclosure. 本実施形態に係る情報処理システムの構成の一例を示す図である。It is a figure which shows an example of the structure of the information processing system which concerns on this embodiment. 本実施形態による書き込みアプリケーションの機能構成について説明する図である。It is a figure explaining the functional structure of the writing application by this embodiment. 本実施形態による書き込み画面画像が重畳表示された投影画像の一例を示す図である。It is a figure which shows an example of the projection image which superposed and displayed the writing screen image by this embodiment. 本実施形態による2つの画像の重畳について説明する図である。It is a figure explaining the superimposition of two images by this embodiment. 本実施形態による情報処理装置の動作処理を示すフローチャートである。It is a flowchart which shows the operation process of the information processing apparatus by this embodiment. 本実施形態による寄せ書き表示アプリケーションと書き込みアプリケーションの動作処理の一例を示すシーケンス図である。It is a sequence diagram which shows an example of the operation processing of the writing display application and the writing application by this embodiment. 本実施形態による書き込み画面画像に基づいて被書き込み画像との合成画像を生成する場合について説明する図である。It is a figure explaining the case of generating the composite image with the image to be written based on the writing screen image by this embodiment. 本実施形態によるペン入力状態の通知および終了アイコンの一例を示す図である。It is a figure which shows an example of the notification of the pen input state and the end icon by this embodiment. 本実施形態による一般アプリケーションと書き込みアプリケーションの動作処理の一例を示すフローチャートである。It is a flowchart which shows an example of the operation processing of the general application and the writing application by this embodiment. 本実施形態による2種類の消しゴム機能について説明する図である。It is a figure explaining two kinds of eraser functions by this embodiment. 本実施形態による消しゴム機能の連携がある場合の動作処理を示すシーケンス図である。It is a sequence diagram which shows the operation process when there is the cooperation of the eraser function by this embodiment. 本実施形態による消しゴム機能の連携が無い場合の動作処理を示すフローチャートである。It is a flowchart which shows the operation process when there is no cooperation of the eraser function by this embodiment. 本実施形態による消しゴム機能の選択GUIの一例を示す図である。It is a figure which shows an example of the selection GUI of the eraser function by this embodiment. 本実施形態による描画具の切り替えによる消しゴム機能の選択について説明する図である。It is a figure explaining the selection of the eraser function by switching the drawing tool by this embodiment. 本実施形態によるユーザの頭の位置に応じた泡画像の表示位置の調整について説明する図である。It is a figure explaining the adjustment of the display position of a bubble image according to the position of a user's head by this embodiment.
 以下に添付図面を参照しながら、本開示の好適な実施の形態について詳細に説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。 The preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings below. In the present specification and the drawings, components having substantially the same functional configuration are designated by the same reference numerals, so that duplicate description will be omitted.
 また、説明は以下の順序で行うものとする。
 1.本開示の一実施形態による情報処理システムの概要
 2.構成例
  2-1.表示装置200
  2-2.センサ300
  2-3.デジタルペン400
  2-4.情報処理装置100
  (2-4-1.書き込みアプリケーションの機能構成)
  (2-4-2.書き込み画面画像の重畳表示について)
 3.動作処理
  3-1.第1の動作処理例
  3-2.第2の動作処理例
  3-3.第3の動作処理例
 4.消しゴム機能
  4-1.消しゴム機能の連携がある場合
  4-2.消しゴム機能の連携が無い場合
  4-3.消しゴム機能の切り替え
 5.その他
  5-1.書き込みエリア表示の位置調整
  5-2.書き込み可能領域の変化
 6.補足
In addition, the explanation shall be given in the following order.
1. 1. Outline of information processing system according to one embodiment of the present disclosure 2. Configuration example 2-1. Display device 200
2-2. Sensor 300
2-3. Digital pen 400
2-4. Information processing device 100
(2-4-1. Functional configuration of writing application)
(2-4-2. Overlapping display of writing screen image)
3. 3. Operation processing 3-1. First operation processing example 3-2. Second operation processing example 3-3. Third operation processing example 4. Eraser function 4-1. When there is an eraser function linkage 4-2. When there is no cooperation of the eraser function 4-3. Switching the eraser function 5. Others 5-1. Adjusting the position of the writing area display 5-2. Change in writable area 6. supplement
 <<1.本開示の一実施形態による情報処理システムの概要>>
 図1は、本開示の一実施形態による情報処理システムの概要について説明する図である。図1に示すように、本実施形態による情報処理システムは、プロジェクタ210、カメラ310、デジタルペン400、および情報処理装置100を含む。
<< 1. Outline of information processing system according to one embodiment of the present disclosure >>
FIG. 1 is a diagram illustrating an outline of an information processing system according to an embodiment of the present disclosure. As shown in FIG. 1, the information processing system according to the present embodiment includes a projector 210, a camera 310, a digital pen 400, and an information processing device 100.
 プロジェクタ210は、実空間の任意の場所に画像を投影する投影装置である。プロジェクタ210は、投影画角に含まれる実空間に画像を投影する。投影画角は投影可能な範囲を意味し、投影領域とも称される。投影領域211は、プロジェクタ210の位置、投影方向、及び投影方向を中心軸とする投影可能な範囲の角度により定義される。また、プロジェクタ210により投影される画像は、投影画像とも称される。投影画像は、投影領域211全体に投影される画像であるとし、ユーザによるデジタルペン400の操作に応じた描画情報が、投影画像内にマッピングされて投影されるとする。図1に示す例では、描画情報として、デジタルペン400の描画面(投影領域211に相当)での移動軌跡を示す筆跡画像Lが投影される。なお、本実施形態において、プロジェクタ210は表示装置の一例であり、また、投影領域は表示領域の一例であり、投影画像は表示画像の一例である。 The projector 210 is a projection device that projects an image on an arbitrary place in real space. The projector 210 projects an image into the real space included in the projection angle of view. The projected angle of view means a projectable range and is also called a projection area. The projection area 211 is defined by the position of the projector 210, the projection direction, and the angle of the projectable range about the projection direction as the central axis. The image projected by the projector 210 is also referred to as a projected image. It is assumed that the projected image is an image projected on the entire projection area 211, and drawing information corresponding to the operation of the digital pen 400 by the user is mapped and projected in the projected image. In the example shown in FIG. 1, a handwriting image L showing a movement locus on the drawing surface (corresponding to the projection area 211) of the digital pen 400 is projected as drawing information. In the present embodiment, the projector 210 is an example of a display device, the projection area is an example of a display area, and the projected image is an example of a display image.
 カメラ310は、実空間を撮像する撮像装置である。カメラ310は、IR(赤外線)カメラ、RGB(赤緑青)カメラ等の、レンズ系、駆動系、及び撮像素子を有し、画像(静止画像又は動画像)を撮像する。カメラ310は、撮像画角311内に含まれる実空間を撮像する。撮像画角311は、撮像可能な範囲を意味し、カメラ310の設置位置、撮像方向、及び撮像方向を中心軸とする撮像可能な範囲の角度により定義される。カメラ310により撮像された画像は、撮像画像とも称される。本実施形態によるカメラ310の撮像画角311は、少なくとも投影領域211を含む範囲としてもよい。 The camera 310 is an imaging device that captures images in real space. The camera 310 has a lens system, a drive system, and an image pickup element such as an IR (infrared) camera and an RGB (red, green, blue) camera, and captures an image (still image or moving image). The camera 310 images the real space included in the image pickup angle of view 311. The image pickup angle of view 311 means an image pickup range, and is defined by the installation position of the camera 310, the image pickup direction, and the angle of the image pickup range centered on the image pickup direction. The image captured by the camera 310 is also referred to as a captured image. The image pickup angle of view 311 of the camera 310 according to the present embodiment may be a range including at least the projection area 211.
 デジタルペン400は、IR(赤外光)のLED(Light Emitting Diode)等の発光部がペン先に搭載された入力装置である。発光部は、デジタルペン400に設けられたボタン又はスイッチ等の操作、ペン先の接地面への押圧、またはペンの揺動等により発光する。また、デジタルペン400は、デジタルペン400に設けられたボタン又はスイッチのユーザ操作、又はデジタルペン400の動き等の検出情報を情報処理装置100に送信してもよい。 The digital pen 400 is an input device in which a light emitting unit such as an IR (infrared light) LED (Light Emitting Diode) is mounted on the pen tip. The light emitting unit emits light by operating a button or switch provided on the digital pen 400, pressing the pen tip against the ground surface, swinging the pen, or the like. Further, the digital pen 400 may transmit detection information such as a user operation of a button or a switch provided on the digital pen 400 or a movement of the digital pen 400 to the information processing apparatus 100.
 情報処理装置100は、カメラ310により撮像された撮像画像から発光箇所を検出してデジタルペン400の位置を認識する。撮像画角311は、投影領域211を少なくとも含む範囲に予め設定される。若しくは、複数のカメラ310により投影領域211を分割して撮影してもよい。情報処理装置100は、カメラ310により継続的に撮像される撮像画像に基づいて、デジタルペン400の位置を継続的に認識、すなわち追跡する。また、情報処理装置100は、デジタルペン400と無線通信接続し、デジタルペン400の発光命令を送信したり、デジタルペン400でスイッチ操作が行われていることを示す情報を受信したりする。また、情報処理装置100は、デジタルペン400からペン先が接地面に押圧されていること(スイッチONの状態)を示す情報を受信すると、デジタルペン400の移動軌跡を筆跡として投影画像に表示する制御を行う。これにより、ユーザは、あたかも実際にペン等で実空間に書き込んでいるかのように描画操作を行うことができる。 The information processing apparatus 100 detects a light emitting point from the captured image captured by the camera 310 and recognizes the position of the digital pen 400. The imaging angle of view 311 is preset in a range including at least the projection region 211. Alternatively, the projection area 211 may be divided and photographed by a plurality of cameras 310. The information processing apparatus 100 continuously recognizes, that is, tracks the position of the digital pen 400 based on the captured image continuously captured by the camera 310. Further, the information processing apparatus 100 wirelessly connects to the digital pen 400, transmits a light emitting command of the digital pen 400, and receives information indicating that the switch operation is performed by the digital pen 400. Further, when the information processing apparatus 100 receives information from the digital pen 400 indicating that the pen tip is pressed against the ground surface (switch ON state), the information processing apparatus 100 displays the movement trajectory of the digital pen 400 as a handwriting on the projected image. Take control. As a result, the user can perform the drawing operation as if he / she was actually writing in the real space with a pen or the like.
 (課題の整理)
 ここで、タッチディスプレイへのデジタルペンを用いた手書き文字の入力など、ペン入力を行う際には、ペン入力の情報を取得できる専用のアプリケーションに表示された内容(専用アプリケーションにより展開される表示画像)に対してのみ行われるものであった。このため、予め専用アプリケーションに、書き込みたい対象の資料をイメージデータ等として読み込んでおく必要があった。また、読み込んだイメージデータに書き込んだ内容は、イメージデータと一体化したデータであるため、書き込み内容だけを処理することは困難であった。
(Arrangement of issues)
Here, when performing pen input such as inputting handwritten characters using a digital pen on the touch display, the content displayed in a dedicated application that can acquire pen input information (display image developed by the dedicated application). ) Was only done. Therefore, it is necessary to read the material to be written into the dedicated application as image data or the like in advance. Further, since the content written in the read image data is data integrated with the image data, it is difficult to process only the written content.
 また、既存のアプリケーションでペン入力を行いたい場合は、ペン入力をマウスイベントの様に見せて入力する方法が考え得るが、ペン入力の機能を十分に活用することは困難であった。また、ペン入力に対応していない既存のアプリケーションにペン入力を行うことは困難であった。 Also, if you want to perform pen input with an existing application, you can think of a method of inputting by making the pen input look like a mouse event, but it was difficult to fully utilize the pen input function. In addition, it was difficult to perform pen input to an existing application that does not support pen input.
 そこで、本開示では、異なるアプリケーションにより提示される画面に、ユーザ入力の情報を重ねて表示することで、ユーザ入力の利便性を高めることが可能な仕組みを提案する。 Therefore, in this disclosure, we propose a mechanism that can enhance the convenience of user input by superimposing user input information on screens presented by different applications.
 <<2.構成例>>
 図2は、本実施形態に係る情報処理システムの構成の一例を示す図である。図2に示すように、本実施形態に係る情報処理システムは、情報処理装置100、表示装置200、センサ300、およびデジタルペン400を含む。
<< 2. Configuration example >>
FIG. 2 is a diagram showing an example of the configuration of the information processing system according to the present embodiment. As shown in FIG. 2, the information processing system according to the present embodiment includes an information processing device 100, a display device 200, a sensor 300, and a digital pen 400.
 <2-1.表示装置200>
 表示装置200は、デジタルペン400により入力された描画情報を反映した画像を表示する。本実施形態では、一例として図1に示すように実空間に画像を投影するプロジェクタ210を用いるが、ディスプレイを用いてもよい。また、表示装置200は、スクリーンと、当該スクリーンに対して反対側(裏側)から照射するリアプロジェクタにより実現されてもよい。また、プロジェクタは例えば駆動機構を有し、任意の方向へ投影可能な装置であってもよい。このような機構を有することで、1カ所だけでなく、様々な箇所に映像を表示することができ、かつユーザが居る方向に向けて投影するなどインタラクティブな機能を実現することが可能になる。また、表示装置200は複数設けられていてもよい。各表示装置200により、それぞれ描画情報を反映した画像を表示することで、より多くのユーザが同時に描画を楽しむことができる。例えば、1つのプロジェクタ210aで投影される画像に対して3人のユーザがそれぞれ描画できるよう設定されている場合に(3つのデジタルペン400による入力が可能な場合に)、さらにもう1つプロジェクタ210bを用意し、当該プロジェクタ210bにより投影される画像に対してさらに3人のユーザが描画できるよう設定してもよい。プロジェクタ210aとプロジェクタ210bにより投影される各画像を実空間の壁等に横に並べて投影することで、6人のユーザが同時に描画を楽しむことができる。
<2-1. Display device 200>
The display device 200 displays an image that reflects the drawing information input by the digital pen 400. In this embodiment, as an example, a projector 210 that projects an image into a real space is used as shown in FIG. 1, but a display may also be used. Further, the display device 200 may be realized by a screen and a rear projector that illuminates the screen from the opposite side (back side). Further, the projector may be, for example, a device having a drive mechanism and capable of projecting in any direction. By having such a mechanism, it is possible to display an image not only in one place but also in various places, and it is possible to realize an interactive function such as projecting in the direction in which the user is. Further, a plurality of display devices 200 may be provided. By displaying an image reflecting drawing information on each display device 200, more users can enjoy drawing at the same time. For example, if it is set so that three users can draw each image projected by one projector 210a (when input by three digital pens 400 is possible), another projector 210b May be set so that three more users can draw on the image projected by the projector 210b. By projecting the images projected by the projector 210a and the projector 210b side by side on a wall or the like in a real space, six users can enjoy drawing at the same time.
 また、表示装置200は、表示以外の出力が可能な構成要素を含んでいてもよい。例えば、表示装置200にスピーカなどの音出力装置を組み合わせてもよい。スピーカは、単一の方向に指向性を形成可能な単一指向性スピーカであってもよい。単一指向性スピーカは、例えばユーザが居る方向に音声を出力する。なお、音出力装置は、表示装置200とは別の場所に設けられてもよい。 Further, the display device 200 may include a component capable of output other than the display. For example, the display device 200 may be combined with a sound output device such as a speaker. The speaker may be a unidirectional speaker capable of forming directivity in a single direction. The unidirectional speaker outputs sound in the direction in which the user is, for example. The sound output device may be provided in a place different from the display device 200.
 <2-2.センサ300>
 センサ300は、投影領域に対するユーザ操作を検出する。本実施形態では、一例として図1に示すように投影領域を撮像するカメラ310を用いる。カメラ310により、ユーザが把持するデジタルペン400の発光を検出する。デジタルペン400の発光は、後述するようにペン先に設けられたIR LEDの発光を想定する。この場合、デジタルペン400によるIR発光以外の光による誤動作を避けるため、カメラ310は可視光カットフィルタ(例えばIRの波長帯の光だけを主に通す光学フィルタ)を介して撮像する構成としてもよい。すなわちカメラ310としてIRカメラを用いてもよい。また、カメラ310は複数であってもよく、例えばさらにRGBカメラを含んでいてもよい。
<2-2. Sensor 300>
The sensor 300 detects a user operation on the projection area. In this embodiment, as an example, a camera 310 that captures a projection region is used as shown in FIG. The camera 310 detects the light emission of the digital pen 400 held by the user. The light emission of the digital pen 400 is assumed to be the light emission of the IR LED provided on the pen tip as described later. In this case, in order to avoid malfunction due to light other than IR emission by the digital pen 400, the camera 310 may be configured to take an image through a visible light cut filter (for example, an optical filter that mainly passes light in the IR wavelength band). .. That is, an IR camera may be used as the camera 310. Further, the number of cameras 310 may be plural, and may further include, for example, an RGB camera.
 また、カメラ310は、表示装置200が表示している領域を観測可能なカメラを想定し、本実施形態では、プロジェクタ210の光軸方向と平行かつ近傍に配置されることを想定する。プロジェクタ210とカメラ310の配置位置は特に限定しないが、プロジェクタ210とカメラ310の配置位置をなるべく近くかつ並行にすることで、回転成分を抑制し得る。回転成分とは、例えば、プロジェクタ座標からスクリーン座標に座標変換を行う際に生じる変換量である。本システムでは、カメラ310とプロジェクタ210との間の射影変換行列を事前に生成し、射影変換行列を用いて撮像画像から投影領域の切り出しや、投影する画像の変形が行われてもよい。 Further, the camera 310 is assumed to be a camera capable of observing the area displayed by the display device 200, and in the present embodiment, it is assumed that the camera 310 is arranged parallel to and in the vicinity of the optical axis direction of the projector 210. The arrangement position of the projector 210 and the camera 310 is not particularly limited, but the rotation component can be suppressed by making the arrangement positions of the projector 210 and the camera 310 as close and parallel as possible. The rotation component is, for example, a conversion amount generated when performing coordinate conversion from projector coordinates to screen coordinates. In this system, a projection transformation matrix between the camera 310 and the projector 210 may be generated in advance, and the projection region may be cut out from the captured image or the projected image may be deformed using the projection transformation matrix.
 また、センサ300は、カメラ310の他、様々な情報をセンシングする他のセンサを含んでいてもよい。例えばセンサ300は、投影領域に対するユーザ操作の他、ユーザの位置やユーザの背の高さなどの情報をセンシングしてもよい。例えばセンサ300は、さらにデプスセンサ、及びマイクロフォン等を含んでいてもよい。デプスセンサは、赤外線測距装置、超音波測距装置、LiDAR(Laser Imaging Detection and Ranging)又はステレオカメラ等の深度情報を取得する装置である。また、デプスセンサは、高精度な距離画像を取得できるToF(Time Of Flight)カメラであってもよい。マイクロフォンは、周囲の音を収音し、アンプおよびADC(Analog Digital Converter)を介してデジタル信号に変換した音声データを出力する装置である。マイクロフォンは、アレイマイクであってもよい。また、センサ300は、投影領域に対するユーザ操作をカメラ310以外のセンサでセンシングしてもよい。このため、例えばセンサ300は、投影領域に設けられたタッチセンサ等をさらに含んでいてもよい。 Further, the sensor 300 may include other sensors that sense various information in addition to the camera 310. For example, the sensor 300 may sense information such as the position of the user and the height of the user in addition to the user's operation on the projection area. For example, the sensor 300 may further include a depth sensor, a microphone, and the like. The depth sensor is a device that acquires depth information such as an infrared range measuring device, an ultrasonic range measuring device, a LiDAR (Laser Imaging Detection and Ringing), or a stereo camera. Further, the depth sensor may be a ToF (Time Of Flight) camera capable of acquiring a highly accurate distance image. A microphone is a device that collects ambient sound and outputs audio data converted into a digital signal via an amplifier and an ADC (Analog Digital Converter). The microphone may be an array microphone. Further, the sensor 300 may sense the user operation with respect to the projection area by a sensor other than the camera 310. Therefore, for example, the sensor 300 may further include a touch sensor or the like provided in the projection area.
 <2-3.デジタルペン400>
 デジタルペン400は、ユーザに利用される入力装置の一例である。デジタルペン400は、ユーザが投影領域211に描画する際に利用される。また、デジタルペン400は、一例としてペン形状のデバイスにより形成される。図2に示すように、デジタルペン400は、通信モジュール410、制御部420、IR LED430、およびスイッチ440を有する。
<2-3. Digital pen 400>
The digital pen 400 is an example of an input device used by a user. The digital pen 400 is used when the user draws on the projection area 211. Further, the digital pen 400 is formed by a pen-shaped device as an example. As shown in FIG. 2, the digital pen 400 includes a communication module 410, a control unit 420, an IR LED 430, and a switch 440.
 IR LED450は、発光部の一例であって、赤外光を発光する。IR LED450は、制御部420の制御により点灯する。かかる赤外光の発光(IR LEDの輝点)がカメラ310により検出され、情報処理装置100により位置が認識され得る。IR LED450は、ペン形状のデバイスにより形成されるデジタルペン400の先端(ペン先)に設けられる。これにより、ペン先を描画面に接地して描画を行っている際の描画位置が、カメラ310により検出され得る。なお、本実施形態では入力装置の一例としてペン形状のデバイスを用いているが、本開示はこれに限定されず、描画位置を示す発光の輝点が検出可能であれば異なる形状のデバイスで形成されていてもよい。例えば、スプレー缶のようにユーザが吹きかけるような物であってもよいし、手足に身に着けるデバイスであってもよい。 The IR LED 450 is an example of a light emitting unit and emits infrared light. The IR LED 450 lights up under the control of the control unit 420. The emission of such infrared light (the bright spot of the IR LED) is detected by the camera 310, and the position can be recognized by the information processing apparatus 100. The IR LED 450 is provided at the tip (pen tip) of the digital pen 400 formed by the pen-shaped device. As a result, the drawing position when the pen tip is grounded to the drawing surface and drawing is performed can be detected by the camera 310. In the present embodiment, a pen-shaped device is used as an example of the input device, but the present disclosure is not limited to this, and if the bright spot of light emission indicating the drawing position can be detected, the device is formed with a different shape. It may have been done. For example, it may be something that the user sprays, such as a spray can, or it may be a device that can be worn on the limbs.
 スイッチ440は、デジタルペン400により描画が行われている状態(すなわち入力状態)かどうかを判定する。例えば、デジタルペン400のペン先を壁など物理的な物(描画面)に押し付けるとペン先が押し込まれスイッチONとなり、ペン先を描画面から離すと押し込みが解除され元に戻る(スイッチOFFとなる)機構を有する。制御部420はスイッチのON/OFF状態を情報処理装置100に送信する。なお、ここでは一例としてペン先の描画面(接地面)への押圧によりスイッチがONとなる機構について説明したが、本開示はこれに限定されず、デジタルペン400に設けられたボタン又はスイッチ等による操作検出、又は操作体が有するモーションセンサを用いた操作体への操作検出等によりスイッチがONとなる機構であってもよい。 The switch 440 determines whether or not drawing is being performed by the digital pen 400 (that is, an input state). For example, when the pen tip of the digital pen 400 is pressed against a physical object (drawing surface) such as a wall, the pen tip is pushed in and the switch is turned on. It has a mechanism. The control unit 420 transmits the ON / OFF state of the switch to the information processing apparatus 100. Here, as an example, a mechanism in which the switch is turned on by pressing the pen tip against the drawing surface (ground surface) has been described, but the present disclosure is not limited to this, and a button or switch provided on the digital pen 400 or the like is described. The switch may be turned on by the operation detection by the operation detection, the operation detection to the operation body using the motion sensor of the operation body, or the like.
 通信モジュール410は、情報処理装置100と有線又は無線により接続し、データの送受信を行う。通信モジュール410は、例えば、有線/無線LAN(Local Area Network)、Wi-Fi(登録商標)、Bluetooth(登録商標)、ZigBee(登録商標)、近距離無線通信等により情報処理装置100と通信接続する。 The communication module 410 is connected to the information processing device 100 by wire or wirelessly to transmit and receive data. The communication module 410 is connected to the information processing device 100 by, for example, a wired / wireless LAN (Local Area Network), Wi-Fi (registered trademark), Bluetooth (registered trademark), ZigBee (registered trademark), short-range wireless communication, or the like. do.
 制御部420は、演算処理装置および制御装置として機能し、各種プログラムに従ってデジタルペン400内の動作全般を制御する。また、制御部420は、例えばCPU(Central Processing Unit)、マイクロプロセッサ、マイクロコントローラ等の電子回路によって実現される。また、制御部420は、使用するプログラムや演算パラメータ等を記憶するROM(Read Only Memory)、及び適宜変化するパラメータ等を一時記憶するRAM(Random Access Memory)を含んでいてもよい。 The control unit 420 functions as an arithmetic processing unit and a control device, and controls the overall operation in the digital pen 400 according to various programs. Further, the control unit 420 is realized by an electronic circuit such as a CPU (Central Processing Unit), a microprocessor, or a microcontroller. Further, the control unit 420 may include a ROM (Read Only Memory) for storing programs to be used, calculation parameters, and the like, and a RAM (Random Access Memory) for temporarily storing parameters and the like that change as appropriate.
 本実施形態による制御部420は、スイッチ440のON/OFF状態や、その他スイッチやボタン(不図示)の操作状態等を、通信モジュール410から情報処理装置100に送信する制御を行う。また、制御部420は、通信モジュール410により情報処理装置100から受信したIR LED450の点灯制御命令に従って、IR LED450の点灯を制御する。 The control unit 420 according to the present embodiment controls to transmit the ON / OFF state of the switch 440 and the operation state of other switches and buttons (not shown) from the communication module 410 to the information processing device 100. Further, the control unit 420 controls the lighting of the IR LED 450 according to the lighting control command of the IR LED 450 received from the information processing apparatus 100 by the communication module 410.
 なお、本実施形態では、一例としてデジタルペン400を用いて軌跡を描く描画入力を行う場合について主に説明したが、本実施形態によるデジタルペン400を用いた操作は描画入力に限定されず、表示された軌跡画像を消す操作や、表示された画像に対するクリック操作等の各種操作を行うことも可能である。 In the present embodiment, as an example, a case where a drawing input for drawing a locus is performed using the digital pen 400 has been mainly described, but the operation using the digital pen 400 according to the present embodiment is not limited to the drawing input and is displayed. It is also possible to perform various operations such as erasing the displayed locus image and clicking on the displayed image.
 また、デジタルペン400は、さらにカラーLEDが設けられていてもよい。カラーLEDは、制御部420の制御により任意の色で点灯する。制御部420は、通信モジュール410により情報処理装置100から受信したカラーLEDの点灯制御命令に従って、カラーLED430の点灯を制御してもよい。カラーLEDは、例えばデジタルペン400の先端部に設けられる。カラーLEDの色は、デジタルペン400により描画した軌跡を表示する筆跡画像の筆跡の色と同じに制御されてもよい。筆跡の色は、情報処理装置100またはデジタルペン400により決定され、共有される。ペン先の色と、筆跡の色とが同じに制御されることで、実空間への描画体験のエンターテインメント性がさらに高まる。また、カラーLEDは、可視光LEDであればよい。 Further, the digital pen 400 may be further provided with a color LED. The color LED lights up in any color under the control of the control unit 420. The control unit 420 may control the lighting of the color LED 430 according to the lighting control command of the color LED received from the information processing apparatus 100 by the communication module 410. The color LED is provided, for example, at the tip of the digital pen 400. The color of the color LED may be controlled to be the same as the color of the handwriting of the handwriting image displaying the locus drawn by the digital pen 400. The color of the handwriting is determined and shared by the information processing apparatus 100 or the digital pen 400. By controlling the color of the pen tip and the color of the handwriting in the same way, the entertainment of the drawing experience in the real space is further enhanced. Further, the color LED may be a visible light LED.
 <2-4.情報処理装置100>
 情報処理装置100は、I/F(Interface)部110、制御部120、操作入力部130、表示部140、および記憶部150を含む。情報処理装置100は、例えばスマートフォン、タブレット端末、PC(パーソナルコンピュータ)等により実現される。また、情報処理装置100は、プロジェクタ210やカメラ310およびデジタルペン400と同一空間に配置されてもよいし、インターネット上のサーバであってもよい。情報処理装置100は、例えば、エッジサーバ、中間サーバ、またはクラウドサーバ等により実現されてもよい。
<2-4. Information processing device 100>
The information processing apparatus 100 includes an I / F (Interface) unit 110, a control unit 120, an operation input unit 130, a display unit 140, and a storage unit 150. The information processing device 100 is realized by, for example, a smartphone, a tablet terminal, a PC (personal computer), or the like. Further, the information processing apparatus 100 may be arranged in the same space as the projector 210, the camera 310, and the digital pen 400, or may be a server on the Internet. The information processing apparatus 100 may be realized by, for example, an edge server, an intermediate server, a cloud server, or the like.
 (I/F部110)
 I/F部110は、情報処理装置100と他の機器とを接続するための接続装置である。I/F部110は、例えばUSB(Universal Serial Bus)コネクタ、有線/無線LAN(Local Area Network)、Wi-Fi(登録商標)、Bluetooth(登録商標)、ZigBee(登録商標)、携帯通信網(LTE(Long Term Evolution)、3G(第3世代の移動体通信方式)、4G(第4世代の移動体通信方式)、5G(第5世代の移動体通信方式))等により実現される。I/F部110は、表示装置200に含まれるプロジェクタ210、センサ300に含まれるカメラ310、及びデジタルペン400との間で情報の入出力を行う。
(I / F section 110)
The I / F unit 110 is a connection device for connecting the information processing device 100 and other devices. The I / F unit 110 includes, for example, a USB (Universal Serial Bus) connector, a wired / wireless LAN (Local Area Network), Wi-Fi (registered trademark), Bluetooth (registered trademark), ZigBee (registered trademark), and a mobile communication network (registered trademark). It is realized by LTE (Long Term Evolution), 3G (3rd generation mobile communication method), 4G (4th generation mobile communication method), 5G (5th generation mobile communication method), and the like. The I / F unit 110 inputs / outputs information to / from the projector 210 included in the display device 200, the camera 310 included in the sensor 300, and the digital pen 400.
 (制御部120)
 制御部120は、演算処理装置および制御装置として機能し、各種プログラムに従って情報処理装置100内の動作全般を制御する。制御部120は、例えばCPU(Central Processing Unit)、マイクロプロセッサ等の電子回路によって実現される。また、制御部120は、使用するプログラムや演算パラメータ等を記憶するROM(Read Only Memory)、及び適宜変化するパラメータ等を一時記憶するRAM(Random Access Memory)を含んでいてもよい。
(Control unit 120)
The control unit 120 functions as an arithmetic processing unit and a control device, and controls the overall operation in the information processing device 100 according to various programs. The control unit 120 is realized by an electronic circuit such as a CPU (Central Processing Unit) or a microprocessor. Further, the control unit 120 may include a ROM (Read Only Memory) for storing programs to be used, calculation parameters, and the like, and a RAM (Random Access Memory) for temporarily storing parameters and the like that change as appropriate.
 本実施形態による制御部120は、アプリケーション実行部121、表示画像生成部122、および表示制御部123として機能する。 The control unit 120 according to the present embodiment functions as an application execution unit 121, a display image generation unit 122, and a display control unit 123.
 ・アプリケーション実行部121
 アプリケーション実行部121は、アプリケーション(アプリケーションプログラム)を起動し、アプリケーションのプログラムに従って各種コマンドの実行処理等を行う。記憶部150には、1以上のアプリケーションのデータが格納され、アプリケーション実行部121は、指示に従って特定のアプリケーションを記憶部150から呼び出し、実行する。本実施形態では、アプリケーション実行部121により、第1のアプリケーションと、第1のアプリケーションとは異なる第2のアプリケーションとが実行される場合について説明する。
-Application execution unit 121
The application execution unit 121 starts an application (application program) and executes various command execution processes according to the application program. The storage unit 150 stores data of one or more applications, and the application execution unit 121 calls a specific application from the storage unit 150 and executes it according to an instruction. In the present embodiment, a case where the first application and the second application different from the first application are executed by the application execution unit 121 will be described.
 第2のアプリケーションは、一例として、ユーザにより入力(描画)された内容を、背景が透過された画面により表示装置200に出力する書き込み(描画入力)アプリケーションである。より具体的には、書き込みアプリケーションは、入力装置の一例であるデジタルペン400の位置(実空間の描画面上の位置)をセンサ300により検出し、デジタルペン400により描画面に描かれた軌跡を表示する筆跡画像を生成する。また、書き込みアプリケーションは、複数のデジタルペン400の位置を識別し、複数の筆跡画像を生成し得る。さらに、書き込みアプリケーションは、1以上の筆跡画像をマッピングして背景を透過した画面(書き込み画面画像)を出力する。また、第2のアプリケーションは、書き込みが終了して保存した書き込み画面画像と、第1のアプリケーションにより提示された画面(被書き込み画面画像)のデータとに基づいて、適宜画像処理を行うソフトウェアであってもよい。例えば第2のアプリケーションは、書き込み終了後(書き込み画面画像の保存後)に、書き込み画面画像を、被書き込み画面画像に重畳した画像の生成(合成処理)を行ってもよい。 The second application is, for example, a writing (drawing input) application that outputs the content input (drawn) by the user to the display device 200 on a screen with a transparent background. More specifically, the writing application detects the position (position on the drawing surface in the real space) of the digital pen 400, which is an example of the input device, by the sensor 300, and detects the locus drawn on the drawing surface by the digital pen 400. Generate a handwriting image to display. The writing application may also identify the positions of the plurality of digital pens 400 and generate a plurality of handwriting images. Further, the writing application maps one or more handwriting images and outputs a screen (writing screen image) through which the background is transparent. Further, the second application is software that appropriately performs image processing based on the writing screen image saved after the writing is completed and the data of the screen (written screen image) presented by the first application. You may. For example, the second application may generate an image (composite process) in which the writing screen image is superimposed on the written screen image after the writing is completed (after the writing screen image is saved).
 第1のアプリケーションは、表示装置200に表示する画像(表示画像)を提示するアプリケーションである。また、第1のアプリケーションは、より具体的には、文章作成アプリケーション、プレゼンテーションアプリケーション、表計算アプリケーション、所定フォーマットで生成される電子文書の閲覧/編集アプリケーション、写真や動画の閲覧アプリケーション、などの一般的なアプリケーションであってもよい。また、第1のアプリケーションは、第2のアプリケーションにより生成され保存された書き込み画面画像を取得し、当該書き込み画面画像に基づいて画像合成を行う等、書き込み画面画像に基づく何らかの処理(画像処理)を行えるソフトウェアであってもよい。 The first application is an application that presents an image (display image) to be displayed on the display device 200. More specifically, the first application is a general application such as a text creation application, a presentation application, a spreadsheet application, an electronic document viewing / editing application generated in a predetermined format, a photo or video viewing application, and the like. Application may be. Further, the first application acquires some processing (image processing) based on the writing screen image, such as acquiring the writing screen image generated and saved by the second application and performing image composition based on the writing screen image. It may be software that can be used.
 アプリケーション実行部121は、第2のアプリケーションにより提示される背景が透過された書き込み画面画像(第2の画面)を、第1のアプリケーションにより提示される表示画像(第1の画面)に重畳する制御を行う。かかる制御は、第2のアプリケーションに従って行われてもよいし、第1のアプリケーションに従って行われてもよい。若しくは、情報処理装置100を操作する管理者による操作入力に従って行われてもよい。第1のアプリケーションにより提示される表示画像に、背景が透過された書き込み画面画像を重畳表示することで、第1のアプリケーション側にデジタルペン400等の入力装置による描画入力機能を実装しなくとも、第1のアプリケーションにより提示される画像への書き込み(描画入力)状態を提示することが可能となる。具体例については、図4および図5を参照して後述する。 The application execution unit 121 controls to superimpose the writing screen image (second screen) presented by the second application on the display image (first screen) presented by the first application. I do. Such control may be performed according to the second application or may be performed according to the first application. Alternatively, it may be performed according to the operation input by the administrator who operates the information processing apparatus 100. By superimposing a writing screen image with a transparent background on the display image presented by the first application, it is not necessary to implement a drawing input function by an input device such as a digital pen 400 on the first application side. It is possible to present the writing (drawing input) state to the image presented by the first application. Specific examples will be described later with reference to FIGS. 4 and 5.
 ・表示画像生成部122
 表示画像生成部122は、プロジェクタ210で投影する表示画像(投影画像)を生成する。例えば表示画像生成部122は、アプリケーション実行部121により実行される第1のアプリケーションにより提示される表示画像に、第2のアプリケーションにより提示される背景が透過された書き込み画面画像が重畳された状態の表示画像を生成する。
-Display image generator 122
The display image generation unit 122 generates a display image (projected image) to be projected by the projector 210. For example, the display image generation unit 122 is in a state in which a writing screen image with a transparent background presented by the second application is superimposed on the display image presented by the first application executed by the application execution unit 121. Generate a display image.
・表示制御部123
 表示制御部123は、表示画像生成部122により生成された表示画像を、I/F部110を介してプロジェクタ210に出力し、プロジェクタ210により投影させる制御を行う。
-Display control unit 123
The display control unit 123 controls the display image generated by the display image generation unit 122 to be output to the projector 210 via the I / F unit 110 and projected by the projector 210.
 (操作入力部130)
 操作入力部130は、ユーザからの操作入力を受け付ける機能を有する。操作入力部130は、例えばキーボード、マウス、タッチパネル、ボタン、スイッチ等の入力デバイスにより実現されていてもよい。なお、デジタルペン400を用いて描画面に書き込みを行う1以上のユーザとは異なるユーザである管理者により、情報処理装置100への操作入力が行われることを想定してもよい。管理者は、第1のアプリケーションおよび第2のアプリケーションの操作や、プロジェクタ210による投影の開始/終了操作等を行い得る。
(Operation input unit 130)
The operation input unit 130 has a function of receiving an operation input from the user. The operation input unit 130 may be realized by an input device such as a keyboard, a mouse, a touch panel, a button, or a switch. It may be assumed that the operation input to the information processing apparatus 100 is performed by an administrator who is a user different from one or more users who write on the drawing surface using the digital pen 400. The administrator may operate the first application and the second application, start / end the projection by the projector 210, and the like.
 (表示部140)
 表示部140は、情報処理装置100を操作するための各種操作画面等の表示を行うディスプレイである。表示部140は、例えば液晶ディスプレイ(LCD:Liquid Crystal Display)、有機EL((Electro Luminescence)ディスプレイなどの表示装置により実現される。また、表示部140にも、プロジェクタ210により投影される表示画像が表示されてもよい。
(Display unit 140)
The display unit 140 is a display that displays various operation screens and the like for operating the information processing apparatus 100. The display unit 140 is realized by a display device such as a liquid crystal display (LCD) or an organic EL ((Electro Luminescence) display), and the display unit 140 also has a display image projected by the projector 210. It may be displayed.
 (記憶部150)
 記憶部150は、制御部120の処理に用いられるプログラムや演算パラメータ等を記憶するROM(Read Only Memory)、および適宜変化するパラメータ等を一時記憶するRAM(Random Access Memory)により実現される。例えば、記憶部150は、I/F部110により外部装置から入力された各種情報や、制御部120により算出、生成された各種情報を記憶する。
(Memory unit 150)
The storage unit 150 is realized by a ROM (Read Only Memory) that stores programs and arithmetic parameters used for processing of the control unit 120, and a RAM (Random Access Memory) that temporarily stores parameters and the like that change as appropriate. For example, the storage unit 150 stores various information input from the external device by the I / F unit 110 and various information calculated and generated by the control unit 120.
 より具体的には、アプリケーション実行部121により実行される各アプリケーションにより生成される各種情報が記憶部150に記憶され、適宜読み出されてもよい。 More specifically, various information generated by each application executed by the application execution unit 121 may be stored in the storage unit 150 and appropriately read out.
 以上、本実施形態による情報処理装置100の構成について説明したが、情報処理装置100の構成は図2に示す例に限定されない。例えば、情報処理装置100は、複数の装置により構成されていてもよいし、少なくとも一部の構成がプロジェクタ210やカメラ310に設けられていてもよい。また、情報処理装置100の少なくとも一部の構成が、サーバに設けられていてもよい。 Although the configuration of the information processing apparatus 100 according to the present embodiment has been described above, the configuration of the information processing apparatus 100 is not limited to the example shown in FIG. For example, the information processing device 100 may be composed of a plurality of devices, or at least a part of the information processing device 100 may be provided in the projector 210 or the camera 310. Further, at least a part of the configuration of the information processing apparatus 100 may be provided in the server.
 また、情報処理装置100は、センサ300から受信するセンシングデータに基づいて、デジタルペン400を用いて書き込みを行っているユーザの位置等を検出する機能を有していてもよい。例えば、RGBカメラとIRカメラが組み合わされたRGB-IRカメラを用いた場合、情報処理装置100は、可視光画像とIR画像を同時に取得し得る。情報処理装置100の制御部120は、可視光画像からユーザの位置等を認識し得る。また、情報処理装置100は、ステレオカメラやToF(Time Of Flight)カメラ等のデプスセンサにより検出されるセンシングデータに基づいて、ユーザの位置等を認識してもよい。 Further, the information processing apparatus 100 may have a function of detecting the position of a user who is writing with the digital pen 400 based on the sensing data received from the sensor 300. For example, when an RGB-IR camera in which an RGB camera and an IR camera are combined is used, the information processing apparatus 100 can acquire a visible light image and an IR image at the same time. The control unit 120 of the information processing apparatus 100 can recognize the user's position and the like from the visible light image. Further, the information processing apparatus 100 may recognize the user's position or the like based on the sensing data detected by the depth sensor such as a stereo camera or a ToF (Time Of Flight) camera.
 (2-4-1.書き込みアプリケーションの機能構成)
 アプリケーション実行部121により実行される第2のアプリケーションの一例である書き込みアプリケーションの機能構成について図3を参照して説明する。
(2-4-1. Functional configuration of writing application)
The functional configuration of the writing application, which is an example of the second application executed by the application execution unit 121, will be described with reference to FIG.
 図3に示すように、書き込みアプリケーション1210は、主に、発光認識部1211、座標変換部1212、操作検出部1213、および書き込み画面画像生成部1214として機能する。 As shown in FIG. 3, the writing application 1210 mainly functions as a light emission recognition unit 1211, a coordinate conversion unit 1212, an operation detection unit 1213, and a writing screen image generation unit 1214.
 ・発光認識部1211
 発光認識部1211は、カメラ310(撮像装置)により撮像された投影領域211(表示領域)の撮像画像に基づいて、デジタルペン400(入力装置)の位置として、IR LEDの輝点を検出する(認識処理)。検出された輝点の位置座標(カメラ座標系で検出されたIR LEDの発光位置)は、記憶部150に蓄積される。また、発光認識部1211は、カメラ310により継続的に撮像される撮像画像に基づいて、検出した輝点の位置を継続的に検出、すなわちデジタルペン400の位置を追跡(トラッキング)する。
-Light emission recognition unit 1211
The light emission recognition unit 1211 detects the bright spot of the IR LED as the position of the digital pen 400 (input device) based on the captured image of the projection area 211 (display area) captured by the camera 310 (imaging device) (the bright spot of the IR LED). Recognition process). The detected position coordinates of the bright spot (the light emitting position of the IR LED detected in the camera coordinate system) are stored in the storage unit 150. Further, the light emission recognition unit 1211 continuously detects the position of the detected bright spot based on the captured image continuously captured by the camera 310, that is, tracks the position of the digital pen 400.
 ・座標変換部1212
 座標変換部1212は、発光認識部1211により認識された輝点の位置座標を、プロジェクタ210の表示座標系に変換する。具体的には、座標変換部1212は、射影行列を用いて位置座標の変換(算出)を行う。変換した位置座標は、記憶部150に蓄積される。また、座標変換部1212は、発光認識部1211により継続的に認識される輝点の位置座標に基づいて、位置座標の変換を継続的に行う。
-Coordinate conversion unit 1212
The coordinate conversion unit 1212 converts the position coordinates of the bright spot recognized by the light emission recognition unit 1211 into the display coordinate system of the projector 210. Specifically, the coordinate conversion unit 1212 converts (calculates) the position coordinates using the projection matrix. The converted position coordinates are stored in the storage unit 150. Further, the coordinate conversion unit 1212 continuously converts the position coordinates based on the position coordinates of the bright spot continuously recognized by the light emission recognition unit 1211.
 ・操作検出部1213
 操作検出部1213は、デジタルペン400から受信した情報に基づいて、デジタルペン400において操作が行われたことを検出する。例えば操作検出部1213は、デジタルペン400に設けられたスイッチ440がONになったこと(すなわち、描画状態であること)を検出する。
-Operation detection unit 1213
The operation detection unit 1213 detects that an operation has been performed on the digital pen 400 based on the information received from the digital pen 400. For example, the operation detection unit 1213 detects that the switch 440 provided on the digital pen 400 is turned on (that is, it is in a drawing state).
 ・書き込み画面画像生成部1214
 書き込み画面画像生成部1214は、デジタルペン400による描画面への書き込みが反映された書き込み画面画像を生成する。例えば書き込み画面画像生成部1214は、操作検出部1213によりデジタルペン400が描画状態であることが検出された際に、座標変換部1212により変換された輝点(IR光位置)に基づいて、描画情報(筆跡画像)をマッピングした画像を生成する。これにより、例えばユーザがデジタルペン400のペン先を描画面に接地して軌跡を描いた際に、当該軌跡を示す筆跡画像がマッピングされた画像が生成される。なお、書き込み画面画像生成部1214は、デジタルペン400による描画面への書き込みが反映された画像であって、かつ、背景を透過した画面画像を生成する。
-Writing screen image generation unit 1214
The writing screen image generation unit 1214 generates a writing screen image in which writing on the drawing surface by the digital pen 400 is reflected. For example, the writing screen image generation unit 1214 draws based on the bright spot (IR light position) converted by the coordinate conversion unit 1212 when the operation detection unit 1213 detects that the digital pen 400 is in the drawing state. Generate an image that maps information (handwriting image). As a result, for example, when the user touches the pen tip of the digital pen 400 on the drawing surface and draws a locus, an image to which a handwriting image showing the locus is mapped is generated. The writing screen image generation unit 1214 generates a screen image that reflects the writing on the drawing surface by the digital pen 400 and that has a transparent background.
 以上、書き込みアプリケーションの機能構成について説明した。なお、本実施形態による書き込みアプリケーションは、複数のデジタルペン400の位置を識別し、それぞれのデジタルペン400により描かれた軌跡を示す筆跡画像を生成することが可能である。すなわち、複数人が同時に描画面に対して書き込みを行うことが可能である。デジタルペン400の位置の識別は、例えばデジタルペン400から受信したIDを、カメラ310の撮像画像から検出した輝点に対応付けることで実現し得る。具体的には、書き込みアプリケーション1210は、情報処理装置100のI/F部110を介してデジタルペン400と通信接続してIDを取得する。次いで、書き込みアプリケーション1210は、通信接続したデジタルペン400に対してIR発光命令を送信し、その後撮像画像から新規に検出された輝点を当該デジタルペン400の位置とみなし、当該デジタルペン400のIDを対応付ける。 The functional configuration of the writing application has been explained above. The writing application according to the present embodiment can identify the positions of a plurality of digital pens 400 and generate a handwriting image showing a locus drawn by each of the digital pens 400. That is, it is possible for a plurality of people to write on the drawing surface at the same time. The identification of the position of the digital pen 400 can be realized, for example, by associating the ID received from the digital pen 400 with the bright spot detected from the image captured by the camera 310. Specifically, the writing application 1210 communicates with the digital pen 400 via the I / F unit 110 of the information processing apparatus 100 to acquire an ID. Next, the writing application 1210 transmits an IR light emission command to the digital pen 400 connected by communication, and then regards a bright spot newly detected from the captured image as the position of the digital pen 400, and ID of the digital pen 400. To associate.
 (2-4-2.書き込み画面画像の重畳表示について)
 本実施形態では、第1のアプリケーション側にデジタルペン400等の入力装置による書き込み機能を実装しなくとも、第2のアプリケーションにより提示される背景が透過された書き込み画面画像を重畳表示することで、あたかも第1のアプリケーションにより提示される画像(画面)に書き込みが行われているように見せることを可能とする。このような書き込み画面画像の重畳表示について、以下、図4および図5を参照して具体的に説明する。
(2-4-2. Overlapping display of writing screen image)
In the present embodiment, even if the writing function by an input device such as a digital pen 400 is not mounted on the first application side, the writing screen image with the transparent background presented by the second application is superimposed and displayed. It is possible to make it appear as if the image (screen) presented by the first application is being written. Such superimposed display of the writing screen image will be specifically described below with reference to FIGS. 4 and 5.
 図4は、本実施形態による書き込み画面画像が重畳表示された投影画像の一例を示す図である。図4に示すように、投影領域211には、ユーザがデジタルペン400を用いて実空間の描画面(投影領域211)で描いた軌跡を示す筆跡画像511と、泡を表現した泡画像501~503とを含む投影画像50が投影されている。泡画像501~503は、描画面においてユーザが書き込む場所(エリア)を明示する描写の一例であって、エリアを区別できる表示であれば色や形、大きさ、数、および表現方法は特に限定されない。例えばユーザに対して事前に「泡の中に書き込んでください」というようなアナウンスを行ってもよい。また、書き込みが終了した際には、書き込まれた泡画像501~503がランダムに画面上を漂うアニメーションが投影領域211に表示されてもよい(詳細については図8を参照して後述する)。 FIG. 4 is a diagram showing an example of a projected image in which the writing screen image according to the present embodiment is superimposed and displayed. As shown in FIG. 4, in the projection area 211, a handwriting image 511 showing a locus drawn by a user on a drawing surface (projection area 211) in real space using a digital pen 400, and a bubble image 501 to represent a bubble. A projected image 50 including 503 is projected. The bubble images 501 to 503 are examples of depictions that clearly indicate the place (area) to be written by the user on the drawing surface, and the color, shape, size, number, and expression method are particularly limited as long as the display can distinguish the areas. Not done. For example, an announcement such as "Please write in a bubble" may be made in advance to the user. Further, when the writing is completed, an animation in which the written bubble images 501 to 503 randomly float on the screen may be displayed in the projection area 211 (details will be described later with reference to FIG. 8).
 ここで、デジタルペン400による筆跡画像511の提示は、第2のアプリケーションの一例である書き込みアプリケーションにより行われ得る。また、泡画像501~503の提示や、その後の泡画像501~503のアニメーションの提示は、第1のアプリケーションの一例である被書き込みアプリケーションにより行われ得る。「被書き込みアプリケーション」は、書き込みアプリケーションにより提示される書き込み画面画像が重畳表示される側のアプリケーションである。実際に書き込みが行われるわけではないが、書き込みアプリケーションにより提示される背景が透過された書き込み画面の背面側に位置した際に、あたかも書き込みが行われているように見えるため、本明細書では、「被書き込みアプリケーション」と称する。被書き込みアプリケーションは、一般的なアプリケーションであってもよいし、書き込みアプリケーションと連携する機能を有しているアプリケーションであってもよい。 Here, the presentation of the handwriting image 511 by the digital pen 400 can be performed by the writing application which is an example of the second application. Further, the presentation of the bubble images 501 to 503 and the subsequent presentation of the animation of the bubble images 501 to 503 can be performed by the written application which is an example of the first application. The "written application" is an application on which the writing screen image presented by the writing application is superimposed and displayed. Although writing is not actually performed, in the present specification, it appears as if writing is being performed when the background presented by the writing application is located on the back side of the transparent writing screen. Called "written application". The write-to-write application may be a general application or an application having a function of cooperating with the write application.
 より具体的には、投影領域211に投影される投影画像50は、図5に示すように、2つの画像が重畳した構成となっている。 More specifically, the projected image 50 projected on the projection area 211 has a configuration in which two images are superimposed, as shown in FIG.
 図5は、本実施形態による2つの画像の重畳について説明する図である。図5に示す第1の画像500は、被書き込みアプリケーションにより提示される泡画像の画面であって、第2の画像510は、書き込みアプリケーションにより提示される書き込み画面である。第2の画像510(書き込み画面)は背景が透過しているため、第1の画像500に重ねて表示された際、図4に示すように、あたかも泡画像の画面に書き込みしているかのように見せることが可能となる。なお、図5に示す例では、第1の画像500および第2の画像510のいずれも全画面表示されているが、本実施形態はこれに限定されない。例えば第1の画像の少なくとも一部の領域に、第2の画像の少なくとも一部の領域が重畳していてもよい。また、第1の画像の領域に、複数の第2の画像が重畳していてもよい。この場合の複数の第2の画像は、互いに少なくとも一部が重なっていてもよいし、離隔していてもよいし、隣接して並んでいてもよい。 FIG. 5 is a diagram illustrating superimposition of two images according to the present embodiment. The first image 500 shown in FIG. 5 is a screen of a bubble image presented by the writing application, and the second image 510 is a writing screen presented by the writing application. Since the background of the second image 510 (writing screen) is transparent, when the second image 510 (writing screen) is superimposed on the first image 500, it is as if writing on the screen of the bubble image as shown in FIG. It becomes possible to show to. In the example shown in FIG. 5, both the first image 500 and the second image 510 are displayed in full screen, but the present embodiment is not limited to this. For example, at least a part of the area of the second image may be superimposed on at least a part of the area of the first image. Further, a plurality of second images may be superimposed on the area of the first image. The plurality of second images in this case may be at least partially overlapped with each other, may be separated from each other, or may be arranged adjacent to each other.
 <<3.動作処理>>
 続いて、本実施形態による情報処理システムの動作処理の流れについてフローチャートを用いて説明する。
<< 3. Operation processing >>
Subsequently, the flow of the operation processing of the information processing system according to the present embodiment will be described using a flowchart.
 <3-1.第1の動作処理例>
 図6は、本実施形態による第1の動作処理の全体の流れの一例を示すフローチャートである。第1の動作処理は、アプリケーション間の連携が無い場合(後述するプロセス間通信等を行わない場合)の動作処理について説明する。
<3-1. First operation processing example>
FIG. 6 is a flowchart showing an example of the overall flow of the first operation process according to the present embodiment. The first operation process will be described when there is no cooperation between applications (when interprocess communication or the like described later is not performed).
 図6に示すように、まず、アプリケーション実行部121は、第1のアプリケーション(被書き込みアプリケーション)を起動する(ステップS103)。 As shown in FIG. 6, first, the application execution unit 121 starts the first application (written application) (step S103).
 次に、アプリケーション実行部121は、第2のアプリケーション(書き込みアプリケーション)を起動する(ステップS106)。 Next, the application execution unit 121 starts a second application (writing application) (step S106).
 第1のアプリケーションと第2のアプリケーションの起動順序は特に限定せず、それぞれのタイミングで起動され得る。具体的には、管理者の操作入力(例えば表示部140に表示する各アイコンのダブルクリック)に応じてそれぞれ起動されてもよい。また、第2のアプリケーション(書き込みアプリケーション)の起動は、デジタルペン400に設けられたボタンの操作等、デジタルペン400で起きる特定のイベントに応じてアプリケーション実行部121により行われてもよい。アプリケーション実行部121は、第1、第2のアプリケーションとは異なり別途起動しているサービスプログラムの機能であってもよく、当該サービスプログラムは、I/F部110を介してデジタルペン400から受信する情報を監視する。このように、アプリケーション間の連携が無い場合、各アプリケーション(第1のアプリケーションおよび第2のアプリケーション)はユーザや管理者による手動操作に基づいてそれぞれ起動する。 The starting order of the first application and the second application is not particularly limited, and they can be started at each timing. Specifically, it may be activated in response to an operation input by the administrator (for example, double-clicking each icon displayed on the display unit 140). Further, the second application (writing application) may be started by the application execution unit 121 in response to a specific event that occurs in the digital pen 400, such as an operation of a button provided on the digital pen 400. Unlike the first and second applications, the application execution unit 121 may be a function of a service program that is started separately, and the service program receives from the digital pen 400 via the I / F unit 110. Monitor the information. In this way, when there is no cooperation between applications, each application (first application and second application) is started based on a manual operation by a user or an administrator.
 次いで、表示画像生成部122は、第1のアプリケーションにより提示される画面に、第2のアプリケーションにより提示される画面を重畳した表示画像を生成し、表示制御部123によりI/F部110から表示装置200に出力して表示制御する(ステップS109)。表示画像生成部122および表示制御部123は、上記サービスプログラムの機能であってもよい。 Next, the display image generation unit 122 generates a display image in which the screen presented by the second application is superimposed on the screen presented by the first application, and the display control unit 123 displays the display image from the I / F unit 110. It is output to the apparatus 200 and the display is controlled (step S109). The display image generation unit 122 and the display control unit 123 may be functions of the service program.
 次に、書き込みアプリケーションの機能として、デジタルペン400を用いた書き込み処理が実施される(ステップS112)。上述したように、書き込みアプリケーションは、デジタルペン400により実空間の描画面(投影領域211)に描かれた軌跡を示す筆跡画像を生成し、書き込み画面画像にマッピングして投影領域211に投影する。かかる書き込み画面画像は、背景が透過し、被書き込みアプリケーションにより提示される画面に重畳して表示される。したがって、ユーザは、実質的には、被書き込みアプリケーションにより提示される画面に重畳表示される透明の画面(描画面)に対して書き込みを行い、かかる透明の画面に書き込みが反映されることで、被書き込みアプリケーションにより提示される画面への書き込み状態を視認することが可能となる。 Next, as a function of the writing application, a writing process using the digital pen 400 is performed (step S112). As described above, the writing application generates a handwriting image showing a locus drawn on a drawing surface (projection area 211) in real space by a digital pen 400, maps it to a writing screen image, and projects it on the projection area 211. Such a writing screen image has a transparent background and is displayed superimposed on the screen presented by the application to be written. Therefore, the user substantially writes on the transparent screen (drawing surface) superimposed and displayed on the screen presented by the application to be written, and the writing is reflected on the transparent screen. It becomes possible to visually recognize the writing state on the screen presented by the application to be written.
 なお、書き込みアプリケーションは、デジタルペン400による書き込みが可能なこと(すなわち書き込みアプリケーションが起動していること)を、書き込みアプリケーションにより提示される画面上にペン入力可能であることを示すアイコンを表示することで通知してもよいし、音声ガイダンスや音楽をスピーカ(不図示)から再生することで通知してもよい。 The writing application displays an icon indicating that writing with the digital pen 400 is possible (that is, the writing application is running) and that pen input is possible on the screen presented by the writing application. It may be notified by, or it may be notified by playing voice guidance or music from a speaker (not shown).
 続いて、書き込みアプリケーションは、書き込み終了の操作が行われると、書き込まれた画像(書き込み画面画像)を記憶部150に保存する処理を行う(ステップS115)。書き込み終了の操作は、書き込みを行っているユーザや、情報処理装置100を操作している管理者により行われ得る。例えば、管理者がキーボード(操作入力部130の一例)の特定キーを操作して終了させてもよいし、投影される画面上に表示される終了アイコンをユーザがデジタルペン400で操作して終了させてもよい。若しくはタイマーで自動的に終了させてもよい。また、デジタルペン400に設けられたボタンの操作等、デジタルペン400で起きる所定の終了イベントが行われた際に終了させてもよい。また、書き込みアプリケーションにより提示される画面の背面に表示されている画面を提示している被書き込みアプリケーションが終了したことがサービスプログラムに検知され、サービスプログラムから通知された場合に終了してもよい。 Subsequently, when the writing end operation is performed, the writing application performs a process of saving the written image (writing screen image) in the storage unit 150 (step S115). The operation of ending the writing may be performed by the user who is writing or the administrator who is operating the information processing apparatus 100. For example, the administrator may operate a specific key on the keyboard (an example of the operation input unit 130) to end the operation, or the user operates the end icon displayed on the projected screen with the digital pen 400 to end the operation. You may let me. Alternatively, it may be automatically terminated by a timer. Further, the event may be terminated when a predetermined end event that occurs in the digital pen 400, such as the operation of a button provided on the digital pen 400, is performed. Further, it may be terminated when the service program detects that the written application presenting the screen displayed on the back of the screen presented by the writing application has terminated and the service program notifies the termination.
 また、本実施形態では、書き込み画面画像が、書き込まれた側の画像(実際に書き込まれてはいないが、画面上書き込まれたかのように見える被書き込み画像)とは関係なく、単独のデータとして保存され得る。すなわち、書き込み画面画像は、被書き込み画像を取り扱う第1のアプリケーションとは異なる第2のアプリケーションにより管理される。具体的には、例えば図5に示す第2の画像510が、書き込み画面画像として保存される。この際、書き込みアプリケーションは、保存する書き込み画面画像に、書き込み日時情報や、被書き込みアプリケーションの情報を付加してもよい。また、この時点では、被書き込み画像(被書き込みアプリケーションにより提示される表示画面であって、例えば図5に示す第1の画像500)には、データ上なんら変更(書き込み)は加えられていない。 Further, in the present embodiment, the writing screen image is saved as a single data regardless of the image on the written side (the written image that is not actually written but appears to be overwritten on the screen). Can be done. That is, the writing screen image is managed by a second application different from the first application that handles the written image. Specifically, for example, the second image 510 shown in FIG. 5 is saved as a writing screen image. At this time, the writing application may add writing date / time information and information of the writing application to the writing screen image to be saved. Further, at this point, no change (writing) has been made in the data to the image to be written (the display screen presented by the application to be written, for example, the first image 500 shown in FIG. 5).
 なお、被書き込みアプリケーションは、書き込みアプリケーションにより保存された書き込み画像に基づいて合成等の各種処理を実施する機能を有していてもよい。この場合、被書き込みアプリケーションは、管理者による操作等により、保存された書き込み画像を取得し、書き込み画像と被書き込み画像を合成する処理を行う(ステップS118)。書き込み画像は背景が透過された画像であり、被書き込みアプリケーションは、被書き込み画像に書き込み画像を重ねて合成することで、書き込みが行われた被書き込み画像を生成することが可能となる。また、制御部120は、このような合成処理を、他のアプリケーションにより実行してもよい。 The application to be written may have a function of performing various processes such as compositing based on the written image saved by the writing application. In this case, the write-to-write application acquires the saved written image by an operation by the administrator or the like, and performs a process of synthesizing the written image and the written image (step S118). The written image is an image with a transparent background, and the written application can generate a written image by superimposing the written image on the written image and synthesizing the written image. Further, the control unit 120 may execute such a synthesis process by another application.
 以上、本実施形態による第1の動作処理について説明した。これにより、書き込みアプリケーションとの連携機能を実装していないどのようなアプリケーションであっても、被書き込みアプリケーションとして用いることが可能となる。ユーザまたは管理者は、ペン入力を行いたいアプリケーション(第1のアプリケーション)と、ペン入力のアプリケーション(第2のアプリケーション)をそれぞれ起動させるだけで、任意のアプリケーションに対してペン入力を容易に行い得る。 The first operation process according to the present embodiment has been described above. As a result, any application that does not implement the function of linking with the write application can be used as the write-to-write application. The user or the administrator can easily perform pen input to any application simply by starting the application for which pen input is desired (first application) and the pen input application (second application). ..
 <3-2.第2の動作処理例>
 続いて、アプリケーション間の連携が有る場合(後述するプロセス間通信等を行う場合)の動作処理について説明する。
<3-2. Second operation processing example>
Next, the operation processing when there is cooperation between applications (when performing interprocess communication or the like described later) will be described.
 ここでは、被書き込みアプリケーション(第1のアプリケーション)として、図4に示すようなエンターテインメント性のある書き込みを、書き込みアプリケーション(第2のアプリケーション)と連携して実現する寄せ書き表示アプリケーションを用いる。以下、図7を参照して具体的に説明する。 Here, as the write-to-write application (first application), a good-luck flag display application that realizes entertainment-like writing as shown in FIG. 4 in cooperation with the write application (second application) is used. Hereinafter, a specific description will be given with reference to FIG. 7.
 図7は、本実施形態による寄せ書き表示アプリケーションと書き込みアプリケーションの動作処理の一例を示すシーケンス図である。図7に示すように、まず、寄せ書き表示アプリケーションが、管理者等による開始操作(表示部140に表示される画面上に配置された寄せ書き表示アプリケーションのアイコンのクリック等)により起動する(ステップS123)。寄せ書き表示アプリケーションが起動すると、寄せ書き表示アプリケーションにより提示されるメニュー画面等が表示部140に表示される。管理者等は、操作入力部130から、寄せ書き表示アプリケーションにより提示される画面をプロジェクタ210で投影する制御を行う。若しくは、寄せ書き表示アプリケーションにより提示される画面が、情報処理装置100と接続されたプロジェクタ210により自動的に投影されてもよい。この際、表示部140に表示される画面が投影されてもよいし、表示部140に表示される画面の一部が投影されてもよい。 FIG. 7 is a sequence diagram showing an example of the operation processing of the writing display application and the writing application according to the present embodiment. As shown in FIG. 7, first, the message display application is started by a start operation by an administrator or the like (clicking the icon of the message display application arranged on the screen displayed on the display unit 140, etc.) (step S123). .. When the Good Luck Flag display application is started, the menu screen or the like presented by the Good Luck Flag display application is displayed on the display unit 140. The administrator or the like controls the operation input unit 130 to project the screen presented by the message display application on the projector 210. Alternatively, the screen presented by the message display application may be automatically projected by the projector 210 connected to the information processing apparatus 100. At this time, the screen displayed on the display unit 140 may be projected, or a part of the screen displayed on the display unit 140 may be projected.
 次に、管理者等により、書き込みアプリケーションとの連携による書き込み機能がON(実行指示)されると(ステップS129)、寄せ書き表示アプリケーションは、書き込みアプリケーションの起動制御を行う(起動させるプロセスの実行)(ステップS132)。 Next, when the write function in cooperation with the write application is turned on (execution instruction) by the administrator or the like (step S129), the good-luck flag display application controls the start of the write application (execution of the process to start) (step S129). Step S132).
 次いで、書き込みアプリケーションが起動する(ステップS135)。書き込みアプリケーションの機能構成に含まれる発光認識部1211は、カメラ310により撮像された投影領域211の撮像画像に基づいて、デジタルペン400のIR発光による輝点を検出し、デジタルペン400の位置として認識する(ステップS138)。そして、発光認識部1211は、認識した輝点(デジタルペン400の位置)のトラッキング(継続的に輝点を認識する処理)を開始する。 Next, the writing application is started (step S135). The light emission recognition unit 1211 included in the functional configuration of the writing application detects the bright spot due to the IR light emission of the digital pen 400 based on the captured image of the projection area 211 captured by the camera 310, and recognizes it as the position of the digital pen 400. (Step S138). Then, the light emission recognition unit 1211 starts tracking (a process of continuously recognizing the bright spot) of the recognized bright spot (position of the digital pen 400).
 次に、書き込みアプリケーションの機能構成に含まれる座標変換部1212は、撮像画像から検出した輝点位置(カメラ座標系)を、プロジェクタ210の表示座標系に変換する座標変換を行う(ステップS141)。 Next, the coordinate conversion unit 1212 included in the functional configuration of the writing application performs coordinate conversion for converting the bright spot position (camera coordinate system) detected from the captured image into the display coordinate system of the projector 210 (step S141).
 次いで、書き込みアプリケーションの機能構成に含まれる書き込み画面画像生成部1214は、デジタルペン400が描画面に押し付けられる等してスイッチ440がONされた情報をデジタルペン400から取得すると、デジタルペン400により描画面に描かれる軌跡(認識した輝点の移動軌跡)を示す筆跡画像をマッピングし、背景を透過した書き込み画面画像を生成する(ステップS144)。 Next, when the writing screen image generation unit 1214 included in the functional configuration of the writing application acquires information from the digital pen 400 that the switch 440 is turned on by pressing the digital pen 400 against the drawing surface, the digital pen 400 draws the image. A handwriting image showing a trajectory drawn on the surface (movement trajectory of the recognized bright spot) is mapped, and a writing screen image transparent to the background is generated (step S144).
 次いで、書き込みアプリケーションは、生成した書き込み画面画像をプロジェクタ210から表示(投影)する制御を行う(ステップS147)。より具体的には、表示制御部123の機能として、書き込み画面画像の表示(投影)制御が行われ得る。 Next, the writing application controls to display (project) the generated writing screen image from the projector 210 (step S147). More specifically, as a function of the display control unit 123, display (projection) control of the writing screen image can be performed.
 一方、寄せ書き表示アプリケーションは、被書き込み画像を表示(投影)する制御を行う(ステップS150)。被書き込み画像の表示は、ステップS129に示す書き込み機能ONに連動して行われてもよい。より具体的には、表示制御部123の機能として、被書き込み画像の表示(投影)制御が行われる。また、被書き込み画像とは、書き込みアプリケーションにより実現されるデジタルペン400による画面上への書き込みを想定して生成された、書き込み領域を明示する表示を含む画像(例えば、図5の第1の画像500)である。ユーザは、投影領域211に投影される被書き込み画像において、書き込み領域として明示されたエリア内(例えば、図4に示す泡画像501~503の領域内)に、デジタルペン400で書き込みを行う。デジタルペン400による書き込みは、書き込みアプリケーションの機能により検出され、投影領域211に反映される。具体的には、被書き込み画像に、書き込み画面画像が重畳表示される。 On the other hand, the good-luck flag display application controls to display (project) the image to be written (step S150). The display of the image to be written may be performed in conjunction with the writing function ON shown in step S129. More specifically, as a function of the display control unit 123, display (projection) control of the image to be written is performed. Further, the image to be written is an image including a display clearly indicating the writing area, which is generated assuming writing on the screen by the digital pen 400 realized by the writing application (for example, the first image of FIG. 5). 500). In the image to be written projected on the projection area 211, the user writes in the area specified as the writing area (for example, in the area of the bubble images 501 to 503 shown in FIG. 4) with the digital pen 400. Writing with the digital pen 400 is detected by the function of the writing application and reflected in the projection area 211. Specifically, the writing screen image is superimposed and displayed on the image to be written.
 次に、寄せ書き表示アプリケーションは、書き込み時間が設定されている場合は書き込みタイマーをONにする(ステップS153)。書き込みタイマーは、例えば被書き込み画像を表示してからカウントされてもよい。残り30秒等、残り時間が閾値を下回ると、寄せ書き表示アプリケーションは、残り時間をカウントするカウントダウンを被書き込み画像に表示するようにしてもよい。例えば、図4に示す泡画像501~503の近くにそれぞれカウントダウン表示(同じカウントダウンの表示)が行われてもよい。書き込み時間中は、泡画像501~503の表示位置は変わらず、泡を表現する枠部分や泡の影が多少揺らめくようなアニメーションが追加されてもよい。 Next, the writing display application turns on the writing timer when the writing time is set (step S153). The write timer may be counted after displaying the image to be written, for example. When the remaining time falls below the threshold value, such as 30 seconds remaining, the good-luck flag display application may display a countdown for counting the remaining time on the image to be written. For example, a countdown display (display of the same countdown) may be performed near each of the bubble images 501 to 503 shown in FIG. During the writing time, the display positions of the bubble images 501 to 503 do not change, and an animation may be added such that the frame portion expressing the bubbles and the shadow of the bubbles are slightly swayed.
 次いで、所定時間が経過すると(ステップS156/Yes)、寄せ書き表示アプリケーションは、書き込みを終了する制御を行う(ステップS159)。具体的には、寄せ書き表示アプリケーションは、例えばプロセス間通信により、書き込みアプリケーションに対して書き込み終了を通知する。なお、ここでは一例としてカウントダウンにより被書き込み画像への書き込みを終了しているが、本実施形態はこれに限定されず、例えば管理者により任意のタイミングで書き込みを終了させてもよい。 Next, when the predetermined time elapses (step S156 / Yes), the good-luck flag display application controls to end the writing (step S159). Specifically, the good-luck flag display application notifies the writing application of the end of writing, for example, by interprocess communication. Here, as an example, writing to the image to be written is completed by a countdown, but the present embodiment is not limited to this, and for example, the administrator may end writing at any timing.
 次に、書き込みアプリケーションは、寄せ書き表示アプリケーションからの通知(例えばプロセス間通信)を受けると、被書き込み画像に重畳表示していた書き込み画像を記憶部150に保存する(ステップS162)。書き込み画像の保存時のデータ形式は任意に設定される。また、保存される書き込み画像は、例えば図5に示す第2の画像510のように、複数の筆跡画像がマッピングされた1画面の保存を想定する。 Next, when the writing application receives a notification (for example, interprocess communication) from the writing display application, the writing application saves the writing image superimposed on the written image in the storage unit 150 (step S162). The data format at the time of saving the written image is arbitrarily set. Further, the written image to be saved is assumed to be saved on one screen to which a plurality of handwritten images are mapped, for example, as in the second image 510 shown in FIG.
 次いで、終了操作が行われると(ステップS165/Yes)、書き込みアプリケーションの動作が終了する。終了操作は、管理者による終了操作であってもよいし、寄せ書き表示アプリケーションからの通知(例えばプロセス間通信)であってもよい。また、上記書き込み画像の保存を行った後、引き続き次の書き込みを行う設定ではない場合は自動的に終了してもよい。引き続き次の書き込みを行う設定の場合、書き込みアプリケーションは、書き込み画像を保存する一方、未書き込みの新たな書き込み画面を表示し、上記ステップS138~S162を繰り返す。 Next, when the end operation is performed (step S165 / Yes), the operation of the writing application ends. The end operation may be an end operation by the administrator, or may be a notification (for example, interprocess communication) from the message display application. Further, after saving the above-mentioned written image, if it is not set to continue the next writing, it may be automatically terminated. In the case of the setting for continuing the next writing, the writing application saves the written image, displays a new unwritten writing screen, and repeats the above steps S138 to S162.
 一方、寄せ書き表示アプリケーションは、書き込みアプリケーションにより保存された書き込み画面画像を記憶部150から取得し(ステップS168)、合成画像を生成する(ステップS171)。本実施形態では、寄せ書き表示アプリケーションが有する連携機能の一例として、ステップS132に示す書き込みアプリケーションの起動の他、書き込み画面画像に基づいて被書き込み画像との合成画像を生成する機能を有する場合について説明する。 On the other hand, the writing display application acquires the writing screen image saved by the writing application from the storage unit 150 (step S168) and generates a composite image (step S171). In the present embodiment, as an example of the cooperation function of the good-luck flag display application, a case where the writing application is started as shown in step S132 and a function of generating a composite image with the written image based on the writing screen image is described will be described. ..
 次に、寄せ書き表示アプリケーションは、生成した合成画像(寄せ書き画像)を表示する制御を行う(ステップS174)。合成画像の表示は、例えば引き続き次の書き込みを行う設定の場合、次の書き込みのために新たに表示する泡画像501~503の周囲に漂わせるアニメーションにより提示し、上記ステップS150~S174を繰り返すようにしてよい。背景に漂わせる合成画像は、一定時間表示し、一定時間経過後は削除するようにしてもよい。書き込む度に次々と合成画像が周囲に漂う表示を提示することで、多数の人による寄せ書きの表示が実現し得る。書き込み画面画像は別途保存されているため、合成画像を保存せずに削除した後も、寄せ書き表示アプリケーションはいつでも読み出して合成画像(寄せ書き画像)の生成および表示を行うことが可能である。 Next, the good-luck flag display application controls to display the generated composite image (good-luck flag image) (step S174). For example, in the case of a setting for continuing the next writing, the display of the composite image is presented by an animation floating around the bubble images 501 to 503 newly displayed for the next writing, and the above steps S150 to S174 are repeated. May be. The composite image floating in the background may be displayed for a certain period of time and deleted after a certain period of time. By presenting a display in which the composite image floats around one after another each time it is written, it is possible to realize the display of the message by a large number of people. Since the writing screen image is saved separately, even after the composite image is deleted without being saved, the writing display application can read it at any time to generate and display the composite image (combined image).
 ここで、図8を参照して合成画像(寄せ書き画像)の生成と表示について具体的に説明する。図8、書き込み画面画像に基づいて被書き込み画像との合成画像を生成する場合について説明する図である。寄せ書き表示アプリケーションは、図8に示すように、書き込みアプリケーションにより保存された書き込み画面画像(一例として、第2の画像510)を取得すると、同時に背景側で表示されていた被書き込み画像(一例として、第1の画像500)の意味のあるレイアウト情報で分離する処理を行う。レイアウト情報とは、例えば、第1の画像500に含まれる書き込み領域の範囲を明示する泡画像の位置や輪郭の情報を含む。被書き込みアプリケーションは、例えば第1の画像500に含まれる書き込み領域の範囲を示す泡画像の輪郭で書き込み画面画像の少なくとも一部を分離する。より具体的には、図8に示すように、第2の画像510から、各泡画像の輪郭で分離した分離画像510a~510cを抽出する。 Here, the generation and display of the composite image (combined image) will be specifically described with reference to FIG. FIG. 8 is a diagram illustrating a case where a composite image with a written image is generated based on a writing screen image. As shown in FIG. 8, when the writing screen image (as an example, the second image 510) saved by the writing application is acquired, the writing display application simultaneously displays the written image (as an example) displayed on the background side. The process of separating with the meaningful layout information of the first image 500) is performed. The layout information includes, for example, information on the position and contour of the bubble image that clearly indicates the range of the writing area included in the first image 500. The write-to-write application separates at least a portion of the write screen image, for example, by the contour of a bubble image showing the extent of the write area contained in the first image 500. More specifically, as shown in FIG. 8, separated images 510a to 510c separated by the contour of each bubble image are extracted from the second image 510.
 次いで、寄せ書き表示アプリケーションは、第1の画像500から切り抜いた各泡画像に、分離画像をそれぞれ合成した合成画像520a~520cを生成する。 Next, the message display application generates composite images 520a to 520c in which separated images are combined with each bubble image cut out from the first image 500.
 次に、寄せ書き表示アプリケーションは、合成画像520a~520cを、プロジェクタ210により投影される投影画像(第1の画像500b)に表示する制御を行う。寄せ書き表示アプリケーションは、合成画像520a~520cをランダムに画面上を浮遊させるアニメーションを提示してもよい。この際、寄せ書き表示アプリケーションは、合成画像520a~520cの大きさや明度、色相、彩度、透過度等を適宜調整し得る。また、第1の画像500bには、次の書き込みが行われる泡画像501~503が表示されている。合成画像520a~520cは、例えば泡画像501~503の周囲や背景画像として表示されてもよい。 Next, the message display application controls to display the composite images 520a to 520c on the projected image (first image 500b) projected by the projector 210. The message display application may present an animation in which composite images 520a to 520c are randomly floated on the screen. At this time, the message display application can appropriately adjust the size, lightness, hue, saturation, transparency, etc. of the composite images 520a to 520c. Further, in the first image 500b, bubble images 501 to 503 in which the next writing is performed are displayed. The composite images 520a to 520c may be displayed, for example, around the bubble images 501 to 503 or as a background image.
 そして、管理者等による終了操作が行われると(ステップS177/Yes)、被書き込みアプリケーションの動作が終了する。 Then, when the end operation is performed by the administrator or the like (step S177 / Yes), the operation of the write-to-write application ends.
 以上本実施形態による書き込みアプリケーションと、これに連携する寄せ書き表示アプリケーションの動作処理について説明した。これにより、ペン入力により寄せ書きを行いたい場合は、いつでも寄せ書き表示アプリケーション上からの操作でペン入力のアプリケーション(書き込みアプリケーション)を起動させることができる。ユーザまたは管理者は、事前にペン入力のアプリケーションにイメージデータ等を読み込ませる必要なく、操作の煩わしさが軽減される。また、他のアプリケーションとペン入力のアプリケーションと組み合わせることで、ペン入力のエンターテインメント性を高めることができる。 The operation processing of the writing application according to this embodiment and the writing display application linked to the writing application has been explained above. As a result, if you want to write a message by pen input, you can start the pen input application (writing application) at any time by operating the message display application. The user or the administrator does not need to load the image data or the like into the pen input application in advance, and the troublesome operation is reduced. In addition, by combining with other applications and pen input applications, the entertainment of pen input can be enhanced.
 なお、本実施形態では、生成する合成画像を「寄せ書き画像」と表現したが、本開示はこれに限定されない。被書き込みアプリケーションは、一例として、書き込みアプリケーションと連携してペン入力のエンターテインメント性を高めることができるアプリケーションであればよい。 In the present embodiment, the generated composite image is expressed as a "good-luck flag", but the present disclosure is not limited to this. As an example, the write-to-write application may be an application that can enhance the entertainment of pen input in cooperation with the write application.
 (ペン入力の状態通知および終了アイコンの表示)
 デジタルペン400による書き込みが可能な状態であることを示す通知や、書き込みの終了操作を受け付ける終了アイコンは、書き込みアプリケーションにより提示される画面上に表示されてもよい。図9は、本実施形態によるペン入力状態の通知および終了アイコンの一例を示す図である。図9に示すように、投影画像50には、デジタルペン400による書き込みが可能であることを示すアイコン540や、ペン入力を終了する終了アイコン542が表示されてもよい。終了アイコン542がデジタルペン400等により選択されると、書き込みアプリケーションは、書き込み終了の指示を受け付け、上記ステップS162に示す書き込み画像保存処理を行う。また、書き込み終了の指示が入力されたことを寄せ書き表示アプリケーションも取得し、上記ステップ159の処理、または、上記ステップ168以降の処理を行うようにしてもよい。
(Pen input status notification and end icon display)
A notification indicating that writing by the digital pen 400 is possible and an end icon for accepting a write end operation may be displayed on the screen presented by the writing application. FIG. 9 is a diagram showing an example of a notification of a pen input state and an end icon according to the present embodiment. As shown in FIG. 9, the projected image 50 may display an icon 540 indicating that writing by the digital pen 400 is possible, and an end icon 542 for ending pen input. When the end icon 542 is selected by the digital pen 400 or the like, the writing application receives the instruction to end writing and performs the writing image saving process shown in step S162. Further, the writing end display application may also acquire the input of the writing end instruction, and the processing of the above step 159 or the processing of the above step 168 and subsequent steps may be performed.
 また、書き込みアプリケーションは、デジタルペン400による書き込みが可能なことを、人物のシルエットやキャラクター、CG、実写画像の人物等が書き込みを行っている様子を示す画像541を表示することで通知してもよい。画像541は、静止画であってもよいし動画像であってもよい。また、書き込みアプリケーションは、デジタルペン400による書き込みが可能なことを、テキスト表示により通知してもよい。 Further, the writing application may notify that writing with the digital pen 400 is possible by displaying an image 541 showing a state in which a person's silhouette, a character, CG, a person in a live-action image, or the like is writing. good. The image 541 may be a still image or a moving image. Further, the writing application may notify by text display that writing with the digital pen 400 is possible.
 また、書き込みアプリケーションは、デジタルペン400による書き込みが可能なことを、音声ガイダンスや音楽をスピーカ(不図示)から再生することで通知してもよい。また、書き込みアプリケーションは、デジタルペン400による書き込みが可能なことを、デジタルペン400に設けられるカラーLEDの点灯により通知してもよい。 Further, the writing application may notify that writing by the digital pen 400 is possible by playing voice guidance or music from a speaker (not shown). Further, the writing application may notify that writing by the digital pen 400 is possible by lighting a color LED provided on the digital pen 400.
 上記例は、本実施形態による他の動作処理例等においても同様に適用され得る。 The above example can be similarly applied to other operation processing examples according to the present embodiment.
 <3-2.第3の動作処理例>
 次に、書き込みアプリケーションと連携する被書き込みアプリケーションが、一般的なアプリケーションである場合の動作処理について説明する。一般的なアプリケーション(以下、一般アプリケーションと称する)とは、例えば、文章作成アプリケーション、プレゼンテーションアプリケーション、表計算アプリケーション、所定フォーマットで生成される電子文書の閲覧/編集アプリケーション、写真や動画の閲覧アプリケーションなどが挙げられる。
<3-2. Third operation processing example>
Next, the operation processing when the written application linked with the writing application is a general application will be described. General applications (hereinafter referred to as general applications) include, for example, text creation applications, presentation applications, spreadsheet applications, electronic document viewing / editing applications generated in a predetermined format, and photo and video viewing applications. Can be mentioned.
 図10は、本実施形態による一般アプリケーションと書き込みアプリケーションの動作処理の一例を示すフローチャートである。 FIG. 10 is a flowchart showing an example of operation processing of a general application and a writing application according to the present embodiment.
 図10に示すように、まず、情報処理装置100のアプリケーション実行部121は、一般アプリケーション(被書き込みアプリケーション)を起動する(ステップS203)。一般アプリケーションの起動は、管理者による操作入力により行われてもよい。なお、管理者は、デジタルペン400を用いて書き込みを行うユーザと同一の人物であってもよい。 As shown in FIG. 10, first, the application execution unit 121 of the information processing apparatus 100 activates a general application (written application) (step S203). The general application may be started by the operation input by the administrator. The administrator may be the same person as the user who writes using the digital pen 400.
 次に、一般アプリケーションに設定された書き込み機能がONにされた場合(ステップS206/Yes)、書き込みアプリケーションを起動する(ステップS209)。書き込み機能のON操作は、例えば管理者により一般アプリケーションのメニュー画面等から行われる。また、書き込みアプリケーションの起動は、一般アプリケーションからのプロセス間通信等により行ってもよい。 Next, when the write function set in the general application is turned on (step S206 / Yes), the write application is started (step S209). The ON operation of the writing function is performed by, for example, from the menu screen of a general application by the administrator. Further, the writing application may be started by interprocess communication or the like from a general application.
 次いで、制御部120は、一般アプリケーションの画面に、書き込みアプリケーションの画面を重畳して投影領域211等に表示(投影)する制御を行う(ステップS212)。書き込みアプリケーションの画面は背景が透過した画面である。詳細について省略するが、上述した各動作処理と同様に、デジタルペン400により実空間の描画面(投影領域211等)で描いた軌跡を示す筆跡画像がマッピングされた背景が透過している画面が、書き込みアプリケーションにより提示される。ユーザは、デジタルペン400を用いて、投影領域211等に投影されている一般アプリケーションの画面に対して書き込みを行うことが可能となる。 Next, the control unit 120 controls to superimpose the screen of the writing application on the screen of the general application and display (project) it on the projection area 211 or the like (step S212). The screen of the writing application is a screen with a transparent background. Although details are omitted, as in each operation process described above, there is a screen in which the background to which the handwriting image showing the trajectory drawn on the drawing surface (projection area 211, etc.) in the real space is mapped by the digital pen 400 is transparent. , Presented by the writing application. The user can use the digital pen 400 to write on the screen of a general application projected on the projection area 211 or the like.
 次に、一般アプリケーションを終了する操作があった場合(ステップS215/Yes)、一般アプリケーションは、連携する書き込みアプリケーションによる書き込み内容を保存するか否かを判断する(ステップS218)。書き込み内容を保存するか否かは予め管理者やユーザが設定しておいてもよいし、ポップアップ表示等を行って、一般アプリケーションを終了する前に管理者またはユーザに確認してもよい。 Next, when there is an operation to terminate the general application (step S215 / Yes), the general application determines whether or not to save the contents written by the linked writing application (step S218). Whether or not to save the written contents may be set in advance by the administrator or the user, or a pop-up display or the like may be displayed to confirm with the administrator or the user before terminating the general application.
 次いで、書き込み内容を保存する場合(ステップS218/Yes)、一般アプリケーションは、プロセス間通信等により書き込みアプリケーションに対して書き込み画面画像の保存を依頼する。書き込みアプリケーションは、書き込み画面画像を記憶部150に保存して動作を終了する。また、一般アプリケーションも動作を終了する(ステップS224)。 Next, when saving the written contents (step S218 / Yes), the general application requests the writing application to save the writing screen image by interprocess communication or the like. The writing application saves the writing screen image in the storage unit 150 and ends the operation. The operation of the general application is also terminated (step S224).
 次に、書き込み内容を保存しない場合(ステップS218/No)、一般アプリケーションは、書き込みアプリケーションに対する保存依頼は行わずに、動作を終了する(ステップS221)。なお、一般アプリケーションは、動作終了通知を書き込みアプリケーションに対してプロセス間通信等により行ってもよい。 Next, when the written content is not saved (step S218 / No), the general application ends the operation without making a save request to the writing application (step S221). In addition, the general application may send the operation end notification to the writing application by interprocess communication or the like.
 一方、書き込みアプリケーションを終了する操作があった場合(ステップS227/Yes)、書き込みアプリケーションは、書き込み画面画像の保存を行った上で書き込みアプリケーションを終了する(ステップS230)。なお、かかる書き込み画面画像の保存は一時的な保存であってもよい。例えば、書き込みアプリケーションが先に終了した場合も、一般アプリケーションが終了する際には、書き込み内容を保存するか否かが判断される。書き込み内容を保存しない場合は、一般アプリケーションが書き込みアプリケーションに対してプロセス間通信等により一時保存した書き込み内容を破棄するよう依頼してもよい。書き込み内容を保存する場合は、破棄依頼は行わない。 On the other hand, when there is an operation to end the writing application (step S227 / Yes), the writing application saves the writing screen image and then ends the writing application (step S230). It should be noted that the saving of the writing screen image may be temporary saving. For example, even if the writing application is terminated first, when the general application is terminated, it is determined whether or not to save the written contents. When the written content is not saved, the general application may request the writing application to discard the temporarily saved written content by interprocess communication or the like. When saving the written contents, the destruction request is not made.
 以上、一般アプリケーションと書き込みアプリケーションの連携について説明した。上述したように、本実施形態では一般アプリケーションの場合も、ペン入力を行いたい際は、一般アプリケーション上からの操作でペン入力のアプリケーション(書き込みアプリケーション)を起動させることができる。ユーザまたは管理者は、事前にペン入力のアプリケーションにイメージデータ等を読み込ませる必要なく、操作の煩わしさが軽減される。また、一般アプリケーションが書き込み不可の設定であっても、重畳画面で書き込みが行われるため、利便性がさらに向上する。また、書き込み内容は、被書き込み側のデータとは別で保存されるため、いつでも書き込み内容を読み出して書き込み状態を確認することができる。また、被書き込みアプリケーションを最前面に表示させる画面切り替えの操作により、簡単に書き込み内容を非表示にすることができる。 The above is the explanation of the cooperation between the general application and the writing application. As described above, even in the case of a general application in the present embodiment, when pen input is desired, a pen input application (writing application) can be started by an operation from the general application. The user or the administrator does not need to load the image data or the like into the pen input application in advance, and the troublesome operation is reduced. Further, even if the general application is set to be non-writable, writing is performed on the superimposed screen, so that the convenience is further improved. Further, since the written content is saved separately from the data on the side to be written, the written content can be read at any time to check the writing state. In addition, the written content can be easily hidden by the screen switching operation that displays the application to be written in the foreground.
 また、上述した動作処理では、一般アプリケーションから書き込みアプリケーションへの通知をプロセス間通信等(アプリ間通信)により行う旨を説明したが、本開示はこれに限定されない。例えばアプリケーション実行部121により別途起動されているサービスプログラムを介して通知を行ってもよい。 Further, in the above-mentioned operation processing, it has been explained that the notification from the general application to the writing application is performed by inter-process communication or the like (inter-app communication), but the present disclosure is not limited to this. For example, the notification may be given via a service program separately started by the application execution unit 121.
 <<4.消しゴム機能>>
 続いて、本実施形態による消しゴム機能について説明する。本実施形態による消しゴム機能には、書き込みアプリケーション側で描画された軌跡のみを消す機能と、被書き込みアプリケーション側で表示されている情報も軌跡と合わせて消す機能とがある。
<< 4. Eraser function >>
Subsequently, the eraser function according to the present embodiment will be described. The eraser function according to the present embodiment has a function of erasing only the locus drawn on the writing application side and a function of erasing the information displayed on the written application side together with the locus.
 図11は、本実施形態による2種類の消しゴム機能について説明する図である。消しゴム機能の実行は、例えばデジタルペン400を用いて行われ得る。図11の画面例550に示すように泡画像に書き込まれた筆跡を消しゴム機能ONの状態でデジタルペン400によりなぞった際、画面例551に示すように書き込まれた筆跡のみを消すようにしてもよいし、画面例552に示すように筆跡と背景の泡画像まで消すようにしてもよい。 FIG. 11 is a diagram illustrating two types of eraser functions according to the present embodiment. Execution of the eraser function can be performed using, for example, a digital pen 400. When the handwriting written on the bubble image is traced with the digital pen 400 with the eraser function ON as shown in the screen example 550 of FIG. 11, only the handwriting written on the screen example 551 is erased. Alternatively, as shown in screen example 552, the handwriting and the background bubble image may be erased.
 背景の泡画像は被書き込みアプリケーションにより提示される画像であり、筆跡は書き込みアプリケーションにより提示される画像であるため、例えば消しゴム機能による入力情報を書き込みアプリケーションのみに反映させることで、筆跡のみ消すことが可能である。消しゴム機能による入力をデジタルペン400により行う場合、書き込みアプリケーションは、認識したデジタルペン400の位置に表示されている筆跡を消す動作処理を行い得る。 Since the background bubble image is an image presented by the writing application and the handwriting is an image presented by the writing application, for example, by reflecting the input information by the eraser function only in the writing application, only the handwriting can be erased. It is possible. When the input by the eraser function is performed by the digital pen 400, the writing application may perform an operation process of erasing the handwriting displayed at the position of the recognized digital pen 400.
 一方、背景の被書き込みアプリケーションにより提示される画像も消す場合、プロセス間通信等により書き込みアプリケーションが消しゴム機能による入力情報(消しゴム入力座標など)を被書き込みアプリケーションに通知する方法や、書き込みアプリケーション側において背景色で塗りつぶす処理を行う方法等が考え得る。以下、図12および図13を参照して具体的に説明する。 On the other hand, when erasing the image presented by the background written application, the writing application may notify the writing application of the input information (eraser input coordinates, etc.) by the eraser function by interprocess communication, or the background on the writing application side. A method of painting with a color can be considered. Hereinafter, a specific description will be made with reference to FIGS. 12 and 13.
 <4-1.消しゴム機能の連携がある場合>
 図12は、本実施形態による消しゴム機能の連携がある場合の動作処理を示すシーケンス図である。図12では、被書き込みアプリケーションに消しゴム機能の連携が追加できる場合を想定する。
<4-1. When there is an eraser function cooperation>
FIG. 12 is a sequence diagram showing an operation process when the eraser function is linked according to the present embodiment. In FIG. 12, it is assumed that the eraser function can be added to the application to be written.
 図12に示すように、書き込みアプリケーションは、デジタルペン400による背景も対象とする消しゴム入力を検出すると(ステップS303)、消しゴム入力座標を被書き込みアプリケーションにプロセス間通信等により通知する(ステップS306)。なお、図12では記載を省略しているが、書き込みアプリケーション側では、消しゴム入力座標に対応する筆跡を消す処理を当然に行う。 As shown in FIG. 12, when the writing application detects the eraser input that also targets the background by the digital pen 400 (step S303), the writing application notifies the writing application of the eraser input coordinates by interprocess communication or the like (step S306). Although the description is omitted in FIG. 12, the writing application side naturally performs the process of erasing the handwriting corresponding to the eraser input coordinates.
 次いで、被書き込みアプリケーションは、書き込みアプリケーションから消しゴム入力座標を受け取り、消しゴム入力座標に対応する画像(例えば泡画像)を消す処理を行う(ステップS309)。被書き込みアプリケーションには、書き込みアプリケーションからの消しゴム入力座標を受け取って対応する処理(消す処理)が行える機能が追加されている。 Next, the write-to-write application receives the eraser input coordinates from the write application and performs a process of erasing the image (for example, a bubble image) corresponding to the eraser input coordinates (step S309). The write-to-write application has an added function that can receive the eraser input coordinates from the write application and perform the corresponding processing (erasing processing).
 これにより、書き込みアプリケーションとは異なる被書き込みアプリケーションにより提示される画像も、リアルタイムで筆跡と共に背景も消すことが可能となる。 This makes it possible to erase the background as well as the handwriting in real time, even for the image presented by the written application that is different from the writing application.
 <4-2.消しゴム機能の連携が無い場合>
 一方、データ編集不可のアプリケーションや、消す処理の追加が困難なアプリケーション、また、書き込みアプリケーションとの連携機能を何ら追加されていないアプリケーションが被書き込みアプリケーションの場合の消しゴム機能について、図13を参照して説明する。
<4-2. When there is no cooperation with the eraser function>
On the other hand, refer to FIG. 13 for the eraser function when the application in which data cannot be edited, the application in which it is difficult to add the erasing process, and the application to which the function for linking with the writing application is not added is the written application. explain.
 図13は、本実施形態による消しゴム機能の連携が無い場合の動作処理を示すフローチャートである。図13に示すように、連携機能なしの被書き込みアプリケーションは、管理者等の操作に従って起動して画像を表示する(ステップS313)。 FIG. 13 is a flowchart showing an operation process when there is no cooperation of the eraser function according to the present embodiment. As shown in FIG. 13, the write-to-write application without the linkage function is activated according to an operation by an administrator or the like and displays an image (step S313).
 一方、書き込みアプリケーションも、管理者等の操作に従って起動して背景が透過された書き込み画面画像を被書き込みアプリケーションの画像に重畳して表示する。書き込みアプリケーションは、デジタルペン400による描画入力の他、背景も対象とする消しゴム入力の検出も行い得る(ステップS316)。 On the other hand, the writing application is also started according to the operation of the administrator or the like, and the writing screen image with the transparent background is superimposed on the image of the writing application and displayed. The writing application can detect the eraser input that targets the background as well as the drawing input by the digital pen 400 (step S316).
 次いで、書き込みアプリケーションは、消しゴム入力座標に対応する筆跡を、背景色で塗り潰す処理を行う(ステップS319)。背景色とは、被書き込みアプリケーションにより提示される画面の背景色である。例えば図11に示す例では、白色が背景色である。例えば画面例552の状態は、デジタルペン400によりなぞられた部分を、背面側画面(被書き込みアプリケーションにより提示される画面)の背景色である白色で塗り潰すことで実現可能であり、筆跡と背景の両方が消えたように見せることができる。 Next, the writing application performs a process of filling the handwriting corresponding to the eraser input coordinates with the background color (step S319). The background color is the background color of the screen presented by the application to be written. For example, in the example shown in FIG. 11, white is the background color. For example, the state of screen example 552 can be realized by painting the portion traced by the digital pen 400 with white, which is the background color of the back side screen (the screen presented by the application to be written), and the handwriting and the background. You can make it look like both of them have disappeared.
 次に、被書き込みアプリケーションにおいて管理者等による終了操作が行われると(ステップS322)、被書き込みアプリケーションは、最新のデータを記憶部150に保存して(上書き保存など)動作を終了する(ステップS325)。なお、既に保存されているデータから何ら更新がなかった場合は、特に保存処理は行われなくてもよい。 Next, when the end operation by the administrator or the like is performed in the write-to-write application (step S322), the write-to-write application saves the latest data in the storage unit 150 (overwrite save, etc.) and ends the operation (step S325). ). If there is no update from the already saved data, the saving process may not be performed.
 次いで、書き込みアプリケーションは、保存された被書き込みアプリケーションのデータを取得し(ステップS328)、背景色の塗り潰し(消しゴム処理)を含む書き込み画面画像を、取得したデータに合成する処理を行う(ステップS331)。合成する処理とは、取得したデータに、背景が透過された消しゴム処理を含む書き込み画面画像を重ねた状態で保存することである(画像形式で保存してもよい)。これにより、被書き込みアプリケーションがどのような形式のデータであっても、当該データに消しゴム処理を含む書き込み内容を反映させることが可能となる。合成されたデータは記憶部150に記憶される。また、消しゴム処理を含む書き込み内容(書き込み画面画像)のみが、記憶部150に記憶されるようにしてもよい。いつでも被書き込みアプリケーションの画像に重畳表示させれば、消しゴム処理を含む書き込みが行われた状態を閲覧することが可能となる。 Next, the writing application acquires the saved data of the application to be written (step S328), and performs a process of synthesizing the writing screen image including the background color filling (eraser processing) with the acquired data (step S331). .. The process of synthesizing is to save the acquired data in a state in which the writing screen image including the eraser process in which the background is transparent is superimposed (may be saved in an image format). As a result, regardless of the format of the data to be written by the application to be written, it is possible to reflect the written contents including the eraser process in the data. The synthesized data is stored in the storage unit 150. Further, only the writing content (writing screen image) including the eraser processing may be stored in the storage unit 150. By superimposing it on the image of the application to be written at any time, it is possible to view the state in which the writing including the eraser processing has been performed.
 なお、被書き込みアプリケーションが複数のページから成る一般アプリケーションの場合、書き込みアプリケーションは、表示画面(投影画像)を監視してページ送り(背面側画面の変化)を検出するようにしてもよい。また、被書き込みアプリケーションは、ページ送りが行われた場合には、書き込み画面画像を保存し(ページ情報を付加してもよい)、表示画面上には新しい書き込み画面画像を表示するようにしてもよい。 If the application to be written is a general application consisting of a plurality of pages, the writing application may monitor the display screen (projected image) to detect page feed (change in the back screen). In addition, the write-to-write application saves the write screen image (may add page information) when page feed is performed, and displays a new write screen image on the display screen. good.
 また、書き込みアプリケーション側に、または他のアプリケーションに、消しゴム処理を含む書き込み画面画像の情報を、被書き込みアプリケーションのデータ形式で反映させる機能を設けた場合、合成する処理として、被書き込みアプリケーションのデータ形式上で消しゴム処理を含む書き込み画面画像の情報を反映させる編集処理を行ってもよい。 In addition, when the writing application side or another application is provided with a function to reflect the information of the writing screen image including the eraser processing in the data format of the writing application, the data format of the writing application is used as the compositing process. The editing process that reflects the information of the writing screen image including the eraser processing may be performed above.
 <4-3.消しゴム機能の切り替え>
 本実施形態による消しゴム機能では、筆跡のみを消す機能と、筆跡と背景まで消す機能とを、必要に応じて適宜切り替えることができるようにしてもよい。
<4-3. Switching the eraser function>
In the eraser function according to the present embodiment, the function of erasing only the handwriting and the function of erasing the handwriting and the background may be appropriately switched as needed.
 例えば書き込みアプリケーションは、GUI(グラフィックユーザインタフェース)を用いてユーザに選択させてもよい。図14は、本実施形態による消しゴム機能の選択GUIの一例を示す図である。図4の画面例560には、筆跡と背景まで消す機能を選択するアイコン561と、筆跡のみを消す機能を選択するアイコン562が表示される。各アイコンにより直感的に消しゴム機能を選択することが可能となる。また、各アイコンの選択はデジタルペン400を用いて行ってもよいし、手でタッチする操作により行ってもよい。タッチ操作は、タッチセンサやデプスセンサ等により検出され得る。 For example, the writing application may be selected by the user using a GUI (graphic user interface). FIG. 14 is a diagram showing an example of a GUI for selecting an eraser function according to the present embodiment. In the screen example 560 of FIG. 4, an icon 561 for selecting a function for erasing the handwriting and the background and an icon 562 for selecting the function for erasing only the handwriting are displayed. Each icon makes it possible to intuitively select the eraser function. Further, each icon may be selected by using the digital pen 400 or by touching the icon by hand. The touch operation can be detected by a touch sensor, a depth sensor, or the like.
 また、消しゴム機能の切り替えは、描画具の切り替えに応じて行ってもよい。図15は、本実施形態による描画具の切り替えによる消しゴム機能の選択について説明する図である。図15の画面例570では、ユーザが指でこすった場合は筆跡のみを消す機能が実行される例を示し、画面例572では、ユーザがデジタルペン400で消した場合は筆跡と背景まで消す機能が実行される例を示す。このように、手や指を用いた場合は筆跡のみを消し、デジタルペン400を用いた場合は背景まで消すようにしてもよい。なお、手や指を検出する手段としては、デプスセンサを利用した手指認識や、LiDARのような測距センサを利用する方法が考え得る。例えばLiDARで描画面に平行な平面を検出し続け、行画面をタッチした際に指の存在を検出することも可能である。また、描画面にタッチセンサを設けてもよい。 Also, the eraser function may be switched according to the switching of the drawing tool. FIG. 15 is a diagram illustrating selection of an eraser function by switching drawing tools according to the present embodiment. The screen example 570 of FIG. 15 shows an example in which the function of erasing only the handwriting is executed when the user rubs with a finger, and the screen example 572 shows a function of erasing the handwriting and the background when the user erases with the digital pen 400. Here is an example of how is executed. In this way, when the hand or finger is used, only the handwriting may be erased, and when the digital pen 400 is used, the background may be erased. As a means for detecting a hand or a finger, a finger recognition using a depth sensor or a method using a distance measuring sensor such as LiDAR can be considered. For example, it is possible to continuously detect a plane parallel to the drawing surface with LiDAR and detect the presence of a finger when the line screen is touched. Further, a touch sensor may be provided on the drawing surface.
 また、デジタルペン400の他に、描画具を用いて、消しゴム機能の切り替えを行ってもよい。例えば、デジタルペン400を用いた場合は筆跡のみを消す機能を実行し、マウス操作が行われた場合は背景まで消す機能を実行するようにしてもよい。 Further, in addition to the digital pen 400, the eraser function may be switched by using a drawing tool. For example, when the digital pen 400 is used, the function of erasing only the handwriting may be executed, and when the mouse operation is performed, the function of erasing the background may be executed.
 また、操作方法により消しゴム機能の切り替えを行ってもよい。例えば、特定の場所を単位時間内に一度なぞったときには、筆跡のみを消す機能を実行し、一定時間内に複数回繰り返しなぞった場合には背景まで消す機能を実行するようにしてもよい。より具体的には、例えば1秒間以内に同じ場所を1度なぞった場合は筆跡のみを消す機能を実行し、複数回繰り返しなぞった場合は背景まで消す機能を実行するようにしてもよい。 Also, the eraser function may be switched depending on the operation method. For example, when a specific place is traced once within a unit time, the function of erasing only the handwriting may be executed, and when the trace is repeated a plurality of times within a certain time, the function of erasing the background may be executed. More specifically, for example, when the same place is traced once within 1 second, the function of erasing only the handwriting may be executed, and when the trace is repeated a plurality of times, the function of erasing the background may be executed.
 <<5.その他>>
 <5-1.書き込みエリア表示の位置調整>
 本実施形態は、ユーザが書き込む場所(エリア)を明示する画像を被書き込みアプリケーションにより提示する際、書き込みを行うユーザのプロファイルに応じて、当該画像の表示位置を調整することで、ユーザビリティを向上させることができる。
<< 5. Others >>
<5-1. Adjusting the position of the writing area display>
This embodiment improves usability by adjusting the display position of the image according to the profile of the user who writes when the image to be written clearly indicates the place (area) to be written by the user. be able to.
 図16は、本実施形態によるユーザの頭の位置に応じた泡画像581~583の表示位置の調整について説明する図である。図16では、投影画像52に、泡画像581~583が表示されている。泡画像581~583は、被書き込みアプリケーションにより提示され、ユーザが書き込む場所(エリア)を明示する画像の一例である。 FIG. 16 is a diagram illustrating adjustment of the display position of the bubble images 581 to 583 according to the position of the user's head according to the present embodiment. In FIG. 16, bubble images 581 to 583 are displayed on the projected image 52. The bubble images 581 to 583 are examples of images presented by the application to be written and clearly indicating the place (area) to be written by the user.
 例えば、被書き込みアプリケーションが、複数人による書き込みを考慮して、画面を縦に分割して各分割エリアに1つの泡画像をそれぞれ表示する制御を行う際、書き込みを行うユーザの頭の位置や身長を検出し、手の届く高さに泡画像を表示するよう制御する。図16に示す例では、泡画像581~583が、ユーザU1~U3の頭の位置や身長に応じて適切な高さに表示されている。これにより、大人や子供、車椅子に乗車した状態なども、各ユーザに合った適切な表示制御を行うことができる。 For example, when the application to be written controls to divide the screen vertically and display one bubble image in each divided area in consideration of writing by a plurality of people, the position and height of the head of the user who writes. Is detected and the bubble image is controlled to be displayed at a height within reach. In the example shown in FIG. 16, the bubble images 581 to 583 are displayed at appropriate heights according to the positions and heights of the heads of the users U1 to U3. As a result, it is possible to perform appropriate display control suitable for each user even when an adult, a child, or a person in a wheelchair is in a wheelchair.
 ユーザの頭の位置や身長等をセンシングする手段として、例えばRGBカメラ、デプスセンサ、高さ測定するためのラインセンサ等を用いてもよい。また、超音波センサによって頭上からの距離を測定する方法で身長を算出してもよい。 As a means for sensing the position and height of the user's head, for example, an RGB camera, a depth sensor, a line sensor for measuring height, or the like may be used. Further, the height may be calculated by a method of measuring the distance from overhead by an ultrasonic sensor.
 <5-2.書き込み可能領域の変化>
 本実施形態は、被書き込みアプリケーションにより提示される画面において意味を持たせているUIと連動して、書き込みアプリケーションにより提示される画面において書き込み可能な領域を変更するようにしてもよい。被書き込みアプリケーションのUIが動的に変化する場合は、書き込みアプリケーションの書き込み可能な領域も動的に変化させるようにしてもよい。
<5-2. Change in writable area>
In this embodiment, the writable area may be changed on the screen presented by the writing application in conjunction with the UI that has meaning on the screen presented by the writing application. When the UI of the write-to-write application changes dynamically, the writable area of the write application may also change dynamically.
 書き込み可能な領域とは、デジタルペン400を用いて書き込まれた筆跡画像を表示する領域である。例えば、書き込み可能な領域を、被書き込みアプリケーションにより提示される泡画像の輪郭とリアルタイムに連動させる場合、書き込みアプリケーションは、泡画像の輪郭の外には書き込みができない(筆跡画像が表示されない)ようにしてもよい。 The writable area is an area for displaying a handwriting image written using the digital pen 400. For example, if the writable area is linked in real time to the contour of the bubble image presented by the writeable application, the write application will not be able to write outside the contour of the bubble image (the handwriting image will not be displayed). You may.
 また、泡画像の輪郭の外にリアルタイムでは書き込みができる状態であっても(筆跡画像がすべて表示される)、図8を参照して説明したように、泡画像の輪郭に合わせて第2の画像510から画像を分離して合成する際に、泡画像の輪郭外にはみ出した筆跡画像は合成対象とならないようにしてもよい。 Further, even if it is possible to write in real time outside the contour of the bubble image (all the handwriting images are displayed), as described with reference to FIG. 8, the second is matched to the contour of the bubble image. When the image is separated from the image 510 and combined, the handwriting image protruding outside the outline of the bubble image may not be the target of composition.
 <<6.補足>>
 上述したように、本開示の実施形態による情報処理装置では、異なるアプリケーションにより提示される画面に、ユーザ入力の情報を重ねて表示することで、ユーザ入力の利便性を高めることが可能となる。
<< 6. Supplement >>
As described above, in the information processing apparatus according to the embodiment of the present disclosure, it is possible to enhance the convenience of user input by displaying the information of user input superimposed on the screens presented by different applications.
 以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本技術はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、特許請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。 Although the preferred embodiments of the present disclosure have been described in detail with reference to the accompanying drawings, the present technology is not limited to such examples. It is clear that anyone with ordinary knowledge in the art of the present disclosure may come up with various modifications or amendments within the scope of the technical ideas set forth in the claims. Is, of course, understood to belong to the technical scope of the present disclosure.
 例えば、被書き込み画像に書き込み画像を重畳する表示は、投影による表示に限定されず、プロジェクタ210以外の表示装置、例えばTV装置、ディスプレイ、タブレット端末、スマートフォン、携帯電話、PC(パーソナルコンピュータ)、タッチパネルディスプレイ、HMD(ヘッドマウントディスプレイ)、ウェアラブルデバイス等であってもよい。 For example, the display in which the written image is superimposed on the written image is not limited to the display by projection, and the display device other than the projector 210, for example, a TV device, a display, a tablet terminal, a smartphone, a mobile phone, a PC (personal computer), or a touch panel. It may be a display, an HMD (head mount display), a wearable device, or the like.
 また、上述した情報処理装置100またはデジタルペン400に内蔵されるCPU、ROM、およびRAM等のハードウェアに、情報処理装置100またはデジタルペン400の機能を発揮させるためのコンピュータプログラムも作成可能である。また、当該コンピュータプログラムを記憶させたコンピュータ読み取り可能な記憶媒体も提供される。 Further, it is possible to create a computer program for exerting the functions of the information processing device 100 or the digital pen 400 on the hardware such as the CPU, ROM, and RAM built in the information processing device 100 or the digital pen 400 described above. .. Also provided is a computer-readable storage medium that stores the computer program.
 また、本明細書に記載された効果は、あくまで説明的または例示的なものであって限定的ではない。つまり、本開示に係る技術は、上記の効果とともに、または上記の効果に代えて、本明細書の記載から当業者には明らかな他の効果を奏しうる。 Further, the effects described in the present specification are merely explanatory or exemplary and are not limited. That is, the technique according to the present disclosure may exert other effects apparent to those skilled in the art from the description of the present specification, in addition to or in place of the above effects.
 なお、本技術は以下のような構成も取ることができる。
(1)
 第1のアプリケーションにより提示される第1の画面と、書き込み入力を受け付ける第2のアプリケーションにより提示される、背景が透過された第2の画面とを、表示出力する制御部を備え、
 前記制御部は、
  前記第1の画面に、前記第2の画面を重畳して表示する制御を行い、
  前記書き込み入力の表示を含む前記第2の画面と前記第1の画面とを用いた画像処理を行う、
情報処理装置。
(2)
 前記制御部は、前記第2の画面を前記第1の画面に合成した合成画像を生成する処理を行う、前記(1)に記載の情報処理装置。
(3)
 前記制御部は、前記第1の画面から抽出されるレイアウト情報に応じて、前記第2の画面から少なくとも一部を分離した画像を、前記レイアウト情報に対応する前記第1の画面の一部の画像と合成した合成画像を生成する処理を行う、前記(1)または(2)に記載の情報処理装置。
(4)
 前記レイアウト情報は、前記第1の画面に含まれる書き込み領域を明示する画像の輪郭の情報を含む、前記(3)に記載の情報処理装置。
(5)
 前記第2のアプリケーションは、入力装置の位置を認識し、前記入力装置により描画面に描かれた軌跡を前記書き込み入力として受け付け、前記軌跡を示す筆跡画像を前記第2の画面に表示する、前記(1)~(4)のいずれか1項に記載の情報処理装置。
(6)
 前記制御部は、前記第1の画面に前記第2の画面を重畳した表示画面を、前記描画面に表示する制御を行う、前記(5)に記載の情報処理装置。
(7)
 前記制御部は、前記表示画面を、前記描画面に投影する制御を行う、前記(6)に記載の情報処理装置。
(8)
 前記第1のアプリケーションは、前記第2のアプリケーションの起動を制御する、前記(1)~(7)のいずれか1項に記載の情報処理装置。
(9)
 前記第1のアプリケーションは、前記第2のアプリケーションにより提示される前記第2の画面の保存を制御する、前記(8)に記載の情報処理装置。
(10)
 前記制御部は、前記書き込み入力が終了し、前記第2の画面が保存された後、前記第2の画面と前記第1の画面とを用いた画像処理を行う、前記(1)~(9)のいずれか1項に記載の情報処理装置。
(11)
 前記第2のアプリケーションは、前記第2の画面において前記書き込み入力の表示を消す入力を受け付ける、前記(1)~(10)のいずれか1項に記載の情報処理装置。
(12)
 前記第2のアプリケーションは、前記書き込み入力の表示と、対応する前記第1の画面の表示を消す入力を受け付けた際、前記第1の画面の背景色で対象部分を塗り潰す処理を行う、前記(11)に記載の情報処理装置。
(13)
 前記第2のアプリケーションは、前記書き込み入力の表示と、対応する前記第1の画面の表示を消す入力を受け付けた際、前記第1のアプリケーションに、前記表示を消す入力の対象部分を示す座標情報を通知する、前記(11)に記載の情報処理装置。
(14)
 プロセッサが、
 第1のアプリケーションにより提示される第1の画面と、書き込み入力を受け付ける第2のアプリケーションにより提示される、背景が透過された第2の画面とを、表示出力することと、
 前記第1の画面に、前記第2の画面を重畳して表示する制御を行うことと、
 前記書き込み入力の表示を含む前記第2の画面と前記第1の画面とを用いた画像処理を行うことと、
を含む、情報処理方法。
(15)
 コンピュータを、
 第1のアプリケーションにより提示される第1の画面と、書き込み入力を受け付ける第2のアプリケーションにより提示される、背景が透過された第2の画面とを、表示出力する制御部として機能させ、
 前記制御部は、
  前記第1の画面に、前記第2の画面を重畳して表示する制御を行い、
  前記書き込み入力の表示を含む前記第2の画面と前記第1の画面とを用いた画像処理を行う、プログラム。
The present technology can also have the following configurations.
(1)
A control unit for displaying and outputting a first screen presented by the first application and a second screen with a transparent background presented by the second application that accepts write input is provided.
The control unit
Control is performed to superimpose and display the second screen on the first screen.
Image processing using the second screen including the display of the write input and the first screen is performed.
Information processing equipment.
(2)
The information processing device according to (1) above, wherein the control unit performs a process of generating a composite image in which the second screen is combined with the first screen.
(3)
The control unit sets an image obtained by separating at least a part from the second screen according to the layout information extracted from the first screen, as a part of the first screen corresponding to the layout information. The information processing apparatus according to (1) or (2) above, which performs a process of generating a composite image combined with an image.
(4)
The information processing apparatus according to (3), wherein the layout information includes information on the outline of an image that clearly indicates a writing area included in the first screen.
(5)
The second application recognizes the position of the input device, accepts the locus drawn on the drawing surface by the input device as the write input, and displays the handwriting image showing the locus on the second screen. The information processing apparatus according to any one of (1) to (4).
(6)
The information processing device according to (5) above, wherein the control unit controls to display a display screen on which the second screen is superimposed on the first screen on the drawing surface.
(7)
The information processing device according to (6), wherein the control unit controls to project the display screen onto the drawing surface.
(8)
The information processing apparatus according to any one of (1) to (7) above, wherein the first application controls the activation of the second application.
(9)
The information processing apparatus according to (8), wherein the first application controls the storage of the second screen presented by the second application.
(10)
The control unit performs image processing using the second screen and the first screen after the writing input is completed and the second screen is saved. The information processing apparatus according to any one of ().
(11)
The information processing apparatus according to any one of (1) to (10) above, wherein the second application accepts an input for erasing the display of the write input on the second screen.
(12)
The second application performs a process of filling a target portion with a background color of the first screen when receiving a display of the write input and an input for erasing the display of the corresponding first screen. The information processing apparatus according to (11).
(13)
When the second application receives the display of the write input and the input for erasing the display of the corresponding first screen, the coordinate information indicating the target portion of the input for erasing the display to the first application. The information processing apparatus according to (11) above.
(14)
The processor,
Displaying and outputting the first screen presented by the first application and the second screen with a transparent background presented by the second application that accepts write input.
Controlling the display of the second screen superimposed on the first screen is performed.
Performing image processing using the second screen including the display of the write input and the first screen, and
Information processing methods, including.
(15)
Computer,
The first screen presented by the first application and the second screen with a transparent background presented by the second application that accepts write input are made to function as a control unit for displaying and outputting.
The control unit
Control is performed to superimpose and display the second screen on the first screen.
A program that performs image processing using the second screen including the display of the write input and the first screen.
 100 情報処理装置
  110 I/F部
  120 制御部
   121 アプリケーション実行部
   122 表示画像生成部
   123 表示制御部
  130 操作入力部
  140 表示部
  150 記憶部
 200 表示装置
  210 プロジェクタ
 300 センサ
  310 カメラ
 400 デジタルペン
  410 通信モジュール
  420 制御部
  430 IR LED
  440 スイッチ
100 Information processing device 110 I / F unit 120 Control unit 121 Application execution unit 122 Display image generation unit 123 Display control unit 130 Operation input unit 140 Display unit 150 Storage unit 200 Display device 210 Projector 300 Sensor 310 Camera 400 Digital pen 410 Communication module 420 Control unit 430 IR LED
440 switch

Claims (15)

  1.  第1のアプリケーションにより提示される第1の画面と、書き込み入力を受け付ける第2のアプリケーションにより提示される、背景が透過された第2の画面とを、表示出力する制御部を備え、
     前記制御部は、
      前記第1の画面に、前記第2の画面を重畳して表示する制御を行い、
      前記書き込み入力の表示を含む前記第2の画面と前記第1の画面とを用いた画像処理を行う、
    情報処理装置。
    A control unit for displaying and outputting a first screen presented by the first application and a second screen with a transparent background presented by the second application that accepts write input is provided.
    The control unit
    Control is performed to superimpose and display the second screen on the first screen.
    Image processing using the second screen including the display of the write input and the first screen is performed.
    Information processing equipment.
  2.  前記制御部は、前記第2の画面を前記第1の画面に合成した合成画像を生成する処理を行う、請求項1に記載の情報処理装置。 The information processing device according to claim 1, wherein the control unit performs a process of generating a composite image in which the second screen is combined with the first screen.
  3.  前記制御部は、前記第1の画面から抽出されるレイアウト情報に応じて、前記第2の画面から少なくとも一部を分離した画像を、前記レイアウト情報に対応する前記第1の画面の一部の画像と合成した合成画像を生成する処理を行う、請求項1に記載の情報処理装置。 The control unit sets an image obtained by separating at least a part from the second screen according to the layout information extracted from the first screen, as a part of the first screen corresponding to the layout information. The information processing apparatus according to claim 1, wherein a process of generating a composite image combined with an image is performed.
  4.  前記レイアウト情報は、前記第1の画面に含まれる書き込み領域を明示する画像の輪郭の情報を含む、請求項3に記載の情報処理装置。 The information processing apparatus according to claim 3, wherein the layout information includes information on the outline of an image that clearly indicates a writing area included in the first screen.
  5.  前記第2のアプリケーションは、入力装置の位置を認識し、前記入力装置により描画面に描かれた軌跡を前記書き込み入力として受け付け、前記軌跡を示す筆跡画像を前記第2の画面に表示する、請求項1に記載の情報処理装置。 The second application recognizes the position of the input device, accepts the locus drawn on the drawing surface by the input device as the write input, and displays the handwriting image showing the locus on the second screen. Item 1. The information processing apparatus according to Item 1.
  6.  前記制御部は、前記第1の画面に前記第2の画面を重畳した表示画面を、前記描画面に表示する制御を行う、請求項5に記載の情報処理装置。 The information processing device according to claim 5, wherein the control unit controls to display a display screen on which the second screen is superimposed on the first screen on the drawing surface.
  7.  前記制御部は、前記表示画面を、前記描画面に投影する制御を行う、請求項6に記載の情報処理装置。 The information processing device according to claim 6, wherein the control unit controls to project the display screen onto the drawing surface.
  8.  前記第1のアプリケーションは、前記第2のアプリケーションの起動を制御する、請求項1に記載の情報処理装置。 The information processing device according to claim 1, wherein the first application controls the activation of the second application.
  9.  前記第1のアプリケーションは、前記第2のアプリケーションにより提示される前記第2の画面の保存を制御する、請求項8に記載の情報処理装置。 The information processing device according to claim 8, wherein the first application controls the storage of the second screen presented by the second application.
  10.  前記制御部は、前記書き込み入力が終了し、前記第2の画面が保存された後、前記第2の画面と前記第1の画面とを用いた画像処理を行う、請求項1に記載の情報処理装置。 The information according to claim 1, wherein the control unit performs image processing using the second screen and the first screen after the writing input is completed and the second screen is saved. Processing equipment.
  11.  前記第2のアプリケーションは、前記第2の画面において前記書き込み入力の表示を消す入力を受け付ける、請求項1に記載の情報処理装置。 The information processing device according to claim 1, wherein the second application accepts an input for erasing the display of the write input on the second screen.
  12.  前記第2のアプリケーションは、前記書き込み入力の表示と、対応する前記第1の画面の表示を消す入力を受け付けた際、前記第1の画面の背景色で対象部分を塗り潰す処理を行う、請求項11に記載の情報処理装置。 When the second application receives the display of the write input and the input for erasing the display of the corresponding first screen, the second application performs a process of filling the target portion with the background color of the first screen. Item 11. The information processing apparatus according to Item 11.
  13.  前記第2のアプリケーションは、前記書き込み入力の表示と、対応する前記第1の画面の表示を消す入力を受け付けた際、前記第1のアプリケーションに、前記表示を消す入力の対象部分を示す座標情報を通知する、請求項11に記載の情報処理装置。 When the second application receives the display of the write input and the input for erasing the display of the corresponding first screen, the coordinate information indicating the target portion of the input for erasing the display to the first application. The information processing apparatus according to claim 11.
  14.  プロセッサが、
     第1のアプリケーションにより提示される第1の画面と、書き込み入力を受け付ける第2のアプリケーションにより提示される、背景が透過された第2の画面とを、表示出力することと、
     前記第1の画面に、前記第2の画面を重畳して表示する制御を行うことと、
     前記書き込み入力の表示を含む前記第2の画面と前記第1の画面とを用いた画像処理を行うことと、
    を含む、情報処理方法。
    The processor,
    Displaying and outputting the first screen presented by the first application and the second screen with a transparent background presented by the second application that accepts write input.
    Controlling the display of the second screen superimposed on the first screen is performed.
    Performing image processing using the second screen including the display of the write input and the first screen, and
    Information processing methods, including.
  15.  コンピュータを、
     第1のアプリケーションにより提示される第1の画面と、書き込み入力を受け付ける第2のアプリケーションにより提示される、背景が透過された第2の画面とを、表示出力する制御部として機能させ、
     前記制御部は、
      前記第1の画面に、前記第2の画面を重畳して表示する制御を行い、
      前記書き込み入力の表示を含む前記第2の画面と前記第1の画面とを用いた画像処理を行う、プログラム。
    Computer,
    The first screen presented by the first application and the second screen with a transparent background presented by the second application that accepts write input are made to function as a control unit for displaying and outputting.
    The control unit
    Control is performed to superimpose and display the second screen on the first screen.
    A program that performs image processing using the second screen including the display of the write input and the first screen.
PCT/JP2021/023540 2020-08-11 2021-06-22 Information processing device for superimposing write screen image WO2022034745A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022542592A JPWO2022034745A1 (en) 2020-08-11 2021-06-22

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-135562 2020-08-11
JP2020135562 2020-08-11

Publications (1)

Publication Number Publication Date
WO2022034745A1 true WO2022034745A1 (en) 2022-02-17

Family

ID=80247808

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/023540 WO2022034745A1 (en) 2020-08-11 2021-06-22 Information processing device for superimposing write screen image

Country Status (2)

Country Link
JP (1) JPWO2022034745A1 (en)
WO (1) WO2022034745A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06124183A (en) * 1992-10-13 1994-05-06 Toshiba Corp Multiwindow system
JP2001134415A (en) * 1999-11-09 2001-05-18 Oki Electric Ind Co Ltd Annotation method for display of arbitrary application
JP2016015098A (en) * 2014-07-03 2016-01-28 シャープ株式会社 Image processing apparatus and image processing method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06124183A (en) * 1992-10-13 1994-05-06 Toshiba Corp Multiwindow system
JP2001134415A (en) * 1999-11-09 2001-05-18 Oki Electric Ind Co Ltd Annotation method for display of arbitrary application
JP2016015098A (en) * 2014-07-03 2016-01-28 シャープ株式会社 Image processing apparatus and image processing method

Also Published As

Publication number Publication date
JPWO2022034745A1 (en) 2022-02-17

Similar Documents

Publication Publication Date Title
JP3997566B2 (en) Drawing apparatus and drawing method
JP4991154B2 (en) Image display device, image display method, and command input method
US9740338B2 (en) System and methods for providing a three-dimensional touch screen
US10015402B2 (en) Electronic apparatus
US20120044140A1 (en) Information display system and program, and optical input system, projection-type images and display apparatus
US20110102599A1 (en) Mobile terminal including projector and control method thereof
EP2919104B1 (en) Information processing device, information processing method, and computer-readable recording medium
JP4513830B2 (en) Drawing apparatus and drawing method
CN107077258B (en) Projection type image display device and image display method
JP6437654B2 (en) Video display system
US8827461B2 (en) Image generation device, projector, and image generation method
US11244511B2 (en) Augmented reality method, system and terminal device of displaying and controlling virtual content via interaction device
US10276133B2 (en) Projector and display control method for displaying split images
CN103092432A (en) Trigger control method and system of man-machine interaction operating instruction and laser emission device
JP2009140498A (en) Information input/output device and information input/output method
JP2012053584A (en) Information display system and program
US20150261385A1 (en) Picture signal output apparatus, picture signal output method, program, and display system
JP2017182109A (en) Display system, information processing device, projector, and information processing method
JP2008117083A (en) Coordinate indicating device, electronic equipment, coordinate indicating method, coordinate indicating program, and recording medium with the program recorded thereon
JP4296607B2 (en) Information input / output device and information input / output method
US10795467B2 (en) Display device, electronic blackboard system, and user interface setting method
JP7143642B2 (en) shared terminal, writing method, program
WO2022034745A1 (en) Information processing device for superimposing write screen image
CN109144598A (en) Electronics mask man-machine interaction method and system based on gesture
TWI518553B (en) Multi-mode interactive projection system, pointing device thereof, and control method therefor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21855824

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022542592

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21855824

Country of ref document: EP

Kind code of ref document: A1