WO2022201413A1 - Drawing processing device, drawing processing system, drawing processing method, and drawing processing program - Google Patents

Drawing processing device, drawing processing system, drawing processing method, and drawing processing program Download PDF

Info

Publication number
WO2022201413A1
WO2022201413A1 PCT/JP2021/012491 JP2021012491W WO2022201413A1 WO 2022201413 A1 WO2022201413 A1 WO 2022201413A1 JP 2021012491 W JP2021012491 W JP 2021012491W WO 2022201413 A1 WO2022201413 A1 WO 2022201413A1
Authority
WO
WIPO (PCT)
Prior art keywords
buffer
target
drawing target
unit
information
Prior art date
Application number
PCT/JP2021/012491
Other languages
French (fr)
Japanese (ja)
Inventor
智史 櫻井
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2021/012491 priority Critical patent/WO2022201413A1/en
Priority to JP2023508301A priority patent/JP7471512B2/en
Publication of WO2022201413A1 publication Critical patent/WO2022201413A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering

Definitions

  • the present disclosure relates to a drawing processing device that performs processing for drawing drawing information, a drawing processing system using the drawing processing device, a drawing processing method, and a drawing processing program.
  • a 2D, 3D, etc. model drawing target may be displayed on a display device such as a display.
  • a two-dimensional drawing target is drawn as it is in a drawing destination buffer, which is a drawing destination buffer, but a three-dimensional drawing target is projected two-dimensionally and then drawn in the drawing destination buffer.
  • the center of the pixel in the drawing destination buffer is included in the drawing target, the pixel is filled in and the drawing target is drawn.
  • drawing objects for example, there are small drawing objects of less than one pixel and drawing objects that span several pixels.
  • Patent Document 1 As a technique for drawing such a drawing target, the technique of Patent Document 1 is disclosed.
  • Japanese Patent Application Laid-Open No. 2002-200000 discloses a technique in which a drawing processing device performs high-resolution rendering of only specific drawing targets such as small characters and thin lines, and renders other drawing targets at low resolution.
  • high-resolution rendering means performing processing on a pixel-by-pixel basis for all pixel positions
  • low-resolution rendering means performing processing on the X coordinate of coordinate values. and that only pixel positions with even-valued Y coordinates are processed on a multi-pixel basis. Therefore, for example, a small drawing target with a drawing size of less than 1 pixel disappears from the display screen without being drawn if the drawing target is not positioned at the center of the pixel.
  • An object of the present invention is to provide a drawing processing device, a drawing processing system, a drawing processing method, and a drawing processing program that can be displayed on a screen.
  • the drawing processing device includes a dot printing unit that dots pixels in a buffer corresponding to the position of a two- or more-dimensional drawing target, draws the drawing target in the buffer, and a scene drawn in the buffer to an output device. and an output control unit that controls to output.
  • a drawing processing system includes an input device that receives input of a drawing target of two or more dimensions, a drawing processing device that performs drawing processing of drawing the drawing target as a scene in a buffer, and an output device that outputs the scene.
  • the drawing processing device includes a dot printing unit that places dots on pixels in the buffer corresponding to the position of the object to be rendered, and renders the object to be rendered into the buffer; and a control unit.
  • the drawing processing method includes the steps of drawing pixels in a buffer corresponding to the position of a drawing target in two or more dimensions, drawing the drawing target in the buffer, and outputting the scene drawn in the buffer to an output device. and controlling to
  • the drawing processing program is a process of drawing dots on pixels in a buffer corresponding to the position of a drawing target in two or more dimensions, drawing the drawing target in the buffer, and outputting the scene drawn in the buffer to an output device. and a process for controlling so as to be executed.
  • the dot dot plotting unit dots pixels in a buffer corresponding to the position of a two-dimensional or more drawing target, draws the drawing target in the buffer, and the output control unit outputs the scene drawn in the buffer to the output device. Therefore, even when a two-dimensional drawing target is not positioned at the center of the pixel, the drawing target is drawn and can be displayed on the display screen.
  • FIG. 1 is a block diagram of a drawing processing system including a drawing processing device according to Embodiment 1;
  • FIG. 1 is a hardware configuration diagram of a drawing processing system according to Embodiment 1;
  • FIG. 4 is a flowchart showing the operation of the drawing processing system according to Embodiment 1; It is an example of drawing information according to the first embodiment. 4 is a flowchart showing the operation of the drawing processing device according to Embodiment 1;
  • 3A and 3B are diagrams of a drawing destination buffer and a detail buffer according to the first embodiment;
  • FIG. FIG. 11 is a diagram in which a rendering result of rendering a polygon model for non-application rendering in a rendering destination buffer and the polygon model for non-application rendering are superimposed;
  • FIG. 11 is a diagram in which a rendering result of rendering a polygon model for non-application rendering in a rendering destination buffer and the polygon model for non-application rendering are superimposed;
  • FIG. 10 is a diagram showing a rendering result of rendering a non-applicable rendering target polygon model to a rendering destination buffer;
  • FIG. 10 is a conceptual diagram showing roughly estimating the area of a drawing target by projecting a three-dimensional drawing target into two dimensions using a bounding box;
  • FIG. 10 is a diagram in which a drawing result of drawing a polygon model to be applied drawing in a detail buffer and a plurality of polygon models are superimposed;
  • FIG. 10 is a diagram showing a drawing result of drawing a polygon model to be applied drawing in a detail buffer;
  • FIG. 10 is a diagram showing a drawing result of drawing a polygon model to be applied drawing in a detail buffer;
  • FIG. 10 is a diagram in which a drawing result of drawing a polygon model to be applied drawing in a detail buffer, a drawing result of drawing another polygon model to be applied drawing in a detail buffer, and a plurality of polygon models are superimposed.
  • FIG. 10 is a diagram showing a drawing result of drawing a polygon model to be applied drawing in the detail buffer and a drawing result of drawing another polygon model to be applied drawing in the detail buffer; The result of drawing a polygon model to be applied to the detail buffer, the result of drawing another polygon model to be applied to the detail buffer, and the result of drawing another polygon model to be applied to the detail buffer.
  • FIG. 10 is a diagram in which a drawing result of drawing a polygon model to be applied drawing in a detail buffer, a drawing result of drawing another polygon model to be applied drawing in a detail buffer, and a plurality of polygon models are superimposed.
  • FIG. 10 is a diagram showing a drawing result of drawing a polygon model to be applied drawing in the
  • FIG. 4 is a diagram in which a plurality of polygon models are superimposed; The result of drawing a polygon model to be applied to the detail buffer, the result of drawing another polygon model to be applied to the detail buffer, and the result of drawing another polygon model to be applied to the detail buffer. It is a figure showing.
  • FIG. 9 is a flowchart showing operations of a drawing processing system according to Embodiment 2 of the present disclosure
  • 9 is a flowchart showing operations of a drawing processing device according to Embodiment 2 of the present disclosure
  • 10 is a block diagram of a drawing processing system including a drawing processing device according to Embodiment 3
  • FIG. FIG. 11 is a flow chart showing the operation of a drawing processing device according to Embodiment 3 of the present disclosure
  • FIG. 12 is a diagram showing a contrasting relationship between a detailed buffer corresponding to a polygon model to be rendered according to the third embodiment and a detailed buffer according to the first embodiment;
  • FIG. 10 is a diagram schematically showing position information, which is information indicating which position in the drawing destination buffer the detail buffer corresponds to;
  • FIG. 12 is a diagram schematically showing position information, which is information indicating to which position in the drawing destination buffer another detail buffer corresponds;
  • FIG. 11 is a diagram schematically showing position information, which is information indicating to which position in the drawing destination buffer a further detail buffer corresponds; Information about where the detail buffer corresponds in the destination buffer, information about where another detail buffer corresponds in the destination buffer, and information about where the other detail buffer corresponds in the destination buffer.
  • FIG. 10 is a diagram schematically showing location information that summarizes information as to whether or not it corresponds to a location;
  • FIG. 10 is a diagram in which a drawing result of drawing a polygon model to be applied drawing in a detailed buffer, a drawing result of drawing another polygon model to be applied drawing to another detailed buffer, and a plurality of polygon models to be applied drawing are superimposed.
  • FIG. 10 is a diagram schematically showing position information, which is information indicating to which position in the drawing destination buffer a further detail buffer corresponds; Information about where the detail buffer corresponds in the destination buffer, information about where another detail buffer corresponds in the destination buffer, and information about where the other detail buffer corresponds in the destination buffer.
  • FIG. 12 is a diagram showing a drawing result of drawing a polygon model to be applied drawing in a detail buffer and a drawing result of drawing another polygon model to be applied drawing in another detail buffer;
  • FIG. 4 is a diagram in which a plurality of polygon models are superimposed; The result of drawing a polygon model to be applied to the detail buffer, the result of drawing another polygon model to be applied to the detail buffer, and the result of drawing another polygon model to be applied to the detail buffer. It is a figure showing.
  • Embodiment 1 a case where the present disclosure is applied to a case where there is a drawing target that is not positioned at the center of a pixel in drawing information that is information for drawing an entire scene will be described below. do.
  • a drawing target is divided into a drawing target to which detailed drawing processing is applied and then drawn to a drawing destination buffer, and a drawing target to which detailed drawing processing is not applied and which is drawn to a drawing destination buffer. It is divided into non-applicable drawing targets that are drawn as they are.
  • the drawing processing system draws the non-applicable drawing target as it is in the drawing destination buffer, calculates the area of the applicable drawing target, and determines the alpha value representing the transparency of the applicable drawing target from the calculated area.
  • the drawing processing system draws the applied drawing target in a detail buffer, which is a buffer different from the drawing destination buffer, and acquires the number of drawn pixels.
  • the drawing processing system Based on the number of drawn pixels acquired for each applied drawing target, the drawing processing system identifies the applied drawing target that has disappeared because it was not drawn in the detail buffer, and determines the position of the lost applied drawing target. Dots the pixel in the detail buffer corresponding to , and draws the erasure-applied drawtarget into the detail buffer.
  • the rendering processing system alpha-blends the RGB values of the erasure-applied drawing target with the alpha values of the erasure-applied drawing target at the pixels corresponding to the erasure-applied drawing target drawn in the detail buffer, thereby converting the erasure-applied drawing target into the drawing destination.
  • Draw to a buffer By doing so, even if the drawing target is not positioned at the center of the pixel, the drawing target can be drawn and displayed on the display screen. Details will be described below.
  • FIG. 1 is a block diagram of a drawing processing system 1 including a drawing processing device 100 according to the first embodiment.
  • the drawing processing system 1 uses an input device 4 for inputting tag information indicating whether or not to apply detailed drawing processing to each drawing object included in the drawing information and drawing information, and using the drawing information and the tag information, A drawing processing device 100 for drawing drawing information and an output device 5 for outputting a scene processed by the drawing processing device 100 are provided.
  • the input device 4 receives, for example, the drawing information of the entire scene including the drawing target information such as the polygon model input by the user.
  • the user tags each drawing object as to whether it is an applicable drawing object or an inapplicable drawing object by looking at the drawing size, brightness, importance, and the like.
  • the input device 4 receives an input from the user for tagging each drawing target as tag information.
  • the drawing processing device 100 performs drawing processing for drawing drawing information as a scene in a drawing destination buffer. Specifically, the rendering processing device 100 renders the non-applicable rendering target tagged as not applying the detailed rendering processing to the rendering destination buffer. The drawing processing device 100 calculates the area of the applicable drawing target tagged as applying the detailed drawing processing, and determines the alpha value representing the transparency of the applicable drawing target from the calculated area. The drawing processing device 100 draws the applied drawing target in a detail buffer, which is a buffer different from the drawing destination buffer, and acquires the number of drawn pixels. The drawing processing device 100 identifies, from the number of drawn pixels obtained for each applied drawing target, the loss applied drawing target that is the applied drawing target that has disappeared without being drawn in the detail buffer.
  • the drawing processing device 100 draws the erasure-applied drawing target by alpha-blending the RGB values of the erasure-applied drawing target with the alpha value of the erasure-applied drawing target in the pixels corresponding to the erasure-applied drawing target drawn in the detail buffer. Draw to the destination buffer. In this way, by marking the pixel in the detail buffer corresponding to the position of the erasure-applied drawing target and drawing the erasure-applied drawing target in the detail buffer, even if the drawing target is not positioned at the center of the pixel, A drawable object is drawn, and the drawable object can be displayed on the display screen.
  • the output device 5 outputs the scene drawn in the drawing destination buffer by the drawing processing device 100 .
  • the output device 5 is, for example, a display device such as a display, and displays the scene drawn in the drawing destination buffer by the drawing processing device 100 .
  • the drawing processing device 100 includes an acquisition unit 101 that acquires drawing information including a drawing target and tag information from the input device 4, a creation unit 102 that creates a drawing destination buffer and a detail buffer, and a drawing target using the tag information.
  • the drawing target is either an inapplicable drawing target that is drawn in the drawing destination buffer or an applicable drawing target that is drawn in the detail buffer, which is a buffer different from the drawing destination buffer.
  • an application determination unit 106 that determines whether or not the detailed drawing process is applied; an unapplied drawing target drawing unit 107 that draws an unapplied drawing target tagged as not applying detailed drawing processing to a drawing destination buffer; A calculation unit 104 for calculating the area of the applied drawing target, a determination unit 105 for determining the alpha value of the applied drawing target from the calculated area, and a drawing target for drawing in the detail buffer to obtain the number of drawn pixels. From the applied drawing target drawing unit 108 and the number of drawn pixels acquired for each applicable drawing target, the loss applied drawing target that is the applied drawing target that has disappeared without being drawn in the detail buffer is identified, and loss applied drawing is performed.
  • a dot printing unit 112 that dots a pixel in the detail buffer corresponding to the position of the object and draws an erasure-applied drawing object in the detail buffer;
  • a blend processing unit 110 for drawing the applied drawing target including the erasure applied drawing object in the drawing destination buffer by alpha blending the RGB values of the applied drawing object including the erasure applied drawing object with the alpha value of the drawing object;
  • an output control unit 111 for controlling output of a scene drawn in a drawing destination buffer, a drawing destination buffer storage unit 103 for storing the drawing destination buffer, and a detailed buffer storage unit 109 for storing the detail buffer.
  • the drawing destination buffer corresponds to the first buffer
  • the detail buffer corresponds to the second buffer.
  • the non-applicable drawing target corresponds to the first drawing target
  • the applicable drawing target corresponds to the second drawing target.
  • the unapplied drawing target drawing portion 107 corresponds to the first drawing target drawing portion
  • the applicable drawing target drawing portion 108 corresponds to the second drawing target drawing portion.
  • FIG. 2 is a hardware configuration diagram of the drawing processing system 1 according to the first embodiment.
  • the configuration of the drawing processing system 1 including the drawing processing device 100 according to the first embodiment will be described with reference to FIG.
  • the drawing processing system 1 includes a bus 41 , a CPU (Central Processing Unit) 42 , a main memory 43 , a GPU (Graphics Processing Unit) 44 , a graphic memory 45 , an input interface 46 and an output interface 47 .
  • a CPU Central Processing Unit
  • main memory 43 main memory
  • GPU Graphics Processing Unit
  • graphic memory 45 input interface 46 and output interface 47 .
  • a bus 41 is a signal path that electrically connects devices and exchanges data.
  • the devices and devices of the drawing processing system 1 are connected via communication.
  • the communication used may be wired communication or wireless communication.
  • the CPU 42 reads and executes programs stored in the main memory 43 . Specifically, the CPU 42 loads at least part of the OS (Operating System) stored in the main memory 42 and executes the program while executing the OS.
  • the CPU 42 is connected to another device via the bus 41 and controls the device.
  • the CPU 42 may be an IC (Integrated Circuit) that performs processing, and may be a microprocessor, microcomputer, or DSP (Digital Signal Processor).
  • the main memory 43 stores programs and OSs describing software, firmware, or a combination of software and firmware.
  • the main memory 43 also stores various information and the like.
  • the main memory 43 is, for example, a volatile semiconductor memory such as RAM (Random Access Memory), ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read Only).
  • RAM Random Access Memory
  • ROM Read Only Memory
  • flash memory EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read Only).
  • Non-volatile semiconductor memories such as HDDs (Hard Disk Drives), portable recording media such as magnetic disks, flexible disks, optical disks, compact disks, mini disks, and DVDs (Digital Versatile Discs).
  • the GPU 44 reads the program stored in the graphic memory 45 and executes drawing processing. Specifically, the GPU 44 loads at least part of the OS stored in the graphic memory 45, and executes the drawing processing program while executing the OS.
  • the blend processing unit 110, and the output control unit 111 are implemented by the GPU 44 reading a program loaded into the graphic memory 45 and executing it.
  • the graphic memory 45 stores programs and OSs describing software, firmware, or a combination of software and firmware. Also, the graphic memory 45 stores various information and the like.
  • the graphic memory 45 includes, for example, volatile semiconductor memory such as RAM, ROM, flash memory, EPROM, EEPROM, nonvolatile semiconductor memory such as HDD, magnetic disk, flexible disk, optical disk, compact disk, mini disk, and DVD. A portable recording medium or the like.
  • the blend processing unit 110 and the output control unit 111 are implemented by programs stored in the graphic memory 45 .
  • the drawing destination buffer storage unit 103 that stores the drawing destination buffer and the detail buffer storage unit 109 that stores the detail buffer are implemented by the graphic memory 45 .
  • the graphic memory 45 includes an acquisition unit 101, a creation unit 102, an application determination unit 106, an unapplied drawing target drawing unit 107, a calculation unit 104, a determination unit 105, and an applicable drawing target.
  • the functions of the drawing unit 108, the dot printing unit 112, the blend processing unit 110, and the output control unit 111 may be realized by separate memories.
  • the drawing destination buffer storage unit 103 and the detail buffer storage unit 109 may be realized by separate memories.
  • the processing including the drawing processing performed by the drawing processing device 100 is performed by the GPU 44 and the graphic memory 45, and the processing other than the processing including the drawing processing performed by the drawing processing device 100 is performed by the CPU 42 and the main memory 43.
  • the drawing processing system 1 may be provided with either the CPU 42 or the GPU 44, and the provided one may realize all the processing. Alternatively, the drawing processing system 1 may be provided with only one of the main memory 43 and the graphic memory 45, and the provided memory may realize the same function as the other memory.
  • Information of each part may be stored in either the main memory 43 or the graphic memory 45 . Information of each part may be stored by combining the main memory 43 and the graphic memory 45 .
  • the input interface 46 is a device that allows input of drawing information and tag information.
  • the input interface 46 is, for example, input using a keyboard, mouse, voice input, etc., and reception of information via communication.
  • the input device 4 is implemented by an input interface 46 .
  • the output interface 47 is a device that outputs rendered scenes.
  • the output interface 47 is, for example, a display device such as a display or a projector.
  • the output device 5 is implemented by an output interface 47 .
  • Each function of the drawing processing system 1 can be realized by hardware, software, firmware, or a combination thereof.
  • Each function of the drawing processing system 1 may be partly realized by dedicated hardware and partly realized by software or firmware.
  • a part of the drawing processing system 1 realizes its function by a processing circuit as dedicated hardware, and the remaining part realizes its function by reading and executing a program stored in the memory by the CPU. good too.
  • Software, firmware, or a combination of software and firmware may be described as a program.
  • FIG. 3 is a flow chart showing the operation of the drawing processing system 1 according to the first embodiment. The operation of the drawing processing system 1 will be described below with reference to FIG.
  • step S1 the input device 4 accepts input of drawing information including a drawing target.
  • drawing information including a drawing target.
  • a user uses a mouse, a keyboard, or the like to specify drawing information and input it to the input device 4, and the input device 4 receives the input.
  • the drawing information received by the input device 4 may be drawing information received via communication, drawing information automatically extracted that satisfies a predetermined criterion, or drawing information extracted by machine learning, even if the drawing information is not specified by the user. Any information such as drawing information can be used.
  • the drawing target is a three-dimensional polygon model and the drawing information includes information on the drawing target, which is the polygon model.
  • the information of the object to be drawn, which is a polygon model includes RGB values, position information, and the like.
  • FIG. 4 is an example of drawing information according to the first embodiment.
  • the drawing information includes a polygon model 51, a polygon model 52, a polygon model 53, and a polygon model 54 to be drawn.
  • the polygon models 51 to 54 are displayed as if drawn on a buffer of the same size as the drawing destination buffer so that the drawing flow can be easily understood.
  • the input device 4 receives input of tag information, which is information indicating whether a drawing target included in the received drawing information is an applicable drawing target or an unapplied drawing target, together with the drawing information. .
  • tag information corresponds to each drawing target.
  • the user can determine whether the polygon model to be drawn is large enough to avoid detailed drawing processing, or whether the polygon model to be drawn is large enough to be drawn in detail. Determine whether the size is small enough to process.
  • the user uses the mouse, keyboard, voice input, etc., to tag each drawing target as non-applicable drawing targets that are determined to be large size, and applicable drawing targets that are determined to be small size.
  • the tag information is input to the input device 4 .
  • the input device 4 receives the input.
  • the user judges that the polygon models 51 to 53 are small enough to perform detailed drawing processing with respect to the drawing information of FIG. 4, and the polygon model 54 is not subjected to detailed drawing processing. Assume that the user determines that the size is large enough to be acceptable.
  • the input device 4 receives, input by the user, tag information tagged with the polygon model 51 as the applicable drawing target, tag information tagged with the polygon model 52 as the applicable drawing target, and the polygon model 53 . Suppose that tag information tagged as being the applicable drawing target and tag information tagged as the polygon model 54 being the unapplied drawing target are received.
  • the criterion for determining whether a drawing target is an applicable drawing target or an inapplicable drawing target is not limited to the size of the drawing target.
  • the user may determine whether it is an applicable drawing target or an unapplied drawing target based on luminance, importance, or the like.
  • the user may determine whether an object is an applicable drawing target or an unapplied drawing target based on a combination of drawing size, brightness, importance, and the like.
  • the user may set a drawing target with a high degree of importance as an applicable drawing target and a drawing target with a low degree of importance as an unapplied drawing target.
  • the user regards the drawing target as an applicable drawing target, and the other drawing targets as non-applicable drawing targets. good too. Whether the drawing target is the applicable drawing target or the non-applying drawing target may be determined according to the user's preference.
  • the input device 4 transmits the drawing information and the tag information to the drawing processing device 100, and proceeds to step S2.
  • step S2 the drawing processing device 100 receives drawing information and tag information from the input device 4.
  • the drawing processing device 100 performs drawing processing using the received drawing information and tag information. Details will be described later.
  • the drawing processing device 100 controls the output device 5 to output the scene drawn in the drawing destination buffer, and proceeds to step S3.
  • step S3 the output device 5 outputs the scene drawn in the drawing destination buffer to a display device such as a display under the control of the drawing processing device 100.
  • a display device such as a display under the control of the drawing processing device 100.
  • step S3 Even after executing step S3, the process returns to step S1, and the above process is repeated until there is a trigger to end the process, such as turning off the power or performing an end operation. Although the above processing is repeated, it may be performed only once.
  • FIG. 5 is a flow chart showing the operation of the rendering processing device 100 according to the first embodiment. The operation of the rendering processing apparatus 100 will be described below with reference to FIG.
  • step S ⁇ b>11 the acquisition unit 101 acquires drawing information and tag information from the input device 4 .
  • the acquisition unit 101 transmits the drawing information and the tag information to the creation unit 102, and proceeds to step S12.
  • step S ⁇ b>12 the creation unit 102 receives the drawing information and the tag information from the acquisition unit 101 and creates a drawing destination buffer in the drawing destination buffer storage unit 103 . Also, the creating unit 102 creates a detailed buffer in the detailed buffer storage unit 109 . All applicable drawing objects are drawn in the detail buffer. Note that the detail buffer has the same resolution as the drawing destination buffer.
  • FIG. 6 is a diagram of the drawing destination buffer 55 and the detail buffer 56 of the first embodiment.
  • the drawing destination buffer 55 is 8 pixels long by 8 pixels wide, for a total of 64 pixels.
  • the detail buffer 56 is also 8 pixels high by 8 pixels wide, for a total of 64 pixels. That is, the detail buffer 56 has the same resolution as the drawing destination buffer 55 .
  • the creation unit 102 transmits the drawing information and the tag information to the application determination unit 106, and proceeds to step S13.
  • step S13 the application determination unit 106 receives the drawing information and the tag information from the creation unit 102, and uses the tag information to determine whether the drawing target is an applicable drawing target or an unapplied drawing target. .
  • the application determination unit 106 transmits the drawing information and all of the tag information of the drawing target that is the non-applicable drawing target to the non-applicable drawing target drawing unit 107 .
  • the non-applicable drawing target drawing unit 107 receives the drawing information and the tag information of the non-applicable drawing target from the application determination unit 106, and stores the non-applicable drawing target in the drawing destination buffer of the drawing destination buffer storage unit 103. draw.
  • the application determination unit 106 determines the drawing information of FIG. 107.
  • the non-applicable drawing target drawing unit 107 receives the drawing information from the application determination unit 106 and the tag information of the polygon model 54 tagged as the non-applicable drawing target, and stores the drawing destination buffer in the drawing destination buffer storage unit 103 .
  • a polygon model 54 is drawn on 55 .
  • FIG. 7 is a diagram in which the drawing result 57 obtained by drawing the non-applicable drawing target polygon model 54 in the drawing destination buffer 55 and the non-applicable drawing target polygon model 54 are superimposed.
  • the non-applicable drawing target drawing unit 107 fills the pixels in the drawing destination buffer 55 with the RGB values associated with the drawing target included in the drawing information so as to resemble the shape of the polygon model 54. .
  • FIG. 8 is a diagram showing a drawing result 57 obtained by drawing the non-applied drawing target polygon model 54 in the drawing destination buffer 55 .
  • FIG. 8 eliminates the superimposed display of the polygon model 54 of FIG. 7 and shows only the actual rendering result 57 .
  • the non-applicable drawing target rendering unit 107 finishes drawing the non-applicable drawing target in the rendering destination buffer, the non-applicable drawing target rendering unit 107 notifies the end of drawing to the rendering destination buffer. It is transmitted to the unit 106 and the blend processing unit 110, and the process proceeds to step S14.
  • step S14 when the application determination unit 106 receives the notification from the non-applicable drawing target drawing unit 107 that the drawing to the drawing destination buffer is completed, the application determination unit 106 calculates the drawing information and the tag information of the drawing target that is the applicable drawing target. 104.
  • the application determining unit 106 adds the drawing information, the tag information of the polygon model 51 tagged as the applicable drawing target, and the tag as the applicable drawing target to the drawing information of FIG.
  • the tag information of the polygon model 52 that has been tagged and the tag information of the polygon model 53 that has been tagged as an applicable drawing target are transmitted to the calculation unit 104 .
  • the calculation unit 104 receives the drawing information and the tag information of the drawing target, which is the application drawing target, from the application determination unit 106, and calculates the area of the application drawing target.
  • the area calculation method the area of each applicable drawing object may be calculated accurately, or the area of each applicable drawing object may be roughly calculated by approximation using a bounding box or the like.
  • the area of the applied drawing target is calculated after the applied drawing target is projected and transformed into two dimensions.
  • FIG. 9 is a conceptual diagram showing how a bounding box 80 is used to project and transform a three-dimensional drawing target 81 into a two-dimensional drawing target 81 to roughly estimate the area of the drawing target 81 .
  • a three-dimensional drawing target 81 in the shape of a house is surrounded by a bounding box 80, and the area 82 of the bounding box 80 corresponding to the surface to be projected and transformed into two dimensions is divided by three. It is assumed to be an approximate value of the area of the drawing object 81 when the dimensional drawing object 81 is projected and transformed into two dimensions.
  • one surface of the bounding box 80 exactly overlaps the surface to be projected and transformed into two dimensions.
  • the approximate value of the area of the drawing object 81 when the three-dimensional drawing object 81 is projected and transformed into two dimensions can be obtained.
  • parallel projection has been described with reference to FIG. 9, but perspective projection or the like is also applicable.
  • the bounding box 80 is used to project the three-dimensional drawing object 81 into two dimensions, and the area of the drawing object 81 is roughly calculated.
  • the drawing processing load can be reduced compared to obtaining the area of . Note that the more complicated the shape of the drawing target 81, the higher the drawing processing load. Roughly estimating the area increases the effect of reducing the drawing processing load.
  • the calculation unit 104 receives the drawing information from the application determination unit 106, the tag information of the polygon model 51 tagged as the applicable drawing target, and the The tag information of the tagged polygon model 52 and the tag information of the polygon model 53 tagged as the applicable drawing target are received.
  • the calculation unit 104 calculates the areas of the polygon models 51-53. For example, assume that the calculation unit 104 calculates the area of the polygon model 51 to be 0.4 pixels, the area of the polygon model 52 to be 0.7 pixels, and the area of the polygon model 53 to be 1.2 pixels.
  • area information information indicating the area of the applicable drawing target calculated by the calculation unit 104 will be referred to as area information.
  • the calculating unit 104 transmits the drawing information, the tag information of the drawing target that is the applicable drawing target, and the area information of the applicable drawing target to the determining unit 105, and proceeds to step S15.
  • the determination unit 105 receives from the calculation unit 104 the drawing information, the tag information of the drawing target that is the applied drawing target, and the area information of the applied drawing target.
  • the determining unit 105 determines an alpha value representing the transparency of the applied rendering target using the area information. Specifically, from the area information, if the area of the applicable drawing target is 1.0 pixels or less, the determination unit 105 determines the area as the alpha value of the applicable drawing target. On the other hand, if the area of the applicable drawing target is larger than 1.0 pixels based on the area information, the determination unit 105 decides 1.0 as the alpha value of the applicable drawing target.
  • the determination unit 105 receives the drawing information from the calculation unit 104, the tag information of the polygon model 51 tagged as the applicable drawing target, and the polygon model 52 tagged as the applicable drawing target. , the tag information of the polygon model 53 tagged as being the applicable drawing target, the area information that the area of the polygon model 51 is 0.4 pixels, and the area of the polygon model 52 that is 0.7 pixels. and area information that the area of the polygon model 53 is 1.2 pixels.
  • the determination unit 105 determines the alpha value of the polygon model 51 using the area information of the polygon model 51 . Specifically, the determination unit 105 determines from the area information that the area of the polygon model 51 is 0.4 pixels, which is 1.0 pixels or less, and thus determines the alpha value of the polygon model 51 to be 0.4. do.
  • the determination unit 105 determines the alpha value of the polygon model 52 using the area information of the polygon model 52 . Specifically, the determination unit 105 determines from the area information that the area of the polygon model 52 is 0.7 pixels, which is 1.0 pixels or less, and thus determines the alpha value of the polygon model 52 to be 0.7. do.
  • the determination unit 105 determines the alpha value of the polygon model 53 using the area information of the polygon model 53 . Specifically, the determination unit 105 determines from the area information that the area of the polygon model 53 is 1.2 pixels, which is larger than 1.0 pixels, and thus determines the alpha value of the polygon model 53 to be 1.0. do.
  • the information indicating the alpha value of the drawing target to be applied determined by the determining unit 105 will be referred to as alpha value information.
  • the determining unit 105 transmits the drawing information, the tag information of the drawing target that is the applicable drawing target, and the alpha value information of the applicable drawing target to the applicable drawing target drawing unit 108, and proceeds to step S16.
  • step S ⁇ b>16 the applicable drawing target drawing unit 108 receives the drawing information, the tag information of the drawing target that is the applicable drawing target, and the alpha value information of the applicable drawing target from the determining unit 105 . Draws the applicable draw target in the detail buffer. At this time, alpha value information is also stored in the detail buffer. Also, the applied drawing target drawing unit 108 acquires the number of pixels drawn in the detailed buffer each time the applied drawing target is drawn in the detailed buffer.
  • the applicable drawing target drawing unit 108 receives the drawing information from the determining unit 105, the tag information of the polygon model 51 tagged as the applicable drawing target, and the tag information of the polygon model 51 tagged as the applicable drawing target.
  • the tag information of the polygon model 52, the tag information of the polygon model 53 tagged as being the applicable drawing target, the alpha value information that the alpha value of the polygon model 51 is 0.4, and the alpha value of the polygon model 52. is 0.7 and alpha value information that the alpha value of the polygon model 53 is 1.0.
  • the applicable drawing target drawing unit 108 draws the polygon model 51 in the detail buffer 56 of the detail buffer storage unit 109 .
  • FIG. 10 is a diagram in which a drawing result 58 obtained by drawing the polygon model 51 to be applied drawing in the detail buffer 56 and a plurality of polygon models 51 to 53 are superimposed. Since the polygon model 51 includes the center of one pixel in the detail buffer 56, the polygon model 51 is drawn by filling the pixel with the RGB values associated with the polygon model 51 included in the drawing information. The drawing result of the polygon model 51 is the drawing result 58 of one pixel in FIG.
  • FIG. 11 is a diagram showing a rendering result 58 of rendering the polygon model 51 to be applied rendering in the detail buffer 56.
  • FIG. FIG. 11 eliminates the superimposed display of the multiple polygon models 51 to 53 of FIG. 10 and shows only the actual drawing result 58.
  • FIG. 11 is a diagram showing a rendering result 58 of rendering the polygon model 51 to be applied rendering in the detail buffer 56.
  • FIG. 11 eliminates the superimposed display of the multiple polygon models 51 to 53 of FIG. 10 and shows only the actual drawing result 58.
  • the applicable drawing target drawing unit 108 stores the alpha value that the alpha value of the polygon model 51 is 0.4 in the pixel of the drawing result 58.
  • the values in the pixels of FIG. 11 represent the stored alpha values.
  • the applicable drawing target drawing unit 108 acquires the number of pixels in which the detailed buffer 56 is actually filled with the polygon model 51 after performing the process of drawing the polygon model 51 in the detail buffer 56 .
  • the applicable drawing target drawing unit 108 acquires information that the number of pixels actually drawn in the detail buffer 56 is 1 pixel.
  • the applicable drawing target drawing unit 108 draws the polygon model 52 in the detail buffer 56 of the detail buffer storage unit 109 .
  • the applicable drawing target drawing unit 108 After performing the process of drawing the polygon model 52 in the detail buffer 56 , the applicable drawing target drawing unit 108 acquires the number of pixels in which the detail buffer 56 is actually filled with the polygon model 52 .
  • the detail buffer 56 is not filled with even one pixel by the polygon model 52, so the applicable drawing target drawing unit 108 acquires the information that the number of pixels is 0 pixel.
  • the applicable drawing target drawing unit 108 draws the polygon model 53 in the detail buffer 56 of the detail buffer storage unit 109 .
  • FIG. 12 shows a drawing result 58 obtained by drawing a polygon model 51 to be applied drawing in the detail buffer 56, a drawing result 59 obtained by drawing another polygon model 53 to be applied drawing in the detail buffer 56, and a plurality of polygon models 51 to 53. is superimposed. Since the polygon model 53 includes the center of one pixel in the detail buffer 56, the polygon model 53 is drawn by filling the pixel with the RGB values associated with the polygon model 53 included in the drawing information. The drawing result of the polygon model 53 is the drawing result 59 of one pixel in FIG.
  • FIG. 13 shows a drawing result 58 obtained by drawing the polygon model 51 to be applied drawing in the detail buffer 56 and a drawing result 59 obtained by drawing another polygon model 53 to be applied drawing in the detail buffer 56 .
  • FIG. 13 eliminates the superimposed display of the multiple polygon models 51-53 of FIG. 12 and shows only the actual drawing results 58-59.
  • the applicable drawing target drawing unit 108 stores the alpha value that the alpha value of the polygon model 53 is 1.0 in the pixel of the drawing result 59.
  • the value in each pixel in FIG. 13 represents the stored alpha value.
  • the applicable drawing target drawing unit 108 acquires the number of pixels in which the detailed buffer 56 is actually filled with the polygon model 53 after performing the process of drawing the polygon model 53 in the detail buffer 56 .
  • the applicable drawing target drawing unit 108 acquires the information that the number of pixels actually drawn in the detail buffer 56 is 1 pixel.
  • information indicating the number of pixels drawn in the detail buffer by the applicable drawing target acquired by the applicable drawing target drawing unit 108 will be referred to as pixel number information.
  • the applied drawing target drawing unit 108 transmits the drawing information, the tag information of the drawing target that is the applicable drawing target, the alpha value information, and the pixel number information to the dot printing unit 112, and the process proceeds to step S17. move on.
  • the dot printing unit 112 receives the drawing information, the tag information of the drawing target that is the applied drawing target, the alpha value information, and the number of pixels information from the applied drawing target drawing unit .
  • the dot printing unit 112 identifies, from the pixel count information, the erasure-applied drawing target, which is the applied drawing target that has disappeared without being drawn by the applicable drawing-target drawing unit 108. Dot the pixel and draw the erasure-applied draw target to the detail buffer. At this time, the alpha value of the alpha value information is also stored in the detail buffer.
  • the dot printing unit 112 receives the drawing information from the applicable drawing target drawing unit 108, the tag information of the polygon model 51 tagged as the applicable drawing target, and the tag information of the polygon model 51 tagged as the applicable drawing target.
  • the tag information of the polygon model 52, the tag information of the polygon model 53 tagged as being the applicable drawing target, the alpha value information that the alpha value of the polygon model 51 is 0.4, and the alpha value of the polygon model 52. is 0.7, alpha value information that the alpha value of the polygon model 53 is 1.0, and pixel number that is drawn in the detail buffer 56 by the polygon model 51 is 1 pixel. number information, pixel number information that the number of pixels drawn in the detail buffer 56 by the polygon model 52 is 0 pixels, and pixel number information that the number of pixels drawn in the detail buffer 56 by the polygon model 53 is 1 pixel. and receive.
  • the dot printing unit 112 identifies an applied drawing target with a pixel count of 0 as a lost applied drawing target that has disappeared without being drawn by the applied drawing target drawing unit 108 . Specifically, since the pixel number information of the polygon model 52 is 0 pixel, the dot printing unit 112 identifies the polygon model 52 as the erasure applied drawing target.
  • the dotting unit 112 dots the pixel in the detail buffer 56 corresponding to the position of the polygon model 52 to which erasure is applied, and renders the erasure-applied drawing target in the detail buffer. Specifically, the dotting unit 112 dots the pixels in the detail buffer 56 corresponding to the positions of the vertices of the polygon model 52 .
  • drawing a two- or more-dimensional drawing target if the center of a pixel is included in the drawing target with respect to the buffer, the pixel is filled and the drawing target is drawn.
  • drawing that is, when drawing a point, the pixel where the point was struck is filled in the buffer, and the drawing target is drawn.
  • the dotting unit 112 can render the polygon model 52 to be erased and rendered in the detail buffer 56 by dotting the pixels in the detail buffer 56 corresponding to the positions of the vertices of the polygon model 52 .
  • the dotting unit 112 approximates a polygon and places dots at the positions of the vertices of the polygon.
  • FIG. 14 shows a drawing result 58 obtained by drawing a polygon model 51 to be applied drawing in the detail buffer 56, a drawing result 59 obtained by drawing another polygon model 53 to be applied drawing in the detail buffer 56, and another polygon to be applied drawing.
  • 5 is a diagram in which a drawing result 60 obtained by drawing a model 52 in a detail buffer 56 and a plurality of polygon models 51 to 53 are superimposed. Since the polygon model 52 has three vertices, the dot marking unit 112 dots the pixels in the detail buffer 56 corresponding to the positions of the three vertices of the polygon model 52 .
  • the drawing result of the polygon model 52 is the 3-pixel drawing result 60 in FIG.
  • FIG. 15 shows a drawing result 58 obtained by drawing a polygon model 51 to be applied drawing in the detail buffer 56, a drawing result 59 obtained by drawing another polygon model 53 to be applied drawing in the detail buffer 56, and another polygon to be applied drawing.
  • FIG. 6 is a drawing showing a drawing result 60 of the model 52 drawn in the detail buffer 56;
  • FIG. 15 eliminates the superimposed display of the multiple polygon models 51-53 of FIG. 14 and shows only the actual drawing results 58-60.
  • the dot printing unit 112 stores an alpha value that the alpha value of the polygon model 52 is 0.7 in each pixel forming the drawing result 60.
  • the value in each pixel in FIG. 15 represents the stored alpha value.
  • the dot printing unit 112 sends a notification of the end of drawing to the detail buffer to the blend processing unit 110, and proceeds to step S18. Further, if there is no drawing target to be erased, the dot printing unit 112 does nothing because the drawing target to be applied has already been drawn in the detail buffer, and notifies the blend processing unit 110 of the end of drawing to the detail buffer. It transmits, and it progresses to step S18.
  • step S18 the blend processing unit 110 receives a notification of completion of drawing to the drawing destination buffer from the unapplied drawing target drawing unit 107, and receives a notification of completion of drawing to the detail buffer from the dot printing unit 112.
  • the RGB values are alpha blended with the alpha value for the rendering destination buffer stored in the destination buffer storage unit 103, and the applied rendering target is set to the rendering destination buffer. draw.
  • the blend processing unit 110 alpha-blends the RGB values of the detail buffer 56 of FIG.
  • the models 51 to 53 are drawn in the drawing destination buffer 55 .
  • the blend processing unit 110 stores the rendering destination buffer 55 of FIG.
  • the RGB values of the polygon model 51 are alpha-blended at 0.4.
  • the blend processing unit 110 stores the alpha of each pixel in the drawing result 60 of the polygon model 52 in FIG.
  • Alpha blend the RGB values of the polygonal model 52 with a value of 0.7.
  • the blend processing unit 110 sets the alpha value of 1.0 for one pixel of the drawing result 59 of the polygon model 53 of FIG. 15 to the drawing destination buffer 55 of FIG. alpha blend the RGB values of the polygon model 53 with .
  • FIG. 16 is a drawing showing the result of drawing all polygon models in the drawing destination buffer 55.
  • FIG. A rendering result 61 is obtained by rendering the polygon model 51 in the rendering destination buffer 55 .
  • the drawing result 61 is the result of alpha blending the RGB values of the polygon model 51 with the alpha value of 0.4 for one pixel of the drawing result 58 of the polygon model 51 of FIG. 15 in the drawing destination buffer 55 of FIG. , and in FIG. 16 are painted with vertical stripes in a simulated manner.
  • a drawing result 62 is obtained by drawing the polygon model 52 in the drawing destination buffer 55 .
  • the drawing result 62 is obtained by alpha-blending the RGB values of the polygon model 52 with the alpha value of 0.7 for each of the three pixels of the drawing result 60 of the polygon model 52 in FIG. 15 for the drawing destination buffer 55 in FIG. This is the result, which is shaded with slanted lines in FIG.
  • a drawing result 63 is obtained by drawing the polygon model 53 in the drawing destination buffer 55 .
  • the drawing result 63 is the result of alpha blending the RGB values of the polygon model 53 with the alpha value of 1.0 for one pixel of the drawing result 59 of the polygon model 53 of FIG. 15 in the drawing destination buffer 55 of FIG. , and in FIG. 16 are painted with horizontal stripes in a simulated fashion.
  • a drawing result 57 is obtained by drawing the polygon model 54 in the drawing destination buffer 55 .
  • the drawing result 57 in FIG. 16 is the same as the drawing result 57 in FIG.
  • the blend processing unit 110 transmits a notification of the end of alpha blending to the output control unit 111, and proceeds to step S19.
  • step S ⁇ b>19 the output control unit 111 receives the alpha blend end notification from the blend processing unit 110 and controls the output device 5 to output the drawing destination buffer stored in the drawing destination buffer storage unit 103 . .
  • the display device which is the output device 5
  • the display device is controlled so as to output the scene drawn in the drawing destination buffer 55 stored in the drawing destination buffer storage unit 103.
  • step S19 Even after executing step S19, the process returns to step S11, and the above process is repeated until there is a trigger to end the process, such as turning off the power or performing an end operation. Although the above processing is repeated, it may be performed only once.
  • the rendering target determined as the non-applicable rendering target by the application determination unit 106 is stored in the rendering destination buffer storage unit 103 by the non-applicable rendering target rendering unit 107.
  • the calculation unit 104 calculates the area of the drawing target determined by the application determining unit 106 as the applicable drawing target, and the determining unit 105 determines the alpha value of the applicable drawing target from the calculated area.
  • the applicable drawing target drawing unit 108 draws the applicable drawing target in a detail buffer which is a buffer different from the drawing destination buffer, and the dot printing unit 112 dots the pixels in the detail buffer corresponding to the position of the applicable drawing target.
  • the blend processing unit 110 alpha-blends the detail buffer to the drawing destination buffer with the alpha value of the applied drawing target, and the output control unit 111 draws to the output device 5 in the drawing destination buffer. Since the scene is controlled to be output, even if the drawing target is not positioned at the center of the pixel, the drawing target is drawn and the drawing target can be displayed on the display screen.
  • Embodiment 2 In the first embodiment, in the drawing processing system 1, the input device 4 receives input of drawing information and tag information, the acquisition unit 101 acquires the drawing information and tag information from the input device 4, and the application determination unit 106 receives The tag information was used to determine whether the drawing target was an applicable drawing target or an unapplied drawing target.
  • the input device 6 In the second embodiment, as shown in FIGS. 17 to 19, in the drawing processing system 2, the input device 6 only receives input of drawing information, and the obtaining unit 201 obtains the drawing information from the input device 6. , the application determination unit 202 determines from the drawing information whether the drawing target is an applicable drawing target or an unapplied drawing target.
  • the drawing processing system 2 automatically determines whether the drawing target is an applicable drawing target or an unapplied drawing target. load can be reduced.
  • the drawing processing system 1 collectively draws both the non-applicable drawing target and the applicable drawing target. I will draw. Other than that, it is the same as the first embodiment.
  • the configurations and operations described in Embodiment 1 are denoted by the same reference numerals, and duplicate descriptions are omitted.
  • the present disclosure is applied to the case where the drawing information, which is the information for drawing the entire scene, includes a drawing target that is not positioned at the center of the pixel.
  • the drawing information which is the information for drawing the entire scene
  • the drawing target that is not positioned at the center of the pixel.
  • FIG. 17 is a block diagram of the drawing processing system 2 including the drawing processing device 200 according to the second embodiment.
  • the drawing processing system 2 includes an input device 6 to which drawing information is input, a drawing processing device 200 that uses the drawing information to draw the drawing information, and an output that outputs the scene processed by the drawing processing device 200.
  • the input device 4 of FIG. instead of the drawing target drawing unit 108, the dot printing unit 112, and the blend processing unit 110, the input device 6, the acquisition unit 201, the application determination unit 202, the non-applicable drawing target drawing unit 203, and the calculation unit 204 , a determination unit 205, an applicable drawing target drawing unit 209, a dot printing unit 210, a blend processing unit 207, an end determination unit 206, and an accumulation storage unit 208 are added as components of the functional block diagram.
  • the input device 6 accepts, for example, only the drawing information of the entire scene including the drawing target information such as the polygon model input by the user.
  • the drawing processing device 200 includes an acquisition unit 201 that acquires drawing information from the input device 6, a creation unit 102 that creates a drawing destination buffer and a detail buffer, and whether or not to perform detailed drawing processing on a drawing target based on the drawing information.
  • an application determination unit 202 that determines whether a drawing target is an unapplied drawing target that is drawn in a drawing destination buffer or an applicable drawing target that is drawn in a detail buffer that is a buffer different from the drawing destination buffer;
  • a non-applicable drawing target drawing unit 203 that draws a non-applicable drawing target tagged as not applying detailed drawing processing to a drawing destination buffer, and a calculation that calculates the area of the applicable drawing target tagged as applying detailed drawing processing.
  • a determining unit 205 that determines the alpha value of the applicable drawing target from the calculated area; an applicable drawing target drawing unit 209 that draws the applicable drawing target in the detail buffer and obtains the number of drawn pixels; From the number of pixels drawn, identify whether the applied drawing target is a lost applied drawing target that has disappeared without being drawn in the detail buffer.
  • a dot printing unit 210 dots the pixels in the detail buffer corresponding to the target position and draws the erasing-applied drawing target in the detail buffer, and the drawing information includes the drawing destination buffer or the drawing target that is not drawn in the detail buffer.
  • a termination determination unit 206 that determines termination based on whether the drawing target is A blend processing unit 207 for drawing in a drawing destination buffer, an output control unit 111 for controlling output of the scene drawn in the drawing destination buffer to the output device 5, a drawing destination buffer storage unit 103 for storing the drawing destination buffer, A detail buffer storage unit 109 stores a detail buffer, and an accumulation storage unit 208 accumulates and stores information drawn in the detail buffer.
  • the unapplied drawing target drawing portion 203 corresponds to the first drawing target drawing portion
  • the applicable drawing target drawing portion 209 corresponds to the second drawing target drawing portion.
  • a hardware configuration diagram of the drawing processing system 2 is the same as that of FIG. 2 of the first embodiment.
  • the hardware configuration of the input device 6 is the same as that of the input device 4.
  • the hardware configuration of the acquisition unit 201 is the same as that of the acquisition unit 101 .
  • the hardware configuration of the application determination unit 202 is the same as that of the application determination unit 106 .
  • the hardware configuration of the non-applicable drawing target rendering unit 203 is the same as that of the non-applicable drawing target rendering unit 107 .
  • the hardware configuration of the calculator 204 is similar to that of the calculator 104 .
  • the hardware configuration of the determination unit 205 is the same as that of the determination unit 105 .
  • the hardware configuration of the applicable rendering target rendering unit 209 is the same as that of the applicable rendering target rendering unit 108 .
  • the hardware configuration of the dotting section 210 is the same as that of the dotting section 112 .
  • the hardware configuration of the blend processing unit 207 is the same as that of the blend processing unit 110 .
  • the termination determination unit 206 is implemented by a program stored in the graphic memory 45, and is implemented by reading and executing the program loaded into the graphic memory 45 by the GPU 44.
  • FIG. The accumulation storage unit 208 is implemented by the graphic memory 45 .
  • graphic memory 45 may implement the functions of the end determination unit 206 and the functions of other units in separate memories. Also, the accumulation storage unit 208 and other storage units may be realized by separate memories. Other than that, the hardware configuration is the same as that of the first embodiment.
  • FIG. 18 is a flowchart showing the operation of the drawing processing system 2 according to Embodiment 2 of the present disclosure. The operation of the drawing processing system 2 will be described below with reference to FIG.
  • step S4 the input device 6 accepts input of drawing information.
  • drawing information For example, a user uses a mouse, a keyboard, or the like to specify drawing information and input it to the input device 6, and the input device 6 receives the input.
  • the drawing information received by the input device 6 is automatically extracted from the drawing information received via communication and the drawing information that satisfies a predetermined criterion without being specified by the user. Any drawing information may be used, such as drawing information extracted by machine learning.
  • the rendering target is a three-dimensional polygon model and the rendering information includes information on the rendering target, which is the polygon model.
  • Embodiment 2 as in Embodiment 1, a case where the present disclosure is applied to FIG. 4 will be described below.
  • the input device 6 transmits the drawing information to the drawing processing device 200, and proceeds to step S5.
  • step S ⁇ b>5 the drawing processing device 200 receives drawing information from the input device 6 .
  • the drawing processing device 200 performs drawing processing using the received drawing information. Details will be described later.
  • the drawing processing device 200 controls the output device 5 to output the scene drawn in the drawing destination buffer, and proceeds to step S6.
  • step S6 the output device 5 outputs the scene drawn in the drawing destination buffer to a display device such as a display under the control of the drawing processing device 200, as in step S3 of the first embodiment.
  • step S6 Even after executing step S6, the process returns to step S4, and the above process is repeated until there is a trigger to end the process, such as turning off the power or performing an end operation. Although the above processing is repeated, it may be performed only once.
  • FIG. 19 is a flowchart showing the operation of the drawing processing device 200 according to Embodiment 2 of the present disclosure. The operation of the rendering processing device 200 will be described below with reference to FIG.
  • step S21 the acquisition unit 201 acquires drawing information from the input device 6.
  • the acquisition unit 201 transmits the drawing information to the creation unit 102, and proceeds to step S22.
  • step S22 the creation unit 102 receives the drawing information from the acquisition unit 201.
  • the creation unit 102 transmits the drawing information to the application determination unit 202, and proceeds to step S23.
  • step S22 is the same as step S12 of the first embodiment.
  • all applicable drawing targets are drawn in the detail buffer, but in the second embodiment, one applicable drawing target is drawn.
  • the application determination unit 202 receives the drawing information from the creation unit 102.
  • the application determination unit 202 uses the drawing information to determine whether one of the drawing targets included in the drawing information is an applicable drawing target or an unapplied drawing target.
  • the application determination unit 202 calculates the area s of the object to be drawn and determines whether or not the area s is less than the threshold value X that is determined and stored in advance.
  • Threshold X is a size large enough for the user not to perform detailed drawing processing on the drawing target, such as 1.0 pixels or 5.0 pixels, or a size small enough to perform detailed drawing processing. do it.
  • the area s of the object to be drawn may be calculated by calculating the area of the object itself. You can make an estimate.
  • the approximate value of the area of the polygon model 51 in FIG. 4 is 0.8 pixels
  • the polygon Assume that the approximate area value of the model 52 is 1.4 pixels, the approximate area value of the polygon model 53 is 2.4 pixels, and the approximate area value of the polygon model 54 is 16.8 pixels.
  • the predetermined threshold value X is 3.0 pixels.
  • the application determination unit 202 determines No in step S23.
  • the application determining unit 202 determines that detailed drawing processing is not necessary because the drawing size of the drawing target is large, and generates tag information indicating that the drawing target is an unapplied drawing target.
  • the application determining unit 202 transmits the drawing information and the tag information of the drawing target that is the non-applicable drawing target to the non-applicable drawing target drawing unit 203, and proceeds to step S24.
  • the approximate value of the area of the polygon model 54 is 16.8 pixels, which is equal to or greater than the predetermined threshold value of 3.0 pixels.
  • the applicability determining unit 202 generates tag information indicating that the polygon model 54 is an inapplicable drawing target, assuming that detailed drawing processing is unnecessary because the drawing size of the polygon model 54 is large.
  • the applicability determining unit 202 transmits the drawing information and the tag information of the polygon model 54 tagged as the non-applicable drawing target to the drawing unit 203, and proceeds to step S24.
  • the application determination unit 202 determines Yes in step S23.
  • the application determining unit 202 generates tag information indicating that the drawing target is an applicable drawing target, assuming that the drawing size of the drawing target is small and therefore detailed drawing processing needs to be performed.
  • the application determination unit 202 transmits the drawing information and the tag information of the drawing target, which is the applicable drawing target, to the calculation unit 204, and proceeds to step S25.
  • step S23 Yes.
  • the application determination unit 202 generates tag information indicating that the polygon model 51 is an object of application drawing, because the drawing size of the polygon model 51 is small and detailed drawing processing needs to be performed.
  • the application determination unit 202 transmits the drawing information and the tag information of the polygon model 51 tagged as the application drawing target to the calculation unit 204, and proceeds to step S25.
  • the approximate value of the area of the polygon model 52 is 1.4 pixels, which is less than the predetermined threshold value of 3.0 pixels. : Yes.
  • the application determination unit 202 generates tag information indicating that the polygon model 52 is an object of application drawing, because the drawing size of the polygon model 52 is small and detailed drawing processing needs to be performed.
  • the application determination unit 202 transmits the drawing information and the tag information of the polygon model 52 tagged as the application drawing target to the calculation unit 204, and proceeds to step S25.
  • the approximate value of the area of the polygon model 53 is 2.4 pixels, which is less than the predetermined threshold value of 3.0 pixels. Become.
  • the application determination unit 202 generates tag information indicating that the polygon model 53 is an object of application drawing, because the drawing size of the polygon model 53 is small and detailed drawing processing needs to be performed.
  • the application determination unit 202 transmits the drawing information and the tag information of the polygon model 53 tagged as the application drawing target to the calculation unit 204, and proceeds to step S25.
  • step S ⁇ b>24 the non-applicable drawing target drawing unit 203 receives the drawing information and the tag information of the drawing target that is the non-applicable drawing target from the application determination unit 202 . Draw the applicable draw target.
  • the non-applicable drawing target drawing unit 203 receives the drawing information from the application determination unit 202 and the tag information of the polygon model 54 tagged as the non-applicable drawing target, and stores the drawing destination buffer.
  • the polygon model 54 is drawn in the drawing destination buffer 55 of the unit 103 .
  • a diagram in which a drawing result 57 obtained by drawing the polygon model 54 in the drawing destination buffer 55 and the polygon model 54 are superimposed is the same as FIG. 7 of the first embodiment.
  • a drawing result 57 obtained by drawing the polygon model 54 in the drawing destination buffer 55 is the same as FIG. 8 of the first embodiment.
  • the drawing unit 203 When the non-applicable drawing target drawing unit 203 finishes drawing one non-applicable drawing target in the drawing destination buffer, the drawing unit 203 transmits a notification of the end of drawing to the drawing destination buffer to the end determination unit 206, and step S29. proceed to
  • step S25 the calculation 204 receives the drawing information and the tag information of the drawing target, which is the applicable drawing target, from the application determination unit 202, and calculates the area of the applicable drawing target.
  • the method of calculating the area is the same as in step S14 of the first embodiment.
  • the calculation unit 204 receives the drawing information from the application determination unit 202 and the tag information of the polygon model 51 tagged as the application drawing target. Alternatively, the calculation unit 204 receives the drawing information from the application determination unit 202 and the tag information of the polygon model 52 tagged as the application drawing target. Alternatively, the calculation unit 204 receives the drawing information from the application determination unit 202 and the tag information of the polygon model 53 tagged as the application drawing target.
  • the calculation unit 204 calculates the area of the applicable drawing target described in the tag information received from the application determination unit 202 .
  • the calculation unit 204 calculates the area of the polygon model 51 when the applicable drawing target described in the tag information is the polygon model 51 . For example, assume that the calculation unit 204 calculates the area of the polygon model 51 to be 0.4 pixels.
  • the calculation unit 204 calculates the area of the polygon model 52 when the applicable drawing target described in the tag information is the polygon model 52 . For example, assume that the calculation unit 204 calculates the area of the polygon model 52 to be 0.7 pixels.
  • the calculation unit 204 calculates the area of the polygon model 53 when the applicable drawing target described in the tag information is the polygon model 53 .
  • the calculation unit 204 calculates the area of the polygon model 53 to be 1.2 pixels.
  • information indicating the area of the applicable drawing target calculated by the calculation unit 204 will be referred to as area information, as in the first embodiment.
  • the calculating unit 204 transmits the drawing information, the tag information of the drawing target that is the applicable drawing target, and the area information of the applicable drawing target to the determining unit 205, and proceeds to step S26.
  • step S26 the determining unit 205 receives from the calculating unit 204 the drawing information, the tag information of the drawing target that is the applied drawing target, and the area information of the applied drawing target.
  • the determining unit 205 determines an alpha value representing the transparency of the applied rendering target using the area information.
  • the method of determining the alpha value is the same as in step S15 of the first embodiment.
  • the determination unit 205 receives the drawing information from the calculation unit 204, the tag information of the polygon model 51 tagged as being the applicable drawing target, and the polygon model 51 having an area of 0.4 pixels. Receive area information that there is.
  • the method of determining the alpha value of polygon model 51 is the same as in step S15 of Embodiment 1, and determination unit 205 determines the alpha value of polygon model 51 to be 0.4.
  • the determination unit 205 receives the drawing information from the calculation unit 204, the tag information of the polygon model 52 tagged as being the applicable drawing target, and the area information that the area of the polygon model 52 is 0.7 pixels. receive.
  • the method of determining the alpha value of the polygon model 52 is the same as in step S15 of Embodiment 1, and the determination unit 205 determines the alpha value of the polygon model 52 to be 0.7.
  • the determination unit 205 obtains from the calculation unit 204 the drawing information, the tag information of the polygon model 53 tagged as the applicable drawing target, and the area information that the area of the polygon model 53 is 1.2 pixels. receive.
  • the method of determining the alpha value of polygon model 53 is the same as in step S15 of Embodiment 1, and determination unit 205 determines the alpha value of polygon model 53 to be 1.0.
  • alpha value information information indicating the alpha value of the drawing target to be applied determined by the determining unit 205 will be referred to as alpha value information, as in the first embodiment.
  • the determining unit 205 transmits the drawing information, the tag information of the drawing target that is the applicable drawing target, and the alpha value information of the applicable drawing target to the applicable drawing target drawing unit 209, and proceeds to step S27.
  • step S ⁇ b>27 the applicable drawing target drawing unit 209 receives the drawing information, the tag information of the drawing target that is the applicable drawing target, and the alpha value information of the applicable drawing target from the determining unit 205 .
  • alpha value information is also stored in the detail buffer.
  • the applicable drawing target drawing unit 209 acquires the number of pixels drawn in the detail buffer.
  • the method of drawing the drawing target, the method of storing alpha value information in the detail buffer, and the method of acquiring the number of pixels drawn in the detail buffer are the same as in step S16 of the first embodiment.
  • the drawing processing system 2 draws drawing targets one by one, so the polygon model 51, the polygon model 52, and the polygon model 53 are not drawn at the same time.
  • the applicable drawing target drawing unit 209 obtains the drawing information from the determining unit 205, the tag information of the polygon model 51 tagged as being the applicable drawing target, and the alpha value of the polygon model 51 being 0. .4 and the alpha value information.
  • the applied drawing target drawing unit 209 draws the polygon model 51 in the detail buffer 56 of the detail buffer storage unit 109 .
  • a diagram in which a drawing result 58 obtained by drawing the polygon model 51 to be applied drawing in the detail buffer 56 and a plurality of polygon models 51 to 53 are superimposed is the same as FIG. 10 of the first embodiment.
  • a drawing result 58 obtained by drawing the polygon model 51 to be applied drawing in the detail buffer 56 is the same as FIG. 11 of the first embodiment.
  • the applied drawing target drawing unit 209 stores an alpha value that the alpha value of the polygon model 51 is 0.4 in the pixel of the drawing result 58 . Further, the applicable drawing target drawing unit 209 acquires information that the number of pixels in which the detail buffer 56 is actually filled by the polygon model 51 is 1 pixel.
  • the applicable drawing target drawing unit 209 receives the drawing information from the determination unit 205, the tag information of the polygon model 52 tagged as the applicable drawing target, and the alpha value of the polygon model 52 whose alpha value is 0.7. receive value information.
  • the applied drawing target drawing unit 209 draws the polygon model 52 in the detail buffer 56 of the detail buffer storage unit 109 .
  • the polygon model 52 does not include the center of any pixel in the detail buffer 56, it disappears without being drawn as in the first embodiment. Therefore, even if the applicable drawing target drawing unit 209 performs the process of drawing the polygon model 52 in the detail buffer 56, the polygon model 52 is not drawn, so the detail buffer 56 in FIG. 6 remains blank. end up
  • the applied drawing target drawing unit 209 acquires information indicating that the number of pixels for which the detail buffer 56 is actually filled by the polygon model 52 is 0 pixel.
  • the applicable drawing target drawing unit 209 receives the drawing information from the determination unit 205, the tag information of the polygon model 53 tagged as the applicable drawing target, and the alpha value of the polygon model 53 whose alpha value is 1.0. receive value information.
  • the applied drawing target drawing unit 209 draws the polygon model 53 in the detail buffer 56 of the detail buffer storage unit 109 .
  • a drawing result 59 obtained by drawing a polygon model 53 to be applied drawing in a detail buffer 56 and a plurality of polygon models 51 to 53 are superimposed. It is the same as the drawing in which the drawing result 58 drawn in 56 is deleted.
  • a drawing result 59 obtained by drawing the polygon model 53 to be applied drawing in the detail buffer 56 is a drawing result 58 obtained by drawing the polygon model 51 to be applied drawing in the detail buffer 56 from FIG. 13 of the first embodiment. is the same as the figure without
  • the applied drawing target drawing unit 209 stores an alpha value that the alpha value of the polygon model 53 is 1.0 in the pixel of the drawing result 59 . Also, the applied drawing target drawing unit 209 acquires information that the number of pixels in which the detail buffer 56 is actually filled by the polygon model 53 is 1 pixel.
  • information indicating the number of pixels drawn in the detail buffer by the applicable drawing target acquired by the applicable drawing target drawing unit 209 will be referred to as pixel number information, as in the first embodiment.
  • the applied drawing target drawing unit 209 transmits the drawing information, the tag information of the drawing target that is the applied drawing target, the alpha value information, the number of pixels information, and the alpha value information of the applied drawing target to the dot printing unit 210, Proceed to step S28.
  • the dot printing section 210 receives from the applied drawing target drawing section 209 the drawing information, the tag information of the drawing target that is the applied drawing target, the alpha value information, and the pixel number information.
  • the dot printing unit 210 identifies from the pixel count information whether the applied drawing target is a lost applied drawing target that has disappeared without being drawn by the applied drawing target drawing unit 209 .
  • the dot dot dot part 210 dots the pixel in the detail buffer corresponding to the position of the erasure applied drawing target, and renders the erasure applied drawing target in the detail buffer.
  • alpha value information is also stored in the detail buffer.
  • step S17 of the first embodiment How to determine if it is an erasure-applied drawtarget, how to dot the pixel in the detail buffer corresponding to the position of the erasure-applied drawtarget, how to draw the loss-applied drawtarget to the detail buffer, and how to store alpha value information in the detail buffer.
  • the method of storing is the same as step S17 of the first embodiment. If the applied drawing target is not the disappearing applied drawing target, the dot printing unit 210 does nothing, as in step S17 of the first embodiment, because the applied drawing target has already been drawn in the detail buffer.
  • the dot printing unit 210 receives the drawing information from the applicable drawing target drawing unit 209, the tag information of the polygon model 51 tagged as the applicable drawing target, and the alpha value of the polygon model 51 being 0. .4 and pixel number information that the number of pixels drawn in the detail buffer 56 by the polygon model 51 is 1 pixel. Based on the pixel number information, the dot printing unit 210 specifies that the polygon model 51 is not a target for erasure-applied drawing because the pixel number information of the polygon model 51 is 1 pixel. Since the polygon model 51 has already been drawn in the detail buffer, the dotting section 210 does nothing.
  • the dot printing unit 210 receives the drawing information from the applicable drawing target drawing unit 209, the tag information of the polygon model 52 tagged as being the applicable drawing target, and the alpha value of the polygon model 52 whose alpha value is 0.7. Receive value information and pixel count information that the number of pixels drawn by the polygon model 52 into the detail buffer 56 is 0 pixels. Based on the pixel number information, the dot printing unit 210 specifies that the polygon model 52 is to be erased and drawn because the pixel number information of the polygon model 52 is 0 pixel.
  • the dot printing unit 210 draws the erasure-applied drawing target in the detail buffer in the same manner as in step S17 of the first embodiment.
  • a drawing result 60 obtained by drawing a polygon model 52 to be applied drawing in a detail buffer 56 and a plurality of polygon models 51 to 53 are superimposed on each other.
  • 56 and a drawing result 59 obtained by drawing another polygon model 53 to be applied drawing in the detail buffer 56 are deleted.
  • a drawing result 60 obtained by drawing the polygon model 52 to be applied drawing in the detail buffer 56 is different from the drawing result 58 obtained by drawing the polygon model 51 to be applied drawing in the detail buffer 56 from FIG. 15 of the first embodiment.
  • the dot printing unit 210 causes each pixel forming the drawing result 60 to store an alpha value that the alpha value of the polygon model 52 is 0.7.
  • the dot printing unit 210 receives the drawing information from the applicable drawing target drawing unit 209, the tag information of the polygon model 53 tagged as the applicable drawing target, and the alpha value of the polygon model 53 whose alpha value is 1.0. It receives value information and pixel number information that the number of pixels drawn by the polygon model 53 into the detail buffer 56 is 1 pixel. Based on the pixel number information, the dot printing unit 210 specifies that the polygon model 53 is not a drawing object to be erased because the pixel number information of the polygon model 53 is 1 pixel. Since the polygon model 53 has already been drawn in the detail buffer, the dotting section 210 does nothing.
  • the dot printing unit 210 sends a notification of the end of drawing to the detail buffer to the end determination unit 206, and proceeds to step S29.
  • step S29 when the end determination unit 206 receives a notification of the end of drawing to the drawing destination buffer from the non-applied drawing target drawing unit 203 or a notification of the end of drawing to the detail buffer from the dot printing unit 210, It is determined whether or not there is another drawing target to be drawn in the drawing destination buffer.
  • the termination determination unit 206 has not yet performed drawing processing for the non-applicable drawing target drawing unit 203, the applicable drawing target drawing unit 209, or the dot printing unit 210, and the drawing target to be drawn in the drawing destination buffer has not yet been processed. If there is, step S29: Yes.
  • the end determination unit 206 receives the notification of the end of drawing to the drawing destination buffer from the non-applicable drawing target drawing unit 203 and there are still drawing targets to be drawn in the drawing destination buffer, the process returns to step S23. Render processing of the rendering target that has not been rendered.
  • the drawing processing device 200 After storing the stored detail buffer that has undergone the drawing processing and making it possible to create a new detail buffer in the detail buffer storage unit 109, the drawing processing device 200 returns to step S23, and draws the drawing target that has not undergone the drawing processing. to process the drawing.
  • step S29 No.
  • the end determination unit 206 collectively transmits all the detail buffers stored in the accumulation storage unit 208 to the blend processing unit 207, and proceeds to step S30.
  • the end determination unit 206 performs step S29: Yes.
  • the end determination unit 206 receives a notification from the non-applicable drawing target drawing unit 203 that the drawing processing of the polygon model 54 is completed, and that the drawing to the drawing destination buffer has ended, and there are still drawing targets to be drawn in the drawing destination buffer. If so, the process returns to step S23 to perform drawing processing on the drawing target that has not been subjected to drawing processing.
  • the end determination unit 206 receives the notification of the end of drawing to the detail buffer from the dot printing unit 210 after drawing processing for any of the polygon models 51 to 53 has been completed, and there is still a drawing object to be drawn in the drawing destination buffer.
  • the detailed buffer in which any of the polygon models 51 to 53 stored in the detailed buffer storage unit 109 is drawn is stored in the accumulation storage unit 208 so that a new detailed buffer can be created in the detail buffer storage unit 109.
  • the process returns to step S23, and the drawing object that has not been subjected to the drawing process is subjected to the drawing process.
  • step S29 No.
  • the end determination unit 206 collectively transmits all of the detailed buffers in which the polygon models 51 to 53 stored in the accumulation storage unit 208 are drawn to the blend processing unit 207, and proceeds to step S30.
  • step S30 the blend processing unit 207 collectively receives from the end determination unit 206 all of the detailed buffers in which the applicable drawing targets are drawn.
  • the blend processing unit 110 alpha-blends one detail buffer and the drawing destination buffer. do.
  • the blend processing unit 207 performs alpha-blending of one detail buffer and the drawing destination buffer in order by the number of detail buffers.
  • the method of alpha blending is the same as step S18 of the first embodiment.
  • a drawing showing the result of drawing all the polygon models in the drawing destination buffer 55 is the same as FIG. 16 of the first embodiment.
  • step S30 is the same as step S18 of the first embodiment.
  • step S ⁇ b>31 the output control unit 111 receives an alpha blend end notification from the blend processing unit 207 . Otherwise, step S31 is the same as step S19 of the first embodiment.
  • step S31 Even after executing step S31, the process returns to step S21, and the above process is repeated until there is a trigger to end the process, such as turning off the power or performing an end operation. Although the above processing is repeated, it may be performed only once.
  • the non-applicable drawing target drawing unit 203 stores the drawing target determined as the non-applicable drawing target by the application determining unit 202 in the drawing destination buffer storage unit 103.
  • the calculation unit 204 calculates the area of the drawing target determined as the applicable drawing target by the application determination unit 202, and the determining unit 205 determines the alpha value of the applicable drawing target from the calculated area.
  • the applicable drawing target drawing unit 209 draws the applicable drawing target in a detail buffer which is a buffer different from the drawing destination buffer, and the dot printing unit 210 dots the pixel in the detail buffer corresponding to the position of the applicable drawing target.
  • the blend processing unit 207 alpha-blends the detail buffer to the drawing destination buffer with the alpha value of the applied drawing target, and the output control unit 111 draws to the output device 5 in the drawing destination buffer. Since the scene is controlled to be output, even if the drawing target is not positioned at the center of the pixel, the drawing target is drawn and the drawing target can be displayed on the display screen.
  • the drawing processing system 2 receives only drawing information input by the input device 6. It is determined from the information whether the drawing target is an applicable drawing target or an unapplied drawing target. With this determination, there is no need for the user or the like to input the tag information to the input device 6, and the drawing processing system 2 automatically determines whether the drawing target is an applicable drawing target or an unapplied drawing target. load can be reduced.
  • the application determination unit 202 calculates the area s of the object to be drawn and determines whether or not the area s is less than the threshold value X that is determined and stored in advance.
  • the threshold value may be changed for each drawing target according to the luminance value, importance, or the like of each drawing target. For example, if the brightness value of the drawing target is lower than the threshold Y, the brightness is low and the threshold X is decreased, and if the brightness value of the drawing target is higher than the threshold Y, the brightness is high and the threshold X is increased.
  • a drawing target with low luminance is not applicable drawing target even if its area is smaller, so the probability of being determined as an unapplied drawing target increases, and the drawing processing load is reduced. can be done.
  • a drawing target with a high brightness is not treated as an inapplicable drawing target unless it has a larger area, so the probability of being determined as an applicable drawing target increases and it becomes more difficult to erase.
  • the threshold value X is reduced. If it is higher than Z, the threshold X is increased as the importance is high.
  • a drawing target with a low degree of importance will not be an applicable drawing target even if it has a smaller area, so the probability of being determined as an unapplied drawing target increases, and the drawing processing load can be reduced. be able to.
  • a drawing target with a high degree of importance is not treated as an unapplied drawing target unless it has a larger area, so the probability of being determined as an applicable drawing target increases, making it more difficult to erase.
  • the drawing processing system 2 includes an accumulation storage unit 208, and the end determination unit 206 stores details in the accumulation storage unit 208 when there are still drawing objects to be drawn in the drawing destination buffer.
  • the drawing processing unit 200 stores the drawing-processed detail buffer stored in the unit 109 so that a new detail buffer can be created in the detail buffer storage unit 109.
  • the drawing processing device 200 performs drawing processing on a drawing target that has not been subjected to drawing processing. I was trying to However, the drawing processing system 2 does not include the accumulation storage unit 208, and the drawing processing device 200 performs the drawing processing so as to overwrite and save the detail buffer stored in the detail buffer storage unit 109, which has undergone the drawing processing. A drawing target that is not drawn may be drawn.
  • Embodiment 3 In the first embodiment, in the rendering processing system 1, the creation unit 102 creates one detail buffer having the same resolution as the rendering destination buffer in the detail buffer storage unit 109, and the applicable rendering target rendering unit 108 or the dot printing unit 112 stores the detail buffer. All applicable drawing objects have been drawn in the buffer.
  • the allocation creating unit 302 creates a detailed buffer corresponding to the range of each applicable drawing object in the detailed buffer storage unit 303, The unit 112 draws the applicable drawing target in the detail buffer corresponding to the range of the applicable drawing target.
  • the present disclosure is applied to the case where the drawing information, which is the information for drawing the entire scene, includes a drawing target that is not positioned at the center of the pixel.
  • the drawing information which is the information for drawing the entire scene
  • the drawing target that is not positioned at the center of the pixel.
  • FIG. 20 is a block diagram of the drawing processing system 3 including the drawing processing device 300 according to the third embodiment.
  • the drawing processing system 3 uses an input device 4 for inputting tag information indicating whether or not to apply detailed drawing processing to each drawing target included in the drawing information and drawing information, and the drawing information and the tag information, A drawing processing device 300 for drawing drawing information and an output device 5 for outputting a scene processed by the drawing processing device 300 are provided.
  • a detailed buffer storage unit 303 instead of the creation unit 102, the detailed buffer storage unit 109, the applied drawing target drawing unit 108, the dot printing unit 112, and the blend processing unit 110 of FIG. , a detailed buffer storage unit 303, an applicable drawing target drawing unit 304, a dot printing unit 305, a blend processing unit 306, and an allocation creating unit 302 are added as components of the functional block diagram.
  • the drawing processing device 300 includes an acquisition unit 101 that acquires drawing information including a drawing target and tag information from the input device 4, a creation unit 301 that creates a drawing destination buffer, and a detailed drawing process for the drawing target using the tag information. determines whether the drawing target is an inapplicable drawing target that is drawn in the drawing destination buffer or an applicable drawing target that is drawn in the detail buffer, which is a buffer different from the drawing destination buffer.
  • a calculation unit 104 for calculating the area of the application drawing target tagged as applying the detail drawing process, and the alpha value of the application drawing target from the calculated area an applicable drawing target drawing unit 304 for drawing the applicable drawing target in the detail buffer and acquiring the number of drawn pixels; Identifies the applied drawing target that was not drawn in the previous step and has disappeared, points the pixel in the detail buffer corresponding to the position of the lost drawing target, and puts the lost drawing target into the detail buffer.
  • the non-applicable drawing target drawing portion 107 corresponds to the first drawing target drawing portion
  • the applicable drawing target drawing portion 304 corresponds to the second drawing target drawing portion.
  • a hardware configuration diagram of the drawing processing system 3 is the same as that of FIG. 2 of the first embodiment.
  • the hardware configuration of the creation unit 301 is the same as that of the creation unit 102.
  • the hardware configuration of the detail buffer storage unit 303 is the same as that of the detail buffer storage unit 109 .
  • the hardware configuration of the applicable rendering target rendering unit 304 is the same as that of the applicable rendering target rendering unit 108 .
  • the hardware configuration of the dotting section 305 is the same as that of the dotting section 112 .
  • the hardware configuration of the blend processing unit 306 is the same as that of the blend processing unit 110 .
  • the allocation creation unit 302 is implemented by a program stored in the graphic memory 45, and implemented by reading and executing the program loaded into the graphic memory 45 by the GPU 44. FIG. Note that the graphic memory 45 may implement the functions of the allocation creation unit 302 and the functions of other units with separate memories.
  • a flowchart showing the operation of the drawing processing system 3 according to the third embodiment of the present disclosure is the same as that of FIG. 3 of the first embodiment.
  • FIG. 21 is a flowchart showing the operation of the rendering processing device 300 according to Embodiment 3 of the present disclosure. The operation of the rendering processing device 300 will be described below with reference to FIG.
  • step S41 the acquisition unit 101 transmits the drawing information and the tag information to the creation unit 301, and proceeds to step S42. Otherwise, step S41 is the same as step S11 of the first embodiment.
  • step S42 the creation unit 301 creates only the drawing destination buffer and does not create the detail buffer.
  • the creation unit 301 transmits the drawing information and the tag information to the application determination unit 106, and proceeds to step S43. Otherwise, step S42 is the same as step S12 of the first embodiment.
  • step S ⁇ b>43 the application determination unit 106 receives drawing information and tag information from the creation unit 301 .
  • the non-applicable drawing target drawing unit 107 transmits a notification of completion of drawing to the drawing destination buffer to the application processing unit 106 and the blend processing unit 306, and proceeds to step S44.
  • step S43 is the same as step S13 of the first embodiment.
  • step S44 when the application determining unit 106 receives the notification from the non-applicable drawing target drawing unit 107 that the drawing to the drawing destination buffer has ended, the drawing information and the tag information of the drawing target that is the applicable drawing target are assigned and created. 302.
  • the application determining unit 106 adds the drawing information, the tag information of the polygon model 51 tagged as the applicable drawing target, and the tag as the applicable drawing target to the drawing information of FIG.
  • the tag information of the polygon model 52 that has been tagged and the tag information of the polygon model 53 that has been tagged as an applicable drawing target are transmitted to the assignment creation unit 302 .
  • the assignment creation unit 302 receives the drawing information and the tag information of the drawing target that is the application drawing target from the application determination unit 106, and creates a detailed buffer corresponding to each application drawing target in the detail buffer storage unit 303. Each corresponding apply draw target will be drawn in each detail buffer. Note that the detail buffer has the same resolution as the drawing destination buffer. Further, the allocation creation unit 302 creates position information indicating which position in the rendering destination buffer each detail buffer corresponds to, and stores the position information in each detail buffer.
  • the assignment creation unit 302 receives the drawing information from the application determination unit 106, the tag information of the polygon model 51 tagged as the applicable drawing target, and the polygon model 51 tagged as the applicable drawing target. It receives the tag information of the model 52 and the tag information of the polygon model 53 tagged as the applicable drawing object.
  • the allocation creating unit 302 creates a detailed buffer 64 corresponding to the polygon model 51 in the detailed buffer storage unit 303 .
  • the allocation creating unit 302 creates a detailed buffer 65 corresponding to the polygon model 52 in the detailed buffer storage unit 303 .
  • the assignment creating unit 302 creates a detailed buffer 66 corresponding to the polygon model 53 in the detailed buffer storage unit 303 .
  • the polygon model 51 is drawn in the detail buffer 64, the polygon model 52 in the detail buffer 65, and the polygon model 53 in the detail buffer 66, respectively. Note that the detail buffers 64 to 66 have the same resolution as the drawing destination buffer 55 .
  • FIG. 22 is a diagram showing the comparative relationship between the detail buffers 64 to 66 corresponding to the polygon models 51 to 53 to be rendered in the third embodiment and the detail buffer 56 in the first embodiment.
  • the detail buffers 64 to 66 have the same resolution as the drawing destination buffer 55 as in the first embodiment.
  • the detail buffer 56 of the first embodiment and the detail buffers 64 to 66 of the third embodiment have the same resolution.
  • the detail buffer 56 of Embodiment 1 is 8 pixels long by 8 pixels wide, for a total of 64 pixels.
  • the detail buffer 64 of the third embodiment has a total of 9 pixels, 3 pixels long and 3 pixels wide.
  • the detail buffer 65 is 2 pixels high by 3 pixels wide, for a total of 6 pixels.
  • the detail buffer 66 is 3 pixels high by 3 pixels wide, for a total of 36 pixels.
  • the size of the detail buffer to be created is smaller than in the first embodiment, and the amount of memory used can be reduced.
  • FIG. 23 shows a detailed buffer 64 corresponding to the polygon model 51 to be applied drawing in the third embodiment, another detailed buffer 65 corresponding to the polygon model 52 to be applied drawing, and another polygon model to be applied drawing.
  • 53 shows yet another detail buffer 66 corresponding to 53;
  • the assignment creation unit 302 includes position information 67 indicating to which position in the drawing destination buffer 55 the detail buffer 64 corresponds and information indicating to which position in the drawing destination buffer 55 the detail buffer 65 corresponds.
  • Position information 68 which is information
  • position information 69 which is information indicating which position in the drawing destination buffer 55 the detail buffer 66 corresponds to, are created.
  • the allocation creating unit 302 stores the position information 67 in the detail buffer 64 , the position information 68 in the detail buffer 65 , and the position information 69 in the detail buffer 66 .
  • FIG. 24 is a diagram schematically showing position information 67, which is information indicating to which position in the drawing destination buffer 55 the detail buffer 64 corresponds.
  • position information 67 is information indicating to which position in the drawing destination buffer 55 the detail buffer 64 corresponds.
  • a dotted line range 64 on the drawing destination buffer 55 corresponds to the position of the detail buffer 64 .
  • FIG. 25 is a diagram schematically showing position information 68, which is information indicating to which position in the drawing destination buffer 55 another detail buffer 65 corresponds.
  • position information 68 is information indicating to which position in the drawing destination buffer 55 another detail buffer 65 corresponds.
  • a dotted line range 65 on the drawing destination buffer 55 corresponds to the position of the detail buffer 65 .
  • FIG. 26 is a diagram schematically showing position information 69, which is information indicating to which position in the rendering destination buffer 55 another detail buffer 66 corresponds.
  • position information 69 is information indicating to which position in the rendering destination buffer 55 another detail buffer 66 corresponds.
  • a dotted line range 66 on the drawing destination buffer 55 corresponds to the position of the detail buffer 66 .
  • the position information is divided for each polygon model to be applied drawing.
  • the detailed buffer storage unit 303 may store them collectively.
  • FIG. 27 shows information about which position in the drawing destination buffer 55 the detail buffer 64 corresponds to, information about which position in the drawing destination buffer 55 another detail buffer 65 corresponds to, and further details.
  • FIG. 10 is a diagram schematically showing position information that summarizes the information as to which position in the drawing destination buffer 55 the buffer 66 corresponds to.
  • a dotted line range 64 on the drawing destination buffer 55 corresponds to the position of the detail buffer 64
  • a dotted line range 65 on the drawing destination buffer 55 corresponds to the position of the detail buffer 65
  • a dotted line range 66 on the drawing destination buffer 55 corresponds to the position of the detail buffer 65 . corresponds to the location of the detail buffer 66 .
  • the range of the detail buffer 64 corresponding to the polygon model 51 is the range in which the entire polygon model 51 can be drawn.
  • the range of the detail buffer 65 corresponding to the polygon model 52 is the range in which the entire polygon model 52 can be drawn.
  • the range of the detail buffer 66 corresponding to the polygon model 53 is the range in which the entire polygon model 53 can be drawn.
  • the range of the detail buffer corresponding to the applicable drawing target may be determined in advance, may be determined by the user upon receipt of input from the user, or may be determined according to the area of the drawing target. , the range is not limited.
  • the allocation creation unit 302 may calculate the area of each drawing target as in the calculation unit 104 of the first embodiment, or use a bounding box or the like. By approximation, the area of each object to be drawn may be roughly estimated. In the case of a three-dimensional object to be drawn, the area of the object to be drawn is calculated after the object to be drawn is projected and transformed into two dimensions. The method of estimating the area of each object to be drawn using a bounding box or the like is the same as in FIG. 9 of the first embodiment. Note that the allocation creation unit 302 may create the detail buffer with a wider range than the applicable drawing target.
  • the assignment creation unit 302 transmits the drawing information and the tag information of the drawing target, which is the applicable drawing target, to the calculation unit 104, and proceeds to step S45.
  • the assignment creation unit 302 generates drawing information, tag information of the polygon model 51 tagged as the applicable drawing target, and tag information of the polygon model 52 tagged as the applicable drawing target. , and the tag information of the polygon model 53 tagged as being the applicable drawing target, are transmitted to the calculation unit 104 .
  • step S45 the calculation unit 104 receives the drawing information and the tag information of the drawing target, which is the applicable drawing target, from the allocation creation unit 302.
  • the calculation unit 104 receives the drawing information from the assignment creation unit 302, the tag information of the polygon model 51 tagged as the applicable drawing target, and the polygon model 51 tagged as the applicable drawing target. 52 tag information and the tag information of the polygon model 53 tagged as being the applicable rendering target. Otherwise, step S45 is the same as step S14 of the first embodiment.
  • step S46 the determining unit 105 transmits the drawing information, the tag information of the drawing target that is the applicable drawing target, and the alpha value information of the applicable drawing target to the applicable drawing target drawing unit 304, and proceeds to step S47. Otherwise, step S46 is the same as step S15 of the first embodiment.
  • step S ⁇ b>47 the applicable drawing target drawing unit 304 receives the drawing information, the tag information of the drawing target that is the applicable drawing target, and the alpha value information of the applicable drawing target from the determining unit 105 . Draw each applicable drawable into each detail buffer. Alpha value information is also stored in each detail buffer at this time. Also, the applied drawing target drawing unit 304 acquires the number of pixels drawn in the detailed buffer each time the applied drawing target is drawn in the detailed buffer.
  • the applicable drawing target drawing unit 304 receives the drawing information from the determination unit 105, the tag information of the polygon model 51 tagged as the applicable drawing target, and the applicable drawing target.
  • the applicable drawing target drawing unit 304 draws the polygon model 51 in the detail buffer 64 of the detail buffer storage unit 303 . Also, the applicable drawing target drawing unit 304 draws the polygon model 52 in the detail buffer 65 of the detail buffer storage unit 303 . Further, the applicable drawing target drawing unit 304 draws the polygon model 53 in the detail buffer 66 of the detail buffer storage unit 303 .
  • FIG. 28 shows a drawing result 70 obtained by drawing a polygon model 51 to be applied drawing in the detail buffer 64, a drawing result 71 obtained by drawing another polygon model 53 to be applied drawing in another detail buffer 66, and a plurality of drawing targets. It is a diagram in which polygon models 51 to 53 are superimposed.
  • the polygon model 51 includes the center of one pixel in the detail buffer 64, as shown in FIG. Then, the polygon model 51 is drawn.
  • the drawing result of the polygon model 51 is the drawing result 70 of one pixel in FIG. Since the polygon model 52 does not contain the center of any pixel in the detail buffer 65, it is not rendered and disappears. Therefore, nothing is drawn in the detail buffer 65 . Since the polygon model 53 includes the center of one pixel in the detail buffer 66, the polygon model 53 is drawn by filling the pixel with the RGB values associated with the polygon model 53 included in the drawing information.
  • the drawing result of the polygon model 53 is the drawing result 71 of one pixel in FIG.
  • the drawing result 70 of the third embodiment is the same as the drawing result 58 of FIG. 12 of the first embodiment.
  • the drawing result 71 of the third embodiment is the same as the drawing result 59 of FIG. 12 of the first embodiment.
  • FIG. 29 is a diagram showing a drawing result 70 obtained by drawing the polygon model 51 to be applied drawing in the detail buffer 64 and a drawing result 71 obtained by drawing another polygon model 53 to be applied drawing in another detail buffer 66 . .
  • FIG. 29 eliminates the superimposed display of the multiple polygon models 51-53 of FIG. 28 and shows only the actual drawing results 70-71.
  • the drawing result 70 of the third embodiment is the same as the drawing result 58 of FIG. 13 of the first embodiment.
  • the drawing result 71 of the third embodiment is the same as the drawing result 59 of FIG. 13 of the first embodiment.
  • the applicable drawing target drawing unit 304 stores the alpha value that the alpha value of the polygon model 51 is 0.4 in the pixel of the drawing result 70. Further, from the alpha value information of the polygon model 53 , the applicable drawing target drawing unit 304 stores an alpha value that the alpha value of the polygon model 53 is 1.0 in the pixel of the drawing result 71 .
  • the values in the pixels of FIG. 29 represent the stored alpha values.
  • the applicable drawing target drawing unit 304 acquires the number of pixels in which the detailed buffer 64 is actually filled with the polygon model 51 after performing the process of drawing the polygon model 51 in the detail buffer 64 .
  • the applied drawing target drawing unit 304 acquires information that the number of pixels actually drawn in the detail buffer 64 is 1 pixel.
  • the applied drawing target drawing unit 304 acquires the number of pixels in which the detailed buffer 65 is actually filled with the polygon model 52 .
  • the polygon model 52 as can be seen from FIG.
  • the detail buffer 65 is not filled with even one pixel by the polygon model 52, so the applicable drawing target drawing unit 304 acquires the information that the number of pixels is 0 pixel.
  • the applied drawing target drawing unit 304 acquires the number of pixels in which the detailed buffer 66 is actually filled with the polygon model 53 .
  • the applied drawing target drawing unit 304 acquires information that the number of pixels actually drawn in the detail buffer 66 is 1 pixel.
  • information indicating the number of pixels drawn in the detail buffer by the applicable drawing target acquired by the applicable drawing target drawing unit 304 will be referred to as pixel number information, as in the first embodiment.
  • the applied drawing target drawing unit 304 transmits the drawing information, the tag information of the drawing target that is the applicable drawing target, the alpha value information, and the number of pixels information to the dot printing unit 305, and the process proceeds to step S48. move on.
  • the dot printing unit 305 receives the drawing information, the tag information of the drawing target that is the applied drawing target, the alpha value information, and the number of pixels information from the applied drawing target drawing unit 304.
  • the dot printing unit 305 identifies, from the pixel number information, the erasure applied drawing target, which is the applied drawing object that has disappeared without being drawn by the applied drawing object drawing unit 304, and stores the erase applied drawing object in the detail buffer corresponding to the position of the erasure applied drawing object. Dot the pixel and draw the erasure-applied draw target to the detail buffer. At this time, the alpha value of the alpha value information is also stored in the detail buffer.
  • the dot printing unit 305 receives the drawing information from the applied drawing target drawing unit 304, the tag information of the polygon model 51 tagged as the applicable drawing target, and the applied drawing
  • alpha value information that the alpha value of the polygon model 52 is 0.7
  • alpha value information that the alpha value of the polygon model 53 is 1.0
  • pixels drawn in the detail buffer 56 by the polygon model 51 are drawn in the detail buffer 56 by the polygon model 51.
  • the dot printing unit 305 uses the pixel count information to convert the applied drawing target having the number of pixels of 0 to the lost drawing target that has disappeared without being drawn by the applied drawing target drawing unit 304. Identifies as a drawing target. Specifically, since the pixel count information of the polygon model 52 is 0 pixel, the dot printing unit 305 identifies the polygon model 52 as the object to be drawn with erasure applied.
  • the dotting unit 305 dots the pixels in the detail buffer 65 corresponding to the position of the polygon model 52 to which erasure is applied, and renders the erasure-applied drawing target in the detail buffer.
  • the dotting unit 305 dots the pixels in the detail buffer 65 corresponding to the positions of the vertices of the polygon model 52 , and renders the polygon model 52 to be rendered with erasure applied in the detail buffer 65 .
  • the dotting unit 305 approximates it to a polygon and places dots at the positions of the vertices of the polygon.
  • FIG. 30 shows a drawing result 70 obtained by drawing a polygon model 51 to be applied drawing in the detail buffer 64, a drawing result 71 obtained by drawing another polygon model 53 to be applied drawing in the detail buffer 66, and another polygon to be applied drawing.
  • It is a diagram in which a drawing result 72 obtained by drawing a model 52 in a detail buffer 65 and a plurality of polygon models 51 to 53 are superimposed.
  • the polygon model 52 has three vertices. Then, 3 pixels of the detail buffer 65 corresponding to the positions of the 3 vertices of the polygon model 52 are filled with RGB values attached to the polygon model 52 included in the drawing information, and the polygon model 52 is drawn.
  • the drawing result of the polygon model 52 is the drawing result 72 of 3 pixels in FIG.
  • FIG. 31 shows a drawing result 70 obtained by drawing a polygon model 51 to be applied drawing in the detail buffer 64, a drawing result 71 obtained by drawing another polygon model 53 to be applied drawing in the detail buffer 66, and another polygon to be applied drawing.
  • FIG. 10 is a diagram showing a drawing result 72 of the model 52 drawn in the detail buffer 65;
  • FIG. 31 eliminates the superimposed display of the multiple polygon models 51-53 of FIG. 30 and shows only the actual drawing results 70-72.
  • the dot printing unit 305 stores an alpha value that the alpha value of the polygon model 52 is 0.7 in each pixel forming the drawing result 72.
  • the value in each pixel in FIG. 31 represents the stored alpha value.
  • the dot printing unit 305 sends a notification of completion of drawing to the detail buffer to the blend processing unit 306, and proceeds to step S49. If there is no drawing target to be erased, the dot printing unit 305 does nothing because the drawing target to be applied has already been drawn in the detail buffer, and notifies the blend processing unit 306 of the end of drawing to the detail buffer. It transmits, and it progresses to step S49.
  • step S ⁇ b>49 the blend processing unit 306 receives a notification of completion of drawing to the drawing destination buffer from the unapplied drawing target drawing unit 107 and a notification of completion of drawing to the detail buffer from the dot printing unit 305 .
  • each pixel of each detail buffer stored in the detail buffer 303 is stored in the rendering destination buffer stored in the rendering destination buffer storage unit 103. Then, alpha blend the RGB values with the alpha value and draw the applied drawing target in the drawing destination buffer.
  • the blend processing unit 306 uses the position information 67 stored in the detail buffer 64 to add alpha to the rendering destination buffer 55 of FIG. 8 for each pixel of the detail buffer 64 of FIG.
  • a polygon model 51 to be applied drawing is drawn in the drawing destination buffer 55 by alpha blending the RGB values with the value.
  • the blend processing unit 306 draws the polygon model 51 in FIG. 31 at the position corresponding to the position information 67 in the drawing destination buffer 55 in FIG. For one pixel of the result 70, alpha blend the RGB values of the polygon model 51 with an alpha value of 0.4.
  • the blend processing unit 306 renders the rendering destination buffer 55 of FIG.
  • the value is alpha-blended and the polygon model 52 to be applied drawing is drawn in the drawing destination buffer 55 .
  • the blend processing unit 306 draws the polygon model 52 of FIG. 31 at the position corresponding to the position information 68 in the drawing destination buffer 55 of FIG. For each pixel of the result 72, alpha blend the RGB values of the polygonal model 52 with the alpha value of 0.7 for each pixel.
  • the blend processing unit 306 converts the RGB values of the detail buffer 66 of FIG. 31 to the rendering destination buffer 55 of FIG. A polygon model 53 to be blended and applied is drawn in a drawing destination buffer 55 . Specifically, the blend processing unit 306 draws the polygon model 53 of FIG. 31 at the position corresponding to the position information 69 in the drawing destination buffer 55 of FIG. For one pixel of the result 71, alpha blend the RGB values of the polygon model 53 with an alpha value of 1.0.
  • a drawing showing the result of drawing all the polygon models in the drawing destination buffer 55 is similar to FIG.
  • the blend processing unit 306 sends a notification of the end of alpha blending to the output control unit 111, and proceeds to step S50.
  • step S ⁇ b>50 the output control unit 111 receives an alpha blend end notification from the blend processing unit 306 . Otherwise, step S50 is the same as step S19 of the first embodiment.
  • step S50 Even after executing step S50, the process returns to step S41, and the above process is repeated until there is a trigger to end the process, such as turning off the power or performing an end operation. Although the above processing is repeated, it may be performed only once.
  • the rendering target determined as an unapplied rendering target by the application determination unit 106 is stored in the rendering destination buffer storage unit 103 by the unapplied rendering target rendering unit 107.
  • the calculation unit 104 calculates the area of the drawing target determined by the application determining unit 106 as the applicable drawing target, and the determining unit 105 determines the alpha value of the applicable drawing target from the calculated area.
  • the applicable drawing target drawing unit 304 draws the applicable drawing target in a detail buffer which is a buffer different from the drawing destination buffer, and the dot printing unit 305 dots the pixels in the detail buffer corresponding to the position of the applicable drawing target.
  • the blend processing unit 306 alpha-blends the detail buffer to the drawing destination buffer with the alpha value of the applied drawing target, and the output control unit 111 draws to the output device 5 in the drawing destination buffer. Since the scene is controlled to be output, even if the drawing target is not positioned at the center of the pixel, the drawing target is drawn and the drawing target can be displayed on the display screen.
  • the drawing processing system 3 has the allocation creating unit 302 create a detailed buffer corresponding to the range of each applicable drawing target in the detailed buffer storage unit 303, and the applicable drawing target drawing unit 304 and the dot printing unit 305 draws the applicable drawing target in the detail buffer corresponding to the range of the applicable drawing target.
  • the drawing processing system 3 only needs to create a detailed buffer corresponding to the range of the applicable drawing target when the ratio of the range not used for drawing of the applicable drawing target is high, so the buffer is created wastefully. is not required and memory usage can be reduced.
  • the drawing processing systems 1 to 3 need to draw the entire scene in the conventional drawing processing device when drawing targets with large drawing size differences coexist in the same scene.
  • the conventional drawing processing device when drawing objects with a large difference in drawing size coexist in a scene, if the whole scene is drawn with the magnification or the number of divisions corresponding to the smallest drawing object, the other drawing objects will also be drawn. Since the image is drawn with a high resolution or with a large number of divisions, the drawing processing load increases. However, since the drawing processing systems 1 to 3 do not need to draw at high resolution, the drawing processing load can be reduced by drawing so as not to erase the drawing target from the display screen.
  • the calculation unit calculates the area of the applicable drawing target, and the determination unit calculates the alpha value from the calculated area. can be determined according to the size of the . Moreover, since the drawing processing systems 1 to 3 perform alpha blending, the background color does not disappear.
  • the drawing processing systems 1 to 3 have the applicable drawing target drawing unit draws the applicable drawing target in the detail buffer, acquires the number of drawn pixels, and the dot printing unit performs the applied drawing From the number of drawn pixels obtained for each target, identify the erasure-applied drawing targets that were not drawn in the detail buffer, point the pixel in the detail buffer corresponding to the position of the erasure-applied drawing target, and remove the erasure-applied drawing target. was drawn in the detail buffer.
  • the rendering processing systems 1 to 3 eliminate the applicable rendering object rendering portion, and the dot dot unit dots the pixels in the detail buffer corresponding to the position of the applicable rendering object for all the applicable rendering objects.
  • Render targets may be rendered into the detail buffer.
  • the number of points to be struck increases.
  • the load of the drawing process can be reduced by providing a drawing target drawing section, where the dot dot section dots the pixels in the detail buffer corresponding to the position of the drawing target with erasure applied, and draws the drawing target with erasure applied in the detail buffer.
  • the drawing processing systems 1 to 3 can draw more accurately when drawing an applicable drawing target having a certain area in the applicable drawing target drawing section.
  • the drawing processing systems 1 to 3 dot the pixels in the detail buffer corresponding to the positions of the vertices of the drawing object with erasure applied. You may also dot the pixel in the detail buffer corresponding to . However, when the dot plotting part plots a dot in the pixel in the detail buffer corresponding to the position of the center of gravity of the object to be rendered with erasure, the center of gravity must be calculated. Rendering processing systems 1 to 3 prefer that the dot dot part plots a pixel in the detail buffer corresponding to the position of the vertex of the object to be rendered with erasure, and the pixel in the detail buffer corresponding to the position of the center of gravity of the object to be rendered with erasure. Rendering processing load can be reduced more than when dots are placed.
  • the drawing processing systems 1 to 3 draw the non-applicable drawing target and then draw the applicable drawing target in order.
  • the order of drawing may be reversed, or the drawing of non-applicable drawing targets and the drawing of applicable drawing targets may be processed in random order.
  • the drawing processing systems 1 to 3 may draw the plurality of non-applicable drawing targets in order, or may draw the non-applicable drawing targets in parallel at the same time.
  • the drawing processing systems 1 to 3 may draw the plurality of applicable drawing targets in order, or may draw the applicable drawing targets in parallel at the same time.
  • the drawing processing systems 1 to 3 are provided with the detail buffer storage units, but processing may be performed by dividing the storage units into a plurality of units.
  • the drawing processing devices 100 to 300 have the drawing destination buffer storage unit and the detail buffer storage unit. may be provided.
  • the drawing processing devices 100 to 300 may exchange information with the storage unit through communication.
  • the creation unit 102 created the rendering destination buffer and the detail buffer in step S12, but the rendering destination buffer and the detail buffer may be created in advance. Further, after the creation unit 102 creates the drawing destination buffer in step S12, the creation unit 102 may create the detail buffer between steps S13 to S16.
  • the drawing destination buffer can be created at any time before drawing is done in the drawing destination buffer.
  • the detail buffer can be created any time before drawing is done in the detail buffer.
  • the drawing destination buffer may be created any time before drawing is performed in the drawing destination buffer.
  • the detail buffer can be created any time before drawing is done in the detail buffer.
  • the drawing processing systems 1 to 3 used the bounding box 80 when calculating the approximate value of the area of each drawing target. , or a shape such as a triangular pyramid that has fewer vertices than the object to be drawn and whose area is easier to calculate than the area of the object to be drawn. A combination of these may also be used.
  • the bounding box or the like can be used not only for approximating the area of a three-dimensional object to be drawn, but also for estimating the area of a two-dimensional object to be drawn. In this case, the sphere is replaced with a circle, the rugby ball is replaced with an ellipse, and the triangular prism and triangular pyramid are replaced with easily approximated shapes such as triangles.
  • the drawing processing systems 1 to 3 are applied, for example, when the object to be drawn is a polygon model, which is one surface model of three-dimensional graphics. It can also be applied to models, solid models, point models, etc. The drawing processing systems 1 to 3 can also be applied when the drawing target is two-dimensional graphics instead of three-dimensional graphics.
  • the drawing processing systems 1 to 3 are applied, for example, when displaying a drawing in RGB colors where the drawing target is represented by RGB values. It can also be applied when displaying
  • the drawing processing device, the drawing processing system, the drawing processing method, and the drawing processing program shown in the above-described embodiments are merely examples, and can be configured in combination with other devices as appropriate. It is not limited to the configuration of the embodiment alone.
  • 1 to 3 drawing processing system 4, 6 input device, 5 output device, 41 bus, 42 CPU, 43 main memory, 44 GPU, 45 graphic memory, 46 input interface, 47 output interface, 51-54 polygon model, 55 draw destination buffer, 56, 64-66, 70-72 detail buffer, 57-63 drawing result, 67-69 position information, 80 bounding box, 81 drawing object, 82 area, 100 to 300 drawing processing device, 101, 201 acquisition unit, 102, 301 creation unit, 103 drawing destination buffer storage unit, 104, 204 calculation unit, 105, 205 determination unit, 106, 202 application determination unit, 107, 203 non-applicable drawing object drawing part, 108, 209, 304 applied drawing object drawing part, 109, 303 detail buffer storage unit, 110, 207, 306 blend processing unit, 111 output control section, 112, 210, 305 dotting section, 206 end determination unit, 208 accumulation storage unit, 302 Assignment creation unit.

Abstract

Existing drawing processing devices have the problem in which if a small drawing object or the like having a drawing size of less than one pixel is not positioned in the center of a pixel, the two-or-more dimensional drawing object disappears from the display screen without being drawn. The following are provided inside of a drawing processing device (100): a dotting unit (112) that renders a dot in a pixel in a buffer corresponding to the position of a two-or-more dimensional drawing object, and draws the drawing object on the buffer; and an output control unit (111) that performs control such that a scene drawn on the buffer is outputted to an output device (5).

Description

描画処理装置、描画処理システム、描画処理方法、および描画処理プログラムDrawing processing device, drawing processing system, drawing processing method, and drawing processing program
 本開示は、描画情報を描画するための処理を行う描画処理装置、前記描画処理装置を用いる描画処理システム、描画処理方法、および描画処理プログラムに関するものである。 The present disclosure relates to a drawing processing device that performs processing for drawing drawing information, a drawing processing system using the drawing processing device, a drawing processing method, and a drawing processing program.
 ディスプレイなどの表示装置に2次元、3次元などのモデルの描画対象を表示させることがある。その際、2次元の描画対象はそのまま描画先のバッファである描画先バッファに描画されるが、3次元の描画対象は2次元に投影されてから描画先バッファに描画される。そして、描画先バッファのピクセルの中心が描画対象に含まれると、当該ピクセルが塗りつぶされ、描画対象が描画される。なお、描画対象は様々なサイズがあり、例えば1ピクセル未満の小さい描画対象から数ピクセルにまたがる描画対象もある。 A 2D, 3D, etc. model drawing target may be displayed on a display device such as a display. At this time, a two-dimensional drawing target is drawn as it is in a drawing destination buffer, which is a drawing destination buffer, but a three-dimensional drawing target is projected two-dimensionally and then drawn in the drawing destination buffer. Then, when the center of the pixel in the drawing destination buffer is included in the drawing target, the pixel is filled in and the drawing target is drawn. It should be noted that there are various sizes of drawing objects, for example, there are small drawing objects of less than one pixel and drawing objects that span several pixels.
 このような描画対象を描画する技術として、特許文献1の技術が開示されている。特許文献1には、描画処理装置が、小さな文字や細線などの特定の描画対象のみを高解像度レンダリングし、それ以外の描画対象を低解像度でレンダリングする技術が開示されている。 As a technique for drawing such a drawing target, the technique of Patent Document 1 is disclosed. Japanese Patent Application Laid-Open No. 2002-200000 discloses a technique in which a drawing processing device performs high-resolution rendering of only specific drawing targets such as small characters and thin lines, and renders other drawing targets at low resolution.
特開2009-274366号公報JP 2009-274366 A
 しかし、上記した従来の描画処理装置では、高解像度レンダリングというのは、すべてのピクセル位置に対して1ピクセル単位の処理を実施することであり、低解像度レンダリングというのは、座標値のうちX座標およびY座標において偶数値をとるピクセル位置に対する複数ピクセル単位の処理のみを実施することを示す。そのため、例えば、描画サイズが1ピクセル未満の小さい描画対象などは、ピクセルの中心に描画対象が位置していない場合、描画対象が描画されずに表示画面上から消失してしまう。 However, in the above-described conventional drawing processing device, high-resolution rendering means performing processing on a pixel-by-pixel basis for all pixel positions, and low-resolution rendering means performing processing on the X coordinate of coordinate values. and that only pixel positions with even-valued Y coordinates are processed on a multi-pixel basis. Therefore, for example, a small drawing target with a drawing size of less than 1 pixel disappears from the display screen without being drawn if the drawing target is not positioned at the center of the pixel.
 本開示は、上記のような問題を解決するためになされたものであって、ピクセルの中心に2次元以上の描画対象が位置していない場合でも、描画対象が描画され、描画対象を表示画面上に表示できる描画処理装置、描画処理システム、描画処理方法、および描画処理プログラムを提供することを目的とする。 The present disclosure has been made to solve the above problems, and even when a two- or more-dimensional drawing target is not positioned at the center of a pixel, the drawing target can be drawn and the drawing target can be displayed on the display screen. An object of the present invention is to provide a drawing processing device, a drawing processing system, a drawing processing method, and a drawing processing program that can be displayed on a screen.
 本開示に係る描画処理装置は、2次元以上の描画対象の位置に対応したバッファ内のピクセルに点を打ち、描画対象をバッファに描画する打点部と、バッファに描画されたシーンを出力装置に出力するように制御する出力制御部とを備える。 The drawing processing device according to the present disclosure includes a dot printing unit that dots pixels in a buffer corresponding to the position of a two- or more-dimensional drawing target, draws the drawing target in the buffer, and a scene drawn in the buffer to an output device. and an output control unit that controls to output.
 本開示に係る描画処理システムは、2次元以上の描画対象の入力を受け付ける入力装置と、描画対象をバッファにシーンとして描画する描画処理を行う描画処理装置と、シーンを出力する出力装置とを備え、描画処理装置は、描画対象の位置に対応したバッファ内のピクセルに点を打ち、描画対象をバッファに描画する打点部と、バッファに描画されたシーンを出力装置に出力するように制御する出力制御部とを備える。 A drawing processing system according to the present disclosure includes an input device that receives input of a drawing target of two or more dimensions, a drawing processing device that performs drawing processing of drawing the drawing target as a scene in a buffer, and an output device that outputs the scene. , the drawing processing device includes a dot printing unit that places dots on pixels in the buffer corresponding to the position of the object to be rendered, and renders the object to be rendered into the buffer; and a control unit.
 本開示に係る描画処理方法は、2次元以上の描画対象の位置に対応したバッファ内のピクセルに点を打ち、描画対象をバッファに描画するステップと、バッファに描画されたシーンを出力装置に出力するように制御するステップとを有する。 The drawing processing method according to the present disclosure includes the steps of drawing pixels in a buffer corresponding to the position of a drawing target in two or more dimensions, drawing the drawing target in the buffer, and outputting the scene drawn in the buffer to an output device. and controlling to
 本開示に係る描画処理プログラムは、2次元以上の描画対象の位置に対応したバッファ内のピクセルに点を打ち、描画対象をバッファに描画する処理と、バッファに描画されたシーンを出力装置に出力するように制御する処理とを実行させる。 The drawing processing program according to the present disclosure is a process of drawing dots on pixels in a buffer corresponding to the position of a drawing target in two or more dimensions, drawing the drawing target in the buffer, and outputting the scene drawn in the buffer to an output device. and a process for controlling so as to be executed.
 本開示によれば、打点部が2次元以上の描画対象の位置に対応したバッファ内のピクセルに点を打ち、描画対象をバッファに描画し、出力制御部がバッファに描画されたシーンを出力装置に出力するように制御するので、ピクセルの中心に2次元以上の描画対象が位置していない場合でも、描画対象が描画され、描画対象を表示画面上に表示させることができる。 According to the present disclosure, the dot dot plotting unit dots pixels in a buffer corresponding to the position of a two-dimensional or more drawing target, draws the drawing target in the buffer, and the output control unit outputs the scene drawn in the buffer to the output device. Therefore, even when a two-dimensional drawing target is not positioned at the center of the pixel, the drawing target is drawn and can be displayed on the display screen.
実施の形態1に係る描画処理装置を含む描画処理システムのブロック図である。1 is a block diagram of a drawing processing system including a drawing processing device according to Embodiment 1; FIG. 実施の形態1に係る描画処理システムのハードウェア構成図である。1 is a hardware configuration diagram of a drawing processing system according to Embodiment 1; FIG. 実施の形態1に係る描画処理システムの動作を示すフローチャートである。4 is a flowchart showing the operation of the drawing processing system according to Embodiment 1; 実施の形態1の描画情報の一例である。It is an example of drawing information according to the first embodiment. 実施の形態1に係る描画処理装置の動作を示すフローチャートである。4 is a flowchart showing the operation of the drawing processing device according to Embodiment 1; 実施の形態1の描画先バッファ及び詳細バッファの図である。3A and 3B are diagrams of a drawing destination buffer and a detail buffer according to the first embodiment; FIG. 不適用描画対象のポリゴンモデルを描画先バッファに描画した描画結果と不適用描画対象のポリゴンモデルとを重畳した図である。FIG. 11 is a diagram in which a rendering result of rendering a polygon model for non-application rendering in a rendering destination buffer and the polygon model for non-application rendering are superimposed; 不適用描画対象のポリゴンモデルを描画先バッファに描画した描画結果を示した図である。FIG. 10 is a diagram showing a rendering result of rendering a non-applicable rendering target polygon model to a rendering destination buffer; バウンディングボックスを用いて3次元の描画対象を2次元に投影変換して描画対象の面積を概算することを示した概念図である。FIG. 10 is a conceptual diagram showing roughly estimating the area of a drawing target by projecting a three-dimensional drawing target into two dimensions using a bounding box; 適用描画対象のポリゴンモデルを詳細バッファに描画した描画結果と複数のポリゴンモデルとを重畳した図である。FIG. 10 is a diagram in which a drawing result of drawing a polygon model to be applied drawing in a detail buffer and a plurality of polygon models are superimposed; 適用描画対象のポリゴンモデルを詳細バッファに描画した描画結果を示した図である。FIG. 10 is a diagram showing a drawing result of drawing a polygon model to be applied drawing in a detail buffer; 適用描画対象のポリゴンモデルを詳細バッファに描画した描画結果と別の適用描画対象のポリゴンモデルを詳細バッファに描画した描画結果と複数のポリゴンモデルとを重畳した図である。FIG. 10 is a diagram in which a drawing result of drawing a polygon model to be applied drawing in a detail buffer, a drawing result of drawing another polygon model to be applied drawing in a detail buffer, and a plurality of polygon models are superimposed. 適用描画対象のポリゴンモデルを詳細バッファに描画した描画結果と別の適用描画対象のポリゴンモデルを詳細バッファに描画した描画結果とを示した図である。FIG. 10 is a diagram showing a drawing result of drawing a polygon model to be applied drawing in the detail buffer and a drawing result of drawing another polygon model to be applied drawing in the detail buffer; 適用描画対象のポリゴンモデルを詳細バッファに描画した描画結果と別の適用描画対象のポリゴンモデルを詳細バッファに描画した描画結果とさらに別の適用描画対象のポリゴンモデルを詳細バッファに描画した描画結果と複数のポリゴンモデルとを重畳した図である。The result of drawing a polygon model to be applied to the detail buffer, the result of drawing another polygon model to be applied to the detail buffer, and the result of drawing another polygon model to be applied to the detail buffer. FIG. 4 is a diagram in which a plurality of polygon models are superimposed; 適用描画対象のポリゴンモデルを詳細バッファに描画した描画結果と別の適用描画対象のポリゴンモデルを詳細バッファに描画した描画結果とさらに別の適用描画対象のポリゴンモデルを詳細バッファに描画した描画結果とを示した図である。The result of drawing a polygon model to be applied to the detail buffer, the result of drawing another polygon model to be applied to the detail buffer, and the result of drawing another polygon model to be applied to the detail buffer. It is a figure showing. 全てのポリゴンモデルを描画先バッファに描画した描画結果を示した図である。FIG. 10 is a diagram showing a rendering result of rendering all polygon models in a rendering destination buffer; 実施の形態2に係る描画処理装置を含む描画処理システムのブロック図である。8 is a block diagram of a drawing processing system including a drawing processing device according to Embodiment 2; FIG. 本開示の実施の形態2に係る描画処理システムの動作を示すフローチャートである。9 is a flowchart showing operations of a drawing processing system according to Embodiment 2 of the present disclosure; 本開示の実施の形態2に係る描画処理装置の動作を示すフローチャートである。9 is a flowchart showing operations of a drawing processing device according to Embodiment 2 of the present disclosure; 実施の形態3に係る描画処理装置を含む描画処理システムのブロック図である。10 is a block diagram of a drawing processing system including a drawing processing device according to Embodiment 3; FIG. 本開示の実施の形態3に係る描画処理装置の動作を示すフローチャートである。FIG. 11 is a flow chart showing the operation of a drawing processing device according to Embodiment 3 of the present disclosure; FIG. 実施の形態3の適用描画対象のポリゴンモデルに対応する詳細バッファと実施の形態1の詳細バッファとの対比関係を示した図である。FIG. 12 is a diagram showing a contrasting relationship between a detailed buffer corresponding to a polygon model to be rendered according to the third embodiment and a detailed buffer according to the first embodiment; 実施の形態3の適用描画対象のポリゴンモデルに対応する詳細バッファと別の適用描画対象のポリゴンモデルに対応する別の詳細バッファとさらに別の適用描画対象のポリゴンモデルに対応するさらに別の詳細バッファとを示した図である。A detail buffer corresponding to the polygon model to be applied drawing in the third embodiment, another detail buffer corresponding to the polygon model to be applied drawing, and yet another detail buffer corresponding to the polygon model to be applied drawing and is a diagram showing. 詳細バッファが描画先バッファのどこの位置に対応するかという情報である位置情報を模擬的に示した図である。FIG. 10 is a diagram schematically showing position information, which is information indicating which position in the drawing destination buffer the detail buffer corresponds to; 別の詳細バッファが描画先バッファのどこの位置に対応するかという情報である位置情報を模擬的に示した図である。FIG. 12 is a diagram schematically showing position information, which is information indicating to which position in the drawing destination buffer another detail buffer corresponds; さらに別の詳細バッファが描画先バッファのどこの位置に対応するかという情報である位置情報を模擬的に示した図である。FIG. 11 is a diagram schematically showing position information, which is information indicating to which position in the drawing destination buffer a further detail buffer corresponds; 詳細バッファが描画先バッファのどこの位置に対応するかという情報と、別の詳細バッファが描画先バッファのどこの位置に対応するかという情報と、さらに別の詳細バッファが描画先バッファのどこの位置に対応するかという情報とをまとめた位置情報を模擬的に示した図である。Information about where the detail buffer corresponds in the destination buffer, information about where another detail buffer corresponds in the destination buffer, and information about where the other detail buffer corresponds in the destination buffer. FIG. 10 is a diagram schematically showing location information that summarizes information as to whether or not it corresponds to a location; 適用描画対象のポリゴンモデルを詳細バッファに描画した描画結果と別の適用描画対象のポリゴンモデルを別の詳細バッファに描画した描画結果と複数の適用描画対象のポリゴンモデルとを重畳した図である。FIG. 10 is a diagram in which a drawing result of drawing a polygon model to be applied drawing in a detailed buffer, a drawing result of drawing another polygon model to be applied drawing to another detailed buffer, and a plurality of polygon models to be applied drawing are superimposed. 適用描画対象のポリゴンモデルを詳細バッファに描画した描画結果と別の適用描画対象のポリゴンモデルを別の詳細バッファに描画した描画結果とを示した図である。FIG. 12 is a diagram showing a drawing result of drawing a polygon model to be applied drawing in a detail buffer and a drawing result of drawing another polygon model to be applied drawing in another detail buffer; 適用描画対象のポリゴンモデルを詳細バッファに描画した描画結果と別の適用描画対象のポリゴンモデルを詳細バッファに描画した描画結果とさらに別の適用描画対象のポリゴンモデルを詳細バッファに描画した描画結果と複数のポリゴンモデルとを重畳した図である。The result of drawing a polygon model to be applied to the detail buffer, the result of drawing another polygon model to be applied to the detail buffer, and the result of drawing another polygon model to be applied to the detail buffer. FIG. 4 is a diagram in which a plurality of polygon models are superimposed; 適用描画対象のポリゴンモデルを詳細バッファに描画した描画結果と別の適用描画対象のポリゴンモデルを詳細バッファに描画した描画結果とさらに別の適用描画対象のポリゴンモデルを詳細バッファに描画した描画結果とを示した図である。The result of drawing a polygon model to be applied to the detail buffer, the result of drawing another polygon model to be applied to the detail buffer, and the result of drawing another polygon model to be applied to the detail buffer. It is a figure showing.
実施の形態1.
 実施の形態1では、一例として、シーン全体の描画の情報である描画情報内に、ピクセルの中心に描画対象が位置していない描画対象がある場合について、本開示を適用した場合を以下に説明する。
Embodiment 1.
In Embodiment 1, as an example, a case where the present disclosure is applied to a case where there is a drawing target that is not positioned at the center of a pixel in drawing information that is information for drawing an entire scene will be described below. do.
 本開示を適用する描画処理システムでは、まず描画対象が、詳細に描画を行う詳細描画処理を適用してから描画先バッファに描画する適用描画対象と、詳細描画処理は適用せず、描画先バッファにそのまま描画する不適用描画対象とに分けられる。描画処理システムは、不適用描画対象を描画先バッファにそのまま描画し、適用描画対象の面積を算出し、算出した面積から適用描画対象の透過度を表すアルファ値を決定する。描画処理システムは、描画先バッファとは異なるバッファである詳細バッファに適用描画対象を描画し、描画されたピクセル数を取得する。描画処理システムは、適用描画対象ごとに取得した描画されたピクセル数から、詳細バッファに描画されず、消失してしまった適用描画対象である消失適用描画対象を特定し、消失適用描画対象の位置に対応した詳細バッファ内のピクセルに点を打ち、消失適用描画対象を詳細バッファに描画する。描画処理システムは、詳細バッファに描画された消失適用描画対象に対応するピクセルにおいて、消失適用描画対象のアルファ値で消失適用描画対象のRGB値をアルファブレンドすることにより、消失適用描画対象を描画先バッファに描画する。このようにすることによって、ピクセルの中心に描画対象が位置していない場合でも、描画対象が描画され、描画対象を表示画面上に表示させることができる。以下に詳細に説明する。 In the drawing processing system to which the present disclosure is applied, first, a drawing target is divided into a drawing target to which detailed drawing processing is applied and then drawn to a drawing destination buffer, and a drawing target to which detailed drawing processing is not applied and which is drawn to a drawing destination buffer. It is divided into non-applicable drawing targets that are drawn as they are. The drawing processing system draws the non-applicable drawing target as it is in the drawing destination buffer, calculates the area of the applicable drawing target, and determines the alpha value representing the transparency of the applicable drawing target from the calculated area. The drawing processing system draws the applied drawing target in a detail buffer, which is a buffer different from the drawing destination buffer, and acquires the number of drawn pixels. Based on the number of drawn pixels acquired for each applied drawing target, the drawing processing system identifies the applied drawing target that has disappeared because it was not drawn in the detail buffer, and determines the position of the lost applied drawing target. Dots the pixel in the detail buffer corresponding to , and draws the erasure-applied drawtarget into the detail buffer. The rendering processing system alpha-blends the RGB values of the erasure-applied drawing target with the alpha values of the erasure-applied drawing target at the pixels corresponding to the erasure-applied drawing target drawn in the detail buffer, thereby converting the erasure-applied drawing target into the drawing destination. Draw to a buffer. By doing so, even if the drawing target is not positioned at the center of the pixel, the drawing target can be drawn and displayed on the display screen. Details will be described below.
 図1は、実施の形態1に係る描画処理装置100を含む描画処理システム1のブロック図である。描画処理システム1は、描画情報に含まれる各描画対象において詳細描画処理を適用するか否かのタグ情報と描画情報とが入力される入力装置4と、描画情報とタグ情報とを用いて、描画情報を描画する処理を行う描画処理装置100と、描画処理装置100で処理されたシーンを出力する出力装置5とを備えている。 FIG. 1 is a block diagram of a drawing processing system 1 including a drawing processing device 100 according to the first embodiment. The drawing processing system 1 uses an input device 4 for inputting tag information indicating whether or not to apply detailed drawing processing to each drawing object included in the drawing information and drawing information, and using the drawing information and the tag information, A drawing processing device 100 for drawing drawing information and an output device 5 for outputting a scene processed by the drawing processing device 100 are provided.
 入力装置4は、例えば、ユーザーによって入力されたポリゴンモデルなどの描画対象の情報を含むシーン全体の描画情報を受け付ける。ユーザーは、各描画対象に対して、描画サイズの大きさ、輝度、重要度などを見て、適用描画対象と不適用描画対象とのいずれであるかのタグ付けをする。入力装置4は、ユーザーによって各描画対象に対するタグ付けをタグ情報として入力を受け付ける。 The input device 4 receives, for example, the drawing information of the entire scene including the drawing target information such as the polygon model input by the user. The user tags each drawing object as to whether it is an applicable drawing object or an inapplicable drawing object by looking at the drawing size, brightness, importance, and the like. The input device 4 receives an input from the user for tagging each drawing target as tag information.
 描画処理装置100は、描画情報を描画先バッファにシーンとして描画する描画処理を行う。具体的には、描画処理装置100は、詳細描画処理を適用しないとタグ付けされた不適用描画対象を描画先バッファに描画する。描画処理装置100は、詳細描画処理を適用するとタグ付けされた適用描画対象の面積を算出し、算出した面積から適用描画対象の透過度を表すアルファ値を決定する。描画処理装置100は、描画先バッファとは異なるバッファである詳細バッファに適用描画対象を描画し、描画されたピクセル数を取得する。描画処理装置100は、適用描画対象ごとに取得した描画されたピクセル数から、詳細バッファに描画されず、消失してしまった適用描画対象である消失適用描画対象を特定し、消失適用描画対象の位置に対応した詳細バッファ内のピクセルに点を打ち、消失適用描画対象を詳細バッファに描画する。描画処理装置100は、詳細バッファに描画された消失適用描画対象に対応するピクセルにおいて、消失適用描画対象のアルファ値で消失適用描画対象のRGB値をアルファブレンドすることにより、消失適用描画対象を描画先バッファに描画する。このように、消失適用描画対象の位置に対応した詳細バッファ内のピクセルに点を打ち、消失適用描画対象を詳細バッファに描画することにより、ピクセルの中心に描画対象が位置していない場合でも、描画対象が描画され、描画対象を表示画面上に表示させることができる。 The drawing processing device 100 performs drawing processing for drawing drawing information as a scene in a drawing destination buffer. Specifically, the rendering processing device 100 renders the non-applicable rendering target tagged as not applying the detailed rendering processing to the rendering destination buffer. The drawing processing device 100 calculates the area of the applicable drawing target tagged as applying the detailed drawing processing, and determines the alpha value representing the transparency of the applicable drawing target from the calculated area. The drawing processing device 100 draws the applied drawing target in a detail buffer, which is a buffer different from the drawing destination buffer, and acquires the number of drawn pixels. The drawing processing device 100 identifies, from the number of drawn pixels obtained for each applied drawing target, the loss applied drawing target that is the applied drawing target that has disappeared without being drawn in the detail buffer. Dots the pixel in the detail buffer that corresponds to the position and draws the erasure-applied drawable into the detail buffer. The drawing processing device 100 draws the erasure-applied drawing target by alpha-blending the RGB values of the erasure-applied drawing target with the alpha value of the erasure-applied drawing target in the pixels corresponding to the erasure-applied drawing target drawn in the detail buffer. Draw to the destination buffer. In this way, by marking the pixel in the detail buffer corresponding to the position of the erasure-applied drawing target and drawing the erasure-applied drawing target in the detail buffer, even if the drawing target is not positioned at the center of the pixel, A drawable object is drawn, and the drawable object can be displayed on the display screen.
 出力装置5は、描画処理装置100で描画先バッファに描画されたシーンを出力する。出力装置5は、例えば、ディスプレイなどの表示装置であり、描画処理装置100で描画先バッファに描画されたシーンを表示する。 The output device 5 outputs the scene drawn in the drawing destination buffer by the drawing processing device 100 . The output device 5 is, for example, a display device such as a display, and displays the scene drawn in the drawing destination buffer by the drawing processing device 100 .
 次に描画処理装置100について、同様に図1を用いて、以下に説明する。 Next, the rendering processing device 100 will be described below using FIG. 1 as well.
 描画処理装置100は、入力装置4から描画対象を含む描画情報とタグ情報とを取得する取得部101と、描画先バッファと詳細バッファとを作成する作成部102と、タグ情報を用いて描画対象を詳細描画処理するか否かによって、描画対象が、描画先バッファに描画される不適用描画対象と、描画先バッファとは異なるバッファである詳細バッファに描画される適用描画対象とのいずれであるかを判定する適用判定部106と、詳細描画処理を適用しないとタグ付けされた不適用描画対象を描画先バッファに描画する不適用描画対象描画部107と、詳細描画処理を適用するとタグ付けされた適用描画対象の面積を算出する算出部104と、算出した面積から適用描画対象のアルファ値を決定する決定部105と、詳細バッファに適用描画対象を描画し、描画されたピクセル数を取得する適用描画対象描画部108と、適用描画対象ごとに取得した描画されたピクセル数から、詳細バッファに描画されず、消失してしまった適用描画対象である消失適用描画対象を特定し、消失適用描画対象の位置に対応した詳細バッファ内のピクセルに点を打ち、消失適用描画対象を詳細バッファに描画する打点部112と、詳細バッファで塗られたピクセルに対応して、消失適用描画対象を含む適用描画対象のアルファ値で消失適用描画対象を含む適用描画対象のRGB値をアルファブレンドすることにより、消失適用描画対象を含む適用描画対象を描画先バッファに描画するブレンド処理部110と、出力装置5に描画先バッファに描画されたシーンを出力するよう制御する出力制御部111と、描画先バッファを記憶する描画先バッファ記憶部103と、詳細バッファを記憶する詳細バッファ記憶部109とを備えている。なお、描画先バッファは、第一のバッファに対応し、詳細バッファは第二のバッファに対応する。不適用描画対象は、第一の描画対象に対応し、適用描画対象は、第二の描画対象に対応する。不適用描画対象描画部107は、第一の描画対象描画部に対応し、適用描画対象描画部108は、第二の描画対象描画部に対応する。 The drawing processing device 100 includes an acquisition unit 101 that acquires drawing information including a drawing target and tag information from the input device 4, a creation unit 102 that creates a drawing destination buffer and a detail buffer, and a drawing target using the tag information. Depending on whether or not to perform detailed drawing processing, the drawing target is either an inapplicable drawing target that is drawn in the drawing destination buffer or an applicable drawing target that is drawn in the detail buffer, which is a buffer different from the drawing destination buffer. an application determination unit 106 that determines whether or not the detailed drawing process is applied; an unapplied drawing target drawing unit 107 that draws an unapplied drawing target tagged as not applying detailed drawing processing to a drawing destination buffer; A calculation unit 104 for calculating the area of the applied drawing target, a determination unit 105 for determining the alpha value of the applied drawing target from the calculated area, and a drawing target for drawing in the detail buffer to obtain the number of drawn pixels. From the applied drawing target drawing unit 108 and the number of drawn pixels acquired for each applicable drawing target, the loss applied drawing target that is the applied drawing target that has disappeared without being drawn in the detail buffer is identified, and loss applied drawing is performed. A dot printing unit 112 that dots a pixel in the detail buffer corresponding to the position of the object and draws an erasure-applied drawing object in the detail buffer; A blend processing unit 110 for drawing the applied drawing target including the erasure applied drawing object in the drawing destination buffer by alpha blending the RGB values of the applied drawing object including the erasure applied drawing object with the alpha value of the drawing object; an output control unit 111 for controlling output of a scene drawn in a drawing destination buffer, a drawing destination buffer storage unit 103 for storing the drawing destination buffer, and a detailed buffer storage unit 109 for storing the detail buffer. . The drawing destination buffer corresponds to the first buffer, and the detail buffer corresponds to the second buffer. The non-applicable drawing target corresponds to the first drawing target, and the applicable drawing target corresponds to the second drawing target. The unapplied drawing target drawing portion 107 corresponds to the first drawing target drawing portion, and the applicable drawing target drawing portion 108 corresponds to the second drawing target drawing portion.
 次に、実施の形態1に係る描画処理システム1のハードウェア構成について説明する。 Next, the hardware configuration of the drawing processing system 1 according to Embodiment 1 will be described.
 図2は、実施の形態1に係る描画処理システム1のハードウェア構成図である。図2を用いて、実施の形態1に係る描画処理装置100を含む描画処理システム1の構成について説明する。 FIG. 2 is a hardware configuration diagram of the drawing processing system 1 according to the first embodiment. The configuration of the drawing processing system 1 including the drawing processing device 100 according to the first embodiment will be described with reference to FIG.
 描画処理システム1は、一例として、バス41、CPU(Central Processing Unit)42、メインメモリ43、GPU(Graphics Processing Unit)44、グラフィックメモリ45、入力インターフェース46、出力インターフェース47とで構成されている。 As an example, the drawing processing system 1 includes a bus 41 , a CPU (Central Processing Unit) 42 , a main memory 43 , a GPU (Graphics Processing Unit) 44 , a graphic memory 45 , an input interface 46 and an output interface 47 .
 バス41は、各装置間を電気的に接続し、データのやり取りを行う信号経路である。描画処理システム1の装置間及び機器間は通信を介して接続されている。用いる通信については、有線通信でも無線通信でもよい。 A bus 41 is a signal path that electrically connects devices and exchanges data. The devices and devices of the drawing processing system 1 are connected via communication. The communication used may be wired communication or wireless communication.
 CPU42は、メインメモリ43に格納されるプログラムを読み込み、実行する。具体的には、CPU42は、メインメモリ42に記憶したOS(Operating System)の少なくとも一部をロードし、OSを実行しながら、プログラムを実行する。CPU42は、バス41を介して他の装置と接続し、当該装置を制御する。CPU42は、プロセッシングを行うIC(Integrated Circuit)であればいいので、マイクロプロセッサ、マイクロコンピュータ、DSP(Digital Signal Processor)であってもよい。 The CPU 42 reads and executes programs stored in the main memory 43 . Specifically, the CPU 42 loads at least part of the OS (Operating System) stored in the main memory 42 and executes the program while executing the OS. The CPU 42 is connected to another device via the bus 41 and controls the device. The CPU 42 may be an IC (Integrated Circuit) that performs processing, and may be a microprocessor, microcomputer, or DSP (Digital Signal Processor).
 メインメモリ43は、ソフトウェア、ファームウェア、またはソフトウェアとファームウェアの組み合わせが記述されたプログラム、OSを記憶する。また、メインメモリ43は、各種情報等を記憶する。メインメモリ43は、例えば、RAM(Random Access Memory)等の揮発性の半導体メモリ、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable Read Only Memory)、EEPROM(Electrically Erasable Programmable Read Only Memory)、HDD(Hard Disk Drive)等の不揮発性の半導体メモリ、磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ミニディスク、DVD(Digital Versatile Disc)といった可搬記録媒体等である。 The main memory 43 stores programs and OSs describing software, firmware, or a combination of software and firmware. The main memory 43 also stores various information and the like. The main memory 43 is, for example, a volatile semiconductor memory such as RAM (Random Access Memory), ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read Only). Non-volatile semiconductor memories such as HDDs (Hard Disk Drives), portable recording media such as magnetic disks, flexible disks, optical disks, compact disks, mini disks, and DVDs (Digital Versatile Discs).
 GPU44は、グラフィックメモリ45に格納されるプログラムを読み込み、描画処理を実行する。具体的には、GPU44は、グラフィックメモリ45に記憶したOSの少なくとも一部をロードし、OSを実行しながら、描画処理のプログラムを実行する。描画処理装置100の取得部101と、作成部102と、適用判定部106と、不適用描画対象描画部107と、算出部104と、決定部105と、適用描画対象描画部108と、打点部112と、ブレンド処理部110と、出力制御部111とは、GPU44がグラフィックメモリ45にロードしたプログラムを読み込み、実行することにより実現する。 The GPU 44 reads the program stored in the graphic memory 45 and executes drawing processing. Specifically, the GPU 44 loads at least part of the OS stored in the graphic memory 45, and executes the drawing processing program while executing the OS. An acquisition unit 101, a creation unit 102, an application determination unit 106, an unapplied drawing target drawing unit 107, a calculation unit 104, a determination unit 105, an applicable drawing target drawing unit 108, and a dot printing unit of the drawing processing device 100. 112, the blend processing unit 110, and the output control unit 111 are implemented by the GPU 44 reading a program loaded into the graphic memory 45 and executing it.
 グラフィックメモリ45は、ソフトウェア、ファームウェア、またはソフトウェアとファームウェアの組み合わせが記述されたプログラム、OSを記憶する。また、グラフィックメモリ45は、各種情報等を記憶する。グラフィックメモリ45は、例えば、RAM等の揮発性の半導体メモリ、ROM、フラッシュメモリ、EPROM、EEPROM、HDD等の不揮発性の半導体メモリ、磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ミニディスク、DVDといった可搬記録媒体等である。描画処理装置100の取得部101と、作成部102と、適用判定部106と、不適用描画対象描画部107と、算出部104と、決定部105と、適用描画対象描画部108と、打点部112と、ブレンド処理部110と、出力制御部111とは、グラフィックメモリ45に記憶するプログラムによって実現する。また、描画先バッファを記憶する描画先バッファ記憶部103と、詳細バッファを記憶する詳細バッファ記憶部109とは、グラフィックメモリ45によって実現される。 The graphic memory 45 stores programs and OSs describing software, firmware, or a combination of software and firmware. Also, the graphic memory 45 stores various information and the like. The graphic memory 45 includes, for example, volatile semiconductor memory such as RAM, ROM, flash memory, EPROM, EEPROM, nonvolatile semiconductor memory such as HDD, magnetic disk, flexible disk, optical disk, compact disk, mini disk, and DVD. A portable recording medium or the like. An acquisition unit 101, a creation unit 102, an application determination unit 106, an unapplied drawing target drawing unit 107, a calculation unit 104, a determination unit 105, an applicable drawing target drawing unit 108, and a dot printing unit of the drawing processing device 100. 112 , the blend processing unit 110 and the output control unit 111 are implemented by programs stored in the graphic memory 45 . Also, the drawing destination buffer storage unit 103 that stores the drawing destination buffer and the detail buffer storage unit 109 that stores the detail buffer are implemented by the graphic memory 45 .
 なお、グラフィックメモリ45は、描画処理装置100の取得部101と、作成部102と、適用判定部106と、不適用描画対象描画部107と、算出部104と、決定部105と、適用描画対象描画部108と、打点部112と、ブレンド処理部110と、出力制御部111との機能それぞれを別々のメモリで実現してもよい。また、描画先バッファ記憶部103と、詳細バッファ記憶部109それぞれを別々のメモリで実現してもよい。 The graphic memory 45 includes an acquisition unit 101, a creation unit 102, an application determination unit 106, an unapplied drawing target drawing unit 107, a calculation unit 104, a determination unit 105, and an applicable drawing target. The functions of the drawing unit 108, the dot printing unit 112, the blend processing unit 110, and the output control unit 111 may be realized by separate memories. Alternatively, the drawing destination buffer storage unit 103 and the detail buffer storage unit 109 may be realized by separate memories.
 実施の形態1では、描画処理装置100が行う描画処理を含む処理は、GPU44及びグラフィックメモリ45で行い、描画処理装置100が行う描画処理を含む処理以外の処理は、CPU42及びメインメモリ43で行う。なお、描画処理システム1は、CPU42とGPU44のどちらか一方だけ備えていて、備えている方がすべての処理を実現してもよい。また、描画処理システム1は、メインメモリ43とグラフィックメモリ45のどちらか一方だけ備えていて、備えているメモリが備えていない方のメモリと同様の機能を実現してもよい。各部の情報等は、メインメモリ43、グラフィックメモリ45のどちらに記憶されてもよい。また、各部の情報等は、メインメモリ43、グラフィックメモリ45を組み合わせて記憶されてもよい。 In the first embodiment, the processing including the drawing processing performed by the drawing processing device 100 is performed by the GPU 44 and the graphic memory 45, and the processing other than the processing including the drawing processing performed by the drawing processing device 100 is performed by the CPU 42 and the main memory 43. . The drawing processing system 1 may be provided with either the CPU 42 or the GPU 44, and the provided one may realize all the processing. Alternatively, the drawing processing system 1 may be provided with only one of the main memory 43 and the graphic memory 45, and the provided memory may realize the same function as the other memory. Information of each part may be stored in either the main memory 43 or the graphic memory 45 . Information of each part may be stored by combining the main memory 43 and the graphic memory 45 .
 入力インターフェース46は、描画情報及びタグ情報を入力可能とする装置である。入力インターフェース46は、例えば、キーボード、マウス、音声入力などでの入力、通信を介した情報の受信などである。入力装置4は、入力インターフェース46にて実現する。 The input interface 46 is a device that allows input of drawing information and tag information. The input interface 46 is, for example, input using a keyboard, mouse, voice input, etc., and reception of information via communication. The input device 4 is implemented by an input interface 46 .
 出力インターフェース47は、描画されたシーンを出力する装置である。出力インターフェース47は、例えば、ディスプレイ、プロジェクターなどの表示装置である。出力装置5は、出力インターフェース47にて実現する。 The output interface 47 is a device that outputs rendered scenes. The output interface 47 is, for example, a display device such as a display or a projector. The output device 5 is implemented by an output interface 47 .
 なお、描画処理システム1の各機能は、ハードウェア、ソフトウェア、ファームウェア、またはこれらの組み合わせにより実現できる。描画処理システム1の各機能について、一部を専用のハードウェアで実現し、一部をソフトウェアまたはファームウェアで実現するようにしてもよい。例えば、描画処理システム1の一部は専用のハードウェアとしての処理回路でその機能を実現し、残りの部分はCPUがメモリに格納されたプログラムを読み出して実行することによってその機能を実現してもよい。ソフトウェア、ファームウェア、またはソフトウェアとファームウェアの組み合わせはプログラムとして記述されていればよい。 Each function of the drawing processing system 1 can be realized by hardware, software, firmware, or a combination thereof. Each function of the drawing processing system 1 may be partly realized by dedicated hardware and partly realized by software or firmware. For example, a part of the drawing processing system 1 realizes its function by a processing circuit as dedicated hardware, and the remaining part realizes its function by reading and executing a program stored in the memory by the CPU. good too. Software, firmware, or a combination of software and firmware may be described as a program.
 次に、描画処理システム1の動作について説明する。 Next, the operation of the drawing processing system 1 will be explained.
 図3は、実施の形態1に係る描画処理システム1の動作を示すフローチャートである。図3を用いて、描画処理システム1の動作を以下に説明する。 FIG. 3 is a flow chart showing the operation of the drawing processing system 1 according to the first embodiment. The operation of the drawing processing system 1 will be described below with reference to FIG.
 ステップS1において、入力装置4は、描画対象を含む描画情報の入力を受け付ける。例えば、ユーザーがマウス、キーボードなどを用いて、描画情報を指定して入力装置4に入力し、入力装置4は、当該入力を受け付ける。 In step S1, the input device 4 accepts input of drawing information including a drawing target. For example, a user uses a mouse, a keyboard, or the like to specify drawing information and input it to the input device 4, and the input device 4 receives the input.
 なお、入力装置4が受け付ける描画情報は、ユーザーが指定しなくても、通信を介して受信した描画情報、予め定められた基準を満たしたものを自動的に抽出した描画情報、機械学習で抽出した描画情報など、どのようなものでもよい。実施の形態1では、例えば、描画対象を3次元のポリゴンモデルとし、描画情報はポリゴンモデルである描画対象の情報を含んでいるとする。ポリゴンモデルである描画対象の情報とは、RGB値、位置情報などである。 The drawing information received by the input device 4 may be drawing information received via communication, drawing information automatically extracted that satisfies a predetermined criterion, or drawing information extracted by machine learning, even if the drawing information is not specified by the user. Any information such as drawing information can be used. In the first embodiment, for example, it is assumed that the drawing target is a three-dimensional polygon model and the drawing information includes information on the drawing target, which is the polygon model. The information of the object to be drawn, which is a polygon model, includes RGB values, position information, and the like.
 図4は、実施の形態1の描画情報の一例である。描画情報は、描画対象であるポリゴンモデル51と、ポリゴンモデル52と、ポリゴンモデル53と、ポリゴンモデル54とを含むとする。なお、図4では、描画の流れがわかりやすいように描画先バッファと同じ大きさのバッファの上にポリゴンモデル51~54が描画されているように表示している。 FIG. 4 is an example of drawing information according to the first embodiment. The drawing information includes a polygon model 51, a polygon model 52, a polygon model 53, and a polygon model 54 to be drawn. In FIG. 4, the polygon models 51 to 54 are displayed as if drawn on a buffer of the same size as the drawing destination buffer so that the drawing flow can be easily understood.
 図3に戻って、入力装置4は、描画情報と共に、受け付けた描画情報に含まれる描画対象において、適用描画対象と不適用描画対象とのいずれであるかの情報であるタグ情報の入力を受け付ける。なお、タグ情報は、各描画対象に対応している。 Returning to FIG. 3, the input device 4 receives input of tag information, which is information indicating whether a drawing target included in the received drawing information is an applicable drawing target or an unapplied drawing target, together with the drawing information. . Note that the tag information corresponds to each drawing target.
 例えば、ユーザーは、描画対象ごとに、描画サイズの大きさを基に、描画対象であるポリゴンモデルが詳細描画処理を行わなくてもいいほど大きいサイズか、それとも描画対象であるポリゴンモデルが詳細描画処理を行うほど小さいサイズかを判定する。ユーザーは、マウス、キーボード、音声入力などを用いて、大きいサイズであると判定した描画対象を不適用描画対象、小さいサイズと判定した描画対象を適用描画対象として、描画対象ごとにタグ付けをしたタグ情報を入力装置4に入力する。入力装置4は、当該入力を受け付ける。 For example, based on the size of the drawing size for each drawing target, the user can determine whether the polygon model to be drawn is large enough to avoid detailed drawing processing, or whether the polygon model to be drawn is large enough to be drawn in detail. Determine whether the size is small enough to process. The user uses the mouse, keyboard, voice input, etc., to tag each drawing target as non-applicable drawing targets that are determined to be large size, and applicable drawing targets that are determined to be small size. The tag information is input to the input device 4 . The input device 4 receives the input.
 実施の形態1では、図4の描画情報に対して、ポリゴンモデル51~53は、詳細描画処理を行うほど小さいサイズであるとユーザーに判定されて、ポリゴンモデル54は、詳細描画処理を行わなくてもいいほど大きいサイズであるとユーザーに判定されたとする。入力装置4は、ユーザーによって入力された、ポリゴンモデル51が適用描画対象であるとタグ付けされたタグ情報と、ポリゴンモデル52が適用描画対象であるとタグ付けされたタグ情報と、ポリゴンモデル53が適用描画対象であるとタグ付けされたタグ情報と、ポリゴンモデル54が不適用描画対象であるとタグ付けされたタグ情報とを受け付けたとする。 In the first embodiment, the user judges that the polygon models 51 to 53 are small enough to perform detailed drawing processing with respect to the drawing information of FIG. 4, and the polygon model 54 is not subjected to detailed drawing processing. Assume that the user determines that the size is large enough to be acceptable. The input device 4 receives, input by the user, tag information tagged with the polygon model 51 as the applicable drawing target, tag information tagged with the polygon model 52 as the applicable drawing target, and the polygon model 53 . Suppose that tag information tagged as being the applicable drawing target and tag information tagged as the polygon model 54 being the unapplied drawing target are received.
 なお、適用描画対象と不適用描画対象とのいずれであるかを判定する基準は、描画対象のサイズの大きさに限らない。例えば、ユーザーは、輝度、重要度などを基準に適用描画対象と不適用描画対象とのいずれであるかを判定してもよい。また、ユーザーは、描画サイズの大きさ、輝度、重要度などを組み合わせたものを基準に適用描画対象と不適用描画対象とのいずれであるかを判定してもよい。具体的には、ユーザーは、重要度が高く描画対象が消えてほしくないと判定した描画対象を適用描画対象、重要度が低い描画対象を不適用描画対象としてもよい。また、ユーザーは、描画対象のサイズが予め定められたサイズ以下で、描画対象の輝度が予め定められた輝度以上であった場合、適用描画対象とし、それ以外の描画対象を不適用描画対象としてもよい。描画対象が適用描画対象と不適用描画対象とのいずれであるかの判断は、ユーザーの好みで判定してもよい。 It should be noted that the criterion for determining whether a drawing target is an applicable drawing target or an inapplicable drawing target is not limited to the size of the drawing target. For example, the user may determine whether it is an applicable drawing target or an unapplied drawing target based on luminance, importance, or the like. Also, the user may determine whether an object is an applicable drawing target or an unapplied drawing target based on a combination of drawing size, brightness, importance, and the like. Specifically, the user may set a drawing target with a high degree of importance as an applicable drawing target and a drawing target with a low degree of importance as an unapplied drawing target. In addition, when the size of a drawing target is equal to or smaller than a predetermined size and the brightness of the drawing target is equal to or higher than a predetermined brightness, the user regards the drawing target as an applicable drawing target, and the other drawing targets as non-applicable drawing targets. good too. Whether the drawing target is the applicable drawing target or the non-applying drawing target may be determined according to the user's preference.
 図3に戻って、入力装置4は、描画情報とタグ情報とを描画処理装置100に送信し、ステップS2に進む。 Returning to FIG. 3, the input device 4 transmits the drawing information and the tag information to the drawing processing device 100, and proceeds to step S2.
 ステップS2において、描画処理装置100は、入力装置4から描画情報とタグ情報とを受信する。描画処理装置100は、受信した描画情報とタグ情報とを用いて、描画処理を行う。詳細については、後述する。描画処理装置100は、描画先バッファに描画されたシーンを出力するように出力装置5を制御し、ステップS3に進む。 In step S2, the drawing processing device 100 receives drawing information and tag information from the input device 4. The drawing processing device 100 performs drawing processing using the received drawing information and tag information. Details will be described later. The drawing processing device 100 controls the output device 5 to output the scene drawn in the drawing destination buffer, and proceeds to step S3.
 ステップS3において、出力装置5は、描画処理装置100の制御によって、描画先バッファに描画されたシーンをディスプレイなどの表示装置に出力する。 In step S3, the output device 5 outputs the scene drawn in the drawing destination buffer to a display device such as a display under the control of the drawing processing device 100. FIG.
 ステップS3を実行した後もステップS1に戻り、電源をOFFにすること、あるいは終了操作がなされる等の処理の終了のトリガーがあるまで上記のような処理を繰り返す。なお、上記のような処理を繰り返すとしたが、繰り返さず一回行うだけでもよい。 Even after executing step S3, the process returns to step S1, and the above process is repeated until there is a trigger to end the process, such as turning off the power or performing an end operation. Although the above processing is repeated, it may be performed only once.
 次に、描画処理装置100の動作について説明する。 Next, the operation of the rendering processing device 100 will be described.
 図5は、実施の形態1に係る描画処理装置100の動作を示すフローチャートである。図5を用いて、描画処理装置100の動作を以下に説明する。 FIG. 5 is a flow chart showing the operation of the rendering processing device 100 according to the first embodiment. The operation of the rendering processing apparatus 100 will be described below with reference to FIG.
 ステップS11において、取得部101は、入力装置4から描画情報とタグ情報とを取得する。取得部101は、描画情報とタグ情報とを作成部102に送信し、ステップS12に進む。 In step S<b>11 , the acquisition unit 101 acquires drawing information and tag information from the input device 4 . The acquisition unit 101 transmits the drawing information and the tag information to the creation unit 102, and proceeds to step S12.
 ステップS12において、作成部102は、取得部101から描画情報とタグ情報とを受信し、描画先バッファ記憶部103に描画先バッファを作成する。また、作成部102は、詳細バッファ記憶部109に詳細バッファを作成する。当該詳細バッファには、適用描画対象すべてが描画されることになる。なお、詳細バッファは、描画先バッファと同じ解像度である。 In step S<b>12 , the creation unit 102 receives the drawing information and the tag information from the acquisition unit 101 and creates a drawing destination buffer in the drawing destination buffer storage unit 103 . Also, the creating unit 102 creates a detailed buffer in the detailed buffer storage unit 109 . All applicable drawing objects are drawn in the detail buffer. Note that the detail buffer has the same resolution as the drawing destination buffer.
 図6は、実施の形態1の描画先バッファ55及び詳細バッファ56の図である。実施の形態1では、描画先バッファ55は、縦が8ピクセルで、横が8ピクセルの合計64ピクセルとする。詳細バッファ56も、縦が8ピクセルで、横が8ピクセルの合計64ピクセルとする。つまり、詳細バッファ56は、描画先バッファ55と同じ解像度となる。 FIG. 6 is a diagram of the drawing destination buffer 55 and the detail buffer 56 of the first embodiment. In the first embodiment, the drawing destination buffer 55 is 8 pixels long by 8 pixels wide, for a total of 64 pixels. The detail buffer 56 is also 8 pixels high by 8 pixels wide, for a total of 64 pixels. That is, the detail buffer 56 has the same resolution as the drawing destination buffer 55 .
 図5に戻って、作成部102は、描画情報とタグ情報とを適用判定部106に送信し、ステップS13に進む。 Returning to FIG. 5, the creation unit 102 transmits the drawing information and the tag information to the application determination unit 106, and proceeds to step S13.
 ステップS13において、適用判定部106は、作成部102から描画情報とタグ情報とを受信し、タグ情報を用いて、描画対象が適用描画対象と不適用描画対象とのいずれであるかを判定する。適用判定部106は、描画情報と、不適用描画対象である描画対象のタグ情報のすべてとを不適用描画対象描画部107に送信する。不適用描画対象描画部107は、適用判定部106から描画情報と、不適用描画対象である描画対象のタグ情報とを受信し、描画先バッファ記憶部103の描画先バッファに不適用描画対象を描画する。 In step S13, the application determination unit 106 receives the drawing information and the tag information from the creation unit 102, and uses the tag information to determine whether the drawing target is an applicable drawing target or an unapplied drawing target. . The application determination unit 106 transmits the drawing information and all of the tag information of the drawing target that is the non-applicable drawing target to the non-applicable drawing target drawing unit 107 . The non-applicable drawing target drawing unit 107 receives the drawing information and the tag information of the non-applicable drawing target from the application determination unit 106, and stores the non-applicable drawing target in the drawing destination buffer of the drawing destination buffer storage unit 103. draw.
 実施の形態1では、適用判定部106は、図4の描画情報に対して、描画情報と、不適用描画対象であるとタグ付けされたポリゴンモデル54のタグ情報とを不適用描画対象描画部107に送信する。不適用描画対象描画部107は、適用判定部106から描画情報と、不適用描画対象であるとタグ付けされたポリゴンモデル54のタグ情報とを受信し、描画先バッファ記憶部103の描画先バッファ55にポリゴンモデル54を描画する。 In the first embodiment, the application determination unit 106 determines the drawing information of FIG. 107. The non-applicable drawing target drawing unit 107 receives the drawing information from the application determination unit 106 and the tag information of the polygon model 54 tagged as the non-applicable drawing target, and stores the drawing destination buffer in the drawing destination buffer storage unit 103 . A polygon model 54 is drawn on 55 .
 図7は、不適用描画対象のポリゴンモデル54を描画先バッファ55に描画した描画結果57と不適用描画対象のポリゴンモデル54とを重畳した図である。図7の描画結果57のように、不適用描画対象描画部107は、ポリゴンモデル54の形に類似するように描画先バッファ55のピクセルを描画情報に含まれる描画対象に付随したRGB値で塗りつぶす。 FIG. 7 is a diagram in which the drawing result 57 obtained by drawing the non-applicable drawing target polygon model 54 in the drawing destination buffer 55 and the non-applicable drawing target polygon model 54 are superimposed. As shown in the drawing result 57 in FIG. 7, the non-applicable drawing target drawing unit 107 fills the pixels in the drawing destination buffer 55 with the RGB values associated with the drawing target included in the drawing information so as to resemble the shape of the polygon model 54. .
 図8は、不適用描画対象のポリゴンモデル54を描画先バッファ55に描画した描画結果57を示した図である。図8は図7のポリゴンモデル54の重畳表示をなくし、実際の描画結果57のみを示している。 FIG. 8 is a diagram showing a drawing result 57 obtained by drawing the non-applied drawing target polygon model 54 in the drawing destination buffer 55 . FIG. 8 eliminates the superimposed display of the polygon model 54 of FIG. 7 and shows only the actual rendering result 57 .
 図5に戻って、不適用描画対象描画部107が描画先バッファに不適用描画対象を描画し終えたら、不適用描画対象描画部107は、描画先バッファへの描画の終了の通知を適用処理部106とブレンド処理部110に送信し、ステップS14に進む。 Returning to FIG. 5, when the non-applicable drawing target rendering unit 107 finishes drawing the non-applicable drawing target in the rendering destination buffer, the non-applicable drawing target rendering unit 107 notifies the end of drawing to the rendering destination buffer. It is transmitted to the unit 106 and the blend processing unit 110, and the process proceeds to step S14.
 ステップS14において、適用判定部106は、不適用描画対象描画部107から描画先バッファへの描画の終了の通知を受信すると、描画情報と、適用描画対象である描画対象のタグ情報とを算出部104に送信する。 In step S14, when the application determination unit 106 receives the notification from the non-applicable drawing target drawing unit 107 that the drawing to the drawing destination buffer is completed, the application determination unit 106 calculates the drawing information and the tag information of the drawing target that is the applicable drawing target. 104.
 実施の形態1では、適用判定部106は、図4の描画情報に対して、描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル51のタグ情報と、適用描画対象であるとタグ付けされたポリゴンモデル52のタグ情報と、適用描画対象であるとタグ付けされたポリゴンモデル53のタグ情報とを算出部104に送信する。 In the first embodiment, the application determining unit 106 adds the drawing information, the tag information of the polygon model 51 tagged as the applicable drawing target, and the tag as the applicable drawing target to the drawing information of FIG. The tag information of the polygon model 52 that has been tagged and the tag information of the polygon model 53 that has been tagged as an applicable drawing target are transmitted to the calculation unit 104 .
 算出部104は、適用判定部106から描画情報と、適用描画対象である描画対象のタグ情報とを受信し、適用描画対象の面積を算出する。面積の算出方法は、適用描画対象ごとの面積を正確に計算してもよいし、バウンディングボックスなどを用いて近似して、適用描画対象ごとの面積の概算をしてもよい。なお、3次元の適用描画対象の場合は、適用描画対象を2次元に投影変換してから、適用描画対象の面積を計算する。 The calculation unit 104 receives the drawing information and the tag information of the drawing target, which is the application drawing target, from the application determination unit 106, and calculates the area of the application drawing target. As for the area calculation method, the area of each applicable drawing object may be calculated accurately, or the area of each applicable drawing object may be roughly calculated by approximation using a bounding box or the like. In the case of a three-dimensional applied drawing target, the area of the applied drawing target is calculated after the applied drawing target is projected and transformed into two dimensions.
 図9は、バウンディングボックス80を用いて3次元の描画対象81を2次元に投影変換して描画対象81の面積を概算することを示した概念図である。例えば、描画対象81の面積を概算する方法としては、家の形をした3次元の描画対象81をバウンディングボックス80で囲み、2次元に投影変換したい面に対応するバウンディングボックス80の面積82を3次元の描画対象81を2次元に投影変換したときの描画対象81の面積の概算値とする。 FIG. 9 is a conceptual diagram showing how a bounding box 80 is used to project and transform a three-dimensional drawing target 81 into a two-dimensional drawing target 81 to roughly estimate the area of the drawing target 81 . For example, as a method of roughly estimating the area of a drawing target 81, a three-dimensional drawing target 81 in the shape of a house is surrounded by a bounding box 80, and the area 82 of the bounding box 80 corresponding to the surface to be projected and transformed into two dimensions is divided by three. It is assumed to be an approximate value of the area of the drawing object 81 when the dimensional drawing object 81 is projected and transformed into two dimensions.
 なお、図9は2次元に投影変換したい面にちょうどバウンディングボックス80の1面が重なっていたが、バウンディングボックス80の1面が重ならなくても、2次元に投影変換したい面にバウンディングボックス80を投影変換すれば、3次元の描画対象81を2次元に投影変換したときの描画対象81の面積の概算値は求まる。また、図9は平行投影について説明したが、透視投影などでも適用可能である。 In FIG. 9, one surface of the bounding box 80 exactly overlaps the surface to be projected and transformed into two dimensions. , the approximate value of the area of the drawing object 81 when the three-dimensional drawing object 81 is projected and transformed into two dimensions can be obtained. In addition, parallel projection has been described with reference to FIG. 9, but perspective projection or the like is also applicable.
 このように、バウンディングボックス80を用いて3次元の描画対象81を2次元に投影変換して描画対象81の面積を概算することで、描画対象81をそのまま2次元に投影変換して描画対象81の面積を求めるよりも描画処理負荷を低減することができる。なお、描画対象81が複雑な形ほど描画処理負荷が高くなるため、描画対象81が複雑な形ほどバウンディングボックス80を用いて3次元の描画対象81を2次元に投影変換して描画対象81の面積を概算すると描画処理負荷の低減の効果が高くなる。 In this way, the bounding box 80 is used to project the three-dimensional drawing object 81 into two dimensions, and the area of the drawing object 81 is roughly calculated. , the drawing processing load can be reduced compared to obtaining the area of . Note that the more complicated the shape of the drawing target 81, the higher the drawing processing load. Roughly estimating the area increases the effect of reducing the drawing processing load.
 図5に戻って、実施の形態1では、算出部104は、適用判定部106から描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル51のタグ情報と、適用描画対象であるとタグ付けされたポリゴンモデル52のタグ情報と、適用描画対象であるとタグ付けされたポリゴンモデル53のタグ情報とを受信する。 Returning to FIG. 5, in the first embodiment, the calculation unit 104 receives the drawing information from the application determination unit 106, the tag information of the polygon model 51 tagged as the applicable drawing target, and the The tag information of the tagged polygon model 52 and the tag information of the polygon model 53 tagged as the applicable drawing target are received.
 算出部104は、ポリゴンモデル51~53の面積を算出する。算出部104は、例えば、ポリゴンモデル51の面積を0.4ピクセル、ポリゴンモデル52の面積を0.7ピクセル、ポリゴンモデル53の面積を1.2ピクセルと算出したとする。以降は、算出部104が算出した適用描画対象の面積を示した情報を面積情報と呼ぶ。 The calculation unit 104 calculates the areas of the polygon models 51-53. For example, assume that the calculation unit 104 calculates the area of the polygon model 51 to be 0.4 pixels, the area of the polygon model 52 to be 0.7 pixels, and the area of the polygon model 53 to be 1.2 pixels. Hereinafter, information indicating the area of the applicable drawing target calculated by the calculation unit 104 will be referred to as area information.
 算出部104は、描画情報と、適用描画対象である描画対象のタグ情報と、適用描画対象の面積情報とを決定部105に送信し、ステップS15に進む。 The calculating unit 104 transmits the drawing information, the tag information of the drawing target that is the applicable drawing target, and the area information of the applicable drawing target to the determining unit 105, and proceeds to step S15.
 ステップS15において、決定部105は、算出部104から描画情報と、適用描画対象である描画対象のタグ情報と、適用描画対象の面積情報とを受信する。決定部105は、面積情報を用いて、適用描画対象の透過度を表すアルファ値を決定する。具体的には、決定部105は、面積情報から、適用描画対象の面積が1.0ピクセル以下の面積の場合は、当該面積を適用描画対象のアルファ値に決定する。一方、決定部105は、面積情報から、適用描画対象の面積が1.0ピクセルよりも大きい面積の場合は、1.0を適用描画対象のアルファ値に決定する。 In step S15, the determination unit 105 receives from the calculation unit 104 the drawing information, the tag information of the drawing target that is the applied drawing target, and the area information of the applied drawing target. The determining unit 105 determines an alpha value representing the transparency of the applied rendering target using the area information. Specifically, from the area information, if the area of the applicable drawing target is 1.0 pixels or less, the determination unit 105 determines the area as the alpha value of the applicable drawing target. On the other hand, if the area of the applicable drawing target is larger than 1.0 pixels based on the area information, the determination unit 105 decides 1.0 as the alpha value of the applicable drawing target.
 実施の形態1では、決定部105は、算出部104から描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル51のタグ情報と、適用描画対象であるとタグ付けされたポリゴンモデル52のタグ情報と、適用描画対象であるとタグ付けされたポリゴンモデル53のタグ情報と、ポリゴンモデル51の面積が0.4ピクセルであるという面積情報と、ポリゴンモデル52の面積が0.7ピクセルであるという面積情報と、ポリゴンモデル53の面積が1.2ピクセルであるという面積情報とを受信する。 In the first embodiment, the determination unit 105 receives the drawing information from the calculation unit 104, the tag information of the polygon model 51 tagged as the applicable drawing target, and the polygon model 52 tagged as the applicable drawing target. , the tag information of the polygon model 53 tagged as being the applicable drawing target, the area information that the area of the polygon model 51 is 0.4 pixels, and the area of the polygon model 52 that is 0.7 pixels. and area information that the area of the polygon model 53 is 1.2 pixels.
 決定部105は、ポリゴンモデル51の面積情報を用いて、ポリゴンモデル51のアルファ値を決定する。具体的には、決定部105は、面積情報から、ポリゴンモデル51の面積が1.0ピクセル以下である0.4ピクセルであることが分かるため、ポリゴンモデル51のアルファ値を0.4に決定する。 The determination unit 105 determines the alpha value of the polygon model 51 using the area information of the polygon model 51 . Specifically, the determination unit 105 determines from the area information that the area of the polygon model 51 is 0.4 pixels, which is 1.0 pixels or less, and thus determines the alpha value of the polygon model 51 to be 0.4. do.
 決定部105は、ポリゴンモデル52の面積情報を用いて、ポリゴンモデル52のアルファ値を決定する。具体的には、決定部105は、面積情報から、ポリゴンモデル52の面積が1.0ピクセル以下である0.7ピクセルであることが分かるため、ポリゴンモデル52のアルファ値を0.7に決定する。 The determination unit 105 determines the alpha value of the polygon model 52 using the area information of the polygon model 52 . Specifically, the determination unit 105 determines from the area information that the area of the polygon model 52 is 0.7 pixels, which is 1.0 pixels or less, and thus determines the alpha value of the polygon model 52 to be 0.7. do.
 決定部105は、ポリゴンモデル53の面積情報を用いて、ポリゴンモデル53のアルファ値を決定する。具体的には、決定部105は、面積情報から、ポリゴンモデル53の面積が1.0ピクセルよりも大きい1.2ピクセルであることが分かるため、ポリゴンモデル53のアルファ値を1.0に決定する。以降は、決定部105が決定した適用描画対象のアルファ値を示した情報をアルファ値情報と呼ぶ。 The determination unit 105 determines the alpha value of the polygon model 53 using the area information of the polygon model 53 . Specifically, the determination unit 105 determines from the area information that the area of the polygon model 53 is 1.2 pixels, which is larger than 1.0 pixels, and thus determines the alpha value of the polygon model 53 to be 1.0. do. Hereinafter, the information indicating the alpha value of the drawing target to be applied determined by the determining unit 105 will be referred to as alpha value information.
 決定部105は、描画情報と、適用描画対象である描画対象のタグ情報と、適用描画対象のアルファ値情報とを適用描画対象描画部108に送信し、ステップS16に進む。 The determining unit 105 transmits the drawing information, the tag information of the drawing target that is the applicable drawing target, and the alpha value information of the applicable drawing target to the applicable drawing target drawing unit 108, and proceeds to step S16.
 ステップS16において、適用描画対象描画部108は、決定部105から描画情報と、適用描画対象である描画対象のタグ情報と、適用描画対象のアルファ値情報とを受信し、詳細バッファ記憶部109の詳細バッファに適用描画対象を描画する。このとき、詳細バッファにアルファ値情報も記憶される。また、適用描画対象描画部108は、詳細バッファに適用描画対象を描画する毎に、詳細バッファに描画されたピクセル数を取得する。 In step S<b>16 , the applicable drawing target drawing unit 108 receives the drawing information, the tag information of the drawing target that is the applicable drawing target, and the alpha value information of the applicable drawing target from the determining unit 105 . Draws the applicable draw target in the detail buffer. At this time, alpha value information is also stored in the detail buffer. Also, the applied drawing target drawing unit 108 acquires the number of pixels drawn in the detailed buffer each time the applied drawing target is drawn in the detailed buffer.
 実施の形態1では、適用描画対象描画部108は、決定部105から描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル51のタグ情報と、適用描画対象であるとタグ付けされたポリゴンモデル52のタグ情報と、適用描画対象であるとタグ付けされたポリゴンモデル53のタグ情報と、ポリゴンモデル51のアルファ値が0.4であるというアルファ値情報と、ポリゴンモデル52のアルファ値が0.7であるというアルファ値情報と、ポリゴンモデル53のアルファ値が1.0であるというアルファ値情報とを受信する。 In the first embodiment, the applicable drawing target drawing unit 108 receives the drawing information from the determining unit 105, the tag information of the polygon model 51 tagged as the applicable drawing target, and the tag information of the polygon model 51 tagged as the applicable drawing target. The tag information of the polygon model 52, the tag information of the polygon model 53 tagged as being the applicable drawing target, the alpha value information that the alpha value of the polygon model 51 is 0.4, and the alpha value of the polygon model 52. is 0.7 and alpha value information that the alpha value of the polygon model 53 is 1.0.
 適用描画対象描画部108は、詳細バッファ記憶部109の詳細バッファ56にポリゴンモデル51を描画する。 The applicable drawing target drawing unit 108 draws the polygon model 51 in the detail buffer 56 of the detail buffer storage unit 109 .
 図10は、適用描画対象のポリゴンモデル51を詳細バッファ56に描画した描画結果58と複数のポリゴンモデル51~53とを重畳した図である。ポリゴンモデル51は、詳細バッファ56の1つのピクセルの中心を含むため、当該ピクセルが描画情報に含まれるポリゴンモデル51に付随したRGB値で塗りつぶされて、ポリゴンモデル51が描画される。ポリゴンモデル51が描画された結果が、図10の1ピクセルの描画結果58である。 FIG. 10 is a diagram in which a drawing result 58 obtained by drawing the polygon model 51 to be applied drawing in the detail buffer 56 and a plurality of polygon models 51 to 53 are superimposed. Since the polygon model 51 includes the center of one pixel in the detail buffer 56, the polygon model 51 is drawn by filling the pixel with the RGB values associated with the polygon model 51 included in the drawing information. The drawing result of the polygon model 51 is the drawing result 58 of one pixel in FIG.
 図11は、適用描画対象のポリゴンモデル51を詳細バッファ56に描画した描画結果58を示した図である。図11は図10の複数のポリゴンモデル51~53の重畳表示をなくし、実際の描画結果58のみを示している。 FIG. 11 is a diagram showing a rendering result 58 of rendering the polygon model 51 to be applied rendering in the detail buffer 56. FIG. FIG. 11 eliminates the superimposed display of the multiple polygon models 51 to 53 of FIG. 10 and shows only the actual drawing result 58. FIG.
 適用描画対象描画部108は、ポリゴンモデル51のアルファ値情報から、描画結果58のピクセルに、ポリゴンモデル51のアルファ値が0.4であるというアルファ値を記憶させる。図11のピクセルの中の値は、記憶されているアルファ値を表す。 From the alpha value information of the polygon model 51, the applicable drawing target drawing unit 108 stores the alpha value that the alpha value of the polygon model 51 is 0.4 in the pixel of the drawing result 58. The values in the pixels of FIG. 11 represent the stored alpha values.
 また、適用描画対象描画部108は、詳細バッファ56にポリゴンモデル51を描画する処理をした後、ポリゴンモデル51によって実際に詳細バッファ56が塗りつぶされたピクセル数を取得する。ポリゴンモデル51の場合は、図11から分かるように、適用描画対象描画部108は、詳細バッファ56に実際に描画されたピクセル数として1ピクセルという情報を取得する。 In addition, the applicable drawing target drawing unit 108 acquires the number of pixels in which the detailed buffer 56 is actually filled with the polygon model 51 after performing the process of drawing the polygon model 51 in the detail buffer 56 . In the case of the polygon model 51, as can be seen from FIG. 11, the applicable drawing target drawing unit 108 acquires information that the number of pixels actually drawn in the detail buffer 56 is 1 pixel.
 同様に、適用描画対象描画部108は、詳細バッファ記憶部109の詳細バッファ56にポリゴンモデル52を描画する。 Similarly, the applicable drawing target drawing unit 108 draws the polygon model 52 in the detail buffer 56 of the detail buffer storage unit 109 .
 しかし、ポリゴンモデル52は、詳細バッファ56のどのピクセルの中心も含まないため、描画されず消えてしまう。したがって、適用描画対象描画部108は、詳細バッファ56にポリゴンモデル52を描画する処理をしても、ポリゴンモデル51だけが描画された図10及び図11のようになってしまう。 However, since the polygon model 52 does not include the center of any pixel in the detail buffer 56, it disappears without being drawn. Therefore, even if the applicable drawing target drawing unit 108 draws the polygon model 52 in the detail buffer 56, only the polygon model 51 is drawn as shown in FIGS. 10 and 11. FIG.
 適用描画対象描画部108は、詳細バッファ56にポリゴンモデル52を描画する処理をした後、ポリゴンモデル52によって実際に詳細バッファ56が塗りつぶされたピクセル数を取得する。ポリゴンモデル52の場合は、図11から分かるように、ポリゴンモデル52によって詳細バッファ56が1ピクセルも塗りつぶされていないため、適用描画対象描画部108は、ピクセル数として0ピクセルという情報を取得する。 After performing the process of drawing the polygon model 52 in the detail buffer 56 , the applicable drawing target drawing unit 108 acquires the number of pixels in which the detail buffer 56 is actually filled with the polygon model 52 . In the case of the polygon model 52, as can be seen from FIG. 11, the detail buffer 56 is not filled with even one pixel by the polygon model 52, so the applicable drawing target drawing unit 108 acquires the information that the number of pixels is 0 pixel.
 次に、適用描画対象描画部108は、詳細バッファ記憶部109の詳細バッファ56にポリゴンモデル53を描画する。 Next, the applicable drawing target drawing unit 108 draws the polygon model 53 in the detail buffer 56 of the detail buffer storage unit 109 .
 図12は、適用描画対象のポリゴンモデル51を詳細バッファ56に描画した描画結果58と別の適用描画対象のポリゴンモデル53を詳細バッファ56に描画した描画結果59と複数のポリゴンモデル51~53とを重畳した図である。ポリゴンモデル53は、詳細バッファ56の1つのピクセルの中心を含むため、当該ピクセルが描画情報に含まれるポリゴンモデル53に付随したRGB値で塗りつぶされて、ポリゴンモデル53が描画される。ポリゴンモデル53が描画された結果が、図10の1ピクセルの描画結果59である。 FIG. 12 shows a drawing result 58 obtained by drawing a polygon model 51 to be applied drawing in the detail buffer 56, a drawing result 59 obtained by drawing another polygon model 53 to be applied drawing in the detail buffer 56, and a plurality of polygon models 51 to 53. is superimposed. Since the polygon model 53 includes the center of one pixel in the detail buffer 56, the polygon model 53 is drawn by filling the pixel with the RGB values associated with the polygon model 53 included in the drawing information. The drawing result of the polygon model 53 is the drawing result 59 of one pixel in FIG.
 図13は、適用描画対象のポリゴンモデル51を詳細バッファ56に描画した描画結果58と別の適用描画対象のポリゴンモデル53を詳細バッファ56に描画した描画結果59とを示した図である。図13は図12の複数のポリゴンモデル51~53の重畳表示をなくし、実際の描画結果58~59のみを示している。 FIG. 13 shows a drawing result 58 obtained by drawing the polygon model 51 to be applied drawing in the detail buffer 56 and a drawing result 59 obtained by drawing another polygon model 53 to be applied drawing in the detail buffer 56 . FIG. 13 eliminates the superimposed display of the multiple polygon models 51-53 of FIG. 12 and shows only the actual drawing results 58-59.
 適用描画対象描画部108は、ポリゴンモデル53のアルファ値情報から、描画結果59のピクセルに、ポリゴンモデル53のアルファ値が1.0であるというアルファ値を記憶させる。図13の各ピクセルの中の値は、記憶されているアルファ値を表す。 From the alpha value information of the polygon model 53, the applicable drawing target drawing unit 108 stores the alpha value that the alpha value of the polygon model 53 is 1.0 in the pixel of the drawing result 59. The value in each pixel in FIG. 13 represents the stored alpha value.
 また、適用描画対象描画部108は、詳細バッファ56にポリゴンモデル53を描画する処理をした後、ポリゴンモデル53によって実際に詳細バッファ56が塗りつぶされたピクセル数を取得する。ポリゴンモデル53の場合は、図13から分かるように、適用描画対象描画部108は、詳細バッファ56に実際に描画されたピクセル数として1ピクセルという情報を取得する。以降は、適用描画対象描画部108が取得した適用描画対象によって詳細バッファに描画されたピクセル数を示す情報をピクセル数情報と呼ぶ。 Also, the applicable drawing target drawing unit 108 acquires the number of pixels in which the detailed buffer 56 is actually filled with the polygon model 53 after performing the process of drawing the polygon model 53 in the detail buffer 56 . In the case of the polygon model 53, as can be seen from FIG. 13, the applicable drawing target drawing unit 108 acquires the information that the number of pixels actually drawn in the detail buffer 56 is 1 pixel. Hereinafter, information indicating the number of pixels drawn in the detail buffer by the applicable drawing target acquired by the applicable drawing target drawing unit 108 will be referred to as pixel number information.
 図5に戻って、適用描画対象描画部108は、描画情報と、適用描画対象である描画対象のタグ情報と、アルファ値情報と、ピクセル数情報とを打点部112に送信し、ステップS17に進む。 Returning to FIG. 5, the applied drawing target drawing unit 108 transmits the drawing information, the tag information of the drawing target that is the applicable drawing target, the alpha value information, and the pixel number information to the dot printing unit 112, and the process proceeds to step S17. move on.
 ステップS17において、打点部112は、適用描画対象描画部108から描画情報と、適用描画対象である描画対象のタグ情報と、アルファ値情報と、ピクセル数情報とを受信する。打点部112は、ピクセル数情報から、適用描画対象描画部108で描画されず消えてしまった適用描画対象である消失適用描画対象を特定し、消失適用描画対象の位置に対応した詳細バッファ内のピクセルに点を打ち、消失適用描画対象を詳細バッファに描画する。このとき、詳細バッファにアルファ値情報のアルファ値も記憶される。 In step S17, the dot printing unit 112 receives the drawing information, the tag information of the drawing target that is the applied drawing target, the alpha value information, and the number of pixels information from the applied drawing target drawing unit . The dot printing unit 112 identifies, from the pixel count information, the erasure-applied drawing target, which is the applied drawing target that has disappeared without being drawn by the applicable drawing-target drawing unit 108. Dot the pixel and draw the erasure-applied draw target to the detail buffer. At this time, the alpha value of the alpha value information is also stored in the detail buffer.
 実施の形態1では、打点部112は、適用描画対象描画部108から描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル51のタグ情報と、適用描画対象であるとタグ付けされたポリゴンモデル52のタグ情報と、適用描画対象であるとタグ付けされたポリゴンモデル53のタグ情報と、ポリゴンモデル51のアルファ値が0.4であるというアルファ値情報と、ポリゴンモデル52のアルファ値が0.7であるというアルファ値情報と、ポリゴンモデル53のアルファ値が1.0であるというアルファ値情報と、ポリゴンモデル51によって詳細バッファ56に描画されたピクセル数が1ピクセルであるというピクセル数情報と、ポリゴンモデル52によって詳細バッファ56に描画されたピクセル数が0ピクセルであるというピクセル数情報と、ポリゴンモデル53によって詳細バッファ56に描画されたピクセル数が1ピクセルであるというピクセル数情報とを受信する。 In the first embodiment, the dot printing unit 112 receives the drawing information from the applicable drawing target drawing unit 108, the tag information of the polygon model 51 tagged as the applicable drawing target, and the tag information of the polygon model 51 tagged as the applicable drawing target. The tag information of the polygon model 52, the tag information of the polygon model 53 tagged as being the applicable drawing target, the alpha value information that the alpha value of the polygon model 51 is 0.4, and the alpha value of the polygon model 52. is 0.7, alpha value information that the alpha value of the polygon model 53 is 1.0, and pixel number that is drawn in the detail buffer 56 by the polygon model 51 is 1 pixel. number information, pixel number information that the number of pixels drawn in the detail buffer 56 by the polygon model 52 is 0 pixels, and pixel number information that the number of pixels drawn in the detail buffer 56 by the polygon model 53 is 1 pixel. and receive.
 打点部112は、ピクセル数情報から、ピクセル数が0ピクセルである適用描画対象を適用描画対象描画部108で描画されず消えてしまった適用描画対象である消失適用描画対象として特定する。具体的には、打点部112は、ポリゴンモデル52のピクセル数情報が0ピクセルであるため、ポリゴンモデル52を消失適用描画対象として特定する。 From the pixel count information, the dot printing unit 112 identifies an applied drawing target with a pixel count of 0 as a lost applied drawing target that has disappeared without being drawn by the applied drawing target drawing unit 108 . Specifically, since the pixel number information of the polygon model 52 is 0 pixel, the dot printing unit 112 identifies the polygon model 52 as the erasure applied drawing target.
 打点部112は、消失適用描画対象であるポリゴンモデル52の位置に対応した詳細バッファ56内のピクセルに点を打ち、消失適用描画対象を詳細バッファに描画する。具体的には、打点部112は、ポリゴンモデル52の頂点の位置に対応した詳細バッファ56内のピクセルに点を打つ。ここで、2次元以上の描画対象を描画する場合は、バッファに対してピクセルの中心が描画対象に含まれると、当該ピクセルが塗りつぶされ、描画対象が描画されるが、1次元の描画対象を描画する場合、つまり点を描画する場合は、バッファに対して点が打たれたピクセルが塗りつぶされ、描画対象が描画される。したがって、打点部112は、ポリゴンモデル52の頂点の位置に対応した詳細バッファ56内のピクセルに点を打つことによって、消失適用描画対象であるポリゴンモデル52を詳細バッファ56に描画することができる。なお、打点部112は、円、楕円など頂点がない描画対象の場合は、多角形に近似して、多角形の頂点の位置に点を打つ。 The dotting unit 112 dots the pixel in the detail buffer 56 corresponding to the position of the polygon model 52 to which erasure is applied, and renders the erasure-applied drawing target in the detail buffer. Specifically, the dotting unit 112 dots the pixels in the detail buffer 56 corresponding to the positions of the vertices of the polygon model 52 . Here, when drawing a two- or more-dimensional drawing target, if the center of a pixel is included in the drawing target with respect to the buffer, the pixel is filled and the drawing target is drawn. When drawing, that is, when drawing a point, the pixel where the point was struck is filled in the buffer, and the drawing target is drawn. Therefore, the dotting unit 112 can render the polygon model 52 to be erased and rendered in the detail buffer 56 by dotting the pixels in the detail buffer 56 corresponding to the positions of the vertices of the polygon model 52 . In addition, in the case of a drawing target such as a circle or an ellipse that does not have vertices, the dotting unit 112 approximates a polygon and places dots at the positions of the vertices of the polygon.
 図14は、適用描画対象のポリゴンモデル51を詳細バッファ56に描画した描画結果58と別の適用描画対象のポリゴンモデル53を詳細バッファ56に描画した描画結果59とさらに別の適用描画対象のポリゴンモデル52を詳細バッファ56に描画した描画結果60と複数のポリゴンモデル51~53とを重畳した図である。ポリゴンモデル52は、頂点が3つあるため、打点部112は、ポリゴンモデル52の3つの頂点の位置に対応した詳細バッファ56内のピクセルに点を打つ。すると、ポリゴンモデル52の3つの頂点に対応した詳細バッファ56の3ピクセル分が、描画情報に含まれるポリゴンモデル52に付随したRGB値で塗りつぶされて、ポリゴンモデル52が描画される。ポリゴンモデル52が描画された結果が、図14の3ピクセルの描画結果60である。 FIG. 14 shows a drawing result 58 obtained by drawing a polygon model 51 to be applied drawing in the detail buffer 56, a drawing result 59 obtained by drawing another polygon model 53 to be applied drawing in the detail buffer 56, and another polygon to be applied drawing. 5 is a diagram in which a drawing result 60 obtained by drawing a model 52 in a detail buffer 56 and a plurality of polygon models 51 to 53 are superimposed. Since the polygon model 52 has three vertices, the dot marking unit 112 dots the pixels in the detail buffer 56 corresponding to the positions of the three vertices of the polygon model 52 . Then, three pixels in the detail buffer 56 corresponding to the three vertices of the polygon model 52 are filled with the RGB values attached to the polygon model 52 included in the drawing information, and the polygon model 52 is drawn. The drawing result of the polygon model 52 is the 3-pixel drawing result 60 in FIG.
 図15は、適用描画対象のポリゴンモデル51を詳細バッファ56に描画した描画結果58と別の適用描画対象のポリゴンモデル53を詳細バッファ56に描画した描画結果59とさらに別の適用描画対象のポリゴンモデル52を詳細バッファ56に描画した描画結果60とを示した図である。図15は図14の複数のポリゴンモデル51~53の重畳表示をなくし、実際の描画結果58~60のみを示している。 FIG. 15 shows a drawing result 58 obtained by drawing a polygon model 51 to be applied drawing in the detail buffer 56, a drawing result 59 obtained by drawing another polygon model 53 to be applied drawing in the detail buffer 56, and another polygon to be applied drawing. FIG. 6 is a drawing showing a drawing result 60 of the model 52 drawn in the detail buffer 56; FIG. 15 eliminates the superimposed display of the multiple polygon models 51-53 of FIG. 14 and shows only the actual drawing results 58-60.
 打点部112は、ポリゴンモデル52のアルファ値情報から、描画結果60を構成する各ピクセルに、ポリゴンモデル52のアルファ値が0.7であるというアルファ値を記憶させる。図15の各ピクセルの中の値は、記憶されているアルファ値を表す。 From the alpha value information of the polygon model 52, the dot printing unit 112 stores an alpha value that the alpha value of the polygon model 52 is 0.7 in each pixel forming the drawing result 60. The value in each pixel in FIG. 15 represents the stored alpha value.
 図5に戻って、打点部112は、詳細バッファへの描画の終了の通知をブレンド処理部110に送信し、ステップS18に進む。また、打点部112は、消失適用描画対象がなかった場合は、適用描画対象はすでに詳細バッファに描画されているため、何もせず、詳細バッファへの描画の終了の通知をブレンド処理部110に送信し、ステップS18に進む。 Returning to FIG. 5, the dot printing unit 112 sends a notification of the end of drawing to the detail buffer to the blend processing unit 110, and proceeds to step S18. Further, if there is no drawing target to be erased, the dot printing unit 112 does nothing because the drawing target to be applied has already been drawn in the detail buffer, and notifies the blend processing unit 110 of the end of drawing to the detail buffer. It transmits, and it progresses to step S18.
 ステップS18において、ブレンド処理部110は、不適用描画対象描画部107から描画先バッファへの描画の終了の通知を受信し、打点部112から詳細バッファへの描画の終了の通知を受信すると、描画先バッファ記憶部103に記憶されている描画先バッファに対して、詳細バッファ109に記憶されている詳細バッファの1ピクセルごとにアルファ値でRGB値をアルファブレンドして適用描画対象を描画先バッファに描画する。 In step S18, the blend processing unit 110 receives a notification of completion of drawing to the drawing destination buffer from the unapplied drawing target drawing unit 107, and receives a notification of completion of drawing to the detail buffer from the dot printing unit 112. For each pixel of the detail buffer stored in the detail buffer 109, the RGB values are alpha blended with the alpha value for the rendering destination buffer stored in the destination buffer storage unit 103, and the applied rendering target is set to the rendering destination buffer. draw.
 実施の形態1では、ブレンド処理部110は、図8の描画先バッファ55に対して、図15の詳細バッファ56の1ピクセルごとにアルファ値でRGB値をアルファブレンドして適用描画対象であるポリゴンモデル51~53を描画先バッファ55に描画する。具体的には、ブレンド処理部110は、描画先バッファ記憶部103に記憶されている図8の描画先バッファ55に対して、図15のポリゴンモデル51の描画結果58の1ピクセルについて、アルファ値である0.4でポリゴンモデル51のRGB値をアルファブレンドする。同様に、ブレンド処理部110は、描画先バッファ記憶部103に記憶されている図8の描画先バッファ55に対して、図15のポリゴンモデル52の描画結果60の各ピクセルについて、各ピクセルのアルファ値である0.7でポリゴンモデル52のRGB値をアルファブレンドする。ブレンド処理部110は、描画先バッファ記憶部103に記憶されている図8の描画先バッファ55に対して、図15のポリゴンモデル53の描画結果59の1ピクセルについて、アルファ値である1.0でポリゴンモデル53のRGB値をアルファブレンドする。 In the first embodiment, the blend processing unit 110 alpha-blends the RGB values of the detail buffer 56 of FIG. The models 51 to 53 are drawn in the drawing destination buffer 55 . Specifically, the blend processing unit 110 stores the rendering destination buffer 55 of FIG. The RGB values of the polygon model 51 are alpha-blended at 0.4. Similarly, the blend processing unit 110 stores the alpha of each pixel in the drawing result 60 of the polygon model 52 in FIG. Alpha blend the RGB values of the polygonal model 52 with a value of 0.7. The blend processing unit 110 sets the alpha value of 1.0 for one pixel of the drawing result 59 of the polygon model 53 of FIG. 15 to the drawing destination buffer 55 of FIG. alpha blend the RGB values of the polygon model 53 with .
 図16は、全てのポリゴンモデルを描画先バッファ55に描画した描画結果を示した図である。ポリゴンモデル51を描画先バッファ55に描画したのが描画結果61となる。描画結果61は、図8の描画先バッファ55に対して、図15のポリゴンモデル51の描画結果58の1ピクセルについて、アルファ値である0.4でポリゴンモデル51のRGB値をアルファブレンドした結果であり、図16では模擬的に縦縞で塗られている。 FIG. 16 is a drawing showing the result of drawing all polygon models in the drawing destination buffer 55. FIG. A rendering result 61 is obtained by rendering the polygon model 51 in the rendering destination buffer 55 . The drawing result 61 is the result of alpha blending the RGB values of the polygon model 51 with the alpha value of 0.4 for one pixel of the drawing result 58 of the polygon model 51 of FIG. 15 in the drawing destination buffer 55 of FIG. , and in FIG. 16 are painted with vertical stripes in a simulated manner.
 ポリゴンモデル52を描画先バッファ55に描画したのが描画結果62となる。描画結果62は、図8の描画先バッファ55に対して、図15のポリゴンモデル52の描画結果60の3ピクセルそれぞれについて、アルファ値である0.7でポリゴンモデル52のRGB値をアルファブレンドした結果であり、図16では模擬的に斜線で塗られている。 A drawing result 62 is obtained by drawing the polygon model 52 in the drawing destination buffer 55 . The drawing result 62 is obtained by alpha-blending the RGB values of the polygon model 52 with the alpha value of 0.7 for each of the three pixels of the drawing result 60 of the polygon model 52 in FIG. 15 for the drawing destination buffer 55 in FIG. This is the result, which is shaded with slanted lines in FIG.
 ポリゴンモデル53を描画先バッファ55に描画したのが描画結果63となる。描画結果63は、図8の描画先バッファ55に対して、図15のポリゴンモデル53の描画結果59の1ピクセルについて、アルファ値である1.0でポリゴンモデル53のRGB値をアルファブレンドした結果であり、図16では模擬的に横縞で塗られている。 A drawing result 63 is obtained by drawing the polygon model 53 in the drawing destination buffer 55 . The drawing result 63 is the result of alpha blending the RGB values of the polygon model 53 with the alpha value of 1.0 for one pixel of the drawing result 59 of the polygon model 53 of FIG. 15 in the drawing destination buffer 55 of FIG. , and in FIG. 16 are painted with horizontal stripes in a simulated fashion.
 ポリゴンモデル54を描画先バッファ55に描画したのが描画結果57となる。図16の描画結果57は、図8の描画結果57と同様である。 A drawing result 57 is obtained by drawing the polygon model 54 in the drawing destination buffer 55 . The drawing result 57 in FIG. 16 is the same as the drawing result 57 in FIG.
 図5に戻って、ブレンド処理部110は、アルファブレンドの終了の通知を出力制御部111に送信し、ステップS19に進む。 Returning to FIG. 5, the blend processing unit 110 transmits a notification of the end of alpha blending to the output control unit 111, and proceeds to step S19.
 ステップS19において、出力制御部111は、ブレンド処理部110からアルファブレンドの終了の通知を受信し、描画先バッファ記憶部103に記憶されている描画先バッファを出力するように出力装置5を制御する。 In step S<b>19 , the output control unit 111 receives the alpha blend end notification from the blend processing unit 110 and controls the output device 5 to output the drawing destination buffer stored in the drawing destination buffer storage unit 103 . .
 実施の形態1では、描画先バッファ記憶部103に記憶されている描画先バッファ55に描画されたシーンを出力するように出力装置5である表示装置を制御する。 In the first embodiment, the display device, which is the output device 5, is controlled so as to output the scene drawn in the drawing destination buffer 55 stored in the drawing destination buffer storage unit 103. FIG.
 ステップS19を実行した後もステップS11に戻り、電源をOFFにすること、あるいは終了操作がなされる等の処理の終了のトリガーがあるまで上記のような処理を繰り返す。なお、上記のような処理を繰り返すとしたが、繰り返さず一回行うだけでもよい。 Even after executing step S19, the process returns to step S11, and the above process is repeated until there is a trigger to end the process, such as turning off the power or performing an end operation. Although the above processing is repeated, it may be performed only once.
 以上述べたように、実施の形態1の描画処理システム1は、適用判定部106によって不適用描画対象と判定された描画対象を、不適用描画対象描画部107が描画先バッファ記憶部103に記憶されている描画先バッファに描画し、算出部104が適用判定部106によって適用描画対象と判定された描画対象の面積を算出し、決定部105が算出した面積から適用描画対象のアルファ値を決定し、適用描画対象描画部108が描画先バッファとは異なるバッファである詳細バッファに適用描画対象を描画し、打点部112が適用描画対象の位置に対応した詳細バッファ内のピクセルに点を打つことで適用描画対象を詳細バッファに描画し、ブレンド処理部110が適用描画対象のアルファ値で描画先バッファに詳細バッファをアルファブレンドし、出力制御部111が出力装置5に描画先バッファに描画されたシーンを出力するよう制御するため、ピクセルの中心に描画対象が位置していない場合でも、描画対象が描画され、描画対象を表示画面上に表示させることができる。 As described above, in the rendering processing system 1 according to the first embodiment, the rendering target determined as the non-applicable rendering target by the application determination unit 106 is stored in the rendering destination buffer storage unit 103 by the non-applicable rendering target rendering unit 107. The calculation unit 104 calculates the area of the drawing target determined by the application determining unit 106 as the applicable drawing target, and the determining unit 105 determines the alpha value of the applicable drawing target from the calculated area. Then, the applicable drawing target drawing unit 108 draws the applicable drawing target in a detail buffer which is a buffer different from the drawing destination buffer, and the dot printing unit 112 dots the pixels in the detail buffer corresponding to the position of the applicable drawing target. , the blend processing unit 110 alpha-blends the detail buffer to the drawing destination buffer with the alpha value of the applied drawing target, and the output control unit 111 draws to the output device 5 in the drawing destination buffer. Since the scene is controlled to be output, even if the drawing target is not positioned at the center of the pixel, the drawing target is drawn and the drawing target can be displayed on the display screen.
実施の形態2.
 実施の形態1では、描画処理システム1は、入力装置4が描画情報とタグ情報との入力を受け付け、取得部101が入力装置4から描画情報とタグ情報とを取得し、適用判定部106がタグ情報を用いて、描画対象が適用描画対象と不適用描画対象とのいずれであるかを判定した。実施の形態2では、図17~図19に示すように、描画処理システム2は、入力装置6が受け付けるのは描画情報の入力だけであり、取得部201が入力装置6から描画情報を取得し、適用判定部202が描画情報から描画対象が適用描画対象と不適用描画対象とのいずれであるかを判定する。
Embodiment 2.
In the first embodiment, in the drawing processing system 1, the input device 4 receives input of drawing information and tag information, the acquisition unit 101 acquires the drawing information and tag information from the input device 4, and the application determination unit 106 receives The tag information was used to determine whether the drawing target was an applicable drawing target or an unapplied drawing target. In the second embodiment, as shown in FIGS. 17 to 19, in the drawing processing system 2, the input device 6 only receives input of drawing information, and the obtaining unit 201 obtains the drawing information from the input device 6. , the application determination unit 202 determines from the drawing information whether the drawing target is an applicable drawing target or an unapplied drawing target.
 当該判定により、ユーザーがタグ情報を入力装置6に入力する必要がなく、描画処理システム2が自動で描画対象が適用描画対象と不適用描画対象とのいずれであるかを判定するため、ユーザーの負荷を低減できる。 With this determination, the user does not need to input the tag information into the input device 6, and the drawing processing system 2 automatically determines whether the drawing target is an applicable drawing target or an unapplied drawing target. load can be reduced.
 なお、実施の形態1では、描画処理システム1は、不適用描画対象も適用描画対象もまとめて描画をしていたが、実施の形態2では、描画処理システム2は、描画対象を1つずつ描画していく。それ以外は、実施の形態1と同様である。以下の説明において実施の形態1で説明した構成及び動作については、同一符号を付して、重複する説明を省略する。 In the first embodiment, the drawing processing system 1 collectively draws both the non-applicable drawing target and the applicable drawing target. I will draw. Other than that, it is the same as the first embodiment. In the following description, the configurations and operations described in Embodiment 1 are denoted by the same reference numerals, and duplicate descriptions are omitted.
 実施の形態2でも実施の形態1と同様に、一例として、シーン全体の描画の情報である描画情報内に、ピクセルの中心に描画対象が位置していない描画対象がある場合について、本開示を適用した場合を以下に説明する。 In the second embodiment, as in the first embodiment, as an example, the present disclosure is applied to the case where the drawing information, which is the information for drawing the entire scene, includes a drawing target that is not positioned at the center of the pixel. A case of application will be described below.
 図17は、実施の形態2に係る描画処理装置200を含む描画処理システム2のブロック図である。描画処理システム2は、描画情報が入力される入力装置6と、描画情報を用いて、描画情報を描画する処理を行う描画処理装置200と、描画処理装置200で処理されたシーンを出力する出力装置5とを備えている。 FIG. 17 is a block diagram of the drawing processing system 2 including the drawing processing device 200 according to the second embodiment. The drawing processing system 2 includes an input device 6 to which drawing information is input, a drawing processing device 200 that uses the drawing information to draw the drawing information, and an output that outputs the scene processed by the drawing processing device 200. a device 5;
 実施の形態2では、実施の形態1の図1の入力装置4と、取得部101と、適用判定部106と、不適用描画対象描画部107と、算出部104と、決定部105と、適用描画対象描画部108と、打点部112と、ブレンド処理部110との代わりに、入力装置6と、取得部201と、適用判定部202と、不適用描画対象描画部203と、算出部204と、決定部205と、適用描画対象描画部209と、打点部210と、ブレンド処理部207と、終了判定部206と、蓄積記憶部208とが機能ブロック図の構成として加わる。 In Embodiment 2, the input device 4 of FIG. Instead of the drawing target drawing unit 108, the dot printing unit 112, and the blend processing unit 110, the input device 6, the acquisition unit 201, the application determination unit 202, the non-applicable drawing target drawing unit 203, and the calculation unit 204 , a determination unit 205, an applicable drawing target drawing unit 209, a dot printing unit 210, a blend processing unit 207, an end determination unit 206, and an accumulation storage unit 208 are added as components of the functional block diagram.
 入力装置6は、例えば、ユーザーによって入力されたポリゴンモデルなどの描画対象の情報を含むシーン全体の描画情報のみを受け付ける。 The input device 6 accepts, for example, only the drawing information of the entire scene including the drawing target information such as the polygon model input by the user.
 描画処理装置200は、入力装置6から描画情報を取得する取得部201と、描画先バッファと詳細バッファとを作成する作成部102と、描画情報から描画対象を詳細描画処理するか否かによって、描画対象が、描画先バッファに描画される不適用描画対象と、描画先バッファとは異なるバッファである詳細バッファに描画される適用描画対象とのいずれであるかを判定する適用判定部202と、詳細描画処理を適用しないとタグ付けされた不適用描画対象を描画先バッファに描画する不適用描画対象描画部203と、詳細描画処理を適用するとタグ付けされた適用描画対象の面積を算出する算出部204と、算出した面積から適用描画対象のアルファ値を決定する決定部205と、詳細バッファに適用描画対象を描画し、描画されたピクセル数を取得する適用描画対象描画部209と、取得した描画されたピクセル数から、適用描画対象が詳細バッファに描画されず、消失してしまった適用描画対象である消失適用描画対象であるかを特定し、消失適用描画対象の場合は、消失適用描画対象の位置に対応した詳細バッファ内のピクセルに点を打ち、消失適用描画対象を詳細バッファに描画する打点部210と、描画情報内に描画先バッファあるいは詳細バッファに描画していない描画対象があるか否かで終了を判定する終了判定部206と、詳細バッファで塗られたピクセルに対応して、適用描画対象のアルファ値で適用描画対象のRGB値をアルファブレンドすることにより、適用描画対象を描画先バッファに描画するブレンド処理部207と、出力装置5に描画先バッファに描画されたシーンを出力するよう制御する出力制御部111と、描画先バッファを記憶する描画先バッファ記憶部103と、詳細バッファを記憶する詳細バッファ記憶部109と、詳細バッファに描画された情報を蓄積して記憶する蓄積記憶部208を備えている。なお、不適用描画対象描画部203は、第一の描画対象描画部に対応し、適用描画対象描画部209は、第二の描画対象描画部に対応する。 The drawing processing device 200 includes an acquisition unit 201 that acquires drawing information from the input device 6, a creation unit 102 that creates a drawing destination buffer and a detail buffer, and whether or not to perform detailed drawing processing on a drawing target based on the drawing information. an application determination unit 202 that determines whether a drawing target is an unapplied drawing target that is drawn in a drawing destination buffer or an applicable drawing target that is drawn in a detail buffer that is a buffer different from the drawing destination buffer; A non-applicable drawing target drawing unit 203 that draws a non-applicable drawing target tagged as not applying detailed drawing processing to a drawing destination buffer, and a calculation that calculates the area of the applicable drawing target tagged as applying detailed drawing processing. a determining unit 205 that determines the alpha value of the applicable drawing target from the calculated area; an applicable drawing target drawing unit 209 that draws the applicable drawing target in the detail buffer and obtains the number of drawn pixels; From the number of pixels drawn, identify whether the applied drawing target is a lost applied drawing target that has disappeared without being drawn in the detail buffer. A dot printing unit 210 dots the pixels in the detail buffer corresponding to the target position and draws the erasing-applied drawing target in the detail buffer, and the drawing information includes the drawing destination buffer or the drawing target that is not drawn in the detail buffer. A termination determination unit 206 that determines termination based on whether the drawing target is A blend processing unit 207 for drawing in a drawing destination buffer, an output control unit 111 for controlling output of the scene drawn in the drawing destination buffer to the output device 5, a drawing destination buffer storage unit 103 for storing the drawing destination buffer, A detail buffer storage unit 109 stores a detail buffer, and an accumulation storage unit 208 accumulates and stores information drawn in the detail buffer. Note that the unapplied drawing target drawing portion 203 corresponds to the first drawing target drawing portion, and the applicable drawing target drawing portion 209 corresponds to the second drawing target drawing portion.
 次に、実施の形態2に係る描画処理システム2のハードウェア構成について説明する。描画処理システム2のハードウェア構成図は、実施の形態1の図2と同様である。 Next, the hardware configuration of the drawing processing system 2 according to Embodiment 2 will be described. A hardware configuration diagram of the drawing processing system 2 is the same as that of FIG. 2 of the first embodiment.
 ただし、入力装置6のハードウェア構成は入力装置4と同様である。取得部201のハードウェア構成は取得部101と同様である。適用判定部202のハードウェア構成は適用判定部106と同様である。不適用描画対象描画部203のハードウェア構成は不適用描画対象描画部107と同様である。算出部204のハードウェア構成は算出部104と同様である。決定部205のハードウェア構成は決定部105と同様である。適用描画対象描画部209のハードウェア構成は適用描画対象描画部108と同様である。打点部210のハードウェア構成は打点部112と同様である。ブレンド処理部207のハードウェア構成はブレンド処理部110と同様である。終了判定部206は、グラフィックメモリ45に記憶するプログラムによって実現され、GPU44がグラフィックメモリ45にロードしたプログラムを読み込み、実行することにより実現する。蓄積記憶部208は、グラフィックメモリ45によって実現される。 However, the hardware configuration of the input device 6 is the same as that of the input device 4. The hardware configuration of the acquisition unit 201 is the same as that of the acquisition unit 101 . The hardware configuration of the application determination unit 202 is the same as that of the application determination unit 106 . The hardware configuration of the non-applicable drawing target rendering unit 203 is the same as that of the non-applicable drawing target rendering unit 107 . The hardware configuration of the calculator 204 is similar to that of the calculator 104 . The hardware configuration of the determination unit 205 is the same as that of the determination unit 105 . The hardware configuration of the applicable rendering target rendering unit 209 is the same as that of the applicable rendering target rendering unit 108 . The hardware configuration of the dotting section 210 is the same as that of the dotting section 112 . The hardware configuration of the blend processing unit 207 is the same as that of the blend processing unit 110 . The termination determination unit 206 is implemented by a program stored in the graphic memory 45, and is implemented by reading and executing the program loaded into the graphic memory 45 by the GPU 44. FIG. The accumulation storage unit 208 is implemented by the graphic memory 45 .
 なお、グラフィックメモリ45は、終了判定部206の機能と他の部の機能とのそれぞれを別々のメモリで実現してもよい。また、蓄積記憶部208と、他の記憶部とのそれぞれを別々のメモリで実現してもよい。それ以外は、実施の形態1のハードウェア構成と同様である。 It should be noted that the graphic memory 45 may implement the functions of the end determination unit 206 and the functions of other units in separate memories. Also, the accumulation storage unit 208 and other storage units may be realized by separate memories. Other than that, the hardware configuration is the same as that of the first embodiment.
 次に、描画処理システム2の動作について説明する。 Next, the operation of the drawing processing system 2 will be explained.
 図18は、本開示の実施の形態2に係る描画処理システム2の動作を示すフローチャートである。図18を用いて、描画処理システム2の動作を以下に説明する。 FIG. 18 is a flowchart showing the operation of the drawing processing system 2 according to Embodiment 2 of the present disclosure. The operation of the drawing processing system 2 will be described below with reference to FIG.
 ステップS4において、入力装置6は、描画情報の入力を受け付ける。例えば、ユーザーがマウス、キーボードなどを用いて、描画情報を指定して入力装置6に入力し、入力装置6は、当該入力を受け付ける。 In step S4, the input device 6 accepts input of drawing information. For example, a user uses a mouse, a keyboard, or the like to specify drawing information and input it to the input device 6, and the input device 6 receives the input.
 なお、入力装置6が受け付ける描画情報は、実施の形態1と同様に、ユーザーが指定しなくても、通信を介して受信した描画情報、予め定められた基準を満たしたものを自動的に抽出した描画情報、機械学習で抽出した描画情報など、どのようなものでもよい。実施の形態2では、実施の形態1と同様に、例えば、描画対象を3次元のポリゴンモデルとし、描画情報はポリゴンモデルである描画対象の情報を含んでいるとする。 As in the first embodiment, the drawing information received by the input device 6 is automatically extracted from the drawing information received via communication and the drawing information that satisfies a predetermined criterion without being specified by the user. Any drawing information may be used, such as drawing information extracted by machine learning. In the second embodiment, as in the first embodiment, for example, it is assumed that the rendering target is a three-dimensional polygon model and the rendering information includes information on the rendering target, which is the polygon model.
 実施の形態2では、実施の形態1と同様に、図4に本開示を適用した場合を以下に説明する。入力装置6は、描画情報を描画処理装置200に送信し、ステップS5に進む。 In Embodiment 2, as in Embodiment 1, a case where the present disclosure is applied to FIG. 4 will be described below. The input device 6 transmits the drawing information to the drawing processing device 200, and proceeds to step S5.
 ステップS5において、描画処理装置200は、入力装置6から描画情報を受信する。描画処理装置200は、受信した描画情報を用いて、描画処理を行う。詳細については、後述する。描画処理装置200は、描画先バッファに描画されたシーンを出力するように出力装置5を制御し、ステップS6に進む。 In step S<b>5 , the drawing processing device 200 receives drawing information from the input device 6 . The drawing processing device 200 performs drawing processing using the received drawing information. Details will be described later. The drawing processing device 200 controls the output device 5 to output the scene drawn in the drawing destination buffer, and proceeds to step S6.
 ステップS6において、出力装置5は、実施の形態1のステップS3と同様に、描画処理装置200の制御によって、描画先バッファに描画されたシーンをディスプレイなどの表示装置に出力する。 In step S6, the output device 5 outputs the scene drawn in the drawing destination buffer to a display device such as a display under the control of the drawing processing device 200, as in step S3 of the first embodiment.
 ステップS6を実行した後もステップS4に戻り、電源をOFFにすること、あるいは終了操作がなされる等の処理の終了のトリガーがあるまで上記のような処理を繰り返す。なお、上記のような処理を繰り返すとしたが、繰り返さず一回行うだけでもよい。 Even after executing step S6, the process returns to step S4, and the above process is repeated until there is a trigger to end the process, such as turning off the power or performing an end operation. Although the above processing is repeated, it may be performed only once.
 次に、描画処理装置200の動作について説明する。 Next, the operation of the rendering processing device 200 will be described.
 図19は、本開示の実施の形態2に係る描画処理装置200の動作を示すフローチャートである。図19を用いて、描画処理装置200の動作を以下に説明する。 FIG. 19 is a flowchart showing the operation of the drawing processing device 200 according to Embodiment 2 of the present disclosure. The operation of the rendering processing device 200 will be described below with reference to FIG.
 ステップS21において、取得部201は、入力装置6から描画情報を取得する。取得部201は、描画情報を作成部102に送信し、ステップS22に進む。 In step S21, the acquisition unit 201 acquires drawing information from the input device 6. The acquisition unit 201 transmits the drawing information to the creation unit 102, and proceeds to step S22.
 ステップS22において、作成部102は、取得部201から描画情報を受信する。作成部102は、描画情報を適用判定部202に送信し、ステップS23に進む。それ以外は、ステップS22は、実施の形態1のステップS12と同様である。ただし、実施の形態1では、当該詳細バッファには、適用描画対象すべてが描画されるが、実施の形態2では、適用描画対象が1つ描画されることになる。 In step S22, the creation unit 102 receives the drawing information from the acquisition unit 201. The creation unit 102 transmits the drawing information to the application determination unit 202, and proceeds to step S23. Otherwise, step S22 is the same as step S12 of the first embodiment. However, in the first embodiment, all applicable drawing targets are drawn in the detail buffer, but in the second embodiment, one applicable drawing target is drawn.
 ステップS23において、適用判定部202は、作成部102から描画情報を受信する。適用判定部202は、描画情報を用いて、描画情報に含まれる描画対象の1つについて、描画対象が適用描画対象と不適用描画対象とのいずれであるかを判定する。実施の形態2では、例えば、適用判定部202は、描画対象の面積sを算出して、面積sが予め定めて記憶しておいた閾値X未満か否かを判定する。閾値Xは、1.0ピクセルや5.0ピクセルなど、ユーザーが描画対象に対して詳細描画処理を行わなくてもいいほど大きいサイズか、それとも詳細描画処理を行うほど小さいサイズかを好みで決定すればよい。 In step S23, the application determination unit 202 receives the drawing information from the creation unit 102. The application determination unit 202 uses the drawing information to determine whether one of the drawing targets included in the drawing information is an applicable drawing target or an unapplied drawing target. In the second embodiment, for example, the application determination unit 202 calculates the area s of the object to be drawn and determines whether or not the area s is less than the threshold value X that is determined and stored in advance. Threshold X is a size large enough for the user not to perform detailed drawing processing on the drawing target, such as 1.0 pixels or 5.0 pixels, or a size small enough to perform detailed drawing processing. do it.
 描画対象の面積sを算出する方法は、描画対象そのものの面積を計算してもよいし、実施の形態1の図9のように、バウンディングボックスなどを用いて近似して、描画対象の面積の概算をしてもよい。実施の形態2では、例えば、図4のポリゴンモデル51の面積の概算値が0.8ピクセル、ポリゴン

モデル52の面積の概算値が1.4ピクセル、ポリゴンモデル53の面積の概算値が2.4ピクセル、ポリゴンモデル54の面積の概算値が16.8ピクセルであったとする。また、予め定めておいた閾値Xが3.0ピクセルであったとする。
The area s of the object to be drawn may be calculated by calculating the area of the object itself. You can make an estimate. In the second embodiment, for example, the approximate value of the area of the polygon model 51 in FIG. 4 is 0.8 pixels, and the polygon

Assume that the approximate area value of the model 52 is 1.4 pixels, the approximate area value of the polygon model 53 is 2.4 pixels, and the approximate area value of the polygon model 54 is 16.8 pixels. Also, assume that the predetermined threshold value X is 3.0 pixels.
 適用判定部202は、描画対象の面積sが予め定めておいた閾値X以上の場合は、ステップS23:Noとなる。適用判定部202は、描画対象の描画サイズが大きいため、詳細描画処理を行う必要がないとして、当該描画対象は不適用描画対象であるというタグ情報を生成する。適用判定部202は、描画情報と、不適用描画対象である描画対象のタグ情報とを不適用描画対象描画部203に送信し、ステップS24に進む。 If the drawing target area s is equal to or greater than the predetermined threshold value X, the application determination unit 202 determines No in step S23. The application determining unit 202 determines that detailed drawing processing is not necessary because the drawing size of the drawing target is large, and generates tag information indicating that the drawing target is an unapplied drawing target. The application determining unit 202 transmits the drawing information and the tag information of the drawing target that is the non-applicable drawing target to the non-applicable drawing target drawing unit 203, and proceeds to step S24.
 実施の形態2では、ポリゴンモデル54の場合は、ポリゴンモデル54の面積の概算値が16.8ピクセルであり、予め定めておいた閾値である3.0ピクセル以上となるため、ポリゴンモデル54は、ステップS23:Noとなる。適用判定部202は、ポリゴンモデル54の描画サイズが大きいため、詳細描画処理を行う必要がないとして、ポリゴンモデル54は不適用描画対象であるというタグ情報を生成する。適用判定部202は、描画情報と、不適用描画対象であるとタグ付けされたポリゴンモデル54のタグ情報とを描画部203に送信し、ステップS24に進む。 In the second embodiment, in the case of the polygon model 54, the approximate value of the area of the polygon model 54 is 16.8 pixels, which is equal to or greater than the predetermined threshold value of 3.0 pixels. , Step S23: No. The applicability determining unit 202 generates tag information indicating that the polygon model 54 is an inapplicable drawing target, assuming that detailed drawing processing is unnecessary because the drawing size of the polygon model 54 is large. The applicability determining unit 202 transmits the drawing information and the tag information of the polygon model 54 tagged as the non-applicable drawing target to the drawing unit 203, and proceeds to step S24.
 一方、適用判定部202は、描画対象の面積sが予め定めておいた閾値X未満の場合は、ステップS23:Yesとなる。適用判定部202は、描画対象の描画サイズが小さいため、詳細描画処理を行う必要があるとして、当該描画対象は適用描画対象であるというタグ情報を生成する。適用判定部202は、描画情報と、適用描画対象である描画対象のタグ情報とを算出部204に送信し、ステップS25に進む。 On the other hand, if the drawing target area s is less than the predetermined threshold value X, the application determination unit 202 determines Yes in step S23. The application determining unit 202 generates tag information indicating that the drawing target is an applicable drawing target, assuming that the drawing size of the drawing target is small and therefore detailed drawing processing needs to be performed. The application determination unit 202 transmits the drawing information and the tag information of the drawing target, which is the applicable drawing target, to the calculation unit 204, and proceeds to step S25.
 実施の形態2では、ポリゴンモデル51の場合は、ポリゴンモデル51の面積の概算値が0.8ピクセルであり、予め定めておいた閾値である3.0ピクセル未満となるため、ポリゴンモデル51は、ステップS23:Yesとなる。適用判定部202は、ポリゴンモデル51の描画サイズが小さいため、詳細描画処理を行う必要があるとして、ポリゴンモデル51は適用描画対象であるというタグ情報を生成する。適用判定部202は、描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル51のタグ情報とを算出部204に送信し、ステップS25に進む。 In the second embodiment, in the case of the polygon model 51, the approximate value of the area of the polygon model 51 is 0.8 pixels, which is less than the predetermined threshold value of 3.0 pixels. , step S23: Yes. The application determination unit 202 generates tag information indicating that the polygon model 51 is an object of application drawing, because the drawing size of the polygon model 51 is small and detailed drawing processing needs to be performed. The application determination unit 202 transmits the drawing information and the tag information of the polygon model 51 tagged as the application drawing target to the calculation unit 204, and proceeds to step S25.
 同様に、ポリゴンモデル52の場合は、ポリゴンモデル52の面積の概算値が1.4ピクセルであり、予め定めておいた閾値である3.0ピクセル未満となるため、ポリゴンモデル52は、ステップS23:Yesとなる。適用判定部202は、ポリゴンモデル52の描画サイズが小さいため、詳細描画処理を行う必要があるとして、ポリゴンモデル52は適用描画対象であるというタグ情報を生成する。適用判定部202は、描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル52のタグ情報とを算出部204に送信し、ステップS25に進む。 Similarly, in the case of the polygon model 52, the approximate value of the area of the polygon model 52 is 1.4 pixels, which is less than the predetermined threshold value of 3.0 pixels. : Yes. The application determination unit 202 generates tag information indicating that the polygon model 52 is an object of application drawing, because the drawing size of the polygon model 52 is small and detailed drawing processing needs to be performed. The application determination unit 202 transmits the drawing information and the tag information of the polygon model 52 tagged as the application drawing target to the calculation unit 204, and proceeds to step S25.
 ポリゴンモデル53の場合は、ポリゴンモデル53の面積の概算値が2.4ピクセルであり、予め定めておいた閾値である3.0ピクセル未満となるため、ポリゴンモデル53は、ステップS23:Yesとなる。適用判定部202は、ポリゴンモデル53の描画サイズが小さいため、詳細描画処理を行う必要があるとして、ポリゴンモデル53は適用描画対象であるというタグ情報を生成する。適用判定部202は、描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル53のタグ情報とを算出部204に送信し、ステップS25に進む。 In the case of the polygon model 53, the approximate value of the area of the polygon model 53 is 2.4 pixels, which is less than the predetermined threshold value of 3.0 pixels. Become. The application determination unit 202 generates tag information indicating that the polygon model 53 is an object of application drawing, because the drawing size of the polygon model 53 is small and detailed drawing processing needs to be performed. The application determination unit 202 transmits the drawing information and the tag information of the polygon model 53 tagged as the application drawing target to the calculation unit 204, and proceeds to step S25.
 ステップS24において、不適用描画対象描画部203は、適用判定部202から描画情報と、不適用描画対象である描画対象のタグ情報とを受信し、描画先バッファ記憶部103の描画先バッファに不適用描画対象を描画する。 In step S<b>24 , the non-applicable drawing target drawing unit 203 receives the drawing information and the tag information of the drawing target that is the non-applicable drawing target from the application determination unit 202 . Draw the applicable draw target.
 実施の形態2では、不適用描画対象描画部203は、適用判定部202から描画情報と、不適用描画対象であるとタグ付けされたポリゴンモデル54のタグ情報とを受信し、描画先バッファ記憶部103の描画先バッファ55にポリゴンモデル54を描画する。 In the second embodiment, the non-applicable drawing target drawing unit 203 receives the drawing information from the application determination unit 202 and the tag information of the polygon model 54 tagged as the non-applicable drawing target, and stores the drawing destination buffer. The polygon model 54 is drawn in the drawing destination buffer 55 of the unit 103 .
 ポリゴンモデル54を描画先バッファ55に描画した描画結果57とポリゴンモデル54とを重畳した図は、実施の形態1の図7と同様である。また、ポリゴンモデル54を描画先バッファ55に描画した描画結果57を示した図は、実施の形態1の図8と同様である。 A diagram in which a drawing result 57 obtained by drawing the polygon model 54 in the drawing destination buffer 55 and the polygon model 54 are superimposed is the same as FIG. 7 of the first embodiment. A drawing result 57 obtained by drawing the polygon model 54 in the drawing destination buffer 55 is the same as FIG. 8 of the first embodiment.
 不適用描画対象描画部203が描画先バッファに1つの不適用描画対象を描画し終えたら、描画部203は、描画先バッファへの描画の終了の通知を終了判定部206に送信し、ステップS29に進む。 When the non-applicable drawing target drawing unit 203 finishes drawing one non-applicable drawing target in the drawing destination buffer, the drawing unit 203 transmits a notification of the end of drawing to the drawing destination buffer to the end determination unit 206, and step S29. proceed to
 一方、ステップS25において、算出204は、適用判定部202から描画情報と、適用描画対象である描画対象のタグ情報とを受信し、適用描画対象の面積を算出する。面積の算出方法は、実施の形態1のステップS14と同様である。 On the other hand, in step S25, the calculation 204 receives the drawing information and the tag information of the drawing target, which is the applicable drawing target, from the application determination unit 202, and calculates the area of the applicable drawing target. The method of calculating the area is the same as in step S14 of the first embodiment.
 実施の形態2では、算出部204は、適用判定部202から描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル51のタグ情報とを受信する。あるいは、算出部204は、適用判定部202から描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル52のタグ情報とを受信する。あるいは、算出部204は、適用判定部202から描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル53のタグ情報とを受信する。 In the second embodiment, the calculation unit 204 receives the drawing information from the application determination unit 202 and the tag information of the polygon model 51 tagged as the application drawing target. Alternatively, the calculation unit 204 receives the drawing information from the application determination unit 202 and the tag information of the polygon model 52 tagged as the application drawing target. Alternatively, the calculation unit 204 receives the drawing information from the application determination unit 202 and the tag information of the polygon model 53 tagged as the application drawing target.
 算出部204は、適用判定部202から受信したタグ情報に記載された適用描画対象の面積を算出する。算出部204は、タグ情報に記載された適用描画対象がポリゴンモデル51であった場合は、ポリゴンモデル51の面積を算出する。算出部204は、例えば、ポリゴンモデル51の面積を0.4ピクセルと算出したとする。 The calculation unit 204 calculates the area of the applicable drawing target described in the tag information received from the application determination unit 202 . The calculation unit 204 calculates the area of the polygon model 51 when the applicable drawing target described in the tag information is the polygon model 51 . For example, assume that the calculation unit 204 calculates the area of the polygon model 51 to be 0.4 pixels.
 同様に、算出部204は、タグ情報に記載された適用描画対象がポリゴンモデル52であった場合は、ポリゴンモデル52の面積を算出する。算出部204は、例えば、ポリゴンモデル52の面積を0.7ピクセルと算出したとする。 Similarly, the calculation unit 204 calculates the area of the polygon model 52 when the applicable drawing target described in the tag information is the polygon model 52 . For example, assume that the calculation unit 204 calculates the area of the polygon model 52 to be 0.7 pixels.
 算出部204は、タグ情報に記載された適用描画対象がポリゴンモデル53であった場合は、ポリゴンモデル53の面積を算出する。算出部204は、例えば、ポリゴンモデル53の面積を1.2ピクセルと算出したとする。以降は、算出部204が算出した適用描画対象の面積を示した情報を、実施の形態1と同様に、面積情報と呼ぶ。 The calculation unit 204 calculates the area of the polygon model 53 when the applicable drawing target described in the tag information is the polygon model 53 . For example, assume that the calculation unit 204 calculates the area of the polygon model 53 to be 1.2 pixels. Hereinafter, information indicating the area of the applicable drawing target calculated by the calculation unit 204 will be referred to as area information, as in the first embodiment.
 算出部204は、描画情報と、適用描画対象である描画対象のタグ情報と、適用描画対象の面積情報とを決定部205に送信し、ステップS26に進む。 The calculating unit 204 transmits the drawing information, the tag information of the drawing target that is the applicable drawing target, and the area information of the applicable drawing target to the determining unit 205, and proceeds to step S26.
 ステップS26において、決定部205は、算出部204から描画情報と、適用描画対象である描画対象のタグ情報と、適用描画対象の面積情報とを受信する。決定部205は、面積情報を用いて、適用描画対象の透過度を表すアルファ値を決定する。アルファ値の決定方法は、実施の形態1のステップS15と同様である。 In step S26, the determining unit 205 receives from the calculating unit 204 the drawing information, the tag information of the drawing target that is the applied drawing target, and the area information of the applied drawing target. The determining unit 205 determines an alpha value representing the transparency of the applied rendering target using the area information. The method of determining the alpha value is the same as in step S15 of the first embodiment.
 実施の形態2では、例えば、決定部205は、算出部204から描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル51のタグ情報と、ポリゴンモデル51の面積が0.4ピクセルであるという面積情報とを受信する。ポリゴンモデル51のアルファ値の決定方法は、実施の形態1のステップS15と同様であり、決定部205は、ポリゴンモデル51のアルファ値を0.4に決定する。 In the second embodiment, for example, the determination unit 205 receives the drawing information from the calculation unit 204, the tag information of the polygon model 51 tagged as being the applicable drawing target, and the polygon model 51 having an area of 0.4 pixels. Receive area information that there is. The method of determining the alpha value of polygon model 51 is the same as in step S15 of Embodiment 1, and determination unit 205 determines the alpha value of polygon model 51 to be 0.4.
 あるいは、決定部205は、算出部204から描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル52のタグ情報と、ポリゴンモデル52の面積が0.7ピクセルであるという面積情報とを受信する。ポリゴンモデル52のアルファ値の決定方法は、実施の形態1のステップS15と同様であり、決定部205は、ポリゴンモデル52のアルファ値を0.7に決定する。 Alternatively, the determination unit 205 receives the drawing information from the calculation unit 204, the tag information of the polygon model 52 tagged as being the applicable drawing target, and the area information that the area of the polygon model 52 is 0.7 pixels. receive. The method of determining the alpha value of the polygon model 52 is the same as in step S15 of Embodiment 1, and the determination unit 205 determines the alpha value of the polygon model 52 to be 0.7.
 あるいは、決定部205は、算出部204から描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル53のタグ情報と、ポリゴンモデル53の面積が1.2ピクセルであるという面積情報とを受信する。ポリゴンモデル53のアルファ値の決定方法は、実施の形態1のステップS15と同様であり、決定部205は、ポリゴンモデル53のアルファ値を1.0に決定する。以降は、決定部205が決定した適用描画対象のアルファ値を示した情報を、実施の形態1と同様に、アルファ値情報と呼ぶ。 Alternatively, the determination unit 205 obtains from the calculation unit 204 the drawing information, the tag information of the polygon model 53 tagged as the applicable drawing target, and the area information that the area of the polygon model 53 is 1.2 pixels. receive. The method of determining the alpha value of polygon model 53 is the same as in step S15 of Embodiment 1, and determination unit 205 determines the alpha value of polygon model 53 to be 1.0. Hereinafter, information indicating the alpha value of the drawing target to be applied determined by the determining unit 205 will be referred to as alpha value information, as in the first embodiment.
 決定部205は、描画情報と、適用描画対象である描画対象のタグ情報と、適用描画対象のアルファ値情報とを適用描画対象描画部209に送信し、ステップS27に進む。 The determining unit 205 transmits the drawing information, the tag information of the drawing target that is the applicable drawing target, and the alpha value information of the applicable drawing target to the applicable drawing target drawing unit 209, and proceeds to step S27.
 ステップS27において、適用描画対象描画部209は、決定部205から描画情報と、適用描画対象である描画対象のタグ情報と、適用描画対象のアルファ値情報とを受信し、詳細バッファ記憶部109の詳細バッファに適用描画対象を描画する。このとき、詳細バッファにアルファ値情報も記憶される。また、適用描画対象描画部209は、詳細バッファに描画されたピクセル数を取得する。描画対象を描画する方法、詳細バッファにアルファ値情報が記憶される方法、詳細バッファに描画されたピクセル数を取得する方法は、実施の形態1のステップS16と同様である。ただし、実施の形態2では、描画処理システム2は、描画対象を1つずつ描画していくため、ポリゴンモデル51とポリゴンモデル52とポリゴンモデル53とは同時には描画されない。 In step S<b>27 , the applicable drawing target drawing unit 209 receives the drawing information, the tag information of the drawing target that is the applicable drawing target, and the alpha value information of the applicable drawing target from the determining unit 205 . Draws the applicable draw target in the detail buffer. At this time, alpha value information is also stored in the detail buffer. Also, the applicable drawing target drawing unit 209 acquires the number of pixels drawn in the detail buffer. The method of drawing the drawing target, the method of storing alpha value information in the detail buffer, and the method of acquiring the number of pixels drawn in the detail buffer are the same as in step S16 of the first embodiment. However, in the second embodiment, the drawing processing system 2 draws drawing targets one by one, so the polygon model 51, the polygon model 52, and the polygon model 53 are not drawn at the same time.
 実施の形態2では、例えば、適用描画対象描画部209は、決定部205から描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル51のタグ情報と、ポリゴンモデル51のアルファ値が0.4であるというアルファ値情報とを受信する。適用描画対象描画部209は、詳細バッファ記憶部109の詳細バッファ56にポリゴンモデル51を描画する。適用描画対象のポリゴンモデル51を詳細バッファ56に描画した描画結果58と複数のポリゴンモデル51~53とを重畳した図は、実施の形態1の図10と同様である。また、適用描画対象のポリゴンモデル51を詳細バッファ56に描画した描画結果58を示した図は、実施の形態1の図11と同様である。適用描画対象描画部209は、描画結果58のピクセルに、ポリゴンモデル51のアルファ値が0.4であるというアルファ値を記憶させる。また、適用描画対象描画部209は、ポリゴンモデル51によって実際に詳細バッファ56が塗りつぶされたピクセル数として1ピクセルという情報を取得する。 In the second embodiment, for example, the applicable drawing target drawing unit 209 obtains the drawing information from the determining unit 205, the tag information of the polygon model 51 tagged as being the applicable drawing target, and the alpha value of the polygon model 51 being 0. .4 and the alpha value information. The applied drawing target drawing unit 209 draws the polygon model 51 in the detail buffer 56 of the detail buffer storage unit 109 . A diagram in which a drawing result 58 obtained by drawing the polygon model 51 to be applied drawing in the detail buffer 56 and a plurality of polygon models 51 to 53 are superimposed is the same as FIG. 10 of the first embodiment. A drawing result 58 obtained by drawing the polygon model 51 to be applied drawing in the detail buffer 56 is the same as FIG. 11 of the first embodiment. The applied drawing target drawing unit 209 stores an alpha value that the alpha value of the polygon model 51 is 0.4 in the pixel of the drawing result 58 . Further, the applicable drawing target drawing unit 209 acquires information that the number of pixels in which the detail buffer 56 is actually filled by the polygon model 51 is 1 pixel.
 あるいは、適用描画対象描画部209は、決定部205から描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル52のタグ情報と、ポリゴンモデル52のアルファ値が0.7であるというアルファ値情報とを受信する。適用描画対象描画部209は、詳細バッファ記憶部109の詳細バッファ56にポリゴンモデル52を描画する。しかし、ポリゴンモデル52は、詳細バッファ56のどのピクセルの中心も含まないため、実施の形態1と同様に、描画されず消えてしまう。したがって、適用描画対象描画部209は、詳細バッファ56にポリゴンモデル52を描画する処理をしても、ポリゴンモデル52は描画されないため、何も描画されていない図6の詳細バッファ56のままになってしまう。適用描画対象描画部209は、ポリゴンモデル52によって実際に詳細バッファ56が塗りつぶされたピクセル数として0ピクセルという情報を取得する。 Alternatively, the applicable drawing target drawing unit 209 receives the drawing information from the determination unit 205, the tag information of the polygon model 52 tagged as the applicable drawing target, and the alpha value of the polygon model 52 whose alpha value is 0.7. receive value information. The applied drawing target drawing unit 209 draws the polygon model 52 in the detail buffer 56 of the detail buffer storage unit 109 . However, since the polygon model 52 does not include the center of any pixel in the detail buffer 56, it disappears without being drawn as in the first embodiment. Therefore, even if the applicable drawing target drawing unit 209 performs the process of drawing the polygon model 52 in the detail buffer 56, the polygon model 52 is not drawn, so the detail buffer 56 in FIG. 6 remains blank. end up The applied drawing target drawing unit 209 acquires information indicating that the number of pixels for which the detail buffer 56 is actually filled by the polygon model 52 is 0 pixel.
 あるいは、適用描画対象描画部209は、決定部205から描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル53のタグ情報と、ポリゴンモデル53のアルファ値が1.0であるというアルファ値情報とを受信する。適用描画対象描画部209は、詳細バッファ記憶部109の詳細バッファ56にポリゴンモデル53を描画する。適用描画対象のポリゴンモデル53を詳細バッファ56に描画した描画結果59と複数のポリゴンモデル51~53とを重畳した図は、実施の形態1の図12から適用描画対象のポリゴンモデル51を詳細バッファ56に描画した描画結果58を削除した図と同様となる。また、適用描画対象のポリゴンモデル53を詳細バッファ56に描画した描画結果59を示した図は、実施の形態1の図13から適用描画対象のポリゴンモデル51を詳細バッファ56に描画した描画結果58を削除した図と同様となる。適用描画対象描画部209は、描画結果59のピクセルに、ポリゴンモデル53のアルファ値が1.0であるというアルファ値を記憶させる。また、適用描画対象描画部209は、ポリゴンモデル53によって実際に詳細バッファ56が塗りつぶされたピクセル数として1ピクセルという情報を取得する。以降は、適用描画対象描画部209が取得した適用描画対象によって詳細バッファに描画されたピクセル数を示す情報を、実施の形態1と同様に、ピクセル数情報と呼ぶ。 Alternatively, the applicable drawing target drawing unit 209 receives the drawing information from the determination unit 205, the tag information of the polygon model 53 tagged as the applicable drawing target, and the alpha value of the polygon model 53 whose alpha value is 1.0. receive value information. The applied drawing target drawing unit 209 draws the polygon model 53 in the detail buffer 56 of the detail buffer storage unit 109 . A drawing result 59 obtained by drawing a polygon model 53 to be applied drawing in a detail buffer 56 and a plurality of polygon models 51 to 53 are superimposed. It is the same as the drawing in which the drawing result 58 drawn in 56 is deleted. A drawing result 59 obtained by drawing the polygon model 53 to be applied drawing in the detail buffer 56 is a drawing result 58 obtained by drawing the polygon model 51 to be applied drawing in the detail buffer 56 from FIG. 13 of the first embodiment. is the same as the figure without The applied drawing target drawing unit 209 stores an alpha value that the alpha value of the polygon model 53 is 1.0 in the pixel of the drawing result 59 . Also, the applied drawing target drawing unit 209 acquires information that the number of pixels in which the detail buffer 56 is actually filled by the polygon model 53 is 1 pixel. Hereinafter, information indicating the number of pixels drawn in the detail buffer by the applicable drawing target acquired by the applicable drawing target drawing unit 209 will be referred to as pixel number information, as in the first embodiment.
 適用描画対象描画部209は、描画情報と、適用描画対象である描画対象のタグ情報と、アルファ値情報と、ピクセル数情報と、適用描画対象のアルファ値情報とを打点部210に送信し、ステップS28に進む。 The applied drawing target drawing unit 209 transmits the drawing information, the tag information of the drawing target that is the applied drawing target, the alpha value information, the number of pixels information, and the alpha value information of the applied drawing target to the dot printing unit 210, Proceed to step S28.
 ステップS28において、打点部210は、適用描画対象描画部209から描画情報と、適用描画対象である描画対象のタグ情報と、アルファ値情報と、ピクセル数情報とを受信する。打点部210は、ピクセル数情報から、適用描画対象が適用描画対象描画部209で描画されず消えてしまった適用描画対象である消失適用描画対象であるかを特定する。打点部210は、適用描画対象が消失適用描画対象であった場合は、消失適用描画対象の位置に対応した詳細バッファ内のピクセルに点を打ち、消失適用描画対象を詳細バッファに描画する。このとき、詳細バッファにアルファ値情報も記憶される。消失適用描画対象であるかを特定する方法、消失適用描画対象の位置に対応した詳細バッファ内のピクセルに点を打ち、消失適用描画対象を詳細バッファに描画する方法、詳細バッファにアルファ値情報を記憶する方法は、実施の形態1のステップS17と同様である。また、打点部210は、適用描画対象が消失適用描画対象でなかった場合は、適用描画対象はすでに詳細バッファに描画されているため、実施の形態1のステップS17と同様に、何もしない。 In step S28, the dot printing section 210 receives from the applied drawing target drawing section 209 the drawing information, the tag information of the drawing target that is the applied drawing target, the alpha value information, and the pixel number information. The dot printing unit 210 identifies from the pixel count information whether the applied drawing target is a lost applied drawing target that has disappeared without being drawn by the applied drawing target drawing unit 209 . When the applied drawing target is an erasure applied drawing target, the dot dot dot part 210 dots the pixel in the detail buffer corresponding to the position of the erasure applied drawing target, and renders the erasure applied drawing target in the detail buffer. At this time, alpha value information is also stored in the detail buffer. How to determine if it is an erasure-applied drawtarget, how to dot the pixel in the detail buffer corresponding to the position of the erasure-applied drawtarget, how to draw the loss-applied drawtarget to the detail buffer, and how to store alpha value information in the detail buffer The method of storing is the same as step S17 of the first embodiment. If the applied drawing target is not the disappearing applied drawing target, the dot printing unit 210 does nothing, as in step S17 of the first embodiment, because the applied drawing target has already been drawn in the detail buffer.
 実施の形態2では、例えば、打点部210は、適用描画対象描画部209から描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル51のタグ情報と、ポリゴンモデル51のアルファ値が0.4であるというアルファ値情報と、ポリゴンモデル51によって詳細バッファ56に描画されたピクセル数が1ピクセルであるというピクセル数情報とを受信する。打点部210は、ピクセル数情報から、ポリゴンモデル51のピクセル数情報が1ピクセルであるため、ポリゴンモデル51は消失適用描画対象ではないと特定する。ポリゴンモデル51はすでに詳細バッファに描画されているため、打点部210は、何もしない。 In the second embodiment, for example, the dot printing unit 210 receives the drawing information from the applicable drawing target drawing unit 209, the tag information of the polygon model 51 tagged as the applicable drawing target, and the alpha value of the polygon model 51 being 0. .4 and pixel number information that the number of pixels drawn in the detail buffer 56 by the polygon model 51 is 1 pixel. Based on the pixel number information, the dot printing unit 210 specifies that the polygon model 51 is not a target for erasure-applied drawing because the pixel number information of the polygon model 51 is 1 pixel. Since the polygon model 51 has already been drawn in the detail buffer, the dotting section 210 does nothing.
 あるいは、打点部210は、適用描画対象描画部209から描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル52のタグ情報と、ポリゴンモデル52のアルファ値が0.7であるというアルファ値情報と、ポリゴンモデル52によって詳細バッファ56に描画されたピクセル数が0ピクセルであるというピクセル数情報とを受信する。打点部210は、ピクセル数情報から、ポリゴンモデル52のピクセル数情報が0ピクセルであるため、ポリゴンモデル52は消失適用描画対象であると特定する。 Alternatively, the dot printing unit 210 receives the drawing information from the applicable drawing target drawing unit 209, the tag information of the polygon model 52 tagged as being the applicable drawing target, and the alpha value of the polygon model 52 whose alpha value is 0.7. Receive value information and pixel count information that the number of pixels drawn by the polygon model 52 into the detail buffer 56 is 0 pixels. Based on the pixel number information, the dot printing unit 210 specifies that the polygon model 52 is to be erased and drawn because the pixel number information of the polygon model 52 is 0 pixel.
 打点部210は、実施の形態1のステップS17と同様に、消失適用描画対象を詳細バッファに描画する。適用描画対象のポリゴンモデル52を詳細バッファ56に描画した描画結果60と複数のポリゴンモデル51~53とを重畳した図は、実施の形態1の図14から適用描画対象のポリゴンモデル51を詳細バッファ56に描画した描画結果58と別の適用描画対象のポリゴンモデル53を詳細バッファ56に描画した描画結果59とを削除した図と同様となる。適用描画対象のポリゴンモデル52を詳細バッファ56に描画した描画結果60を示した図は、実施の形態1の図15から適用描画対象のポリゴンモデル51を詳細バッファ56に描画した描画結果58と別の適用描画対象のポリゴンモデル53を詳細バッファ56に描画した描画結果59とを削除した図と同様となる。打点部210は、ポリゴンモデル52のアルファ値情報から、描画結果60を構成する各ピクセルに、ポリゴンモデル52のアルファ値が0.7であるというアルファ値を記憶させる。 The dot printing unit 210 draws the erasure-applied drawing target in the detail buffer in the same manner as in step S17 of the first embodiment. A drawing result 60 obtained by drawing a polygon model 52 to be applied drawing in a detail buffer 56 and a plurality of polygon models 51 to 53 are superimposed on each other. 56 and a drawing result 59 obtained by drawing another polygon model 53 to be applied drawing in the detail buffer 56 are deleted. A drawing result 60 obtained by drawing the polygon model 52 to be applied drawing in the detail buffer 56 is different from the drawing result 58 obtained by drawing the polygon model 51 to be applied drawing in the detail buffer 56 from FIG. 15 of the first embodiment. 2 is the same as the drawing in which the drawing result 59 obtained by drawing the polygon model 53 to be drawn in the detail buffer 56 is deleted. Based on the alpha value information of the polygon model 52, the dot printing unit 210 causes each pixel forming the drawing result 60 to store an alpha value that the alpha value of the polygon model 52 is 0.7.
 あるいは、打点部210は、適用描画対象描画部209から描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル53のタグ情報と、ポリゴンモデル53のアルファ値が1.0であるというアルファ値情報と、ポリゴンモデル53によって詳細バッファ56に描画されたピクセル数が1ピクセルであるというピクセル数情報とを受信する。打点部210は、ピクセル数情報から、ポリゴンモデル53のピクセル数情報が1ピクセルであるため、ポリゴンモデル53は消失適用描画対象ではないと特定する。ポリゴンモデル53はすでに詳細バッファに描画されているため、打点部210は、何もしない。 Alternatively, the dot printing unit 210 receives the drawing information from the applicable drawing target drawing unit 209, the tag information of the polygon model 53 tagged as the applicable drawing target, and the alpha value of the polygon model 53 whose alpha value is 1.0. It receives value information and pixel number information that the number of pixels drawn by the polygon model 53 into the detail buffer 56 is 1 pixel. Based on the pixel number information, the dot printing unit 210 specifies that the polygon model 53 is not a drawing object to be erased because the pixel number information of the polygon model 53 is 1 pixel. Since the polygon model 53 has already been drawn in the detail buffer, the dotting section 210 does nothing.
 打点部210は、詳細バッファへの描画の終了の通知を終了判定部206に送信し、ステップS29に進む。 The dot printing unit 210 sends a notification of the end of drawing to the detail buffer to the end determination unit 206, and proceeds to step S29.
 ステップS29において、終了判定部206は、不適用描画対象描画部203から描画先バッファへの描画の終了の通知を受信、あるいは打点部210から詳細バッファへの描画の終了の通知を受信した場合、描画先バッファに描画すべき描画対象が他にあるか否かを判定する。 In step S29, when the end determination unit 206 receives a notification of the end of drawing to the drawing destination buffer from the non-applied drawing target drawing unit 203 or a notification of the end of drawing to the detail buffer from the dot printing unit 210, It is determined whether or not there is another drawing target to be drawn in the drawing destination buffer.
 終了判定部206は、まだ不適用描画対象描画部203、適用描画対象描画部209、あるいは打点部210にて描画の処理をする描画処理をしておらず、描画先バッファに描画すべき描画対象がある場合は、ステップS29:Yesとなる。終了判定部206は、不適用描画対象描画部203から描画先バッファへの描画の終了の通知を受信し、かつ描画先バッファに描画すべき描画対象がまだある場合は、ステップS23に戻って、描画処理をしていない描画対象を描画処理する。 The termination determination unit 206 has not yet performed drawing processing for the non-applicable drawing target drawing unit 203, the applicable drawing target drawing unit 209, or the dot printing unit 210, and the drawing target to be drawn in the drawing destination buffer has not yet been processed. If there is, step S29: Yes. When the end determination unit 206 receives the notification of the end of drawing to the drawing destination buffer from the non-applicable drawing target drawing unit 203 and there are still drawing targets to be drawn in the drawing destination buffer, the process returns to step S23. Render processing of the rendering target that has not been rendered.
 終了判定部206は、打点部210から詳細バッファへの描画の終了の通知を受信し、かつ描画先バッファに描画すべき描画対象がまだある場合は、蓄積記憶部208に詳細バッファ記憶部109に記憶させていた描画処理をした詳細バッファを記憶させ、詳細バッファ記憶部109に新しい詳細バッファを作成できるようにしてから、ステップS23に戻って、描画処理装置200は描画処理をしていない描画対象を描画処理する。 When the end determination unit 206 receives the notification of the end of drawing to the detail buffer from the dot printing unit 210 and there is still a drawing target to be drawn in the drawing destination buffer, After storing the stored detail buffer that has undergone the drawing processing and making it possible to create a new detail buffer in the detail buffer storage unit 109, the drawing processing device 200 returns to step S23, and draws the drawing target that has not undergone the drawing processing. to process the drawing.
 一方、終了判定部206は、まだ描画処理をしておらず、描画先バッファに描画すべき描画対象がない場合は、ステップS29:Noとなる。終了判定部206は、蓄積記憶部208に記憶させた詳細バッファのすべてを一括でブレンド処理部207に送信し、ステップS30に進む。 On the other hand, if the end determination unit 206 has not yet performed rendering processing and there is no rendering target to be rendered in the rendering destination buffer, step S29: No. The end determination unit 206 collectively transmits all the detail buffers stored in the accumulation storage unit 208 to the blend processing unit 207, and proceeds to step S30.
 実施の形態2では、終了判定部206は、ポリゴンモデル51と、ポリゴンモデル52と、ポリゴンモデル53と、ポリゴンモデル54とのうち、1つでも描画処理をしていないものがある場合は、ステップS29:Yesとなる。終了判定部206は、ポリゴンモデル54の描画処理がされ、不適用描画対象描画部203から描画先バッファへの描画の終了の通知を受信し、かつ描画先バッファに描画すべき描画対象がまだある場合は、ステップS23に戻って、描画処理をしていない描画対象を描画処理する。 In the second embodiment, the end determination unit 206 performs step S29: Yes. The end determination unit 206 receives a notification from the non-applicable drawing target drawing unit 203 that the drawing processing of the polygon model 54 is completed, and that the drawing to the drawing destination buffer has ended, and there are still drawing targets to be drawn in the drawing destination buffer. If so, the process returns to step S23 to perform drawing processing on the drawing target that has not been subjected to drawing processing.
 終了判定部206は、ポリゴンモデル51~53のいずれかの描画処理がされ、打点部210から詳細バッファへの描画の終了の通知を受信し、かつ描画先バッファに描画すべき描画対象がまだある場合は、蓄積記憶部208に詳細バッファ記憶部109に記憶させていたポリゴンモデル51~53のいずれかが描画された詳細バッファを記憶させ、詳細バッファ記憶部109に新しい詳細バッファを作成できるようにしてから、ステップS23に戻って、描画処理をしていない描画対象を描画処理する。 The end determination unit 206 receives the notification of the end of drawing to the detail buffer from the dot printing unit 210 after drawing processing for any of the polygon models 51 to 53 has been completed, and there is still a drawing object to be drawn in the drawing destination buffer. In this case, the detailed buffer in which any of the polygon models 51 to 53 stored in the detailed buffer storage unit 109 is drawn is stored in the accumulation storage unit 208 so that a new detailed buffer can be created in the detail buffer storage unit 109. After that, the process returns to step S23, and the drawing object that has not been subjected to the drawing process is subjected to the drawing process.
 一方、終了判定部206は、ポリゴンモデル51~54のすべての描画処理をした場合は、ステップS29:Noとなる。終了判定部206は、蓄積記憶部208に記憶させたポリゴンモデル51~53がそれぞれ描画された各詳細バッファのすべてを一括でブレンド処理部207に送信し、ステップS30に進む。 On the other hand, if the end determination unit 206 has completed rendering processing for all of the polygon models 51 to 54, step S29: No. The end determination unit 206 collectively transmits all of the detailed buffers in which the polygon models 51 to 53 stored in the accumulation storage unit 208 are drawn to the blend processing unit 207, and proceeds to step S30.
 ステップS30において、ブレンド処理部207は、終了判定部206から適用描画対象がそれぞれ描画された各詳細バッファのすべてを一括で受信する。実施の形態1では、ブレンド処理部110は、詳細バッファ1つと描画先バッファとをアルファブレンドしたが、実施の形態2では、ブレンド処理部207は、複数の詳細バッファと描画先バッファとをアルファブレンドする。具体的には、ブレンド処理部207は、詳細バッファ1つと描画先バッファとをアルファブレンドする処理を詳細バッファの数だけ順番に行う。アルファブレンドの方法は、実施の形態1のステップS18と同様である。全てのポリゴンモデルを描画先バッファ55に描画した描画結果を示した図は、実施の形態1の図16と同様である。 In step S30, the blend processing unit 207 collectively receives from the end determination unit 206 all of the detailed buffers in which the applicable drawing targets are drawn. In the first embodiment, the blend processing unit 110 alpha-blends one detail buffer and the drawing destination buffer. do. Specifically, the blend processing unit 207 performs alpha-blending of one detail buffer and the drawing destination buffer in order by the number of detail buffers. The method of alpha blending is the same as step S18 of the first embodiment. A drawing showing the result of drawing all the polygon models in the drawing destination buffer 55 is the same as FIG. 16 of the first embodiment.
 ブレンド処理部207は、アルファブレンドの終了の通知を出力制御部111に送信し、ステップS31に進む。それ以外は、ステップS30は、実施の形態1のステップS18と同様である。 The blend processing unit 207 transmits a notification of the end of alpha blending to the output control unit 111, and proceeds to step S31. Otherwise, step S30 is the same as step S18 of the first embodiment.
 ステップS31において、出力制御部111は、ブレンド処理部207からアルファブレンドの終了の通知を受信する。それ以外は、ステップS31は、実施の形態1のステップS19と同様である。 In step S<b>31 , the output control unit 111 receives an alpha blend end notification from the blend processing unit 207 . Otherwise, step S31 is the same as step S19 of the first embodiment.
 ステップS31を実行した後もステップS21に戻り、電源をOFFにすること、あるいは終了操作がなされる等の処理の終了のトリガーがあるまで上記のような処理を繰り返す。なお、上記のような処理を繰り返すとしたが、繰り返さず一回行うだけでもよい。 Even after executing step S31, the process returns to step S21, and the above process is repeated until there is a trigger to end the process, such as turning off the power or performing an end operation. Although the above processing is repeated, it may be performed only once.
 以上述べたように、実施の形態2の描画処理システム2は、適用判定部202によって不適用描画対象と判定された描画対象を、不適用描画対象描画部203が描画先バッファ記憶部103に記憶されている描画先バッファに描画し、算出部204が適用判定部202によって適用描画対象と判定された描画対象の面積を算出し、決定部205が算出した面積から適用描画対象のアルファ値を決定し、適用描画対象描画部209が描画先バッファとは異なるバッファである詳細バッファに適用描画対象を描画し、打点部210が適用描画対象の位置に対応した詳細バッファ内のピクセルに点を打つことで適用描画対象を詳細バッファに描画し、ブレンド処理部207が適用描画対象のアルファ値で描画先バッファに詳細バッファをアルファブレンドし、出力制御部111が出力装置5に描画先バッファに描画されたシーンを出力するよう制御するため、ピクセルの中心に描画対象が位置していない場合でも、描画対象が描画され、描画対象を表示画面上に表示させることができる。 As described above, in the drawing processing system 2 of the second embodiment, the non-applicable drawing target drawing unit 203 stores the drawing target determined as the non-applicable drawing target by the application determining unit 202 in the drawing destination buffer storage unit 103. The calculation unit 204 calculates the area of the drawing target determined as the applicable drawing target by the application determination unit 202, and the determining unit 205 determines the alpha value of the applicable drawing target from the calculated area. Then, the applicable drawing target drawing unit 209 draws the applicable drawing target in a detail buffer which is a buffer different from the drawing destination buffer, and the dot printing unit 210 dots the pixel in the detail buffer corresponding to the position of the applicable drawing target. , the blend processing unit 207 alpha-blends the detail buffer to the drawing destination buffer with the alpha value of the applied drawing target, and the output control unit 111 draws to the output device 5 in the drawing destination buffer. Since the scene is controlled to be output, even if the drawing target is not positioned at the center of the pixel, the drawing target is drawn and the drawing target can be displayed on the display screen.
 また、実施の形態2においては、描画処理システム2は、入力装置6が受け付けるのは描画情報の入力だけであり、取得部201が入力装置6から描画情報を取得し、適用判定部202が描画情報から描画対象が適用描画対象と不適用描画対象とのいずれであるかを判定する。当該判定により、ユーザーなどがタグ情報を入力装置6に入力する必要がなく、描画処理システム2が自動で描画対象が適用描画対象と不適用描画対象とのいずれであるかを判定するため、ユーザーの負荷を低減できる。 In the second embodiment, the drawing processing system 2 receives only drawing information input by the input device 6. It is determined from the information whether the drawing target is an applicable drawing target or an unapplied drawing target. With this determination, there is no need for the user or the like to input the tag information to the input device 6, and the drawing processing system 2 automatically determines whether the drawing target is an applicable drawing target or an unapplied drawing target. load can be reduced.
 なお、実施の形態2のステップS23において、例えば、適用判定部202は、描画対象の面積sを算出して、面積sが予め定めて記憶しておいた閾値X未満か否かを判定したが、描画対象ごとの輝度値、重要度などで描画対象ごとに閾値を変動させてもよい。例えば、描画対象の輝度値が閾値Yよりも低い場合は、輝度が低いとして閾値Xを小さくし、描画対象の輝度値が閾値Yよりも高い場合は、輝度が高いとして閾値Xを大きくする。当該閾値Xの変更をすることで、輝度が低い描画対象は、より面積が小さくても適用描画対象とならないため、不適用描画対象と判定される確率が高くなり、描画処理負荷を低減させることができる。一方、輝度が高い描画対象は、より面積が大きくないと不適用描画対象とならないため、適用描画対象と判定される確率が高くなり、より消えにくくなる。同様に、例えば、描画対象に予め重要度が付与されている場合、描画対象の重要度が閾値Zよりも低い場合は、重要度が低いとして閾値Xを小さくし、描画対象の重要度が閾値Zよりも高い場合は、重要度が高いとして閾値Xを大きくする。当該閾値Xの変更をすることで、重要度が低い描画対象は、より面積が小さくても適用描画対象とならないため、不適用描画対象と判定される確率が高くなり、描画処理負荷を低減させることができる。一方、重要度が高い描画対象は、より面積が大きくないと不適用描画対象とならないため、適用描画対象と判定される確率が高くなり、より消えにくくなる。 Note that in step S23 of the second embodiment, for example, the application determination unit 202 calculates the area s of the object to be drawn and determines whether or not the area s is less than the threshold value X that is determined and stored in advance. , the threshold value may be changed for each drawing target according to the luminance value, importance, or the like of each drawing target. For example, if the brightness value of the drawing target is lower than the threshold Y, the brightness is low and the threshold X is decreased, and if the brightness value of the drawing target is higher than the threshold Y, the brightness is high and the threshold X is increased. By changing the threshold value X, a drawing target with low luminance is not applicable drawing target even if its area is smaller, so the probability of being determined as an unapplied drawing target increases, and the drawing processing load is reduced. can be done. On the other hand, a drawing target with a high brightness is not treated as an inapplicable drawing target unless it has a larger area, so the probability of being determined as an applicable drawing target increases and it becomes more difficult to erase. Similarly, for example, when an object to be drawn is assigned a degree of importance in advance, if the degree of importance of the object to be drawn is lower than the threshold value Z, the degree of importance is considered to be low, and the threshold value X is reduced. If it is higher than Z, the threshold X is increased as the importance is high. By changing the threshold X, a drawing target with a low degree of importance will not be an applicable drawing target even if it has a smaller area, so the probability of being determined as an unapplied drawing target increases, and the drawing processing load can be reduced. be able to. On the other hand, a drawing target with a high degree of importance is not treated as an unapplied drawing target unless it has a larger area, so the probability of being determined as an applicable drawing target increases, making it more difficult to erase.
 実施の形態2において、描画処理システム2は、蓄積記憶部208を備えており、終了判定部206は、描画先バッファに描画すべき描画対象がまだある場合は、蓄積記憶部208に詳細バッファ記憶部109に記憶させていた描画処理をした詳細バッファを記憶させ、詳細バッファ記憶部109に新しい詳細バッファを作成できるようにしてから、描画処理装置200は描画処理をしていない描画対象を描画処理するようにしていた。しかし、描画処理システム2は、蓄積記憶部208を備えず、描画処理装置200は、詳細バッファ記憶部109に記憶されている描画処理をした詳細バッファに上書き保存していくように、描画処理をしていない描画対象を描画処理するようにしてもよい。 In the second embodiment, the drawing processing system 2 includes an accumulation storage unit 208, and the end determination unit 206 stores details in the accumulation storage unit 208 when there are still drawing objects to be drawn in the drawing destination buffer. The drawing processing unit 200 stores the drawing-processed detail buffer stored in the unit 109 so that a new detail buffer can be created in the detail buffer storage unit 109. Then, the drawing processing device 200 performs drawing processing on a drawing target that has not been subjected to drawing processing. I was trying to However, the drawing processing system 2 does not include the accumulation storage unit 208, and the drawing processing device 200 performs the drawing processing so as to overwrite and save the detail buffer stored in the detail buffer storage unit 109, which has undergone the drawing processing. A drawing target that is not drawn may be drawn.
実施の形態3.
 実施の形態1では、描画処理システム1は、作成部102が詳細バッファ記憶部109に描画先バッファと同じ解像度の詳細バッファを1つ作成し、適用描画対象描画部108あるいは打点部112が当該詳細バッファに適用描画対象すべてを描画した。実施の形態3では、図20~図31に示すように、割り当て作成部302が詳細バッファ記憶部303に適用描画対象ごとの範囲に対応した詳細バッファを作成し、適用描画対象描画部108あるいは打点部112が適用描画対象の範囲に対応した詳細バッファに当該適用描画対象を描画する。
Embodiment 3.
In the first embodiment, in the rendering processing system 1, the creation unit 102 creates one detail buffer having the same resolution as the rendering destination buffer in the detail buffer storage unit 109, and the applicable rendering target rendering unit 108 or the dot printing unit 112 stores the detail buffer. All applicable drawing objects have been drawn in the buffer. In the third embodiment, as shown in FIGS. 20 to 31, the allocation creating unit 302 creates a detailed buffer corresponding to the range of each applicable drawing object in the detailed buffer storage unit 303, The unit 112 draws the applicable drawing target in the detail buffer corresponding to the range of the applicable drawing target.
 当該描画により、適用描画対象の描画に使用しない範囲の割合が高い場合に、適用描画対象の範囲に対応した詳細バッファを作成するだけでいいため、無駄にバッファを作成する必要がなく、メモリの使用量を低減することができる。それ以外は、実施の形態1と同様である。以下の説明において実施の形態1で説明した構成及び動作については、同一符号を付して、重複する説明を省略する。 With this drawing, if the proportion of the range that is not used for drawing of the applicable drawing target is high, it is only necessary to create a detail buffer corresponding to the range of the applicable drawing target. The amount used can be reduced. Other than that, it is the same as the first embodiment. In the following description, the configurations and operations described in Embodiment 1 are denoted by the same reference numerals, and duplicate descriptions are omitted.
 実施の形態3でも実施の形態1と同様に、一例として、シーン全体の描画の情報である描画情報内に、ピクセルの中心に描画対象が位置していない描画対象がある場合について、本開示を適用した場合を以下に説明する。 In the third embodiment, as in the first embodiment, as an example, the present disclosure is applied to the case where the drawing information, which is the information for drawing the entire scene, includes a drawing target that is not positioned at the center of the pixel. A case of application will be described below.
 図20は、実施の形態3に係る描画処理装置300を含む描画処理システム3のブロック図である。描画処理システム3は、描画情報に含まれる各描画対象において詳細描画処理を適用するか否かのタグ情報と描画情報とが入力される入力装置4と、描画情報とタグ情報とを用いて、描画情報を描画する処理を行う描画処理装置300と、描画処理装置300で処理されたシーンを出力する出力装置5とを備えている。 FIG. 20 is a block diagram of the drawing processing system 3 including the drawing processing device 300 according to the third embodiment. The drawing processing system 3 uses an input device 4 for inputting tag information indicating whether or not to apply detailed drawing processing to each drawing target included in the drawing information and drawing information, and the drawing information and the tag information, A drawing processing device 300 for drawing drawing information and an output device 5 for outputting a scene processed by the drawing processing device 300 are provided.
 実施の形態3では、実施の形態1の図1の作成部102と、詳細バッファ記憶部109と、適用描画対象描画部108と、打点部112と、ブレンド処理部110の代わりに、作成部301と、詳細バッファ記憶部303と、適用描画対象描画部304と、打点部305と、ブレンド処理部306と、割り当て作成部302とが機能ブロック図の構成として加わる。 In the third embodiment, instead of the creation unit 102, the detailed buffer storage unit 109, the applied drawing target drawing unit 108, the dot printing unit 112, and the blend processing unit 110 of FIG. , a detailed buffer storage unit 303, an applicable drawing target drawing unit 304, a dot printing unit 305, a blend processing unit 306, and an allocation creating unit 302 are added as components of the functional block diagram.
 描画処理装置300は、入力装置4から描画対象を含む描画情報とタグ情報とを取得する取得部101と、描画先バッファを作成する作成部301と、タグ情報を用いて描画対象を詳細描画処理するか否かによって、描画対象が、描画先バッファに描画される不適用描画対象と、描画先バッファとは異なるバッファである詳細バッファに描画される適用描画対象とのいずれであるかを判定する適用判定部106と、詳細描画処理を適用しないとタグ付けされた不適用描画対象を描画先バッファに描画する不適用描画対象描画部107と、詳細描画処理を適用するとタグ付けされた適用描画対象の範囲に対応した詳細バッファをそれぞれ作成する割り当て作成部302と、詳細描画処理を適用するとタグ付けされた適用描画対象の面積を算出する算出部104と、算出した面積から適用描画対象のアルファ値を決定する決定部105と、詳細バッファに適用描画対象を描画し、描画されたピクセル数を取得する適用描画対象描画部304と、適用描画対象ごとに取得した描画されたピクセル数から、詳細バッファに描画されず、消失してしまった適用描画対象である消失適用描画対象を特定し、消失適用描画対象の位置に対応した詳細バッファ内のピクセルに点を打ち、消失適用描画対象を詳細バッファに描画する打点部305と、詳細バッファで塗られたピクセルに対応して、消失適用描画対象を含む適用描画対象のアルファ値で消失適用描画対象を含む適用描画対象のRGB値をアルファブレンドすることにより、消失適用描画対象を含む適用描画対象を描画先バッファに描画するブレンド処理部306と、出力装置5に描画先バッファに描画されたシーンを出力するよう制御する出力制御部111と、描画先バッファを記憶する描画先バッファ記憶部103と、詳細バッファを記憶する詳細バッファ記憶部303とを備えている。なお、不適用描画対象描画部107は、第一の描画対象描画部に対応し、適用描画対象描画部304は、第二の描画対象描画部に対応する。 The drawing processing device 300 includes an acquisition unit 101 that acquires drawing information including a drawing target and tag information from the input device 4, a creation unit 301 that creates a drawing destination buffer, and a detailed drawing process for the drawing target using the tag information. determines whether the drawing target is an inapplicable drawing target that is drawn in the drawing destination buffer or an applicable drawing target that is drawn in the detail buffer, which is a buffer different from the drawing destination buffer. An application determining unit 106, an unapplied drawing target rendering unit 107 that draws an unapplied drawing target tagged as not applying detailed drawing processing to a drawing destination buffer, and an applicable drawing target tagged as applying detailed drawing processing. , a calculation unit 104 for calculating the area of the application drawing target tagged as applying the detail drawing process, and the alpha value of the application drawing target from the calculated area an applicable drawing target drawing unit 304 for drawing the applicable drawing target in the detail buffer and acquiring the number of drawn pixels; Identifies the applied drawing target that was not drawn in the previous step and has disappeared, points the pixel in the detail buffer corresponding to the position of the lost drawing target, and puts the lost drawing target into the detail buffer. By alpha-blending the RGB values of the applicable drawing objects including the erasure applying drawing objects with the alpha values of the applying drawing objects including the erasing applying drawing objects in correspondence with the pixels painted in the dot printing part 305 to be drawn and the detail buffer. , a blend processing unit 306 for drawing applied drawing objects including lost applied drawing objects in a drawing destination buffer, an output control unit 111 for controlling output of the scene drawn in the drawing destination buffer to the output device 5, and a drawing destination buffer. and a detailed buffer storage unit 303 for storing detailed buffers. The non-applicable drawing target drawing portion 107 corresponds to the first drawing target drawing portion, and the applicable drawing target drawing portion 304 corresponds to the second drawing target drawing portion.
 次に、実施の形態3に係る描画処理システム3のハードウェア構成について説明する。描画処理システム3のハードウェア構成図は、実施の形態1の図2と同様である。 Next, the hardware configuration of the rendering processing system 3 according to Embodiment 3 will be described. A hardware configuration diagram of the drawing processing system 3 is the same as that of FIG. 2 of the first embodiment.
 ただし、作成部301のハードウェア構成は作成部102と同様である。詳細バッファ記憶部303のハードウェア構成は詳細バッファ記憶部109と同様である。適用描画対象描画部304のハードウェア構成は適用描画対象描画部108と同様である。打点部305のハードウェア構成は打点部112と同様である。ブレンド処理部306のハードウェア構成はブレンド処理部110と同様である。割り当て作成部302は、グラフィックメモリ45に記憶するプログラムによって実現され、GPU44がグラフィックメモリ45にロードしたプログラムを読み込み、実行することにより実現する。なお、グラフィックメモリ45は、割り当て作成部302の機能と他の部の機能とのそれぞれを別々のメモリで実現してもよい。 However, the hardware configuration of the creation unit 301 is the same as that of the creation unit 102. The hardware configuration of the detail buffer storage unit 303 is the same as that of the detail buffer storage unit 109 . The hardware configuration of the applicable rendering target rendering unit 304 is the same as that of the applicable rendering target rendering unit 108 . The hardware configuration of the dotting section 305 is the same as that of the dotting section 112 . The hardware configuration of the blend processing unit 306 is the same as that of the blend processing unit 110 . The allocation creation unit 302 is implemented by a program stored in the graphic memory 45, and implemented by reading and executing the program loaded into the graphic memory 45 by the GPU 44. FIG. Note that the graphic memory 45 may implement the functions of the allocation creation unit 302 and the functions of other units with separate memories.
 次に、描画処理システム3及び描画処理装置300の動作について説明する。 Next, the operations of the drawing processing system 3 and the drawing processing device 300 will be described.
 本開示の実施の形態3に係る描画処理システム3の動作を示すフローチャートは、実施の形態1の図3と同様である。 A flowchart showing the operation of the drawing processing system 3 according to the third embodiment of the present disclosure is the same as that of FIG. 3 of the first embodiment.
 図21は、本開示の実施の形態3に係る描画処理装置300の動作を示すフローチャートである。図21を用いて、描画処理装置300の動作を以下に説明する。 FIG. 21 is a flowchart showing the operation of the rendering processing device 300 according to Embodiment 3 of the present disclosure. The operation of the rendering processing device 300 will be described below with reference to FIG.
 ステップS41において、取得部101は、描画情報とタグ情報とを作成部301に送信し、ステップS42に進む。それ以外は、ステップS41は、実施の形態1のステップS11と同様である。 In step S41, the acquisition unit 101 transmits the drawing information and the tag information to the creation unit 301, and proceeds to step S42. Otherwise, step S41 is the same as step S11 of the first embodiment.
 ステップS42において、作成部301は、描画先バッファのみを作成し、詳細バッファを作成しない。作成部301は、描画情報とタグ情報とを適用判定部106に送信し、ステップS43に進む。それ以外は、ステップS42は、実施の形態1のステップS12と同様である。 In step S42, the creation unit 301 creates only the drawing destination buffer and does not create the detail buffer. The creation unit 301 transmits the drawing information and the tag information to the application determination unit 106, and proceeds to step S43. Otherwise, step S42 is the same as step S12 of the first embodiment.
 ステップS43において、適用判定部106は、作成部301から描画情報とタグ情報とを受信する。不適用描画対象描画部107は、描画先バッファへの描画の終了の通知を適用処理部106とブレンド処理部306に送信し、ステップS44に進む。それ以外は、ステップS43は、実施の形態1のステップS13と同様である。 In step S<b>43 , the application determination unit 106 receives drawing information and tag information from the creation unit 301 . The non-applicable drawing target drawing unit 107 transmits a notification of completion of drawing to the drawing destination buffer to the application processing unit 106 and the blend processing unit 306, and proceeds to step S44. Otherwise, step S43 is the same as step S13 of the first embodiment.
 ステップS44において、適用判定部106は、不適用描画対象描画部107から描画先バッファへの描画の終了の通知を受信すると、描画情報と、適用描画対象である描画対象のタグ情報とを割り当て作成部302に送信する。 In step S44, when the application determining unit 106 receives the notification from the non-applicable drawing target drawing unit 107 that the drawing to the drawing destination buffer has ended, the drawing information and the tag information of the drawing target that is the applicable drawing target are assigned and created. 302.
 実施の形態3では、適用判定部106は、図4の描画情報に対して、描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル51のタグ情報と、適用描画対象であるとタグ付けされたポリゴンモデル52のタグ情報と、適用描画対象であるとタグ付けされたポリゴンモデル53のタグ情報とを割り当て作成部302に送信する。 In the third embodiment, the application determining unit 106 adds the drawing information, the tag information of the polygon model 51 tagged as the applicable drawing target, and the tag as the applicable drawing target to the drawing information of FIG. The tag information of the polygon model 52 that has been tagged and the tag information of the polygon model 53 that has been tagged as an applicable drawing target are transmitted to the assignment creation unit 302 .
 割り当て作成部302は、適用判定部106から描画情報と、適用描画対象である描画対象のタグ情報とを受信し、詳細バッファ記憶部303に適用描画対象ごとに対応する詳細バッファを作成する。各詳細バッファには、対応する各適用描画対象が描画されることになる。なお、詳細バッファは、描画先バッファと同じ解像度である。また、割り当て作成部302は、各詳細バッファが描画先バッファのどこの位置に対応するかという情報である位置情報を作成し、各詳細バッファに記憶させる。 The assignment creation unit 302 receives the drawing information and the tag information of the drawing target that is the application drawing target from the application determination unit 106, and creates a detailed buffer corresponding to each application drawing target in the detail buffer storage unit 303. Each corresponding apply draw target will be drawn in each detail buffer. Note that the detail buffer has the same resolution as the drawing destination buffer. Further, the allocation creation unit 302 creates position information indicating which position in the rendering destination buffer each detail buffer corresponds to, and stores the position information in each detail buffer.
 実施の形態3では、割り当て作成部302は、適用判定部106から描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル51のタグ情報と、適用描画対象であるとタグ付けされたポリゴンモデル52のタグ情報と、適用描画対象であるとタグ付けされたポリゴンモデル53のタグ情報とを受信する。割り当て作成部302は、詳細バッファ記憶部303にポリゴンモデル51に対応する詳細バッファ64を作成する。また、割り当て作成部302は、詳細バッファ記憶部303にポリゴンモデル52に対応する詳細バッファ65を作成する。さらに、割り当て作成部302は、詳細バッファ記憶部303にポリゴンモデル53に対応する詳細バッファ66を作成する。詳細バッファ64には、ポリゴンモデル51が、詳細バッファ65には、ポリゴンモデル52が、詳細バッファ66には、ポリゴンモデル53が、それぞれ描画されることになる。なお、詳細バッファ64~66は、描画先バッファ55と同じ解像度である。 In the third embodiment, the assignment creation unit 302 receives the drawing information from the application determination unit 106, the tag information of the polygon model 51 tagged as the applicable drawing target, and the polygon model 51 tagged as the applicable drawing target. It receives the tag information of the model 52 and the tag information of the polygon model 53 tagged as the applicable drawing object. The allocation creating unit 302 creates a detailed buffer 64 corresponding to the polygon model 51 in the detailed buffer storage unit 303 . Also, the allocation creating unit 302 creates a detailed buffer 65 corresponding to the polygon model 52 in the detailed buffer storage unit 303 . Furthermore, the assignment creating unit 302 creates a detailed buffer 66 corresponding to the polygon model 53 in the detailed buffer storage unit 303 . The polygon model 51 is drawn in the detail buffer 64, the polygon model 52 in the detail buffer 65, and the polygon model 53 in the detail buffer 66, respectively. Note that the detail buffers 64 to 66 have the same resolution as the drawing destination buffer 55 .
 図22は、実施の形態3の適用描画対象のポリゴンモデル51~53に対応する詳細バッファ64~66と実施の形態1の詳細バッファ56との対比関係を示した図である。実施の形態3では、詳細バッファ64~66は、実施の形態1と同じ描画先バッファ55と同じ解像度であるとする。つまり、実施の形態1の詳細バッファ56と、実施の形態3の詳細バッファ64~66とは、同じ解像度であるとする。 FIG. 22 is a diagram showing the comparative relationship between the detail buffers 64 to 66 corresponding to the polygon models 51 to 53 to be rendered in the third embodiment and the detail buffer 56 in the first embodiment. In the third embodiment, it is assumed that the detail buffers 64 to 66 have the same resolution as the drawing destination buffer 55 as in the first embodiment. In other words, it is assumed that the detail buffer 56 of the first embodiment and the detail buffers 64 to 66 of the third embodiment have the same resolution.
 ここで、実施の形態1の詳細バッファ56は、縦が8ピクセルで、横が8ピクセルの合計64ピクセルである。一方、実施の形態3の詳細バッファ64は、縦が3ピクセルで、横が3ピクセルの合計9ピクセルである。詳細バッファ65は、縦が2ピクセルで、横が3ピクセルの合計6ピクセルである。詳細バッファ66は、縦が3ピクセルで、横が3ピクセルの合計36ピクセルである。実施の形態3では、詳細バッファ64~66以外の範囲の詳細バッファは作成しないため、実施の形態1よりも作成する詳細バッファの大きさが小さくなり、メモリの使用量を低減することができる。 Here, the detail buffer 56 of Embodiment 1 is 8 pixels long by 8 pixels wide, for a total of 64 pixels. On the other hand, the detail buffer 64 of the third embodiment has a total of 9 pixels, 3 pixels long and 3 pixels wide. The detail buffer 65 is 2 pixels high by 3 pixels wide, for a total of 6 pixels. The detail buffer 66 is 3 pixels high by 3 pixels wide, for a total of 36 pixels. In the third embodiment, since no detail buffer is created outside the range of the detail buffers 64 to 66, the size of the detail buffer to be created is smaller than in the first embodiment, and the amount of memory used can be reduced.
 図23は、実施の形態3の適用描画対象のポリゴンモデル51に対応する詳細バッファ64と別の適用描画対象のポリゴンモデル52に対応する別の詳細バッファ65とさらに別の適用描画対象のポリゴンモデル53に対応するさらに別の詳細バッファ66とを示した図である。 FIG. 23 shows a detailed buffer 64 corresponding to the polygon model 51 to be applied drawing in the third embodiment, another detailed buffer 65 corresponding to the polygon model 52 to be applied drawing, and another polygon model to be applied drawing. 53 shows yet another detail buffer 66 corresponding to 53; FIG.
 また、割り当て作成部302は、詳細バッファ64が描画先バッファ55のどこの位置に対応するかという情報である位置情報67と、詳細バッファ65が描画先バッファ55のどこの位置に対応するかという情報である位置情報68と、詳細バッファ66が描画先バッファ55のどこの位置に対応するかという情報である位置情報69とを作成する。割り当て作成部302は、詳細バッファ64に位置情報67を、詳細バッファ65に位置情報68を、詳細バッファ66に位置情報69を記憶させる。 Further, the assignment creation unit 302 includes position information 67 indicating to which position in the drawing destination buffer 55 the detail buffer 64 corresponds and information indicating to which position in the drawing destination buffer 55 the detail buffer 65 corresponds. Position information 68, which is information, and position information 69, which is information indicating which position in the drawing destination buffer 55 the detail buffer 66 corresponds to, are created. The allocation creating unit 302 stores the position information 67 in the detail buffer 64 , the position information 68 in the detail buffer 65 , and the position information 69 in the detail buffer 66 .
 図24は、詳細バッファ64が描画先バッファ55のどこの位置に対応するかという情報である位置情報67を模擬的に示した図である。描画先バッファ55上の点線の範囲64が詳細バッファ64の位置に対応する。 FIG. 24 is a diagram schematically showing position information 67, which is information indicating to which position in the drawing destination buffer 55 the detail buffer 64 corresponds. A dotted line range 64 on the drawing destination buffer 55 corresponds to the position of the detail buffer 64 .
 図25は、別の詳細バッファ65が描画先バッファ55のどこの位置に対応するかという情報である位置情報68を模擬的に示した図である。描画先バッファ55上の点線の範囲65が詳細バッファ65の位置に対応する。 FIG. 25 is a diagram schematically showing position information 68, which is information indicating to which position in the drawing destination buffer 55 another detail buffer 65 corresponds. A dotted line range 65 on the drawing destination buffer 55 corresponds to the position of the detail buffer 65 .
 図26は、さらに別の詳細バッファ66が描画先バッファ55のどこの位置に対応するかという情報である位置情報69を模擬的に示した図である。描画先バッファ55上の点線の範囲66が詳細バッファ66の位置に対応する。 FIG. 26 is a diagram schematically showing position information 69, which is information indicating to which position in the rendering destination buffer 55 another detail buffer 66 corresponds. A dotted line range 66 on the drawing destination buffer 55 corresponds to the position of the detail buffer 66 .
 なお、実施の形態3では、適用描画対象のポリゴンモデルごとに位置情報を分けたが、適用描画対象のポリゴンモデルすべてがそれぞれ描画先バッファ55のどこの位置に対応するかという情報として位置情報を1つにまとめて、詳細バッファ記憶部303が記憶していてもよい。 In the third embodiment, the position information is divided for each polygon model to be applied drawing. The detailed buffer storage unit 303 may store them collectively.
 図27は、詳細バッファ64が描画先バッファ55のどこの位置に対応するかという情報と、別の詳細バッファ65が描画先バッファ55のどこの位置に対応するかという情報と、さらに別の詳細バッファ66が描画先バッファ55のどこの位置に対応するかという情報とをまとめた位置情報を模擬的に示した図である。描画先バッファ55上の点線の範囲64が詳細バッファ64の位置に対応し、描画先バッファ55上の点線の範囲65が詳細バッファ65の位置に対応し、描画先バッファ55上の点線の範囲66が詳細バッファ66の位置に対応する。 FIG. 27 shows information about which position in the drawing destination buffer 55 the detail buffer 64 corresponds to, information about which position in the drawing destination buffer 55 another detail buffer 65 corresponds to, and further details. FIG. 10 is a diagram schematically showing position information that summarizes the information as to which position in the drawing destination buffer 55 the buffer 66 corresponds to. A dotted line range 64 on the drawing destination buffer 55 corresponds to the position of the detail buffer 64 , a dotted line range 65 on the drawing destination buffer 55 corresponds to the position of the detail buffer 65 , and a dotted line range 66 on the drawing destination buffer 55 corresponds to the position of the detail buffer 65 . corresponds to the location of the detail buffer 66 .
 なお、ポリゴンモデル51に対応する詳細バッファ64の範囲は、ポリゴンモデル51全体が描画できる範囲である。ポリゴンモデル52に対応する詳細バッファ65の範囲は、ポリゴンモデル52全体が描画できる範囲である。ポリゴンモデル53に対応する詳細バッファ66の範囲は、ポリゴンモデル53全体が描画できる範囲である。適用描画対象に対応する詳細バッファの範囲は、予め決められていてもよいし、ユーザーからの入力の受付によってユーザーが決められるようにしてもよいし、描画対象の面積に対応して決められるなど、範囲の決め方は限定しない。 The range of the detail buffer 64 corresponding to the polygon model 51 is the range in which the entire polygon model 51 can be drawn. The range of the detail buffer 65 corresponding to the polygon model 52 is the range in which the entire polygon model 52 can be drawn. The range of the detail buffer 66 corresponding to the polygon model 53 is the range in which the entire polygon model 53 can be drawn. The range of the detail buffer corresponding to the applicable drawing target may be determined in advance, may be determined by the user upon receipt of input from the user, or may be determined according to the area of the drawing target. , the range is not limited.
 例えば、描画対象の面積に対応して決める場合は、割り当て作成部302は、実施の形態1の算出部104のように描画対象ごとの面積を計算してもよいし、バウンディングボックスなどを用いて近似して、描画対象ごとの面積の概算をしてもよい。なお、3次元の描画対象の場合は、描画対象を2次元に投影変換してから、描画対象の面積を計算する。バウンディングボックスなどを用いて描画対象ごとの面積の概算をする方法は、実施の形態1の図9と同様である。なお、割り当て作成部302は、適用描画対象よりも範囲に余裕を持たせて詳細バッファを作成してもよい。 For example, when determining according to the area of the drawing target, the allocation creation unit 302 may calculate the area of each drawing target as in the calculation unit 104 of the first embodiment, or use a bounding box or the like. By approximation, the area of each object to be drawn may be roughly estimated. In the case of a three-dimensional object to be drawn, the area of the object to be drawn is calculated after the object to be drawn is projected and transformed into two dimensions. The method of estimating the area of each object to be drawn using a bounding box or the like is the same as in FIG. 9 of the first embodiment. Note that the allocation creation unit 302 may create the detail buffer with a wider range than the applicable drawing target.
 図21に戻って、割り当て作成部302は、描画情報と、適用描画対象である描画対象のタグ情報とを算出部104に送信し、ステップS45に進む。実施の形態3では、割り当て作成部302は、描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル51のタグ情報と、適用描画対象であるとタグ付けされたポリゴンモデル52のタグ情報と、適用描画対象であるとタグ付けされたポリゴンモデル53のタグ情報とを算出部104に送信する。 Returning to FIG. 21, the assignment creation unit 302 transmits the drawing information and the tag information of the drawing target, which is the applicable drawing target, to the calculation unit 104, and proceeds to step S45. In the third embodiment, the assignment creation unit 302 generates drawing information, tag information of the polygon model 51 tagged as the applicable drawing target, and tag information of the polygon model 52 tagged as the applicable drawing target. , and the tag information of the polygon model 53 tagged as being the applicable drawing target, are transmitted to the calculation unit 104 .
 ステップS45において、算出部104は、割り当て作成部302から描画情報と、適用描画対象である描画対象のタグ情報とを受信する。実施の形態3では、算出部104は、割り当て作成部302から描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル51のタグ情報と、適用描画対象であるとタグ付けされたポリゴンモデル52のタグ情報と、適用描画対象であるとタグ付けされたポリゴンモデル53のタグ情報とを受信する。それ以外は、ステップS45は、実施の形態1のステップS14と同様である。 In step S45, the calculation unit 104 receives the drawing information and the tag information of the drawing target, which is the applicable drawing target, from the allocation creation unit 302. In the third embodiment, the calculation unit 104 receives the drawing information from the assignment creation unit 302, the tag information of the polygon model 51 tagged as the applicable drawing target, and the polygon model 51 tagged as the applicable drawing target. 52 tag information and the tag information of the polygon model 53 tagged as being the applicable rendering target. Otherwise, step S45 is the same as step S14 of the first embodiment.
 ステップS46において、決定部105は、描画情報と、適用描画対象である描画対象のタグ情報と、適用描画対象のアルファ値情報とを適用描画対象描画部304に送信し、ステップS47に進む。それ以外は、ステップS46は、実施の形態1のステップS15と同様である。 In step S46, the determining unit 105 transmits the drawing information, the tag information of the drawing target that is the applicable drawing target, and the alpha value information of the applicable drawing target to the applicable drawing target drawing unit 304, and proceeds to step S47. Otherwise, step S46 is the same as step S15 of the first embodiment.
 ステップS47において、適用描画対象描画部304は、決定部105から描画情報と、適用描画対象である描画対象のタグ情報と、適用描画対象のアルファ値情報とを受信し、詳細バッファ記憶部303の各詳細バッファに各適用描画対象を描画する。このとき、各詳細バッファにアルファ値情報も記憶される。また、適用描画対象描画部304は、詳細バッファに適用描画対象を描画する毎に、詳細バッファに描画されたピクセル数を取得する。 In step S<b>47 , the applicable drawing target drawing unit 304 receives the drawing information, the tag information of the drawing target that is the applicable drawing target, and the alpha value information of the applicable drawing target from the determining unit 105 . Draw each applicable drawable into each detail buffer. Alpha value information is also stored in each detail buffer at this time. Also, the applied drawing target drawing unit 304 acquires the number of pixels drawn in the detailed buffer each time the applied drawing target is drawn in the detailed buffer.
 実施の形態3では、適用描画対象描画部304は、実施の形態1と同様に、決定部105から描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル51のタグ情報と、適用描画対象であるとタグ付けされたポリゴンモデル52のタグ情報と、適用描画対象であるとタグ付けされたポリゴンモデル53のタグ情報と、ポリゴンモデル51のアルファ値が0.4であるというアルファ値情報と、ポリゴンモデル52のアルファ値が0.7であるというアルファ値情報と、ポリゴンモデル53のアルファ値が1.0であるというアルファ値情報とを受信する。 In the third embodiment, as in the first embodiment, the applicable drawing target drawing unit 304 receives the drawing information from the determination unit 105, the tag information of the polygon model 51 tagged as the applicable drawing target, and the applicable drawing target. The tag information of the polygon model 52 tagged as being the object, the tag information of the polygon model 53 tagged as being the application drawing object, and the alpha value information that the alpha value of the polygon model 51 is 0.4. Then, alpha value information that the alpha value of the polygon model 52 is 0.7 and alpha value information that the alpha value of the polygon model 53 is 1.0 are received.
 適用描画対象描画部304は、詳細バッファ記憶部303の詳細バッファ64にポリゴンモデル51を描画する。また、適用描画対象描画部304は、詳細バッファ記憶部303の詳細バッファ65にポリゴンモデル52を描画する。さらに、適用描画対象描画部304は、詳細バッファ記憶部303の詳細バッファ66にポリゴンモデル53を描画する。 The applicable drawing target drawing unit 304 draws the polygon model 51 in the detail buffer 64 of the detail buffer storage unit 303 . Also, the applicable drawing target drawing unit 304 draws the polygon model 52 in the detail buffer 65 of the detail buffer storage unit 303 . Further, the applicable drawing target drawing unit 304 draws the polygon model 53 in the detail buffer 66 of the detail buffer storage unit 303 .
 図28は、適用描画対象のポリゴンモデル51を詳細バッファ64に描画した描画結果70と別の適用描画対象のポリゴンモデル53を別の詳細バッファ66に描画した描画結果71と複数の適用描画対象のポリゴンモデル51~53とを重畳した図である。 FIG. 28 shows a drawing result 70 obtained by drawing a polygon model 51 to be applied drawing in the detail buffer 64, a drawing result 71 obtained by drawing another polygon model 53 to be applied drawing in another detail buffer 66, and a plurality of drawing targets. It is a diagram in which polygon models 51 to 53 are superimposed.
 実施の形態1と同様に、図28のように、ポリゴンモデル51は、詳細バッファ64の1つのピクセルの中心を含むため、当該ピクセルが描画情報に含まれるポリゴンモデル51に付随したRGB値で塗りつぶされて、ポリゴンモデル51が描画される。ポリゴンモデル51が描画された結果が、図28の1ピクセルの描画結果70である。ポリゴンモデル52は、詳細バッファ65のどのピクセルの中心も含まないため、描画されず消えてしまう。したがって、詳細バッファ65には、何も描画されない。ポリゴンモデル53は、詳細バッファ66の1つのピクセルの中心を含むため、当該ピクセルが描画情報に含まれるポリゴンモデル53に付随したRGB値で塗りつぶされて、ポリゴンモデル53が描画される。ポリゴンモデル53が描画された結果が、図28の1ピクセルの描画結果71である。実施の形態3の描画結果70は、実施の形態1の図12の描画結果58と同じになる。実施の形態3の描画結果71は、実施の形態1の図12の描画結果59と同じになる。 As in the first embodiment, the polygon model 51 includes the center of one pixel in the detail buffer 64, as shown in FIG. Then, the polygon model 51 is drawn. The drawing result of the polygon model 51 is the drawing result 70 of one pixel in FIG. Since the polygon model 52 does not contain the center of any pixel in the detail buffer 65, it is not rendered and disappears. Therefore, nothing is drawn in the detail buffer 65 . Since the polygon model 53 includes the center of one pixel in the detail buffer 66, the polygon model 53 is drawn by filling the pixel with the RGB values associated with the polygon model 53 included in the drawing information. The drawing result of the polygon model 53 is the drawing result 71 of one pixel in FIG. The drawing result 70 of the third embodiment is the same as the drawing result 58 of FIG. 12 of the first embodiment. The drawing result 71 of the third embodiment is the same as the drawing result 59 of FIG. 12 of the first embodiment.
 図29は、適用描画対象のポリゴンモデル51を詳細バッファ64に描画した描画結果70と別の適用描画対象のポリゴンモデル53を別の詳細バッファ66に描画した描画結果71とを示した図である。図29は図28の複数のポリゴンモデル51~53の重畳表示をなくし、実際の描画結果70~71のみを示している。図28と同様に、実施の形態3の描画結果70は、実施の形態1の図13の描画結果58と同じになる。実施の形態3の描画結果71は、実施の形態1の図13の描画結果59と同じになる。 FIG. 29 is a diagram showing a drawing result 70 obtained by drawing the polygon model 51 to be applied drawing in the detail buffer 64 and a drawing result 71 obtained by drawing another polygon model 53 to be applied drawing in another detail buffer 66 . . FIG. 29 eliminates the superimposed display of the multiple polygon models 51-53 of FIG. 28 and shows only the actual drawing results 70-71. 28, the drawing result 70 of the third embodiment is the same as the drawing result 58 of FIG. 13 of the first embodiment. The drawing result 71 of the third embodiment is the same as the drawing result 59 of FIG. 13 of the first embodiment.
 適用描画対象描画部304は、ポリゴンモデル51のアルファ値情報から、描画結果70のピクセルに、ポリゴンモデル51のアルファ値が0.4であるというアルファ値を記憶させる。また、適用描画対象描画部304は、ポリゴンモデル53のアルファ値情報から、描画結果71のピクセルに、ポリゴンモデル53のアルファ値が1.0であるというアルファ値を記憶させる。図29のピクセルの中の値は、記憶されているアルファ値を表す。 From the alpha value information of the polygon model 51, the applicable drawing target drawing unit 304 stores the alpha value that the alpha value of the polygon model 51 is 0.4 in the pixel of the drawing result 70. Further, from the alpha value information of the polygon model 53 , the applicable drawing target drawing unit 304 stores an alpha value that the alpha value of the polygon model 53 is 1.0 in the pixel of the drawing result 71 . The values in the pixels of FIG. 29 represent the stored alpha values.
 適用描画対象描画部304は、詳細バッファ64にポリゴンモデル51を描画する処理をした後、ポリゴンモデル51によって実際に詳細バッファ64が塗りつぶされたピクセル数を取得する。ポリゴンモデル51の場合は、図29から分かるように、適用描画対象描画部304は、詳細バッファ64に実際に描画されたピクセル数として1ピクセルという情報を取得する。適用描画対象描画部304は、詳細バッファ65にポリゴンモデル52を描画する処理をした後、ポリゴンモデル52によって実際に詳細バッファ65が塗りつぶされたピクセル数を取得する。ポリゴンモデル52の場合は、図29から分かるように、ポリゴンモデル52によって詳細バッファ65が1ピクセルも塗りつぶされていないため、適用描画対象描画部304は、ピクセル数として0ピクセルという情報を取得する。適用描画対象描画部304は、詳細バッファ66にポリゴンモデル53を描画する処理をした後、ポリゴンモデル53によって実際に詳細バッファ66が塗りつぶされたピクセル数を取得する。ポリゴンモデル53の場合は、図29から分かるように、適用描画対象描画部304は、詳細バッファ66に実際に描画されたピクセル数として1ピクセルという情報を取得する。以降は、適用描画対象描画部304が取得した適用描画対象によって詳細バッファに描画されたピクセル数を示す情報を、実施の形態1と同様に、ピクセル数情報と呼ぶ。 The applicable drawing target drawing unit 304 acquires the number of pixels in which the detailed buffer 64 is actually filled with the polygon model 51 after performing the process of drawing the polygon model 51 in the detail buffer 64 . In the case of the polygon model 51, as can be seen from FIG. 29, the applied drawing target drawing unit 304 acquires information that the number of pixels actually drawn in the detail buffer 64 is 1 pixel. After executing the process of drawing the polygon model 52 in the detailed buffer 65 , the applied drawing target drawing unit 304 acquires the number of pixels in which the detailed buffer 65 is actually filled with the polygon model 52 . In the case of the polygon model 52, as can be seen from FIG. 29, the detail buffer 65 is not filled with even one pixel by the polygon model 52, so the applicable drawing target drawing unit 304 acquires the information that the number of pixels is 0 pixel. After executing the process of drawing the polygon model 53 in the detailed buffer 66 , the applied drawing target drawing unit 304 acquires the number of pixels in which the detailed buffer 66 is actually filled with the polygon model 53 . In the case of the polygon model 53, as can be seen from FIG. 29, the applied drawing target drawing unit 304 acquires information that the number of pixels actually drawn in the detail buffer 66 is 1 pixel. Hereinafter, information indicating the number of pixels drawn in the detail buffer by the applicable drawing target acquired by the applicable drawing target drawing unit 304 will be referred to as pixel number information, as in the first embodiment.
 図21に戻って、適用描画対象描画部304は、描画情報と、適用描画対象である描画対象のタグ情報と、アルファ値情報と、ピクセル数情報とを打点部305に送信し、ステップS48に進む。 Returning to FIG. 21, the applied drawing target drawing unit 304 transmits the drawing information, the tag information of the drawing target that is the applicable drawing target, the alpha value information, and the number of pixels information to the dot printing unit 305, and the process proceeds to step S48. move on.
 ステップS48において、打点部305は、適用描画対象描画部304から描画情報と、適用描画対象である描画対象のタグ情報と、アルファ値情報と、ピクセル数情報とを受信する。打点部305は、ピクセル数情報から、適用描画対象描画部304で描画されず消えてしまった適用描画対象である消失適用描画対象を特定し、消失適用描画対象の位置に対応した詳細バッファ内のピクセルに点を打ち、消失適用描画対象を詳細バッファに描画する。このとき、詳細バッファにアルファ値情報のアルファ値も記憶される。 In step S48, the dot printing unit 305 receives the drawing information, the tag information of the drawing target that is the applied drawing target, the alpha value information, and the number of pixels information from the applied drawing target drawing unit 304. The dot printing unit 305 identifies, from the pixel number information, the erasure applied drawing target, which is the applied drawing object that has disappeared without being drawn by the applied drawing object drawing unit 304, and stores the erase applied drawing object in the detail buffer corresponding to the position of the erasure applied drawing object. Dot the pixel and draw the erasure-applied draw target to the detail buffer. At this time, the alpha value of the alpha value information is also stored in the detail buffer.
 実施の形態3では、打点部305は、実施の形態1と同様に、適用描画対象描画部304から描画情報と、適用描画対象であるとタグ付けされたポリゴンモデル51のタグ情報と、適用描画対象であるとタグ付けされたポリゴンモデル52のタグ情報と、適用描画対象であるとタグ付けされたポリゴンモデル53のタグ情報と、ポリゴンモデル51のアルファ値が0.4であるというアルファ値情報と、ポリゴンモデル52のアルファ値が0.7であるというアルファ値情報と、ポリゴンモデル53のアルファ値が1.0であるというアルファ値情報と、ポリゴンモデル51によって詳細バッファ56に描画されたピクセル数が1ピクセルであるというピクセル数情報と、ポリゴンモデル52によって詳細バッファ56に描画されたピクセル数が0ピクセルであるというピクセル数情報と、ポリゴンモデル53によって詳細バッファ56に描画されたピクセル数が1ピクセルであるというピクセル数情報とを受信する。 In the third embodiment, as in the first embodiment, the dot printing unit 305 receives the drawing information from the applied drawing target drawing unit 304, the tag information of the polygon model 51 tagged as the applicable drawing target, and the applied drawing The tag information of the polygon model 52 tagged as being the object, the tag information of the polygon model 53 tagged as being the application drawing object, and the alpha value information that the alpha value of the polygon model 51 is 0.4. , alpha value information that the alpha value of the polygon model 52 is 0.7, alpha value information that the alpha value of the polygon model 53 is 1.0, and pixels drawn in the detail buffer 56 by the polygon model 51. Pixel number information that the number is 1 pixel, pixel number information that the number of pixels drawn in the detail buffer 56 by the polygon model 52 is 0 pixels, and pixel number information that the number of pixels drawn in the detail buffer 56 by the polygon model 53 is and pixel number information that it is one pixel.
 打点部305は、実施の形態1と同様に、ピクセル数情報から、ピクセル数が0ピクセルである適用描画対象を適用描画対象描画部304で描画されず消えてしまった適用描画対象である消失適用描画対象として特定する。具体的には、打点部305は、ポリゴンモデル52のピクセル数情報が0ピクセルであるため、ポリゴンモデル52を消失適用描画対象として特定する。 As in the first embodiment, the dot printing unit 305 uses the pixel count information to convert the applied drawing target having the number of pixels of 0 to the lost drawing target that has disappeared without being drawn by the applied drawing target drawing unit 304. Identifies as a drawing target. Specifically, since the pixel count information of the polygon model 52 is 0 pixel, the dot printing unit 305 identifies the polygon model 52 as the object to be drawn with erasure applied.
 打点部305は、消失適用描画対象であるポリゴンモデル52の位置に対応した詳細バッファ65内のピクセルに点を打ち、消失適用描画対象を詳細バッファに描画する。打点部305は、ポリゴンモデル52の頂点の位置に対応した詳細バッファ65内のピクセルに点を打ち、消失適用描画対象であるポリゴンモデル52を詳細バッファ65に描画する。なお、打点部305は、円、楕円など頂点がない描画対象の場合は、多角形に近似して、多角形の頂点の位置に点を打つ。 The dotting unit 305 dots the pixels in the detail buffer 65 corresponding to the position of the polygon model 52 to which erasure is applied, and renders the erasure-applied drawing target in the detail buffer. The dotting unit 305 dots the pixels in the detail buffer 65 corresponding to the positions of the vertices of the polygon model 52 , and renders the polygon model 52 to be rendered with erasure applied in the detail buffer 65 . In addition, in the case of a drawing target such as a circle or an ellipse that does not have vertices, the dotting unit 305 approximates it to a polygon and places dots at the positions of the vertices of the polygon.
 図30は、適用描画対象のポリゴンモデル51を詳細バッファ64に描画した描画結果70と別の適用描画対象のポリゴンモデル53を詳細バッファ66に描画した描画結果71とさらに別の適用描画対象のポリゴンモデル52を詳細バッファ65に描画した描画結果72と複数のポリゴンモデル51~53とを重畳した図である。ポリゴンモデル52は、実施の形態1と同様に、頂点が3つあるため、打点部305は、ポリゴンモデル52の3つの頂点の位置に対応した詳細バッファ65内のピクセルに点を打つ。すると、ポリゴンモデル52の3つの頂点の位置に対応した詳細バッファ65の3ピクセル分が、描画情報に含まれるポリゴンモデル52に付随したRGB値で塗りつぶされて、ポリゴンモデル52が描画される。ポリゴンモデル52が描画された結果が、図30の3ピクセルの描画結果72である。 FIG. 30 shows a drawing result 70 obtained by drawing a polygon model 51 to be applied drawing in the detail buffer 64, a drawing result 71 obtained by drawing another polygon model 53 to be applied drawing in the detail buffer 66, and another polygon to be applied drawing. It is a diagram in which a drawing result 72 obtained by drawing a model 52 in a detail buffer 65 and a plurality of polygon models 51 to 53 are superimposed. As in the first embodiment, the polygon model 52 has three vertices. Then, 3 pixels of the detail buffer 65 corresponding to the positions of the 3 vertices of the polygon model 52 are filled with RGB values attached to the polygon model 52 included in the drawing information, and the polygon model 52 is drawn. The drawing result of the polygon model 52 is the drawing result 72 of 3 pixels in FIG.
 図31は、適用描画対象のポリゴンモデル51を詳細バッファ64に描画した描画結果70と別の適用描画対象のポリゴンモデル53を詳細バッファ66に描画した描画結果71とさらに別の適用描画対象のポリゴンモデル52を詳細バッファ65に描画した描画結果72とを示した図である。図31は図30の複数のポリゴンモデル51~53の重畳表示をなくし、実際の描画結果70~72のみを示している。 FIG. 31 shows a drawing result 70 obtained by drawing a polygon model 51 to be applied drawing in the detail buffer 64, a drawing result 71 obtained by drawing another polygon model 53 to be applied drawing in the detail buffer 66, and another polygon to be applied drawing. FIG. 10 is a diagram showing a drawing result 72 of the model 52 drawn in the detail buffer 65; FIG. 31 eliminates the superimposed display of the multiple polygon models 51-53 of FIG. 30 and shows only the actual drawing results 70-72.
 打点部305は、ポリゴンモデル52のアルファ値情報から、描画結果72を構成する各ピクセルに、ポリゴンモデル52のアルファ値が0.7であるというアルファ値を記憶させる。図31の各ピクセルの中の値は、記憶されているアルファ値を表す。 From the alpha value information of the polygon model 52, the dot printing unit 305 stores an alpha value that the alpha value of the polygon model 52 is 0.7 in each pixel forming the drawing result 72. The value in each pixel in FIG. 31 represents the stored alpha value.
 図21に戻って、打点部305は、詳細バッファへの描画の終了の通知をブレンド処理部306に送信し、ステップS49に進む。また、打点部305は、消失適用描画対象がなかった場合は、適用描画対象はすでに詳細バッファに描画されているため、何もせず、詳細バッファへの描画の終了の通知をブレンド処理部306に送信し、ステップS49に進む。 Returning to FIG. 21, the dot printing unit 305 sends a notification of completion of drawing to the detail buffer to the blend processing unit 306, and proceeds to step S49. If there is no drawing target to be erased, the dot printing unit 305 does nothing because the drawing target to be applied has already been drawn in the detail buffer, and notifies the blend processing unit 306 of the end of drawing to the detail buffer. It transmits, and it progresses to step S49.
 ステップS49において、ブレンド処理部306は、不適用描画対象描画部107から描画先バッファへの描画の終了の通知を受信し、打点部305から詳細バッファへの描画の終了の通知を受信すると、詳細バッファ303に記憶されている各詳細バッファの位置情報を用いて、描画先バッファ記憶部103に記憶されている描画先バッファに対して、詳細バッファ303に記憶されている各詳細バッファの1ピクセルごとにアルファ値でRGB値をアルファブレンドして適用描画対象を描画先バッファに描画する。 In step S<b>49 , the blend processing unit 306 receives a notification of completion of drawing to the drawing destination buffer from the unapplied drawing target drawing unit 107 and a notification of completion of drawing to the detail buffer from the dot printing unit 305 . Using the position information of each detail buffer stored in the buffer 303, each pixel of each detail buffer stored in the detail buffer 303 is stored in the rendering destination buffer stored in the rendering destination buffer storage unit 103. Then, alpha blend the RGB values with the alpha value and draw the applied drawing target in the drawing destination buffer.
 実施の形態3では、ブレンド処理部306は、詳細バッファ64に記憶されている位置情報67を用いて、図8の描画先バッファ55に対して、図31の詳細バッファ64の1ピクセルごとにアルファ値でRGB値をアルファブレンドして適用描画対象であるポリゴンモデル51を描画先バッファ55に描画する。具体的には、ブレンド処理部306は、描画先バッファ記憶部103に記憶されている図8の描画先バッファ55に対して、位置情報67に対応した位置に、図31のポリゴンモデル51の描画結果70の1ピクセルについて、アルファ値である0.4でポリゴンモデル51のRGB値をアルファブレンドする。 In the third embodiment, the blend processing unit 306 uses the position information 67 stored in the detail buffer 64 to add alpha to the rendering destination buffer 55 of FIG. 8 for each pixel of the detail buffer 64 of FIG. A polygon model 51 to be applied drawing is drawn in the drawing destination buffer 55 by alpha blending the RGB values with the value. Specifically, the blend processing unit 306 draws the polygon model 51 in FIG. 31 at the position corresponding to the position information 67 in the drawing destination buffer 55 in FIG. For one pixel of the result 70, alpha blend the RGB values of the polygon model 51 with an alpha value of 0.4.
 同様に、ブレンド処理部306は、詳細バッファ65に記憶されている位置情報68を用いて、図8の描画先バッファ55に対して、図31の詳細バッファ65の1ピクセルごとにアルファ値でRGB値をアルファブレンドして適用描画対象であるポリゴンモデル52を描画先バッファ55に描画する。具体的には、ブレンド処理部306は、描画先バッファ記憶部103に記憶されている図8の描画先バッファ55に対して、位置情報68に対応した位置に、図31のポリゴンモデル52の描画結果72の各ピクセルについて、各ピクセルのアルファ値である0.7でポリゴンモデル52のRGB値をアルファブレンドする。 Similarly, using the position information 68 stored in the detail buffer 65, the blend processing unit 306 renders the rendering destination buffer 55 of FIG. The value is alpha-blended and the polygon model 52 to be applied drawing is drawn in the drawing destination buffer 55 . Specifically, the blend processing unit 306 draws the polygon model 52 of FIG. 31 at the position corresponding to the position information 68 in the drawing destination buffer 55 of FIG. For each pixel of the result 72, alpha blend the RGB values of the polygonal model 52 with the alpha value of 0.7 for each pixel.
 ブレンド処理部306は、詳細バッファ66に記憶されている位置情報69を用いて、図8の描画先バッファ55に対して、図31の詳細バッファ66の1ピクセルごとにアルファ値でRGB値をアルファブレンドして適用描画対象であるポリゴンモデル53を描画先バッファ55に描画する。具体的には、ブレンド処理部306は、描画先バッファ記憶部103に記憶されている図8の描画先バッファ55に対して、位置情報69に対応した位置に、図31のポリゴンモデル53の描画結果71の1ピクセルについて、アルファ値である1.0でポリゴンモデル53のRGB値をアルファブレンドする。 Using the position information 69 stored in the detail buffer 66, the blend processing unit 306 converts the RGB values of the detail buffer 66 of FIG. 31 to the rendering destination buffer 55 of FIG. A polygon model 53 to be blended and applied is drawn in a drawing destination buffer 55 . Specifically, the blend processing unit 306 draws the polygon model 53 of FIG. 31 at the position corresponding to the position information 69 in the drawing destination buffer 55 of FIG. For one pixel of the result 71, alpha blend the RGB values of the polygon model 53 with an alpha value of 1.0.
 全てのポリゴンモデルを描画先バッファ55に描画した描画結果を示した図は、図16と同様である。 A drawing showing the result of drawing all the polygon models in the drawing destination buffer 55 is similar to FIG.
 ブレンド処理部306は、アルファブレンドの終了の通知を出力制御部111に送信し、ステップS50に進む。 The blend processing unit 306 sends a notification of the end of alpha blending to the output control unit 111, and proceeds to step S50.
 ステップS50において、出力制御部111は、ブレンド処理部306からアルファブレンドの終了の通知を受信する。それ以外は、ステップS50は、実施の形態1のステップS19と同様である。 In step S<b>50 , the output control unit 111 receives an alpha blend end notification from the blend processing unit 306 . Otherwise, step S50 is the same as step S19 of the first embodiment.
 ステップS50を実行した後もステップS41に戻り、電源をOFFにすること、あるいは終了操作がなされる等の処理の終了のトリガーがあるまで上記のような処理を繰り返す。なお、上記のような処理を繰り返すとしたが、繰り返さず一回行うだけでもよい。 Even after executing step S50, the process returns to step S41, and the above process is repeated until there is a trigger to end the process, such as turning off the power or performing an end operation. Although the above processing is repeated, it may be performed only once.
 以上述べたように、実施の形態3の描画処理システム3は、適用判定部106によって不適用描画対象と判定された描画対象を、不適用描画対象描画部107が描画先バッファ記憶部103に記憶されている描画先バッファに描画し、算出部104が適用判定部106によって適用描画対象と判定された描画対象の面積を算出し、決定部105が算出した面積から適用描画対象のアルファ値を決定し、適用描画対象描画部304が描画先バッファとは異なるバッファである詳細バッファに適用描画対象を描画し、打点部305が適用描画対象の位置に対応した詳細バッファ内のピクセルに点を打つことで適用描画対象を詳細バッファに描画し、ブレンド処理部306が適用描画対象のアルファ値で描画先バッファに詳細バッファをアルファブレンドし、出力制御部111が出力装置5に描画先バッファに描画されたシーンを出力するよう制御するため、ピクセルの中心に描画対象が位置していない場合でも、描画対象が描画され、描画対象を表示画面上に表示させることができる。 As described above, in the rendering processing system 3 of the third embodiment, the rendering target determined as an unapplied rendering target by the application determination unit 106 is stored in the rendering destination buffer storage unit 103 by the unapplied rendering target rendering unit 107. The calculation unit 104 calculates the area of the drawing target determined by the application determining unit 106 as the applicable drawing target, and the determining unit 105 determines the alpha value of the applicable drawing target from the calculated area. Then, the applicable drawing target drawing unit 304 draws the applicable drawing target in a detail buffer which is a buffer different from the drawing destination buffer, and the dot printing unit 305 dots the pixels in the detail buffer corresponding to the position of the applicable drawing target. , the blend processing unit 306 alpha-blends the detail buffer to the drawing destination buffer with the alpha value of the applied drawing target, and the output control unit 111 draws to the output device 5 in the drawing destination buffer. Since the scene is controlled to be output, even if the drawing target is not positioned at the center of the pixel, the drawing target is drawn and the drawing target can be displayed on the display screen.
 また、実施の形態3においては、描画処理システム3は、割り当て作成部302が詳細バッファ記憶部303に適用描画対象ごとの範囲に対応した詳細バッファを作成し、適用描画対象描画部304と打点部305とが適用描画対象の範囲に対応した詳細バッファに当該適用描画対象を描画する。当該描画により、描画処理システム3は、適用描画対象の描画に使用しない範囲の割合が高い場合に、適用描画対象の範囲に対応した詳細バッファを作成するだけでいいため、無駄にバッファを作成する必要がなく、メモリの使用量を低減することができる。 Further, in the third embodiment, the drawing processing system 3 has the allocation creating unit 302 create a detailed buffer corresponding to the range of each applicable drawing target in the detailed buffer storage unit 303, and the applicable drawing target drawing unit 304 and the dot printing unit 305 draws the applicable drawing target in the detail buffer corresponding to the range of the applicable drawing target. As a result of this drawing, the drawing processing system 3 only needs to create a detailed buffer corresponding to the range of the applicable drawing target when the ratio of the range not used for drawing of the applicable drawing target is high, so the buffer is created wastefully. is not required and memory usage can be reduced.
 実施の形態1~3においては、描画処理システム1~3は、同一シーンに描画サイズ差が大きい描画対象が混在する場合に、従来の描画処理装置では、シーン全体を描画する必要があり、例えば、1ピクセル未満の小さい描画対象などを表示画面上から消失させないように描画するには、その分高い倍率あるいはピクセルの分割数を多くして描画対象を描画する必要がある。つまり、従来の描画処理装置では、シーンの中に、描画サイズ差が大きい描画対象が混在する場合、一番小さい描画対象に対応した倍率あるいは分割数でシーン全体を描画すると、他の描画対象も高解像度、あるいは分割数を多くして描画するため、描画処理負荷が大きくなってしまう。しかし、描画処理システム1~3は、高解像度で描画する必要がないため、描画対象を表示画面上から消失させないように描画することによる描画処理負荷を低減させることができる。 In the first to third embodiments, the drawing processing systems 1 to 3 need to draw the entire scene in the conventional drawing processing device when drawing targets with large drawing size differences coexist in the same scene. In order to draw a small drawing target of less than one pixel so as not to disappear from the display screen, it is necessary to draw the drawing target at a correspondingly higher magnification or increase the number of pixel divisions. In other words, in the conventional drawing processing device, when drawing objects with a large difference in drawing size coexist in a scene, if the whole scene is drawn with the magnification or the number of divisions corresponding to the smallest drawing object, the other drawing objects will also be drawn. Since the image is drawn with a high resolution or with a large number of divisions, the drawing processing load increases. However, since the drawing processing systems 1 to 3 do not need to draw at high resolution, the drawing processing load can be reduced by drawing so as not to erase the drawing target from the display screen.
 実施の形態1~3においては、描画処理システム1~3は、算出部が適用描画対象の面積を算出し、決定部が当該面積からアルファ値を算出するため、描画対象の輝度値を描画対象のサイズに応じて決定し、適切な輝度にすることができる。また、描画処理システム1~3は、アルファブレンドをするため、背景となる色が消えることもない。 In the drawing processing systems 1 to 3 of the first to third embodiments, the calculation unit calculates the area of the applicable drawing target, and the determination unit calculates the alpha value from the calculated area. can be determined according to the size of the . Moreover, since the drawing processing systems 1 to 3 perform alpha blending, the background color does not disappear.
 なお、実施の形態1~3において、描画処理システム1~3は、適用描画対象描画部が、詳細バッファに適用描画対象を描画し、描画されたピクセル数を取得し、打点部が、適用描画対象ごとに取得した描画されたピクセル数から、詳細バッファに描画されなかった消失適用描画対象を特定し、消失適用描画対象の位置に対応した詳細バッファ内のピクセルに点を打ち、消失適用描画対象を詳細バッファに描画していた。しかし、描画処理システム1~3は、適用描画対象描画部をなくし、打点部がすべての適用描画対象対して、適用描画対象の位置に対応した詳細バッファ内のピクセルに点を打ち、すべての適用描画対象を詳細バッファに描画してもよい。ただし、適用描画対象描画部をなくし、打点部がすべての適用描画対象を詳細バッファに描画するためには、適用描画対象が複雑な形状であった場合、打つべき点の数が増えるため、適用描画対象描画部を設けて、打点部が消失適用描画対象の位置に対応した詳細バッファ内のピクセルに点を打ち、消失適用描画対象を詳細バッファに描画する方が、描画処理の負荷が軽減できる。また、描画処理システム1~3は、適用描画対象描画部である程度の面積を持った適用描画対象を描画した方が、正確に描画をすることができる。 In the first to third embodiments, the drawing processing systems 1 to 3 have the applicable drawing target drawing unit draws the applicable drawing target in the detail buffer, acquires the number of drawn pixels, and the dot printing unit performs the applied drawing From the number of drawn pixels obtained for each target, identify the erasure-applied drawing targets that were not drawn in the detail buffer, point the pixel in the detail buffer corresponding to the position of the erasure-applied drawing target, and remove the erasure-applied drawing target. was drawn in the detail buffer. However, the rendering processing systems 1 to 3 eliminate the applicable rendering object rendering portion, and the dot dot unit dots the pixels in the detail buffer corresponding to the position of the applicable rendering object for all the applicable rendering objects. Render targets may be rendered into the detail buffer. However, in order to eliminate the drawing part of the applicable drawing object and to draw all the applicable drawing objects in the detail buffer, if the applicable drawing object has a complicated shape, the number of points to be struck increases. The load of the drawing process can be reduced by providing a drawing target drawing section, where the dot dot section dots the pixels in the detail buffer corresponding to the position of the drawing target with erasure applied, and draws the drawing target with erasure applied in the detail buffer. . Further, the drawing processing systems 1 to 3 can draw more accurately when drawing an applicable drawing target having a certain area in the applicable drawing target drawing section.
 実施の形態1~3において、描画処理システム1~3は、打点部が消失適用描画対象の頂点の位置に対応した詳細バッファ内のピクセルに点を打ったが、消失適用描画対象の重心の位置に対応した詳細バッファ内のピクセルに点を打ってもよい。ただし、打点部が消失適用描画対象の重心の位置に対応した詳細バッファ内のピクセルに点を打つ場合は、重心を算出する必要がある。描画処理システム1~3は、打点部が消失適用描画対象の頂点の位置に対応した詳細バッファ内のピクセルに点を打つ方が、消失適用描画対象の重心の位置に対応した詳細バッファ内のピクセルに点を打つよりも描画処理負荷を低減できる。 In the first to third embodiments, the drawing processing systems 1 to 3 dot the pixels in the detail buffer corresponding to the positions of the vertices of the drawing object with erasure applied. You may also dot the pixel in the detail buffer corresponding to . However, when the dot plotting part plots a dot in the pixel in the detail buffer corresponding to the position of the center of gravity of the object to be rendered with erasure, the center of gravity must be calculated. Rendering processing systems 1 to 3 prefer that the dot dot part plots a pixel in the detail buffer corresponding to the position of the vertex of the object to be rendered with erasure, and the pixel in the detail buffer corresponding to the position of the center of gravity of the object to be rendered with erasure. Rendering processing load can be reduced more than when dots are placed.
 実施の形態1~3において、描画処理システム1~3は、不適用描画対象の描画をしてから、適用描画対象の描画をするように順番に処理していたが、同時に並列に描画してもいいし、描画の順番を逆にしてもよいし、不適用描画対象の描画と適用描画対象の描画とをランダムな順番に処理してもよい。また、不適用描画対象が複数ある場合は、描画処理システム1~3は、複数の不適用描画対象を順番に描画してもよいし、同時に並列に不適用描画対象を描画してもよい。同様に、適用描画対象が複数ある場合は、描画処理システム1~3は、複数の適用描画対象を順番に描画してもよいし、同時に並列に適用描画対象を描画してもよい。 In the first to third embodiments, the drawing processing systems 1 to 3 draw the non-applicable drawing target and then draw the applicable drawing target in order. Alternatively, the order of drawing may be reversed, or the drawing of non-applicable drawing targets and the drawing of applicable drawing targets may be processed in random order. When there are a plurality of non-applicable drawing targets, the drawing processing systems 1 to 3 may draw the plurality of non-applicable drawing targets in order, or may draw the non-applicable drawing targets in parallel at the same time. Similarly, when there are a plurality of applicable drawing targets, the drawing processing systems 1 to 3 may draw the plurality of applicable drawing targets in order, or may draw the applicable drawing targets in parallel at the same time.
 実施の形態1~3において、描画処理システム1~3は、詳細バッファ記憶部を備えているが、複数に記憶部を分けて処理してもよい。また、描画処理システム1~3は、描画処理装置100~300が描画先バッファ記憶部及び詳細バッファ記憶部を備えていたが、これらの記憶部はクラウドなど描画処理装置100~300とは異なる装置に備えられていてもよい。描画処理装置100~300は、通信によって記憶部と情報を授受すればよい。 In Embodiments 1 to 3, the drawing processing systems 1 to 3 are provided with the detail buffer storage units, but processing may be performed by dividing the storage units into a plurality of units. In the drawing processing systems 1 to 3, the drawing processing devices 100 to 300 have the drawing destination buffer storage unit and the detail buffer storage unit. may be provided. The drawing processing devices 100 to 300 may exchange information with the storage unit through communication.
 実施の形態1において、描画処理システム1は、ステップS12において作成部102が描画先バッファと詳細バッファとを作成したが、予め描画先バッファと詳細バッファとが作成されていてもよい。また、ステップS12において作成部102が描画先バッファを作成した後、ステップS13~ステップS16との間で、作成部102が詳細バッファを作成してもよい。描画先バッファの作成は、描画先バッファに描画がなされる前であればいつもでよい。また、詳細バッファの作成は、詳細バッファに描画がなされる前であればいつもでよい。同様に、実施の形態2及び実施の形態3においても、描画先バッファの作成は、描画先バッファに描画がなされる前であればいつもでよい。また、詳細バッファの作成は、詳細バッファに描画がなされる前であればいつもでよい。 In the first embodiment, in the rendering processing system 1, the creation unit 102 created the rendering destination buffer and the detail buffer in step S12, but the rendering destination buffer and the detail buffer may be created in advance. Further, after the creation unit 102 creates the drawing destination buffer in step S12, the creation unit 102 may create the detail buffer between steps S13 to S16. The drawing destination buffer can be created at any time before drawing is done in the drawing destination buffer. Also, the detail buffer can be created any time before drawing is done in the detail buffer. Similarly, in the second and third embodiments, the drawing destination buffer may be created any time before drawing is performed in the drawing destination buffer. Also, the detail buffer can be created any time before drawing is done in the detail buffer.
 実施の形態1~3において、描画処理システム1~3は、描画対象ごとの面積の概算値を算出する場合にバウンディングボックス80を用いたが、ボックスでなくても、球、ラグビーボール型、三角柱、三角錐などの描画対象よりも頂点数が少なく、描画対象の面積の算出よりも面積が算出しやすい形状でもよい。また、これらの組合せでもよい。なお、バウンディングボックスなどは、3次元の描画対象の面積の概算だけでなく、2次元の描画対象の面積の概算にも用いることができる。この場合、球は円、ラグビーボール型は楕円、三角柱及び三角錐は三角形などの概算しやすい形状に置き換わる。 In Embodiments 1 to 3, the drawing processing systems 1 to 3 used the bounding box 80 when calculating the approximate value of the area of each drawing target. , or a shape such as a triangular pyramid that has fewer vertices than the object to be drawn and whose area is easier to calculate than the area of the object to be drawn. A combination of these may also be used. The bounding box or the like can be used not only for approximating the area of a three-dimensional object to be drawn, but also for estimating the area of a two-dimensional object to be drawn. In this case, the sphere is replaced with a circle, the rugby ball is replaced with an ellipse, and the triangular prism and triangular pyramid are replaced with easily approximated shapes such as triangles.
 実施の形態1~3において、描画処理システム1~3は、例えば、描画対象が3次元グラフィックスのサーフェースモデルの1つであるポリゴンモデルである場合について適用したが、ポリゴンモデル以外のサーフェースモデル、ソリットモデル、ポイントモデルなどでも適用できる。また、描画対象が3次元グラフィックスではなく、描画対象が2次元グラフィックスである場合にも、描画処理システム1~3を適用できる。 In Embodiments 1 to 3, the drawing processing systems 1 to 3 are applied, for example, when the object to be drawn is a polygon model, which is one surface model of three-dimensional graphics. It can also be applied to models, solid models, point models, etc. The drawing processing systems 1 to 3 can also be applied when the drawing target is two-dimensional graphics instead of three-dimensional graphics.
 実施の形態1~3において、描画処理システム1~3は、例えば、描画対象がRGB値で表されるRGBカラーの描画を表示する場合に適用したが、グレースケール、HSBカラーで表される描画を表示する場合にも適用できる。 In Embodiments 1 to 3, the drawing processing systems 1 to 3 are applied, for example, when displaying a drawing in RGB colors where the drawing target is represented by RGB values. It can also be applied when displaying
 ところで、上記した実施の形態に示した描画処理装置、描画処理システム、描画処理方法、および描画処理プログラムは一例に過ぎず、適宜、他の装置と組み合わせて構成することが出来るものであって、実施の形態単独の構成に限られるものではない。 By the way, the drawing processing device, the drawing processing system, the drawing processing method, and the drawing processing program shown in the above-described embodiments are merely examples, and can be configured in combination with other devices as appropriate. It is not limited to the configuration of the embodiment alone.
 1~3 描画処理システム、 4, 6 入力装置、
 5 出力装置、 41 バス、 42 CPU、
 43 メインメモリ、 44 GPU、
 45 グラフィックメモリ、 46 入力インターフェース、
 47 出力インターフェース、 51~54 ポリゴンモデル、
 55 描画先バッファ、 
 56, 64~66, 70~72 詳細バッファ、
 57~63 描画結果、 67~69 位置情報、
 80 バウンディングボックス、
 81 描画対象、 82 面積、
 100~300 描画処理装置、 101, 201 取得部、
 102, 301 作成部、 103 描画先バッファ記憶部、
 104, 204 算出部、 105, 205 決定部、
 106, 202 適用判定部、
 107, 203 不適用描画対象描画部、
 108, 209, 304 適用描画対象描画部、
 109, 303 詳細バッファ記憶部、
 110, 207, 306 ブレンド処理部、
 111出力制御部、 112, 210, 305 打点部、
 206 終了判定部、 208 蓄積記憶部、 
 302 割り当て作成部。
1 to 3 drawing processing system, 4, 6 input device,
5 output device, 41 bus, 42 CPU,
43 main memory, 44 GPU,
45 graphic memory, 46 input interface,
47 output interface, 51-54 polygon model,
55 draw destination buffer,
56, 64-66, 70-72 detail buffer,
57-63 drawing result, 67-69 position information,
80 bounding box,
81 drawing object, 82 area,
100 to 300 drawing processing device, 101, 201 acquisition unit,
102, 301 creation unit, 103 drawing destination buffer storage unit,
104, 204 calculation unit, 105, 205 determination unit,
106, 202 application determination unit,
107, 203 non-applicable drawing object drawing part,
108, 209, 304 applied drawing object drawing part,
109, 303 detail buffer storage unit,
110, 207, 306 blend processing unit,
111 output control section, 112, 210, 305 dotting section,
206 end determination unit, 208 accumulation storage unit,
302 Assignment creation unit.

Claims (14)

  1.  2次元以上の描画対象の位置に対応したバッファ内のピクセルに点を打ち、前記描画対象を前記バッファに描画する打点部と、
     前記バッファに描画されたシーンを出力装置に出力するように制御する出力制御部と
    を備えた描画処理装置。
    a dot dot part that dots pixels in a buffer corresponding to the position of a drawing target in two or more dimensions and draws the drawing target in the buffer;
    and an output control unit that controls to output the scene drawn in the buffer to an output device.
  2.  前記描画対象が、描画先となる第一のバッファに描画される第一の描画対象と、前記第一のバッファとは異なるバッファである第二のバッファに描画される第二の描画対象とのいずれであるかを判定する適用判定部と、
     前記適用判定部によって前記第一の描画対象と判定された前記描画対象を前記第一のバッファに描画する第一の描画対象描画部と、
     前記描画対象の透過度を表すアルファ値で前記第一のバッファに前記第二のバッファをアルファブレンドするブレンド処理部とを備え、
     前記バッファは、前記第二のバッファであり、
     前記打点部は、前記第二の描画対象の位置に対応した前記第二のバッファ内のピクセルに点を打ち、前記第二の描画対象を前記第二のバッファに描画し、
     前記出力制御部は、アルファブレンドした前記第一のバッファに描画されたシーンを前記出力装置に出力するように制御する
    請求項1に記載の描画処理装置。
    The drawing target is a first drawing target that is drawn in a first buffer that is a drawing destination, and a second drawing target that is drawn in a second buffer that is a buffer different from the first buffer. an application determination unit that determines which one;
    a first drawing target drawing unit that draws the drawing target determined as the first drawing target by the application determination unit in the first buffer;
    a blend processing unit that alpha-blends the second buffer to the first buffer with an alpha value representing the transparency of the rendering target;
    the buffer is the second buffer;
    The dot-dotting unit dots pixels in the second buffer corresponding to the position of the second drawing target, and draws the second drawing target in the second buffer;
    2. The drawing processing device according to claim 1, wherein the output control unit controls to output the alpha-blended scene drawn in the first buffer to the output device.
  3.  前記適用判定部によって前記第二の描画対象と判定された前記描画対象の面積を算出する算出部と、
     前記算出部により算出された前記面積から前記第二の描画対象の透過度を表すアルファ値を決定する決定部とを備える
    請求項2に記載の描画処理装置。
    a calculation unit that calculates the area of the drawing target determined to be the second drawing target by the application determination unit;
    3. The drawing processing apparatus according to claim 2, further comprising a determination unit that determines an alpha value representing the transparency of the second drawing target from the area calculated by the calculation unit.
  4.  前記第二の描画対象を前記第二のバッファに描画し、前記第二の描画対象を前記第二のバッファに描画したときに塗りつぶされたピクセル数を取得する第二の描画対象描画部を備え、
     前記打点部は、前記ピクセル数が0ピクセルである前記第二の描画対象に対応した前記第二のバッファの位置に点を打ち、前記第二の描画対象を前記第二のバッファに描画する
    請求項2から3のいずれか1項に記載の描画処理装置。
    a second drawing target drawing unit that draws the second drawing target in the second buffer and acquires the number of pixels filled when the second drawing target is drawn in the second buffer; ,
    wherein the dot-printing unit dots a position in the second buffer corresponding to the second drawing target having the number of pixels of 0, and draws the second drawing target in the second buffer; 4. The drawing processing device according to any one of items 2 and 3.
  5.  前記打点部は、前記第二の描画対象の頂点の位置または前記第二の描画対象の重心の位置に対応した前記第二のバッファ内のピクセルに点を打つ
    請求項2から4のいずれか1項に記載の描画処理装置。
    5. Any one of claims 2 to 4, wherein said dot plotting unit plots a dot on a pixel in said second buffer corresponding to the position of the vertex of said second object to be rendered or the position of the center of gravity of said second object to be rendered. 3. The drawing processing device according to the paragraph.
  6.  前記算出部は、前記第二の描画対象に近似した前記第二の描画対象よりも頂点数が少ない形状で概算した面積を前記第二の描画対象の前記面積として算出する
    請求項3に記載の描画処理装置。
    4. The area according to claim 3, wherein the calculation unit calculates the area of the second drawing target by approximating a shape having a smaller number of vertices than the second drawing target approximated to the second drawing target. Drawing processor.
  7.  前記第二の描画対象ごとの面積に応じて、前記第二の描画対象ごとの前記第二のバッファを作成する割り当て作成部を備える
    請求項2から6のいずれか1項に記載の描画処理装置。
    7. The drawing processing device according to claim 2, further comprising an assignment creating unit that creates the second buffer for each of the second drawing targets according to the area of each of the second drawing targets. .
  8.  前記割り当て作成部は、前記第二の描画対象に近似した前記第二の描画対象よりも頂点数が少ない形状で概算した面積を前記第二の描画対象の前記面積として、前記第一のバッファに対する前記第二の描画対象ごとの前記第二のバッファの大きさを算出する
    請求項7に記載の描画処理装置。
    The allocation creation unit sets the area approximated by a shape having a smaller number of vertices than the second drawing target approximated to the second drawing target to the first buffer as the area of the second drawing target. 8. The drawing processing device according to claim 7, wherein the size of said second buffer is calculated for each said second drawing target.
  9.  前記描画対象が前記第一の描画対象と前記第二の描画対象とのいずれであるかの情報であるタグ情報を取得する取得部を備え、
     前記適用判定部は、前記タグ情報を用いて、前記描画対象が前記第一の描画対象と前記第二の描画対象とのいずれであるかを判定する
    請求項2から8のいずれか1項に記載の描画処理装置。
    an acquisition unit that acquires tag information indicating whether the drawing target is the first drawing target or the second drawing target,
    9. The application determination unit according to any one of claims 2 to 8, wherein the application determining unit determines whether the rendering target is the first rendering target or the second rendering target using the tag information. A drawing processor as described.
  10.  前記適用判定部は、前記描画対象の面積を用いて、前記描画対象が前記第一の描画対象と前記第二の描画対象とのいずれであるかを判定する
    請求項2から8のいずれか1項に記載の描画処理装置。
    9. Any one of claims 2 to 8, wherein the application determination unit determines whether the drawing target is the first drawing target or the second drawing target using the area of the drawing target. 3. The drawing processing device according to the paragraph.
  11.  前記適用判定部は、前記描画対象の輝度、または前記描画対象に予め付与された重要度のうち少なくとも1つを用いて、前記描画対象が前記第一の描画対象と前記第二の描画対象とのいずれであるかの判定に用いる閾値を変動させる
    請求項10に記載の描画処理装置。
    The application determination unit determines whether the drawing target is the first drawing target or the second drawing target using at least one of the brightness of the drawing target and the importance assigned in advance to the drawing target. 11. The drawing processing apparatus according to claim 10, wherein a threshold value used for determining whether it is any of the above is varied.
  12.  2次元以上の描画対象の入力を受け付ける入力装置と、
     前記描画対象をバッファにシーンとして描画する描画処理を行う描画処理装置と、
     前記シーンを出力する出力装置とを備え、
     前記描画処理装置は、
     前記描画対象の位置に対応した前記バッファ内のピクセルに点を打ち、前記描画対象を前記バッファに描画する打点部と、
     前記バッファに描画されたシーンを前記出力装置に出力するように制御する出力制御部と
    を備えた描画処理システム。
    an input device that receives an input of a drawing target of two or more dimensions;
    a drawing processing device that performs a drawing process of drawing the drawing target in a buffer as a scene;
    and an output device that outputs the scene,
    The drawing processing device
    a dot dot part that dots pixels in the buffer corresponding to the position of the object to be rendered and renders the object to be rendered into the buffer;
    and an output control unit that controls to output the scene drawn in the buffer to the output device.
  13.  2次元以上の描画対象の位置に対応したバッファ内のピクセルに点を打ち、前記描画対象を前記バッファに描画するステップと、
     前記バッファに描画されたシーンを出力装置に出力するように制御するステップと
    を有する描画処理方法。
    Plotting pixels in a buffer corresponding to the position of a drawing target in two or more dimensions, and drawing the drawing target in the buffer;
    and a step of controlling output of the scene drawn in the buffer to an output device.
  14.  2次元以上の描画対象の位置に対応したバッファ内のピクセルに点を打ち、前記描画対象を前記バッファに描画する処理と、
     前記バッファに描画されたシーンを出力装置に出力するように制御する処理と
    を実行させる描画処理プログラム。
    a process of plotting pixels in a buffer corresponding to the position of a drawing target in two or more dimensions and drawing the drawing target in the buffer;
    A drawing processing program for executing a process of controlling output of the scene drawn in the buffer to an output device.
PCT/JP2021/012491 2021-03-25 2021-03-25 Drawing processing device, drawing processing system, drawing processing method, and drawing processing program WO2022201413A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2021/012491 WO2022201413A1 (en) 2021-03-25 2021-03-25 Drawing processing device, drawing processing system, drawing processing method, and drawing processing program
JP2023508301A JP7471512B2 (en) 2021-03-25 2021-03-25 Drawing processing device, drawing processing system, drawing processing method, and drawing processing program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/012491 WO2022201413A1 (en) 2021-03-25 2021-03-25 Drawing processing device, drawing processing system, drawing processing method, and drawing processing program

Publications (1)

Publication Number Publication Date
WO2022201413A1 true WO2022201413A1 (en) 2022-09-29

Family

ID=83395432

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/012491 WO2022201413A1 (en) 2021-03-25 2021-03-25 Drawing processing device, drawing processing system, drawing processing method, and drawing processing program

Country Status (2)

Country Link
JP (1) JP7471512B2 (en)
WO (1) WO2022201413A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005346605A (en) * 2004-06-07 2005-12-15 Matsushita Electric Ind Co Ltd Antialias drawing method and drawing apparatus using the same
JP2010146255A (en) * 2008-12-18 2010-07-01 Mitsubishi Electric Corp Vector graphics drawing device
WO2014087541A1 (en) * 2012-12-07 2014-06-12 三菱電機株式会社 Graphics rendering device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005346605A (en) * 2004-06-07 2005-12-15 Matsushita Electric Ind Co Ltd Antialias drawing method and drawing apparatus using the same
JP2010146255A (en) * 2008-12-18 2010-07-01 Mitsubishi Electric Corp Vector graphics drawing device
WO2014087541A1 (en) * 2012-12-07 2014-06-12 三菱電機株式会社 Graphics rendering device

Also Published As

Publication number Publication date
JP7471512B2 (en) 2024-04-19
JPWO2022201413A1 (en) 2022-09-29

Similar Documents

Publication Publication Date Title
JP6563048B2 (en) Tilt adjustment of texture mapping for multiple rendering targets with different resolutions depending on screen position
US10147227B2 (en) Variable rate shading
KR100896155B1 (en) Flexible antialiasing in embedded devices
KR102275712B1 (en) Rendering method and apparatus, and electronic apparatus
US8035641B1 (en) Fast depth of field simulation
US8044955B1 (en) Dynamic tessellation spreading for resolution-independent GPU anti-aliasing and rendering
US10235799B2 (en) Variable rate deferred passes in graphics rendering
TWI434226B (en) Image processing techniques
BR112019012641A2 (en) woven render in block architectures
KR102499397B1 (en) Method and apparatus for performing graphics pipelines
US10388063B2 (en) Variable rate shading based on temporal reprojection
TWI789452B (en) Graphic processor performing sampling-based rendering and method of operating the same
KR20180060198A (en) Graphic processing apparatus and method for processing texture in graphics pipeline
TWI622016B (en) Depicting device
CN105550973B (en) Graphics processing unit, graphics processing system and anti-aliasing processing method
KR102285840B1 (en) Method for rendering image and Image outputting device thereof
JP4183082B2 (en) 3D image drawing apparatus and 3D image drawing method
WO2022201413A1 (en) Drawing processing device, drawing processing system, drawing processing method, and drawing processing program
US7528839B1 (en) Faster clears for three-dimensional modeling applications
KR101227155B1 (en) Graphic image processing apparatus and method for realtime transforming low resolution image into high resolution image
JP7444276B2 (en) Drawing processing device, drawing processing system, drawing processing method, and drawing processing program
KR100848687B1 (en) 3-dimension graphic processing apparatus and operating method thereof
JP3701627B2 (en) Drawing processing program, recording medium storing drawing processing program, drawing processing apparatus and method
US20180122132A1 (en) Geometry shadow maps with per-fragment atomics
JP6205200B2 (en) Image processing apparatus and image processing method having sort function

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21933023

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023508301

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21933023

Country of ref document: EP

Kind code of ref document: A1