WO2015186402A1 - Dispositif, procédé et programme de traitement d'images - Google Patents

Dispositif, procédé et programme de traitement d'images Download PDF

Info

Publication number
WO2015186402A1
WO2015186402A1 PCT/JP2015/057572 JP2015057572W WO2015186402A1 WO 2015186402 A1 WO2015186402 A1 WO 2015186402A1 JP 2015057572 W JP2015057572 W JP 2015057572W WO 2015186402 A1 WO2015186402 A1 WO 2015186402A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
display
instruction
user
unit
Prior art date
Application number
PCT/JP2015/057572
Other languages
English (en)
Japanese (ja)
Inventor
雄生 宮前
雅文 野田
匠 吉田
啓一郎 石原
玄人 森田
裕 横川
ニコラ ドゥセ
Original Assignee
株式会社ソニー・コンピュータエンタテインメント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2014117815A external-priority patent/JP6313666B2/ja
Priority claimed from JP2014117816A external-priority patent/JP2015230685A/ja
Application filed by 株式会社ソニー・コンピュータエンタテインメント filed Critical 株式会社ソニー・コンピュータエンタテインメント
Publication of WO2015186402A1 publication Critical patent/WO2015186402A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • A63F13/655Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition by importing photos, e.g. of the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • A63F13/86Watching games played by other players
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Definitions

  • the present invention relates to an image processing technique, and more particularly to an image processing apparatus, an image processing method, and an image processing program for processing a captured image.
  • a game using a live-action image captured by an imaging device is provided.
  • the user can enjoy the game play while viewing the image of the surrounding area of the user.
  • the present invention has been made in view of such circumstances, and an object thereof is to provide a game with higher entertainment.
  • an image processing apparatus acquires a captured image captured by an imaging apparatus, and displays a display image obtained by superimposing a mirror image of the acquired captured image and an image of an object or a character.
  • a generation unit that generates and displays the generated display image on the display device, a transmission unit that distributes the display image to another device so that the other user can view the display image, and an instruction to horizontally flip the captured image in the display image from the user Is received, instead of the mirror image of the captured image, the generation unit includes a mirror image control unit that generates a display image in which the captured image and the image of the object or character are superimposed.
  • a more highly entertaining game can be provided.
  • FIG. 2A is a diagram showing an external configuration of the upper surface of the input device
  • FIG. 2B is a diagram showing an external configuration of the back side surface of the input device.
  • FIG. 1 shows a configuration of a game system 1 according to an embodiment of the present invention.
  • the game system 1 includes a game device 10, input devices 6 a and 6 b operated by a user of the game device 10, and portable terminal devices 9 a and 9 b operated by a user of the game device 10.
  • the game apparatus 10 is communicably connected to the distribution server 5 connected to the external network 3 and the mobile terminal apparatuses 9c and 9d of other users via an access point (AP) 8.
  • AP access point
  • a plurality of users can log in to the game apparatus 10 simultaneously.
  • the auxiliary storage device 2 is a large-capacity storage device such as an HDD (hard disk drive) or a flash memory, and may be an external storage device connected to the game device 10 by a USB (Universal Serial Serial Bus) or the like, or a built-in storage device It may be.
  • the output device 4 may be a television having a display for outputting an image and a speaker for outputting sound.
  • the game device 10 is connected to the input device 6 operated by the user wirelessly or by wire, and the input device 6 outputs operation information indicating the operation result of the user to the game device 10.
  • the game apparatus 10 is further connected to the mobile terminal device 9 operated by the user wirelessly or by wire, and the mobile terminal device 9 is used by operation information indicating the operation result of the user, a game program executed on the game apparatus 10 or the like. Transmitted data or the like is transmitted to the game apparatus 10.
  • the game apparatus 10 receives the operation information from the input device 6 or the mobile terminal device 9, the game apparatus 10 reflects the result in processing such as an OS (system software) or a game program, and causes the output device 4 to output a processing result.
  • OS system software
  • the camera 7 as an imaging device is provided in the vicinity of the output device 4 and images the space around the output device 4.
  • FIG. 1 shows an example in which the camera 7 is attached to the upper part of the output device 4, the camera 7 may be arranged on the side of the output device 4, and in any case, the game is played in front of the output device 4. It is arranged at a position where the user can be imaged.
  • the camera 7 may be a stereo camera or a distance measuring camera.
  • the input device 6 may be a game controller that provides user operation information to the game device 10.
  • the game controller includes a plurality of input units such as a plurality of push-type operation buttons, an analog stick that can input an analog amount, and a rotary button.
  • the portable terminal device 9 may be a portable information terminal such as a smartphone or a tablet, or may be a portable game device.
  • the mobile terminal device 9 executes an application for transmitting user operation information, game data, and the like to a game program executed on the game device 10.
  • the game apparatus 10 provides a function that uses a live-action image around the user captured by the camera 7 and distributes the screen of the game that the user is playing to other users.
  • the game device 10 receives an instruction to distribute the game screen from the user A or B while the users A and B are executing the game, the game device 10 sends the game screen data to the distribution server 5 via the network 3.
  • Users C and D can access the distribution server 5 via the network 3 using the mobile terminal device 9, the game device 10, a personal computer, and the like, and view the moving image of the distributed game.
  • the moving image of the game may be streamed from the distribution server, or the moving image stored in the distribution server may be downloaded and distributed.
  • FIG. 2A shows an external configuration of the upper surface of the input device.
  • the user operates the input device 6 by holding the left grip 78b with the left hand and holding the right grip 78a with the right hand.
  • the four types of buttons 72 to 75 are marked with different figures in different colors in order to distinguish them from each other.
  • the ⁇ button 72 is marked with a red circle
  • the X button 73 is marked with a blue cross
  • the ⁇ button 74 is marked with a purple square
  • the ⁇ button 75 is marked with a green triangle.
  • a touch pad 79 is provided in a flat area between the direction key 71 and the operation button 76.
  • the touch pad 79 also functions as a push-down button that sinks downward when pressed by the user and returns to the original position when the user releases the hand.
  • a function button 80 is provided between the two analog sticks 77a and 77b.
  • the function button 80 is used to turn on the power of the input device 6 and simultaneously activate a communication function for connecting the input device 6 and the game device 10. After the input device 6 is connected to the game device 10, the function button 80 is also used for causing the game device 10 to display a menu screen.
  • the SHARE button 81 is provided between the touch pad 79 and the direction key 71.
  • the SHARE button 81 is used to input an instruction from the user to the OS or system software in the game apparatus 10.
  • the OPTIONS button 82 is provided between the touch pad 79 and the operation button 76.
  • the OPTIONS button 82 is used to input an instruction from the user to an application (game) executed on the game apparatus 10. Both the SHARE button 81 and the OPTIONS button 82 may be formed as push buttons.
  • FIG. 2B shows an external configuration of the back side surface of the input device.
  • a touch pad 79 extends from the upper surface of the casing on the upper side of the casing rear side of the input device 6, and a horizontally long light emitting unit 85 is provided on the lower side of the casing rear side.
  • the light emitting unit 85 has red (R), green (G), and blue (B) LEDs, and lights up according to the light emission color information transmitted from the game apparatus 10.
  • the upper button 83a, the lower button 84a, the upper button 83b, and the lower button 84b are provided at symmetrical positions in the longitudinal direction.
  • the upper button 83a and the lower button 84a are operated by the index finger and middle finger of the user's right hand, respectively, and the upper button 83b and the lower button 84b are operated by the index finger and the middle finger of the user's left hand, respectively.
  • the light emitting unit 85 is provided between the arrangement of the right upper button 83a and the lower button 84a and the arrangement of the left upper button 83b and the lower button 84b, so that an index finger for operating each button is provided.
  • the camera 7 can suitably image the light-emitting unit 85 that is lit.
  • the upper button 83 may be configured as a push-type button
  • the lower button 84 may be configured as a trigger-type button that is rotatably supported.
  • FIG. 3 shows a functional block diagram of the game apparatus 10.
  • the game apparatus 10 includes a main power button 20, a power ON LED 21, a standby LED 22, a system controller 24, a clock 26, a device controller 30, a media drive 32, a USB module 34, a flash memory 36, a wireless communication module 38, and a wired communication module. 40, a subsystem 50, and a main system 60.
  • the main system 60 includes a main CPU (Central Processing Unit), a memory and a memory controller that are main storage devices, a GPU (Graphics Processing Unit), and the like.
  • the GPU is mainly used for arithmetic processing of game programs. These functions may be configured as a system on chip and formed on a single chip.
  • the main CPU has a function of executing a game program recorded in the auxiliary storage device 2.
  • the subsystem 50 includes a sub CPU, a memory that is a main storage device, a memory controller, and the like, does not include a GPU, and does not have a function of executing a game program.
  • the number of circuit gates of the sub CPU is smaller than the number of circuit gates of the main CPU, and the operation power consumption of the sub CPU is smaller than the operation power consumption of the main CPU.
  • the sub CPU operates even while the main CPU is in the standby state, and its processing function is limited in order to keep power consumption low.
  • the main power button 20 is an input unit through which an operation input from the user is performed.
  • the main power button 20 is provided on the front surface of the housing of the game apparatus 10 to turn on or off the power supply to the main system 60 of the game apparatus 10. Operated.
  • the power ON LED 21 is lit when the main power button 20 is turned on, and the standby LED 22 is lit when the main power button 20 is turned off.
  • the system controller 24 detects pressing of the main power button 20 by the user. When the main power button 20 is pressed while the main power is off, the system controller 24 acquires the pressing operation as an “on instruction”, while when the main power is in the on state, the main power button 20 is pressed. When the button 20 is pressed, the system controller 24 acquires the pressing operation as an “off instruction”.
  • the clock 26 is a real-time clock, generates current date and time information, and supplies it to the system controller 24, the subsystem 50, and the main system 60.
  • the device controller 30 is configured as an LSI (Large-Scale Integrated Circuit) that exchanges information between devices like a South Bridge. As illustrated, devices such as a system controller 24, a media drive 32, a USB module 34, a flash memory 36, a wireless communication module 38, a wired communication module 40, a subsystem 50, and a main system 60 are connected to the device controller 30.
  • the device controller 30 absorbs the difference in electrical characteristics of each device and the difference in data transfer speed, and controls the timing of data transfer.
  • the media drive 32 is a drive device that loads and drives application software such as a game and a ROM medium 44 that records license information, and reads programs and data from the ROM medium 44.
  • the ROM medium 44 is a read-only recording medium such as an optical disc, a magneto-optical disc, or a Blu-ray disc.
  • the USB module 34 is a module that is connected to an external device with a USB cable.
  • the USB module 34 may be connected to the auxiliary storage device 2 and the camera 7 with a USB cable.
  • the flash memory 36 is an auxiliary storage device that constitutes an internal storage.
  • the wireless communication module 38 wirelessly communicates with, for example, the input device 6 using a communication protocol such as a Bluetooth (registered trademark) protocol or an IEEE802.11 protocol.
  • the wireless communication module 38 corresponds to the third generation (3rd Generation) digital mobile phone system conforming to the IMT-2000 (International Mobile Telecommunication 2000) standard defined by the ITU (International Telecommunication Union). It may also be compatible with another generation of digital mobile phone systems.
  • the wired communication module 40 performs wired communication with an external device, and connects to the network 3 via, for example, the AP 8.
  • FIG. 4 shows the configuration of the game apparatus 10.
  • the game apparatus 10 includes a communication unit 102, a control unit 110, and a data holding unit 160.
  • each element described as a functional block for performing various processes can be configured by a circuit block, a memory, and other LSIs in terms of hardware, and loaded into the memory in terms of software. Realized by programs. Therefore, it is understood by those skilled in the art that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof, and is not limited to any one.
  • the communication unit 102 has the functions of the wireless communication module 38 and the wired communication module 40 illustrated in FIG. 3 and controls communication via the network 3.
  • the data holding unit 160 is provided in the auxiliary storage device 2 and stores data used by the game program.
  • the control unit 110 includes an image generation unit 111, an audio generation unit 112, a distribution control unit 113, a mask control unit 114, a face detection unit 115, a balloon control unit 116, a controller detection unit 117, a difference detection unit 118, and a billboard control unit 119.
  • a mirror image control unit 120 and a voting control unit 121 are provided.
  • the image generation unit 111 generates a game image controlled by another configuration of the control unit 110.
  • the sound generation unit 112 generates a sound of a game controlled by another configuration of the control unit 110.
  • the image generation unit 111 and the sound generation unit 112 output the generated image and sound data to the output device 4.
  • the distribution control unit 113 receives an instruction to distribute the image and sound data of the game generated by the image generation unit 111 and the sound generation unit 112 to another user, the distribution control unit 113 receives the image data generated by the image generation unit 111 and
  • the voice data generated by the voice generation unit 112 is uploaded to the distribution server 5 via the communication unit 102. Details of other configurations of the control unit 110 will be described later.
  • FIG. 5 shows the configuration of the portable information terminal.
  • the mobile terminal device 9 includes a communication unit 202, a control unit 210, a data holding unit 260, an input device 261, a touch pad 262, a display device 263, and a speaker 264. These functional blocks can be realized in various forms by hardware only, software only, or a combination thereof.
  • the communication unit 202 controls communication via the network 3.
  • the data holding unit 260 is provided in the storage device, and stores data used by an application executed in the control unit 210 of the mobile terminal device 9 in order to control a game program executed in the game device 10.
  • the display device 263 displays the image generated by the control unit 210.
  • the speaker 264 outputs the sound generated by the control unit 210.
  • the input device 261 inputs an instruction from the user to the control unit 210.
  • the touch pad 262 is provided in the display device 263 and inputs an instruction from the user to the control unit 210.
  • the control unit 210 includes an image generation unit 211, a sound generation unit 212, a menu display unit 213, a mask instruction reception unit 214, a balloon instruction reception unit 215, a billboard instruction reception unit 216, a specular image instruction reception unit 217, and a voting instruction reception unit. 218.
  • the image generation unit 211 generates an image of a game controlled by another configuration of the control unit 210 and outputs it to the display device 263.
  • the sound generation unit 212 generates a sound of a game controlled by another configuration of the control unit 210 and outputs it to the speaker 264. Details of other configurations of the control unit 210 will be described later.
  • the game device 10 has a function of generating an image in which the user is wearing a mask by tracking the position of the user's face and displaying the mask image superimposed on the user's face. provide.
  • the mask instruction receiving unit 214 of the portable terminal device 9 transmits an instruction to the game apparatus 10 when receiving an instruction to display a mask from the user via the menu screen displayed by the menu display unit 213.
  • the mask control unit 114 of the game apparatus 10 causes the face detection unit 115 to detect a human face included in the image captured by the camera 7.
  • the face detection unit 115 detects a human face included in an image captured by the camera 7.
  • the face detection unit 115 acquires data of an image captured by the camera 7 at a predetermined timing, and detects a human face using any known face recognition technique. While the mask is displayed, the face detection unit 115 may execute face recognition for all frames captured by the camera 7 or may perform face recognition for every predetermined number of frames.
  • the face detection unit 115 can track the face of the person being imaged by detecting the face included in the image imaged by the camera 7 continuously at predetermined time intervals.
  • the mask control unit 114 transmits the face image data detected by the face detection unit 115 to the mobile terminal device 9 in order to receive an instruction as to which user's face the mask is displayed on.
  • the mask instruction receiving unit 214 displays a list of face images received from the game apparatus 10 on the display device 263, and receives a face selection instruction for displaying a mask from the user.
  • the mask instruction reception unit 214 transmits the received selection instruction to the game apparatus 10.
  • the mask instruction receiving unit 214 further receives from the user an instruction to select the type of image to be displayed as a mask.
  • the image to be displayed as the mask may be selected from the two-dimensional image data or the three-dimensional shape data stored in the data holding unit 260 of the mobile terminal device 9 or the data holding unit 160 of the game device 10, or the user It may be an image drawn by. In the latter case, the mask instruction receiving unit 214 displays a drawing screen on the display device 263 and receives image drawing from the user.
  • mask instruction receiving unit 214 transmits the received selection instruction to game device 10. At this time, when an image drawn by the user or an image in which no data is stored in the data holding unit 160 of the game apparatus 10 is selected, the image data is also transmitted to the game apparatus 10.
  • the mask control unit 114 superimposes an image of the type of mask indicated on the position of the face of the person specified by the user via the mobile terminal device 9 among the human faces detected by the face detection unit 115. To display.
  • the mask control unit 114 enlarges or reduces the mask image to be displayed in accordance with the size of the face detected by the face detection unit 115, and displays the image at the detected face position.
  • the face detection unit 115 detects the orientation of the face by detecting the shape of the face outline, the positions of eyes, nose and mouth.
  • the mask control unit 114 rotates and displays the mask image in accordance with the face orientation detected by the face detection unit 115.
  • the mask control unit 114 performs the rendering by rotating the three-dimensional shape data according to the detected face orientation. To do.
  • an angle obtained by rotating a predetermined angle further than the detected rotation angle of the face with the linear direction connecting the position of the face and the position of the camera 7 as the reference direction may be set as the rotation angle of the mask image.
  • the mask image when the user whose mask is displayed on the face tilts or rotates the face, the mask image may be tilted or rotated larger than the actual tilt or rotation angle. Thereby, the rotation of the mask image can be further emphasized, and it can be easily shown that the mask image is rotated in accordance with the rotation of the face. As the rotation angle from the reference direction of the face increases, the rotation angle may be further increased from the actual rotation angle.
  • the image displayed as a mask is a two-dimensional image
  • the mask is rotated in accordance with the rotation of the face.
  • the rotation of the face around a straight line perpendicular to the straight line connecting the position and the position of the camera 7 may be ignored.
  • the image displayed as a mask is reduced in a direction perpendicular to the straight line. May be displayed.
  • a two-dimensional image displayed as a mask is pasted as a texture on the surface of a solid modeled as a head, or a solid such as a sphere or cylinder, and the solid is rotated according to the orientation of the face.
  • the image displayed as a mask may be rotated.
  • an image displayed as a mask may be rotated at an angle larger than the actual face rotation angle.
  • the face detection unit 115 cannot detect the face.
  • the mask controller 114 erases the mask. At this time, the mask may be erased after displaying the appearance of noise entering the area where the mask was displayed, or the character being displayed on the screen being in the way of the mask.
  • FIG. 6 shows an example of a screen 400 displayed on the display of the output device 4.
  • An image captured by the camera 7 is displayed on the display of the output device 4, and a character 300 is superimposed on the captured image.
  • FIG. 7 shows an example of a screen displayed on the display device 263 of the mobile terminal device 9.
  • the mask instruction receiving unit 214 of the mobile terminal device 9 displays a drawing screen 500 for the user to draw a mask on the display device 263.
  • the user can draw an image to be displayed as a mask on the drawing screen 500.
  • the drawing screen 500 is displayed only on the display device 263 of the mobile terminal device 9 and is not displayed on the screen distributed from the distribution control unit 113.
  • FIG. 8 shows an example of a screen displayed on the display device 263 of the mobile terminal device 9.
  • the mask instruction reception unit 214 of the mobile terminal device 9 receives all the face images in the captured image detected by the face detection unit 115 of the game device 10 in order to receive a user's selection instruction to display the mask on the face.
  • Data is acquired from the game apparatus 10, and the acquired user face list 502 is displayed on the display device 263.
  • the user selects which person's face is to be masked from all the person's faces captured by the camera 7 and detected by the face detection unit 115. Can do.
  • the face list 502 screen is also displayed only on the display device 263 of the mobile terminal device 9, and is not displayed on the screen distributed from the distribution control unit 113.
  • the received selection instruction is transmitted to the game apparatus 10.
  • FIG. 9 shows an example of a screen 400 displayed on the display of the output device 4.
  • the mask control unit 114 of the game apparatus 10 obtains a face selection instruction for displaying a mask from the mobile terminal device 9, the mask 300 displays a state in which the character 300 sprays the mask material on the face.
  • FIG. 10 shows an example of a screen 400 displayed on the display of the output device 4.
  • the mask control unit 114 displays the mask 310 at the face position detected by the face detection unit 115.
  • FIG. 11 shows an example of a screen 400 displayed on the display of the output device 4.
  • the mask control unit 114 tilts and displays the mask 312 larger than the face tilt detected by the face detection unit 115.
  • FIG. 12 shows an example of a screen 400 displayed on the display of the output device 4.
  • the face detection unit 115 cannot detect the user's face.
  • the mask control unit 114 displays an animation of the disappearance of the displayed mask 312 and erases the mask 312 when the animation display ends.
  • the game device 10 tracks the position of the input device 6 and displays a string extending from the position of the input device 6 and an object connected to the string, whereby the balloon is connected to the input device 6. Provide a function to generate such images.
  • the balloon instruction accepting unit 215 of the mobile terminal device 9 transmits an instruction to the game device 10 when an instruction to display a balloon is received from the user via the menu screen displayed by the menu display unit 213.
  • the balloon control unit 116 of the game apparatus 10 receives an instruction to display a balloon from the mobile terminal device 9, the balloon control unit 116 transmits identification information of the currently active input device 6 recognized by the game apparatus 10 to the mobile terminal device 9.
  • the balloon instruction receiving unit 215 receives from the user an instruction to select the input device 6 that connects the balloons from the currently active input devices 6.
  • the balloon instruction receiving unit 215 further receives from the user an instruction to select the type of image to be displayed as a balloon.
  • the image to be displayed as the balloon may be selected from the two-dimensional image data or the three-dimensional shape data stored in the data holding unit 260 of the mobile terminal device 9 or the data holding unit 160 of the game device 10, or the user It may be an image drawn by. In the latter case, the balloon instruction receiving unit 215 displays a drawing screen on the display device 263 and receives drawing of an image from the user.
  • the balloon instruction reception unit 215 Upon receiving the selection instruction of the input device 6 and the selection instruction of the image, the balloon instruction reception unit 215 transmits the received selection instruction to the game apparatus 10. At this time, when an image drawn by the user or an image in which no data is stored in the data holding unit 160 of the game apparatus 10 is selected, the image data is also transmitted to the game apparatus 10.
  • the balloon control unit 116 maintains a correspondence relationship between the type of balloon image and the input device 6 to which the balloon is connected.
  • the balloon control unit 116 When the touch pad 79 of the input device 6 is pressed and a balloon is associated with the input device 6, the balloon control unit 116 is associated with the input device 6 at the position of the input device 6. While the touch pad 79 is pressed, the image is enlarged. When the finger is released from the touch pad 79, the balloon control unit 116 displays an image enlarged according to the input time with respect to the touch pad 79 at the tip of the string connected to the input device 6. The balloon control unit 116 may display the image at the original size, may display the image after being resized to a predetermined size, or accepts an instruction to enlarge or reduce the image from the user and enlarges the image. Or you may reduce and display.
  • the balloon control unit 116 may paste and display a two-dimensional image displayed as a balloon as a texture on the surface of a three-dimensional modeled balloon or a three-dimensional surface such as a sphere or a spheroid. The same image may be pasted on a plurality of three-dimensional surfaces.
  • the balloon control unit 116 may determine the length of the string according to the input time to the touch pad 79 or may have a predetermined length.
  • the controller detection unit 117 detects the input device 6 included in the image captured by the camera 7.
  • the controller detection unit 117 acquires data of an image captured by the camera 7 at a predetermined timing, and detects light emitted by the light emitting unit 85 using a known image analysis technique.
  • the controller detection unit 117 may detect the position of the input device 6 for all frames captured by the camera 7, or may detect the position of the input device 6 for every predetermined number of frames.
  • the controller detection unit 117 can track the position of the input device 6 being captured by detecting the position of the input device 6 included in the image captured by the camera 7 continuously at a predetermined time interval. it can.
  • the balloon control unit 116 may set a predetermined buoyancy for the balloon and display a simulation of the movement of the balloon in accordance with the movement of the input device 6 by physical calculation.
  • the movement of the entire string may be simulated by providing a control point at a predetermined position of the string connecting the balloon and the input device 6 and simulating the movement of the control point.
  • the balloon control unit 116 may move the balloon by a simple simulation in which a part of a physical phenomenon is simplified.
  • the difference detection unit 118 detects a difference between frames in the moving image.
  • the difference detection unit 118 detects a difference between frames at least around the balloon while the balloon is displayed.
  • the balloon control unit 116 detects the balloon from the position of the region where the difference has occurred. In the direction toward the center of gravity, a force having a magnitude corresponding to the amount of difference, the size of the area in which the difference has occurred, or the period in which the difference has occurred is applied to the balloon, and the balloon is moved by physical calculation.
  • the balloon control unit 116 may move the balloon by simple simulation instead of physical calculation.
  • the balloon may be moved by a predetermined amount in the direction opposite to the area where the difference has occurred.
  • the balloon control unit 116 tracks the movement of the user's body by analyzing the image captured by the camera 7 instead of or in addition to detecting the difference between frames in the moving image.
  • the balloon may be moved by applying a force to the balloon.
  • the shape of the user's hand may be recognized, and the balloon may be moved when the user's hand hits the balloon.
  • the controller detection unit 117 detects the input device 6 when the input device 6 is out of the field of view of the camera 7, the user points the input device 6 in the opposite direction to the camera 7, or the light emitting unit 85 is covered with a hand.
  • the balloon control unit 116 erases the balloon. At this time, the balloon may be erased after displaying a state where the string connecting the balloon to the input device 6 is broken and the balloon is flying upward.
  • FIG. 13 shows an example of a screen displayed on the display device 263 of the mobile terminal device 9.
  • the balloon instruction receiving unit 215 of the mobile terminal device 9 displays a drawing screen 510 on which the user draws a balloon on the display device 263. The user can draw an image to be displayed as a balloon on the drawing screen 510.
  • the drawing screen 510 is displayed only on the display device 263 of the mobile terminal device 9 and is not displayed on the screen distributed from the distribution control unit 113.
  • FIG. 14 shows an example of a screen displayed on the display device 263 of the mobile terminal device 9.
  • the balloon instruction receiving unit 215 acquires the identification information of the currently active input device 6 from the game device 10 in order to receive an instruction to select the input device 6 that displays the balloon from the user, and lists the acquired active input devices 6. 512 is displayed on the display device 263.
  • the balloon instruction receiving unit 215 receives a selection instruction by the user from the displayed list 512 of the input device 6, the balloon instruction receiving unit 215 transmits the received selection instruction to the game apparatus 10.
  • the screen of the list 512 is also displayed only on the display device 263 of the mobile terminal device 9 and is not displayed on the screen distributed from the distribution control unit 113.
  • FIG. 15 shows an example of a screen 400 displayed on the display of the output device 4.
  • the balloon control unit 116 obtains an instruction to select the input device 6 that displays a balloon from the mobile terminal device 9, the balloon control unit 116 displays a state in which the character 300 irradiates light to the input device 6. An effect that gives the function of displaying is displayed.
  • FIG. 16 shows an example of a screen 400 displayed on the display of the output device 4.
  • the balloon control unit 116 acquires information indicating that the touch pad 79 has been pressed from the input device 6, a state in which the image 320 associated with the input device 6 swells from the position of the input device 6. indicate.
  • FIG. 17 shows an example of a screen 400 displayed on the display of the output device 4.
  • the balloon control unit 116 acquires information indicating that the finger is released from the touch pad 79 from the input device 6, the image 320 associated with the input device 6 is connected to the input device 6 with the string 322. Display the state.
  • the balloon is a string 322 that is a first portion displayed at a position calculated by the balloon control unit 116 according to the position where the input device 6 is detected so as to follow the movement of the input device 6 that is the tracking target.
  • an image 320 that is a second part moved by the balloon control unit 116 depending on the movement of the root part of the string 322.
  • the balloon control unit 116 moves the root portion of the string 322 following the position of the input device 6 detected by the controller detection unit 117, and moves the balloon image 320 so as to be pulled by the string 322.
  • the balloon image 320 is moved according to the detected difference.
  • FIG. 18 shows an example of a screen 400 displayed on the display of the output device 4.
  • the balloon control unit 116 acquires information indicating that the light emitting unit 85 of the input device 6 cannot be detected from the controller detection unit 117, the balloon control unit 116 cuts the balloon string 322 connected to the input device 6 and the root of the string 322. And the balloon image 320 are separated. Since the user can recognize from the screen displayed on the output device 4 that the root portion of the balloon string 322 is moved following the input device 6, the light emission of the input device 6 that is the tracking target can be recognized. To easily understand the relationship between hiding the root portion of the string 322 that follows the unit 85 and the light emitting unit 85 and detaching the balloon image 320 that is moved depending on the string 322 from the input device 6. Can do.
  • the balloon control unit 116 may move the balloon separated from the string 322 upward according to the buoyancy of the balloon, or may float the air softly.
  • the billboard instruction receiving unit 216 of the mobile terminal device 9 receives an instruction to display the billboard from the user via the menu screen displayed by the menu display unit 213, the billboard instruction receiving unit 216 displays an image drawing screen on the display device 263. Accept drawing of an image to be displayed as a billboard.
  • the billboard instruction receiving unit 216 transmits an instruction to display the billboard to the game apparatus 10. At this time, image data drawn by the user is also transmitted to the game apparatus 10. Also, an instruction for the position of the screen for displaying the billboard is received from the user, and the received display position is transmitted to the game apparatus 10.
  • the billboard control unit 119 of the game apparatus 10 displays the image acquired from the mobile terminal device 9 as a billboard at the designated position on the game screen.
  • the billboard displayed on the screen is also displayed on the distributed screen, so that the user can post a message to the viewer on the billboard.
  • the contents of the distributed game, comments about the game, the theme of the moving image being distributed, notice to the viewer, etc. can be posted on the billboard.
  • the billboard control unit 119 when the difference detection unit 118 detects a difference between frames in the vicinity of the displayed billboard, the billboard control unit 119 is displayed according to the detected difference. Affects billboards. For example, when the billboard control unit 119 detects a difference of a predetermined amount or more over a predetermined period in a region wider than the predetermined region near the billboard by the difference detection unit 118, the billboard control unit 119 In the direction from the position toward the center of gravity of the billboard, the amount of difference, the size of the area in which the difference has occurred, or a force of a magnitude corresponding to the period in which the difference has occurred is applied to the billboard, and the billboard is Move.
  • the user can move the billboard by taking an action of hitting the billboard with his / her hand, so that it is possible to provide an environment in which a highly entertaining interaction can be experienced.
  • the billboard control unit 119 may move the billboard not by physical calculation but by simple simulation. For example, the billboard may be moved by a predetermined amount in the direction opposite to the area where the difference has occurred.
  • the billboard control unit 119 causes noise in the billboard image in the area where the difference occurs or noise in the area transparently displayed on the background of the billboard. You may display the state that occurs.
  • the billboard control unit 119 tracks the movement of the user's body by analyzing the image captured by the camera 7 instead of or in addition to detecting the difference between frames in the moving image.
  • the billboard may be moved by applying a force to the billboard.
  • the shape of the user's hand may be recognized, and the billboard may be moved when the user's hand hits the billboard.
  • FIG. 19 shows an example of a screen displayed on the display device 263 of the mobile terminal device 9.
  • the billboard instruction receiving unit 216 of the mobile terminal device 9 displays a screen 520 on the display device 263 for instructing a position where the user displays the billboard.
  • the user instructs a position on the screen 520 where the billboard should be displayed.
  • the billboard instruction receiving unit 216 acquires the image captured by the camera 7 or the display image generated by the image generation unit 111 from the game apparatus 10 and displays it on the screen 520, and the position where the billboard should be displayed. And a size instruction is received.
  • the user can adjust the position and size of the billboard display while confirming the image displayed on the output device 4 or the terminal of the distribution destination viewer. If another billboard is already displayed, the image set on the billboard may be displayed at the set position.
  • This screen 520 is displayed only on the display device 263 of the mobile terminal device 9 and is not displayed on the screen distributed from the distribution control unit 113.
  • FIG. 20 shows an example of a screen displayed on the display device 263 of the mobile terminal device 9.
  • the billboard instruction receiving unit 216 displays a drawing screen 522 on which the user draws the billboard on the display device 263. The user can draw an image to be displayed as a billboard on the drawing screen 522.
  • the drawing screen 522 is also displayed only on the display device 263 of the mobile terminal device 9 and is not displayed on the screen distributed from the distribution control unit 113.
  • FIG. 21 shows an example of a screen 400 displayed on the display of the output device 4.
  • the billboard control unit 119 of the game apparatus 10 acquires image data to be displayed as a billboard from the mobile terminal device 9, the billboard controller 119 displays a state in which the character 300 blows the billboard material onto the billboard display position.
  • FIG. 22 shows an example of a screen 400 displayed on the display of the output device 4.
  • the billboard control unit 119 displays the billboard 330 at the position instructed by the user.
  • the billboard 330 is supported at both ends by two objects 332 and 334.
  • FIG. 23 shows an example of a screen 400 displayed on the display of the output device 4.
  • the two objects 332 and 334 that support both ends of the billboard 330 are rotated and displayed so as to face the position of the user's face detected by the face detection unit 115.
  • FIG. 24 shows an example of a screen 400 displayed on the display of the output device 4.
  • the difference detection unit 118 detects a difference between frames.
  • the billboard control unit 119 shakes the object 332 when a difference is detected in the vicinity of the object 332, and drops the object 332 when the amount of the difference is large.
  • the billboard control unit 119 displays a state in which the billboard 330 that has lost support at the left end hangs down.
  • the billboard control unit 119 displays a state in which the object 332 returns to the original position and supports the left end of the billboard 330 again when a predetermined time elapses after the object 332 falls.
  • the game apparatus 10 provides a function of horizontally flipping the screen.
  • the image generation unit 111 of the game apparatus 10 is a mirror image obtained by horizontally inverting the image captured by the camera 7 in order to make it easy for the user to interact with a balloon, billboard, character, or the like displayed on the screen. Generate a screen using.
  • the user is distributing a moving image and wants the camera 7 to capture characters and the like to be shown to the viewer, the characters are reversed left and right and are difficult to read. Therefore, for example, when the message to the viewer is imaged by the camera 7 and delivered to the viewer, the screen is further horizontally reversed according to the user's instruction as necessary, and the original image captured by the camera 7 is restored.
  • the mirror image control unit 120 displays an instruction to reverse the display screen from the mirror image instruction receiving unit 217 of the input device 6 or the mobile terminal device 9, that is, the original image captured by the camera 7 instead of the mirror image.
  • the image generation unit 111 is instructed to flip the display screen left and right.
  • the image generation unit 111 receives an instruction to flip the display screen left and right from the mirror image control unit 120, the image generation unit 111 generates the screen using the original image captured by the camera 7, thereby flipping the screen left and right.
  • the mirror surface image control unit 120 receives a screen displayed on the display of the output device 4 viewed by the user who distributes the moving image when receiving an instruction to flip the display screen left and right, and the distribution server 5. Both of the screens of the moving images to be distributed may be reversed left or right, or only one of them may be reversed.
  • the face detection unit 115, the controller detection unit 117, and the difference detection unit 118 are used for detecting the difference between the face, the input device 6, and the frame 7.
  • the image picked up by the above may be reversed horizontally or may not be reversed horizontally.
  • the game apparatus 10 has a function of receiving a questionnaire execution instruction for a viewer from a user who distributes a moving image, and receiving a vote for the questionnaire from another user who is viewing the distributed moving image. provide.
  • the voting control unit 121 of the game apparatus 10 receives a questionnaire response regarding the evaluation of the distributed moving image from the viewer while the distribution control unit 113 distributes the display image.
  • the voting control unit 121 can accept, from a viewer, a questionnaire answer uniquely set by the user, in addition to the default questionnaire that can accept a regular answer.
  • the voting instruction receiving unit 218 of the mobile terminal device 9 receives an instruction to conduct a questionnaire for the viewer via the menu screen displayed by the menu display unit 213 when the user distributes a moving image
  • a screen for accepting the question and answer options from the user is displayed on the display device 263, and the questionnaire question and answer options are accepted from the user.
  • the number of options for answering the questionnaire may be a predetermined number.
  • the number of operation buttons 76 of the input device 6 may be four.
  • the voting instruction reception unit 218 transmits the received questionnaire question and answer options to the game apparatus 10.
  • the voting instruction accepting unit 218 stores the accepted questionnaire questions and answer options in the data holding unit 260, and holds the data when instructed by the user to execute the questionnaire stored in the data holding unit 260.
  • Questionnaire questions and answer options may be read from unit 260 and transmitted to game device 10.
  • the voting control unit 121 of the game apparatus 10 displays a billboard displaying the questionnaire items received from the mobile terminal device 9 on a screen distributed to other users. Thereby, the viewer can know that the questionnaire by the user is being implemented.
  • the voting control unit 121 may include a user interface for receiving a vote for a questionnaire from a viewer on a screen distributed to other users.
  • This user interface may be realized by a script or the like that can be executed in a browser mounted on the mobile terminal device 9, the game device 10, or a personal computer used by the viewer to view a moving image.
  • This user interface may be provided as a standard function in a service for distributing moving images from the game apparatus 10.
  • the user interface transmits the received vote result directly to the game apparatus 10 or to the game apparatus 10 via the distribution server 5.
  • the voting control unit 121 stores the voting result in the data holding unit 160 when the voting result by the viewer is acquired from the viewer terminal or the distribution server 5.
  • the voting control unit 121 reads the voting result from the data holding unit 160 and transmits the result to the portable terminal device 9. To do.
  • the voting instruction receiving unit 218 displays the progress of voting acquired from the game apparatus 10 on the display device 263. Since the progress of voting is not displayed on the screen generated by the image generation unit 111, the user who is viewing the display of the output device 4 or the moving image distributed from the distribution server 5 is viewed. The user cannot see the progress, and only the user who uses the mobile terminal device 9 who created the questionnaire can see the progress.
  • the voting instruction receiving unit 218 transmits the received instruction to the game apparatus 10 when the user instructs the end of the questionnaire.
  • the voting control unit 121 deletes the displayed questionnaire item from the screen.
  • the voting instruction receiving unit 218 transmits the received instruction to the game apparatus 10 when the user instructs distribution of the questionnaire result.
  • the voting control unit 121 reads the questionnaire result from the data holding unit 160 and displays the result on the screen generated by the image generation unit 111.
  • information such as a response to a questionnaire received by the game apparatus 10 from a specific display target, for example, another user who is viewing a moving image distributed from the distribution control unit 113 requires an implementation of the questionnaire.
  • the user's mobile terminal device 9 can display the image at an arbitrary timing, but the distribution image is not displayed unless there is a public instruction from the user's mobile terminal device 9. Therefore, the user can collect information privately from the viewer during distribution of the moving image, and can display the information on the distribution image and publish it if necessary.
  • FIG. 25 shows an example of a screen displayed on the display device 263 of the mobile terminal device 9.
  • the voting instruction reception unit 218 of the mobile terminal device 9 displays a screen 530 for receiving questionnaire items from the user on the display device 263.
  • the user sets a questionnaire item on the screen 530.
  • the voting instruction receiving unit 218 transmits the received questionnaire items to the game apparatus 10.
  • FIG. 26 shows an example of a screen 400 displayed on the display of the output device 4.
  • the voting control unit 121 of the game apparatus 10 acquires the item of the questionnaire from the mobile terminal device 9, the voting control unit 121 displays a state in which the character 300 displays the questionnaire 340 as a hologram.
  • FIG. 27 shows an example of a screen 600 displayed on the display device of the user terminal that views the moving image distributed by the distribution server 5.
  • the screen 600 is provided with a user interface for answering a questionnaire. Users can vote for questionnaires via the interface.
  • FIG. 28 shows an example of a screen 400 displayed on the display of the output device 4.
  • the voting control unit 121 acquires a vote for a questionnaire from a user who views the distributed moving image
  • the voting control unit 121 displays an icon 342 indicating that the vote for the questionnaire has been acquired on the screen 400.
  • FIG. 29 shows an example of a screen displayed on the display device 263 of the mobile terminal device 9.
  • the voting instruction reception unit 218 transmits the received instruction to the game apparatus 10 when receiving an instruction to acquire the progress of the questionnaire from the user.
  • the voting control unit 121 receives an instruction to acquire the progress of the questionnaire from the portable terminal device 9 that has requested execution of the questionnaire
  • the voting control unit 121 reads the questionnaire result from the data holding unit 160 and transmits the result to the portable terminal device 9.
  • the voting instruction reception unit 218 displays a screen 532 indicating the progress of the questionnaire on the display device 263. This screen 532 is displayed only on the display device 263 of the mobile terminal device 9 that has requested execution of the questionnaire, and is not displayed on the screen distributed from the distribution control unit 113. That is, the user who requested the implementation of the questionnaire can check the progress of the questionnaire without opening the results of the questionnaire to the viewer.
  • FIG. 30 shows an example of a screen 400 displayed on the display of the output device 4.
  • the voting instruction reception unit 218 transmits the received instruction to the game device 10 when receiving an instruction to distribute the result of the questionnaire from the user.
  • the voting control unit 121 receives an instruction to distribute the questionnaire result from the mobile terminal device 9 that has requested execution of the questionnaire, the voting control unit 121 reads the questionnaire result from the data holding unit 160 and displays the aggregated questionnaire result 344 on the screen 400. indicate.
  • the screen 400 displaying the questionnaire result 344 is also distributed to the user who is viewing the moving image.
  • FIG. 31 is a flowchart showing a procedure of the game control method according to the embodiment.
  • the mask instruction receiving unit 214 of the mobile terminal device 9 transmits the received instruction to the game apparatus 10 (S102).
  • the mask control unit 114 of the game apparatus 10 causes the face detection unit 115 to detect the face included in the image captured by the camera 7 (S 104). The face image is transmitted to the mobile terminal device 9 (S106).
  • the mask instruction accepting unit 214 displays a list of face images transmitted from the game apparatus 10 on the display device 263 in order to accept a user's selection instruction to display a mask on the face (S108).
  • An instruction to select a face to be displayed (S110), an instruction to select a mask image to be displayed are received from the user (S112), an instruction to select a face to be received (S114), and an instruction to select a mask image are transmitted to the game apparatus 10 ( S116).
  • the mask control unit 114 When the mask control unit 114 obtains an instruction to select a face and mask image for displaying a mask from the mobile terminal device 9, the mask control unit 114 displays a state in which the character sprays the mask material on the face (S118). A mask image is displayed at the face position detected by the face detection unit 115 (S120). While the face detection unit 115 can detect the user's face (N in S122), the mask control unit 114 displays a mask image superimposed on the detected face position (S120). When the face detection unit 115 cannot detect the user's face by covering the face with a hand and hiding it (Y in S122), the mask control unit 114 shows that the displayed mask 312 disappears. An animation is displayed (S124), and when the animation display ends, the mask image is deleted (S126).
  • FIG. 32 is a flowchart showing the procedure of the game control method according to the embodiment.
  • the balloon instruction accepting unit 215 of the mobile terminal device 9 transmits the accepted instruction to the game device 10 (S202).
  • the balloon control unit 116 of the game device 10 transmits identification information of the active input device 6 to the mobile terminal device 9 (S206).
  • the balloon instruction receiving unit 215 displays a list of the active input devices 6 transmitted from the game device 10 on the display device 263 (S208), selects the input device 6 to which the balloon display is assigned (S210), and displays the balloon to be displayed.
  • An image selection instruction is received from the user (S212), the received selection instruction of the input device 6 (S214), and a balloon image selection instruction are transmitted to the game apparatus 10 (S216).
  • the balloon control unit 116 When the balloon control unit 116 acquires the input device 6 and a balloon image selection instruction from the portable terminal device 9, the balloon control unit 116 displays an animation of the character irradiating light on the input device 6 (S217).
  • the balloon control unit 116 waits until the touch pad 79 of the input device 6 to which the balloon display is assigned is pressed (N in S218), and when the touch pad 79 is pressed (Y in S218), Until the pressing is completed (N in S222), a state in which the balloon image assigned to the input device 6 swells from the position of the input device 6 is displayed (S220).
  • the balloon control unit 116 connects the balloon to the input device 6 while the controller detection unit 117 can detect the position of the input device 6 (N in S226).
  • the displayed state is displayed (S224). If the controller cannot detect the position of the input device 6 because the user covers and hides the light emitting unit 85 of the input device 6 (Y in S226), the balloon control unit 116 displays the balloon in the input device.
  • an animation is displayed (S228). When the animation display is completed, the balloon is erased (S230).
  • FIG. 33 is a flowchart showing a procedure of the game control method according to the embodiment.
  • the billboard instruction receiving unit 216 of the mobile terminal device 9 receives an instruction to display a billboard from the user (S300)
  • the billboard instruction receiving unit 216 transmits the received instruction to the game device 10 (S302).
  • the billboard control unit 119 of the game apparatus 10 receives an instruction to display the billboard from the mobile terminal device 9, the display image generated by the image generation unit 111 and displayed on the output device 4 is sent to the mobile terminal device 9. Transmit (S306).
  • the billboard instruction accepting unit 216 displays the display image transmitted from the game apparatus 10 on the display device 263 (S308), selects a position for displaying the billboard (S310), and selects an image to be displayed as the billboard.
  • the instruction is received from the user (S312), the received display position selection instruction (S314), and the billboard image selection instruction are transmitted to the game apparatus 10 (S316).
  • the billboard control unit 119 When the billboard control unit 119 obtains the display position of the billboard and the selection instruction of the image to be displayed as the billboard from the mobile terminal device 9, the billboard control unit 119 displays an animation of spraying the material of the billboard image on the display position (S318). When the animation display ends, a billboard image is displayed at the display position (S320).
  • the difference detection unit 118 detects a difference in the vicinity of the area where the billboard is displayed (Y in S322)
  • the billboard control unit 119 changes the display mode of the billboard according to the detected difference. (S324). If no difference is detected (N in S322), the billboard image is displayed as it is (S320).
  • FIG. 34 is a flowchart showing a procedure of the game control method according to the embodiment.
  • the game apparatus 10 acquires a captured image from the camera 7 (S400). If the mirror image control unit 120 has not received an instruction to flip the display image left and right (N in S402), the mirror image control unit 120 generates a mirror image obtained by horizontally inverting the captured image, and generates a display image based on the mirror image (S404). . When an instruction to horizontally flip the display image is received (Y in S402), S404 is skipped, and the captured image is not horizontally reversed, thereby generating a display image that is horizontally reversed from the normal display image.
  • the specular image control unit 120 When the specular image control unit 120 has not received an instruction to horizontally flip the detection image used by the difference detection unit 118 to detect the difference between frames (N in S406), the mirror image control unit 120 reverses the captured image to the left and right to detect the detected image. And the difference is detected based on the generated detection image (S408). If an instruction to flip the detected image to the left and right is accepted (Y in S406), skip S408 and do not flip the captured image to the left or right, so that the difference from the normal detected image is based on the detected image that is reversed left and right Is detected.
  • the difference detection unit 118 detects a difference between frames based on a detection image that is horizontally reversed or not horizontally reversed, and the captured real world user, a displayed virtual world character, a balloon, An interaction with an object such as a billboard is detected (S410).
  • the balloon control unit 116 and the billboard control unit 119 control the balloon and the billboard according to the detected interaction (S412).
  • the image generation unit 111 generates a distribution image including the interacted balloon and billboard (S414).
  • FIG. 35 is a flowchart showing a procedure of the game control method according to the embodiment.
  • the voting instruction receiving unit 218 of the mobile terminal device 9 receives an instruction to conduct a questionnaire from the user (S500), the questionnaire question (S502) and the answer option for the question are received from the user (S504) and received.
  • a questionnaire execution instruction is transmitted to the game apparatus 10 together with the selected question and answer options (S506).
  • the voting control unit 121 When the voting control unit 121 obtains the questionnaire execution instruction from the mobile terminal device 9, the voting control unit 121 displays the questionnaire question and answer options on the distribution image distributed by the image generation unit 111 (S508).
  • the distribution image on which the questionnaire is displayed is transmitted from the distribution control unit 113 to the viewer's terminal via the distribution server 5 (S510), and the viewer answers the questionnaire (S512).
  • the voting control unit 121 acquires the questionnaire response from the viewer's terminal, the voting control unit 121 stores the acquired response in the data holding unit 160 (S514). At this time, the voting control unit 121 displays an icon or the like indicating that the questionnaire response has been acquired on the distribution image.
  • the voting instruction accepting unit 218 accepts an instruction to acquire the progress of the questionnaire from the user (S516), it transmits the accepted instruction to the game device 10 (S518).
  • the voting control unit 121 receives an instruction to acquire the progress of the questionnaire from the portable terminal device 9 that has requested execution of the questionnaire, the voting control unit 121 reads the questionnaire result from the data holding unit 160 and transmits the result to the portable terminal device 9 (S520). ).
  • the voting instruction reception unit 218 displays a screen indicating the progress of the questionnaire on the display device 263 (S522).
  • the voting instruction receiving unit 218 When the voting instruction receiving unit 218 receives an instruction to publish the questionnaire result from the user (S524), the voting instruction receiving unit 218 transmits the received instruction to the game device 10 (S526).
  • the voting control unit 121 receives an instruction to distribute the questionnaire result from the mobile terminal device 9 that has requested execution of the questionnaire, the voting control unit 121 reads the questionnaire result from the data holding unit 160 and displays the aggregated questionnaire result on the distribution image. (S528). The distribution image displaying the result of the questionnaire is distributed to the viewer's terminal (S530).
  • the distribution image distributed from the distribution control unit 113 to another device and the display image displayed on the display of the output device 4 may be the same or different.
  • the masks, balloons, billboards, questionnaire results, etc. described above may be displayed only on the distribution image, only on the display image, or on both.
  • the present invention can be used in an image processing apparatus that processes a captured image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un dispositif de jeu (10) équipé de : une unité de génération d'images (111) qui obtient une image photographique capturée par un dispositif d'imagerie, génère une image d'affichage dans laquelle une image miroir de l'image photographique obtenue et l'image d'un personnage ou d'un objet sont superposées, puis qui affiche l'image d'affichage générée sur un dispositif d'affichage ; une unité de commande de distribution (113) qui distribue l'image d'affichage à un autre dispositif pour permettre à un autre utilisateur de visualiser l'image d'affichage ; et une unité de commande d'image miroir (120) qui, lorsqu'une instruction de basculer horizontalement l'image photographique dans l'image d'affichage est reçue en provenance de l'utilisateur, amène l'unité de génération à générer une image d'affichage dans laquelle, au lieu de l'image miroir de l'image photographique, l'image photographique et l'image du personnage ou de l'objet sont superposées.
PCT/JP2015/057572 2014-06-06 2015-03-13 Dispositif, procédé et programme de traitement d'images WO2015186402A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2014-117815 2014-06-06
JP2014117815A JP6313666B2 (ja) 2014-06-06 2014-06-06 画像処理装置、画像処理方法及び画像処理プログラム
JP2014-117816 2014-06-06
JP2014117816A JP2015230685A (ja) 2014-06-06 2014-06-06 画像処理装置、画像処理システム、端末装置、画像処理方法及び画像処理プログラム

Publications (1)

Publication Number Publication Date
WO2015186402A1 true WO2015186402A1 (fr) 2015-12-10

Family

ID=54766483

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/057572 WO2015186402A1 (fr) 2014-06-06 2015-03-13 Dispositif, procédé et programme de traitement d'images

Country Status (1)

Country Link
WO (1) WO2015186402A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10974145B2 (en) 2017-11-30 2021-04-13 Dwango Co., Ltd. Intervention server and intervention program
CN113542850A (zh) * 2020-08-21 2021-10-22 海信视像科技股份有限公司 显示设备及数据处理方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005191964A (ja) * 2003-12-25 2005-07-14 Toshiba Corp 画像処理方法
JP2006014875A (ja) * 2004-06-30 2006-01-19 Sony Computer Entertainment Inc 情報処理装置、プログラム、および、情報処理装置におけるオブジェクト制御方法
JP2014044655A (ja) * 2012-08-28 2014-03-13 Premium Agency Inc 拡張現実システム、映像合成装置、映像合成方法及びプログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005191964A (ja) * 2003-12-25 2005-07-14 Toshiba Corp 画像処理方法
JP2006014875A (ja) * 2004-06-30 2006-01-19 Sony Computer Entertainment Inc 情報処理装置、プログラム、および、情報処理装置におけるオブジェクト制御方法
JP2014044655A (ja) * 2012-08-28 2014-03-13 Premium Agency Inc 拡張現実システム、映像合成装置、映像合成方法及びプログラム

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10974145B2 (en) 2017-11-30 2021-04-13 Dwango Co., Ltd. Intervention server and intervention program
US11623144B2 (en) 2017-11-30 2023-04-11 Dwango Co., Ltd. Intervention server and intervention program
CN113542850A (zh) * 2020-08-21 2021-10-22 海信视像科技股份有限公司 显示设备及数据处理方法

Similar Documents

Publication Publication Date Title
WO2015186401A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image et programme de traitement d'image
CN109475774B (zh) 虚拟现实环境中的视图位置处的观众管理
JP6929380B2 (ja) Vr環境への第2の画面の仮想ウィンドウ
JP6663505B2 (ja) Vr環境における観客ビュー視点
CN103357177B (zh) 使用便携式游戏装置来记录或修改在主游戏系统上实时运行的游戏或应用
CN104238738B (zh) 在头戴式系统内产生增强虚拟现实场景的系统和方法
JP6581666B2 (ja) ヘッドマウントディスプレイでのピンチアンドホールドジェスチャのナビゲーション
US9984505B2 (en) Display of text information on a head-mounted display
TWI594174B (zh) 頭戴顯示器之追蹤系統、方法及裝置
JP6266101B2 (ja) ヘッドマウンテッドディスプレイ上でのゲームプレイの移行
CN105378596B (zh) 用于头戴式显示器中在透明模式与非透明模式之间转变的系统和方法
CN106664401A (zh) 用于在与内容交互时向用户提供反馈的系统和方法
CN105359063A (zh) 利用追踪的头戴式显示器
CN104043245B (zh) 游戏控制器
CN107656615A (zh) 大量同时远程数字呈现世界
JP7503122B2 (ja) 位置に基づくゲームプレイコンパニオンアプリケーションへユーザの注目を向ける方法及びシステム
CN110270088A (zh) 异步虚拟现实交互
JP2021100575A (ja) ゲームプログラム、ゲーム方法、および情報処理装置
JP2019524181A (ja) ゲーム内位置ベースのゲームプレイコンパニオンアプリケーション
JP6313666B2 (ja) 画像処理装置、画像処理方法及び画像処理プログラム
WO2015186402A1 (fr) Dispositif, procédé et programme de traitement d'images
JP2015229065A (ja) 画像処理装置、画像処理システム、端末装置、画像処理方法及び画像処理プログラム
JP2015230685A (ja) 画像処理装置、画像処理システム、端末装置、画像処理方法及び画像処理プログラム
JP2015230683A (ja) 画像処理装置、画像処理方法及び画像処理プログラム
WO2022137343A1 (fr) Procédé de traitement d'informations, support lisible par ordinateur, et dispositif de traitement d'informations

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15803743

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15803743

Country of ref document: EP

Kind code of ref document: A1