WO2016072132A1 - Dispositif de traitement d'informations, système de traitement d'informations, système d'objet réel, et procédé de traitement d'informations - Google Patents

Dispositif de traitement d'informations, système de traitement d'informations, système d'objet réel, et procédé de traitement d'informations Download PDF

Info

Publication number
WO2016072132A1
WO2016072132A1 PCT/JP2015/074169 JP2015074169W WO2016072132A1 WO 2016072132 A1 WO2016072132 A1 WO 2016072132A1 JP 2015074169 W JP2015074169 W JP 2015074169W WO 2016072132 A1 WO2016072132 A1 WO 2016072132A1
Authority
WO
WIPO (PCT)
Prior art keywords
real object
information processing
real
processing apparatus
user
Prior art date
Application number
PCT/JP2015/074169
Other languages
English (en)
Japanese (ja)
Inventor
平田 真一
アレクシー アンドレ
圭司 外川
直紀 沼口
洋 大澤
山岸 建
永塚 仁夫
Original Assignee
ソニー株式会社
株式会社ソニー・インタラクティブエンタテインメント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社, 株式会社ソニー・インタラクティブエンタテインメント filed Critical ソニー株式会社
Publication of WO2016072132A1 publication Critical patent/WO2016072132A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F9/00Games not otherwise provided for
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H30/00Remote-control arrangements specially adapted for toys, e.g. for toy vehicles
    • A63H30/02Electrical arrangements
    • A63H30/04Electrical arrangements using wireless transmission
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Definitions

  • the present invention relates to information processing technology using an object in real space.
  • the above-mentioned conventional technology replaces simple things in the real world with various and attractive things in the virtual world or communicates with the equipment by using the main output means as image display on the display device. Is easily made.
  • the degree of dependence on image representation is reduced, the use of real-world objects and the room for computers to exhibit their performance are limited, making it difficult to diversify.
  • Information processing technology using objects in the real world is very effective in that it is easy for the user to understand intuitively and easily obtain a sense of reality. Therefore, there is a need for a technique that can realize the same aspect with an object in the real world, even if it is less dependent on image representation.
  • the present invention has been made in view of such problems, and an object thereof is to provide an information processing technique that can realize various forms using real-world objects.
  • An aspect of the present invention relates to an information processing apparatus.
  • the information processing apparatus sequentially acquires image frames of moving images taken by the imaging apparatus, and detects the state information of the real object existing in the object space at predetermined time intervals by detecting the images in the image frames.
  • An information processing unit that determines operation details of the real object to be controlled among the real objects at predetermined time intervals according to a predetermined rule based on the state information detected by the state specifying unit, and a control target
  • a real object control unit that controls the real object to operate according to the content determined by the information processing unit.
  • This information processing system is an information processing system including an information processing device and a real object that moves under the control of the information processing device, and the information processing device sequentially processes image frames of moving images taken by the imaging device.
  • a state specifying unit that detects state information of a real object existing in the object space at a predetermined time interval, and a predetermined based on the state information detected by the state specifying unit
  • an information processing unit that determines the operation content of the real object to be controlled among the real objects at predetermined time intervals, and a control target real object to be operated with the content determined by the information processing unit And a real object control unit.
  • Still another aspect of the present invention relates to a real object system.
  • This real object system includes a communication unit that receives a control signal from an information processing device and a drive unit that operates an actuator according to the received control signal, thereby including a plurality of real objects that operate based on the control signal.
  • one of the real objects is operated based on a control signal reflecting a user operation performed via an input device connected to the information processing device, and another real object is determined by the information processing device. The operation is based on a control signal reflecting the movement.
  • Still another aspect of the present invention relates to an information processing method.
  • This information processing method sequentially acquires image frames of moving images taken by the imaging device, and detects state information of the real object existing in the object space at predetermined time intervals by detecting images in the image frames.
  • the marker provided on the real object of the present embodiment is a diagram showing a case where the light emission of the marker is used for distinguishing the front and back of the real object by a top view of the real object
  • FIG. 1 shows a configuration example of an information processing system to which this embodiment can be applied.
  • the information processing system 1 moves the real objects 120a and 120b placed on the play field 20, the imaging device 12 that captures the space on the play field 20, and moves at least one of the real objects 120a and 120b by performing predetermined information processing.
  • the information processing device 10 includes an input device 14 that accepts user operations, a microphone 16 that acquires ambient sound, a display device 18 that displays an image, and a speaker 19 that outputs sound. Note that the input device 14, the microphone 16, the display device 18, and the speaker 19 may not be included in the information processing system 1 depending on the mode of implementation.
  • the play field 20 is a plane that defines an area serving as a reference for the information processing apparatus 10 to recognize the real objects 120a and 120b and specify the position coordinates thereof.
  • the play field 20 is not limited in material and shape as long as such a planar area can be defined, and may be any of paper, board, cloth, desk top board, game board, and the like. Or the image etc. which the projector contained in the display apparatus 18 projected on the desk or the floor may be sufficient.
  • the shape of the real objects 120a and 120b is not limited as long as it is an object existing in real space. That is, it may be a simple shape as shown in the figure, or may be a more complex shape such as a doll or a miniature of a real world object such as a doll or a minicar, its parts, or a game piece. Further, the size, material, color, number of used objects, etc. of the real objects 120a, 120b are not limited. Furthermore, it may be a structure that can be assembled or disassembled by the user, or may be a finished product. At least one of the real objects 120a and 120b includes an actuator that establishes communication with the information processing apparatus 10 and is driven by the transmitted control signal.
  • each of the real objects 120a and 120b includes wheels 122a and 122b, and is configured to self-run by rotating an axle motor according to a control signal from the information processing apparatus 10.
  • the information processing apparatus 10 moves the real object 120b based on a user operation via the input device 14, while the real object 120a is moved according to the position or movement of the real object 120b. Move.
  • the information processing apparatus 10 basically moves any real object (for example, the real object 120a) according to a predetermined rule corresponding to the state on the play field 20.
  • the real object for which the information processing apparatus 10 determines the movement is particularly referred to as “real object controlled by the information processing apparatus 10”, and is distinguished from the real object operated by the user via the input device 14.
  • the real object (for example, the real object 120b) other than the real object controlled by the information processing apparatus 10 may be any object that can be freely placed and moved by the user, and may not include an actuator inside. Even if an actuator is provided, the operation means is not limited to the input device 14 and may be directly operated using a dedicated controller or the like.
  • the object driven by the actuator is not limited to the wheel.
  • a gripper or an arm may be attached to a real object and the movable part thereof may be moved, or any of mechanisms that are controlled by a general robot or toy may be employed.
  • a light emitting element, a display, a speaker, a vibrator, and the like may be incorporated to operate them.
  • a plurality of mechanisms may be operated simultaneously. In any case, these mechanisms are controlled by the information processing apparatus 10.
  • the image pickup device 12 is a general digital video camera having an image pickup device such as a CCD (Charge-Coupled Device) or a CMOS (Complementary-Metal-Oxide-Semiconductor), and is a space on the play field 20 where real objects 120a and 120b are placed. Take a video. Alternatively, a camera that detects invisible light such as near-infrared light or a general camera that detects visible light may be combined.
  • the frame data of the moving image is sequentially transmitted to the information processing apparatus 10 together with the photographing, and is used to acquire the position coordinates of the real objects 120a and 120b on the plane formed by the play field 20. Therefore, the imaging device 12 is preferably arranged so as to overlook the play field 20.
  • the position and angle of the imaging device 12 are not particularly limited.
  • the position coordinates in the camera coordinate system are acquired using the distance, May be converted into position coordinates in the world coordinate system with the play field 20 as a horizontal plane.
  • the technology for acquiring the position of the subject in the depth direction using a stereo camera is widely known.
  • a viewpoint moving camera may be used instead of the stereo camera.
  • a device that irradiates reference light such as near infrared rays and detects the reflected light may be provided, and the positions of the real objects 120a and 120b may be specified by an existing method such as TOF (Time of Flight). Further, the positions of the real objects 120a and 120b may be specified by detecting the contact position using the upper surface of the play field 20 as a touch pad.
  • TOF or touchpad is used, the real objects 120a and 120b are distinguished by integrating with color information of each image in the captured image. Further, the real objects 120a and 120b once detected can be tracked using the existing visual tracking technology, so that the subsequent position coordinate acquisition can be made efficient.
  • the information processing apparatus 10 acquires the position and movement of a real object other than the real object on the play field 20, mainly the real object 120a that it controls, and determines the movement of the real object 120a that it controls based on that. And the real object 120a is operated by transmitting the control signal according to the determination result.
  • the information processing apparatus 10 may be a game device or a personal computer, and may implement an information processing function by loading a necessary application program. Further, as will be described later, the information processing apparatus 10 may establish communication with another information processing apparatus or server via a network and send and receive necessary information.
  • the movements of the real objects 120a and 120b are obtained by tracking the time change of their position coordinates using the moving image data captured by the imaging device 12 as described above.
  • the time change of the position is specified from the user's operation content on the input device 14 thereafter. it can.
  • the movements of the real objects 120a and 120b may be acquired by means other than the captured image, or accuracy may be improved by integrating information acquired by a plurality of means.
  • the input device 14 receives user operations such as start / end of processing and driving of the real object 120b, and inputs a signal representing the operation content to the information processing device 10.
  • the input device 14 may be any common input device such as a game controller, a keyboard, a mouse, a joystick, or a touch pad, or any combination thereof.
  • the microphone 16 acquires ambient sound, converts it into an electrical signal, and inputs it to the information processing apparatus 10.
  • the information processing apparatus 10 mainly recognizes the user's voice out of the voice signal acquired from the microphone 16 and reflects it in the movement of the real object 120a controlled by the information processing apparatus 10 itself.
  • the display device 18 displays an image generated by the information processing device 10.
  • the display device 18 is a projector and an image is displayed on the play field 20.
  • the display device 18 may be a display such as a general television monitor, or both the projector and the display may be included in the configuration.
  • the play field 20 may be a display.
  • the speaker 19 may be a general speaker, a sounding device such as a buzzer, or a combination thereof, and outputs a predetermined sound or voice as a sound according to a request from the information processing device 10. Connection between the information processing apparatus 10 and other apparatuses may be wired or wireless, and may be via various networks. Alternatively, any two or more of the information processing device 10, the imaging device 12, the input device 14, the microphone 16, the display device 18, and the speaker 19 may be combined or integrally provided. Further, an external device such as a speaker may be connected to the information processing apparatus 10.
  • FIG. 2 shows an internal circuit configuration of the information processing apparatus 10.
  • the information processing apparatus 10 includes a CPU (Central Processing Unit) 22, a GPU (Graphics Processing Unit) 24, and a main memory 26. These are connected to each other via a bus 30.
  • An input / output interface 28 is further connected to the bus 30.
  • the input / output interface 28 includes a peripheral device interface such as USB or IEEE 1394, a communication unit 32 including a wired or wireless LAN network interface, a storage unit 34 such as a hard disk drive or a nonvolatile memory, a display device 18, a speaker 19, and the like.
  • An output unit 36 that outputs data to the output device, an imaging device 12, an input device 14, and an input unit 38 that inputs data from the microphone 16, a recording medium drive unit that drives a removable recording medium such as a magnetic disk, an optical disk, or a semiconductor memory 40 is connected.
  • the CPU 22 controls processing and signal transmission in the components inside the information processing apparatus 10 by executing the operating system stored in the storage unit 34.
  • the CPU 22 also executes various programs read from the removable recording medium and loaded into the main memory 26 or downloaded via the communication unit 32.
  • the GPU 24 has a function of a geometry engine and a function of a rendering processor.
  • the GPU 24 performs a drawing process according to a drawing command from the CPU 22 and outputs it to the display device 18 as appropriate.
  • the main memory 26 is composed of RAM (Random Access Memory) and stores programs and data necessary for processing.
  • the communication unit 32 establishes communication with the real objects 120a and 120b including the actuator and transmits a control signal. In a mode in which sound is output or an image is displayed on the real object 120a or the like, those data are also transmitted. As will be described later, in a mode in which a sensor is provided on the real object 120a or the like, the communication unit 32 may receive a measurement value from the sensor from the real object 120a or the like. Further, the communication unit 32 may establish communication with the network as necessary, and send and receive necessary files and data to and from an external server or information processing apparatus.
  • FIG. 3 shows the configuration of the real objects 120a and 120b and the information processing apparatus 10 in detail.
  • each element described as a functional block for performing various processes may be configured by a CPU (Central Processing Unit), a memory, a microprocessor, other LSIs, actuators, sensors, and the like in hardware.
  • CPU Central Processing Unit
  • memory a memory
  • microprocessor other LSIs, actuators, sensors, and the like
  • software it is realized by a program loaded in a memory. Therefore, it is understood by those skilled in the art that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof, and is not limited to any one.
  • a real object controlled by the information processing apparatus 10 is a real object 120a
  • a real object operated by the user via the input device 14 is a real object 120b.
  • the real object 120b may not be provided.
  • a real object that the user simply moves or places by hand may be a flat card, a doll without an internal structure, a game piece, a block, or an ornament. Even in this case, it is possible to identify each image with high accuracy from the photographed image by adding a change on the exterior such as different colors, patterns, and shapes, or printing a two-dimensional barcode. it can.
  • the real objects 120a and 120b include drive units 106a and 106b that operate according to a control signal from the information processing apparatus 10, and communication units 108a and 108b that receive necessary control signals and data from the information processing apparatus 10, respectively.
  • the control signal received by the real object 120a is determined by the information processing apparatus 10, and the control signal received by the real object 120b reflects a user operation via the input device 14.
  • the drive units 106 a and 106 b include an actuator that is driven by a control signal from the information processing apparatus 10. As shown in FIG. 1, when the real objects 120a and 120b are self-propelled by the wheels 122a and 122b, the axle is rotated or the rudder angle is changed by the actuator.
  • the driving units 106a and 106b may include an actuator that generates a motion other than the wheel, a light emitting element, a display, a speaker, a vibrator, and the like and a mechanism that operates the actuator. These mechanisms are also operated by a control signal from the information processing apparatus 10 by utilizing an existing technology.
  • the communication units 108a and 108b receive the control signal transmitted from the information processing apparatus 10, and notify the respective driving units 106a and 106b.
  • the communication units 108a and 108b hold the individual identification information of their own real objects 120a and 120b in an internal memory. Then, it is determined based on the individual identification information transmitted together with the control signal whether or not the control signal transmitted from the information processing apparatus 10 is transmitted to the real objects 120a and 120b.
  • the real objects 120a and 120b may further include a sensor (not shown) that measures its own state.
  • a sensor (not shown) that measures its own state.
  • each real object measures its own state, so that the positional relationship, the change in the shape of the real object, and the like can be more accurately determined. You may ask for.
  • the communication units 108a and 108b transmit the measurement value obtained by the sensor to the information processing apparatus 10 together with its individual identification information.
  • a rotary encoder and a rudder angle sensor may be provided on the wheels 122a and 122b to specify the actual movement amount and movement direction.
  • a position sensor that acquires the absolute position of the real objects 120a and 120b, and a motion sensor such as an acceleration sensor, a gyro sensor, or a geomagnetic sensor may be provided.
  • the real objects 120a and 120b are basically moved by a control signal from the information processing apparatus 10, if actual measurement values are acquired in this way, feedback control can be performed so as to correct the error.
  • a joint that can be bent and stretched by the user may be provided on the real objects 120a and 120b, and a potentiometer for specifying the angle may be introduced.
  • a potentiometer for specifying the angle
  • the information processing apparatus 10 includes a communication unit 50 that transmits a control signal to the real objects 120a and 120b, a state specifying unit 52 that specifies a state on the play field 20 including the positional relationship between the real objects 120a and 120b, and information on the real objects.
  • a real object control unit 60 that generates a control signal as a result of a user operation or processing in the information processing unit 62 is included.
  • the state specifying unit 52 sequentially acquires image frames of moving images from the imaging device 12 and analyzes them to specify the positions of the real objects 120a and 120b at predetermined time intervals.
  • Various techniques are widely known as techniques for detecting and tracking an image of an object by image analysis, and any of them may be adopted in the present embodiment.
  • a technique for identifying a position in a three-dimensional space by integrating image information in a captured image and position information in the depth direction using a stereo image, TOF, etc. also uses a general method. can do.
  • the state specifying unit 52 uses the measurement values transmitted from the real objects 120a and 120b to specify the movement, position, shape, posture, and the like in detail. Also good.
  • the communication unit 50 receives the measurement values transmitted from the real objects 120 a and 120 b and supplies them to the state specifying unit 52.
  • the real objects 120a and 120b have markers or two-dimensional barcodes that emit light with different specific colors. May be provided.
  • the real objects 120a and 120b may be distinguished by other appearance features such as color, shape, pattern, size, and the like.
  • the real object information storage unit 54 stores information associating individual identification information of the real objects 120a and 120b with features on their appearance.
  • the state specifying unit 52 refers to the information and specifies individual identification information corresponding to the appearance feature of the image detected from the captured image, thereby distinguishing the real objects 120a and 120b at each position.
  • the individual identification information and the position coordinates of the real objects 120a and 120b are sequentially supplied to the information processing unit 62.
  • the information processing unit 62 performs information processing to be performed based on the positional relationship between the real objects 120a and 120b, a user operation on the input device 14, an audio signal acquired by the microphone 16, and the like. For example, when realizing a game in which a real object 120b operated by a user competes with a real object 120a controlled by the information processing apparatus 10, the information processing unit 62 determines the movement of the real object 120a controlled by the information processing apparatus 10. The game progresses while calculating the score, determining the win / loss, generating the display image, and determining the output sound. A game program, rules for determining movement, data necessary for image generation, and the like are stored in the scenario storage unit 56.
  • the information processing unit 62 When using the voice signal acquired by the microphone 16, the information processing unit 62 is provided with a voice recognition function using a general technique in order to detect a predetermined keyword from the voice signal.
  • the information processing unit 62 outputs the generated display image data to the display device 18 or outputs to the speaker 19 by decoding audio data to be generated.
  • the display device 18 is a projector that projects an image onto the play field 20
  • the image on the play field 20 can be changed according to the progress of the game, the positions of the real objects 120a and 120b, and the like. Information such as a score may be included in the image.
  • the real object control unit 60 generates a control signal so that the real object 120a moves with the movement determined by the information processing unit 62. Further, it interprets the contents of the user operation on the input device 14 and generates a control signal for the real object 120b that is the operation target of the user. These control signals are associated with the individual identification information of each of the real objects 120a and 120b supplied from the information processing unit 62.
  • the communication unit 50 establishes communication with the real objects 120a and 120b, and transmits the control signal generated by the real object control unit 60. Specifically, the signal to be transmitted varies depending on the control method, and a technique generally used in the field of robot engineering or the like may be appropriately employed.
  • FIG. 4 shows an external configuration example of the real object 120a.
  • the real object 120a of this example includes wheels 122a at the lower part of a rectangular parallelepiped main body as shown in FIG.
  • a marker 126 that emits light of a predetermined color is provided at the top.
  • the state specifying unit 52 of the information processing apparatus 10 acquires the position coordinates of the real object 120a by detecting the image of the marker 126 in the captured image. That is, the center 129 of the marker 126 is set as the tracking point of the real object 120a.
  • the marker 126 is disposed at a position close to one side of the rectangle that forms the upper surface 124a of the real object 120a.
  • the side surface close to the marker 126 can be defined as the front of the real object, and recognition of forward and backward movement can be shared with the user.
  • the marker 126 is mounted with, for example, an infrared light emitting diode or a white light emitting diode.
  • the color may be switched by attaching a color filter cap to the surface of the white light emitting diode.
  • the marker 126 may incorporate a wavelength conversion element such as an up-conversion phosphor.
  • the wheels 122a are each provided with a motor 128 that rotates the axle.
  • the motor 128 corresponds to the drive unit 106a shown in FIG. 3 and is driven by a control signal transmitted from the information processing apparatus 10 to rotate the wheel 122a at the requested rotation speed and direction.
  • the wheel 122a may be provided with a mechanism for changing the rudder angle by an actuator. As a result, a movement such as changing the moving direction of the real object 120a or turning on the spot is also generated.
  • the real object 120a may be provided with a switch for releasing the rotation of the wheel 122a from the motor 128 so that the user can run the real object 120a by hand.
  • FIG. 4 A single real object may be assembled using the real object 120a shown in FIG. 4 as a minimum unit.
  • FIG. 5 shows a connection example of a plurality of real objects in a bottom view.
  • the real object 130a shown on the left side of the figure has a configuration in which two real objects 120a of the smallest unit are connected to the front and rear. In this case, the real object 130a travels in the longitudinal direction as indicated by the arrow.
  • the coupling is realized by, for example, a plate capable of fitting two real objects 120a, a cover covering the whole, a binding band, and the like.
  • Each real object 120a may be provided with a concavo-convex connection portion, a hook-and-loop fastener, a magnet, and the like to be connected. The same applies to other examples.
  • the real object 130b shown in the upper right of FIG. 5 has a configuration in which two minimum real objects 120a are connected to the left and right. In this case, as indicated by the arrow, the real object 130b travels in a direction perpendicular to the longitudinal direction.
  • the real object 130c shown in the lower right of the drawing has a configuration in which the smallest real object 120a is connected to each other so as to form a triangle inside. In this case, the real object 130c turns at the same place as shown by the arrow.
  • FIG. 6 is a top view showing a modification of the number and positions of markers provided on a real object.
  • the real object 120c shown on the left side of the figure includes two markers 126a and 126b.
  • the state specifying unit 52 of the information processing apparatus 10 acquires the position of the midpoint 132 of the line segment connecting the markers 126a and 126b in the captured image as a tracking point of the real object 120c. Similar to the example shown in FIG. 4, the middle point 132 can be distinguished from the front and back by shifting the center point 132 from the center of the upper surface of the real object 120c.
  • a boom 134 is provided on the real object 120c, and markers 126a and 126b are arranged at both ends thereof, so that the position thereof can be adjusted so that it enters the field of view of the imaging device 12 regardless of the surrounding state. Also good.
  • the real object 120d shown on the right side of the figure includes three markers 126c, 126d, and 126e.
  • the state specifying unit 52 detects the images of the markers 126c, 126d, and 126e in the photographed image, and acquires the position of the center of gravity 136 of the triangle having the vertexes as the tracking points of the real object 120d.
  • the front and rear can be distinguished by changing the number of markers before and after the real object 120d.
  • FIG. 7 shows, as another example of the marker provided on the real object, a top view of the real object when the light emission of the marker is used to distinguish the front and back of the real object.
  • the real object 120e includes markers 126f and 126g at both ends of the boom 134 as in the real object 120c of FIG.
  • the boom 134 may be disposed at an arbitrary position such as the center of the upper surface of the real object 120e as illustrated.
  • the positions of the markers 126f and 126g may be freely arranged by the user.
  • the booms 134 may not be provided, and the markers 126f and 126g may be directly installed on the real object 120e main body.
  • a plurality of markers 126f and 126g are arranged on a straight line that can define the front-rear direction, such as a straight line connecting the left and right of the upper surface of the real object 120e.
  • the information processing apparatus 10 lights the left marker 126f according to the information because the left and right of the marker are determined naturally.
  • the wheel is rotated in two forward and reverse directions with one of the markers 126f and 126g lit, and the rotation direction in which the lit marker is on the left is determined from the movement of the marker in the captured image. Identify and determine the direction of forward rotation.
  • the state specifying unit 52 of the information processing apparatus 10 records the information thus determined in the real object information storage unit 54 in association with the individual identification information of the real object 120e, and uses it for subsequent real object control. If the user recognizes only the above rules, there is no discrepancy with the control of the information processing apparatus 10.
  • the tracking point is the same as the real object 120c in FIG. This is the midpoint 132 of the line segment connecting the markers 126f and 126g.
  • the markers 126f and 126g may be lit / flashing, may emit visible / invisible light, or may have different emission colors. Note that the arrangement and number of markers shown in FIGS. 6 and 7 are examples, and may be appropriately determined according to the size and shape of the real object. In addition, as described above, when an actual object can be recognized by appearance features other than the marker, the marker need not be provided.
  • an intended world is constructed by considering it as a variety of objects so that pretend play and games can be realized. That is, it is not necessary to change the appearance of the real object according to the world to be constructed. Even in such a case, for example, the user can distinguish the former as an ambulance, the latter as a normal car, etc. by emitting one real object's marker in red and the other real object's marker in white. Intuitive grasp. At this time, if an effect such as blinking red light emission is given, the user can further experience a sense of reality.
  • the user himself / herself can distinguish between the real object controlled by the information processing apparatus 10 and the real object operated by the user.
  • the information processing apparatus 10 may clearly indicate in advance which color of the real object is the target operated by the user.
  • the information processing apparatus 10 turns on the markers in the respective colors and moves one of the real objects slightly. If the moved real object is ruled as a real object controlled by the information processing apparatus 10 (or a real object operated by the user), the user can know which color of the real object he / she operates. You can play and play smoothly.
  • FIG. 8 is a flowchart illustrating a processing procedure in which the information processing apparatus 10 controls a real object according to the state on the play field 20. This flowchart is started when a user inputs a processing start request via the input device 14 or the like in a state where the real objects 120a and 120b are placed on the play field 20. Note that when the user operates the real object 120b via the input device 14, the processing for that is performed at an independent timing and is not included in the illustrated flowchart.
  • the state specifying unit 52 of the information processing device 10 requests the imaging device 12 to start shooting (S10).
  • the imaging device 12 starts capturing a moving image accordingly, an image frame is acquired at a predetermined frame rate and transmitted to the information processing device 10.
  • the period of the time step t may be the same as the imaging period of the image frame in the imaging device 12, or may be longer than that. That is, all image frames of a moving image may be processed, or processing may be performed while thinning out image frames every predetermined number.
  • the state specifying unit 52 analyzes the image frame, and states such as the position coordinates of the real objects 120a and 120b on the plane of the play field 20 Information is acquired (S16).
  • the position coordinates on the play field 20 are obtained by detecting an image of each real object marker from the image frame and appropriately performing coordinate conversion based on the detected image.
  • the information acquired here may vary depending on the embodiment. As described above, the value measured by the sensor provided on the real object, the content of the user operation via the input device 14, the audio information acquired by the microphone 16 and the like may be further used. These pieces of information are reflected in subsequent information processing as needed.
  • the movement is not limited to the moving direction, if the movement in the direction in which it is improved is determined for each time step on the play field 20, the change in the surroundings can be dealt with flexibly. Real objects can be realized. Further, even in a complicated system where there are many real objects, the motion can be determined relatively easily by selecting the best directivity by statistical processing one by one.
  • the information processing unit 62 not only changes the moving direction of the real object, but also the amount and direction of change of the movable part of the real object such as a joint, the lighting start of the light emitting element, the image to be displayed on the display built in the real object, The sound generated from a speaker with a built-in real object may be determined. Further, an image to be newly displayed on the display device 18 such as a projector or a sound effect generated from the speaker 19 may be determined. Then, the real object control unit 60 generates a control signal based on the determination result, adds image data and audio data as necessary, and transmits the control signal from the communication unit 50, thereby controlling the real object 120a (S20). At this time, the information processing unit 62 may appropriately output display image and audio data to the display device 18 and the speaker 19.
  • an image or light may be projected onto a real object by a projector or a projector connected to the information processing apparatus 10, or a sound that matches the movement may be generated by the speaker 19. Both may be performed simultaneously, or only one of them may be performed. In either case, it is possible to make it appear as if a real object is emitting light or uttering, and it is possible to realize as much expression as possible while suppressing the above disadvantages.
  • FIG. 9 shows a realization example of “competition / competition” among the aspects that can be realized in the present embodiment.
  • a real object 120a controlled by the information processing apparatus 10 and a real object 120b operated by the user are placed.
  • the play field 20 includes miniatures such as trees 140 and houses 142a and 142b. These are real objects that do not have a drive mechanism and can be placed freely by the user.
  • a river 144, a bridge 146, and sand 148 are displayed. These may be projected by a projector or drawn on paper or the like.
  • the information processing apparatus 10 acquires information on the position and range of those areas on the play field 20.
  • the information may be stored in the scenario storage unit 56. Even when the user draws on the spot, the information can be acquired by detecting from the captured image.
  • the user moves the real object 120b to play against or compete with the real object 120a controlled by the information processing apparatus 10.
  • Various rules can be considered in this case.
  • the speed to a predetermined goal such as the tree 140 may be competed or directly hit.
  • the score may be lowered when hitting a placed real object such as the tree 140 or the houses 142a and 142b.
  • a laser irradiation means (not shown) may be provided on the real objects 120a and 120b so as to be regarded as a gun, and a score may be given if the object can be fired while aiming at the opponent's real object.
  • the real objects 120a and 120b may have a shape such as a tank, for example. Further, when a traveling sound of a tank or the like is generated from the speaker 19 or the like in accordance with the movement of the real objects 120a and 120b, a more realistic feeling is given.
  • the information processing apparatus 10 controls the river 144 and the sand 148 displayed on the play field 20 so that the movement of the real objects 120a and 120b is restricted.
  • the user moves his / her real object 120b in the direction of the river 144 where the bridge 146 is not applied, the user is forced to stop at the edge of the river 144. That is, even if the input device 14 is moved in the direction of the river 144, the information processing unit 62 controls the real object 120b so that the speed becomes zero at the edge of the river 144. As a result, the user resumes the movement by retreating or changing the direction of the real object 120b.
  • the moving speed is slowed or the direction change is slowed.
  • the information processing unit 62 controls the real object 120b by reducing the moving speed requested by the user to the input device 14 by a predetermined rate or reducing the response of the change in the steering angle.
  • the speed and direction are calculated so that the same restriction is imposed on the real object 120a controlled by the information processing apparatus 10.
  • the real objects 120 a and 120 b may be prevented from going out of the play field 20.
  • a region where speed and direction change can be accelerated may be provided.
  • a team battle may be made by placing a plurality of real objects 120a and 12b. For example, a plurality of users may each bring their own real objects 120b to form a team and play against the same number of real objects 120a controlled by the information processing apparatus 10.
  • FIG. 10 shows an example of data prepared in the scenario storage unit 56 in order to realize the form shown in FIG.
  • the scenario storage unit 56 prepares data of the image 110 to be projected.
  • a river 144, a bridge 146, and a sand 148 are drawn on the image 110.
  • the information processing unit 62 reads the data of the image 110 from the scenario storage unit 56 at the start of the battle, and outputs the data to the projector that is the display device 18 for projection.
  • the deformation rule of the image 110 may also be stored in the scenario storage unit 56, and the projected image may be changed over time. For example, by displaying an image in which the water of the river 144 overflows over time or the bridge 146 collapses at the timing when the real object 120b crosses, the real object and the image are fused, and a more thrilling battle realizable.
  • the area of the river 144 and the sand 148 is acquired as described above, and the data 112 mapped to the plane corresponding to the play field 20 is prepared.
  • a motion limit amount in each region is set.
  • the data 112 displays a dotted line area indicating the position of the real object acquired at each time step.
  • This data is obtained by converting and mapping the position of the image detected from the captured image by the state specifying unit 52 into the position coordinates on the plane of the play field 20 (xy plane in the drawing).
  • the identification information “# T0”, “# T1” of the tree 140, and the house 142a are located in the vicinity of the area of each real object in FIG. , 142b identification information “# H0” and “# H1”, and real object 120a and 120b identification information “# C0” and “# C1” are also shown.
  • the state specifying unit 52 creates individual identification information of each real object and information related to the position for each time step, and supplies the information to the information processing unit 62.
  • the positions of the real objects 120a and 120b are changed by a user operation or control by the information processing apparatus 10.
  • the information processing unit 62 monitors whether or not they enter the area where the speed limit is imposed, and limits the speed by a set amount.
  • the position of the real object 120a controlled by the information processing apparatus 10 (identification information “# C0”) is compared with the position of the real object 120b operated by the user (identification information “C1”), and the position is easily moved. For example, the optimum moving direction of the real object 120a at that time is determined and controlled.
  • the handling of the data related to the position information is the same in the forms described later.
  • FIG. 11 shows an implementation example of “cooperation / assistance” among the modes that can be realized in the present embodiment.
  • An annular road 150 and a pedestrian crossing 156 are displayed in the play field 20.
  • a real object 120a controlled by the information processing apparatus 10 and a real object 120b operated by the user are placed.
  • miniatures such as trees 152a and 152b and a signal 158 and dolls 154a and 154b are placed in the play field 20.
  • the trees 152a and 152b and the dolls 154a and 154b are real objects that do not have a driving mechanism, and can be freely placed by the user or moved by hand.
  • the signal 158 includes blue, yellow, and red light emitting diodes, and is configured so that the user can switch the light emission color via the input device 14 or the information processing device 10. In any case, since the actual control is performed by the information processing apparatus 10, the information processing apparatus 10 can naturally recognize the switching of the emission color.
  • the real objects 120a and 120b travel on the road 150 counterclockwise.
  • the user causes the real object 120b that he / she operates to travel ahead of the real object 120a controlled by the information processing apparatus 10, and excludes from the road 150 objects that obstruct the traveling of the real object 120a controlled by the information processing apparatus 10.
  • “cooperation” is performed so that the real object 120a can travel quickly.
  • the tree 152b falls down and blocks the road 150, so the user pushes the tree 152b with the real object 120b to get out of the road 150. Even if the real object 120a controlled by the information processing apparatus 10 reaches the real object 120b, the real object 120a is stopped there until the tree 152b is eliminated.
  • the real object 120a controlled by the information processing apparatus 10 can be represented as an ambulance, and the real object 120b operated by the user can be regarded as a private car, thereby producing a situation such as a citizen cooperating to run the ambulance quickly.
  • Each real object may be in the shape of their car. Further, a driving sound or a brake sound of a car may be generated from the speaker 19 or the like in accordance with the movement of each real object.
  • the above-described “disturbance” mode may be realized by interfering with the real object 120b operated by the player so that the real object 120a of the opponent team cannot pass.
  • the information processing apparatus 10 can realize a movement that moves forward while avoiding interference by moving the real object 120a by moving away from the disturbing real object 120b and determining a direction with a small retraction amount. .
  • the third real object to be controlled by the information processing apparatus 10 or operated by another user is prepared, and the real object 120a controlled by the information processing apparatus 10 rushes to the third real object. It is good also as a game which the real object 120b which a user operates obstructs going.
  • the third real object is an ambulance
  • the real object 120a controlled by the information processing apparatus 10 is a tank
  • the real object 120b operated by the user is a private car
  • the ambulance that is likely to be attacked by the tank is protected by the private car.
  • the child moves the dolls 154a and 154b by hand.
  • the signal 158 is red
  • the real object 120 a controlled by the information processing apparatus 10 continues to travel on the road 150.
  • the signal 158 turns blue, it stops before the pedestrian crossing 156 so that the child can move the doll 154b or the like to cross the pedestrian crossing 156.
  • the signal 158 tries to cross in red or cross the road without a pedestrian crossing
  • the real object 120a or the speaker 19 may suddenly stop or generate a brake sound just before that. To recognize that it is a dangerous act.
  • the target to which the real object 120a controlled by the information processing apparatus 10 reacts may be an object that can be held and moved by hand instead of the real object 120b having the same configuration.
  • Such a mode is particularly effective for a user who is difficult to operate with the input device 14 such as an infant.
  • the tree 152b may be excluded using a hand instead of the real object 120b.
  • the “education” mode the user operates the real object 120b instead of the dolls 154a and 154b, and generates an interaction with the real object 120a controlled by the information processing apparatus 10 while traveling on a more complicated road. By doing so, you may be allowed to learn the traffic regulations of automobiles.
  • the real object 120a controlled by the information processing apparatus 10 may be stopped in front of the dolls 154a and 154b on the roadside.
  • the user leaves the real object 120a in a state where the doll is put on the stopped real object 120a, and stops again in front of the miniature of the house placed separately. This makes it possible to play a taxi. It is a rule to stop the real object 120a in front of the miniature at the bus stop, and you can play with the bus by getting on and off the doll.
  • FIG. 12 shows an implementation example of “act” among the modes that can be realized in the present embodiment.
  • a plurality of real objects 120a controlled by the information processing apparatus 10 are placed in the play field 20, and the user appreciates playing a mass game with a unified movement.
  • the position is acquired from the captured image.
  • it is possible to return to the set formation and start the mass game.
  • the user can return to the specified position and continue the mass game.
  • data in which the time change of the position coordinates of each real object 120a is set is stored in the scenario storage unit 56.
  • the initial formation and the position in the middle are defined, so that the actual object 120a deviating from it can be returned to the defined position.
  • the information processing unit 62 determines an optimal moving direction at each time step while referring to the setting of the position coordinates so as not to hinder the movement of the other real object 120a in the returning process.
  • the information processing apparatus 10 may add effects by the movement of light such as applying a spotlight using a projector or a lighting device (not shown), or may play music from the real object 120 a itself or the speaker 19. .
  • the contents of the mass game and the music may be switched depending on the combination of the shape of the placed real object 120a and the color of the marker.
  • some change may occur depending on the user's behavior. For example, when the applause or cheering sound made by the user when the mass game is finished, the mass game may be started again as an offer. In this case, when the size of the audio signal acquired by the microphone 16 exceeds the threshold value, the information processing unit 62 determines the start of offer.
  • a mass game may proceed in accordance with the user's command, or the rhythm, tempo, and tune may be switched.
  • the user presses a predetermined button of the input device 14 with a desired time signature the real object 120a is moved by the movement according to the time signature.
  • the voice for which the user counts may be acquired by the microphone 16.
  • the imaging device 12 may take a picture of a user actually conducting a command by hand or shaking a marker that imitates a command stick, and the time may be adjusted.
  • one of the plurality of real objects 120a in this example may be a real object operated by the user or a real object moved by the user's hand.
  • the information processing apparatus 10 controls the other real object 120a so as to move in accordance with the real object.
  • the other real object 120a may move with the same movement as the real object moved by the user, or the other real object 120a may move in a line with the real object moved by the user. Good. These correspond to the “imitation” forms described above.
  • another real object may be surrounded or aligned so that the real object moved by the user is always at the center, so that the real object moved by the user may move.
  • the distance and positional relationship with the real object moved by the user, the distance and positional relationship between the other real objects 120a, and their temporal changes are set in advance and stored in the scenario storage unit 56. If the next moving direction is calculated for every time step for all the real objects 120a to be controlled, a movement that matches the real object that the user moves can be realized.
  • a game that challenges whether a mass game can be performed according to a model by mixing a plurality of real objects 120a controlled by the information processing apparatus 10 with real objects operated by the user.
  • the information processing device 10 plays music from the speaker 19 and causes the display device 18 to display a moving image representing a model movement.
  • the user places an actual object that he / she operates in one of the initial formations, and operates it so that he / she follows the example.
  • the information processing unit 62 controls other real objects and detects an event in which the real object operated by the user deviates from the model based on the captured image. Then, a final score is calculated by deducting points according to the number of deviations and displayed on the display device 18 at the end of the mass game.
  • the information processing apparatus 10 may assist the operation. Specifically, if the actual movement of the real object 120b is different from the movement requested via the input device 14 due to some circumstances, fine adjustment is made to the control signal reflecting the operation content to the input device 14 To make the actual movement closer to the request. For example, when an excessive load is applied to the real object 120b, it may be meandering even if the user's operation is straight ahead. In such a case, the information processing unit 62 adjusts the rudder angle in a direction to suppress meandering so that the real object 120b can actually go straight.
  • the actual difference based on the change in the position of the real object in the photographed image or the measurement value by the sensor is compared with the user's operation content input from the input device 14, thereby obtaining the difference between the request and the actual. To detect. And a control signal is adjusted so that a difference may become small.
  • This form corresponds to the “auxiliary” described above.
  • the assist target is not limited to that related to the traveling of the real object, but the same applies to the movement of the gripper and the arm. Further, such adjustment can be performed at any time when any other mode is implemented.
  • FIG. 13 shows an example of a mode realized when a function for grasping an object is added.
  • a real object 120a controlled by the information processing apparatus 10 a real object 120b operated by a user, and a plurality of blocks 162 are placed.
  • the block 162 may be a lump such as a synthetic resin having no mechanism inside.
  • Each real object 120 a and 120 b includes grippers 160 a and 160 b for gripping the block 162.
  • the user operates the real object 120b via the input device 14 to bring it close to the one block 162 and open / close the gripper 160 so as to grasp it. Then, the grasped block 162 is carried into his position 164b displayed on the play field 20. Similarly, the information processing apparatus 10 controls the real object 120a to move the block 162 to another position 164a. The winner is the one who brings more blocks 162 to his position. Thereby, the above-described “match” and “disturbance” modes can be realized.
  • the grippers 160a and 160b sandwich the block 162 from the left and right by opening and closing, but the block 162 may be lifted so that its base portion can be moved in the vertical direction.
  • a gripper at the tip of the arm capable of controlling the joint angle
  • a more complicated operation such as lifting the block 162 to a high position or turning the block 162 upside down may be performed.
  • a forklift mechanism for inserting and lifting a claw between the block 162 and the floor may be provided.
  • the shape of the block 162 is also appropriately optimized to facilitate carrying.
  • FIG. 14 shows another example of a mode realized when a function of grasping an object is added. This example realizes the above-mentioned form of “cooperation / assistance”.
  • a real object 120a having a gripper and a plurality of blocks 170 having various colors, which are controlled by the information processing apparatus 10, are placed.
  • the user 8 assembles the block 170 in the work area 172 provided in the play field 20. As shown in the figure, they may be assembled three-dimensionally or arranged in a plane. In the former case, it may be simply placed like a building block, or the blocks 170 may be assembled as a mutually connectable structure.
  • the user 8 specifies the color of the next necessary block by voice during assembly.
  • “RED! (Red!)” Is designated.
  • the real object 120a searches for the block of the designated color among the blocks 170 in the area other than the work area 172 on the play field 20, and carries it close to the user. If the work area 172 is fixed, it can be estimated that the user 8 is in the vicinity thereof.
  • the information processing unit 62 recognizes the designated color based on the audio signal acquired by the microphone 16 and controls the block 170 of that color to be grasped by the real object 120a and carried to the estimated position.
  • the position of the user 8 may be detected by providing a rule that the user 8 is within the field of view of the imaging device 12. In this case, even if the work area 172 is not clearly defined, if it is carried from the block 170 placed farther from the detected position of the user 8, it is not necessary to carry the block being assembled.
  • the designation of the block by voice is not limited to color, and any attribute such as size and shape may be used, or a combination thereof. In this form, the user can assemble the blocks as he / she likes, and the real object 120a can carry out the work efficiently by carrying blocks randomly selected in the process. Since hands are used for assembling, by realizing the designation of the block by voice, the work efficiency is not reduced by the designation.
  • a model selected from a completed model prepared in advance may be assembled.
  • the user selects an object to be assembled from the models displayed on the paper or the display device, and designates it using the input device 14.
  • the scenario storage unit 56 of the information processing apparatus 10 stores an assembly order and block identification information used for each model. Accordingly, it is possible to realize a mode in which the real object 120a selects and carries the next necessary block by determination by the information processing apparatus 10 without designating the block by the user. At this time, information relating to how the carried blocks are connected may be displayed on the display device 18 or projected onto the play field 20.
  • the real object 120a may be configured to sort the plurality of blocks 170 according to a predetermined standard such as color, shape, and size.
  • the information processing unit 62 first specifies the number of types of the blocks 170 on the play field 20, for example, the number of colors based on the captured image. Then, the corresponding number of areas are set on the play field 20. Then, by repeating the control of bringing the block 170 held by the real object 120a to the corresponding area according to the attribute such as the color, a group of blocks 170 is formed for each type. This makes it much easier to find the desired block 170, especially when there are a large number of blocks 170. As shown in FIG.
  • the user 8 may perform the sorting operation in parallel during the assembly of the block 170, or may be performed not only during the assembly but also during a play game described later, for example. Further, the blocks may be cleared on the real object 120a by collecting scattered blocks 170 in one place regardless of the attribute.
  • a work area for the real object 120a may be provided, and the block 170 may be assembled in parallel with the real object 120a.
  • the user and the real object 120a assemble the same model selected by the user separately. If the user 8 assembles the actual object 120a as an example, the assembly procedure is equivalent to that shown.
  • the completed form may be displayed on the display device 18 and simultaneously assembled to compete for the speed to completion. You may assemble one thing in cooperation in one work area.
  • the blocks 170 need only be placed at the position when arranged in a plane, but when the three-dimensional assembly is performed, the above-described arm or crane is provided on the real object 120a. Thus, the block 170 can be lifted to a high position.
  • FIG. 15 is a diagram for explaining an example of a block connection method in a mode in which a real object assembles a block. This figure is a side view of the process of connecting the blocks by fitting the cylindrical concave portions 176 provided in the block 170b into the columnar convex portions 174 provided in the block 170a.
  • the diameter r2 of the concave portion 176 of the block 170b is set to be larger than the diameter r1 of the convex portion 174 of the block 170a.
  • the information processing unit 62 performs the concave portion 176 of the block 170b as in the state (c).
  • the convex portion 174 is tightened by the internal mechanism.
  • the force required for the real object 120a in the fitting operation can be equivalent to lifting the block 170b and placing it on the block 170a, and the internal mechanism strengthens the connection, so that the assembled object collapses even when tilted. You can avoid it.
  • the concave portion 176 has a clamp structure in which at least a part of the inner wall is narrowed by, for example, rotation of a screw.
  • the information processing unit 62 transmits a control signal for operating the actuator that rotates the screw to the block 170b to tighten the screw.
  • the convex portion 174 of the block 170a is sandwiched.
  • the connection is removed from the state (c)
  • the state may be shifted to the state (b) by loosening a screw.
  • connection method is not limited to this, and an air suction mechanism may be provided in the connection portion, and the connection surface may be connected by forming a vacuum on the connection surface under the control of the information processing apparatus 10.
  • connection may be made by using an adhesive surface such as a hook-and-loop fastener, and when removing, the eject pin may be protruded and peeled off under the control of the information processing apparatus 10.
  • two gripper arms may be provided on the real object 120a so that the real object 120a does not lose its stability due to a reaction at the time of connection. Accordingly, if both the blocks 170a and 170b to be connected are held and connected, the reaction is unlikely to occur. Alternatively, if the play field 20 can be connected to the block 170a that is in contact with the block 170b and the block 170a that is the connection destination is fixed when the block 170b is connected, the real object 120a is less likely to lose stability.
  • the mode of using the voice command in the environment as shown in FIG. 14 can be used not only for assembling the block but also for playing.
  • the block 170 is likened to fruits and vegetables such as bananas, apples and melons depending on the color.
  • the real object 120a carries the yellow block 170 to the vicinity of the user 8.
  • a sound such as “Yes, please” is generated from the real object 120a or the speaker 19 at the same time, thereby making it possible to produce a situation such as shopping at a fruit and vegetable store.
  • the scenario storage unit 56 stores the color of the block 170 in association with keywords such as “banana”, “apple”, and “melon”.
  • the information processing unit 62 detects a keyword from the audio signal acquired by the microphone 16, the information processing unit 62 controls the real object 120a so as to carry the block 170 of the corresponding color.
  • the block 170 may be shaped like an actual fruit or vegetable.
  • the blocks may be associated with rough classifications such as fruits and vegetables, meat, fish, etc. and produced as if shopping at different stores such as fruit and vegetable stores, butchers, and fish stores.
  • the area of each store may be formed on the play field 20 by separating the blocks according to the color of the real object 120a as described above. Further, by placing a block that the user 8 regards as money on the play field 20, the real object 120 a may take it away and make it more like shopping.
  • the block provided by the user 8 may be regarded as laundry, and the real object 120a may take away the block in response to a voice such as “Please bring this with the cleaner”.
  • a keyword such as “Take this with you” and a rule of movement that grabs a block placed in the vicinity of the user 8 and puts them together in a predetermined area on the play field 20 are associated with each other. It is stored in the storage unit 56.
  • a block may be regarded as an animal or a character, or a figure having such a shape may be used instead of the block 170.
  • FIG. 16 shows still another example of a mode realized when a function of grasping an object is added.
  • This example realizes the above-described "match" form
  • the play field 20 is a board game board.
  • an actual object 120a having a gripper and pieces 180 and 182 used for a board game, which are controlled by the information processing apparatus 10 are placed.
  • the pieces 180 and 182 may be a lump of synthetic resin or the like having no mechanism inside.
  • the shapes of the pieces 180 and 182 and the grids of the board are not limited to those shown in the drawings, and may be determined as appropriate depending on the game.
  • the game played here may be a general game such as chess, shogi, and reversi.
  • the real object 120a moves his piece 180 using a gripper, and the user 8 moves his piece 182 by hand.
  • the information processing unit 62 determines the piece 180 to which the real object 120a should move and the destination to move according to the battle situation including the position of the piece 182 moved by the user.
  • Such a strategy itself may be the same processing as a program in a conventional computer game.
  • the real object 120a may play an auxiliary role in a situation where two users are competing. For example, when the opponent's piece is picked up by one of the users with chess or shogi, the piece is carried toward the user. In the case of reversi, the front and back of the piece are reversed.
  • the scenario storage unit 56 stores rules for each game, particularly rules for changing other pieces by each hand. This aspect is not limited to a board game, and can be similarly realized in a card game or a six game.
  • a battle game may be realized in which the pieces 180 and 182 are character figures and attacked by hitting the opponent figure.
  • the real object 120a avoids being hit or bumped against the figure of the user 8 by grasping or pushing his / her own figure.
  • an actual object having a gripper is also placed on the user side, and the user operates to oppose it.
  • a third real object controlled by the information processing apparatus 10 may be introduced so that the weapon held by the figure is brought from the place where the weapon is placed. At this time, the user may be able to designate a weapon by voice, as shown in FIG.
  • FIG. 17 shows an example of a mode realized when a function for holding a pen is added to a real object.
  • a real object 120a controlled by the information processing apparatus 10 and a real object 120b operated by the user are placed.
  • Each real object 120a, 120b includes a holding mechanism 190a, 190b that can hold the pen downward.
  • the holding mechanisms 190 a and 190 b are configured to be able to change the pen height so that the pen tip can be brought into contact with or separated from the play field 20 by a control signal from the information processing apparatus 10. Then, when the real objects 120a and 120b move while the pen tip is in contact, a line drawing corresponding to the movement is created on the play field 20.
  • the real object 120a controlled by the information processing apparatus 10 draws a line drawing so as to compensate for the line drawing. Complete the picture.
  • the information processing apparatus 10 may decorate the user's original line drawing according to a predetermined rule, or a complete picture of a sample picture may be displayed on the display device 18 and both may draw for that purpose. You may make it go.
  • a rule regarding how to decorate the user's line drawing is stored in the scenario storage unit 56.
  • the completed form of the picture is stored in the scenario storage unit 56, and the deficient portion is drawn on the real object 120a by the control of the information processing apparatus 10 by comparing with the progress of drawing at each time step.
  • the play field 20 may be a replaceable paper or a surface-treated plate that can be redrawn by erasing line drawings.
  • the above is a form of “cooperation / assistance” using a pen.
  • the form of “activity” may be realized by drawing only a real object 120 a controlled by the information processing apparatus 10.
  • the user designates an object to be drawn by voice or an operation via the input device 14.
  • the information processing unit 62 reads out a picture associated with the designated object from the scenario storage unit 56 and draws the picture as it is, thereby realizing an interaction with the user.
  • the user may give points to the picture drawn by the real object 120a or correct the picture. At this time, the user may directly write the score and correction to the play field 20 with a pen held in his hand.
  • the information processing apparatus 10 acquires and learns the score and the corrected part from the photographed image. For example, a picture with a bad score below the threshold is not drawn again because it is not the user's preference. The picture corrected by the user is reproduced when the picture is drawn again. Therefore, the information processing unit 62 associates the score of the user and the corrected picture data with the completed picture data stored in the scenario storage unit 56.
  • a projector is used to project an image on which a line is drawn on the path through which the real objects 120a and 120b have passed. Can also be realized. This eliminates the need to change paper or erase lines, and can easily store the resulting picture electronically.
  • FIG. 18 shows an example of a mode in which an interaction is generated between the real object and the user's body.
  • a real object 120a controlled by the information processing apparatus 10 is placed. Then, the real object 120 a self-runs so as to bounce off three sides of the four sides of the edge of the play field 20. The user protects the remaining side with his / her hand 196. That is, the real object 120a moves like a ball of a pinball, but by hitting it back with the hand 196, a "matching / competition" form is realized.
  • the information processing unit 62 physically calculates the movement and rebound of the ball when it is assumed that the play field 20 is inclined at a predetermined angle, and moves the real object 120a as such.
  • the state specifying unit 52 detects the image of the hand 196 from the captured image and tracks it with an existing method.
  • the information processing unit 62 can also reproduce the manner in which the ball rebounds by the movement of the hand with the real object 120a. Even if the user does not actually hit back the real object 120a, if the hand 196 catches up with the real object 120a before the real object 120a crosses the side, the success of the hit is determined, and the real object 120a is controlled to return to the opposite side. To do. If it does not catch up, the real object 120a goes out of the play field 20 and loses the user.
  • the inside of the play field 20 may be surrounded by a block, and the real object 120a may be moved so as to bounce off the block.
  • a block may be placed inside it as a rebound point. Even if the user places the block in a free position or shape, the position coordinates of each block can be determined from the captured image, so the speed and moving direction of the real object 120a can be determined for each time step by simple physical calculation. . If the material is assumed according to the color of the block, the change in the coefficient of restitution can be reflected in the speed calculation. When rebounding, a rebounding sound may be generated from the speaker 19 or the like.
  • a “competition” mode in which the hand 196 follows the real object 120a may be realized.
  • a plurality of real objects 120a may be placed so as to escape all at once according to the movement of the hand 196.
  • the information processing unit 62 determines the moving direction of each real object 120a so as to move away from the hand based on the position information of the hand at each time step. At this time, if the moving direction is determined by generating random numbers or the like so that the directions are dispersed as much as possible, it becomes difficult to catch and the game performance is further improved.
  • the user may be in a state where the user does not actually grab the real object 120a but has caught it in a predetermined area of the play field 20.
  • a living creature such as sheep, a mouse, or a fish
  • a cry from the speaker 19 By making the real object 120a into the shape of a living creature such as sheep, a mouse, or a fish, or generating a cry from the speaker 19, a more realistic feeling can be given.
  • FIG. 19 shows an example of a “cooperation” or “act” mode in which ball rolling is included in the movement of a real object.
  • a device such as a Rube Goldberg machine is assumed in which other objects and devices move so as to respond to the rolling of the ball 200.
  • the play field 20 several real objects 120a controlled by the information processing apparatus 10 and some real objects 120b operated by the user are placed.
  • an approximate course in which the ball 200 rolls is formed by the user in the block 202.
  • Real objects 120m, 120n, and 120p are further placed as real objects controlled by the information processing apparatus 10.
  • the real object 120m is composed of a quadrangular prism main body and a slide.
  • the quadrangular column is provided with holes 204 and 206 at the lower part and the upper part, and a mechanism for raising and lowering the internal cavity like an elevator.
  • the actuator that moves the table up and down is driven by a control signal from the information processing apparatus 10.
  • the real objects 120n and 120p are provided with a mechanism that causes the whole to emit light or generate a sound effect according to a control signal from the information processing apparatus 10.
  • the real objects 120n and 120p may appear to emit light by projecting light from the projector, or sound effects may be generated from the speaker 19.
  • the real object 120a controlled by the information processing apparatus 10 puts the ball 200 carried by the gripper into the hole 204 below the real object 120m.
  • the platform on which the ball 200 is placed rises inside the quadrangular column of the real object 120m, and the ball 200 goes out from the upper hole 206 by tilting the platform.
  • the ball 200 rolls on the slide and collides with the real object 120n.
  • the information processing apparatus 10 reacts by causing the real object 120n to emit light or generate sound.
  • the ball 200 that bounces and rolls off from the real object 120n is guided to a desired direction by the user operating the real object 120b to bounce or push it.
  • the real object 120a controlled by the information processing apparatus 10 is rebounded or pushed into the opening of the real object 120p, and the ball 200 thus rolled is set as a goal.
  • the information processing apparatus 10 tracks the position of the ball 200 based on the captured image, moves the real object 120a to a position where it rebounds in the direction of the real object 120p, or pushes the ball 200 with the real object 120a. Further, it is detected from the photographed image that the ball 200 has entered the opening, and the real object 120p is caused to emit light or a sound is generated to produce the goal moment. As described above, in the present embodiment, since the change in the field is detected based on the captured image, various functions of the real object can be activated in accordance with the movement of the ball that is not controlled.
  • the scenario storage unit 56 stores the rules related to the movement of the ball 200 as described above and the movement of the real object according to the movement.
  • FIG. 20 is a diagram for explaining a mode in which a plurality of users participate in one game using a network.
  • the information processing systems 1a, 1b, and 1c are similarly constructed in three places.
  • the information processing apparatuses 10a, 10b, and 10c of each system are connected to the server 210 via the network 212.
  • real objects that are moved by the users of the information processing systems 1b and 1c via the network 212 are placed in addition to the real objects operated by the users there.
  • the server 210 associates the information processing system of the participating user with the real object, and notifies the information processing apparatuses 10a, 10b, and 10c.
  • the server 210 notifies the other information processing system of the information.
  • Each of the information processing apparatuses 10a, 10b, and 10c moves a corresponding real object in accordance with an operation of the user via the input device 14, and responds to each of them according to operation information of other users notified from the server 210. Move a real object.
  • the real object corresponding to each user moves in the same manner.
  • the number of real objects may be increased, and at least one of the server 210 or the information processing apparatuses 10a, 10b, and 10c may determine the movement. In this way, for example, the battle game is executed in the play field as shown in FIG.
  • the game can be made more interesting by performing various adjustments. For example, the number of times the user has played in the past and the battle record are recorded in the server 210, and the information processing apparatuses 10a, 10b, and 10c are notified at the start of the game. The information processing apparatuses 10a, 10b, and 10c adjust the movement of the real object according to the information.
  • a handicap is given to the user, and the real object of another user is moved at a higher speed, or a single bombardment Score higher than other users.
  • the number of the first piece is made different. You may divide the proficiency level into multiple levels. By doing so, since it is possible to fight equally regardless of age and proficiency level, it is possible to enjoy the game without selecting an opponent. Moreover, depending on the case, it is also possible to select that only users with the same skill level fight without handicap.
  • the position and movement of an actual object on the play field are detected using an image obtained by shooting the play field. Then, the information processing apparatus controls and moves any of the real objects according to a preset rule. This makes it possible to create various spaces in the real world where objects react in various ways. At this time, by determining the movement of the real object at each time step of a minute time interval, it is possible to make it appear as if it is responding to the movement of the other real object flexibly with a relatively simple calculation.
  • a real object that is operated by the user via the input device or a real object that is placed or moved by a hand is introduced to influence the movement of the real object that is controlled by the information processing apparatus.
  • play such as a computer game and collaborative work with a device can be realized using a real object, and even an infant who is not familiar with an input device can easily play and use it.
  • the same effect can be obtained by acquiring a voice and reacting a real object to it.
  • the real object operated by the user is also moved through the information processing device, so that the moving speed can be limited depending on the place on the play field, and the user can be adjusted to give a handicap.
  • various adjustments that were easy in a computer game are possible in the real world, and the rules can be made more complex and multifunctional.
  • the difficulty level can be changed easily.
  • the movement of the real object is determined based on the captured image, it is possible to interact with the real object controlled by the information processing device and the image or output sound displayed in the play field, the movement of an object or human body that does not have a drive system It becomes. If the image is projected by the projector, the real object and the image can be linked, and thus the world view to be constructed can be easily changed.
  • the information processing apparatus can be included as a participating member. For example, it is possible to realize a relatively large aspect in which a multi-user Allied Force and an information processing device army battle each other using a large number of real objects.
  • Information processing system 10 Information processing device, 12 Imaging device, 14 Input device, 16 Microphone, 18 Display device, 19 Speaker, 20 Playfield, 22 CPU, 24 GPU, 26 Main memory, 50 Communication unit, 52 Status identification unit 54 real object information storage unit, 56 scenario storage unit, 60 real object control unit, 62 information processing unit, 106a drive unit, 108a communication unit, 120a real object, 122 wheels, 126 markers, 128 motors.
  • the present invention can be used for toys, learning devices, computers, game devices, information processing devices, and systems including them.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

Selon la présente invention, dans un système de traitement d'information 1, un dispositif de capture d'image 12 capture des images d'objets réels 120a, 120b placés sur un terrain de jeu 20. Un dispositif de traitement d'informations 10 acquiert l'état du terrain de jeu 20 à partir d'images capturées ou similaire, et détermine et commande le déplacement de l'objet réel 120a en fonction de la relation de position entre les objets réels 120a, 120b, et les demandes d'utilisateur effectuées par l'intermédiaire d'un microphone 16 et d'un dispositif d'entrée 14. Le dispositif de traitement d'informations commande en outre l'objet réel 120b conformément à une opération d'utilisateur effectuée par l'intermédiaire du dispositif d'entrée 14. Le terrain de jeu 20 est formé d'une image projetée depuis un projecteur inclus dans un dispositif d'affichage 18. L'image projetée ainsi que le signal audio d'un haut-parleur 19 sont modifiés en fonction du déplacement des objets réels 120a, 120b.
PCT/JP2015/074169 2014-11-07 2015-08-27 Dispositif de traitement d'informations, système de traitement d'informations, système d'objet réel, et procédé de traitement d'informations WO2016072132A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014227275A JP6352151B2 (ja) 2014-11-07 2014-11-07 情報処理装置、情報処理システム、および情報処理方法
JP2014-227275 2014-11-07

Publications (1)

Publication Number Publication Date
WO2016072132A1 true WO2016072132A1 (fr) 2016-05-12

Family

ID=55908862

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/074169 WO2016072132A1 (fr) 2014-11-07 2015-08-27 Dispositif de traitement d'informations, système de traitement d'informations, système d'objet réel, et procédé de traitement d'informations

Country Status (2)

Country Link
JP (1) JP6352151B2 (fr)
WO (1) WO2016072132A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112106004A (zh) * 2018-05-09 2020-12-18 索尼公司 信息处理装置、信息处理方法和程序
JP2022509986A (ja) * 2019-08-30 2022-01-25 上▲海▼商▲湯▼智能科技有限公司 車両位置決めシステム及び方法、車両制御方法及び装置
JP7335538B1 (ja) 2022-12-22 2023-08-30 株式会社カプコン 情報処理方法、情報処理システムおよびプログラム
US11944887B2 (en) 2018-03-08 2024-04-02 Sony Corporation Information processing device and information processing method

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3304522B8 (fr) 2015-06-08 2023-10-04 Battlekart Europe Système de création d'un environnement
JP7091160B2 (ja) * 2018-06-14 2022-06-27 鹿島建設株式会社 構造物構築方法
JP2023102972A (ja) * 2022-01-13 2023-07-26 凸版印刷株式会社 空中表示装置
WO2023233583A1 (fr) * 2022-06-01 2023-12-07 株式会社ソニー・インタラクティブエンタテインメント Dispositif électronique, et système de traitement d'informations

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0866564A (ja) * 1994-06-22 1996-03-12 Konami Co Ltd 移動体の遠隔制御装置
JPH0944249A (ja) * 1995-07-31 1997-02-14 Nippon Steel Corp 移動体制御方法
JP2001306145A (ja) * 2000-04-25 2001-11-02 Casio Comput Co Ltd 移動ロボット装置およびそのプログラム記録媒体
JP2005063184A (ja) * 2003-08-13 2005-03-10 Toshiba Corp 自走式移動装置及び位置補正方法
JP2005313303A (ja) * 2004-04-30 2005-11-10 Japan Science & Technology Agency ロボット遠隔制御システム
WO2006134778A1 (fr) * 2005-06-14 2006-12-21 The University Of Electro-Communications Dispositif, procédé et programme de détection de position, et système fournissant une réalité composite
JP2008030136A (ja) * 2006-07-27 2008-02-14 Sony Corp ロボットの動作編集装置及び動作編集方法、並びにコンピュータ・プログラム
WO2008065458A2 (fr) * 2006-11-28 2008-06-05 Dalnoki Adam Système et procédé pour déplacer des objets réels par des opérations mises en œuvre dans un environnement virtuel
WO2009037679A1 (fr) * 2007-09-21 2009-03-26 Robonica (Proprietary) Limited Affichage d'informations dans un système de jeu comportant un jouet mobile
WO2010026710A1 (fr) * 2008-09-03 2010-03-11 村田機械株式会社 Procédé de planification d'itinéraire, unité de planification d'itinéraire et dispositif mobile autonome
JP2014136141A (ja) * 2013-01-18 2014-07-28 Iai Corp ロボットゲームシステム
WO2014178272A1 (fr) * 2013-05-01 2014-11-06 村田機械株式会社 Corps mobile autonome

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4787963B2 (ja) * 2006-08-10 2011-10-05 国立大学法人北海道大学 経路推定装置およびその制御方法、経路推定装置制御プログラム、ならびに該プログラムを記録した記録媒体

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0866564A (ja) * 1994-06-22 1996-03-12 Konami Co Ltd 移動体の遠隔制御装置
JPH0944249A (ja) * 1995-07-31 1997-02-14 Nippon Steel Corp 移動体制御方法
JP2001306145A (ja) * 2000-04-25 2001-11-02 Casio Comput Co Ltd 移動ロボット装置およびそのプログラム記録媒体
JP2005063184A (ja) * 2003-08-13 2005-03-10 Toshiba Corp 自走式移動装置及び位置補正方法
JP2005313303A (ja) * 2004-04-30 2005-11-10 Japan Science & Technology Agency ロボット遠隔制御システム
WO2006134778A1 (fr) * 2005-06-14 2006-12-21 The University Of Electro-Communications Dispositif, procédé et programme de détection de position, et système fournissant une réalité composite
JP2008030136A (ja) * 2006-07-27 2008-02-14 Sony Corp ロボットの動作編集装置及び動作編集方法、並びにコンピュータ・プログラム
WO2008065458A2 (fr) * 2006-11-28 2008-06-05 Dalnoki Adam Système et procédé pour déplacer des objets réels par des opérations mises en œuvre dans un environnement virtuel
WO2009037679A1 (fr) * 2007-09-21 2009-03-26 Robonica (Proprietary) Limited Affichage d'informations dans un système de jeu comportant un jouet mobile
WO2010026710A1 (fr) * 2008-09-03 2010-03-11 村田機械株式会社 Procédé de planification d'itinéraire, unité de planification d'itinéraire et dispositif mobile autonome
JP2014136141A (ja) * 2013-01-18 2014-07-28 Iai Corp ロボットゲームシステム
WO2014178272A1 (fr) * 2013-05-01 2014-11-06 村田機械株式会社 Corps mobile autonome

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11944887B2 (en) 2018-03-08 2024-04-02 Sony Corporation Information processing device and information processing method
CN112106004A (zh) * 2018-05-09 2020-12-18 索尼公司 信息处理装置、信息处理方法和程序
JP2022509986A (ja) * 2019-08-30 2022-01-25 上▲海▼商▲湯▼智能科技有限公司 車両位置決めシステム及び方法、車両制御方法及び装置
JP7335538B1 (ja) 2022-12-22 2023-08-30 株式会社カプコン 情報処理方法、情報処理システムおよびプログラム
JP2024090279A (ja) * 2022-12-22 2024-07-04 株式会社カプコン 情報処理方法、情報処理システムおよびプログラム

Also Published As

Publication number Publication date
JP6352151B2 (ja) 2018-07-04
JP2016091423A (ja) 2016-05-23

Similar Documents

Publication Publication Date Title
JP6352151B2 (ja) 情報処理装置、情報処理システム、および情報処理方法
JP7322122B2 (ja) 情報処理装置、情報処理方法、および、情報媒体
US11559751B2 (en) Toy systems and position systems
US10328339B2 (en) Input controller and corresponding game mechanics for virtual reality systems
JP6957218B2 (ja) シミュレーションシステム及びプログラム
US20080076498A1 (en) Storage medium storing a game program, game apparatus and game controlling method
JP2010233671A (ja) プログラム、情報記憶媒体及びゲーム装置
US9526987B2 (en) Storage medium, game apparatus, game system and game controlling method
US8075400B2 (en) Game apparatus
KR20170134675A (ko) 포털 디바이스 및 협력하는 비디오 게임 머신
JP3532898B2 (ja) ビデオゲーム玩具
JP2019170544A (ja) 動作装置、動作玩具及びシステム
JP2022003549A (ja) エンターテインメントシステム
JP3751626B2 (ja) ゲーム装置
WO2019142227A1 (fr) Corps mobile et procédé de commande de corps mobile
JP5499001B2 (ja) ゲーム装置、ならびに、プログラム
JP3751627B2 (ja) ゲーム装置
JP5616010B2 (ja) ゲームプログラムおよびゲームシステム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15857909

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15857909

Country of ref document: EP

Kind code of ref document: A1