WO2015025442A1 - Dispositif et procédé de traitement d'informations - Google Patents

Dispositif et procédé de traitement d'informations Download PDF

Info

Publication number
WO2015025442A1
WO2015025442A1 PCT/JP2014/002529 JP2014002529W WO2015025442A1 WO 2015025442 A1 WO2015025442 A1 WO 2015025442A1 JP 2014002529 W JP2014002529 W JP 2014002529W WO 2015025442 A1 WO2015025442 A1 WO 2015025442A1
Authority
WO
WIPO (PCT)
Prior art keywords
information processing
marker
image
camera
processing apparatus
Prior art date
Application number
PCT/JP2014/002529
Other languages
English (en)
Japanese (ja)
Inventor
義勝 金丸
佐藤 文昭
雄一 西澤
Original Assignee
株式会社ソニー・コンピュータエンタテインメント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ソニー・コンピュータエンタテインメント filed Critical 株式会社ソニー・コンピュータエンタテインメント
Publication of WO2015025442A1 publication Critical patent/WO2015025442A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5258Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Definitions

  • the present invention relates to an information processing apparatus and an information processing method for performing information processing in response to a user operation.
  • Information processing devices such as portable game machines and PDAs (Personal Digital Assistants) are widely used.
  • many information processing apparatuses are equipped with a communication function, and a multifunctional information processing apparatus in which functions of a mobile phone, a PDA, and the like are integrated into one has appeared, such as a smartphone.
  • Such an information processing apparatus includes a large-capacity memory and a high-speed processor, and a user can enjoy various applications by installing an application program in the information processing apparatus.
  • AR augmented reality
  • Patent Document 1 A technique has also been proposed (see, for example, Patent Document 1).
  • AR technology fuses the real world and the virtual world by an approach different from the construction of a virtual world based on user motion recognition.
  • the present invention has been made in view of such problems, and an object thereof is to provide a technique that can effectively use the AR technique in information processing.
  • An aspect of the present invention relates to an information processing apparatus.
  • the information processing apparatus includes a captured image acquisition unit that acquires data of the captured image from a camera that is capturing a real space, an image analysis unit that analyzes the captured image and detects a marker present in the captured space, a camera A relative position relationship between the object space and the object space, a space definition unit that defines a three-dimensional coordinate system corresponding to the object space and a screen corresponding to the field of view of the camera, and information processing corresponding to the detected marker And an information processing unit that arranges a virtual object corresponding to the marker in the three-dimensional coordinate system and an object image formed by projecting the virtual object on the screen is superimposed on the captured image to generate an output image An output image generation unit that outputs to a display device, and the information processing unit arranges an operation object that is a user operation means for information processing as a virtual object. It is characterized in.
  • This information processing method is an information processing method performed by an information processing device using a photographed image, the step of acquiring data of the photographed image from a camera that is photographing a real space and storing it in a memory, and analyzing the photographed image Detecting a marker existing in the object space, identifying a relative positional relationship between the camera and the object space, and a screen corresponding to the field of view of the camera and a three-dimensional coordinate system corresponding to the object space.
  • An object image formed by projecting a virtual object on a screen, defining a step, executing information processing corresponding to the detected marker, placing a virtual object corresponding to the marker in a three-dimensional coordinate system Are superimposed on the captured image read from the memory, and an output image is generated and output to the display device.
  • a user operation means to the information processing.
  • AR technology can be effectively used for information processing such as games.
  • (A) is a figure which shows the front surface of an electronic device
  • (b) is a figure which shows the back surface of an electronic device.
  • (A) is a figure which shows the upper surface of an electronic device
  • (b) is a figure which shows the lower surface of an electronic device
  • (c) is a figure which shows the left side surface of an electronic device.
  • It is a figure which shows the circuit structure of an electronic device.
  • It is a figure which shows the functional block of the information processing apparatus in this Embodiment.
  • FIG. 11 It is a figure which shows the example of a change of a screen when a user brings a camera close to a marker in real space from the state where the screen example of FIG. 11 is displayed. It is a figure which shows the example of a screen displayed in battle
  • FIG. 4 is a flowchart illustrating a processing procedure in which an information processing unit and an output image generation unit update a display image in response to a change in the position and orientation of a camera in the present embodiment. It is a figure which shows the example of the display screen in the aspect which implement
  • FIG. 1A shows the front surface of the information processing apparatus 10.
  • the information processing apparatus 10 is formed by a horizontally long casing, and the left and right areas gripped by the user have an arcuate outline.
  • a rectangular touch panel 50 is provided on the front surface of the information processing apparatus 10.
  • the touch panel 50 includes a display device 20 and a transparent front touch pad 21 that covers the surface of the display device 20.
  • the display device 20 is an organic EL (Electro-Liminescence) panel and displays an image.
  • the display device 20 may be a display unit such as a liquid crystal panel.
  • the front touch pad 21 is a multi-touch pad having a function of detecting a plurality of points touched at the same time, and the touch panel 50 is configured as a multi-touch screen.
  • a triangle button 22a, a circle button 22b, a x button 22c, and a square button 22d are provided on the right side of the rhombus.
  • operation buttons 22 are provided on the right side of the rhombus.
  • an upper key 23a, a left key 23b, a lower key 23c, and a right key 23d are provided on the left side of 50.
  • the user can input the eight directions of up / down / left / right and diagonal by operating the direction key 23.
  • a left stick 24 a is provided below the direction key 23, and a right stick 24 b is provided below the operation button 22.
  • the user tilts the left stick 24a or the right stick 24b (hereinafter, collectively referred to as “analog stick 24”), and inputs a direction and a tilt amount.
  • An L button 26a and an R button 26b are provided on the left and right tops of the housing.
  • the operation button 22, the direction key 23, the analog stick 24, the L button 26a, and the R button 26b constitute operation means operated by the user.
  • a front camera 30 is provided in the vicinity of the operation button 22.
  • a left speaker 25a and a right speaker 25b (hereinafter, collectively referred to as “speaker 25”) that output sound are provided, respectively.
  • a HOME button 27 is provided below the left stick 24a, and a START button 28 and a SELECT button 29 are provided below the right stick 24b.
  • FIG. 1B shows the back surface of the information processing apparatus 10.
  • a rear camera 31 and a rear touch pad 32 are provided on the rear surface of the information processing apparatus 10.
  • the rear touch pad 32 is configured as a multi-touch pad, like the front touch pad 21.
  • the information processing apparatus 10 is equipped with two cameras and a touch pad on the front surface and the back surface.
  • FIG. 2A shows the upper surface of the information processing apparatus 10.
  • the L button 26a and the R button 26b are provided on the left and right ends of the upper surface of the information processing apparatus 10, respectively.
  • a power button 33 is provided on the right side of the L button 26 a, and the user turns the power on or off by pressing the power button 33.
  • the information processing apparatus 10 has a power control function of transitioning to a suspended state when a time during which the operating means is not operated (no operation time) continues for a predetermined time. When the information processing apparatus 10 enters the suspended state, the user can return the information processing apparatus 10 from the suspended state to the awake state by pressing the power button 33.
  • the game card slot 34 is an insertion slot for inserting a game card, and this figure shows a state where the game card slot 34 is covered with a slot cover.
  • An LED lamp that blinks when the game card is being accessed may be provided in the vicinity of the game card slot 34.
  • the accessory terminal 35 is a terminal for connecting a peripheral device (accessory), and this figure shows a state where the accessory terminal 35 is covered with a terminal cover. Between the accessory terminal 35 and the R button 26b, a-button 36a and a + button 36b for adjusting the volume are provided.
  • FIG. 2B shows the lower surface of the information processing apparatus 10.
  • the memory card slot 37 is an insertion slot for inserting a memory card, and this figure shows a state in which the memory card slot 37 is covered with a slot cover.
  • an audio input / output terminal 38, a microphone 39, and a multi-use terminal 40 are provided on the lower surface of the information processing apparatus 10.
  • the multi-use terminal 40 corresponds to USB (Universal Serial Bus) and can be connected to other devices via a USB cable.
  • USB Universal Serial Bus
  • FIG. 2C shows the left side surface of the information processing apparatus 10.
  • a SIM card slot 41 which is a SIM card insertion slot, is provided.
  • FIG. 3 shows a circuit configuration of the information processing apparatus 10.
  • the wireless communication module 71 is configured by a wireless LAN module compliant with a communication standard such as IEEE 802.11b / g, and is connected to an external network such as the Internet via a wireless access point.
  • the wireless communication module 71 may have a Bluetooth (registered trademark) protocol communication function.
  • the mobile phone module 72 corresponds to the third generation (3rd Generation) digital mobile phone system conforming to the IMT-2000 (International Mobile Telecommunication 2000) standard defined by the ITU (International Telecommunication Union). Connect to the telephone network 4.
  • a SIM card 74 Into the SIM card slot 41, a SIM card 74 in which a unique ID number for specifying the telephone number of the mobile phone is recorded is inserted. By inserting the SIM card 74 into the SIM card slot 41, the mobile phone module 72 can communicate with the mobile phone network 4.
  • the CPU (Central Processing Unit) 60 executes a program loaded in the main memory 64.
  • a GPU (Graphics Processing Unit) 62 performs calculations necessary for image processing.
  • the main memory 64 is composed of a RAM (Random Access Memory) or the like, and stores programs, data, and the like used by the CPU 60.
  • the storage 66 is configured by a NAND flash memory (NAND-type flash memory) or the like, and is used as a built-in auxiliary storage device.
  • the motion sensor 67 detects the movement of the information processing apparatus 10, and the geomagnetic sensor 68 detects the geomagnetism in the triaxial direction.
  • the GPS control unit 69 receives a signal from a GPS satellite and calculates a current position.
  • the front camera 30 and the rear camera 31 capture an image and input image data.
  • the front camera 30 and the rear camera 31 are constituted by CMOS image sensors (Complementary Metal Oxide Semiconductor Image Sensor).
  • the display device 20 is an organic EL display device and has a light emitting element that emits light by applying a voltage to the cathode and the anode. In the power saving mode, the voltage applied between the electrodes is made lower than usual, so that the display device 20 can be dimmed and power consumption can be suppressed.
  • the display device 20 may be a liquid crystal panel display device provided with a backlight. In the power saving mode, by reducing the amount of light from the backlight, the liquid crystal panel display device can be in a dimmed state and power consumption can be suppressed.
  • the operation unit 70 includes various operation means in the information processing apparatus 10. Specifically, the operation button 22, the direction key 23, the analog stick 24, the L button 26a, the R button 26b, the HOME button 27, and START. A button 28, a SELECT button 29, a power button 33, a-button 36a, and a + button 36b are included.
  • the front touchpad 21 and the rear touchpad 32 are multi-touchpads, and the front touchpad 21 is disposed on the surface of the display device 20.
  • the speaker 25 outputs sound generated by each function of the information processing apparatus 10, and the microphone 39 inputs sound around the information processing apparatus 10.
  • the audio input / output terminal 38 inputs stereo sound from an external microphone and outputs stereo sound to external headphones or the like.
  • a game card 76 in which a game file is recorded is inserted.
  • the game card 76 has a recording area in which data can be written.
  • data is written / read by the media drive.
  • a memory card 78 is inserted into the memory card slot 37.
  • the multi-use terminal 40 can be used as a USB terminal, and is connected to a USB cable 80 to transmit / receive data to / from another USB device.
  • a peripheral device is connected to the accessory terminal 35.
  • the information processing apparatus 10 executes information processing such as game and electronic data creation, output of various contents such as electronic books, web pages, videos, music, communication, and the like according to user operations.
  • a necessary program may be loaded from various internal storage devices to the main memory 64 and all processing may be performed in the information processing apparatus 10 under the control of the CPU 60 or may be connected via a network. It may be carried out while requesting a part of the processing from the server and receiving the result.
  • various types and types of processing executed by the information processing apparatus 10 are conceivable and not particularly limited. Hereinafter, a case where a game is executed will be described as an example.
  • a card of a pattern associated with the game is placed in an arbitrary place such as a table installed in a room, and the rear camera 31 of the information processing apparatus 10 is used.
  • the information processing apparatus 10 detects the presence of the card from the photographed image, the information processing apparatus 10 starts a game associated with the card.
  • the user continues to capture the real space, and the information processing apparatus 10 superimposes an image of a tool or a character used in the game on the captured image of the real space and displays the image on the display device 20. .
  • the user progresses the game by operating the information processing apparatus 10 while photographing the real space. Thereby, for example, it is possible to enjoy a battle game with a virtual character on the stage of one's own room.
  • curd used as the opportunity to start a game should just be a thing which can be detected with a picked-up image and can be matched with a game in advance
  • a predetermined picture, character, It may be a card or a three-dimensional object that describes at least one of the figures.
  • curd and solid thing which have a predetermined shape may be sufficient.
  • a display on which at least one of a predetermined shape, a picture, a character, and a figure is displayed, or an electronic device having the display may be used.
  • markers are collectively referred to as “markers”.
  • the marker may be used alone, or one game may be selected by combining a plurality of markers.
  • the target to be associated with the marker does not have to be a game unit.
  • a game unit such as an application including a game, a unit larger than the game, a command to be input in the game, various parameters for determining the game environment, a character to be drawn, an object, etc. Smaller units may be used.
  • FIG. 4 shows functional blocks of the information processing apparatus 10.
  • Each functional block included in the control unit 100 can be configured by the CPU 60, the GPU 62, the main memory 64, and the like as described above in terms of hardware, and various storage devices in the information processing apparatus 10 and the like in terms of software.
  • This is realized by a program loaded into the main memory 64 from the loaded recording medium. Therefore, it is understood by those skilled in the art that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof, and is not limited to any one.
  • the control unit 100 analyzes an input information acquisition unit 102 that acquires information related to a user operation, a captured image acquisition unit 104 that acquires captured image data, a captured image storage unit 106 that stores captured image data, and a captured image.
  • Image analysis unit 108 for performing marker detection and space definition, marker correspondence information storage unit 109 for storing information relating to the marker to be detected, information processing unit 114 for performing information processing corresponding to the marker, and information relating to the object to be drawn
  • An object data storage unit 116, and an output image generation unit 118 that generates an output image by superimposing an object image on the captured image.
  • the input information acquisition unit 102 receives information related to user operations performed on the information processing apparatus 10 as input signals from the operation unit 70, the front touchpad 21, and the motion sensor 67, respectively. Each input signal is converted into operation content information in accordance with a predetermined rule, and supplied to the captured image acquisition unit 104, the image analysis unit 108, and the information processing unit 114. As will be described later, in the present embodiment, the operation based on the interaction with the image world displayed on the display device 20 is basically used. Therefore, the operation via the operation unit 70 included in the information processing device 10 is preferably an initial operation or the like. Stop at the minimum scene.
  • the captured image acquisition unit 104 acquires captured image data by causing the rear camera 31 to start capturing in accordance with a process start operation by the user. By using the rear camera 31, the user can naturally perform operations for moving the information processing apparatus 10 while viewing the display device 20 to change the field of view or progress the game.
  • the captured image acquisition unit 104 acquires captured image data at a predetermined rate in real time, appropriately assigns an identification number, supplies it to the image analysis unit 108 and stores it in the captured image storage unit 106.
  • the identification number is transmitted between the functional blocks in order to identify the captured image to be processed in a series of subsequent processes.
  • the image analysis unit 108 includes a marker detection unit 110 that detects a marker present in the field of view of the camera, and a space definition unit 112 that tracks an object existing in real space and the position and orientation of the camera and defines a coordinate system used for graphics drawing. including.
  • the marker detection unit 110 detects a marker in the captured image based on the marker information registered in the marker correspondence information storage unit 109.
  • the marker detection technique any of various object recognition methods that have been proposed in the field of computer vision or the like may be employed.
  • Random Forest method and Random Ferns method can be used as recognition methods based on features. Specifically, a decision tree group learned using local patches randomly sampled from the prepared marker image is created, and a local patch including feature points detected from the captured image is input to the decision tree group. Then, the presence probability distribution of the marker is acquired from the output result at the leaf node.
  • the marker correspondence information storage unit 109 stores data representing marker characteristics in accordance with the technique used for marker detection.
  • the above technique has a remarkable effect in that there is no need to write a dedicated code for marker detection in the marker, and any image or three-dimensional object can be used as the marker. It is not intended to limit the form.
  • the marker detecting unit 110 may detect the marker by using a marker in which a dedicated code is written, and reading the marker.
  • the space definition unit 112 analyzes the photographed image to track the environmental shape formed by the object existing in the real space (photographed space) that is the subject of photographing, and the position and orientation of the camera that is photographing the environment shape. . Then, a three-dimensional world coordinate system is defined in the object space, and the camera coordinate system is sequentially defined according to the movement of the camera. For example, a SLAM (Simultaneous Localization And Mapping) method is used as a technique for tracking the environmental shape and the position and orientation of the camera.
  • SLAM Simultaneous Localization And Mapping
  • the SLAM method is a method of tracking the movement of a feature point for each local patch including the feature point detected from a captured image and updating a predetermined state variable at each time step based on the tracking.
  • state variables as the camera position and orientation (rotation angle), moving speed, angular velocity, the position of at least one feature point of the object existing in the object space, etc., the object space and the sensor surface of the camera.
  • the positional relationship (distance and angle) and thus the positional relationship between the world coordinate system and the camera coordinate system can be acquired for each captured image.
  • the camera may be a stereo camera, or an infrared irradiation unit and an infrared sensor may be separately provided, and the distance from the camera to the subject may be acquired by a known method to acquire the environmental shape and the like.
  • the position and orientation of the camera may be calculated based on output signals from the motion sensor 67 and the geomagnetic sensor 68 provided in the information processing apparatus 10.
  • Various means for image analysis performed by the marker detection unit 110 and the space definition unit 112 are conceivable as described above, and are described in, for example, Patent Document 1. Therefore, detailed description thereof is omitted here.
  • the information processing unit 114 executes information processing such as a game associated with the marker detected by the marker detection unit 110. Therefore, the marker correspondence information storage unit 109 stores a marker and a game that starts when the marker is detected in association with each other. The marker correspondence information storage unit 109 further stores an object to be drawn in each game in association with each other. The information processing unit 114 determines the placement of each object in the world coordinate system defined by the space definition unit 112 and requests the output image generation unit 118 to generate an output image including drawing of the object.
  • the output image generation unit 118 reads the captured image data from the captured image storage unit 106, and draws an object on the captured image to generate an output image.
  • the object model data is stored in the object data storage unit 116 together with the identification information.
  • the drawing process performed here converts an object placed in the world coordinate system into a camera coordinate system defined from the position and orientation of the camera, and projects it onto the screen.
  • the basic processing procedure is general. It may be the same as computer graphics.
  • the generated output image data is output to the display device 20 via the frame memory and displayed immediately. Note that during a period in which no marker is detected in the captured image, the output image generation unit 118 may use the captured image as it is as an output image.
  • the object data storage unit 116 further stores information that defines the actions of characters included as objects. Although a specific example will be described later, in the present embodiment, since the distance and relative angle between the camera and the character can be sequentially specified, when those parameters satisfy a predetermined condition, the character performs a predetermined motion (hereinafter referred to as “specific motion”). ”). Thereby, the interaction between the camera moved by the user and the virtual character is realized.
  • specific motion a predetermined motion
  • FIG. 5 is a diagram for explaining an example of the usage environment of the information processing apparatus 10 in the present embodiment.
  • a real space 120 is provided with a table 122, on which a clock 124 and a pencil stand 126 are placed.
  • the user places the marker 128 a on the table 122 and images it with the rear camera 31 of the information processing apparatus 10.
  • the marker 128a is a card on which a picture of a cat is drawn.
  • the field of view of the rear camera 31 is an area indicated by a dotted rectangle 130, for example.
  • FIG. 6 illustrates the relationship between the real space and the coordinate system for drawing the object, and shows a state where the real space 120 in FIG. 5 is viewed from the upper right.
  • a clock 124, a pencil stand 126, and a marker 128 a are placed on the table 122, and the camera sensor surface coincides with the screen 140.
  • the space definition unit 112 reads the actual size of the marker from the marker correspondence information storage unit 109, and displays the marker image in the captured image. Compare with Thereby, the initial value of the distance and rotation angle from the marker 128a to the screen 140, and hence the position and orientation of the camera can be derived.
  • the relationship between the world coordinate system and the camera coordinate system is determined, and as a result, to the screen 140 of the object placed in the world coordinate system. Projection becomes possible. Further, the space definition unit 112 tracks the feature points of the captured image such as the marker 128a, the table 122, the clock 124, and the pencil stand 126 by the SLAM method, so that the camera can move and change its posture. Even so, the relationship between the two coordinate systems can be obtained sequentially. Thereby, the change of the object image according to the movement of the camera can be expressed accurately.
  • the image plane of the marker 128a is a table upper surface or a floor surface, and the surface is defined as a reference surface, thereby expressing the object on the surface. it can.
  • FIG. 7 is a flowchart illustrating a processing procedure in which the information processing apparatus 10 executes a game with object drawing triggered by marker detection.
  • the information processing apparatus does not start the game itself in response to the marker but starts processing the game and other functions after receiving various mode selections related to the game. Therefore, as a concept encompassing them, an object started by the information processing apparatus 10 is called an “application”.
  • the captured image acquisition unit 104 starts acquisition of captured image data by causing the rear camera 31 to start capturing (S10). Thereafter, the captured image acquisition unit 104 sequentially acquires captured images (image frames) at a predetermined rate.
  • the captured image data is supplied to the image analysis unit 108 and stored in the captured image storage unit 106.
  • the output image generation unit 118 reads the photographed image data from the photographed image storage unit 106 and outputs it to the display device 20 to display the photographed image as it is like the electronic viewfinder (S12). Thereby, the user can move the information processing apparatus 10 while confirming the image photographed by the rear camera 31 with the display device 20, and can obtain a desired visual field.
  • the marker detection unit 110 of the image analysis unit 108 performs marker detection processing by analyzing the captured image (S14). Therefore, the marker detection unit 110 refers to data representing the feature of each marker registered in the marker correspondence information storage unit 109.
  • the output image generation unit 118 continues to output the captured image as it is to the display device 20 by receiving the notification from the image analysis unit 108 (N in S16, S12).
  • the space definition unit 112 determines the world coordinate system as shown in FIG. 6 and sets the screen 140 with respect to the world coordinate system according to the position and orientation of the camera. (S18). Since the position and orientation of the camera may change constantly, the movement is tracked by the SLAM method described above.
  • the marker detection unit 110 identifies an application associated with the detected marker by referring to the marker correspondence information storage unit 109, and notifies the information processing unit 114 of identification information of the application (S20). In response to this, the information processing unit 114 executes the notified application in cooperation with the output image generation unit 118 (S22).
  • an object such as a character or a game tool associated with an application is placed in the world coordinate system, and is drawn by projecting it onto the screen 140, and the output image is superimposed on the captured image.
  • the character is caused to perform a specific action according to the movement of the camera, or the game is advanced according to a user operation.
  • the generated output image is sequentially displayed on the display device 20, so that an image that changes according to the progress of processing such as camera movement or game can be displayed.
  • the shooting is continued until the user performs an overall operation stop operation such as ending the shooting, marker detection is performed, and the corresponding application is executed (N in S26, S12 to S22).
  • an operation for ending the shooting is performed, a series of processing ends (Y in S26).
  • FIG. 8 shows an example of a screen displayed on the display device 20 at the application execution stage in S22 of FIG.
  • This figure shows, for example, one screen of an application that has started execution as a result of detecting the marker 128a after photographing the real space 120 shown in FIG.
  • a cat character 150a which is an application or an object associated with the marker 128a
  • icons 152a, 152b, and 152c are drawn so as to surround the marker 128b.
  • the “icon” is used if the associated process is started or the corresponding item is selected by designating the corresponding area, the purpose of use, the process to be started, and the type of selection target
  • the shape is not limited. Therefore, the button graphics as shown in the figure are also called “icons”.
  • the icons 152a, 152b, and 152c are for selecting a function to be executed by the application.
  • the character 150a and the icons 152a, 152b, and 152c are first placed on the world coordinate system, and then projected and drawn on a screen that matches the sensor surface of the camera. It can be expressed as it is placed.
  • the information processing unit 114 associates an area in the front touchpad 21 corresponding to the position of the icons 152a, 152b, and 152c on the display screen with the function represented by each icon. Thereby, when the user touches the front touchpad 21 at the position of a desired icon, the information processing unit 114 starts processing of the corresponding function. Since the icons 152a, 152b, and 152c are drawn as if they are placed on a table in the real world, the position of the icon on the screen changes when the field of view of the camera changes. Therefore, the information processing unit 114 constantly acquires information related to the drawing area of each icon from the output image generation unit 118, and updates the detection area of each icon on the front touchpad 21 according to the movement of the camera.
  • the icons 152a, 152b, and 152c may also have a three-dimensional shape.
  • FIG. 9 shows an example of the structure of data stored in the marker correspondence information storage unit 109, which is referred to when the marker detection unit 110 performs marker detection and corresponding application identification in S16 and S20 of FIG.
  • the marker correspondence information 300 includes an identification information column 302, a feature amount information column 304, a size column 306, and a corresponding application column 308.
  • an identification number assigned to each marker to be registered is stored in the identification information column 302, an identification number assigned to each marker to be registered is stored.
  • the feature amount information column 304 a marker template image or data identification information representing the feature amount is stored.
  • the names of images such as “cat image” and “fox image” are shown.
  • Data bodies such as images and feature quantities are stored separately in association with image names.
  • the feature amount information column 304 may store image data, feature number data identification numbers, the data itself, or a pointer indicating the storage address of the data.
  • the size column 306 stores the size of each marker. In the case of the figure, since it is assumed that a rectangular card is used as a marker, it is described in a format of “vertical length ⁇ horizontal length (mm)”, but the format depends on the shape of the marker. It can be various. A combination of a plurality of parameters such as size and shape may be used.
  • the corresponding application column 308 stores the identification information of the application associated with each marker. Although the names of games such as “air hockey game” and “card game” are shown in the same figure, an application identification number, a software main body, or a pointer indicating a storage address of the software may be stored.
  • the marker detection unit 110 reads out each registered marker image or its feature amount based on the identification information in the feature amount information column 304, and uses it to detect a marker in the captured image. By using the above method for detection, it is possible to detect a marker with high robustness against a change in magnification.
  • the marker detection unit 110 identifies an application associated with the detected marker with reference to the corresponding application column 308.
  • the space definition unit 112 defines the world coordinate system according to the unit length of the real space based on the size of the marker described in the size column 306, and acquires the positional relationship between the subject including the marker and the camera. To do.
  • the marker and application do not have to correspond one-to-one.
  • the marker with the identification information “001” and the marker with the identification information “002” both correspond to the “air hockey game”.
  • a combination of multiple markers may correspond to one application.
  • a predetermined application or game can be executed only when a plurality of markers are collected, and enjoyment of marker collection can also be provided.
  • a combination of a plurality of seals collected on a single card, such as a stamp rally may be used.
  • the game environment such as an air hockey table in an air hockey game
  • the size of the markers such as the identification information “001” marker and the identification information “002” marker in FIG.
  • You may change the size of the object which represents.
  • the unit length in the world coordinate system can correspond to the unit length in the real space based on the marker size as described above.
  • each object can be drawn assuming an arbitrary size in the real space.
  • the size of the object may be changed depending on the interval between the placed markers.
  • FIG. 10 is a flowchart illustrating an example of a processing procedure in which the information processing unit 114 executes the application corresponding to the marker in S22 of FIG. Note that it is understood by those skilled in the art that the processing procedure and processing content can be variously changed depending on the content of the application, and the present embodiment is not limited to this. Further, the flowchart of FIG. 9 particularly shows a processing procedure for the application itself, and processing for changes in the position and orientation of the camera performed in parallel will be described later.
  • the information processing unit 114 identifies a character, an icon, or the like associated with the application, and the output image generation unit 118 detects them.
  • the object is drawn on the photographed image (S30). Modeling data of each object is read from the object data storage unit 116 in accordance with a request from the information processing unit 114.
  • the initial screen displayed thereby is, for example, the screen example 148 shown in FIG.
  • the output image generating unit 118 reads the explanatory note image from the object data storage unit 116. Is superimposed on the captured image (S34). At this time, it is desirable not to represent the explanatory image in a plane but to display it without destroying the real space world in the captured image. For example, in the captured image, the description image is texture-mapped to a rectangular area where the marker image appears, so that the description card is placed at the marker position.
  • the display is continued until the user performs an operation to end the display of the description (N in S36, S34).
  • the battle mode is shifted to the battle mode of S42 to S48 (Y of 36, Y of S40).
  • the process proceeds to the watching mode of S52 to S58 (Y in S36, N in S40, Y in S50). If an operation for terminating the display of the explanatory note is performed, the display is returned to the initial screen displayed in S30 (Y in S36, N in S40, N in S50), unless an operation for terminating the application itself is performed. , S38 N, S30). An icon for returning the display to the initial screen may be displayed together with the description.
  • the initial position and size of the object to be drawn during the battle are determined in the world coordinate system, and are projected onto the screen to be superimposed on the captured image (S42). Then, the game is advanced while appropriately moving the tool and the object of the opponent character according to the user operation on the information processing apparatus 10 (S44).
  • the information processing unit 114 calculates the movement of the object, and the output image generation unit 118 updates the display image by drawing the object on the captured image at a predetermined frame rate (S46).
  • shooting and display of a real space with a camera provided in the information processing apparatus 10 constitutes a part of the user interface. Therefore, by utilizing the movement of the information processing apparatus 10 itself detected by the motion sensor 67 as means for user operation in the game, a sense of unity is created in a series of operations including camera shooting, and the user can easily understand.
  • other operation means may be used as appropriate.
  • the processes of S44 and S46 are continued until the battle is ended or the user performs an end operation (N in S48, S44, S46).
  • the display is returned to the initial screen displayed in S30 (Y in S48, N in S38, S30), unless an operation for ending the application itself is performed.
  • an initial position and a size of an object to be drawn at the time of watching are determined in the world coordinate system, and are projected onto the screen to be superimposed on the captured image (S52).
  • a situation in which a plurality of characters start a battle is displayed.
  • the game is automatically advanced so that the characters battle each other according to the set level (S54).
  • the information processing unit 114 calculates the movement of the object, and the output image generation unit 118 updates the display image by drawing the object on the captured image at a predetermined frame rate (S56).
  • the processes of S54 and S56 are continued until the battle between the characters ends or the user performs an operation to end the watching (N of S58, S54, S56).
  • the display is returned to the initial screen displayed in S30 (Y in S58, N in S38, S30) unless an operation for ending the application itself is performed. If an operation for terminating the application is performed at any stage, the processing of the application is terminated (Y in S38).
  • the application termination operation may be accepted at any time during the execution of the application.
  • FIG. 11 shows an example of the explanatory note display screen displayed in S34 of FIG. This screen is displayed after the “how to play” icon 152a is touched on the initial screen shown in FIG. 8, for example. Further, in this example, based on the photographed image of the real space 120 shown in FIG. 5, the marker having the identification information “001” is detected from the marker correspondence information 300 shown in FIG. 9, and the “air hockey game” application is executed. It is assumed that
  • a description image 164a, a cat character 150b, an icon 162a for returning the description to the previous page, and an icon 162b for proceeding to the next page are drawn on the photographed image.
  • image data such as the image 164 b is stored in the object data storage unit 116.
  • the description image may be composed of a plurality of pages, in which case the icons 162a and 162b are drawn.
  • the description image can be expressed as if it is placed on a table in the real space as shown in the figure by texture mapping the region where the marker is originally reflected.
  • the image to be prepared as the description image may be a still image as shown in the figure or a moving image.
  • an application including a game since an application including a game is assumed, an image explaining how to play is displayed.
  • the content to be displayed is not limited to this. That is, the contents displayed may vary depending on the contents of the application such as an electronic book, a web page, and a movie.
  • the icon 162a, the icon 162b, and the character 150b are drawn by the same procedure as described in FIG.
  • the description image 164a is replaced with a natural expression by crossfading the previous or next page image.
  • a page turning animation may be inserted.
  • the character 150b stands up and moves from the state on the initial screen shown in FIG. 8 so that the explanatory note can be seen. Furthermore, it may be expressed as if the character 150b is changing explanations or turning pages.
  • FIG. 12 shows a screen change example when the user brings the camera closer to the marker in the real space from the state where the screen example 160 shown in FIG. 11 is displayed.
  • the explanatory note image 164c, the icons 162c and 162d, and the character 150c are the same objects as the explanatory note image 164a, the icons 162a and 162b, and the character 150b in the screen example 160 of FIG.
  • the position of the object in the world coordinate system is determined, when the camera is brought close to it, it is copied up or part of it protrudes from the field of view as in real space. This gives the impression that the description and icons are actually placed on the table.
  • the explanatory note is small and difficult to read, it is possible to make the explanatory note viewed close by bringing the camera closer.
  • FIG. 13 shows a screen example in the battle mode displayed in S46 of FIG. This screen is displayed, for example, immediately after the “battle” icon 152b is touched or during the battle on the initial screen shown in FIG. In this example as well, it is assumed that an “air hockey game” application is executed, as in FIG.
  • an air hockey table 182 a pack 186, mallets 184a and 184b, and a cat character 150c are drawn on the photographed image.
  • the mallet 184 b in front of the user moves in the same direction as the information processing apparatus 10 on the air hockey base 182, thereby returning the pack 186.
  • the movement of the information processing apparatus 10 is detected by the motion sensor 67, and the movement amount and speed thereof are converted into the movement amount and speed of the mallet 184b drawn on the screen.
  • the conversion rule is also stored in the object data storage unit 116, and the information processing unit 114 refers to it to determine the movement of the mallet at each time step and the movement of the pack that rebounds. This makes it possible to change the movement of the pack in the same way as actual air hockey, such as moving the mallet left and right at the moment of hitting to change the hitting angle of the pack, or moving the mallet forward and smashing. Can do.
  • the opponent's cat character 150c is the same as the character drawn on the initial screen of FIG. In the battle mode, the character 150c is drawn so as to face the camera, that is, the user with the air hockey table interposed therebetween.
  • the cat character 150c also grasps the virtual information processing apparatus and moves its own mallet 184a by moving it. As a result, it is possible to produce a sense of reality that is actually playing against a virtual character.
  • the air hockey base 182 draws the marker 128c in the photographed image so as to be in a predetermined direction and position.
  • the air hockey table 182 in the figure shows a state in which only the top plate is floating in the air, but a portion below the top plate may be added and expressed as a three-dimensional structure, or a scoreboard or the like may be further expressed. Good.
  • the top plate portion of the air hockey base 182 transparent or translucent, the concealed portion of the photographed image is reduced, and the impression that is performed in real space can be given more.
  • the air hockey base 182 may be disposed directly above the marker 128c, but when the top plate portion is transparent or semi-transparent, it is desirable that the air hockey base 182 be shifted so that the pack 186 or the like is not easily seen due to the design of the marker. .
  • the relative relationship between the camera and the world coordinate system can be specified based on objects around the marker such as a clock or a pencil stand. Therefore, once the marker is detected and the air hockey base 182 is arranged so as to correspond to the marker, even if the marker is out of the field of view by a game operation or the like, or the user removes the marker, the relative position with the surrounding objects Thus, the air hockey table 182 can be drawn so as to be at the same position in the world coordinate system. The same applies to other objects such as characters and icons.
  • FIG. 14 shows a screen example in the watching mode displayed in S56 of FIG. This screen is displayed, for example, immediately after the “watching” icon 152c is touched or during watching in the initial screen shown in FIG. This example also assumes that an application of “air hockey game” is executed, as in FIGS. 11 and 13.
  • an air hockey table 194, a pack 198, mallets 196a and 196b, a cat character 150d, and a fox character 192 are drawn on the photographed image.
  • drawing is performed so that the cat character 150d and the fox character 192 face each other with the air hockey table 194 interposed therebetween.
  • Each of them moves their mallet 196a, 196b by moving a virtual information processing apparatus.
  • the user can watch this pattern from various distances and angles by changing the position and posture of the camera.
  • the air hockey table is arranged immediately above the marker 128d in the real space, the actual movement of the user changes the camera distance and angle around the marker 128d.
  • the cat character 150d is the same as the character drawn in the initial screen of FIG. 8 or the battle mode of FIG.
  • the level set for the cat character 150d is increased. That is, the user creates a situation where the cat character 150d is trained to be a player. Then, by fighting with other characters in the watching mode, it is possible to provide enjoyment such as cheering with the feeling that the cat character 150d that he grew up participates in the external game, or strengthening in the fighting mode. .
  • FIG. 15 shows the structure of data stored in the marker correspondence information storage unit 109, which is referred to when the information processing unit 114 specifies an object to be drawn corresponding to the marker in S30, S34, S42, and S52 of FIG.
  • An example is shown.
  • the marker correspondence information 300 in FIG. 9 a mode is assumed in which the character drawn by the marker is changed even in the same application.
  • elements other than the character may be changed by the marker, or all elements may be uniquely determined by the application without being changed.
  • the processing branch by the marker is completed when the application is identified with reference to the marker correspondence information 300 in FIG. 9, and the information processing unit 114 does not need to refer to the marker correspondence information storage unit 109.
  • the object information 400 includes an application field 402, a marker field 404, a first character field 406, a second character field 408, a display icon field 410, an explanation image field 412, and a tool field 414.
  • the application column 402 stores application identification information.
  • the identification information corresponds to the identification information stored in the corresponding application column 308 in the marker correspondence information 300 of FIG. Although the names of the games such as “air hockey game” and “card game” are shown in the same figure, as with the marker correspondence information 300 in FIG. 9, the application identification number, the software body, or a pointer indicating the storage address of the software Etc. may be stored.
  • the marker column 404 stores marker identification numbers.
  • the identification number corresponds to the identification number stored in the identification information column 302 in the marker correspondence information 300 of FIG.
  • the information processing unit 114 identifies the object to be drawn from the object information 400 based on them.
  • the first character column 406 stores identification information of the first character appearing in all modes in the example of FIG.
  • the second character column 408 stores identification information of the second character that is the opponent in the watching mode in the example of FIG.
  • the first character column 406 and the second character column 408 show the names of character models such as “cat” and “fox”, but they may be identification numbers.
  • the modeling data for each character is stored in the object data storage unit 116 in association with the character identification information.
  • the display icon field 410, the explanation image field 412, and the tool field 414 have the same notation and relationship with the data body.
  • the display icon column 410 stores identification information of icons drawn in S30 of FIG. In the case of the figure, the names of icons such as “how to play”, “match”, “watch” are shown. These correspond to the “how to play” icon 152a, the “match” icon 152b, and the “watch” icon 152c in the screen example 148 of FIG.
  • the identification information of the explanation text image drawn in S34 of FIG. 10 is stored.
  • the names of the image group of the explanatory note covering a plurality of pages such as “hockey operation (1), (2),...” Are shown.
  • One of the pages corresponds to the explanatory images 164a and 164b in FIG.
  • the tool column 414 stores identification information of game tool objects drawn in the battle mode or the watching mode.
  • the names of the tool object models such as “air hockey table”, “mallet”, and “pack” are shown. These objects correspond to the air hockey tables 182 and 194, the packs 186 and 198, and the mallets 184a, 184b, 196a, and 196b in the screen examples of FIGS.
  • the information processing unit 114 identifies an object to be drawn in each mode of the application with reference to the object information 400, and requests the output image generation unit 118 to draw.
  • the output image generation unit 118 reads the modeling data stored in the object data storage unit 116 based on the identification information, and draws each object.
  • the information processing unit 114 arranges an object to be drawn based on the position and size of the marker in the world coordinate system defined by the space definition unit 112. For example, as described above, by preparing markers of a plurality of sizes even in the same application, the size of an object such as a character or an air hockey table may be changed according to the size of the marker.
  • the object size may be changed according to the game execution status or the user's request regardless of the marker size. For example, in the watching mode, when the character you grew up wins the opponent's character, you can set up a special battle mode that allows you to play with the raised character as big as a human in real space and the tool object also being real size You may make it provide.
  • FIG. 16 shows an example of a screen displayed in the special battle mode.
  • the screen example 200 basically has the same configuration as the screen example 180 in the normal battle mode shown in FIG. However, the difference is that the air hockey table 202 is the actual size, and the cat character 150e is the same size as a human being. Therefore, as compared with the case of FIG. 13, the user is in a state of photographing the real space at a position retracted from the table on which the marker 128d is placed. According to the size change of the air hockey table 202, its arrangement is also adjusted as appropriate. In the figure, an air hockey base 202 is disposed so as to overlap the marker 128d.
  • FIG. 17 is a flowchart showing a processing procedure for drawing an object in the battle mode of FIG. 10 when it is allowed to change the size of the object in this way.
  • the information processing unit 114 first determines the size and arrangement of objects in the world coordinate system (S70). As described above, the size of the object is uniquely derived from the marker size according to a predetermined rule, or is determined by referring to a setting value for a special battle mode.
  • the information processing unit 114 adjusts the sensitivity of the movement of the tool object to the movement of the information processing apparatus 10 (S72). .
  • the movable width of the mallet differs greatly between a desktop-sized air hockey base as shown in FIG. 13 and a full-size air hockey base as shown in FIG. For this reason, if the conversion rule from the movement amount of the information processing apparatus 10 to the movement amount of the mallet is fixed regardless of the size, the information processing apparatus 10 needs to be moved unnaturally when the size is large or small. It is conceivable that the mallet moves too much when it is sized.
  • the “sensitivity” may be a parameter indicating the magnitude of the response of the tool object to the movement of the information processing apparatus, and variables to be used are not particularly limited, such as a movement amount ratio, a speed ratio, and an acceleration ratio.
  • the ratio Vm / Va of the mallet speed Vm obtained with respect to the speed Va of the information processing apparatus 10 is set to be inversely proportional to the magnification of the object size. That is, the mallet speed Vm is obtained from the object size magnification S and the speed Va of the information processing apparatus 10 as follows.
  • Vm kVa / S
  • the above equation is only an example, and sensitivity adjustment may be performed by various methods.
  • the ratio of the mallet movement amount to the air hockey table width may be calculated from the movement amount of the information processing apparatus 10.
  • the processing of S70 and S72 determines the object size and position, and the movement of the object relative to the movement of the information processing apparatus 10. Therefore, at the start of the game, the object is arranged and drawn accordingly. The drawing process is repeated while moving the object according to (S74).
  • FIG. 18 shows an example of the structure of data stored in the object data storage unit 116 and referred to by the information processing unit 114 for causing the character to perform a specific action when the application is executed.
  • the specific action setting information 500 includes a character field 502, a target part field 504, a relative angle threshold value field 506, a distance threshold value field 508, and an action field 510.
  • the character column 502 stores identification information of the character that is the subject of action. In the case of the figure, the name of the character model such as “cat” is shown.
  • the identification information corresponds to the identification information stored in the first character column 406 in the object information 400 of FIG.
  • the relative angle threshold column 506, and the distance threshold column 508 combinations of conditions for causing each character to perform a specific action are stored. Since this condition can be freely set with respect to the distance and relative angle between the character and the camera, there are various other contents and expression formats of the condition. In the case of the figure, for each part of the character, a condition for determining a situation in which the camera approaches the part within a predetermined angle is shown as an example.
  • FIG. 19 is a diagram for explaining how to express the conditions set in the specific operation setting information 500 of FIG.
  • a part 210 represents a part where conditions are set, such as a character's head or hand.
  • An angle ⁇ (0 ⁇ ⁇ ⁇ 180 °) between the normal vector n1 at the reference point 212 (vertex in the figure) of the part and the normal vector n2 of the screen 214 corresponding to the sensor surface of the camera, and the reference A threshold is set for the distance A between the point 212 and the center of the screen 214.
  • the reference point 212 is looked into.
  • the angle ⁇ is equal to or greater than the threshold value and the distance A is equal to or less than the threshold value, a specific action is generated.
  • the target part field 504 is a target part for which a condition is set and its reference point
  • the relative angle threshold value field 506 is the angle ⁇ threshold value
  • the distance threshold value field 508 is the distance A value.
  • the action column 510 stores identification information of a specific action performed by the character when a condition set for the target part is satisfied. In the setting example shown in the figure, when the camera moves so as to look into the top of the cat character, the cat character jumps in the direction of the camera (second line of the specific action setting information 500).
  • the user acquires the item by performing an action of pushing the item being gripped toward the camera (third line of the specific action setting information 500).
  • the character speaks a hint or the like related to the game strategy (the fourth line of the specific action setting information 500).
  • a program for controlling the movement of the actual character is stored in the object data storage unit 116 separately in correspondence with the identification information of the specific action described in the action column 510.
  • conditions can be set for the distance and relative angle between the object and the individual of the camera, so that the operation of the object as a reaction can be prepared with abundant variations. Because the position and orientation of the information processing device can be changed with a high degree of freedom with respect to the position and orientation of objects that change from moment to moment, by using those combinations as input values, you can enjoy accidental output results .
  • FIG. 20 is a flowchart illustrating a processing procedure in which the information processing unit 114 and the output image generation unit 118 update the display image in response to changes in the position and orientation of the camera. This process is performed in parallel with the process of executing the application shown in FIG. First, the information processing unit 114 constantly acquires information on the position and orientation of the camera from the space definition unit 112 to monitor whether or not they have changed (S80). If there is no change, only monitoring is continued (N in S80).
  • the output image generation unit 118 acquires a new camera coordinate system from the space definition unit 112 (S82). This means that the screen 140 in FIG. 6 has been moved in response to changes in the position and orientation of the camera.
  • the specific action setting information 500 in FIG. 18 is referred to, and it is determined whether or not the distance and relative angle between the object drawn at that time and the camera satisfy the set condition (S84). If not satisfied (N in S84), the output image generation unit 118 projects and redraws each object on the screen after movement, and updates the display image by superimposing it on the captured image (S88). In this case, naturally, the movement of the object corresponding to the application process is expressed.
  • the information processing unit 114 acquires the identification information of the specific action associated with the condition from the specific action setting information 500. Then, based on the action generation program corresponding to the identification information, the character is moved to perform the specific action (S86). Then, the output image generation unit 118 projects and redraws each object including the character performing the specific action on the screen after movement, and updates the display image by superimposing it on the captured image (S88).
  • the corresponding process is started by touching the front touch pad 21 at the position of the icon on the display screen.
  • it is expressed as if icons are also placed on a plane such as a table on which markers are placed in real space. Therefore, as another operation method, it is recognized that the icon is touched by actually reaching out to the position in the real space where the icon appears to be placed on the screen and touching a plane such as a table. .
  • FIG. 21 shows an example of a display screen in a mode in which icon operation is realized by finger movement in real space.
  • the user has put out the right hand 222 within the field of view of the camera while the screen example 148 shown in FIG. 8 is displayed.
  • the user's left hand is holding the information processing apparatus 10 and photographing the real space, and is closer to the table side than in the case of FIG.
  • a captured image including the marker 128a and the user's right hand 222, a cat character 150a drawn with computer graphics, an icon 152c, and the like are mixed.
  • the user's right hand 222 in the captured image is recognized by the existing hand recognition technology.
  • the finger stops in the area of the icon 152c in the captured image for a time longer than a predetermined threshold it is determined that the touch has been made.
  • it is determined that the finger touches the area corresponding to the icon 152c on the table in the real space by tracking the feature point of the finger by the SLAM method or the like. If it is determined that a touch has been made, the information processing unit 114 starts processing corresponding to the icon.
  • Hand recognition and tracking of feature points are performed at any time by the image analysis unit 108. Therefore, when the touch determination is performed based on the finger movement in the real space, the information processing unit 114 sequentially notifies the image analysis unit 108 of the position of the currently displayed icon in the world coordinate system. Then, the image analysis unit 108 performs touch determination in the three-dimensional space and notifies the information processing unit 114 of the result. When touch determination is performed in the captured image, the information processing unit 114 acquires the position of the finger in the captured image from the image analysis unit 108 as needed, and performs the touch determination by itself.
  • a drawing object is sandwiched in a photographed image in which a virtually drawn icon 152c is placed on a table in real space, and the user's actual right hand 222 is further on the icon 152c. It will be in the state. Therefore, the output image generation unit 118 erases the portion hidden in the right hand 222 of the icon 152c by a general hidden surface removal process.
  • the contour line of the right hand 222 can also be specified by the above hand recognition technique or the existing contour tracking technique.
  • the output image generation unit 118 creates a sense of reality by applying a finger shadow 224 as the finger approaches the icon 152c. This process can also be realized by a general shadowing technique. At this time, a global illumination model may be used, or a shadow may be cast by a ray tracing method or the like by estimating a light source from a captured image.
  • the information processing apparatus corresponding to the marker is started by the information processing apparatus by photographing the marker placed in the real space using the information processing apparatus provided with the camera.
  • the object image is superimposed on the photographed image so that the object drawn by computer graphics exists in the real space being photographed. Then, an object such as a tool used in the game is moved according to the movement of the information processing apparatus.
  • the movement of the character is changed depending on the distance between the camera position and the character and the relative angle.
  • draw icons as if they are placed in real space, including icons and descriptions.
  • the icon can be operated by touching a touch pad on a display screen provided in the information processing apparatus or a surface in a real space where the icon is virtually placed. In this way, all operations and information display can be performed in the augmented reality space existing in the display screen, and a user interface that maintains the world view can be realized. In addition, more natural and intuitive operation is possible compared to a general input device or a cursor.
  • the object to be drawn is generated corresponding to a marker whose size is known, drawing can be performed assuming the size of the object in real space. Therefore, it is also possible to make the character the same size as an actual human or make the tool full size, and in the resulting screen world, you can actually play with the character in your room etc. Can produce.
  • the size of the object is variable, the operability can be maintained even if the size is changed by adjusting the sensitivity of the movement of the object such as the tool with respect to the movement of the information processing apparatus according to the size. .
  • 10 information processing devices 20 display devices, 21 front touchpad, 31 rear camera, 60 CPU, 62 GPU, 64 main memory, 66 storage, 67 motion sensor, 70 operation unit, 100 control unit, 102 input information acquisition unit, 104 A captured image acquisition unit, 106, a captured image storage unit, 108, an image analysis unit, 109, a marker correspondence information storage unit, 110, a marker detection unit, 112, a space definition unit, 114, an information processing unit, 116, an object data storage unit, and 118, an output image generation unit.
  • the present invention can be used for information processing apparatuses such as computers, game machines, and information terminals.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention concerne un procédé caractérisé en ce que, lorsqu'un repère (128b) qui est compris dans une image photographique est détecté, un traitement d'informations correspondant au repère (128b) est entamé, et des objets correspondants comme un personnage (150a) et des icônes (152a, 152b, 152c) sont disposés dans un système de coordonnées tridimensionnelles correspondant à un espace sujet destiné à être restitué sur l'image photographique et instantanément affiché. Les icônes (152a, 152b, 152c) apparaissent comme si elles étaient placées dans le plan sur lequel est placé le repère (128b) et sont disposées de façon à être manipulables en fonction d'un contact sur un pavé tactile sur un écran d'affichage ou d'une indication par un doigt de l'emplacement correspondant dans l'espace sujet.
PCT/JP2014/002529 2013-08-20 2014-05-13 Dispositif et procédé de traitement d'informations WO2015025442A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013170282A JP2015041126A (ja) 2013-08-20 2013-08-20 情報処理装置および情報処理方法
JP2013-170282 2013-08-20

Publications (1)

Publication Number Publication Date
WO2015025442A1 true WO2015025442A1 (fr) 2015-02-26

Family

ID=52483242

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/002529 WO2015025442A1 (fr) 2013-08-20 2014-05-13 Dispositif et procédé de traitement d'informations

Country Status (2)

Country Link
JP (1) JP2015041126A (fr)
WO (1) WO2015025442A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107045711A (zh) * 2016-02-05 2017-08-15 株式会社万代南梦宫娱乐 图像生成系统以及图像处理方法
CN107320955A (zh) * 2017-06-23 2017-11-07 武汉秀宝软件有限公司 一种基于多客户端的ar场馆界面交互方法及系统
CN108303062A (zh) * 2016-12-27 2018-07-20 株式会社和冠 图像信息处理装置及图像信息处理方法
CN109661686A (zh) * 2016-08-31 2019-04-19 卡西欧计算机株式会社 对象显示系统、用户终端装置、对象显示方法及程序
JPWO2020202747A1 (fr) * 2019-03-29 2020-10-08
US11380011B2 (en) * 2019-04-23 2022-07-05 Kreatar, Llc Marker-based positioning of simulated reality

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6293020B2 (ja) * 2014-08-27 2018-03-14 株式会社テクノクラフト キャラクター連携アプリケーション装置
JP6527182B2 (ja) * 2017-02-03 2019-06-05 Kddi株式会社 端末装置、端末装置の制御方法、コンピュータプログラム
JP7111416B2 (ja) * 2017-03-24 2022-08-02 日本電気株式会社 携帯端末、情報処理システム、制御方法、及びプログラム
KR102470919B1 (ko) * 2017-09-11 2022-11-25 나이키 이노베이트 씨.브이. 표적 탐색 및 지오캐싱 이용을 위한 장치, 시스템, 및 방법
WO2019055475A1 (fr) 2017-09-12 2019-03-21 Nike Innovate C.V. Système de traitement d'authentification et de post-authentification à facteurs multiples
US11509653B2 (en) 2017-09-12 2022-11-22 Nike, Inc. Multi-factor authentication and post-authentication processing system
JP2019200811A (ja) * 2019-07-30 2019-11-21 富士通株式会社 表示制御方法、情報処理装置及び表示制御プログラム
JP2023091953A (ja) * 2021-12-21 2023-07-03 株式会社セガ プログラム及び情報処理装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001092995A (ja) * 1999-08-31 2001-04-06 Xerox Corp 拡張現実感表示システム及びプロセッサ生成画像の選択的表示方法
WO2012098872A1 (fr) * 2011-01-18 2012-07-26 京セラ株式会社 Terminal mobile et procédé destiné à commander un terminal mobile
JP2012145981A (ja) * 2011-01-06 2012-08-02 Nintendo Co Ltd 画像処理プログラム、画像処理装置、画像処理システム、および画像処理方法
JP2013050881A (ja) * 2011-08-31 2013-03-14 Nintendo Co Ltd 情報処理プログラム、情報処理システム、情報処理装置および情報処理方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001092995A (ja) * 1999-08-31 2001-04-06 Xerox Corp 拡張現実感表示システム及びプロセッサ生成画像の選択的表示方法
JP2012145981A (ja) * 2011-01-06 2012-08-02 Nintendo Co Ltd 画像処理プログラム、画像処理装置、画像処理システム、および画像処理方法
WO2012098872A1 (fr) * 2011-01-18 2012-07-26 京セラ株式会社 Terminal mobile et procédé destiné à commander un terminal mobile
JP2013050881A (ja) * 2011-08-31 2013-03-14 Nintendo Co Ltd 情報処理プログラム、情報処理システム、情報処理装置および情報処理方法

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107045711A (zh) * 2016-02-05 2017-08-15 株式会社万代南梦宫娱乐 图像生成系统以及图像处理方法
CN107045711B (zh) * 2016-02-05 2023-08-11 株式会社万代南梦宫娱乐 图像生成系统以及图像处理方法
CN109661686A (zh) * 2016-08-31 2019-04-19 卡西欧计算机株式会社 对象显示系统、用户终端装置、对象显示方法及程序
CN109661686B (zh) * 2016-08-31 2023-05-05 卡西欧计算机株式会社 对象显示系统、用户终端装置、对象显示方法及程序
CN108303062A (zh) * 2016-12-27 2018-07-20 株式会社和冠 图像信息处理装置及图像信息处理方法
CN107320955A (zh) * 2017-06-23 2017-11-07 武汉秀宝软件有限公司 一种基于多客户端的ar场馆界面交互方法及系统
CN107320955B (zh) * 2017-06-23 2021-01-29 武汉秀宝软件有限公司 一种基于多客户端的ar场馆界面交互方法及系统
JPWO2020202747A1 (fr) * 2019-03-29 2020-10-08
WO2020202747A1 (fr) * 2019-03-29 2020-10-08 ソニー株式会社 Appareil de traitement d'informations, procédé de traitement d'informations et support d'enregistrement
JP7400810B2 (ja) 2019-03-29 2023-12-19 ソニーグループ株式会社 情報処理装置、情報処理方法、及び記録媒体
US11380011B2 (en) * 2019-04-23 2022-07-05 Kreatar, Llc Marker-based positioning of simulated reality

Also Published As

Publication number Publication date
JP2015041126A (ja) 2015-03-02

Similar Documents

Publication Publication Date Title
WO2015025442A1 (fr) Dispositif et procédé de traitement d'informations
CN110147231B (zh) 组合特效生成方法、装置及存储介质
JP6158406B2 (ja) 携帯デバイスによるインタラクティブアプリケーションのビデオキャプチャを可能とするためのシステム
CN110276840B (zh) 多虚拟角色的控制方法、装置、设备及存储介质
WO2018077206A1 (fr) Procédé, dispositif, système et équipement de génération de scène de réalité augmentée
WO2019153824A1 (fr) Procédé de commande d'objet virtuel, dispositif, appareil informatique et support de stockage
WO2019153750A1 (fr) Procédé, appareil et dispositif de commutation de vue d'environnement virtuel, et support d'informations
JP5739671B2 (ja) 情報処理プログラム、情報処理装置、情報処理システム、及び情報処理方法
JP5654430B2 (ja) 家庭用ゲームシステムにおいて実行中のゲーム又はアプリケーションをリアルタイムに記録又は変更するための携帯型ゲーム装置の使用
JP5436912B2 (ja) プログラム、情報記憶媒体及びゲーム装置
JP5602618B2 (ja) 画像処理プログラム、画像処理装置、画像処理システム、および画像処理方法
CN110427110B (zh) 一种直播方法、装置以及直播服务器
JP5627973B2 (ja) ゲーム処理をするためのプログラム、装置、システムおよび方法
CN108694073B (zh) 虚拟场景的控制方法、装置、设备及存储介质
CN111158469A (zh) 视角切换方法、装置、终端设备及存储介质
JP5256269B2 (ja) データ生成装置、データ生成装置の制御方法、及びプログラム
US8947365B2 (en) Storage medium storing image processing program for implementing image processing according to input coordinate, and information processing device
CN112156464B (zh) 虚拟对象的二维形象展示方法、装置、设备及存储介质
US9310894B2 (en) Processing operation signals from a pointing device and/or an input device
US10166477B2 (en) Image processing device, image processing method, and image processing program
EP2394710A2 (fr) Système de génération d'images, procédé de génération d'images et support de stockage d'informations
JP2006314633A (ja) ゲームプログラムおよびゲーム装置
JP2012064010A (ja) 情報処理装置、情報処理プログラム、情報処理システムおよび情報処理方法
JP2015210379A (ja) 画像融合システム、情報処理装置、情報端末、および情報処理方法
JP2009251858A (ja) 画像変換プログラムおよび画像変換装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14837765

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14837765

Country of ref document: EP

Kind code of ref document: A1