WO2013038814A1 - Appareil de traitement d'image, procédé de traitement, programme et support d'enregistrement non temporaire - Google Patents

Appareil de traitement d'image, procédé de traitement, programme et support d'enregistrement non temporaire Download PDF

Info

Publication number
WO2013038814A1
WO2013038814A1 PCT/JP2012/068813 JP2012068813W WO2013038814A1 WO 2013038814 A1 WO2013038814 A1 WO 2013038814A1 JP 2012068813 W JP2012068813 W JP 2012068813W WO 2013038814 A1 WO2013038814 A1 WO 2013038814A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
screen
line
image
sight direction
Prior art date
Application number
PCT/JP2012/068813
Other languages
English (en)
Japanese (ja)
Inventor
匡信 菅野
Original Assignee
株式会社コナミデジタルエンタテインメント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社コナミデジタルエンタテインメント filed Critical 株式会社コナミデジタルエンタテインメント
Publication of WO2013038814A1 publication Critical patent/WO2013038814A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5252Changing parameters of virtual cameras using two or more virtual cameras concurrently or sequentially, e.g. automatically switching between fixed virtual cameras when a character changes room or displaying a rear-mirror view in a car-driving game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5255Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/218Input arrangements for video game devices characterised by their sensors, purposes or types using pressure sensors, e.g. generating a signal proportional to the pressure applied by the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/204Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform the platform being a handheld device
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • A63F2300/6676Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera by dedicated player input
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Definitions

  • the present invention relates to an image processing apparatus, a processing method, a program, and a non-transitory recording medium suitable for causing a user to operate in a correct orientation with respect to a screen.
  • Patent Document 1 discloses a technique for generating a stereoscopic image to be displayed on the screen of a portable game device.
  • the portable information processing apparatus can easily change the orientation of the screen at hand, the user may operate without facing the screen.
  • the orientation is changed without facing the screen, there is a problem that it is difficult to see the displayed image or that the stereoscopic view cannot be accurately performed.
  • the present invention solves the problems as described above, and provides an image processing apparatus, a processing method, a program, and a non-temporary recording medium suitable for causing a user to operate in a correct orientation with respect to a screen.
  • the purpose is to do.
  • the image processing apparatus includes a storage unit, a display unit, a reception unit, a detection unit, and an update unit, and is configured as follows.
  • an image processing apparatus will be described as an example in which the user can arbitrarily specify the shooting position of the camera and can change the orientation of the screen on which the shot image is displayed.
  • the storage unit stores the viewpoint position and the line-of-sight direction.
  • the viewpoint position is, for example, a photographing position of a camera that photographs a real space.
  • the line-of-sight direction is, for example, the shooting direction of a camera that shoots a real space.
  • information on the shooting position and shooting direction of the camera is stored in the storage unit.
  • the display unit displays an image captured from the viewpoint position and the line-of-sight direction on the screen.
  • the display unit displays on the screen an image of the real space photographed from the photographing position of the camera toward the photographing direction.
  • the accepting unit accepts an instruction operation.
  • the instruction operation is, for example, an operation for instructing the shooting position of the camera.
  • the reception unit receives an operation for designating the shooting position of the camera from the user.
  • Detecting unit detects changes in screen orientation.
  • the detection unit detects a change in the screen orientation.
  • the update unit updates the viewpoint position based on the instruction operation, and updates the line-of-sight direction based on the change in the screen orientation.
  • the update unit updates the shooting position of the camera to a position specified by the user, and updates the shooting direction of the camera based on the orientation of the screen changed by the user.
  • the update unit updates the line-of-sight direction so as to approach the initial direction.
  • the initial direction is, for example, a shooting direction of the camera, and is a direction perpendicular to the screen when the screen and the user are facing each other.
  • the update unit updates the shooting direction so as to approach the initial direction when an operation for specifying the shooting position of the camera is received when the user and the screen do not face each other, that is, when the shooting direction and the initial direction are different.
  • the line-of-sight direction is updated to the initial direction regardless of the direction of the screen.
  • the direction is changed.
  • the captured image may be a stereoscopic image.
  • the image captured by the camera is an image for stereoscopic viewing, and the user can stereoscopically view the image when facing the screen.
  • the user when an image cannot be stereoscopically viewed unless the user is facing the screen, the user can be prompted to perform the operation facing the screen.
  • the photographed image may be an image representing the state of the virtual space.
  • the viewpoint position is the shooting position of the virtual camera arranged in the virtual space
  • the line-of-sight direction is the shooting direction of the virtual camera.
  • the display unit displays an image representing a state of the virtual space captured by the virtual camera on the screen.
  • the updating unit may update the line-of-sight direction so as to approach the initial direction by a predetermined angle with respect to one instruction operation.
  • the instruction operation is pressing a predetermined button
  • the predetermined angle is 10 degrees.
  • the update unit updates the line-of-sight direction so as to approach the initial direction by 10 degrees from the current direction.
  • the line-of-sight direction can be gradually brought closer to the initial direction.
  • the predetermined angle may be determined according to an angle formed by the line-of-sight direction and the initial direction.
  • the predetermined angle is set larger as the angle formed by the current line-of-sight direction and the initial direction is larger.
  • the update unit approaches the initial direction by 10 degrees in one operation, and the angle formed by the current gaze direction and the initial direction is 30 degrees. In the case of, it is updated so as to approach the initial direction by 5 degrees in one operation.
  • the line-of-sight direction can be brought close to the initial direction smoothly.
  • An image processing apparatus is executed by an image processing apparatus including a storage unit that stores a viewpoint position and a line-of-sight direction, a display unit, a reception unit, a detection unit, and an update unit.
  • This processing method includes a display process, a reception process, a detection process, and an update process, and is configured as follows.
  • the display unit displays an image taken from the viewpoint position and the line-of-sight direction on the screen.
  • the reception unit receives an instruction operation.
  • the detection unit detects a change in the orientation of the screen.
  • the update unit updates the viewpoint position based on the instruction operation, and updates the line-of-sight direction based on the change in the screen orientation.
  • the updating unit updates the line-of-sight direction so as to approach the initial direction.
  • a program according to another aspect of the present invention is configured to cause a computer to function as the image processing apparatus and to cause the computer to execute the processing method.
  • the program of the present invention can be recorded on a computer-readable non-transitory recording medium such as a compact disk, flexible disk, hard disk, magneto-optical disk, digital video disk, magnetic tape, and semiconductor memory.
  • a computer-readable non-transitory recording medium such as a compact disk, flexible disk, hard disk, magneto-optical disk, digital video disk, magnetic tape, and semiconductor memory.
  • the above program can be distributed and sold via a computer communication network independently of the computer on which the program is executed.
  • the non-temporary recording medium can be distributed and sold independently from the computer.
  • non-temporary recording medium refers to a tangible recording medium.
  • Non-temporary recording media are, for example, compact disks, flexible disks, hard disks, magneto-optical disks, digital video disks, magnetic tapes, semiconductor memories, and the like.
  • the transitory recording medium refers to the transmission medium (propagation signal) itself.
  • the temporary recording medium is, for example, an electric signal, an optical signal, an electromagnetic wave, or the like.
  • the temporary storage area is an area for temporarily storing data and programs, and is, for example, a volatile memory such as a RAM (Random Access Memory).
  • an image processing device a processing method, a program, and a non-temporary recording medium that are suitable for causing a user to operate the screen in the correct orientation.
  • FIG. 1 is a diagram illustrating a schematic configuration of a typical portable information processing apparatus in which an image processing apparatus according to an embodiment of the present invention is realized. It is a figure which shows the external appearance of a portable information processing apparatus. It is a figure for demonstrating the relationship between the image displayed on a screen, and a viewpoint position. It is a figure for demonstrating the relationship between the image displayed on a screen, and a viewpoint position. It is a figure which shows the function structure of the image processing apparatus which concerns on embodiment. It is a figure for demonstrating an imaging
  • FIG. 1 is a diagram showing a schematic configuration of a typical portable information processing apparatus 1 in which an image processing apparatus 100 according to an embodiment of the present invention is realized.
  • FIG. 2 shows an external view of the portable information processing apparatus 1.
  • the portable information processing apparatus 1 includes a processing control unit 10, a connector 11, a cartridge 12, a wireless communication unit 13, a communication controller 14, a sound amplifier 15, a speaker 16, and a microphone. 17, an operation key 18, a first display unit 19, a second display unit 20, a touch panel 21, a camera 22, and an angular velocity sensor 23. Further, as shown in FIG. 2, the portable information processing apparatus 1 includes an upper housing 31 and a lower housing 32, and the upper housing 31 and the lower housing 32 are connected to each other so as to be opened and closed. Yes.
  • the processing control unit 10 includes a CPU (Central Processing Unit) core 10a, an image processing unit 10b, a VRAM (Video Random Access Memory) 10c, a WRAM (Work RAM) 10d, an LCD (Liquid Crystal Display) controller 10e, A touch panel controller 10f.
  • a CPU Central Processing Unit
  • an image processing unit 10b an image processing unit 10b
  • a VRAM Video Random Access Memory
  • WRAM Work RAM
  • LCD Liquid Crystal Display
  • a touch panel controller 10f A touch panel controller
  • the CPU core 10a controls the operation of the entire portable information processing apparatus 1 and is connected to each component to exchange control signals and data. Specifically, with the cartridge 12 mounted in the connector 11, a program or data stored in a ROM (Read Only Memory) 12a in the cartridge 12 is read and predetermined processing is executed.
  • ROM Read Only Memory
  • the image processing unit 10b processes the data read from the ROM 12a in the cartridge 12, the data of the image taken by the camera 22, or the data processed by the CPU core 10a, and then processes this data into a VRAM 10c. To store.
  • the VRAM 10c is a frame memory that stores information for display, and stores image information processed by the image processing unit 10b and the like.
  • the WRAM 10d stores work data and the like necessary when the CPU core 10a executes various processes according to the program.
  • the LCD controller 10e controls the first display unit 19 and the second display unit 20 to display a predetermined display image.
  • the LCD controller 10 e converts the image information stored in the VRAM 10 c into a display signal at a predetermined synchronization timing, and outputs the display signal to the first display unit 19.
  • the LCD controller 10e displays a predetermined instruction icon or the like on the second display unit 20.
  • the touch panel controller 10f detects a touch (touch) on the touch panel 21 with a touch pen or a user's finger. For example, in the state where a predetermined instruction icon or the like is displayed on the second display unit 20, a contact on the touch panel 21 and its position are detected.
  • the connector 11 is a terminal that can be detachably connected to the cartridge 12 and transmits / receives predetermined data to / from the cartridge 12 when the cartridge 12 is connected.
  • the cartridge 12 includes a ROM 12a and a RAM (Random Access Memory) 12b.
  • the ROM 12a stores a program for realizing the game, image data and sound data associated with the game, and the like.
  • the RAM 12b stores various data indicating the progress of the game.
  • the wireless communication unit 13 is a unit that performs wireless communication with the wireless communication unit 13 of another portable information processing apparatus 1 and transmits / receives predetermined data via an antenna (not shown) (not shown). .
  • the wireless communication unit 13 can also perform wireless LAN communication with a predetermined access point.
  • the wireless communication unit 13 is assigned a unique MAC (Media Access Control) address.
  • the communication controller 14 controls the wireless communication unit 13 and mediates communication performed between the processing control unit 10 and the processing control unit 10 of another portable information processing device 1 according to a predetermined protocol.
  • the sound amplifier 15 amplifies the audio signal generated by the processing control unit 10 and supplies it to the speaker 16.
  • the speaker 16 is composed of, for example, a stereo speaker, and outputs predetermined music sound, sound effect, and the like according to the audio signal amplified by the sound amplifier 15.
  • the microphone 17 receives an analog signal such as a user's voice, and the received signal is processed by the processing control unit 10 such as mixing.
  • the operation key 18 is composed of a direction key 18a, a button 18b, and the like that are appropriately arranged in the portable information processing apparatus 1, and receives a predetermined instruction input according to a user operation.
  • the operation key 18 includes a button for adjusting the volume, a knob, and the like.
  • the first display unit 19 and the second display unit 20 are composed of an LCD or the like, and are controlled by the LCD controller 10e to appropriately display a game image or the like.
  • the first display unit 19 is provided in the upper housing 31, and the second display unit 20 is provided in the lower housing 32.
  • the first display unit 19 displays a stereoscopic image (hereinafter referred to as “stereoscopic image”) that allows the player to stereoscopically view.
  • the stereoscopic image is, for example, an image capable of autostereoscopic viewing by a parallax barrier method, or an image capable of stereoscopic viewing by wearing predetermined glasses and viewing the screen.
  • the second display unit 20 displays a normal image instead of a stereoscopic image.
  • the touch panel 21 is disposed so as to be superimposed on the front surface of the second display unit 20, and receives input by touching a touch pen or a user's finger.
  • the touch panel 21 includes, for example, a pressure-sensitive touch sensor panel and the like, detects the pressure of the user's finger and the like, and detects the contact state, the transition from the contact state to the non-contact state, and the like.
  • the touch panel 21 may detect contact with a user's finger or the like from a change in capacitance or the like.
  • the camera 22 captures the surrounding space according to the user's instruction and converts the captured image into an electrical signal.
  • the camera 22 is composed of, for example, a CMOS (Complimentary MOS) sensor or the like.
  • the camera 22 is disposed on the back side of the upper housing 31 (FIG. 2B).
  • the angular velocity sensor 23 detects angular velocities generated around the three axes (FIG. 2A, xyz axes) of the portable information processing device 1 and outputs detected angular velocity data to the processing control unit 10.
  • the process control unit 10 obtains the posture and movement of the portable information processing device 1 based on the angular velocity data.
  • the image processing apparatus 100 is realized on the typical portable information processing apparatus 1 described above, but can also be realized on a general computer or a game apparatus.
  • a general computer or game device includes a CPU core, a VRAM, and a WRAM similarly to the portable information processing device 1.
  • a communication unit for example, a NIC (Network Interface Controller) conforming to a standard such as 10BASE-T / 100BASE-T used when configuring a LAN (Local Area Network), a hard disk as a storage device, and a DVD -ROMs, magneto-optical disks, etc. can be used.
  • a keyboard or a mouse is used instead of the touch panel. Then, after the program is installed, when the program is executed, it can function as an information processing apparatus.
  • the cartridge 12 storing the game program and data is attached to the connector 11 and the portable information processing apparatus 1 is turned on to execute the program, thereby realizing the image processing apparatus 100 according to the embodiment.
  • FIG. 3 shows an example of a game virtual space.
  • the virtual space of this game is a horizontally long field 300, and a player base 201 as a player base and an enemy base 202 as an enemy base are arranged at both ends of the field.
  • the player is required to drop the enemy base 202 while protecting the player base 201.
  • the player causes the enemy base 202 to fall by causing a character (player character 203) operated by the player to attack the enemy base 202 or by injecting the player shell 205 toward the enemy base 202.
  • the player also protects the player base 201 from attacks by the enemy character 204 and the enemy shell 206.
  • FIG. 3 shows that an image 301 viewed in the shooting direction 403 from the virtual camera 400 on the straight line 401 is displayed on the screen.
  • the method for moving the virtual camera 400 includes a manual mode in which the player can arbitrarily move, and an auto mode in which the most important shooting position is automatically selected according to the program.
  • the manual mode for example, the player can move the virtual camera 400 on the straight line 401 by pressing the right direction or the left direction of the direction key 18a.
  • the auto mode for example, a priority is set for an event that occurs during game play, and the virtual camera 400 automatically moves based on the priority so that the event is displayed on the screen.
  • the event includes, for example, a fall of the player base 201 (priority: 1), a fall of the enemy base 202 (priority: 2), a down shot of the player character 203 (priority: 3), and a down shot of the enemy character 204 (priority).
  • a priority is set in advance for each event. Then, when any of these events occur simultaneously, the position of the virtual camera 400 is set so that an event image with a high priority is displayed on the screen.
  • the player can appropriately set whether to move the virtual camera 400 in the manual mode or in the auto mode. For example, it is assumed that the game proceeds by alternately performing the player's attack and the enemy's attack, and is set to the auto mode at the beginning of the player's or enemy's attack. In this case, when the pressing of the direction key 18a is detected in the auto mode, the mode may be shifted to the manual mode.
  • the image processing apparatus 100 includes a storage unit 101, a display unit 102, a reception unit 103, a detection unit 104, and an update unit 105.
  • the storage unit 101 stores the viewpoint position and the line-of-sight direction.
  • the viewpoint position is a position on the straight line 401 of the virtual camera 400 (hereinafter referred to as “shooting position”), and the line-of-sight direction is the shooting direction of the virtual camera 400.
  • shooting position a position on the straight line 401 of the virtual camera 400
  • line-of-sight direction is the shooting direction of the virtual camera 400.
  • an initial position 101a1, an initial direction 101a2, a current photographing position 101a3, and a photographing direction 101a4 of the virtual camera 400 are registered in association with each other.
  • the center position of the straight line 401 (the position of the virtual camera 400 in FIG. 3) is set to 0 [dot], and this position is set as the initial position.
  • the shooting position is represented as a positive value when moving in the right direction of the field 300 toward the screen, and as a negative value when moving in the left direction of the field 300.
  • a direction 402 perpendicular to the straight line 401 toward the field 300 is set to 0 [deg], and this direction is set as an initial direction.
  • the shooting direction is represented by a positive value when facing the screen in the right direction of the field 300 and a negative value when facing the left direction of the field.
  • the shooting information table 101a of FIG. 6A is stored in the storage unit 101.
  • the WRAM 10d functions as the storage unit 101.
  • the display unit 102 displays an image taken from the viewpoint position and the line-of-sight direction on the screen.
  • the photographed image is a virtual space image and is a stereoscopic image displayed on the first display unit 19.
  • the display unit 102 refers to the imaging information table 101a and performs imaging in the imaging direction 403 from the initial imaging position.
  • the completed image 301 (FIG. 3) is displayed on the screen.
  • the processing control unit 10 and the first display unit 19 cooperate to function as the display unit 102.
  • the accepting unit 103 accepts an instruction operation.
  • the instruction operation is an operation for moving the shooting position of the virtual camera 400, and is a left or right press of the direction key 18a.
  • the receiving unit 103 receives the press as an instruction to move the virtual camera 400 in the left direction. Or you may make it receive operation which designates the left direction or the right direction with the touch panel 21.
  • FIG. For example, when the player touches the touch panel 21 with a finger and moves it by a predetermined distance in the right direction, the accepting unit 103 accepts the operation as an instruction to move the virtual camera 400 in the right direction.
  • the processing control unit 10 and the operation key 18 cooperate to function as the reception unit 103.
  • the detection unit 104 detects a change in the screen orientation.
  • FIG. 7 shows the positional relationship between the player 500 who holds the portable information processing apparatus 1 and plays a game, and the upper casing 31 (first display unit 19) of the portable information processing apparatus 1.
  • FIG. 7A and FIG. 7B are views seen from directly above the player 500.
  • the direction of the screen is set to 0 [deg] in a direction 502 perpendicular to the direction 501 parallel to the user 500.
  • FIG. 7A shows a state where the player 500 is holding the portable information processing apparatus 1 so that the player 500 faces the first display unit 19 of the upper housing 31.
  • “facing directly” means that the upper body of the player faces directly in front of the screen.
  • the detection unit 104 detects the screen direction 503. That is, the screen direction 503 is 0 [deg], and the detection unit 104 detects 0 [deg] as the screen orientation.
  • FIG. 7B shows a state where the player 500 is holding the portable information processing apparatus 1 so that the upper housing 31 is tilted by 30 [deg] to the left from the direction 502.
  • the detection unit 104 detects the screen direction 504. That is, the detection unit 104 detects ⁇ 30 [deg] as the screen orientation.
  • the processing control unit 10 and the angular velocity sensor 23 cooperate to function as the detection unit 104.
  • the update unit 105 updates the viewpoint position based on the instruction operation, and updates the line-of-sight direction based on the change in the screen orientation.
  • the virtual camera 400 is shown in FIG. As shown, it is arranged at the initial position and faces the initial direction (imaging direction 403). Since the shooting position and the shooting direction are not changed from the initial position and the initial direction, the update unit 105 does not update the shooting position and the shooting direction.
  • the display unit 102 displays an image 303 (FIG. 8A) taken in the initial direction from the initial position on the screen.
  • the image 303 includes a player character 203 and an object 207.
  • the receiving unit 103 receives the press and updates it.
  • the unit 105 updates the shooting position based on the received press. For example, if the accepting unit 103 accepts a pressure to move 150 [dots] leftward, the updating unit 105 sets the photographing position 101a3 of the photographing information table 101a to “ ⁇ 150” as shown in FIG. 6B. Update to [dot].
  • the display unit 102 obtains the shooting position and the shooting direction with reference to the updated shooting information table 101a (FIG. 6B), and moves to the left from the initial position by 150 [dot] (FIG. 9). (B)) and the image 304 (FIG. 9A) taken from the shooting direction 403 are displayed on the screen.
  • the portable information is such that the player 500 tilts the first display unit 19 to the left from the direction 502 by 30 [deg].
  • the detection unit 104 detects the tilted screen direction 504, and the update unit 105 updates the shooting direction based on the detected direction 504. For example, if the detection unit 104 detects an inclination of 30 [deg] in the left direction, the update unit 105 sets the shooting direction 101a4 of the shooting information table 101a to “ ⁇ 30” [ ⁇ 30] as illustrated in FIG. deg].
  • the display unit 102 obtains a shooting position and a shooting direction with reference to the updated shooting information table 101a (FIG. 6C), and acquires a shooting position (initial position) and a shooting direction 404 (FIG. 10B). An image 305 (FIG. 10 (a)) taken from is displayed on the screen.
  • the update unit 105 updates the line-of-sight direction so as to approach the initial direction.
  • the receiving unit 103 receives the pressing, and the detecting unit 104
  • the direction of the upper housing 31 is detected.
  • the update unit 105 updates the shooting position based on the received pressure, and updates the shooting direction based on the detected direction.
  • the instruction operation is performed on the assumption that the shooting direction remains the direction 404 and only the shooting position moves to the designated position.
  • the update unit 105 changes the shooting direction from the direction 404 to the initial direction while the virtual camera 400 moves to the specified shooting position, as shown in FIG.
  • the shooting direction is updated so as to gradually approach 403.
  • the updating unit 105 updates the line-of-sight direction so as to approach the initial direction by a predetermined angle with respect to one instruction operation.
  • One instruction operation is, for example, one pressing of the direction key 18a in the left direction or the right direction. In the present embodiment, it is assumed that the photographing position moves by 50 [dot] by pressing the direction key 18a once.
  • FIG. 13A shows an example of a relationship between an angle formed by the current gaze direction (imaging direction) and the initial direction (hereinafter referred to as “current angle”) and a predetermined angle.
  • the updating unit 105 updates the shooting direction according to this graph. For example, as illustrated in FIG. 12B, when the virtual camera 400 is arranged at the initial position and is tilted 30 [deg] in the left direction, when the left direction of the direction key 18 a is pressed once, the update unit 105 The imaging position is updated to a position where the virtual camera 400 is moved 50 [dots] leftward on the straight line 401, and at the same time, the imaging direction is updated so as to approach the initial direction by 10 [deg]. That is, as shown in FIG.
  • the update unit 105 sets the shooting position 101a3 to “ ⁇ 50” [dot] and sets the shooting direction 101a4 to “ ⁇ 20” [0] from the shooting information table 101a of FIG. deg] (direction 405). Then, the display unit 102 obtains a shooting position and a shooting direction with reference to the shooting information table 101a updated every time the direction key 18a is pressed, and displays an image shot from the shooting position and the shooting direction. When the direction key 18a is pressed three times to the left and updated to the photographing direction 403, the display unit 102 displays the image 307 photographed from the photographing position and the photographing direction 403 moved by 150 [dots] left from the initial position. (FIG. 12A) is displayed on the screen.
  • the predetermined angle is not a fixed value, but may be determined according to an angle (current angle) formed by the line-of-sight direction and the initial direction. For example, as shown in FIG. 13B, the predetermined angle may be set larger as the current angle is larger.
  • the update unit 105 receives the upper housing 31 (first display unit 19). Since the shooting direction is updated regardless of the orientation of), an image 307 (FIG. 12A) in a direction not intended by the player 500 is displayed. That is, although the screen is tilted, the image 307 (FIG. 12A) displayed when facing the field in the virtual space is displayed on the screen. In this way, by updating the shooting direction, it is possible to prompt the player 500 to face the upper housing 31 (the first display unit 19) as shown in FIG. 14C.
  • the processing control unit 10 functions as the update unit 105.
  • the display unit 102 generates an image shot from the viewpoint position and the line-of-sight direction based on the viewpoint position and the line-of-sight direction stored in the storage unit 101 (step S101).
  • the display unit 102 refers to the shooting information table 101a and generates an image representing the state of the virtual space shot from the shooting position and shooting direction.
  • the image generated by the display unit 102 is stored in the VRAM 10c.
  • the display unit 102 waits for a queue to be cleared or another process to be performed until a vertical synchronization interrupt occurs (step S102).
  • the display unit 102 converts the image information stored in the VRAM 10c into a display signal and displays it on the first display unit 19 (step S103). For example, the image 303 in FIG. 8 is displayed on the first display unit 19.
  • the accepting unit 103 determines whether or not an instruction operation has been accepted (step S104). If the reception unit 103 determines that the instruction operation has been received (step S104; Yes), the detection unit 104 detects the orientation of the screen (step S105). On the other hand, when the receiving unit 103 determines that the instruction operation is not received (step S104; No), the detecting unit 104 detects the orientation of the screen (step S109).
  • the reception unit 103 determines whether either the left direction or the right direction of the direction key 18a is pressed. When it is determined whether the receiving unit 103 has received the press, the detecting unit 104 detects the current orientation of the upper casing 31 (first display unit 19). Note that the determination of whether or not the instruction operation by the reception unit 103 has been received and the detection of the screen orientation of the detection unit 104 may be performed in the reverse order or simultaneously.
  • the update unit 105 next determines whether or not the screen orientation is the initial direction (step S106).
  • the updating unit 105 determines that the screen orientation is the initial direction (step S106; Yes)
  • the updating unit 105 updates the viewpoint position stored in the storage unit 101 based on the instruction operation received by the receiving unit 103 (step S107).
  • the updating unit 105 updates the viewpoint position stored in the storage unit 101 based on the instruction operation received by the receiving unit 103, and stores the viewpoint position.
  • the line-of-sight direction stored in the unit 101 is updated so as to approach the initial direction (step S108). Then, it returns to step S101.
  • the updating unit 105 determines the shooting position of the shooting information table 101a based on the pressing of the direction key 18a. 101a3 is updated.
  • the update unit 105 performs the shooting position 101a3 of the shooting information table 101a based on the press. Is updated so that the shooting direction 101a4 gradually approaches the initial direction regardless of the orientation of the upper casing 31.
  • the display unit 102 displays an image photographed from the photographing position and photographing direction in the updated photographing information table 101 a on the first display unit 19. .
  • step S109 when the screen orientation is detected by the detection unit 104, the update unit 105 next determines whether or not the screen orientation is the initial direction (step S110).
  • step S110 determines that the screen orientation is the initial direction (step S110; Yes)
  • step S110 determines that the screen orientation is the initial direction
  • step S111 the updating unit 105 updates the line-of-sight direction stored in the storage unit 101 based on the orientation of the screen (step S111). Then, it returns to step S101.
  • the update unit 105 takes a picture based on the orientation of the upper housing 31.
  • the imaging direction 101a4 of the information table 101a is updated.
  • the update unit 105 does not update the shooting position 101a3 and the shooting direction 101a4. Then, the display unit 102 displays on the first display unit 19 an image shot from the shooting position and shooting direction of the shooting information table 101a.
  • the player In a normal game, as shown in FIG. 14C, it is desirable that the player is operated in a state of facing the screen.
  • the portable information processing apparatus 1 that can change the orientation at hand is used.
  • the player may operate in a state where the player is not facing the screen.
  • the image when a stereoscopic image is displayed, the image may not be stereoscopically viewed unless it is directly facing the screen.
  • the updating unit 105 updates the direction so that the shooting direction faces the field in the virtual space, and faces the virtual camera. By displaying the image displayed in the case on the screen, it is possible to prompt the player to face the screen.
  • the operation of the image processing apparatus 100 has been described by taking the shooting position and shooting direction of the virtual camera as an example, but the present invention is not limited to this.
  • it may be a shooting position and a shooting direction of a camera shooting a real space.
  • the camera may not be provided in the same housing as the screen.
  • the shooting position of the camera can be specified remotely by the user, but the present invention can also be applied to an apparatus in which the shooting direction is determined based on the orientation of the screen held by the user.
  • the screen orientation is detected by the angular velocity sensor.
  • the present invention is not limited to this.
  • the orientation of the screen may be obtained from a change in a captured image captured by the camera.
  • an image processing device a processing method, a program, and a non-temporary recording medium that are suitable for causing a user to operate the screen in the correct orientation.
  • SYMBOLS 1 Portable information processing apparatus 10 Processing control part 10a CPU core 10b Image processing part 10c VRAM 10d WRAM 10e LCD controller 10f Touch panel controller 11 Connector 12 Cartridge 12a ROM 12b RAM 13 wireless communication unit 14 communication controller 15 sound amplifier 16 speaker 17 microphone 18 operation key 18a direction key 18b button 19 first display unit 20 second display unit 21 touch panel 22 camera 23 angular velocity sensor 31 upper casing 32 lower casing DESCRIPTION OF SYMBOLS 100 Image processing apparatus 101 Memory

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Processing Or Creating Images (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Selon l'invention, une position de point de vue et une direction de ligne de vue sont stockées dans une unité de stockage (101). Une unité d'affichage (102) affiche, sur un écran, une image photographiée à partir de la position de point de vue et de la direction de ligne de vue. Une unité de réception (103) reçoit des opérations d'instruction. Une unité de détection (104) détecte un changement de la direction de l'écran. Une unité de mise à jour (105) met à jour la position de point de vue sur la base des opérations d'instruction, et met à jour la direction de ligne de vue sur la base du changement de la direction de l'écran. Ensuite, lorsque les opérations d'instruction sont reçues, l'unité de mise à jour (105) met à jour la direction de ligne de vue pour que celle-ci soit proche de la direction initiale.
PCT/JP2012/068813 2011-09-15 2012-07-25 Appareil de traitement d'image, procédé de traitement, programme et support d'enregistrement non temporaire WO2013038814A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011201393A JP5113933B1 (ja) 2011-09-15 2011-09-15 画像処理装置、処理方法、ならびに、プログラム
JP2011-201393 2011-09-15

Publications (1)

Publication Number Publication Date
WO2013038814A1 true WO2013038814A1 (fr) 2013-03-21

Family

ID=47676453

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/068813 WO2013038814A1 (fr) 2011-09-15 2012-07-25 Appareil de traitement d'image, procédé de traitement, programme et support d'enregistrement non temporaire

Country Status (2)

Country Link
JP (1) JP5113933B1 (fr)
WO (1) WO2013038814A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020162193A1 (fr) * 2019-02-06 2020-08-13 ソニー株式会社 Dispositif et procédé de traitement d'informations, et programme

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6219037B2 (ja) * 2013-02-06 2017-10-25 任天堂株式会社 情報処理プログラム、情報処理装置、情報処理システム、および情報処理方法
JP6734236B2 (ja) * 2017-08-14 2020-08-05 株式会社 ディー・エヌ・エー ゲームを提供するためのプログラム、システム、及び方法
CN108970114A (zh) * 2018-08-21 2018-12-11 苏州蜗牛数字科技股份有限公司 一种通过自定义映射按键实现视野调整的方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002298160A (ja) * 2001-03-29 2002-10-11 Namco Ltd 携帯型画像生成装置、プログラム及び情報記憶媒体
JP2004166995A (ja) * 2002-11-20 2004-06-17 Nintendo Co Ltd ゲーム装置および情報処理装置
JP2011108256A (ja) * 2011-01-07 2011-06-02 Nintendo Co Ltd 情報処理プログラム、情報処理方法、情報処理装置及び情報処理システム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002298160A (ja) * 2001-03-29 2002-10-11 Namco Ltd 携帯型画像生成装置、プログラム及び情報記憶媒体
JP2004166995A (ja) * 2002-11-20 2004-06-17 Nintendo Co Ltd ゲーム装置および情報処理装置
JP2011108256A (ja) * 2011-01-07 2011-06-02 Nintendo Co Ltd 情報処理プログラム、情報処理方法、情報処理装置及び情報処理システム

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020162193A1 (fr) * 2019-02-06 2020-08-13 ソニー株式会社 Dispositif et procédé de traitement d'informations, et programme

Also Published As

Publication number Publication date
JP2013059586A (ja) 2013-04-04
JP5113933B1 (ja) 2013-01-09

Similar Documents

Publication Publication Date Title
JP7098001B2 (ja) 仮想シーンにおける距離情報表示方法並びにその、端末コンピュータ装置及びコンピュータプログラム
US9098130B2 (en) Computer-readable storage medium having stored thereon input processing program, input processing apparatus, input processing method, and input processing system
JP5710934B2 (ja) コンテンツ表示装置、およびコンテンツ表示方法
JP5541974B2 (ja) 画像表示プログラム、装置、システムおよび方法
US9602809B2 (en) Storage medium storing display control program for providing stereoscopic display desired by user with simpler operation, display control device, display control method, and display control system
CN106899801B (zh) 移动终端及其控制方法
US9975042B2 (en) Information processing terminal and game device
KR102523919B1 (ko) 오디오 재생 및 수집 방법, 장치 및 디바이스 및 판독 가능한 저장 매체
WO2011148544A1 (fr) Dispositif électronique portable
CN111918090B (zh) 直播画面显示方法、装置、终端及存储介质
WO2020151594A1 (fr) Procédé de rotation d'angle de visualisation, dispositif, appareil et support de stockage
CN109324739B (zh) 虚拟对象的控制方法、装置、终端及存储介质
JP6755224B2 (ja) ゲームシステム、ゲームプログラム、ゲーム装置、および、ゲーム処理方法
CN106067833B (zh) 移动终端及其控制方法
US20160059128A1 (en) Information processing terminal, non-transitory storage medium encoded with computer readable information processing program, information processing terminal system, and information processing method
JP2007017596A (ja) 携帯端末装置
JP5113933B1 (ja) 画像処理装置、処理方法、ならびに、プログラム
JP2016067024A (ja) 携帯型電子機器
CN113509720A (zh) 虚拟对战的回放方法、装置、终端、服务器及存储介质
US20120058825A1 (en) Game apparatus, game control method, and information recording medium
JP5764390B2 (ja) 画像生成プログラム、画像生成方法、画像生成装置及び画像生成システム
JP5941620B2 (ja) 情報処理プログラム、情報処理装置、情報処理方法、及び情報処理システム
JP5785732B2 (ja) 情報処理プログラム、撮像装置、撮像方法及び撮像システム
JP5926773B2 (ja) 周辺装置、情報処理システム、および、周辺装置の接続方法
JP2011204182A (ja) 画像生成装置、画像加工方法、および、プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12831977

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12831977

Country of ref document: EP

Kind code of ref document: A1