WO2013038814A1 - Image processing apparatus, processing method, program, and non-temporary recording medium - Google Patents

Image processing apparatus, processing method, program, and non-temporary recording medium Download PDF

Info

Publication number
WO2013038814A1
WO2013038814A1 PCT/JP2012/068813 JP2012068813W WO2013038814A1 WO 2013038814 A1 WO2013038814 A1 WO 2013038814A1 JP 2012068813 W JP2012068813 W JP 2012068813W WO 2013038814 A1 WO2013038814 A1 WO 2013038814A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
screen
line
image
sight direction
Prior art date
Application number
PCT/JP2012/068813
Other languages
French (fr)
Japanese (ja)
Inventor
匡信 菅野
Original Assignee
株式会社コナミデジタルエンタテインメント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社コナミデジタルエンタテインメント filed Critical 株式会社コナミデジタルエンタテインメント
Publication of WO2013038814A1 publication Critical patent/WO2013038814A1/en

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5252Changing parameters of virtual cameras using two or more virtual cameras concurrently or sequentially, e.g. automatically switching between fixed virtual cameras when a character changes room or displaying a rear-mirror view in a car-driving game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5255Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/218Input arrangements for video game devices characterised by their sensors, purposes or types using pressure sensors, e.g. generating a signal proportional to the pressure applied by the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/204Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform the platform being a handheld device
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • A63F2300/6676Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera by dedicated player input
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Definitions

  • the present invention relates to an image processing apparatus, a processing method, a program, and a non-transitory recording medium suitable for causing a user to operate in a correct orientation with respect to a screen.
  • Patent Document 1 discloses a technique for generating a stereoscopic image to be displayed on the screen of a portable game device.
  • the portable information processing apparatus can easily change the orientation of the screen at hand, the user may operate without facing the screen.
  • the orientation is changed without facing the screen, there is a problem that it is difficult to see the displayed image or that the stereoscopic view cannot be accurately performed.
  • the present invention solves the problems as described above, and provides an image processing apparatus, a processing method, a program, and a non-temporary recording medium suitable for causing a user to operate in a correct orientation with respect to a screen.
  • the purpose is to do.
  • the image processing apparatus includes a storage unit, a display unit, a reception unit, a detection unit, and an update unit, and is configured as follows.
  • an image processing apparatus will be described as an example in which the user can arbitrarily specify the shooting position of the camera and can change the orientation of the screen on which the shot image is displayed.
  • the storage unit stores the viewpoint position and the line-of-sight direction.
  • the viewpoint position is, for example, a photographing position of a camera that photographs a real space.
  • the line-of-sight direction is, for example, the shooting direction of a camera that shoots a real space.
  • information on the shooting position and shooting direction of the camera is stored in the storage unit.
  • the display unit displays an image captured from the viewpoint position and the line-of-sight direction on the screen.
  • the display unit displays on the screen an image of the real space photographed from the photographing position of the camera toward the photographing direction.
  • the accepting unit accepts an instruction operation.
  • the instruction operation is, for example, an operation for instructing the shooting position of the camera.
  • the reception unit receives an operation for designating the shooting position of the camera from the user.
  • Detecting unit detects changes in screen orientation.
  • the detection unit detects a change in the screen orientation.
  • the update unit updates the viewpoint position based on the instruction operation, and updates the line-of-sight direction based on the change in the screen orientation.
  • the update unit updates the shooting position of the camera to a position specified by the user, and updates the shooting direction of the camera based on the orientation of the screen changed by the user.
  • the update unit updates the line-of-sight direction so as to approach the initial direction.
  • the initial direction is, for example, a shooting direction of the camera, and is a direction perpendicular to the screen when the screen and the user are facing each other.
  • the update unit updates the shooting direction so as to approach the initial direction when an operation for specifying the shooting position of the camera is received when the user and the screen do not face each other, that is, when the shooting direction and the initial direction are different.
  • the line-of-sight direction is updated to the initial direction regardless of the direction of the screen.
  • the direction is changed.
  • the captured image may be a stereoscopic image.
  • the image captured by the camera is an image for stereoscopic viewing, and the user can stereoscopically view the image when facing the screen.
  • the user when an image cannot be stereoscopically viewed unless the user is facing the screen, the user can be prompted to perform the operation facing the screen.
  • the photographed image may be an image representing the state of the virtual space.
  • the viewpoint position is the shooting position of the virtual camera arranged in the virtual space
  • the line-of-sight direction is the shooting direction of the virtual camera.
  • the display unit displays an image representing a state of the virtual space captured by the virtual camera on the screen.
  • the updating unit may update the line-of-sight direction so as to approach the initial direction by a predetermined angle with respect to one instruction operation.
  • the instruction operation is pressing a predetermined button
  • the predetermined angle is 10 degrees.
  • the update unit updates the line-of-sight direction so as to approach the initial direction by 10 degrees from the current direction.
  • the line-of-sight direction can be gradually brought closer to the initial direction.
  • the predetermined angle may be determined according to an angle formed by the line-of-sight direction and the initial direction.
  • the predetermined angle is set larger as the angle formed by the current line-of-sight direction and the initial direction is larger.
  • the update unit approaches the initial direction by 10 degrees in one operation, and the angle formed by the current gaze direction and the initial direction is 30 degrees. In the case of, it is updated so as to approach the initial direction by 5 degrees in one operation.
  • the line-of-sight direction can be brought close to the initial direction smoothly.
  • An image processing apparatus is executed by an image processing apparatus including a storage unit that stores a viewpoint position and a line-of-sight direction, a display unit, a reception unit, a detection unit, and an update unit.
  • This processing method includes a display process, a reception process, a detection process, and an update process, and is configured as follows.
  • the display unit displays an image taken from the viewpoint position and the line-of-sight direction on the screen.
  • the reception unit receives an instruction operation.
  • the detection unit detects a change in the orientation of the screen.
  • the update unit updates the viewpoint position based on the instruction operation, and updates the line-of-sight direction based on the change in the screen orientation.
  • the updating unit updates the line-of-sight direction so as to approach the initial direction.
  • a program according to another aspect of the present invention is configured to cause a computer to function as the image processing apparatus and to cause the computer to execute the processing method.
  • the program of the present invention can be recorded on a computer-readable non-transitory recording medium such as a compact disk, flexible disk, hard disk, magneto-optical disk, digital video disk, magnetic tape, and semiconductor memory.
  • a computer-readable non-transitory recording medium such as a compact disk, flexible disk, hard disk, magneto-optical disk, digital video disk, magnetic tape, and semiconductor memory.
  • the above program can be distributed and sold via a computer communication network independently of the computer on which the program is executed.
  • the non-temporary recording medium can be distributed and sold independently from the computer.
  • non-temporary recording medium refers to a tangible recording medium.
  • Non-temporary recording media are, for example, compact disks, flexible disks, hard disks, magneto-optical disks, digital video disks, magnetic tapes, semiconductor memories, and the like.
  • the transitory recording medium refers to the transmission medium (propagation signal) itself.
  • the temporary recording medium is, for example, an electric signal, an optical signal, an electromagnetic wave, or the like.
  • the temporary storage area is an area for temporarily storing data and programs, and is, for example, a volatile memory such as a RAM (Random Access Memory).
  • an image processing device a processing method, a program, and a non-temporary recording medium that are suitable for causing a user to operate the screen in the correct orientation.
  • FIG. 1 is a diagram illustrating a schematic configuration of a typical portable information processing apparatus in which an image processing apparatus according to an embodiment of the present invention is realized. It is a figure which shows the external appearance of a portable information processing apparatus. It is a figure for demonstrating the relationship between the image displayed on a screen, and a viewpoint position. It is a figure for demonstrating the relationship between the image displayed on a screen, and a viewpoint position. It is a figure which shows the function structure of the image processing apparatus which concerns on embodiment. It is a figure for demonstrating an imaging
  • FIG. 1 is a diagram showing a schematic configuration of a typical portable information processing apparatus 1 in which an image processing apparatus 100 according to an embodiment of the present invention is realized.
  • FIG. 2 shows an external view of the portable information processing apparatus 1.
  • the portable information processing apparatus 1 includes a processing control unit 10, a connector 11, a cartridge 12, a wireless communication unit 13, a communication controller 14, a sound amplifier 15, a speaker 16, and a microphone. 17, an operation key 18, a first display unit 19, a second display unit 20, a touch panel 21, a camera 22, and an angular velocity sensor 23. Further, as shown in FIG. 2, the portable information processing apparatus 1 includes an upper housing 31 and a lower housing 32, and the upper housing 31 and the lower housing 32 are connected to each other so as to be opened and closed. Yes.
  • the processing control unit 10 includes a CPU (Central Processing Unit) core 10a, an image processing unit 10b, a VRAM (Video Random Access Memory) 10c, a WRAM (Work RAM) 10d, an LCD (Liquid Crystal Display) controller 10e, A touch panel controller 10f.
  • a CPU Central Processing Unit
  • an image processing unit 10b an image processing unit 10b
  • a VRAM Video Random Access Memory
  • WRAM Work RAM
  • LCD Liquid Crystal Display
  • a touch panel controller 10f A touch panel controller
  • the CPU core 10a controls the operation of the entire portable information processing apparatus 1 and is connected to each component to exchange control signals and data. Specifically, with the cartridge 12 mounted in the connector 11, a program or data stored in a ROM (Read Only Memory) 12a in the cartridge 12 is read and predetermined processing is executed.
  • ROM Read Only Memory
  • the image processing unit 10b processes the data read from the ROM 12a in the cartridge 12, the data of the image taken by the camera 22, or the data processed by the CPU core 10a, and then processes this data into a VRAM 10c. To store.
  • the VRAM 10c is a frame memory that stores information for display, and stores image information processed by the image processing unit 10b and the like.
  • the WRAM 10d stores work data and the like necessary when the CPU core 10a executes various processes according to the program.
  • the LCD controller 10e controls the first display unit 19 and the second display unit 20 to display a predetermined display image.
  • the LCD controller 10 e converts the image information stored in the VRAM 10 c into a display signal at a predetermined synchronization timing, and outputs the display signal to the first display unit 19.
  • the LCD controller 10e displays a predetermined instruction icon or the like on the second display unit 20.
  • the touch panel controller 10f detects a touch (touch) on the touch panel 21 with a touch pen or a user's finger. For example, in the state where a predetermined instruction icon or the like is displayed on the second display unit 20, a contact on the touch panel 21 and its position are detected.
  • the connector 11 is a terminal that can be detachably connected to the cartridge 12 and transmits / receives predetermined data to / from the cartridge 12 when the cartridge 12 is connected.
  • the cartridge 12 includes a ROM 12a and a RAM (Random Access Memory) 12b.
  • the ROM 12a stores a program for realizing the game, image data and sound data associated with the game, and the like.
  • the RAM 12b stores various data indicating the progress of the game.
  • the wireless communication unit 13 is a unit that performs wireless communication with the wireless communication unit 13 of another portable information processing apparatus 1 and transmits / receives predetermined data via an antenna (not shown) (not shown). .
  • the wireless communication unit 13 can also perform wireless LAN communication with a predetermined access point.
  • the wireless communication unit 13 is assigned a unique MAC (Media Access Control) address.
  • the communication controller 14 controls the wireless communication unit 13 and mediates communication performed between the processing control unit 10 and the processing control unit 10 of another portable information processing device 1 according to a predetermined protocol.
  • the sound amplifier 15 amplifies the audio signal generated by the processing control unit 10 and supplies it to the speaker 16.
  • the speaker 16 is composed of, for example, a stereo speaker, and outputs predetermined music sound, sound effect, and the like according to the audio signal amplified by the sound amplifier 15.
  • the microphone 17 receives an analog signal such as a user's voice, and the received signal is processed by the processing control unit 10 such as mixing.
  • the operation key 18 is composed of a direction key 18a, a button 18b, and the like that are appropriately arranged in the portable information processing apparatus 1, and receives a predetermined instruction input according to a user operation.
  • the operation key 18 includes a button for adjusting the volume, a knob, and the like.
  • the first display unit 19 and the second display unit 20 are composed of an LCD or the like, and are controlled by the LCD controller 10e to appropriately display a game image or the like.
  • the first display unit 19 is provided in the upper housing 31, and the second display unit 20 is provided in the lower housing 32.
  • the first display unit 19 displays a stereoscopic image (hereinafter referred to as “stereoscopic image”) that allows the player to stereoscopically view.
  • the stereoscopic image is, for example, an image capable of autostereoscopic viewing by a parallax barrier method, or an image capable of stereoscopic viewing by wearing predetermined glasses and viewing the screen.
  • the second display unit 20 displays a normal image instead of a stereoscopic image.
  • the touch panel 21 is disposed so as to be superimposed on the front surface of the second display unit 20, and receives input by touching a touch pen or a user's finger.
  • the touch panel 21 includes, for example, a pressure-sensitive touch sensor panel and the like, detects the pressure of the user's finger and the like, and detects the contact state, the transition from the contact state to the non-contact state, and the like.
  • the touch panel 21 may detect contact with a user's finger or the like from a change in capacitance or the like.
  • the camera 22 captures the surrounding space according to the user's instruction and converts the captured image into an electrical signal.
  • the camera 22 is composed of, for example, a CMOS (Complimentary MOS) sensor or the like.
  • the camera 22 is disposed on the back side of the upper housing 31 (FIG. 2B).
  • the angular velocity sensor 23 detects angular velocities generated around the three axes (FIG. 2A, xyz axes) of the portable information processing device 1 and outputs detected angular velocity data to the processing control unit 10.
  • the process control unit 10 obtains the posture and movement of the portable information processing device 1 based on the angular velocity data.
  • the image processing apparatus 100 is realized on the typical portable information processing apparatus 1 described above, but can also be realized on a general computer or a game apparatus.
  • a general computer or game device includes a CPU core, a VRAM, and a WRAM similarly to the portable information processing device 1.
  • a communication unit for example, a NIC (Network Interface Controller) conforming to a standard such as 10BASE-T / 100BASE-T used when configuring a LAN (Local Area Network), a hard disk as a storage device, and a DVD -ROMs, magneto-optical disks, etc. can be used.
  • a keyboard or a mouse is used instead of the touch panel. Then, after the program is installed, when the program is executed, it can function as an information processing apparatus.
  • the cartridge 12 storing the game program and data is attached to the connector 11 and the portable information processing apparatus 1 is turned on to execute the program, thereby realizing the image processing apparatus 100 according to the embodiment.
  • FIG. 3 shows an example of a game virtual space.
  • the virtual space of this game is a horizontally long field 300, and a player base 201 as a player base and an enemy base 202 as an enemy base are arranged at both ends of the field.
  • the player is required to drop the enemy base 202 while protecting the player base 201.
  • the player causes the enemy base 202 to fall by causing a character (player character 203) operated by the player to attack the enemy base 202 or by injecting the player shell 205 toward the enemy base 202.
  • the player also protects the player base 201 from attacks by the enemy character 204 and the enemy shell 206.
  • FIG. 3 shows that an image 301 viewed in the shooting direction 403 from the virtual camera 400 on the straight line 401 is displayed on the screen.
  • the method for moving the virtual camera 400 includes a manual mode in which the player can arbitrarily move, and an auto mode in which the most important shooting position is automatically selected according to the program.
  • the manual mode for example, the player can move the virtual camera 400 on the straight line 401 by pressing the right direction or the left direction of the direction key 18a.
  • the auto mode for example, a priority is set for an event that occurs during game play, and the virtual camera 400 automatically moves based on the priority so that the event is displayed on the screen.
  • the event includes, for example, a fall of the player base 201 (priority: 1), a fall of the enemy base 202 (priority: 2), a down shot of the player character 203 (priority: 3), and a down shot of the enemy character 204 (priority).
  • a priority is set in advance for each event. Then, when any of these events occur simultaneously, the position of the virtual camera 400 is set so that an event image with a high priority is displayed on the screen.
  • the player can appropriately set whether to move the virtual camera 400 in the manual mode or in the auto mode. For example, it is assumed that the game proceeds by alternately performing the player's attack and the enemy's attack, and is set to the auto mode at the beginning of the player's or enemy's attack. In this case, when the pressing of the direction key 18a is detected in the auto mode, the mode may be shifted to the manual mode.
  • the image processing apparatus 100 includes a storage unit 101, a display unit 102, a reception unit 103, a detection unit 104, and an update unit 105.
  • the storage unit 101 stores the viewpoint position and the line-of-sight direction.
  • the viewpoint position is a position on the straight line 401 of the virtual camera 400 (hereinafter referred to as “shooting position”), and the line-of-sight direction is the shooting direction of the virtual camera 400.
  • shooting position a position on the straight line 401 of the virtual camera 400
  • line-of-sight direction is the shooting direction of the virtual camera 400.
  • an initial position 101a1, an initial direction 101a2, a current photographing position 101a3, and a photographing direction 101a4 of the virtual camera 400 are registered in association with each other.
  • the center position of the straight line 401 (the position of the virtual camera 400 in FIG. 3) is set to 0 [dot], and this position is set as the initial position.
  • the shooting position is represented as a positive value when moving in the right direction of the field 300 toward the screen, and as a negative value when moving in the left direction of the field 300.
  • a direction 402 perpendicular to the straight line 401 toward the field 300 is set to 0 [deg], and this direction is set as an initial direction.
  • the shooting direction is represented by a positive value when facing the screen in the right direction of the field 300 and a negative value when facing the left direction of the field.
  • the shooting information table 101a of FIG. 6A is stored in the storage unit 101.
  • the WRAM 10d functions as the storage unit 101.
  • the display unit 102 displays an image taken from the viewpoint position and the line-of-sight direction on the screen.
  • the photographed image is a virtual space image and is a stereoscopic image displayed on the first display unit 19.
  • the display unit 102 refers to the imaging information table 101a and performs imaging in the imaging direction 403 from the initial imaging position.
  • the completed image 301 (FIG. 3) is displayed on the screen.
  • the processing control unit 10 and the first display unit 19 cooperate to function as the display unit 102.
  • the accepting unit 103 accepts an instruction operation.
  • the instruction operation is an operation for moving the shooting position of the virtual camera 400, and is a left or right press of the direction key 18a.
  • the receiving unit 103 receives the press as an instruction to move the virtual camera 400 in the left direction. Or you may make it receive operation which designates the left direction or the right direction with the touch panel 21.
  • FIG. For example, when the player touches the touch panel 21 with a finger and moves it by a predetermined distance in the right direction, the accepting unit 103 accepts the operation as an instruction to move the virtual camera 400 in the right direction.
  • the processing control unit 10 and the operation key 18 cooperate to function as the reception unit 103.
  • the detection unit 104 detects a change in the screen orientation.
  • FIG. 7 shows the positional relationship between the player 500 who holds the portable information processing apparatus 1 and plays a game, and the upper casing 31 (first display unit 19) of the portable information processing apparatus 1.
  • FIG. 7A and FIG. 7B are views seen from directly above the player 500.
  • the direction of the screen is set to 0 [deg] in a direction 502 perpendicular to the direction 501 parallel to the user 500.
  • FIG. 7A shows a state where the player 500 is holding the portable information processing apparatus 1 so that the player 500 faces the first display unit 19 of the upper housing 31.
  • “facing directly” means that the upper body of the player faces directly in front of the screen.
  • the detection unit 104 detects the screen direction 503. That is, the screen direction 503 is 0 [deg], and the detection unit 104 detects 0 [deg] as the screen orientation.
  • FIG. 7B shows a state where the player 500 is holding the portable information processing apparatus 1 so that the upper housing 31 is tilted by 30 [deg] to the left from the direction 502.
  • the detection unit 104 detects the screen direction 504. That is, the detection unit 104 detects ⁇ 30 [deg] as the screen orientation.
  • the processing control unit 10 and the angular velocity sensor 23 cooperate to function as the detection unit 104.
  • the update unit 105 updates the viewpoint position based on the instruction operation, and updates the line-of-sight direction based on the change in the screen orientation.
  • the virtual camera 400 is shown in FIG. As shown, it is arranged at the initial position and faces the initial direction (imaging direction 403). Since the shooting position and the shooting direction are not changed from the initial position and the initial direction, the update unit 105 does not update the shooting position and the shooting direction.
  • the display unit 102 displays an image 303 (FIG. 8A) taken in the initial direction from the initial position on the screen.
  • the image 303 includes a player character 203 and an object 207.
  • the receiving unit 103 receives the press and updates it.
  • the unit 105 updates the shooting position based on the received press. For example, if the accepting unit 103 accepts a pressure to move 150 [dots] leftward, the updating unit 105 sets the photographing position 101a3 of the photographing information table 101a to “ ⁇ 150” as shown in FIG. 6B. Update to [dot].
  • the display unit 102 obtains the shooting position and the shooting direction with reference to the updated shooting information table 101a (FIG. 6B), and moves to the left from the initial position by 150 [dot] (FIG. 9). (B)) and the image 304 (FIG. 9A) taken from the shooting direction 403 are displayed on the screen.
  • the portable information is such that the player 500 tilts the first display unit 19 to the left from the direction 502 by 30 [deg].
  • the detection unit 104 detects the tilted screen direction 504, and the update unit 105 updates the shooting direction based on the detected direction 504. For example, if the detection unit 104 detects an inclination of 30 [deg] in the left direction, the update unit 105 sets the shooting direction 101a4 of the shooting information table 101a to “ ⁇ 30” [ ⁇ 30] as illustrated in FIG. deg].
  • the display unit 102 obtains a shooting position and a shooting direction with reference to the updated shooting information table 101a (FIG. 6C), and acquires a shooting position (initial position) and a shooting direction 404 (FIG. 10B). An image 305 (FIG. 10 (a)) taken from is displayed on the screen.
  • the update unit 105 updates the line-of-sight direction so as to approach the initial direction.
  • the receiving unit 103 receives the pressing, and the detecting unit 104
  • the direction of the upper housing 31 is detected.
  • the update unit 105 updates the shooting position based on the received pressure, and updates the shooting direction based on the detected direction.
  • the instruction operation is performed on the assumption that the shooting direction remains the direction 404 and only the shooting position moves to the designated position.
  • the update unit 105 changes the shooting direction from the direction 404 to the initial direction while the virtual camera 400 moves to the specified shooting position, as shown in FIG.
  • the shooting direction is updated so as to gradually approach 403.
  • the updating unit 105 updates the line-of-sight direction so as to approach the initial direction by a predetermined angle with respect to one instruction operation.
  • One instruction operation is, for example, one pressing of the direction key 18a in the left direction or the right direction. In the present embodiment, it is assumed that the photographing position moves by 50 [dot] by pressing the direction key 18a once.
  • FIG. 13A shows an example of a relationship between an angle formed by the current gaze direction (imaging direction) and the initial direction (hereinafter referred to as “current angle”) and a predetermined angle.
  • the updating unit 105 updates the shooting direction according to this graph. For example, as illustrated in FIG. 12B, when the virtual camera 400 is arranged at the initial position and is tilted 30 [deg] in the left direction, when the left direction of the direction key 18 a is pressed once, the update unit 105 The imaging position is updated to a position where the virtual camera 400 is moved 50 [dots] leftward on the straight line 401, and at the same time, the imaging direction is updated so as to approach the initial direction by 10 [deg]. That is, as shown in FIG.
  • the update unit 105 sets the shooting position 101a3 to “ ⁇ 50” [dot] and sets the shooting direction 101a4 to “ ⁇ 20” [0] from the shooting information table 101a of FIG. deg] (direction 405). Then, the display unit 102 obtains a shooting position and a shooting direction with reference to the shooting information table 101a updated every time the direction key 18a is pressed, and displays an image shot from the shooting position and the shooting direction. When the direction key 18a is pressed three times to the left and updated to the photographing direction 403, the display unit 102 displays the image 307 photographed from the photographing position and the photographing direction 403 moved by 150 [dots] left from the initial position. (FIG. 12A) is displayed on the screen.
  • the predetermined angle is not a fixed value, but may be determined according to an angle (current angle) formed by the line-of-sight direction and the initial direction. For example, as shown in FIG. 13B, the predetermined angle may be set larger as the current angle is larger.
  • the update unit 105 receives the upper housing 31 (first display unit 19). Since the shooting direction is updated regardless of the orientation of), an image 307 (FIG. 12A) in a direction not intended by the player 500 is displayed. That is, although the screen is tilted, the image 307 (FIG. 12A) displayed when facing the field in the virtual space is displayed on the screen. In this way, by updating the shooting direction, it is possible to prompt the player 500 to face the upper housing 31 (the first display unit 19) as shown in FIG. 14C.
  • the processing control unit 10 functions as the update unit 105.
  • the display unit 102 generates an image shot from the viewpoint position and the line-of-sight direction based on the viewpoint position and the line-of-sight direction stored in the storage unit 101 (step S101).
  • the display unit 102 refers to the shooting information table 101a and generates an image representing the state of the virtual space shot from the shooting position and shooting direction.
  • the image generated by the display unit 102 is stored in the VRAM 10c.
  • the display unit 102 waits for a queue to be cleared or another process to be performed until a vertical synchronization interrupt occurs (step S102).
  • the display unit 102 converts the image information stored in the VRAM 10c into a display signal and displays it on the first display unit 19 (step S103). For example, the image 303 in FIG. 8 is displayed on the first display unit 19.
  • the accepting unit 103 determines whether or not an instruction operation has been accepted (step S104). If the reception unit 103 determines that the instruction operation has been received (step S104; Yes), the detection unit 104 detects the orientation of the screen (step S105). On the other hand, when the receiving unit 103 determines that the instruction operation is not received (step S104; No), the detecting unit 104 detects the orientation of the screen (step S109).
  • the reception unit 103 determines whether either the left direction or the right direction of the direction key 18a is pressed. When it is determined whether the receiving unit 103 has received the press, the detecting unit 104 detects the current orientation of the upper casing 31 (first display unit 19). Note that the determination of whether or not the instruction operation by the reception unit 103 has been received and the detection of the screen orientation of the detection unit 104 may be performed in the reverse order or simultaneously.
  • the update unit 105 next determines whether or not the screen orientation is the initial direction (step S106).
  • the updating unit 105 determines that the screen orientation is the initial direction (step S106; Yes)
  • the updating unit 105 updates the viewpoint position stored in the storage unit 101 based on the instruction operation received by the receiving unit 103 (step S107).
  • the updating unit 105 updates the viewpoint position stored in the storage unit 101 based on the instruction operation received by the receiving unit 103, and stores the viewpoint position.
  • the line-of-sight direction stored in the unit 101 is updated so as to approach the initial direction (step S108). Then, it returns to step S101.
  • the updating unit 105 determines the shooting position of the shooting information table 101a based on the pressing of the direction key 18a. 101a3 is updated.
  • the update unit 105 performs the shooting position 101a3 of the shooting information table 101a based on the press. Is updated so that the shooting direction 101a4 gradually approaches the initial direction regardless of the orientation of the upper casing 31.
  • the display unit 102 displays an image photographed from the photographing position and photographing direction in the updated photographing information table 101 a on the first display unit 19. .
  • step S109 when the screen orientation is detected by the detection unit 104, the update unit 105 next determines whether or not the screen orientation is the initial direction (step S110).
  • step S110 determines that the screen orientation is the initial direction (step S110; Yes)
  • step S110 determines that the screen orientation is the initial direction
  • step S111 the updating unit 105 updates the line-of-sight direction stored in the storage unit 101 based on the orientation of the screen (step S111). Then, it returns to step S101.
  • the update unit 105 takes a picture based on the orientation of the upper housing 31.
  • the imaging direction 101a4 of the information table 101a is updated.
  • the update unit 105 does not update the shooting position 101a3 and the shooting direction 101a4. Then, the display unit 102 displays on the first display unit 19 an image shot from the shooting position and shooting direction of the shooting information table 101a.
  • the player In a normal game, as shown in FIG. 14C, it is desirable that the player is operated in a state of facing the screen.
  • the portable information processing apparatus 1 that can change the orientation at hand is used.
  • the player may operate in a state where the player is not facing the screen.
  • the image when a stereoscopic image is displayed, the image may not be stereoscopically viewed unless it is directly facing the screen.
  • the updating unit 105 updates the direction so that the shooting direction faces the field in the virtual space, and faces the virtual camera. By displaying the image displayed in the case on the screen, it is possible to prompt the player to face the screen.
  • the operation of the image processing apparatus 100 has been described by taking the shooting position and shooting direction of the virtual camera as an example, but the present invention is not limited to this.
  • it may be a shooting position and a shooting direction of a camera shooting a real space.
  • the camera may not be provided in the same housing as the screen.
  • the shooting position of the camera can be specified remotely by the user, but the present invention can also be applied to an apparatus in which the shooting direction is determined based on the orientation of the screen held by the user.
  • the screen orientation is detected by the angular velocity sensor.
  • the present invention is not limited to this.
  • the orientation of the screen may be obtained from a change in a captured image captured by the camera.
  • an image processing device a processing method, a program, and a non-temporary recording medium that are suitable for causing a user to operate the screen in the correct orientation.
  • SYMBOLS 1 Portable information processing apparatus 10 Processing control part 10a CPU core 10b Image processing part 10c VRAM 10d WRAM 10e LCD controller 10f Touch panel controller 11 Connector 12 Cartridge 12a ROM 12b RAM 13 wireless communication unit 14 communication controller 15 sound amplifier 16 speaker 17 microphone 18 operation key 18a direction key 18b button 19 first display unit 20 second display unit 21 touch panel 22 camera 23 angular velocity sensor 31 upper casing 32 lower casing DESCRIPTION OF SYMBOLS 100 Image processing apparatus 101 Memory

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Processing Or Creating Images (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A view point position and a view line direction are stored in a storage unit (101). A display unit (102) displays, on a screen, an image photographed from the view point position and the view line direction. A receiving unit (103) receives instructing operations. A detecting unit (104) detects a change of the direction of the screen. An updating unit (105) updates the view point position on the basis of the instructing operations, and updates the view line direction on the basis of the change of the direction of the screen. Then, when the instructing operations are received, the updating unit (105) updates the view line direction to be close to the initial direction.

Description

画像処理装置、処理方法、プログラム、ならびに、非一時的な記録媒体Image processing apparatus, processing method, program, and non-transitory recording medium
 本発明は、ユーザに、画面に対して正しい向きで操作させるのに好適な画像処理装置、処理方法、プログラム、ならびに、非一時的な(non-transitory)記録媒体に関する。 The present invention relates to an image processing apparatus, a processing method, a program, and a non-transitory recording medium suitable for causing a user to operate in a correct orientation with respect to a screen.
 従来から、携帯型ゲーム装置や携帯電話等、手元で画面の向きを変更できる携帯型情報処理装置が知られている。近年では、これらの携帯型情報処理装置において、立体視を行うことが可能である。例えば、特許文献1には、携帯型ゲーム装置の画面に表示する立体画像を生成する技術が開示されている。 2. Description of the Related Art Conventionally, portable information processing devices that can change the orientation of a screen at hand, such as a portable game device and a cellular phone, are known. In recent years, these portable information processing devices can perform stereoscopic viewing. For example, Patent Document 1 discloses a technique for generating a stereoscopic image to be displayed on the screen of a portable game device.
特開2011-86188号公報JP 2011-86188 A
 しかしながら、携帯型情報処理装置は、手元で容易に画面の向きを変更することができるので、ユーザは画面に対して正対せずに操作をおこなうことがある。携帯型情報処理装置においては、画面に正対せず、向きを変えてしまうと、表示されている画像が見難い、あるいは、立体視が正確に行えないという問題があった。 However, since the portable information processing apparatus can easily change the orientation of the screen at hand, the user may operate without facing the screen. In the portable information processing apparatus, if the orientation is changed without facing the screen, there is a problem that it is difficult to see the displayed image or that the stereoscopic view cannot be accurately performed.
 本発明は、上記のような課題を解決するもので、ユーザに、画面に対して正しい向きで操作させるのに好適な画像処理装置、処理方法、プログラム、ならびに、非一時的な記録媒体を提供することを目的とする。 The present invention solves the problems as described above, and provides an image processing apparatus, a processing method, a program, and a non-temporary recording medium suitable for causing a user to operate in a correct orientation with respect to a screen. The purpose is to do.
 本発明の第1の観点に係る画像処理装置は、記憶部と、表示部と、受付部と、検知部と、更新部と、を備え、以下のように構成する。なお、以下では、カメラの撮影位置をユーザが任意に指定することができ、撮影された画像が表示される画面の向きを、手元で変更できる画像処理装置を例に説明する。 The image processing apparatus according to the first aspect of the present invention includes a storage unit, a display unit, a reception unit, a detection unit, and an update unit, and is configured as follows. In the following, an image processing apparatus will be described as an example in which the user can arbitrarily specify the shooting position of the camera and can change the orientation of the screen on which the shot image is displayed.
 記憶部には、視点位置及び視線方向が記憶される。 The storage unit stores the viewpoint position and the line-of-sight direction.
 視点位置とは、例えば、現実空間を撮影するカメラの撮影位置である。また、視線方向とは、例えば、現実空間を撮影するカメラの撮影方向である。例えば、記憶部には、カメラの撮影位置及び撮影方向の情報が記憶される。 The viewpoint position is, for example, a photographing position of a camera that photographs a real space. The line-of-sight direction is, for example, the shooting direction of a camera that shoots a real space. For example, information on the shooting position and shooting direction of the camera is stored in the storage unit.
 表示部は、視点位置及び視線方向から撮影された画像を画面に表示する。 The display unit displays an image captured from the viewpoint position and the line-of-sight direction on the screen.
 例えば、表示部は、カメラの撮影位置から撮影方向に向かって撮影された現実空間の画像を画面に表示する。 For example, the display unit displays on the screen an image of the real space photographed from the photographing position of the camera toward the photographing direction.
 受付部は、指示操作を受け付ける。 The accepting unit accepts an instruction operation.
 指示操作とは、例えば、カメラの撮影位置を指示する操作である。例えば、受付部は、ユーザより、カメラの撮影位置を指定する操作を受け付ける。 The instruction operation is, for example, an operation for instructing the shooting position of the camera. For example, the reception unit receives an operation for designating the shooting position of the camera from the user.
 検知部は、画面の向きの変化を検知する。 Detecting unit detects changes in screen orientation.
 例えば、検知部は、ユーザにより画面の向きが変更されると、画面の向きの変化を検知する。 For example, when the screen orientation is changed by the user, the detection unit detects a change in the screen orientation.
 更新部は、指示操作に基づいて、視点位置を更新し、画面の向きの変化に基づいて、視線方向を更新する。 The update unit updates the viewpoint position based on the instruction operation, and updates the line-of-sight direction based on the change in the screen orientation.
 例えば、更新部は、カメラの撮影位置を、ユーザにより指定された位置に更新し、ユーザにより変更された画面の向きに基づいて、カメラの撮影方向を更新する。 For example, the update unit updates the shooting position of the camera to a position specified by the user, and updates the shooting direction of the camera based on the orientation of the screen changed by the user.
 そして、更新部は、指示操作が受け付けられると、視線方向を初期方向に近づくように更新する。 Then, when the instruction operation is accepted, the update unit updates the line-of-sight direction so as to approach the initial direction.
 初期方向とは、例えば、カメラの撮影方向であって、画面とユーザとが正対している状態で画面と直角をなす方向であるとする。更新部は、ユーザと画面とが正対しない場合、すなわち、撮影方向と初期方向とが異なる場合にカメラの撮影位置を指定する操作を受け付けると、撮影方向を初期方向に近づくように更新する。 The initial direction is, for example, a shooting direction of the camera, and is a direction perpendicular to the screen when the screen and the user are facing each other. The update unit updates the shooting direction so as to approach the initial direction when an operation for specifying the shooting position of the camera is received when the user and the screen do not face each other, that is, when the shooting direction and the initial direction are different.
 本発明によれば、ユーザが画面に正対していない状態で、視点位置を移動させると、画面の向きにかかわらず視線方向が初期方向に更新されるので、ユーザが意図していない方向に視線方向が変更される。これにより、ユーザが画面に正対していないとユーザが意図したように画面に画像が表示されないので、ユーザに画面に正対して操作を行うように促すことができる。 According to the present invention, if the viewpoint position is moved while the user is not facing the screen, the line-of-sight direction is updated to the initial direction regardless of the direction of the screen. The direction is changed. As a result, an image is not displayed on the screen as intended by the user if the user is not facing the screen, and the user can be prompted to perform the operation facing the screen.
 また、撮影された画像は、立体視用の画像であってもよい。 Further, the captured image may be a stereoscopic image.
 例えば、カメラが撮影した画像は立体視用の画像であり、ユーザは、画面に正対している場合に、画像を立体視することができる。 For example, the image captured by the camera is an image for stereoscopic viewing, and the user can stereoscopically view the image when facing the screen.
 本発明によれば、ユーザが画面に正対していないと画像を立体視することができない場合に、ユーザに画面に正対して操作を行うように促すことができる。 According to the present invention, when an image cannot be stereoscopically viewed unless the user is facing the screen, the user can be prompted to perform the operation facing the screen.
 また、撮影された画像は、仮想空間の様子を表す画像であってもよい。 Further, the photographed image may be an image representing the state of the virtual space.
 例えば、視点位置とは仮想空間内に配置される仮想カメラの撮影位置であり、視線方向とは仮想カメラの撮影方向である。表示部は、仮想カメラが撮影した仮想空間の様子を表す画像を画面に表示する。 For example, the viewpoint position is the shooting position of the virtual camera arranged in the virtual space, and the line-of-sight direction is the shooting direction of the virtual camera. The display unit displays an image representing a state of the virtual space captured by the virtual camera on the screen.
 本発明によれば、仮想空間の画像を表示する画面に対して、ユーザを正対するように促すことができる。 According to the present invention, it is possible to prompt the user to face the screen displaying the image of the virtual space.
 また、更新部は、1回の指示操作に対して所定の角度だけ初期方向に近づくように、視線方向を更新するようにしてもよい。 Further, the updating unit may update the line-of-sight direction so as to approach the initial direction by a predetermined angle with respect to one instruction operation.
 例えば、指示操作が所定のボタンの押圧であるとし、所定の角度を10度とする。例えば、視線方向が初期方向とは異なる場合に、ユーザが当該所定のボタンを1回押圧すると、更新部は、視線方向を現在の方向から10度だけ初期方向に近づくように更新する。 For example, it is assumed that the instruction operation is pressing a predetermined button, and the predetermined angle is 10 degrees. For example, when the line-of-sight direction is different from the initial direction, when the user presses the predetermined button once, the update unit updates the line-of-sight direction so as to approach the initial direction by 10 degrees from the current direction.
 本発明によれば、視線方向を徐々に初期方向に近づけることができる。 According to the present invention, the line-of-sight direction can be gradually brought closer to the initial direction.
 あるいは、所定の角度は、視線方向と初期方向とがなす角度に応じて定められるようにしてもよい。 Alternatively, the predetermined angle may be determined according to an angle formed by the line-of-sight direction and the initial direction.
 例えば、所定の角度は、現在の視線方向と初期方向とのなす角度が大きいほど大きく定められる。例えば、更新部は、現在の視線方向と初期方向のなす角度が60度である場合は、1回の操作で10度だけ初期方向に近づき、現在の視線方向と初期方向の成す角度が30度の場合は、1回の操作で5度だけ初期方向に近づくように更新する。 For example, the predetermined angle is set larger as the angle formed by the current line-of-sight direction and the initial direction is larger. For example, when the angle formed by the current gaze direction and the initial direction is 60 degrees, the update unit approaches the initial direction by 10 degrees in one operation, and the angle formed by the current gaze direction and the initial direction is 30 degrees. In the case of, it is updated so as to approach the initial direction by 5 degrees in one operation.
 本発明によれば、視線方向を滑らかに初期方向に近づけることができる。 According to the present invention, the line-of-sight direction can be brought close to the initial direction smoothly.
 本発明の第2の観点に係る画像処理装置は、視点位置及び視線方向が記憶される記憶部と、表示部と、受付部と、検知部と、更新部と、を備える画像処理装置が実行する処理方法であって、表示工程と、受付工程と、検知工程と、更新工程と、を備え以下のように構成する。 An image processing apparatus according to a second aspect of the present invention is executed by an image processing apparatus including a storage unit that stores a viewpoint position and a line-of-sight direction, a display unit, a reception unit, a detection unit, and an update unit. This processing method includes a display process, a reception process, a detection process, and an update process, and is configured as follows.
 表示工程において、表示部が、視点位置及び視線方向から撮影された画像を画面に表示する。
 受付工程において、受付部が、指示操作を受け付ける。
 検知工程において、検知部が、画面の向きの変化を検知する。
 更新工程において、更新部が、指示操作に基づいて、視点位置を更新し、画面の向きの変化に基づいて、視線方向を更新する。
 また、更新工程において、更新部は、指示操作が受け付けられると、視線方向を初期方向に近づくように更新する。
In the display step, the display unit displays an image taken from the viewpoint position and the line-of-sight direction on the screen.
In the reception process, the reception unit receives an instruction operation.
In the detection step, the detection unit detects a change in the orientation of the screen.
In the update process, the update unit updates the viewpoint position based on the instruction operation, and updates the line-of-sight direction based on the change in the screen orientation.
In the updating process, when the instruction operation is accepted, the updating unit updates the line-of-sight direction so as to approach the initial direction.
 本発明のその他の観点に係るプログラムは、コンピュータを上記画像処理装置として機能させ、コンピュータに上記処理方法を実行させるように構成する。 A program according to another aspect of the present invention is configured to cause a computer to function as the image processing apparatus and to cause the computer to execute the processing method.
 また、本発明のプログラムは、コンパクトディスク、フレキシブルディスク、ハードディスク、光磁気ディスク、ディジタルビデオディスク、磁気テープ、半導体メモリ等のコンピュータ読取可能な非一時的な記録媒体に記録することができる。 Also, the program of the present invention can be recorded on a computer-readable non-transitory recording medium such as a compact disk, flexible disk, hard disk, magneto-optical disk, digital video disk, magnetic tape, and semiconductor memory.
 上記プログラムは、プログラムが実行されるコンピュータとは独立して、コンピュータ通信網を介して配布・販売することができる。また、上記非一時的な記録媒体は、コンピュータとは独立して配布・販売することができる。 The above program can be distributed and sold via a computer communication network independently of the computer on which the program is executed. The non-temporary recording medium can be distributed and sold independently from the computer.
 ここで、非一時的な記録媒体とは、有形な(tangible)記録媒体をいう。非一時的な記録媒体は、例えば、コンパクトディスク、フレキシブルディスク、ハードディスク、光磁気ディスク、ディジタルビデオディスク、磁気テープ、半導体メモリ等である。また、一時的な(transitory)記録媒体とは、伝送媒体(伝搬信号)それ自体を示す。一時的な記録媒体は、例えば、電気信号、光信号、電磁波等である。なお、一時的な(temporary)記憶領域とは、データやプログラムを一時的に記憶するための領域であり、例えば、RAM(Random Access Memory)等の揮発性メモリである。 Here, the non-temporary recording medium refers to a tangible recording medium. Non-temporary recording media are, for example, compact disks, flexible disks, hard disks, magneto-optical disks, digital video disks, magnetic tapes, semiconductor memories, and the like. The transitory recording medium refers to the transmission medium (propagation signal) itself. The temporary recording medium is, for example, an electric signal, an optical signal, an electromagnetic wave, or the like. The temporary storage area is an area for temporarily storing data and programs, and is, for example, a volatile memory such as a RAM (Random Access Memory).
 本発明によれば、ユーザに、画面に対し正しい向きで操作させるのに好適な画像処理装置、処理方法、プログラム、ならびに、非一時的な記録媒体を提供することができる。 According to the present invention, it is possible to provide an image processing device, a processing method, a program, and a non-temporary recording medium that are suitable for causing a user to operate the screen in the correct orientation.
本発明の実施形態に係る画像処理装置が実現される典型的な携帯型情報処理装置の概要構成を示す図である。1 is a diagram illustrating a schematic configuration of a typical portable information processing apparatus in which an image processing apparatus according to an embodiment of the present invention is realized. 携帯型情報処理装置の外観を示す図である。It is a figure which shows the external appearance of a portable information processing apparatus. 画面に表示される画像と視点位置との関係を説明するための図である。It is a figure for demonstrating the relationship between the image displayed on a screen, and a viewpoint position. 画面に表示される画像と視点位置との関係を説明するための図である。It is a figure for demonstrating the relationship between the image displayed on a screen, and a viewpoint position. 実施形態に係る画像処理装置の機能構成を示す図である。It is a figure which shows the function structure of the image processing apparatus which concerns on embodiment. 撮影情報テーブルを説明するための図である。It is a figure for demonstrating an imaging | photography information table. 画面の向きを説明するための図である。It is a figure for demonstrating the direction of a screen. 画面に表示される画像、仮想カメラの撮影位置及び撮影方向、及びユーザの向きの関係を説明するための図である。It is a figure for demonstrating the relationship between the image displayed on a screen, the imaging | photography position and imaging | photography direction of a virtual camera, and a user's direction. 画面に表示される画像、仮想カメラの撮影位置及び撮影方向、及びユーザの向きの関係を説明するための図である。It is a figure for demonstrating the relationship between the image displayed on a screen, the imaging | photography position and imaging | photography direction of a virtual camera, and a user's direction. 画面に表示される画像、仮想カメラの撮影位置及び撮影方向、及びユーザの向きの関係を説明するための図である。It is a figure for demonstrating the relationship between the image displayed on a screen, the imaging | photography position and imaging | photography direction of a virtual camera, and a user's direction. 画面に表示される画像、仮想カメラの撮影位置及び撮影方向、及びユーザの向きの関係を説明するための図である。It is a figure for demonstrating the relationship between the image displayed on a screen, the imaging | photography position and imaging | photography direction of a virtual camera, and a user's direction. 画面に表示される画像、仮想カメラの撮影位置及び撮影方向、及びユーザの向きの関係を説明するための図である。It is a figure for demonstrating the relationship between the image displayed on a screen, the imaging | photography position and imaging | photography direction of a virtual camera, and a user's direction. 現在の角度と所定の角度との関係を説明するための図である。It is a figure for demonstrating the relationship between the present angle and a predetermined angle. 画面に表示される画像、仮想カメラの撮影位置及び撮影方向、及びユーザの向きの関係を説明するための図である。It is a figure for demonstrating the relationship between the image displayed on a screen, the imaging | photography position and imaging | photography direction of a virtual camera, and a user's direction. 実施形態に係る画像処理装置の各部が行う画像処理を説明するためのフローチャートである。5 is a flowchart for explaining image processing performed by each unit of the image processing apparatus according to the embodiment.
(携帯型情報処理装置の概要構成)
 実施形態に係る画像処理装置が実現される典型的な携帯型情報処理装置について説明する。なお、以下では、理解を容易にするため、ゲーム用の携帯型情報処理装置を利用して本発明が実現される実施形態を説明するが、以下に説明する実施形態は説明のためのものであり、本願発明の範囲を制限するものではない。したがって、当業者であればこれらの各要素もしくは全要素をこれと均等なものに置換した実施形態を採用することが可能であるが、これらの実施形態も本発明の範囲に含まれる。
(Outline configuration of portable information processing device)
A typical portable information processing apparatus in which the image processing apparatus according to the embodiment is realized will be described. In the following, for ease of understanding, an embodiment in which the present invention is realized using a portable information processing device for games will be described. However, the embodiment described below is for explanation. There is no limitation on the scope of the present invention. Therefore, those skilled in the art can employ embodiments in which each or all of these elements are replaced with equivalent ones, and these embodiments are also included in the scope of the present invention.
 図1は、本発明の実施形態に係る画像処理装置100が実現される典型的な携帯型情報処理装置1の概要構成を示す図である。また、図2に携帯型情報処理装置1の外観図を示す。以下、これらの図を参照して説明する。 FIG. 1 is a diagram showing a schematic configuration of a typical portable information processing apparatus 1 in which an image processing apparatus 100 according to an embodiment of the present invention is realized. FIG. 2 shows an external view of the portable information processing apparatus 1. Hereinafter, description will be given with reference to these drawings.
 携帯型情報処理装置1は、図1に示すように、処理制御部10と、コネクタ11と、カートリッジ12と、無線通信部13と、通信コントローラ14と、サウンドアンプ15と、スピーカ16と、マイク17と、操作キー18と、第1の表示部19と、第2の表示部20と、タッチパネル21と、カメラ22と、角速度センサ23と、を備える。また、携帯型情報処理装置1は、図2に示すように、上側筐体31と下側筐体32とから構成され、上側筐体31と下側筐体32とは開閉可能に接続されている。 As shown in FIG. 1, the portable information processing apparatus 1 includes a processing control unit 10, a connector 11, a cartridge 12, a wireless communication unit 13, a communication controller 14, a sound amplifier 15, a speaker 16, and a microphone. 17, an operation key 18, a first display unit 19, a second display unit 20, a touch panel 21, a camera 22, and an angular velocity sensor 23. Further, as shown in FIG. 2, the portable information processing apparatus 1 includes an upper housing 31 and a lower housing 32, and the upper housing 31 and the lower housing 32 are connected to each other so as to be opened and closed. Yes.
 処理制御部10は、CPU(Central Processing Unit)コア10aと、画像処理部10bと、VRAM(Video Random Access Memory)10cと、WRAM(Work RAM)10dと、LCD(Liquid Crystal Display)コントローラ10eと、タッチパネルコントローラ10fと、を備える。 The processing control unit 10 includes a CPU (Central Processing Unit) core 10a, an image processing unit 10b, a VRAM (Video Random Access Memory) 10c, a WRAM (Work RAM) 10d, an LCD (Liquid Crystal Display) controller 10e, A touch panel controller 10f.
 CPUコア10aは、携帯型情報処理装置1全体の動作を制御し、各構成要素と接続され制御信号やデータをやりとりする。具体的には、カートリッジ12がコネクタ11に装着された状態で、カートリッジ12内のROM(Read Only Memory)12aに記憶されたプログラムやデータを読み出して、所定の処理を実行する。 The CPU core 10a controls the operation of the entire portable information processing apparatus 1 and is connected to each component to exchange control signals and data. Specifically, with the cartridge 12 mounted in the connector 11, a program or data stored in a ROM (Read Only Memory) 12a in the cartridge 12 is read and predetermined processing is executed.
 画像処理部10bは、カートリッジ12内のROM 12aから読み出されたデータ、カメラ22により撮影された画像のデータ、または、CPUコア10aにて処理されたデータを加工処理した後、これをVRAM 10cに格納する。 The image processing unit 10b processes the data read from the ROM 12a in the cartridge 12, the data of the image taken by the camera 22, or the data processed by the CPU core 10a, and then processes this data into a VRAM 10c. To store.
 VRAM 10cは、表示用の情報を記憶するフレームメモリであり、画像処理部10b等により加工された画像情報を記憶する。 The VRAM 10c is a frame memory that stores information for display, and stores image information processed by the image processing unit 10b and the like.
 WRAM 10dは、CPUコア10aがプログラムに従った各種処理を実行する際に必要となるワークデータ等を記憶する。 The WRAM 10d stores work data and the like necessary when the CPU core 10a executes various processes according to the program.
 LCDコントローラ10eは、第1の表示部19及び第2の表示部20を制御し、所定の表示用画像を表示させる。例えば、LCDコントローラ10eは、VRAM 10cに記憶された画像情報を、所定の同期タイミングで表示信号に変換し、第1の表示部19に出力する。また、LCDコントローラ10eは、第2の表示部20に所定の指示アイコン等を表示する。 The LCD controller 10e controls the first display unit 19 and the second display unit 20 to display a predetermined display image. For example, the LCD controller 10 e converts the image information stored in the VRAM 10 c into a display signal at a predetermined synchronization timing, and outputs the display signal to the first display unit 19. The LCD controller 10e displays a predetermined instruction icon or the like on the second display unit 20.
 タッチパネルコントローラ10fは、タッチペンやユーザの指によるタッチパネル21への接触(タッチ)を検出する。例えば、第2の表示部20に所定の指示アイコン等が表示されている状態で、タッチパネル21上の接触、及びその位置等を検出する。 The touch panel controller 10f detects a touch (touch) on the touch panel 21 with a touch pen or a user's finger. For example, in the state where a predetermined instruction icon or the like is displayed on the second display unit 20, a contact on the touch panel 21 and its position are detected.
 コネクタ11は、カートリッジ12と脱着自在に接続可能な端子であり、カートリッジ12が接続された際に、カートリッジ12との間で所定のデータを送受信する。 The connector 11 is a terminal that can be detachably connected to the cartridge 12 and transmits / receives predetermined data to / from the cartridge 12 when the cartridge 12 is connected.
 カートリッジ12は、ROM 12aと、RAM(Random Access Memory)12bと、を備える。
 ROM 12aには、ゲームを実現するためのプログラムとゲームに付随する画像データや音声データ等が記憶される。
 RAM 12bには、ゲームの進行状況等を示す種々のデータが記憶される。
The cartridge 12 includes a ROM 12a and a RAM (Random Access Memory) 12b.
The ROM 12a stores a program for realizing the game, image data and sound data associated with the game, and the like.
The RAM 12b stores various data indicating the progress of the game.
 無線通信部13は、他の携帯型情報処理装置1の無線通信部13との間で、無線通信を行うユニットであり、図示せぬアンテナ(内蔵アンテナ等)を介して所定のデータを送受信する。
 なお、無線通信部13は、所定のアクセスポイントとの間で、無線LAN通信を行うこともできる。また、無線通信部13には、固有のMAC(Media Access Control)アドレスが採番されている。
The wireless communication unit 13 is a unit that performs wireless communication with the wireless communication unit 13 of another portable information processing apparatus 1 and transmits / receives predetermined data via an antenna (not shown) (not shown). .
The wireless communication unit 13 can also perform wireless LAN communication with a predetermined access point. The wireless communication unit 13 is assigned a unique MAC (Media Access Control) address.
 通信コントローラ14は、無線通信部13を制御し、所定のプロトコルに沿って、処理制御部10と他の携帯型情報処理装置1の処理制御部10との間で行われる通信の仲立ちをする。 The communication controller 14 controls the wireless communication unit 13 and mediates communication performed between the processing control unit 10 and the processing control unit 10 of another portable information processing device 1 according to a predetermined protocol.
 サウンドアンプ15は、処理制御部10にて生成された音声信号を増幅し、スピーカ16に供給する。 The sound amplifier 15 amplifies the audio signal generated by the processing control unit 10 and supplies it to the speaker 16.
 スピーカ16は、例えば、ステレオスピーカ等からなり、サウンドアンプ15にて増幅された音声信号に従って、所定の楽曲音や効果音等を出力する。 The speaker 16 is composed of, for example, a stereo speaker, and outputs predetermined music sound, sound effect, and the like according to the audio signal amplified by the sound amplifier 15.
 マイク17は、ユーザの声等のアナログ信号を受け付け、受け付けた信号は処理制御部10でミキシング等の処理がされる。 The microphone 17 receives an analog signal such as a user's voice, and the received signal is processed by the processing control unit 10 such as mixing.
 操作キー18は、携帯型情報処理装置1に適宜配置された方向キー18aやボタン18b等から構成され、ユーザの操作に従って、所定の指示入力を受け付ける。操作キー18には、音量を調節するためのボタンや、つまみ等も含まれる。 The operation key 18 is composed of a direction key 18a, a button 18b, and the like that are appropriately arranged in the portable information processing apparatus 1, and receives a predetermined instruction input according to a user operation. The operation key 18 includes a button for adjusting the volume, a knob, and the like.
 第1の表示部19及び第2の表示部20は、LCD等からなり、LCDコントローラ10eに制御され、ゲーム画像等を適宜表示する。第1の表示部19は上側筐体31に備えられ、第2の表示部20は下側筐体32に備えられる。第1の表示部19は、プレイヤに立体視をさせることが可能な立体画像(以下、「立体視用画像」という)を表示する。立体視用画像は、例えば、パララックスバリア方式による裸眼立体視が可能な画像、あるいは、所定のメガネを装着して画面を見ることにより立体視が可能な画像である。一方、第2の表示部20は、立体視用画像ではなく通常の画像を表示する。 The first display unit 19 and the second display unit 20 are composed of an LCD or the like, and are controlled by the LCD controller 10e to appropriately display a game image or the like. The first display unit 19 is provided in the upper housing 31, and the second display unit 20 is provided in the lower housing 32. The first display unit 19 displays a stereoscopic image (hereinafter referred to as “stereoscopic image”) that allows the player to stereoscopically view. The stereoscopic image is, for example, an image capable of autostereoscopic viewing by a parallax barrier method, or an image capable of stereoscopic viewing by wearing predetermined glasses and viewing the screen. On the other hand, the second display unit 20 displays a normal image instead of a stereoscopic image.
 タッチパネル21は、第2の表示部20の前面に重畳して配置され、タッチペンやユーザの指の接触による入力を受け付ける。タッチパネル21は、例えば、感圧式のタッチセンサパネル等からなり、ユーザの指等の圧力を検出し、接触状態、及び、接触状態から非接触状態への移行等を検出する。なお、タッチパネル21は、他に静電容量の変化等から、ユーザの指等の接触を検出してもよい。 The touch panel 21 is disposed so as to be superimposed on the front surface of the second display unit 20, and receives input by touching a touch pen or a user's finger. The touch panel 21 includes, for example, a pressure-sensitive touch sensor panel and the like, detects the pressure of the user's finger and the like, and detects the contact state, the transition from the contact state to the non-contact state, and the like. In addition, the touch panel 21 may detect contact with a user's finger or the like from a change in capacitance or the like.
 カメラ22は、ユーザの指示に従い周辺の空間等を撮影し、撮影した映像を電気信号に変換する。カメラ22は、例えば、CMOS(Complimentary MOS)センサ等から構成される。カメラ22は、上側筐体31の裏側に配置される(図2(b))。 The camera 22 captures the surrounding space according to the user's instruction and converts the captured image into an electrical signal. The camera 22 is composed of, for example, a CMOS (Complimentary MOS) sensor or the like. The camera 22 is disposed on the back side of the upper housing 31 (FIG. 2B).
 角速度センサ23は、携帯型情報処理装置1の3軸周り(図2(a)、xyz軸)に生じる角速度をそれぞれ検知し、処理制御部10に検知した角速度のデータを出力する。処理制御部10は、角速度データに基づいて、携帯型情報処理装置1の姿勢や動きを求める。 The angular velocity sensor 23 detects angular velocities generated around the three axes (FIG. 2A, xyz axes) of the portable information processing device 1 and outputs detected angular velocity data to the processing control unit 10. The process control unit 10 obtains the posture and movement of the portable information processing device 1 based on the angular velocity data.
 なお、各実施形態に係る画像処理装置100は、上述した典型的な携帯型情報処理装置1上に実現されるが、一般的なコンピュータやゲーム装置上に実現することもでき、これらの実施形態も本発明の範囲に含まれる。一般的なコンピュータやゲーム装置は、上記携帯型情報処理装置1と同様に、CPUコアや、VRAM、WRAM、を備える。また、通信部として、例えば、LAN(Local Area Network)を構成する際に用いられる10BASE-T/100BASE-Tなどの規格に準拠するNIC(Network Interface Controller)、記憶装置としてハードディスクを有する他、DVD-ROMや、光磁気ディスク等が利用できるようになっている。また、タッチパネルの代わりに、キーボードやマウスなどを利用する。そして、プログラムをインストールした後に、そのプログラムを実行させると、情報処理装置として機能させることができる。 The image processing apparatus 100 according to each embodiment is realized on the typical portable information processing apparatus 1 described above, but can also be realized on a general computer or a game apparatus. Are also included within the scope of the present invention. A general computer or game device includes a CPU core, a VRAM, and a WRAM similarly to the portable information processing device 1. In addition, as a communication unit, for example, a NIC (Network Interface Controller) conforming to a standard such as 10BASE-T / 100BASE-T used when configuring a LAN (Local Area Network), a hard disk as a storage device, and a DVD -ROMs, magneto-optical disks, etc. can be used. In addition, a keyboard or a mouse is used instead of the touch panel. Then, after the program is installed, when the program is executed, it can function as an information processing apparatus.
 以下、上記携帯型情報処理装置1により実現される画像処理装置100の機能について、図1乃至図15を参照して説明する。ゲーム用のプログラム及びデータを記憶したカートリッジ12をコネクタ11に装着して、携帯型情報処理装置1の電源を投入することにより、当該プログラムが実行され、実施形態に係る画像処理装置100が実現される。 Hereinafter, functions of the image processing apparatus 100 realized by the portable information processing apparatus 1 will be described with reference to FIGS. 1 to 15. The cartridge 12 storing the game program and data is attached to the connector 11 and the portable information processing apparatus 1 is turned on to execute the program, thereby realizing the image processing apparatus 100 according to the embodiment. The
(ゲームの概要)
 まず、実施形態に係る画像処理装置100が実行するゲームの例について説明する。図3にゲームの仮想空間の例を示す。
(Outline of the game)
First, an example of a game executed by the image processing apparatus 100 according to the embodiment will be described. FIG. 3 shows an example of a game virtual space.
 本ゲームの仮想空間は横長のフィールド300であり、フィールドの両端には、プレイヤの基地であるプレイヤベース201と、敵の基地である敵ベース202と、が配置される。本ゲームにおいて、プレイヤは、プレイヤベース201を守りつつ、敵ベース202を陥落させることが求められる。 The virtual space of this game is a horizontally long field 300, and a player base 201 as a player base and an enemy base 202 as an enemy base are arranged at both ends of the field. In this game, the player is required to drop the enemy base 202 while protecting the player base 201.
 プレイヤは、プレイヤが操作するキャラクタ(プレイヤキャラクタ203)を敵ベース202に向かわせて攻撃をしかけたり、プレイヤ砲弾205を敵ベース202に向けて射出したりすることにより、敵ベース202を陥落させる。また、プレイヤは、敵キャラクタ204及び敵砲弾206による攻撃からプレイヤベース201を守る。 The player causes the enemy base 202 to fall by causing a character (player character 203) operated by the player to attack the enemy base 202 or by injecting the player shell 205 toward the enemy base 202. The player also protects the player base 201 from attacks by the enemy character 204 and the enemy shell 206.
 仮想空間のフィールド300は広いため、すべてが画面に収まらず、画面に表示される画像はフィールド300の一部である。画面には、仮想空間に配置される仮想カメラ400により撮影された画像が表示される。仮想カメラ400は、直線401の上を平行移動し、後述するマニュアルモード又はオートモードで移動させることができる。例えば、図3は、直線401上の仮想カメラ400から撮影方向403に見た画像301が画面に表示されることを示している。 Since the field 300 of the virtual space is wide, not all fits on the screen, and the image displayed on the screen is a part of the field 300. An image captured by the virtual camera 400 arranged in the virtual space is displayed on the screen. The virtual camera 400 can be moved in parallel on a straight line 401 and moved in a manual mode or an auto mode, which will be described later. For example, FIG. 3 shows that an image 301 viewed in the shooting direction 403 from the virtual camera 400 on the straight line 401 is displayed on the screen.
 仮想カメラ400を移動させる手法には、プレイヤが任意に動かすことができるマニュアルモードと、プログラムに従って、随時最も重要と思われる撮影位置を自動的に選択されるオートモードがある。マニュアルモードでは、プレイヤが、例えば、方向キー18aの右方向または左方向を押圧することにより、仮想カメラ400を直線401上で動かすことができる。オートモードでは、例えば、ゲームプレイ中に起こるイベントについて優先度が設定されており、仮想カメラ400は、当該優先度に基づいて、当該イベントが画面に表示されるように自動的に移動する。 The method for moving the virtual camera 400 includes a manual mode in which the player can arbitrarily move, and an auto mode in which the most important shooting position is automatically selected according to the program. In the manual mode, for example, the player can move the virtual camera 400 on the straight line 401 by pressing the right direction or the left direction of the direction key 18a. In the auto mode, for example, a priority is set for an event that occurs during game play, and the virtual camera 400 automatically moves based on the priority so that the event is displayed on the screen.
 イベントとは、例えば、プレイヤベース201の陥落(優先度:1)、敵ベース202の陥落(優先度:2)、プレイヤキャラクタ203の撃墜(優先度:3)、敵キャラクタ204の撃墜(優先度:4)、プレイヤベースから射出されたプレイヤ砲弾205の先端(優先度:5)、敵ベースから射出された敵砲弾206の先端(優先度:6)、プレイヤキャラクタ203の出撃(優先度:7)、敵キャラクタ204の出撃(優先度:8)、プレイヤキャラクタ203の最前線(優先度:9)、敵キャラクタ204の最前線(優先度:10)、プレイヤベース201でのその他のイベント(優先度:11)、敵ベース202でのその他のイベント(優先度:12)、である。各イベントには予め優先度が設定される。そして、これらのイベントのいずれかが同時に発生した場合、優先度の高いイベントの画像が画面に表示されるように仮想カメラ400の位置が設定される。 The event includes, for example, a fall of the player base 201 (priority: 1), a fall of the enemy base 202 (priority: 2), a down shot of the player character 203 (priority: 3), and a down shot of the enemy character 204 (priority). : 4), the tip of the player shell 205 ejected from the player base (priority: 5), the tip of the enemy shell 206 ejected from the enemy base (priority: 6), the sortie of the player character 203 (priority: 7) ), Enemy character 204 sortie (priority: 8), player character 203 forefront (priority: 9), enemy character 204 forefront (priority: 10), other events in the player base 201 (priority) Degree: 11), and other events (priority: 12) in the enemy base 202. A priority is set in advance for each event. Then, when any of these events occur simultaneously, the position of the virtual camera 400 is set so that an event image with a high priority is displayed on the screen.
 例えば、図4に示すように、プレイヤベース201に敵砲弾206が命中し、プレイヤベース201が陥落されたとする。また、これと同時にプレイヤキャラクタ203と敵キャラクタ204とが対峙したとする。この場合、プレイヤベース201の陥落は優先度が1であり、プレイヤキャラクタ203の最前線は優先度が9であるので、プレイヤベース201の陥落が画面に表示されるように仮想カメラ400の位置が移動し、画像302が画面に表示される。 For example, as shown in FIG. 4, it is assumed that an enemy shell 206 hits the player base 201 and the player base 201 falls. At the same time, it is assumed that the player character 203 and the enemy character 204 face each other. In this case, since the priority of the fall of the player base 201 is 1, and the priority of the front line of the player character 203 is 9, the virtual camera 400 is positioned so that the fall of the player base 201 is displayed on the screen. The image 302 is displayed on the screen.
 仮想カメラ400の移動をマニュアルモードで行うか、あるいはオートモードで行うかは、プレイヤが適宜設定することができる。例えば、ゲームは、プレイヤの攻撃と敵の攻撃とが交互に行われて進行し、プレイヤ又は敵の攻撃の初めには、オートモードに設定されるものとする。この場合、オートモードのときに、方向キー18aの押圧が検知されると、マニュアルモードに移行するようにしてもよい。 The player can appropriately set whether to move the virtual camera 400 in the manual mode or in the auto mode. For example, it is assumed that the game proceeds by alternately performing the player's attack and the enemy's attack, and is set to the auto mode at the beginning of the player's or enemy's attack. In this case, when the pressing of the direction key 18a is detected in the auto mode, the mode may be shifted to the manual mode.
 以下、マニュアルモードの場合における、画像処理装置100の機能について説明する。 Hereinafter, functions of the image processing apparatus 100 in the manual mode will be described.
(実施形態)
 実施形態の画像処理装置100は、図5に示すように、記憶部101と、表示部102と、受付部103と、検知部104と、更新部105と、を備える。
(Embodiment)
As illustrated in FIG. 5, the image processing apparatus 100 according to the embodiment includes a storage unit 101, a display unit 102, a reception unit 103, a detection unit 104, and an update unit 105.
 記憶部101は、視点位置及び視線方向が記憶される。 The storage unit 101 stores the viewpoint position and the line-of-sight direction.
 例えば、視点位置とは、仮想カメラ400の直線401上の位置(以下、「撮影位置」という)であり、視線方向とは、仮想カメラ400の撮影方向である。これらの値は、例えば、図6の撮影情報テーブル101aに登録される。 For example, the viewpoint position is a position on the straight line 401 of the virtual camera 400 (hereinafter referred to as “shooting position”), and the line-of-sight direction is the shooting direction of the virtual camera 400. These values are registered in, for example, the photographing information table 101a in FIG.
 撮影情報テーブル101aには、仮想カメラ400の初期位置101a1と、初期方向101a2と、現在の撮影位置101a3と、撮影方向101a4と、が対応づけて登録されている。直線401の中央の位置(図3の仮想カメラ400の位置)を0[dot]とし、この位置を初期位置とする。撮影位置は、画面に向かって、フィールド300の右方向に移動した場合には正の値、フィールド300の左方向に移動した場合には負の値で表されるものとする。また、フィールド300に向かって直線401と直角をなす方向402を0[deg]とし、この方向を初期方向とする。撮影方向は、画面に向かって、フィールド300の右方向に向いた場合には正の値、フィールドの左方向に向いた場合には負の値で表されるものとする。 In the photographing information table 101a, an initial position 101a1, an initial direction 101a2, a current photographing position 101a3, and a photographing direction 101a4 of the virtual camera 400 are registered in association with each other. The center position of the straight line 401 (the position of the virtual camera 400 in FIG. 3) is set to 0 [dot], and this position is set as the initial position. The shooting position is represented as a positive value when moving in the right direction of the field 300 toward the screen, and as a negative value when moving in the left direction of the field 300. A direction 402 perpendicular to the straight line 401 toward the field 300 is set to 0 [deg], and this direction is set as an initial direction. The shooting direction is represented by a positive value when facing the screen in the right direction of the field 300 and a negative value when facing the left direction of the field.
 例えば、仮想カメラ400が、図3に示す撮影位置に配置され、撮影方向403を向いている場合、図6(a)の撮影情報テーブル101aが記憶部101に記憶される。 For example, when the virtual camera 400 is arranged at the shooting position shown in FIG. 3 and faces the shooting direction 403, the shooting information table 101a of FIG. 6A is stored in the storage unit 101.
 本実施形態において、WRAM 10dが記憶部101として機能する。 In this embodiment, the WRAM 10d functions as the storage unit 101.
 表示部102は、視点位置及び視線方向から撮影された画像を画面に表示する。 The display unit 102 displays an image taken from the viewpoint position and the line-of-sight direction on the screen.
 本実施形態において、撮影された画像は、仮想空間の画像であり、第1の表示部19に表示される立体視用の画像であるとする。 In this embodiment, it is assumed that the photographed image is a virtual space image and is a stereoscopic image displayed on the first display unit 19.
 例えば、記憶部101に、図6(a)の撮影情報テーブル101aが格納されている場合、表示部102は、当該撮影情報テーブル101aを参照して、初期位置の撮影位置から撮影方向403に撮影した画像301(図3)を画面に表示する。 For example, when the imaging information table 101a of FIG. 6A is stored in the storage unit 101, the display unit 102 refers to the imaging information table 101a and performs imaging in the imaging direction 403 from the initial imaging position. The completed image 301 (FIG. 3) is displayed on the screen.
 本実施形態において、処理制御部10及び第1の表示部19が協働して、表示部102として機能する。 In the present embodiment, the processing control unit 10 and the first display unit 19 cooperate to function as the display unit 102.
 受付部103は、指示操作を受け付ける。 The accepting unit 103 accepts an instruction operation.
 例えば、指示操作とは、仮想カメラ400の撮影位置を動かすための操作であり、方向キー18aの左方向又は右方向の押圧である。例えば、プレイヤが方向キー18aの左方向を押圧した場合、受付部103は、左方向に仮想カメラ400を動かす指示として当該押圧を受け付ける。あるいは、タッチパネル21により左方向又は右方向を指定する操作を受け付けるようにしてもよい。例えば、プレイヤがタッチパネル21に指で触れて、右方向に所定の距離だけ動かした場合、受付部103は、右方向に仮想カメラ400を動かす指示として、当該操作を受け付ける。 For example, the instruction operation is an operation for moving the shooting position of the virtual camera 400, and is a left or right press of the direction key 18a. For example, when the player presses the left direction of the direction key 18a, the receiving unit 103 receives the press as an instruction to move the virtual camera 400 in the left direction. Or you may make it receive operation which designates the left direction or the right direction with the touch panel 21. FIG. For example, when the player touches the touch panel 21 with a finger and moves it by a predetermined distance in the right direction, the accepting unit 103 accepts the operation as an instruction to move the virtual camera 400 in the right direction.
 本実施形態において、処理制御部10及び操作キー18(あるいは、タッチパネル21)が協働して、受付部103として機能する。 In the present embodiment, the processing control unit 10 and the operation key 18 (or the touch panel 21) cooperate to function as the reception unit 103.
 検知部104は、画面の向きの変化を検知する。 The detection unit 104 detects a change in the screen orientation.
 ここで、画面の向きとは、立体視用画像を表示する第1の表示部19を有する上側筐体31の向きであるとする。図7に、携帯型情報処理装置1を把持してゲームプレイをするプレイヤ500と、携帯型情報処理装置1の上側筐体31(第1の表示部19)との位置関係を示す。図7(a)及び図7(b)は、プレイヤ500の真上から見た図である。画面の方向は、ユーザ500と平行な方向501に対して、直角をなす方向502を0[deg]とする。 Here, it is assumed that the orientation of the screen is the orientation of the upper casing 31 having the first display unit 19 that displays the stereoscopic image. FIG. 7 shows the positional relationship between the player 500 who holds the portable information processing apparatus 1 and plays a game, and the upper casing 31 (first display unit 19) of the portable information processing apparatus 1. FIG. 7A and FIG. 7B are views seen from directly above the player 500. The direction of the screen is set to 0 [deg] in a direction 502 perpendicular to the direction 501 parallel to the user 500.
 図7(a)は、上側筐体31が有する第1の表示部19に対してプレイヤ500が正対するように、プレイヤ500が携帯型情報処理装置1を把持している様子を示している。ここで、「正対」とは、プレイヤの上体が画面に対して真正面に相対することをいい、立体視用画像が画面に表示される場合は、最も効果的に立体視を行うことができる配置である。この場合、検知部104は、画面の方向503を検知する。すなわち、画面方向503は0[deg]であり、検知部104は、画面の向きとして0[deg]を検知する。 FIG. 7A shows a state where the player 500 is holding the portable information processing apparatus 1 so that the player 500 faces the first display unit 19 of the upper housing 31. Here, “facing directly” means that the upper body of the player faces directly in front of the screen. When a stereoscopic image is displayed on the screen, stereoscopic viewing is most effectively performed. It is possible arrangement. In this case, the detection unit 104 detects the screen direction 503. That is, the screen direction 503 is 0 [deg], and the detection unit 104 detects 0 [deg] as the screen orientation.
 図7(b)は、プレイヤ500が方向502から左側に30[deg]だけ、上側筐体31を傾けるように携帯型情報処理装置1を把持している様子を示している。この場合、検知部104は、画面の方向504を検知する。すなわち、検知部104は、画面の向きとして-30[deg]を検知する。 FIG. 7B shows a state where the player 500 is holding the portable information processing apparatus 1 so that the upper housing 31 is tilted by 30 [deg] to the left from the direction 502. In this case, the detection unit 104 detects the screen direction 504. That is, the detection unit 104 detects −30 [deg] as the screen orientation.
 本実施形態において、処理制御部10及び角速度センサ23が協働して、検知部104として機能する。 In the present embodiment, the processing control unit 10 and the angular velocity sensor 23 cooperate to function as the detection unit 104.
 更新部105は、指示操作に基づいて、視点位置を更新し、画面の向きの変化に基づいて、視線方向を更新する。 The update unit 105 updates the viewpoint position based on the instruction operation, and updates the line-of-sight direction based on the change in the screen orientation.
 ここで、図8乃至図12、14に、画面に表示される画像(図8(a)乃至図12(a)、図14(a))と、仮想カメラ400の撮影位置及び撮影方向(図8(b)乃至図12(b)、図14(b))と、画面の向き(図8(c)乃至図12(c)、図14(c))と、の関係を示す。以下、更新部105の機能について、これらの図を参照して説明する。 8 to 12 and 14, images displayed on the screen (FIG. 8A to FIG. 12A and FIG. 14A), the shooting position and shooting direction of the virtual camera 400 (see FIG. 8). 8 (b) to FIG. 12 (b) and FIG. 14 (b)) and the screen orientation (FIG. 8 (c) to FIG. 12 (c) and FIG. 14 (c)) are shown. Hereinafter, the function of the update unit 105 will be described with reference to these drawings.
 例えば、図8(c)に示すように、プレイヤ500が上側筐体31と正対している状態で、プレイヤによる指示操作が未だ受け付けられていない場合、仮想カメラ400は、図8(b)に示すように、初期位置に配置され、初期方向(撮影方向403)を向く。撮影位置及び撮影方向は初期位置及び初期方向から変更されていないので、更新部105は、撮影位置及び撮影方向の更新を行わない。表示部102は、初期位置から初期方向に撮影した画像303(図8(a))を画面に表示する。画像303には、プレイヤキャラクタ203及びオブジェクト207が含まれる。 For example, as shown in FIG. 8C, when the player 500 is facing the upper housing 31 and the player has not yet received an instruction operation, the virtual camera 400 is shown in FIG. As shown, it is arranged at the initial position and faces the initial direction (imaging direction 403). Since the shooting position and the shooting direction are not changed from the initial position and the initial direction, the update unit 105 does not update the shooting position and the shooting direction. The display unit 102 displays an image 303 (FIG. 8A) taken in the initial direction from the initial position on the screen. The image 303 includes a player character 203 and an object 207.
 図9(c)に示すように、プレイヤ500が上側筐体31と正対している状態で、プレイヤ500が方向キー18aの左方向を押圧したとすると、受付部103は当該押圧を受け付け、更新部105は受け付けられた押圧に基づいて撮影位置を更新する。例えば、受付部103が左方向に150[dot]だけ移動させる押圧を受け付けたとすると、更新部105は、図6(b)に示すように、撮影情報テーブル101aの撮影位置101a3を“-150”[dot]に更新する。そして、表示部102は、更新された撮影情報テーブル101a(図6(b))を参照して撮影位置及び撮影方向を求め、初期位置から左に150[dot]だけ移動した撮影位置(図9(b))及び撮影方向403から撮影された画像304(図9(a))を、画面に表示する。 As shown in FIG. 9C, if the player 500 presses the left direction of the direction key 18a in a state where the player 500 faces the upper housing 31, the receiving unit 103 receives the press and updates it. The unit 105 updates the shooting position based on the received press. For example, if the accepting unit 103 accepts a pressure to move 150 [dots] leftward, the updating unit 105 sets the photographing position 101a3 of the photographing information table 101a to “−150” as shown in FIG. 6B. Update to [dot]. The display unit 102 obtains the shooting position and the shooting direction with reference to the updated shooting information table 101a (FIG. 6B), and moves to the left from the initial position by 150 [dot] (FIG. 9). (B)) and the image 304 (FIG. 9A) taken from the shooting direction 403 are displayed on the screen.
 また、図8に示した状態から、図10(c)に示すように、プレイヤ500が、第1の表示部19の向きが方向502から左に30[deg]だけ傾くように、携帯型情報処理装置1を傾けたとすると、検知部104は傾いた画面の方向504を検知し、更新部105は検知された方向504に基づいて撮影方向を更新する。例えば、検知部104が左方向に30[deg]の傾きを検知したとすると、更新部105は、図6(c)に示すように、撮影情報テーブル101aの撮影方向101a4を“-30”[deg]に更新する。そして、表示部102は、更新された撮影情報テーブル101a(図6(c))を参照して撮影位置及び撮影方向を求め、撮影位置(初期位置)及び撮影方向404(図10(b))から撮影された画像305(図10(a))を、画面に表示する。 Also, from the state shown in FIG. 8, as shown in FIG. 10C, the portable information is such that the player 500 tilts the first display unit 19 to the left from the direction 502 by 30 [deg]. If the processing device 1 is tilted, the detection unit 104 detects the tilted screen direction 504, and the update unit 105 updates the shooting direction based on the detected direction 504. For example, if the detection unit 104 detects an inclination of 30 [deg] in the left direction, the update unit 105 sets the shooting direction 101a4 of the shooting information table 101a to “−30” [−30] as illustrated in FIG. deg]. Then, the display unit 102 obtains a shooting position and a shooting direction with reference to the updated shooting information table 101a (FIG. 6C), and acquires a shooting position (initial position) and a shooting direction 404 (FIG. 10B). An image 305 (FIG. 10 (a)) taken from is displayed on the screen.
 また、更新部105は、指示操作が受け付けられると、視線方向を初期方向に近づくように更新する。 Also, when the instruction operation is accepted, the update unit 105 updates the line-of-sight direction so as to approach the initial direction.
 図11(c)のように、プレイヤ500が、上側筐体31を左に傾けた状態で、方向キー18aの左方向を押圧したとすると、受付部103は当該押圧を受け付け、検知部104は上側筐体31の方向を検知する。更新部105は、受け付けられた押圧に基づいて撮影位置を更新し、検知された方向に基づいて撮影方向を更新する。ここで、プレイヤ500が、図10(b)の状態から方向キー18aで撮影位置を移動させようとした場合(例えば、左方向に移動させようとした場合)、図11(b)に示すように、撮影方向は方向404のままで、撮影位置だけが指定された位置に移動すると想定して、指示操作を行うのが一般的である。そして、上記のような指示操作を行ったプレイヤ500は、当該撮影位置から方向404に向けて撮影された画像306が表示部102により画面に表示されることを想定していると考えられる。これに対して本実施形態では、更新部105は、図12(b)に示すように、指定された撮影位置に仮想カメラ400が移動する間に、撮影方向を、方向404から初期方向の方向403へ、徐々に近づけるように、撮影方向を更新する。 As shown in FIG. 11C, if the player 500 presses the left direction of the direction key 18a with the upper housing 31 tilted to the left, the receiving unit 103 receives the pressing, and the detecting unit 104 The direction of the upper housing 31 is detected. The update unit 105 updates the shooting position based on the received pressure, and updates the shooting direction based on the detected direction. Here, when the player 500 tries to move the shooting position with the direction key 18a from the state shown in FIG. 10B (for example, when the player 500 tries to move leftward), as shown in FIG. 11B. In general, the instruction operation is performed on the assumption that the shooting direction remains the direction 404 and only the shooting position moves to the designated position. Then, it is considered that the player 500 who has performed the instruction operation as described above assumes that the display unit 102 displays the image 306 photographed in the direction 404 from the photographing position. On the other hand, in the present embodiment, the update unit 105 changes the shooting direction from the direction 404 to the initial direction while the virtual camera 400 moves to the specified shooting position, as shown in FIG. The shooting direction is updated so as to gradually approach 403.
 ここで、更新部105は、例えば、1回の指示操作に対して所定の角度だけ初期方向に近づくように視線方向を更新する。1回の指示操作とは、例えば、方向キー18aの左方向又は右方向の1回の押圧である。本実施形態では、1回の方向キー18aの押圧で50[dot]だけ、撮影位置が移動するものとする。 Here, for example, the updating unit 105 updates the line-of-sight direction so as to approach the initial direction by a predetermined angle with respect to one instruction operation. One instruction operation is, for example, one pressing of the direction key 18a in the left direction or the right direction. In the present embodiment, it is assumed that the photographing position moves by 50 [dot] by pressing the direction key 18a once.
 図13(a)に、現在の視線方向(撮影方向)と初期方向とのなす角度(以下、「現在の角度」という)と、所定の角度との関係の一例を示す。更新部105はこのグラフに従い撮影方向を更新する。例えば、図12(b)に示すように仮想カメラ400が初期位置に配置され、左方向に30[deg]傾いた状態で、方向キー18aの左方向を1回押圧すると、更新部105は、仮想カメラ400を直線401上の左方向に50[dot]だけ移動した位置に撮影位置を更新し、同時に、撮影方向を10[deg]だけ初期方向に近づくように更新する。すなわち、更新部105は、図6(c)の撮影情報テーブル101aから、図6(d)に示すように、撮影位置101a3を“-50”[dot]、撮影方向101a4を“-20”[deg](方向405)に更新する。そして、表示部102は、方向キー18aの押圧毎に更新される撮影情報テーブル101aを参照して撮影位置及び撮影方向を求め、当該撮影位置及び撮影方向から撮影された画像を表示する。方向キー18aが左方向に3回押圧され、撮影方向403に更新されると、表示部102は、初期位置から左に150[dot]だけ移動した撮影位置及び撮影方向403から撮影された画像307(図12(a))を、画面に表示する。 FIG. 13A shows an example of a relationship between an angle formed by the current gaze direction (imaging direction) and the initial direction (hereinafter referred to as “current angle”) and a predetermined angle. The updating unit 105 updates the shooting direction according to this graph. For example, as illustrated in FIG. 12B, when the virtual camera 400 is arranged at the initial position and is tilted 30 [deg] in the left direction, when the left direction of the direction key 18 a is pressed once, the update unit 105 The imaging position is updated to a position where the virtual camera 400 is moved 50 [dots] leftward on the straight line 401, and at the same time, the imaging direction is updated so as to approach the initial direction by 10 [deg]. That is, as shown in FIG. 6D, the update unit 105 sets the shooting position 101a3 to “−50” [dot] and sets the shooting direction 101a4 to “−20” [0] from the shooting information table 101a of FIG. deg] (direction 405). Then, the display unit 102 obtains a shooting position and a shooting direction with reference to the shooting information table 101a updated every time the direction key 18a is pressed, and displays an image shot from the shooting position and the shooting direction. When the direction key 18a is pressed three times to the left and updated to the photographing direction 403, the display unit 102 displays the image 307 photographed from the photographing position and the photographing direction 403 moved by 150 [dots] left from the initial position. (FIG. 12A) is displayed on the screen.
 なお、所定の角度は、一定の値ではなく、視線方向と初期方向とがなす角度(現在の角度)に応じて定められるようにしてもよい。例えば、図13(b)に示すように、現在の角度が大きいほど、所定の角度を大きく設定するようにしてもよい。 It should be noted that the predetermined angle is not a fixed value, but may be determined according to an angle (current angle) formed by the line-of-sight direction and the initial direction. For example, as shown in FIG. 13B, the predetermined angle may be set larger as the current angle is larger.
 上側筐体31の向きがプレイヤに正対する方向(方向502)から傾けられた状態で、撮影位置を変更する操作が受け付けられると、更新部105は、上側筐体31(第1の表示部19)の向きにかかわらず撮影方向を更新するので、プレイヤ500の意図しない方向の画像307(図12(a))が表示される。すなわち、画面の向きは傾いているにも関わらず、画面には、仮想空間のフィールドに正対した場合に表示される画像307(図12(a))が表示される。このように、撮影方向を更新することにより、図14(c)に示すように、プレイヤ500に上側筐体31(第1の表示部19)に対して正対させるように促すことができる。 When an operation for changing the shooting position is received in a state where the orientation of the upper housing 31 is tilted from the direction facing the player (direction 502), the update unit 105 receives the upper housing 31 (first display unit 19). Since the shooting direction is updated regardless of the orientation of), an image 307 (FIG. 12A) in a direction not intended by the player 500 is displayed. That is, although the screen is tilted, the image 307 (FIG. 12A) displayed when facing the field in the virtual space is displayed on the screen. In this way, by updating the shooting direction, it is possible to prompt the player 500 to face the upper housing 31 (the first display unit 19) as shown in FIG. 14C.
 本実施形態において、処理制御部10が更新部105として機能する。 In the present embodiment, the processing control unit 10 functions as the update unit 105.
(実施形態に係る画像処理装置の動作)
 以下、本実施形態の画像処理装置100の各部が行う動作について説明する。画像処理装置100に電源が投入されると、CPUコア10aは、図15のフローチャートに示す画像処理を開始する。
(Operation of Image Processing Device According to Embodiment)
Hereinafter, operations performed by each unit of the image processing apparatus 100 according to the present embodiment will be described. When the image processing apparatus 100 is powered on, the CPU core 10a starts image processing shown in the flowchart of FIG.
 表示部102は、記憶部101に記憶された視点位置及び視線方向に基づいて、当該視点位置及び視線方向から撮影された画像を生成する(ステップS101)。例えば、表示部102は、撮影情報テーブル101aを参照して、撮影位置及び撮影方向から撮影した仮想空間の様子を表す画像を生成する。 The display unit 102 generates an image shot from the viewpoint position and the line-of-sight direction based on the viewpoint position and the line-of-sight direction stored in the storage unit 101 (step S101). For example, the display unit 102 refers to the shooting information table 101a and generates an image representing the state of the virtual space shot from the shooting position and shooting direction.
 表示部102が生成した画像は、VRAM 10cに記憶される。表示部102は、垂直同期割り込みが生じるまでキューのクリアや、別のプロセスの処理などを行って待機する(ステップS102)。そして、垂直同期割り込みが生じると、表示部102は、VRAM 10cに記憶された画像情報を表示信号に変換し、第1の表示部19に表示する(ステップS103)。例えば、図8の画像303が第1の表示部19に表示される。 The image generated by the display unit 102 is stored in the VRAM 10c. The display unit 102 waits for a queue to be cleared or another process to be performed until a vertical synchronization interrupt occurs (step S102). When a vertical synchronization interrupt occurs, the display unit 102 converts the image information stored in the VRAM 10c into a display signal and displays it on the first display unit 19 (step S103). For example, the image 303 in FIG. 8 is displayed on the first display unit 19.
 受付部103は、指示操作を受け付けたか否かを判断する(ステップS104)。受付部103が、指示操作を受け付けたと判断すると(ステップS104;Yes)、次に、検知部104は、画面の向きを検知する(ステップS105)。一方、受付部103が指示操作を受け付けないと判断した場合も(ステップS104;No)、検知部104は、画面の向きを検知する(ステップS109)。 The accepting unit 103 determines whether or not an instruction operation has been accepted (step S104). If the reception unit 103 determines that the instruction operation has been received (step S104; Yes), the detection unit 104 detects the orientation of the screen (step S105). On the other hand, when the receiving unit 103 determines that the instruction operation is not received (step S104; No), the detecting unit 104 detects the orientation of the screen (step S109).
 例えば、受付部103は、方向キー18aの左方向又は右方向のいずれかが押圧されたか否かを判断する。受付部103が当該押圧を受け付けたか否か判断すると、検知部104は、現在の上側筐体31(第1の表示部19)の向きを検知する。なお、受付部103による指示操作を受け付けたか否かの判断と検知部104の画面の向きの検知とは、順序は逆でも、あるいは、同時に行われてもよい。 For example, the reception unit 103 determines whether either the left direction or the right direction of the direction key 18a is pressed. When it is determined whether the receiving unit 103 has received the press, the detecting unit 104 detects the current orientation of the upper casing 31 (first display unit 19). Note that the determination of whether or not the instruction operation by the reception unit 103 has been received and the detection of the screen orientation of the detection unit 104 may be performed in the reverse order or simultaneously.
 ステップS105において、検知部104により画面の向きが検知されると、次に、更新部105は、当該画面の向きが初期方向であるか否かを判断する(ステップS106)。更新部105は、画面の向きが初期方向であると判断すると(ステップS106;Yes)、受付部103が受け付けた指示操作に基づいて記憶部101に記憶された視点位置を更新する(ステップS107)。一方、更新部105は、画面の向きが初期方向でないと判断すると(ステップS106;No)、受付部103が受け付けた指示操作に基づいて、記憶部101に記憶された視点位置を更新し、記憶部101に記憶された視線方向を、初期方向に近づくように更新する(ステップS108)。その後、ステップS101に戻る。 When the screen orientation is detected by the detection unit 104 in step S105, the update unit 105 next determines whether or not the screen orientation is the initial direction (step S106). When the updating unit 105 determines that the screen orientation is the initial direction (step S106; Yes), the updating unit 105 updates the viewpoint position stored in the storage unit 101 based on the instruction operation received by the receiving unit 103 (step S107). . On the other hand, when the updating unit 105 determines that the screen orientation is not the initial direction (step S106; No), the updating unit 105 updates the viewpoint position stored in the storage unit 101 based on the instruction operation received by the receiving unit 103, and stores the viewpoint position. The line-of-sight direction stored in the unit 101 is updated so as to approach the initial direction (step S108). Then, it returns to step S101.
 例えば、方向キー18aの左方向を押圧され、上側筐体31が初期方向に向いている場合(図9)、更新部105は、方向キー18aの押圧に基づいて、撮影情報テーブル101aの撮影位置101a3を更新する。一方、方向キー18aの左方向を押圧され、上側筐体31の向きが初期方向から傾けられている場合(図12)、更新部105は、当該押圧に基づいて撮影情報テーブル101aの撮影位置101a3を更新し、上側筐体31の向きに関わらず撮影方向101a4を初期方向へ徐々に近づけるように更新する。そして、更新部105により撮影位置又は撮影方向が更新されると、表示部102は、更新後の撮影情報テーブル101aの撮影位置及び撮影方向から撮影された画像を第1の表示部19に表示する。 For example, when the left direction of the direction key 18a is pressed and the upper housing 31 faces the initial direction (FIG. 9), the updating unit 105 determines the shooting position of the shooting information table 101a based on the pressing of the direction key 18a. 101a3 is updated. On the other hand, when the left direction of the direction key 18a is pressed and the direction of the upper housing 31 is tilted from the initial direction (FIG. 12), the update unit 105 performs the shooting position 101a3 of the shooting information table 101a based on the press. Is updated so that the shooting direction 101a4 gradually approaches the initial direction regardless of the orientation of the upper casing 31. Then, when the photographing position or photographing direction is updated by the updating unit 105, the display unit 102 displays an image photographed from the photographing position and photographing direction in the updated photographing information table 101 a on the first display unit 19. .
 ステップS109において、検知部104により画面の向きが検知されると、次に、更新部105は、当該画面の向きが初期方向であるか否かを判断する(ステップS110)。更新部105は、画面の向きが初期方向であると判断すると(ステップS110;Yes)、視点位置及び視線方向の更新は行わず、ステップS101に戻る。一方、更新部105は、画面の向きが初期方向でないと判断すると(ステップS110;No)、当該画面の向きに基づいて記憶部101に記憶された視線方向を更新する(ステップS111)。その後、ステップS101に戻る。 In step S109, when the screen orientation is detected by the detection unit 104, the update unit 105 next determines whether or not the screen orientation is the initial direction (step S110). When the updating unit 105 determines that the screen orientation is the initial direction (step S110; Yes), the updating of the viewpoint position and the line-of-sight direction is not performed, and the process returns to step S101. On the other hand, when determining that the screen orientation is not the initial direction (step S110; No), the updating unit 105 updates the line-of-sight direction stored in the storage unit 101 based on the orientation of the screen (step S111). Then, it returns to step S101.
 例えば、プレイヤ500により方向キー18aの押圧がされず、上側筐体31の向きが初期方向から傾けられている場合(図10)、更新部105は、上側筐体31の向きに基づいて、撮影情報テーブル101aの撮影方向101a4を更新する。一方、プレイヤ500により方向キー18aの押圧がされず、上側筐体31の向きが初期方向である場合(図8)、更新部105は、撮影位置101a3及び撮影方向101a4の更新を行わない。そして、表示部102は、撮影情報テーブル101aの撮影位置及び撮影方向から撮影された画像を第1の表示部19に表示する。 For example, when the player 500 does not press the direction key 18a and the orientation of the upper housing 31 is tilted from the initial direction (FIG. 10), the update unit 105 takes a picture based on the orientation of the upper housing 31. The imaging direction 101a4 of the information table 101a is updated. On the other hand, when the player 500 does not press the direction key 18a and the orientation of the upper housing 31 is the initial direction (FIG. 8), the update unit 105 does not update the shooting position 101a3 and the shooting direction 101a4. Then, the display unit 102 displays on the first display unit 19 an image shot from the shooting position and shooting direction of the shooting information table 101a.
 通常のゲームでは、図14(c)に示したように、プレイヤは画面に対して正対した状態で操作されるのが望ましいが、手元で向きを変えることができる携帯型情報処理装置1の場合、図10(c)に示したように、プレイヤが画面に対して正対していない状態で操作してしまうことがある。特に、立体視用画像が表示されている場合は、画面に正対した状態でないと、画像の立体視が行えない場合もある。本発明によれば、プレイヤが画面に正対していない状態で仮想カメラを移動させようとすると、更新部105が撮影方向を仮想空間のフィールドに正対するように向きを更新して、正対した場合に表示される画像を画面に表示することにより、プレイヤに画面に対して正対させるように促すことができる。 In a normal game, as shown in FIG. 14C, it is desirable that the player is operated in a state of facing the screen. However, the portable information processing apparatus 1 that can change the orientation at hand is used. In this case, as shown in FIG. 10C, the player may operate in a state where the player is not facing the screen. In particular, when a stereoscopic image is displayed, the image may not be stereoscopically viewed unless it is directly facing the screen. According to the present invention, when the player tries to move the virtual camera while not facing the screen, the updating unit 105 updates the direction so that the shooting direction faces the field in the virtual space, and faces the virtual camera. By displaying the image displayed in the case on the screen, it is possible to prompt the player to face the screen.
 なお、実施形態では、仮想カメラの撮影位置及び撮影方向を例にとり画像処理装置100の動作を説明したが、これに限らない。例えば、現実空間を撮影するカメラの撮影位置及び撮影方向であってもよい。また、現実空間を撮影するカメラの場合、カメラは画面と同一の筐体に備わっていなくてもよい。例えば、カメラの撮影位置は遠隔でユーザが指定できるが、撮影方向はユーザが手元に持つ画面の向きに基づいて定められる装置の場合にも本発明を適用することができる。 In the embodiment, the operation of the image processing apparatus 100 has been described by taking the shooting position and shooting direction of the virtual camera as an example, but the present invention is not limited to this. For example, it may be a shooting position and a shooting direction of a camera shooting a real space. In the case of a camera that captures a real space, the camera may not be provided in the same housing as the screen. For example, the shooting position of the camera can be specified remotely by the user, but the present invention can also be applied to an apparatus in which the shooting direction is determined based on the orientation of the screen held by the user.
 また、実施形態では、画面の向きを角速度センサにより検知することとしたが、これに限らない。例えば、カメラで撮影した撮影画像の変化から、画面の向きを求めるようにしてもよい。 In the embodiment, the screen orientation is detected by the angular velocity sensor. However, the present invention is not limited to this. For example, the orientation of the screen may be obtained from a change in a captured image captured by the camera.
 本発明は、2011年9月15日に出願された日本国特許出願2011-201393号に基づく。本明細書中に日本国特許出願2011-201393号の明細書、特許請求の範囲、図面全体を参照として取り込むものとする。 The present invention is based on Japanese Patent Application No. 2011-201393 filed on September 15, 2011. The specification, claims, and entire drawings of Japanese Patent Application No. 2011-201393 are incorporated herein by reference.
 本発明によれば、ユーザに、画面に対し正しい向きで操作させるのに好適な画像処理装置、処理方法、プログラム、ならびに、非一時的な記録媒体を提供することができる。 According to the present invention, it is possible to provide an image processing device, a processing method, a program, and a non-temporary recording medium that are suitable for causing a user to operate the screen in the correct orientation.
1 携帯型情報処理装置
10 処理制御部
10a CPUコア
10b 画像処理部
10c VRAM
10d WRAM
10e LCDコントローラ
10f タッチパネルコントローラ
11 コネクタ
12 カートリッジ
12a ROM
12b RAM
13 無線通信部
14 通信コントローラ
15 サウンドアンプ
16 スピーカ
17 マイク
18 操作キー
18a 方向キー
18b ボタン
19 第1の表示部
20 第2の表示部
21 タッチパネル
22 カメラ
23 角速度センサ
31 上側筐体
32 下側筐体
100 画像処理装置
101 記憶部
102 表示部
103 受付部
104 検知部
105 更新部
201 プレイヤベース
202 敵ベース
203 プレイヤキャラクタ
204 敵キャラクタ
205 プレイヤ砲弾
206 敵砲弾
207 オブジェクト
300 フィールド
301、302、303、304、305、306、307 画像
400 仮想カメラ
401 直線
402 方向
403、404、405 撮影方向
500 プレイヤ
501、502 方向
503、504 画面の方向
DESCRIPTION OF SYMBOLS 1 Portable information processing apparatus 10 Processing control part 10a CPU core 10b Image processing part 10c VRAM
10d WRAM
10e LCD controller 10f Touch panel controller 11 Connector 12 Cartridge 12a ROM
12b RAM
13 wireless communication unit 14 communication controller 15 sound amplifier 16 speaker 17 microphone 18 operation key 18a direction key 18b button 19 first display unit 20 second display unit 21 touch panel 22 camera 23 angular velocity sensor 31 upper casing 32 lower casing DESCRIPTION OF SYMBOLS 100 Image processing apparatus 101 Memory | storage part 102 Display part 103 Receiving part 104 Detection part 105 Update part 201 Player base 202 Enemy base 203 Player character 204 Enemy character 205 Player shell 206 enemy shell 207 Object 300 Field 301, 302, 303, 304, 305 , 306, 307 Image 400 Virtual camera 401 Straight line 402 Direction 403, 404, 405 Shooting direction 500 Player 501, 502 Direction 503, 504 Screen direction

Claims (8)

  1.  視点位置及び視線方向が記憶される記憶部(101)と、
     前記視点位置及び前記視線方向から撮影された画像を画面に表示する表示部(102)と、
     指示操作を受け付ける受付部(103)と、
     前記画面の向きの変化を検知する検知部(104)と、
     前記指示操作に基づいて、前記視点位置を更新し、前記画面の向きの変化に基づいて、前記視線方向を更新する更新部(105)と、
     を備え、
     前記更新部(105)は、前記指示操作が受け付けられると、前記視線方向を初期方向に近づくように更新する
     ことを特徴とする画像処理装置(100)。
    A storage unit (101) for storing a viewpoint position and a line-of-sight direction;
    A display unit (102) for displaying an image taken from the viewpoint position and the line-of-sight direction on a screen;
    A receiving unit (103) for receiving an instruction operation;
    A detection unit (104) for detecting a change in the orientation of the screen;
    An update unit (105) that updates the viewpoint position based on the instruction operation and updates the line-of-sight direction based on a change in the orientation of the screen;
    With
    When the instruction operation is received, the update unit (105) updates the line-of-sight direction so as to approach the initial direction. The image processing apparatus (100).
  2.  請求項1に記載の画像処理装置(100)であって、
     前記撮影された画像は、立体視用の画像である
     ことを特徴とする画像処理装置(100)。
    An image processing apparatus (100) according to claim 1,
    The image processing apparatus (100), wherein the photographed image is a stereoscopic image.
  3.  請求項1又は2に記載の画像処理装置(100)であって、
     前記撮影された画像は、仮想空間の様子を表す画像である
     ことを特徴とする画像処理装置(100)。
    The image processing device (100) according to claim 1 or 2,
    The image processing apparatus (100), wherein the photographed image is an image representing a state of a virtual space.
  4.  請求項1乃至3のいずれか1項に記載の画像処理装置(100)であって、
     前記更新部(105)は、1回の前記指示操作に対して所定の角度だけ前記初期方向に近づくように、前記視線方向を更新する
     ことを特徴とする画像処理装置(100)。
    The image processing apparatus (100) according to any one of claims 1 to 3,
    The image processing apparatus (100), wherein the updating unit (105) updates the line-of-sight direction so as to approach the initial direction by a predetermined angle with respect to one instruction operation.
  5.  請求項4に記載の画像処理装置(100)であって、
     前記所定の角度は、前記視線方向と前記初期方向とがなす角度に応じて定められる
     ことを特徴とする画像処理装置(100)。
    An image processing apparatus (100) according to claim 4,
    The image processing apparatus (100), wherein the predetermined angle is determined according to an angle formed by the line-of-sight direction and the initial direction.
  6.  視点位置及び視線方向が記憶される記憶部(101)と、表示部(102)と、受付部(103)と、検知部(104)と、更新部(105)と、を備える画像処理装置(100)が実行する処理方法であって、
     前記表示部(102)が、前記視点位置及び前記視線方向から撮影された画像を画面に表示する表示工程と、
     前記受付部(103)が、指示操作を受け付ける受付工程と、
     前記検知部(104)が、前記画面の向きの変化を検知する検知工程と、
     前記更新部(105)が、前記指示操作に基づいて、前記視点位置を更新し、前記画面の向きの変化に基づいて、前記視線方向を更新する更新工程と、
     を備え、
     前記更新工程において、前記更新部(105)は、前記指示操作が受け付けられると、前記視線方向を初期方向に近づくように更新する
     ことを特徴とする処理方法。
    An image processing apparatus (101) including a storage unit (101) that stores a viewpoint position and a line-of-sight direction, a display unit (102), a reception unit (103), a detection unit (104), and an update unit (105). 100) is a processing method executed by
    A display step in which the display unit (102) displays an image captured from the viewpoint position and the line-of-sight direction on a screen;
    A receiving step in which the receiving unit (103) receives an instruction operation;
    A detection step in which the detection unit (104) detects a change in the orientation of the screen;
    An updating step in which the update unit (105) updates the viewpoint position based on the instruction operation and updates the line-of-sight direction based on a change in the orientation of the screen;
    With
    In the updating step, when the instruction operation is accepted, the updating unit (105) updates the line-of-sight direction so as to approach the initial direction.
  7.  コンピュータを、
     視点位置及び視線方向が記憶される記憶部(101)、
     前記視点位置及び前記視線方向から撮影された画像を画面に表示する表示部(102)、
     指示操作を受け付ける受付部(103)、
     前記画面の向きの変化を検知する検知部(104)、
     前記指示操作に基づいて、前記視点位置を更新し、前記画面の向きの変化に基づいて、前記視線方向を更新する更新部(105)、
     として機能させ、
     前記更新部(105)が、前記指示操作が受け付けられると、前記視線方向を初期方向に近づくように更新する
     ように機能させることを特徴とするプログラム。
    Computer
    A storage unit (101) for storing a viewpoint position and a line-of-sight direction;
    A display unit (102) for displaying an image captured from the viewpoint position and the line-of-sight direction on a screen;
    A reception unit (103) for receiving an instruction operation;
    A detection unit (104) for detecting a change in the orientation of the screen;
    An update unit (105) that updates the viewpoint position based on the instruction operation and updates the line-of-sight direction based on a change in the orientation of the screen.
    Function as
    The update unit (105) functions to update the line-of-sight direction so as to approach the initial direction when the instruction operation is received.
  8.  コンピュータを、
     視点位置及び視線方向が記憶される記憶部(101)、
     前記視点位置及び前記視線方向から撮影された画像を画面に表示する表示部(102)、
     指示操作を受け付ける受付部(103)、
     前記画面の向きの変化を検知する検知部(104)、
     前記指示操作に基づいて、前記視点位置を更新し、前記画面の向きの変化に基づいて、前記視線方向を更新する更新部(105)、
     として機能させ、
     前記更新部(105)が、前記指示操作が受け付けられると、前記視線方向を初期方向に近づくように更新する
     ように機能させることを特徴とするプログラムを記録した非一時的な記録媒体。
    Computer
    A storage unit (101) for storing a viewpoint position and a line-of-sight direction;
    A display unit (102) for displaying an image captured from the viewpoint position and the line-of-sight direction on a screen;
    A reception unit (103) for receiving an instruction operation;
    A detection unit (104) for detecting a change in the orientation of the screen;
    An update unit (105) that updates the viewpoint position based on the instruction operation and updates the line-of-sight direction based on a change in the orientation of the screen.
    Function as
    A non-temporary recording medium storing a program, wherein the updating unit (105) functions to update the line-of-sight direction so as to approach an initial direction when the instruction operation is received.
PCT/JP2012/068813 2011-09-15 2012-07-25 Image processing apparatus, processing method, program, and non-temporary recording medium WO2013038814A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011201393A JP5113933B1 (en) 2011-09-15 2011-09-15 Image processing apparatus, processing method, and program
JP2011-201393 2011-09-15

Publications (1)

Publication Number Publication Date
WO2013038814A1 true WO2013038814A1 (en) 2013-03-21

Family

ID=47676453

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/068813 WO2013038814A1 (en) 2011-09-15 2012-07-25 Image processing apparatus, processing method, program, and non-temporary recording medium

Country Status (2)

Country Link
JP (1) JP5113933B1 (en)
WO (1) WO2013038814A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020162193A1 (en) * 2019-02-06 2020-08-13 ソニー株式会社 Information processing device and method, and program

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6219037B2 (en) * 2013-02-06 2017-10-25 任天堂株式会社 Information processing program, information processing apparatus, information processing system, and information processing method
JP6734236B2 (en) * 2017-08-14 2020-08-05 株式会社 ディー・エヌ・エー Program, system, and method for providing game
CN108970114A (en) * 2018-08-21 2018-12-11 苏州蜗牛数字科技股份有限公司 A method of visual field adjustment is realized by customized mapping keys

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002298160A (en) * 2001-03-29 2002-10-11 Namco Ltd Portable image generating device and program, and information storage medium
JP2004166995A (en) * 2002-11-20 2004-06-17 Nintendo Co Ltd Game device and information processor
JP2011108256A (en) * 2011-01-07 2011-06-02 Nintendo Co Ltd Information processing program, information processing method, information processing apparatus, and information processing system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002298160A (en) * 2001-03-29 2002-10-11 Namco Ltd Portable image generating device and program, and information storage medium
JP2004166995A (en) * 2002-11-20 2004-06-17 Nintendo Co Ltd Game device and information processor
JP2011108256A (en) * 2011-01-07 2011-06-02 Nintendo Co Ltd Information processing program, information processing method, information processing apparatus, and information processing system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020162193A1 (en) * 2019-02-06 2020-08-13 ソニー株式会社 Information processing device and method, and program

Also Published As

Publication number Publication date
JP2013059586A (en) 2013-04-04
JP5113933B1 (en) 2013-01-09

Similar Documents

Publication Publication Date Title
JP7098001B2 (en) Distance information display method in a virtual scene and its terminal computer device and computer program
CN108710525B (en) Map display method, device, equipment and storage medium in virtual scene
US9098130B2 (en) Computer-readable storage medium having stored thereon input processing program, input processing apparatus, input processing method, and input processing system
JP5710934B2 (en) Content display device and content display method
US9602809B2 (en) Storage medium storing display control program for providing stereoscopic display desired by user with simpler operation, display control device, display control method, and display control system
US9975042B2 (en) Information processing terminal and game device
CN106899801B (en) Mobile terminal and control method thereof
CN107707817B (en) video shooting method and mobile terminal
KR102523919B1 (en) Audio playback and collection method, device and device and readable storage medium
WO2011148544A1 (en) Portable electronic device
CN111918090B (en) Live broadcast picture display method and device, terminal and storage medium
JP2012003350A (en) Image display program, device, system, and method
CN109324739B (en) Virtual object control method, device, terminal and storage medium
CN106067833B (en) Mobile terminal and control method thereof
CN113509720B (en) Virtual fight playback method, device, terminal, server and storage medium
US20240066404A1 (en) Perspective rotation method and apparatus, device, and storage medium
JP5113933B1 (en) Image processing apparatus, processing method, and program
JP2016067024A (en) Portable electronic device
JP5764390B2 (en) Image generation program, image generation method, image generation apparatus, and image generation system
US20120058825A1 (en) Game apparatus, game control method, and information recording medium
JP5941620B2 (en) Information processing program, information processing apparatus, information processing method, and information processing system
JP2012175685A (en) Information processing program, imaging apparatus, imaging method, and imaging system
JP5926773B2 (en) Peripheral device, information processing system, and connection method of peripheral device
JP2011204182A (en) Image generating device, image processing method, and program
CN112121409A (en) Game interaction method, flexible screen terminal and computer readable storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12831977

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12831977

Country of ref document: EP

Kind code of ref document: A1