WO2014062001A1 - 3차원의 가상 공간 내에서 가상 카메라를 제어하기 위한 방법, 시스템 및 컴퓨터 판독 가능한 기록 매체 - Google Patents
3차원의 가상 공간 내에서 가상 카메라를 제어하기 위한 방법, 시스템 및 컴퓨터 판독 가능한 기록 매체 Download PDFInfo
- Publication number
- WO2014062001A1 WO2014062001A1 PCT/KR2013/009260 KR2013009260W WO2014062001A1 WO 2014062001 A1 WO2014062001 A1 WO 2014062001A1 KR 2013009260 W KR2013009260 W KR 2013009260W WO 2014062001 A1 WO2014062001 A1 WO 2014062001A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- virtual camera
- virtual
- motion
- path
- plane
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
Definitions
- the present invention relates to a method, a system and a computer readable recording medium for controlling a virtual camera in a three-dimensional virtual space.
- three-dimensional content In recent years, there is a demand for content that can be expressed three-dimensionally on a two-dimensional screen of a digital device such as a personal computer or a mobile terminal device (hereinafter, such content is referred to as "three-dimensional content" for convenience). It is increasing.
- a good example of such three-dimensional content is a three-dimensional animation produced by Anipen's three-dimensional animation authoring program.
- the inventor in order to enable users to author three-dimensional content using only relatively simple tools or interfaces, disclose WO2011 / 149160 (the specification of which is incorporated herein by reference in its entirety).
- the invention related to a novel three-dimensional animation authoring method and the like. It can be said that it is very important to control the virtual camera in the three-dimensional virtual space in this invention or other invention that assumes a visual expression by assuming a three-dimensional virtual space.
- the inventor (s) proposes a novel technique for controlling a virtual camera in a three-dimensional virtual space according to the present specification.
- the present invention aims to solve all of the above-mentioned problems of the prior art.
- Another object of the present invention is to enable a user to easily control a virtual camera in a three-dimensional virtual space.
- Another object of the present invention is to enable a user to sufficiently control a virtual camera in a three-dimensional virtual space even with a two-dimensional user interface.
- a system for controlling a virtual camera in a three-dimensional virtual space the user interface module for providing a user interface for receiving control data for the virtual camera, and the movement of the virtual camera
- a camera control module for setting a plane and controlling a property of the virtual camera according to the control data, the property including at least a part of a position, a gaze, a field of view, and a motion trajectory of the virtual camera, wherein the position is on the motion plane Limited to or around the system.
- a computer readable recording medium for recording another method for implementing the present invention, another system, and a computer program for executing the method.
- the user can easily control the virtual camera in the three-dimensional virtual space.
- a user can sufficiently control a virtual camera in a three-dimensional virtual space even with a two-dimensional user interface.
- FIG. 1 is a view showing a schematic configuration of an entire system for controlling a virtual camera in a three-dimensional virtual space according to an embodiment of the present invention.
- FIGS. 2 and 3 exemplarily illustrate a user interface according to an embodiment of the present invention.
- FIG. 4 is a conceptual diagram of a general motion surface of a virtual camera according to an embodiment of the present invention.
- FIG. 5 is a conceptual diagram of a plane of motion of a virtual camera when there is a specific path related to a 3D content object according to an embodiment of the present invention.
- FIG. 6 is a conceptual diagram of a motion plane of a virtual camera when the 3D content object is not on the ground plane according to an embodiment of the present invention.
- FIG. 1 is a view showing a schematic configuration of an entire system for controlling a virtual camera in a three-dimensional virtual space according to an embodiment of the present invention.
- a system for controlling a virtual camera in a three-dimensional virtual space may include a user interface module 100 and a camera control module 200.
- a virtual camera control system may be collectively implemented in one computing device or distributed in two or more computing devices.
- a user terminal device e.g., a desktop computer, a laptop computer, a workstation, a PDA, a web pad (especially a smart pad) in which all modules of the virtual camera control system typically need to utilize and control the virtual camera.
- Mobile phones especially smart phones, etc.
- some modules may be other digital devices (eg, three-dimensional content) that can communicate with the user terminal device other than the user terminal device.
- Providing server (not shown).
- the user interface module 100 and the camera control module 200 may communicate with each other by a predetermined communication network (not shown) or a predetermined processor (not shown).
- a communication network may be configured regardless of a communication mode such as wired communication or wireless communication, and may include a local area network (LAN), a metropolitan area network (MAN), and a wide area network (WAN). Network).
- the communication network referred to herein may be a known Internet or World Wide Web (WWW).
- WWW World Wide Web
- the communication network may include, at least in part, a known wired / wireless data communication network, a known telephone network, or a known wired / wireless television communication network without being necessarily limited thereto.
- the processor may control the flow of data provided or exchanged between such modules in the device.
- the user interface module 100 and the camera control module 200 may be included in the virtual camera control system in the form of an operating system, an application module, or another program module, and may be physically stored in various known storage devices. Can be.
- program modules include, but are not limited to, routines, subroutines, programs, objects, components, data structures, etc. that perform particular tasks or execute particular abstract data types, described below, in accordance with the present invention.
- the user interface module 100 may perform a function of providing a user interface for allowing a user to control a virtual camera for photographing three-dimensional content such as three-dimensional animation in a three-dimensional virtual space by his terminal device. Can be.
- the user interface module 100 allows the user to perform a virtual camera using only the manipulation of the most convenient form (for example, inputting a path by touch on the screen of the smart pad, specifying an object by touch, inputting a sketch gesture, etc.). You can control the position, line of sight, field of view, and movement trajectory.
- a detailed configuration and function of the user interface module 100 will be described in more detail below.
- the camera control module 200 may perform a function of controlling the position, the line of sight, the field of view, the movement trajectory, etc. of the virtual camera according to the data input by the user through the user interface.
- properties of the virtual camera that can be controlled may be as follows:
- Position The location of the virtual camera, which can be represented as P (x, y, z)
- Look-at vector A vector that represents the line of sight of a virtual camera or the field of view of a virtual camera
- Direction vector A vector representing the degree to which the imaging part of the virtual camera is tilted in the pitch direction, yaw direction and / or roll direction
- Field of view angle An angle representing the field of view within which a virtual camera can capture an image. There can be vertical and horizontal viewing angles.
- the user interface may preferably take the form of a widget.
- FIGS. 2 and 3 exemplarily illustrate a user interface according to an embodiment of the present invention.
- the user interface may be configured to be graphically or hardware provided to the user to allow the user to issue control commands, such as to enter control data, such as: have.
- Zoom in / out Zoom in / out of the virtual camera
- Tilt up / down Tilt up / down of the virtual camera
- Pan left / right Pan left and right of the virtual camera
- Frustum view on / off turns the frustum view on or off
- Focus lock / unlock Locks or unlocks the virtual camera after focusing or unlocking such focus (e.g., focusing the virtual camera on an animated object such as a character or a specific part of the ground plane of the animation, etc.) doing)
- the user may select a portion to be focused, in particular from the portion captured by the virtual camera by the focusing window (Focusing window) that can be included in the user interface. Accordingly, the virtual camera may perform focusing.
- the focusing window Focusing window
- the user may check in real time the scene captured by the virtual camera by a preview window which may also be included.
- the user inputs a predetermined gesture in a sketch manner or the like by a gesture window, which may also be included (the means of sketching may be a finger, an electronic pen, a touch pen, etc.), thereby allowing the user to enter the user interface module 100.
- a gesture window which may also be included (the means of sketching may be a finger, an electronic pen, a touch pen, etc.), thereby allowing the user to enter the user interface module 100.
- control data for a virtual camera that is already matched with such a gesture may be input or a weight of various control data may be adjusted by a graph drawing method (see FIG. 3).
- values of attributes such as the position, the line of sight, the field of view, and the like of the virtual camera may be continuously determined as necessary.
- the attribute value of the virtual camera thus determined may be used to construct a motion sequence of the virtual camera.
- Additional attribute values of the virtual camera may be further determined by interpolation between sets of respective attribute values of the virtual camera constituting the motion sequence. That is, the user may actually determine the attribute value of the virtual camera continuously over time. After that, the user can use the preview window of the user interface to check in real time the scene captured by the virtual camera in real time, and then input a predetermined gesture in a sketch manner by using the gesture window. ) You can further control or adjust the attribute values of the virtual camera.
- the user interface described above may be disposed and displayed correspondingly with respect to a camera moving surface or a ground surface of three-dimensional content in a three-dimensional virtual space or elsewhere as described below.
- the camera control module 200 may control the virtual camera by receiving the property value or the attribute value of the virtual camera determined thereby.
- the position of the virtual camera, the motion trajectory, and the like must be determined by the user every time. This is a significant burden for the user, especially for the user who needs to perform control on the two-dimensional screen. Therefore, the present invention proposes the setting and use of the motion surface of the virtual camera, which can appropriately limit and determine the position, motion trajectory, etc. of the virtual camera without complicated manipulation of the user.
- FIG. 4 is a conceptual diagram of a general motion surface of a virtual camera according to an embodiment of the present invention.
- the camera control module 200 is most generally on or above a plane of a predetermined virtual globe with the location of the virtual camera centered on the location of the three-dimensional content object that it is imaging.
- the term "periphery" means that the reference plane of the position of the virtual camera is the spherical surface, the position of the virtual camera by shaking, etc. which is an operation that the virtual camera may perform in some cases). Is to be included until it is not exactly spherical). However, the size of the virtual sphere may be changed according to the control data input by the user through the user interface. In the following, for convenience, the above spherical surface and the like in which the motion of the virtual camera is limited will be referred to as the motion surface of the virtual camera.
- FIG. 5 is a conceptual diagram of a plane of motion of a virtual camera when there is a specific path related to a 3D content object according to an embodiment of the present invention.
- the camera control module 200 raises the position of the virtual camera at a predetermined angle (eg, vertically) from the particular path. It can be confined to or around the virtual plane of motion that extends.
- the above path is a path on the ground plane spaced apart by a predetermined distance from the motion path on the ground plane of the moving 3D content object. (These paths may be automatically determined when the motion of the 3D content object is determined. It may be changed according to the control data input by the user interface) or any path input by the user in a sketch manner with respect to the gesture window of the user interface or the screen of the user terminal device. In the latter case, the input sketch path may be a uniform cubic B-spline path projected on the ground plane of the 3D content.
- the user may modify the path on the ground plane of the virtual camera by a sketching method or the like while watching the scene captured by the virtual camera on the above-described movement surface using the user interface (in this case, other attribute values of the virtual camera may be separately Of course).
- FIG. 6 is a conceptual diagram of a motion plane of a virtual camera when the 3D content object is not on the ground plane according to an embodiment of the present invention.
- the camera control module 200 may limit the position of the virtual camera onto or around the surface of a predetermined virtual sphere having the raised position as a center point.
- the above limitation may be performed as the user specifies, by the user interface, the three-dimensional content object that the virtual camera will capture, but may also be performed as the user selects any portion on the ground plane on the gesture window of the user interface. have.
- the selection may be a sketch selection based on the position of the 3D content object.
- the user may modify the center point of the sphere where the motion of the virtual camera is restricted while viewing the scene captured by the virtual camera in the above-described motion surface using the user interface (in this case, other property values of the virtual camera may be changed separately). Of course you can).
- the motion surface of the virtual camera or the path on the ground surface used to determine it, the center point of the motion surface which is a spherical surface, etc. may be stored and reused in the form of a file.
- Embodiments according to the present invention described above can be implemented in the form of program instructions that can be executed by various computer components and recorded in a computer-readable recording medium.
- the computer-readable recording medium may include program instructions, data files, data structures, etc. alone or in combination.
- Program instructions recorded on the computer-readable recording medium may be specially designed and configured for the present invention, or may be known and available to those skilled in the computer software arts.
- Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks and magnetic tape, optical recording media such as CD-ROMs and DVDs, and magneto-optical media such as floptical disks. medium) and hardware devices specifically configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like.
- Examples of program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like.
- the hardware device may be modified with one or more software modules to perform the processing according to the present invention, and vice versa.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims (15)
- 3차원의 가상 공간 내에서 가상 카메라를 제어하기 위한 시스템으로서,상기 가상 카메라에 대한 제어 데이터를 입력 받기 위한 사용자 인터페이스를 제공하는 사용자 인터페이스 모듈, 및상기 가상 카메라의 운동면을 설정하고 상기 제어 데이터에 따라 상기 가상 카메라의 속성을 제어하는 카메라 제어 모듈 - 상기 속성은 상기 가상 카메라의 위치, 시선, 시야 및 운동 궤적 중 적어도 일부를 포함하고, 상기 위치는 상기 운동면 상으로 또는 그 주위로 제한됨 -을 포함하는 시스템.
- 제1항에 있어서,상기 사용자 인터페이스는 소정의 제스쳐를 입력 받기 위한 제스쳐 윈도우를 포함하는 시스템.
- 제2항에 있어서,상기 사용자 인터페이스 모듈은 상기 제스쳐 윈도우를 통하여 입력된 제스쳐와 상기 가상 카메라에 대한 소정의 제어 데이터를 매칭시킬 수 있는 것인 시스템.
- 제1항에 있어서,상기 운동면은 상기 가상 카메라가 관찰하는 3차원 컨텐트 오브젝트에 관한 소정의 경로에 기초하여 설정되는 시스템.
- 제4항에 있어서,상기 3차원 컨텐트 오브젝트에 관한 상기 경로는 상기 3차원 컨텐트 오브젝트의 운동 경로인 시스템.
- 제4항에 있어서,상기 3차원 컨텐트 오브젝트에 관한 상기 경로는 상기 사용자가 입력한 경로를 상기 3차원 컨텐트의 그라운드 면에 대하여 투영한 것인 시스템.
- 제1항에 있어서,상기 운동면은 상기 가상 카메라가 관찰하는 특정 3차원 컨텐트 오브젝트의 위치에 기초하여 설정되는 시스템.
- 제1항에 있어서,상기 운동면은 상기 가상 카메라가 관찰하는 3차원 컨텐트의 그라운드 면 위의 임의의 특정된 부분의 위치에 기초하여 설정되는 시스템.
- 3차원의 가상 공간 내에서 가상 카메라를 제어하기 위한 방법으로서,상기 가상 카메라에 대한 제어 데이터를 입력 받는 단계, 및상기 가상 카메라의 운동면을 설정하고 상기 제어 데이터에 따라 상기 가상 카메라의 속성을 제어하는 단계 - 상기 속성은 상기 가상 카메라의 위치, 시선, 시야 및 운동 궤적 중 적어도 일부를 포함하고, 상기 위치는 상기 운동면 상으로 또는 그 주위로 제한됨 -를 포함하는 방법.
- 제9항에 있어서,상기 운동면은 상기 가상 카메라가 관찰하는 3차원 컨텐트 오브젝트에 관한 소정의 경로에 기초하여 설정되는 방법.
- 제10항에 있어서,상기 3차원 컨텐트 오브젝트에 관한 상기 경로는 상기 3차원 컨텐트 오브젝트의 운동 경로인 방법.
- 제10항에 있어서,상기 3차원 컨텐트 오브젝트에 관한 상기 경로는 상기 사용자가 입력한 경로를 상기 3차원 컨텐트의 그라운드 면에 대하여 투영한 것인 방법.
- 제9항에 있어서,상기 운동면은 상기 가상 카메라가 관찰하는 특정 3차원 컨텐트 오브젝트의 위치에 기초하여 설정되는 방법.
- 제9항에 있어서,상기 운동면은 상기 가상 카메라가 관찰하는 3차원 컨텐트의 그라운드 면 위의 임의의 특정된 부분의 위치에 기초하여 설정되는 방법.
- 제1항 내지 제8항 중 어느 한 항에 따른 방법을 실행하기 위한 컴퓨터 프로그램을 기록하는 컴퓨터 판독 가능한 기록 매체.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/436,085 US10007348B2 (en) | 2012-10-16 | 2013-10-16 | Method and system for controlling virtual camera in virtual 3D space and computer-readable recording medium |
JP2015538018A JP2016502171A (ja) | 2012-10-16 | 2013-10-16 | 3次元の仮想空間内で仮想カメラを制御するための方法、システムおよびコンピュータ読取可能な記録媒体 |
CN201380053949.1A CN104769543B (zh) | 2012-10-16 | 2013-10-16 | 用于在虚拟三维空间中控制虚拟相机的方法和系统以及计算机可读记录介质 |
EP13847729.4A EP2911393B1 (en) | 2012-10-16 | 2013-10-16 | Method and system for controlling virtual camera in virtual 3d space and computer-readable recording medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR20120114780 | 2012-10-16 | ||
KR10-2012-0114780 | 2012-10-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014062001A1 true WO2014062001A1 (ko) | 2014-04-24 |
Family
ID=50488495
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2013/009260 WO2014062001A1 (ko) | 2012-10-16 | 2013-10-16 | 3차원의 가상 공간 내에서 가상 카메라를 제어하기 위한 방법, 시스템 및 컴퓨터 판독 가능한 기록 매체 |
Country Status (6)
Country | Link |
---|---|
US (1) | US10007348B2 (ko) |
EP (1) | EP2911393B1 (ko) |
JP (2) | JP2016502171A (ko) |
KR (1) | KR101565854B1 (ko) |
CN (1) | CN104769543B (ko) |
WO (1) | WO2014062001A1 (ko) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104349020B (zh) * | 2014-12-02 | 2017-11-03 | 北京中科大洋科技发展股份有限公司 | 一种虚拟摄像机与真实摄像机切换的方法 |
JP6849430B2 (ja) * | 2016-12-27 | 2021-03-24 | キヤノン株式会社 | 画像処理装置、画像処理方法、及びプログラム |
US10970915B2 (en) * | 2017-01-06 | 2021-04-06 | Canon Kabushiki Kaisha | Virtual viewpoint setting apparatus that sets a virtual viewpoint according to a determined common image capturing area of a plurality of image capturing apparatuses, and related setting method and storage medium |
US10569172B2 (en) | 2017-09-19 | 2020-02-25 | Canon Kabushiki Kaisha | System and method of configuring a virtual camera |
CN108833863A (zh) * | 2018-07-24 | 2018-11-16 | 河北德冠隆电子科技有限公司 | 四维实景交通仿真的虚拟摄像机监控监视查看预览方法 |
CN109391773B (zh) * | 2018-09-21 | 2021-08-10 | 百度在线网络技术(北京)有限公司 | 全景页面切换时拍摄点的移动控制方法和装置 |
JP7307568B2 (ja) * | 2019-03-20 | 2023-07-12 | 任天堂株式会社 | 画像表示システム、画像表示プログラム、表示制御装置、および画像表示方法 |
CN114627215B (zh) * | 2022-05-16 | 2022-09-16 | 山东捷瑞数字科技股份有限公司 | 基于三维软件的相机抖动动画制作的方法和装置 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20030060492A (ko) * | 2002-01-09 | 2003-07-16 | 주식회사 인피니트테크놀로지 | 3차원 가상내시경 화면 표시장치 및 그 방법 |
KR20100110658A (ko) * | 2009-04-03 | 2010-10-13 | 주식회사 다림비젼 | 복수의 입력 비디오(디지털,아날로그,ip 압축 비디오)들을 가상 그래픽 3차원 비디오 월의 가상모니터에 비디오 택스쳐링의 방식을 이용하여 표현하고 이를 가상 3차원 그래픽 공간에 배치한 가상카메라를 통해 실시간 3차원 렌더링을 통해 실시간으로 모니터에 보여 주는 매트릭스 및 스위처/믹서 기능을 갖는 3차원 공간속의 비디오 월을 보여 주는 멀티뷰 시스템 |
KR20110096623A (ko) * | 2010-02-23 | 2011-08-31 | 주식회사 빅아이 | 입체영상 제작시의 카메라 간격 제공방법, 프로그램 기록매체 및 입체영상 생성기 |
KR20110116275A (ko) * | 2010-04-19 | 2011-10-26 | 하나로드림게임즈 주식회사 | 저작물 생성 방법, 저작물 제공 방법 및 이를 제공하는 저작물 생성 장치 |
WO2011149160A1 (ko) | 2010-05-25 | 2011-12-01 | 연세대학교 산학협력단 | 애니메이션 저작 시스템 및 애니메이션 저작 방법 |
KR20110129171A (ko) * | 2010-05-25 | 2011-12-01 | 연세대학교 산학협력단 | 애니메이션 저작 시스템 및 애니메이션 저작 방법 |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11137842A (ja) * | 1997-09-04 | 1999-05-25 | Sega Enterp Ltd | 画像処理装置 |
JP3762750B2 (ja) * | 2003-01-07 | 2006-04-05 | コナミ株式会社 | 画像表示制御プログラム |
JP3665058B2 (ja) * | 2003-02-26 | 2005-06-29 | 三菱重工業株式会社 | シミュレータシナリオ製作支援プログラム及びシミュレータ装置 |
US20080168402A1 (en) * | 2007-01-07 | 2008-07-10 | Christopher Blumenberg | Application Programming Interfaces for Gesture Operations |
EP2279497B1 (en) * | 2008-04-14 | 2016-02-10 | Google, Inc. | Swoop navigation |
WO2010022386A2 (en) * | 2008-08-22 | 2010-02-25 | Google Inc. | Navigation in a three dimensional environment on a mobile device |
WO2010060211A1 (en) * | 2008-11-28 | 2010-06-03 | Nortel Networks Limited | Method and apparatus for controling a camera view into a three dimensional computer-generated virtual environment |
IT1396752B1 (it) * | 2009-01-30 | 2012-12-14 | Galileo Avionica S P A Ora Selex Galileo Spa | Visualizzazione di uno spazio virtuale tridimensionale generato da un sistema elettronico di simulazione |
US9256282B2 (en) * | 2009-03-20 | 2016-02-09 | Microsoft Technology Licensing, Llc | Virtual object manipulation |
US9299184B2 (en) * | 2009-04-07 | 2016-03-29 | Sony Computer Entertainment America Llc | Simulating performance of virtual camera |
JP5717270B2 (ja) * | 2009-12-28 | 2015-05-13 | 任天堂株式会社 | 情報処理プログラム、情報処理装置および情報処理方法 |
JP5800473B2 (ja) * | 2010-06-11 | 2015-10-28 | 任天堂株式会社 | 情報処理プログラム、情報処理装置、情報処理システム及び情報処理方法 |
JP5846767B2 (ja) * | 2011-01-17 | 2016-01-20 | 株式会社ソニー・コンピュータエンタテインメント | ゲーム装置、ゲーム制御方法、及びゲーム制御プログラム |
US8547414B2 (en) * | 2011-01-31 | 2013-10-01 | New Vad, Llc | Touch screen video switching system |
-
2013
- 2013-10-16 CN CN201380053949.1A patent/CN104769543B/zh active Active
- 2013-10-16 KR KR1020130123418A patent/KR101565854B1/ko active IP Right Grant
- 2013-10-16 EP EP13847729.4A patent/EP2911393B1/en active Active
- 2013-10-16 US US14/436,085 patent/US10007348B2/en active Active
- 2013-10-16 JP JP2015538018A patent/JP2016502171A/ja active Pending
- 2013-10-16 WO PCT/KR2013/009260 patent/WO2014062001A1/ko active Application Filing
-
2018
- 2018-05-30 JP JP2018103400A patent/JP6728272B2/ja active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20030060492A (ko) * | 2002-01-09 | 2003-07-16 | 주식회사 인피니트테크놀로지 | 3차원 가상내시경 화면 표시장치 및 그 방법 |
KR20100110658A (ko) * | 2009-04-03 | 2010-10-13 | 주식회사 다림비젼 | 복수의 입력 비디오(디지털,아날로그,ip 압축 비디오)들을 가상 그래픽 3차원 비디오 월의 가상모니터에 비디오 택스쳐링의 방식을 이용하여 표현하고 이를 가상 3차원 그래픽 공간에 배치한 가상카메라를 통해 실시간 3차원 렌더링을 통해 실시간으로 모니터에 보여 주는 매트릭스 및 스위처/믹서 기능을 갖는 3차원 공간속의 비디오 월을 보여 주는 멀티뷰 시스템 |
KR20110096623A (ko) * | 2010-02-23 | 2011-08-31 | 주식회사 빅아이 | 입체영상 제작시의 카메라 간격 제공방법, 프로그램 기록매체 및 입체영상 생성기 |
KR20110116275A (ko) * | 2010-04-19 | 2011-10-26 | 하나로드림게임즈 주식회사 | 저작물 생성 방법, 저작물 제공 방법 및 이를 제공하는 저작물 생성 장치 |
WO2011149160A1 (ko) | 2010-05-25 | 2011-12-01 | 연세대학교 산학협력단 | 애니메이션 저작 시스템 및 애니메이션 저작 방법 |
KR20110129171A (ko) * | 2010-05-25 | 2011-12-01 | 연세대학교 산학협력단 | 애니메이션 저작 시스템 및 애니메이션 저작 방법 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2911393A4 |
Also Published As
Publication number | Publication date |
---|---|
KR101565854B1 (ko) | 2015-11-05 |
CN104769543B (zh) | 2018-10-26 |
EP2911393B1 (en) | 2018-06-06 |
JP6728272B2 (ja) | 2020-07-22 |
JP2018160260A (ja) | 2018-10-11 |
JP2016502171A (ja) | 2016-01-21 |
EP2911393A1 (en) | 2015-08-26 |
US20150241980A1 (en) | 2015-08-27 |
US10007348B2 (en) | 2018-06-26 |
EP2911393A4 (en) | 2016-06-08 |
CN104769543A (zh) | 2015-07-08 |
KR20140048827A (ko) | 2014-04-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014062001A1 (ko) | 3차원의 가상 공간 내에서 가상 카메라를 제어하기 위한 방법, 시스템 및 컴퓨터 판독 가능한 기록 매체 | |
KR102595150B1 (ko) | 다수의 가상 캐릭터를 제어하는 방법, 기기, 장치 및 저장 매체 | |
CN110533780B (zh) | 一种图像处理方法及其装置、设备和存储介质 | |
WO2015174729A1 (ko) | 공간 정보를 제공하기 위한 증강 현실 제공 방법과 시스템, 그리고 기록 매체 및 파일 배포 시스템 | |
KR20220119494A (ko) | 가상 객체 제어 방법 및 장치, 기기, 컴퓨터 판독 가능 저장 매체 | |
WO2011031026A2 (ko) | 배경 이미지를 이용한 3차원 아바타 서비스 제공 시스템 및 방법 | |
WO2021097600A1 (zh) | 一种隔空交互方法、装置和设备 | |
CN116134405A (zh) | 用于扩展现实的私有控制接口 | |
WO2014116056A1 (ko) | 애니메이션의 모션 시퀀스를 생성하기 위한 방법, 시스템 및 컴퓨터 판독 가능한 기록 매체 | |
US11107184B2 (en) | Virtual object translation | |
CN107003804B (zh) | 为提供原型设计工具的方法、系统及可进行非暂时性的计算机解读的记录媒介 | |
CN104536562B (zh) | 一种基于体感技术及云计算的文件传输方法 | |
CN105631901A (zh) | 一种确定待测对象的运动信息的方法及装置 | |
KR20120010374A (ko) | 손가락의 움직임을 인식하여 3d인터페이스를 제공하는 단말기 및 그 방법 | |
CN109840946A (zh) | 虚拟对象显示方法及装置 | |
Chen et al. | A case study of security and privacy threats from augmented reality (ar) | |
WO2013025011A1 (ko) | 공간 제스처 인식을 위한 신체 트래킹 방법 및 시스템 | |
WO2014062003A1 (ko) | 군중 애니메이션을 생성하기 위한 방법, 시스템 및 컴퓨터 판독 가능한 기록 매체 | |
WO2011083929A2 (ko) | 뷰잉 프러스텀을 이용하여 객체에 대한 정보를 제공하기 위한 방법, 시스템 및 컴퓨터 판독 가능한 기록 매체 | |
Jiang et al. | A SLAM-based 6DoF controller with smooth auto-calibration for virtual reality | |
Lee et al. | Tunnelslice: Freehand subspace acquisition using an egocentric tunnel for wearable augmented reality | |
Kot et al. | Application of augmented reality in mobile robot teleoperation | |
CN114053693B (zh) | 虚拟场景中的对象控制方法、装置及终端设备 | |
CN110489026A (zh) | 一种手持输入设备及其指示图标的消隐控制方法和装置 | |
KR20200069009A (ko) | 가상현실 구현장치 및 이를 이용한 플랜트 전기배선 설계의 자동화 시스템 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13847729 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015538018 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14436085 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013847729 Country of ref document: EP |