US20040141014A1 - Display controlling apparatus, information terminal unit provided with display controlling apparatus, and viewpoint location controlling apparatus - Google Patents

Display controlling apparatus, information terminal unit provided with display controlling apparatus, and viewpoint location controlling apparatus Download PDF

Info

Publication number
US20040141014A1
US20040141014A1 US10/639,517 US63951703A US2004141014A1 US 20040141014 A1 US20040141014 A1 US 20040141014A1 US 63951703 A US63951703 A US 63951703A US 2004141014 A1 US2004141014 A1 US 2004141014A1
Authority
US
United States
Prior art keywords
information
view
information object
viewpoint
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/639,517
Other languages
English (en)
Inventor
Toru Kamiwada
Takushi Fujita
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujitsu Ltd
Original Assignee
Fujitsu Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujitsu Ltd filed Critical Fujitsu Ltd
Assigned to FUJITSU LIMITED reassignment FUJITSU LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FUJITA, TAKUSHI, KAMIWADA, TORU
Publication of US20040141014A1 publication Critical patent/US20040141014A1/en
Priority to US11/320,345 priority Critical patent/US7812841B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object

Definitions

  • the present invention provides a display control apparatus that enables smooth viewpoint movement and to search an object by an intuitive operation in a space in which a plurality of information objects have different dimensions and shapes, and that realizes a proper movement direction and a proper movement speed based on geometric information of the object.
  • the present invention relates to an information terminal unit provided with a display controlling apparatus and a display controlling program controlling a computer to conduct processes at the display controlling apparatus.
  • Japanese Laid-Open Patent Application No. 2000-172248 by the inventor of the present invention is known as a three-dimensional display controlling apparatus that provides a comfortable environment for browsing an electronic document with hypertext structure on a display unit.
  • an electronic document group with hypertext structure is arranged in a single virtual space based on its like structure, a display image to browse the electronic document is generated based on a view defined in the virtual space and displayed at the display unit.
  • a view is consecutively changed so as to consecutively generate and display the display image at the display unit at real time based on the view at the time.
  • the user can browse a document by following links of hypertext so as to consecutively enlarge the document, while the user consecutively changes the view in the virtual space.
  • FIG. 1 A variation example of a display screen is shown in FIG. 1.
  • An electronic document 602 is linked by the hypertext structure from an electronic document 601 , and further an electronic document 603 is linked.
  • a screen example 62 is displayed by zooming in from a state of a screen example 61 , and further a screen example 63 is displayed by zooming in.
  • a method has already been devised in that the three-dimensional product catalog is distributed through the Internet, and the three-dimensional shape is displayed at a terminal side while the three-dimensional shape is being rotated or a like on a screen by the operation of the user.
  • a Web3D as a typical method in that the three-dimensional shape such as a product is distributed from a server on the Internet to a user terminal in a VRML (Virtual Reality Modeling Language) form or a special form, and the three-dimensional shape is displayed on the screen at the user terminal by a program for displaying such as three-dimensional shape.
  • a display object is selected by selecting an item by using a mouse on an HTML document displayed at a general WWW browser. That is, the program is to display a single three-dimensional shape.
  • the program is realized as a plug-in of a browser, a Java program, or an ActiveX object.
  • the single three-dimensional shape can be displayed, rotated, and partially enlarged in a window of the browser. Therefore, before the shape is displayed, it is needed to click an option by the mouse, select a menu, conduct a search condition in order to select the display object.
  • the first object of the present invention are achieved by a display controlling apparatus for displaying a plurality of information objects in a three-dimensional virtual space, the plurality of information objects successively linking each other and including an information object shown by a three-dimensional shape
  • the display controlling apparatus including: a view determining part determining a view to display as if tracing a surface of a shape based on the shape of the information object to observe by corresponding to a view movement instruction input by a user; and a display image generating part generating display images of the plurality of information objects linked each other based on the view determined by the view determining part, wherein the display images are displayed at a display unit so as to display the information objects corresponding to the view movement instruction.
  • display images representing the plurality of information objects linked each other is generated based on the view determined based on the shape of the information object observed by a user and is displayed at the display unit.
  • the above-described information object may be an information object shown in the three-dimensional virtual space on the display unit by an electronic document having a hypertext structure provided through the Internet.
  • the present invention can be arranged to include a link information managing part managing a relative location relationship and a relative scale ratio of each of the information objects in the three-dimensional virtual space as link information, wherein while the view determining part switches the information object to observe to the information object arranged in a movement direction indicated by the view management instruction based on the link information, the view determining part determines the view based on the shape of the switched information object, and the display image generating part select the information object to display from the plurality of the information objects linked each other based on the location relationship and the scale ratio and generates the display images.
  • the information object to observe is switched to another information object arranged in a movement direction based on the link information, and also the information objects to display are selected based on the location relationship and the scale ratio of the link information.
  • the present invention can be arranged so that the view determining part includes: an observation point movement path calculating part calculating a movement path of an observation point based on a three-dimensional shape of the information object; and a view movement calculating part calculating the view movement based on the shape of the information object, wherein when the information object to observe is the three-dimensional shape, the view movement calculating part calculates the view movement based on a calculation result by the observation point movement path calculating part.
  • the view movement is calculated based on a calculation result of the movement of the observation point. Therefore, it is possible to conduct the view movement based on each shape of the information objects having different dimensions. Also, it is possible to calculate the movement path of the observation point as if a curved surface is traced.
  • the first object of the present invention can be achieved by a display controlling program for causing a computer to display the plurality of information objects in the three-dimensional virtual space, and also by a computer-readable recording medium with program code for causing a computer to display the plurality of information objects in the three-dimensional virtual space.
  • the second object of the present invention are achieved by an information terminal unit provided with the display controlling apparatus as claimed in any one of claims 1 through 4 , including an instruction receiving part receiving a view movement instruction in the three-dimensional virtual space by an operation of a user in a view movement direction.
  • the second object of the present invention are achieved by a viewpoint location controlling apparatus for controlling a viewpoint location with respect to the plurality of the information objects having shapes displayed in a three-dimensional virtual space
  • the viewpoint location controlling apparatus including: a reference determining part determining the information object to display at a nearest location from the viewpoint location in accordance with an input by a user; and a speed changing part changing a movement speed of the view by corresponding to a length of a distance from the determined reference object to the viewpoint location, wherein based on the reference information object, the viewpoint location is controlled so as to move the viewpoint at the movement speed corresponding to the distance to the viewpoint location, and the plurality of the information objects are displayed.
  • the viewpoint location controlling apparatus since the information object at the closest location to the viewpoint location is determined as the reference information object, it is possible to determine the information object that is subject for the viewpoint to observe. Also, the viewpoint location is controlled by the speed changing part so that the viewpoint is moved at the movement speed corresponding to the distance to the viewpoint location. Accordingly, when the viewpoint location is approaching near the information object, the viewpoint is slowly moved. On the other hand, when the viewpoint is far from the information object, it is possible to display the plurality of information objects so that the viewpoint is quickly moved.
  • the third object of the present invention can be achieved by a viewpoint location controlling program for causing a computer to control a viewpoint location with respect to the plurality of the information objects in the three-dimensional virtual space, and also by a computer-readable recording medium with program code for causing a computer to control a viewpoint location with respect to the plurality of the information objects in the three-dimensional virtual space.
  • FIG. 1 is a diagram showing change examples of a conventional display screen.
  • FIG. 2 is a diagram showing an example of a functional configuration of a display controlling apparatus.
  • FIG. 3A is a diagram for explaining a viewpoint movement in a case of moving in parallel with respect to a plane information object
  • FIG. 3B is a diagram for explaining the viewpoint movement in a case of conducting a tilt operation with respect to the plane information object.
  • FIG. 4A is a diagram for explaining a viewpoint movement in a case of moving in parallel with respect to a three dimensional information object
  • FIG. 4B is a diagram for explaining the viewpoint movement in a case of conducting a tilt operation with respect to the solid information object.
  • FIG. 5 is a diagram for explaining a view movement to a target surface being plane.
  • FIG. 6 is a diagram for explaining the view movement to a target surface being spherical.
  • FIG. 7 is a diagram for explaining the view movement to a target surface being a free form.
  • FIG. 8 is a flowchart diagram for explaining a process of the view movement with respect to the target surface being plane.
  • FIG. 9 is a flowchart diagram for explaining a process of the view movement with respect to target surface being the spherical surface or the free form surface.
  • FIG. 10 is a diagram showing an example of a successive link of the information objects.
  • FIG. 11 is a diagram showing an example of screen changes by the view movement.
  • FIG. 12A is a diagram showing an example of an information terminal unit provided with the display controlling apparatus
  • FIG. 12B and FIG. 12C are diagrams showing examples of remote controller for conducting a view movement operation.
  • FIG. 13 is a diagram showing another example of the information terminal unit provided with the display controlling apparatus.
  • FIG. 14 is a diagram showing a configuration of a three-dimensional data browsing apparatus.
  • FIG. 15 is a diagram showing a display example of the three-dimensional data browsing screen.
  • FIG. 16A through FIG. 16D are diagrams showing examples of changes of a three-dimensional data browsing screen in a case of approaching the viewpoint to the information object.
  • FIG. 17 is a diagram showing a link structure of each of the information objects.
  • FIG. 18 is a flowchart diagram for explaining a displaying process in the three-dimensional data browsing apparatus.
  • FIG. 19 is a diagram showing an example of a movement speed of the viewpoint corresponding to a viewpoint location.
  • FIG. 20 is a flowchart diagram for explaining a reference information object determining process.
  • FIG. 21 is a diagram showing a direction example of the viewpoint movement in a case in which the information object being plane is the reference information object.
  • FIG. 22 is a diagram showing a direction example of the viewpoint movement in a case in which the information object being plane is the reference information object.
  • FIG. 23 is a diagram showing a state in which another smaller information object having a different geometric model positions in a front of the information object.
  • FIG. 24 is a diagram showing an example of a distance used as reference to conduct a distance process.
  • FIG. 25 is a flowchart diagram for explaining a viewpoint distance process.
  • FIG. 26 is a graph diagram showing a correspondence between the distances before and after the viewpoint distance process.
  • FIG. 27 is a diagram for explaining an example of the viewpoint distance process in a case in which two information objects having different geometric models position in a viewpoint direction.
  • FIG. 28 is a diagram for explaining an example of the viewpoint distance process in a case in which an information object is viewed from the information object having a different geometric model.
  • FIG. 2 is a diagram showing an example of a functional configuration of a display controlling apparatus.
  • a display controlling apparatus 100 includes an information object data obtaining part 101 , an information object data storing part 102 , a user instruction receiving part 103 , a view determining part 104 , a display image generating part 105 , a display controlling part 106 , a plurality of view movement calculating parts 107 , a plurality of information object displaying parts 108 , an observation point movement path calculating part 109 , a communication controlling part 110 , and an installer 111 .
  • the information object data obtaining part 101 obtains information object data from Web information obtained through a network 118 , such as the Internet, by the communication controlling part 110 , and stores the obtained information object data in the information object data storing part 102 . Moreover, as link information that defines a correlation between information objects, a relative position relationship and a scale ratio within a virtual space are stored.
  • the user instruction receiving part 103 receives data showing a view movement direction indicated by a user.
  • the view determining part 104 determines a view by using a calculation result of the view movement calculating part 107 and the observation point movement path calculating part 109 .
  • the display image generating part 105 Based on view data determined by the view determining part 104 , the display image generating part 105 generates all display images of the information objects displayed in a range of the view in response to the shape of each information object.
  • the display controlling part 106 controls a display unit in order to display the display image generated by the display image generating part 105 . That is, the display controlling part 106 generates the display image data on which an information object is displayed, based on the view data by using information object displaying part 108 corresponding to the shape of each information object.
  • the view movement calculating part 107 calculates a view movement amount including a movement distance, an angle change, and movement direction information, corresponding to the shape of an information object.
  • the shape of the information object is a spherical surface or a free form surface
  • the view movement amount is calculated.
  • the view movement and information object suitable for each of the information object of various aspects such as a planar information object, solid information object, and a like can be displayed.
  • the communication controlling part 110 controls a connection and disconnection to the network 118 , and controls to send and receive data.
  • the installer 111 installs a program for executing processes conducted by each processing part described above that control the display controlling apparatus 100 , from a CD-ROM 119 that is a computer-readable storage medium.
  • the installed program is executed by a CPU (Central Processing Unit) of the display controlling apparatus 100 and realizes each processing part described above.
  • a medium storing the program is not limited to the CD-ROM 119 but any computer-readable medium can be used.
  • the view movement calculating part 107 and the information object displaying part 108 may be obtained from the network 118 , the CD-ROM 119 , or a like, and be used by the determining part 104 and the display image generating part 105 .
  • FIG. 3 is a diagram for explaining the viewpoint movement with respect to a planer information object.
  • FIG. 4 is a diagram for explaining the viewpoint movement with respect to the solid information object.
  • a target surface 401 on the information object shows a target surface that is subject for the viewpoint to zoom in.
  • a viewpoint location 402 shows a current viewpoint location of a user.
  • An observation point 403 shows a location to be observed on the information object within a current view of the user. It should be noted that the observation point 403 is a intersection of a center line of the view and the target surface.
  • the viewpoint movement path 411 shows movement paths of the viewpoint accompanying a zoom-in operation and a zoom-out operation.
  • the viewpoint movement path 412 shows movement paths of the viewpoint accompanying a right movement operation and a left movement operation.
  • the viewpoint movement path 413 shows movement paths of the viewpoint accompanying an upward movement operation and a downward movement operation.
  • the viewpoint movement path 414 shows movement paths of the viewpoint accompanying tilt operations.
  • the viewpoint movement path 415 shows movement paths of the viewpoint accompanying rotation operations.
  • dashed line arrows in FIG. 4A and FIG. 4B shows examples of visual lines when the viewpoint in the view moves along each viewpoint movement path. Tips of the dashed line arrows show observation points at that time.
  • the viewpoint infinitely approaches the observation point 403 on the target surface 401 along the movement path 411 .
  • the viewpoint moves so as to distance far from the target surface along the movement path 411 .
  • a movement speed of the viewpoint changes in proportion to the distance between the viewpoint and the target surface. Accordingly, it seems that the viewpoint is infinitely approaching the observation point on the target surface, so as to realize a zoomed display.
  • the display controlling apparatus 100 moves the viewpoint location 402 and the observation point 403 based on information showing the direction of the view movement from the user instruction receiving part 103 in FIG. 2 so that the visual line moves toward the target surface 401 in parallel.
  • the display controlling apparatus 100 moves the viewpoint location 402 and the observation point 403 based on information showing the direction of the view movement from the user instruction receiving part 103 in FIG. 2 so that the visual line traces the target surface 401 . That is, the viewpoint location 402 and the observation point 403 are moved while a tilt angle between a direction of the visual line and the target surface is maintained at constant.
  • the display controlling apparatus 100 moves based on the information showing the direction of the view movement from the user instruction receiving part 103 of FIG. 2 along the viewpoint movement path 414 in which the distance between the viewpoint location 402 and the observation point 403 becomes constant. That is, at a constant distance between the viewpoint location 402 and the observation point 403 , the display controlling apparatus 100 moves the viewpoint location 402 , while changing an angle of the direction of the visual line, which connects the viewpoint location 402 and the observation point 403 , and the target surface 401 .
  • the viewpoint location 402 is moved along the viewpoint movement path 415 . That is, in the rotation operation, the display controlling apparatus 100 maintains a location of the observation point 403 , and the distance and the tilt angle from the observation point 403 to the viewpoint location 402 to be constant, and the display controlling apparatus 100 moves the view so as to rotate centering on a perpendicular to the target surface in the observation point 403 .
  • the display object has an information object shape having a spherical surface other than a flat surface as described above, it is necessary to allocate a proper shape of the target surface beforehand. It is not necessary that the target surface shape always corresponds to the information object shape. For example, the target surface being spherical can be allocated to the information object being near globular but irregular.
  • FIG. 5 is a diagram for explaining the view movement to the target surface being plane.
  • FIG. 6 is a diagram for explaining the view movement to the target surface being spherical.
  • FIG. 7 is a diagram for explaining the view movement to the target surface being a free form.
  • FIG. 5 FIG. 6, and FIG. 7, lx, ly, and lz denote vectors defining x axis, y axis, and z axis, respectively, of a local coordinate system of the information object.
  • Vx, Vy, and Vz denote vectors representing x axis, y axis, and z axis, respectively, of the viewpoint coordinate system.
  • V 0 is a location of the viewpoint, that is, V 0 denotes an origin of the viewpoint coordinate system.
  • T the observation point on the target surface.
  • a straight line connecting the viewpoint V 0 and the observation point T is called the visual line, and the view is defined so that the visual line becomes a centerline of the view. That is, the view is defined so that the observation point comes to a center of the screen.
  • the direction of the view is defined so that the vector Vx corresponds to a screen horizontal right direction and the vector Vy corresponds to screen perpendicular down direction.
  • the movement directions of the observation point corresponding to the view movement instruction by the user, which indicates right and left and up and down, respectively, are shown by lines Cu and Cv with arrows.
  • a direction of Cu is defined as a direction based on the target surface shape of the line of intersection of an xz plane and the target surface of the viewpoint coordinate system.
  • a direction of Cv is defined as a direction based on the target surface shape of the line of intersection of a yz plane and the target surface of the viewpoint coordinate system.
  • the observation point is moved along the target surface.
  • the above-mentioned operation can be realized by a similar method disclosed in the Japanese Laid-Open Patent Application No. 2000-112248. Accordingly, the view determining part 104 shown in FIG. 2 determines so as to move the view simply along the movement path Cu or Cv in parallel.
  • the movement path Cu or Cv of the observation point T is determined so as to move along the curve line while changing the direction of the viewpoint coordinate system. Moreover, in other than the observation point T, the movement paths Cu and Cv are not necessary to correspond to the line of intersection of the xz plane or the yz plane of the viewpoint coordinate system and the target surface. Consequently, the observation point movement path Cu or Cv is calculated for each kind of curved surface.
  • FIG. 8 is a flowchart diagram for explaining the process of the view movement with respect to the target surface being plane.
  • the user instruction receiving part 103 in FIG. 2 receives an input instruction of a user (step S 101 ), and determines whether or not the input instruction of the user is a movement instruction to move the view upward, downward, leftward, or rightward (step S 102 ).
  • the user instruction receiving part 103 executes a step S 103 , and conducts a process corresponding to the input instruction of the user (step S 103 ). The process goes back to the step s 101 and waits for a next input instruction of the user.
  • the user instruction receiving part 103 decides that the input instruction of the user Is the movement instruction to move the view upward, downward, leftward, and rightward
  • the user instruction receiving part 103 activates the view determining part 104 , and executes a step S 104 .
  • the step S 104 the movement direction vector t of the observation point T is obtained.
  • the view determining part 104 activated by the user instruction receiving part 103 obtains a direction vector of the line of intersection of the xy plane of the viewpoint coordinate system and the target surface in the observation point T, and determines the direction vector as the movement direction vector t. It should be noted that a positive direction in chosen for the movement direction vector T so as not to be an obtuse angle between the movement direction vector T and the vector Vx.
  • a movement distance d of the observation point T is calculated. That is, the view determining part 104 calculates the movement distance d by giving the obtained the movement direction vector t to the view movement calculating part 107 .
  • the view movement calculating part 107 calculates the movement distance d of the observation point T from the movement direction vector t given from the view determining part 104 based on data of the information object of the plane, which is obtained from the information object data storing part 102 and currently displayed on the display unit.
  • a step S 106 the view determining part 104 moves the view toward the vector t in parallel by the distance d calculated by the view movement calculating part 107 .
  • a step S 107 the display image generating part 105 generates a drawing based on view data that are obtained from the view determining part 104 and moved in parallel.
  • the display controlling part 106 displays the drawing corresponding to a new view on the display unit based on drawing data generated by the display image generating part 105 .
  • a step S 108 it is determined whether or not it is an end of the process. When it is determined that it is the end of the process, the process is terminated. On the other hand, when it is determined that it is not the end of the process, the process goes back to the step S 101 and receives a next input instruction by the user.
  • FIG. 9 is a flowchart diagram for explaining the process of the view movement with respect to target surface being the spherical surface or the free form surface as the curved surface.
  • the user instruction receiving part 103 in FIG. 2 receives the input instruction of the user (step S 121 ), and determines whether or not the input instruction is the movement instruction to go upward, downward, rightward, or leftward (step S 122 ).
  • the user instruction receiving part 103 executes the step S 103 , conducts a process corresponding to the input instruction of the user (step S 123 ), and goes back to the step S 121 to waits for the next input instruction of the user.
  • the user instruction receiving part 103 activates the view determining part 104 , and executes a step S 124 .
  • the user instruction receiving part 103 obtains for the tangent vector t of the movement direction of a current observation point T at the location of a current observation point T.
  • the view determining part 104 activated by the user instruction receiving part 103 obtains a tangent direction vector being a line of the intersection of the xy plane of the viewpoint coordinate system and the target surface at the observation point T, and sets the tangent direction vector as the movement direction vector t.
  • the movement direction vector t is selected to be a positive direction so as not to be the obtuse angle with respect to the vector Vx.
  • the movement distance d of the observation point T is calculated. That is, the view determining part 104 calculates the movement distance d by giving the movement direction vector t obtained in the step S 124 to the view movement calculating part 107 .
  • the view movement calculating part 107 calculates the movement distance of an observation point according to the distance of the viewpoint V 0 and the observation point T based on data of the information object having the curved surface, which is obtained from the information object data storing part 102 and is currently displayed on the display unit.
  • the view determining part 104 searches for a point which moves only by the distance d in the positive direction t along the target surface from the current observation point T as a new observation point T by using the observation point movement path calculating part 109 , which is prepared corresponding to a type of the curved surface, based on data of the information object having the curved surface, which is obtained from the information object data storing part 102 and is currently displayed on the display unit.
  • a step S 127 the view determining part 104 searches for a view corresponding to a new observation point by using the view movement calculating part 107 so that the tilt angle and the rotation angle in a current view are stored.
  • a location of the viewpoint V 0 and a direction of the Vector Vz are determined so that the visual line crosses the observation point newly obtained while maintaining the distance from the view to the observation point, the tilt angle, and the rotation angle.
  • the directions of Vectors Vx and Vy are determined so as that the xy plane determined by the vectors Vx and Vy includes a tangent of the vector Cu in the new observation point T.
  • a step S 128 the display image generating part 105 generates drawing data based on the view data corresponding to the curved surface shape obtained from the view determining part 104 .
  • the display controlling part 106 displays a drawing corresponding to the new view on the display unit based on the drawing data generated by the display image generating part 105 .
  • a step S 129 it is determined whether or not it is an end of the process. When it is determined that it is the end of the process, the process goes back to the step S 121 to receive a next input instruction of the user.
  • the observation point movement path calculating part 109 along the target surface prepared for each shape type of each target surface may be provided.
  • the view determining part 104 may select the movement path calculating part corresponding to the type of corresponding target surface, based on the information object that is obtained from the information object data storing part 102 and becomes as a reference of the view movement displayed at the display unit.
  • FIG. 10 is a diagram showing an example of a successive link of the information objects.
  • an information object S 201 having a spherical surface links to an information object A 202 , an information object B 203 , and an information object C 204 having a plane.
  • the information object A 202 is linked to information objects X 205 and Y 206 being cubic.
  • a starting point side of each arrow indicates the information object being a link source and an ending point side of each arrow indicates the information object being a link destination.
  • the information object of the link destination in each link is arranged on a smaller scale than that of the information object of the link source near the surface of the information object of the link source.
  • the information objects A 202 , B 203 , and C 204 being plane as the information object of the link destination are arranged on a small scale near the surface of the information object S 201 having the spherical surface as the information object of the link source. Furthermore, the information object X 205 being cubic and the information object Y 206 being cubic as a link destination information object are arranged on a small scale near the surface of the information object A 202 being plane as the information object of the link source.
  • the local coordinate system is defined for each of the information objects. Then, the local coordinate system can be defined by defining the transformation matrix between those coordinate systems.
  • a local coordinate system which center is an origin is defined.
  • information object A 202 having the plane shape, which is linked from the information object S 201 a local coordinate system which center is an origin and the plane shape corresponds to the xy plane is defined.
  • the origin comes near the sphere surface and a direction of the z axis is arranged so as to point the origin of the latter local coordinate system.
  • This definition is defined by the transformation matrix between both coordinate systems.
  • FIG. 11 is a diagram showing an example of the screen changes by the view movement.
  • the information object S 201 having spherical surface is determined by an observation subject on a screen 21 . And in a state in which the view is defined at a location of facing a front of and viewing from relatively far distance, the information objects in the virtual space are projected to be images. Since the information object S 201 is spherical, the viewpoint movement process shown in FIG. 9 is selected with respect to the viewpoint movement operation by the user.
  • the information objects A 202 , B 203 , and C 204 are clearly displayed as shown in a screen 23 .
  • the information objects to be displayed at a screen are automatically selected and displayed based on a location relationship between the view and each of the information objects.
  • the information object that has not been visible becomes opaque gradually, and appears on the screen.
  • the information objects A 202 , B 203 , and C 204 move rightward so as to turn to a backside of the information object S 201 having the spherical surface.
  • the information object C 204 turns and hides itself behind the information object S 201 having the spherical surface.
  • the information object B 203 turns behind the information object S 201 having the spherical surface, and then the information object A 202 is displayed at a front on the information object S 201 as shown in a screen 25 .
  • the view movement operation to move the view leftward the view is moved so as to turn beside the information object S 201 along a path such as a view movement path 412 shown in FIG. 4A.
  • the information object A 202 becomes visible at the front.
  • the zoom-in operation is conducted from this state, as shown in the screen 26 , the information object A 202 is displayed larger, and the information object X 205 and the information object Y 206 linked from the information object A 202 appear.
  • the information object to be the observation subject is switched from the information object S 201 to the information object A 202 . Thereby, the view movement method shown in FIG. 5 is selected.
  • a screen 28 when the view movement operation is conducted to move the view leftward, the viewpoint moves to turn to a left side of the information object X 205 along the viewpoint movement path 412 . Then, the information object X 205 is displayed as shown in a screen 29 .
  • an information terminal unit provided with the display controlling apparatus 100 realizing the above-described processes by a view movement operating device that enables a user to conduct the view movement operation can be configured as shown in FIG. 12 and FIG. 13.
  • FIG. 12A, FIG. 12B, and FIG. 12C are diagrams showing examples of the information terminal unit provided with the display controlling apparatus. An example is shown in that the user instruction receiving part 103 in FIG. 2 is applied to a remote controller.
  • an information terminal unit 1000 includes a display unit 501 that displays display data on a screen 502 based on the display data sent from the display controlling part 106 in FIG. 2, a remote controller 503 that conducts the process by the user instruction receiving part 103 in FIG. 2, an information terminal main unit 505 that controls each of processing parts shown in FIG. 2 and controls the entire display unit 501 , and a CD-ROM driver 506 that installs recorded data read from a CD-ROM 119 to a storage unit of the information terminal main unit 505 .
  • the screen 502 of the display unit 501 consecutively changes with the view movement by the user operating the remote controller 503 .
  • the remote controller 503 that conducts the view movement operation is configured as shown in FIG. 12B and FIG. 12C.
  • the remote controller 503 includes a power button 521 that turns on or off a power source, an information input button 522 such as a ten key, and a view movement buttons 523 that indicates the view movement.
  • the remote controller 503 includes the power button 521 that turns on or off the power source, the information input buttons 522 such as the ten key, a view movement joystick 524 , and a view movement button 5231 that indicates the view movement.
  • the view movement joystick 524 can realize operations to go upward, downward, rightward, and leftward by the view movement buttons 523 shown in FIG. 12B.
  • the view movement button 5231 can realize the zoom-in, the zoom-out, the tilt, and the rotation operations.
  • FIG. 13 is a diagram showing another example of the information terminal unit provided with the display controlling apparatus.
  • an information terminal unit being portable type is applied.
  • an information terminal unit 1001 includes the screen 502 displaying the display data controlled by the display controlling part 106 in FIG. 2, and the information input buttons 522 and the view movement buttons 523 that have functions equivalent to those of the information terminal unit 1000 .
  • the view movement method is automatically selected corresponding to a form and contents of the information object that is the observation subject for a current view, from a plurality of the view movement processes described above, so that it is possible to conduct the view movement process based on the user instruction. Accordingly, even if the information object being plane and the information object being solid are mixed in a single space, the user is not required to be aware of differences and the user can browse the information objects by smoothly moving the view with a common operation.
  • Relative location relationships and scale ratios in the virtual space are stored and managed in the information object data storing part 102 , as link information that defines correlation between the information objects.
  • the information object to observe is automatically selected by the view determining part 104 .
  • the display image generating part 105 the information objects to display in the view determined are automatically selected and then display images are generated. Therefore, it becomes possible to realize changes of the view in response to the view movement instruction by the user.
  • the zoom-in and the zoom-out can be infinitely repeated while sequentially tracing the information objects successively linked together.
  • a program that displays data showing three-dimensional shape on a two-dimensional screen generally determines a viewpoint location, a visual line direction, a view angle, and a like in a virtual three-dimensional space where three-dimensional shape data are arranged as the information object. And the program projects and displays the information object on the two-dimensional plane based on the viewpoint location, the visual line direction, the view angle, and the like. Since this viewpoint location, the visual line direction, and the like are changed by input from the user, a user interface can be realized so that the user can browse and operate the three-dimensional information objects from various locations or directions.
  • a zooming method for displaying a display object by enlarging or reducing the display object in the two-dimensional plane is to provide a uniform user interface with respect to the entire data image and a fine structure of each part by enlarging and reducing two-dimensional data subject to display. Especially, this is an effective display interface when data subject to display has a hierarchical structure.
  • zooming method two ways for a movement speed when a display object is moved upward, downward, rightward, and leftward.
  • One way is to define a change amount of a screen per unit time. This way determines how much time is needed, for example, when the display object displayed at a center of the screen is moved toward a side of the screen.
  • the change amount of the screen is generally constant, regardless of a ratio of enlargement and reduction by the zoom.
  • Another movement speed by the zoom is a relative velocity with respect to the display object, and shows a movement amount per unit time with respect to the display object displayed on the screen.
  • the relative velocity becomes greater while viewing the entire image, and the relative velocity becomes smaller while viewing details.
  • the enlargement and reduction by the zoom with respect to the two-dimensional plane are considered as movements of approaching toward and departing from the two-dimensional plane being in the three-dimensional space.
  • the relative velocity with respect to the display object by the zoom becomes an absolute velocity in the three-dimensional space and can be shown as changed corresponding a distance between the two-dimensional plane in the three-dimensional space and the viewpoint location.
  • the movement speed is constant at the viewpoint location in many cases. If the movement speed is constant at the viewpoint location, in a case in which the viewpoint location is located far from the three-dimensional information object and the view movement operation is conducted broadly, the movement speed is felt slower than expected. In a case in which the viewpoint location is located closer the three-dimensional information object and the view movement operation is conducted for a detailed portion, the movement speed is felt faster than expected. Thus, a problem causes so that the user cannot conduct the fine operations for the detailed portion.
  • This control of the movement speed corresponding to the distance can be enough for a case in which there is only one three-dimensional information object subject to display.
  • the nearest information object to the viewpoint location in selected from the plurality of three-dimensional information objects, and the movement speed of the viewpoint is determined by the distance between the selected three-dimensional information object and the viewpoint location.
  • the nearest information object is set as a reference information object.
  • the movement speed is decelerated when the viewpoint location is getting closer to the reference information object, and the movement speed is accelerated when the viewpoint location is getting farther from the reference information object when another three-dimensional information object becomes closer to the viewpoint location than a current reference Information object during the view movement, the three-dimensional information object is set as the reference information object. Then, the movement speed of the view is changed based on the distance from the viewpoint location to the three-dimensional information object.
  • information which each of the three-dimensional information objects to calculate this distance is called a “geometric model” of each three-dimension information object”. For example, If each of the three-dimensional information objects has information showing a plane as the geometric model, the distance with respect to the plane is calculated. If each of the three-dimensional information objects has information showing a spherical surface as the geometric model, first, the distance from the viewpoint location to a center of the sphere is calculated, and then, a value deducted a radius of the sphere from its result is set as the distance.
  • a case of the view movement approaching toward the plane and a case of the view movement approaching toward the sphere are compared.
  • the view movement becomes a linear movement in the three-dimensional space.
  • the distance from the spherical surface is changed accompanying with the linear movement.
  • a curvilinear movement is required. It is very difficult for the user to move along the spherical surface by repeating the linear movement.
  • various movement methods are provided simultaneously to the user, it becomes inconvenience for the user because an operation system becomes complicate. It may be required more in addition to the various movement methods, that is, further movement methods can be required with respect to the three-dimensional information objects. It is desired for the user to move the view by the same operation in a method corresponding the shape of the three-dimensional information object.
  • FIG. 14 is a diagram showing a configuration of the three-dimensional data browsing apparatus.
  • a three-dimensional data browsing apparatus 2000 includes a controlling part 2011 that controls the entire three-dimensional data browsing apparatus 2000 , an input processing part 2114 that controls data input from an input unit 2014 , a display processing part 2115 that displays data on a display unit 2015 , an information object data managing part 2116 that manages information object data by an information object data DB 2016 , a communicating part 2118 that controls data communication through an external network 2025 , an installer 2019 that installs a three-dimensional data browsing program for realizing a browse by the three-dimensional data from CD-ROM 2019 that is a storage medium storing the three-dimensional data browsing program, a reference information object determination processing part 2021 , and a viewpoint distance processing part 2022 .
  • the controlling part 2011 is a CPU (central processing unit) of the three-dimensional data browsing apparatus 2000 , and controls the entire apparatus 2000 .
  • the input unit 2014 includes the remote controller 503 shown in FIG. 12B or FIG. 12C, and controls to input data according to the operations of the user.
  • the reference information object determination processing part 2021 determines the information object used as a reference information object by comparing distances between the viewpoint location and the plurality of the information objects. Moreover, the reference information object determination processing part 2021 can determine the movement direction and the movement speed of the viewpoint upward, downward, right ward, or leftward, based on geometric model information of the determined reference information object that is managed in the information object data managing part 2116 by the information object data DB 2016 .
  • the viewpoint distance processing part 2022 processes the distance from the viewpoint location based on the local coordinate system shown using the geometric model information for each information object that is managed in the information object data managing part 2116 by the information object data DB 2016 . Thereby, the reference information object determination processing part 2021 can compare the distances from the viewpoint location to the information objects having different geometric model information, and can properly determine the reference information object.
  • FIG. 15 is a diagram showing a display example of the three-dimensional data browsing screen.
  • the three-dimensional data browsing screen which is displayed at the display unit 2015 by the three-dimensional data browsing program installed by the installer 2019 , is illustrated.
  • the three-dimensional data browsing screen 2030 is the display example in that the information objects 2031 through 2035 arranged in the virtual three-dimensional space are displayed at the display unit 2015 by projecting to the two-dimensional plane based on information concerning the viewpoint set in the three-dimensional space.
  • the viewpoint can be moved to various directions by an instruction input by the user.
  • FIG. 16A through FIG. 16D aspects of changes of the three-dimensional data browsing screen 2030 is shown in FIG. 16A through FIG. 16D.
  • FIG. 16A through FIG. 16D are diagrams showing examples of changes of the three-dimensional data browsing screen in the case of approaching the viewpoint to the information object.
  • FIG. 16A showing information showing the entire information objects 2031 through 2035 on the three-dimensional data browsing screen 2030
  • the viewpoint approaches along the information object 2034 being the sphere
  • the information object 2034 being the sphere is enlarged so that information 2036 becomes visible.
  • the information objects 2031 through 2033 are changed so as to disappear out the three-dimensional data browsing screen 2030 .
  • the viewpoint focuses on and approaches the information 2036 of the information object 2034 being the sphere
  • the information 2036 is displayed at the center of the three-dimensional data browsing screen 2030 , enlarged while the information object 2034 being the sphere, and then, the information 2036 and the information object 2034 being the sphere are displayed in the entire screen as shown in FIG. 16C.
  • the viewpoint further approaches toward the information 2036
  • the information 2036 is enlarged as shown in FIG. 16D, so as to see what the information 2036 looks like.
  • data of each information object subject to display are stored in the information object data DB 2016 by the information object data managing part 2116 beforehand, or are stored in the information object data DB 2016 by obtaining through the external network 2025 connected to the three-dimensional data browsing apparatus 2000 if necessary.
  • the information belonging to each of the information objects 2031 through 2035 includes a link structure as shown in FIG. 17.
  • FIG. 17 is a diagram showing the link structure of each of the information objects.
  • the link structure 2026 is managed by the information object data DB 2016 in FIG. 14, and includes a three-dimensional shape data 2201 that specify a three-dimensional shape, a geometric model information 2202 that shows geometric information required to calculate the distance from the viewpoint location to the information object, and a link information 2210 that shows information concerning the information object to be linked.
  • the link information 2210 includes a link destination information object name 2211 that shows a information object name used as a link destination, and a coordinate transformation matrix 2212 that transforms from the local coordinate system for the link source information object to the local coordinate system of the link destination information object.
  • the local coordinate system can be determined by a size of the information object and the geometric model information 2202 showing a geometric shape.
  • a direction and endpoints of a plane are recorded if the information object is the plane, and a location of a center and a radius length of a sphere are recorded if the information object is the sphere.
  • the geometric model information 2202 is referred to when the viewpoint is moved by operating the input unit 2014 by the user.
  • each of the information objects subject to display in the three-dimensional data browsing apparatus 2000 includes the local coordinate system.
  • the local coordinate system vertexes, lines, and a surface are defined so as to define a geometric three-dimensional shape.
  • each of the information objects includes other information objects by a link structure 2026 by which all the information objects form a tree structure.
  • the link destination information object is called a child information object of the link source information object
  • the link source information object is called a parent information object with respect to the link destination information object. All the information objects except for the information object positioned as an origin of this tree have only one parent information object.
  • the parent information object has geometric relationships such as a location and a size in the three-dimensional space, as the coordinate transformation matrix for transforming from the local coordinates of the parent information object to the local coordinates of the child information object.
  • data of the child information object defines a three-dimensional model in the local coordinate system of the child information object itself, and includes link information to further link another child information object.
  • a displaying process in the three-dimensional data browsing apparatus 2000 will be described with reference to FIG. 16A through FIG. 16D.
  • FIG. 18 is a flowchart diagram for explaining the displaying process in the three-dimensional data browsing apparatus.
  • the controlling part 2011 conducts a process P 1000 for moving the viewpoint location.
  • this process P 1000 corresponds to a process conducted by the view movement calculating parts 107 and the observation point movement path calculating part 109 , and calculates the movement distance corresponding to the shape of the information object.
  • the controlling part 2011 conducts a process P 2000 for drawing the display data on a screen.
  • This process P 2000 determines the information object as the reference information object based on the movement distance of the viewpoint calculated in the process P 1000 , and draws on the display unit 2015 .
  • step S 2303 it is determined whether or not the viewpoint movement is continuing.
  • the viewpoint movement has been conducted, the displaying process goes back to the process P 1000 and the same process is conducted.
  • the viewpoint movement is not continuing, it is checked whether or not a user input is received (step S 2305 ).
  • the displaying process goes back to the step S 2304 , and conducts the same process again.
  • step S 2306 it is determined whether or not user input is an end command.
  • the displaying process in the three-dimensional data browsing apparatus 2000 is terminated.
  • the displaying process goes back to the process P 2000 , and the same process is conducted.
  • the process P 2000 determines the information object 2034 being the sphere as the reference information object. Moreover, in this case, four sets of link information 2210 are obtained by referring to the link structure 2026 of the information object 2034 being the sphere, and based on the coordinate transformation matrix 2212 , four information objects 2036 are displayed on the information object 2034 being the sphere on the three-dimensional data browsing screen 2030 (FIG. 16B). After that, by setting the information object 2034 being the sphere as the reference information object, as shown on the three-dimensional data browsing screen 2030 shown in FIG. 16C and FIG. 16D, four children information objects linked from the information object 2034 being the sphere can be visually recognized.
  • the display data is changed corresponding to the distance from the viewpoint to the nearest information object in the three-dimensional space.
  • the movement speed of the viewpoint is proportional to the distance from the nearest information object.
  • FIG. 19 is a diagram showing an example of the movement speed of the viewpoint corresponding to the viewpoint location.
  • a process for obtaining the nearest information object is conducted in accordance with a flowchart shown in FIG. 20.
  • the reference information object determination processing part 2021 in FIG. 14 obtains distances from the viewpoint location to all information objects, and a reference information object determining process for determining the information object having the smallest value of the distance as reference information object is executed.
  • FIG. 20 is a flowchart diagram for explaining the reference information object determining process.
  • the reference information object determining process sets infinity to a value d (step S 2311 ). It is determined whether or not there are any information objects that have not been checked (step S 2312 ). When all the information objects are checked, the reference information object determining process is terminated.
  • information of information object n is retrieved from the information object data DB 2016 (step S 2313 ).
  • the distance dn from the viewpoint location to the information object n is set based on the information of information object n retrieved the information object data DB 2016 (step S 2314 ), and it is determined whether or not the distance dn from the viewpoint location to the information object n is less than the value d (step S 2315 ).
  • the distance dn from the viewpoint location to the information object n is greater than or equal to the value d.
  • the reference information object determining process goes back to the step S 2312 and the same process is conducted.
  • the information object n is set as the reference information object (step S 2317 ).
  • the reference information object determining process goes back to the step S 2312 and the same process is conducted.
  • the movement speed is changed based on the distance obtained by the process conducted by the reference information object determination processing part 2021 .
  • the movement speed used between the children information objects on the screen in FIG. 16D is applied to all movements, the movement between the information objects on the screen in FIG. 16A takes too much time.
  • the movement speed is changed corresponding the distance
  • the movement speed is changed corresponding to the distance from the nearest information object being the sphere while changing a screen state in FIG. 16A to another screen state in FIG. 16D. Accordingly, it is possible to realize the proper movement speed corresponding to each screen state.
  • This nearest information object is the “reference information object”.
  • the direction of the viewpoint movement corresponds to the viewpoint movement along this shape of the reference information object. When the viewpoint location becomes closer to another information object than the reference information object, another information object is newly used as a new reference information object.
  • the information object subject to browse at the three-dimensional browsing apparatus 2000 shows various shapes in the three-dimensional space.
  • FIG. 21 is a diagram showing a direction example of the viewpoint movement in a case in which the information object being plane is the reference information object.
  • FIG. 22 is a diagram showing a direction example of the viewpoint movement in a case in which the information object being plane is the reference information object.
  • the geometric model information 2202 of the link structure 2026 in FIG. 17 that each of the information objects has is used.
  • the direction and the endpoints of the plane are recorded if the information object is plane, and the location of the center and the radius length of the sphere are recorded if the information object is spherical.
  • the movement method for the viewpoint corresponding to the shape recorded in the geometric model information 2202 is recorded.
  • the above-described information recorded in the geometric model information 2202 is not necessary to be equal to the shape of the three-dimensional information object displayed in the screen, and is just to use the movement of the viewpoint.
  • the geometric model information 2202 properly define each of the information objects, it is possible to realize a comfortable browsing screen with respect to various shapes by the same input operation. For example, even if the viewpoint is moved among the children information objects aligned on the surface of the information object having the geometric model of the plane and even if the viewpoint is moved among the children information objects aligned on the surface of the geometric model of the sphere, the children information objects can be sequentially displayed by the same operation.
  • the information object A is an object to be largely displayed when the viewpoint is positioned at the viewpoint location 2
  • the viewpoint ends up to move along the spherical surface according to the geometric model of the information object B.
  • the reference information object subject to view does not correspond to the reference information object managing the viewpoint movement. As a result, the operation of the user becomes difficult.
  • three-dimensional data browsing apparatus 2000 processes a value obtained as the distance in the three-dimensional space and then uses the value.
  • the value as the distance being a reference in each of three stages is used as shown in FIG. 24.
  • FIG. 24 is a diagram showing an example of the distance being the reference to conduct a viewpoint distance process.
  • oblique lines show the reference information object.
  • the distance from the reference information object to the viewpoint location is divided into three distance ranges of a range to a distance dA, a range from the distance dA to a distance dB, and a range from the distance dB to a distance dC.
  • the viewpoint distance processing part 2022 in FIG. 14 is activated, and the viewpoint distance process for determining the distance to the viewpoint location is executed. That is, three reference distances dA, dB, and dC are provided in near order from the surface of the geometric model, and a process method is changed depending on the distance where the viewpoint is positioned.
  • FIG. 25 is a flowchart diagram for explaining for the viewpoint distance process.
  • a value d denotes a distance before the viewpoint distance process and a value d′ denotes a distance after the viewpoint distance process.
  • the value d′ after the viewpoint distance process is set as a value dn in the step S 2314 in FIG. 20.
  • the viewpoint distance processing part 2022 obtains the distance to the viewpoint location and sets the distance as the value d that is called a distance d hereinafter (step S 2321 ). It is determined whether or not the distance d to the viewpoint location is shorter than the distance dA (step S 2322 ). When the distance d to the viewpoint location is shorter the distance dA, the distance dA is set as the distance d to the viewpoint location (step S 2322 ), and the viewpoint distance process is terminated.
  • the distance d′ is set to be a constant at the distance dA as a process result, and the distance d′ is managed not to be changed depending on the distance d. This is a process for guaranteeing a minimum speed. It is possible to avoid the movement speed being extremely slow in the location relationship between the view and each of the information objects other than the reference information object.
  • the distance d to the viewpoint location is more than the distance dA
  • it is determined whether or not the distance d to the viewpoint location is shorter than the distance dB (step S 2323 ).
  • the distance d to the viewpoint location is set to the distance d′ as the process result (step S 2324 ). Then, the viewpoint distance process is terminated. That is, within a range where the distance d is more than the distance dA and shorter than the distance dB, the distance d is not processed and remained at the distance d′. Within this range, the above-described process for changing the movement speed is conducted.
  • step S 2325 it is further determined whether or not the distance d to the viewpoint is shorter than the distance dc.
  • infinity is set to the distance d′ (step S 2327 ). Then, the viewpoint distance process is terminated.
  • the distance d′ is calculated by the following expression (step S 2324 ).
  • d ′ ⁇ - ( dB - d ⁇ ⁇ C ) 2 ⁇ d - d ⁇ ⁇ C ⁇ ⁇ + 2 ⁇ ⁇ dB - d ⁇ ⁇ C
  • a graph showing relationship between the distance d and the distance d′ is shown in FIG. 26. As known from the graph, the distance d′ before the viewpoint distance process becomes a consecutive value with respect to the distance d after the viewpoint distance process. Accordingly, the movement speed cannot be changed quickly.
  • the viewpoint distance process is conducted in the local coordinate system of each of the information objects.
  • the distance d′ obtained in the local coordinate system of each of the information objects is converted into a value in the local coordinate system of a current reference information object.
  • the values converted from the distances d′ are compared with each other.
  • the information object having the smallest value is set as the reference information object.
  • the value is inversely converted based on the coordinate transformation matrix 2212 stored in the link information 2202 of the link structure 2026 .
  • FIG. 27 is a diagram for explaining an example of the viewpoint distance process in a case in which two information objects having different geometric models position in a viewpoint direction.
  • the range where the distance d is shorter than the distance dC in the local coordinate system of the information object A in FIG. 23 is shown as a range A in FIG. 27, and the range where the distance d is shorter than the distance dC in the local coordinate system of the information object B in FIG. 23 is shown as a range B in FIG. 27.
  • the information object B is located at the closest location. Accordingly, in this case, the information object B is set as the reference information object.
  • the viewpoint is positioned at a viewpoint location 2
  • the information object B is located closer to the viewpoint location 2 as a simply measured distance.
  • the distance to the viewpoint location is more than the distance dC in the local coordinate system of the information object B
  • the distance to the viewpoint becomes infinity after the viewpoint distance process. Since the distance to the viewpoint is shorter than the distance dC in the local coordinate system of the information object A, the distance with respect to the information object A becomes shorter than that with respect to the information object B after the viewpoint distance. As a result, the information object A is set as the reference information object. Therefore, the above-described problem can be eliminated.
  • a transparent portion is in the information object A.
  • a state, in which the information object B located farther than the information object A is viewed through that transparent portion, will be considered.
  • FIG. 28 is a diagram for explaining an example of the viewpoint distance process in a case in which an information object is viewed from the information object having a different geometric model.
  • the view movement method for moving the view along the plane is applied.
  • the viewpoint distance process described above and shown in FIG. 25 is conducted so that the distance closer to the information object B than the distance dA is set to be constant where the range A is the range in which the distance is less than the distance dA in the local coordinate system of the information object A and the range B is the range in which the distance is less than the distance dB in the local coordinate system of the information object B, the distance with respect to the information object B after the viewpoint distance process becomes smaller than that with respect to the information object A and the information object B becomes the reference information object at the viewpoint location 2 .
  • a certain range is provided for a case in that the viewpoint is positioned closer than a predetermined distance, as well as a case in that the viewpoint is positioned farther than a predetermined distance. Consequently, based on the viewpoint location, it is possible to properly select the reference information object.
  • the three-dimensional data browsing apparatus 2000 it is possible to obtain the distance from the surface of the information object to the viewpoint location along the shape of the information object based on the geometric model information.
  • the reference information object determines, as the reference information object, the information object having the distance being the closest to the viewpoint in the distances to the viewpoint location that are determined based on the local coordinate systems of the plurality of the information object, respectively, and based on predetermined ranges from the information objects to the viewpoint. Therefore, even if the viewpoint location is positioned farther from or closer to the information object, it is possible to properly determine the reference information object, so that the viewpoint movement can be conducted at a smooth speed.
  • a process by the view determining part 104 shown in FIG. 2 corresponds to a view determining part
  • a process by the display image generating part 105 shown in FIG. 2 corresponds to a display image generating part.
  • a process by the reference information object determination processing part 2021 shown in FIG. 14 corresponds to a reference determining part and a speed changing part
  • a process by the viewpoint distance processing part 2022 shown in FIG. 14 corresponds to a distance processing part.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)
US10/639,517 2001-02-23 2003-08-13 Display controlling apparatus, information terminal unit provided with display controlling apparatus, and viewpoint location controlling apparatus Abandoned US20040141014A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/320,345 US7812841B2 (en) 2001-02-23 2005-12-29 Display controlling apparatus, information terminal unit provided with display controlling apparatus, and viewpoint location controlling apparatus

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2001-048770 2001-02-23
JP2001048770 2001-02-23
PCT/JP2001/004784 WO2002069276A1 (fr) 2001-02-23 2001-06-06 Dispositif de commande d'affichage, dispositif terminal d'information equipe de ce dispositif de commande d'affichage, et dispositif de commande de position de point de vue

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2001/004784 Continuation WO2002069276A1 (fr) 2001-02-23 2001-06-06 Dispositif de commande d'affichage, dispositif terminal d'information equipe de ce dispositif de commande d'affichage, et dispositif de commande de position de point de vue

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US11/320,345 Division US7812841B2 (en) 2001-02-23 2005-12-29 Display controlling apparatus, information terminal unit provided with display controlling apparatus, and viewpoint location controlling apparatus

Publications (1)

Publication Number Publication Date
US20040141014A1 true US20040141014A1 (en) 2004-07-22

Family

ID=18909980

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/639,517 Abandoned US20040141014A1 (en) 2001-02-23 2003-08-13 Display controlling apparatus, information terminal unit provided with display controlling apparatus, and viewpoint location controlling apparatus
US11/320,345 Expired - Fee Related US7812841B2 (en) 2001-02-23 2005-12-29 Display controlling apparatus, information terminal unit provided with display controlling apparatus, and viewpoint location controlling apparatus

Family Applications After (1)

Application Number Title Priority Date Filing Date
US11/320,345 Expired - Fee Related US7812841B2 (en) 2001-02-23 2005-12-29 Display controlling apparatus, information terminal unit provided with display controlling apparatus, and viewpoint location controlling apparatus

Country Status (4)

Country Link
US (2) US20040141014A1 (ja)
EP (1) EP1363246A4 (ja)
JP (1) JP4077321B2 (ja)
WO (1) WO2002069276A1 (ja)

Cited By (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050110789A1 (en) * 2003-11-20 2005-05-26 Microsoft Corporation Dynamic 2D imposters of 3D graphic objects
US7194469B1 (en) * 2002-09-24 2007-03-20 Adobe Systems Incorporated Managing links in a collection of documents
US20070257915A1 (en) * 2006-05-08 2007-11-08 Ken Kutaragi User Interface Device, User Interface Method and Information Storage Medium
US20070257914A1 (en) * 2004-03-31 2007-11-08 Hidenori Komatsumoto Image Processing Device, Image Processing Method, And Information Storage Medium
US7576756B1 (en) * 2002-02-21 2009-08-18 Xerox Corporation System and method for interaction of graphical objects on a computer controlled system
US20110074918A1 (en) * 2009-09-30 2011-03-31 Rovi Technologies Corporation Systems and methods for generating a three-dimensional media guidance application
US20120110501A1 (en) * 2010-11-03 2012-05-03 Samsung Electronics Co. Ltd. Mobile terminal and screen change control method based on input signals for the same
US20120174038A1 (en) * 2011-01-05 2012-07-05 Disney Enterprises, Inc. System and method enabling content navigation and selection using an interactive virtual sphere
US20120311056A1 (en) * 2010-03-31 2012-12-06 Rakuten, Inc. Information processing device, information processing method, information processing program, and storage medium
US20130027393A1 (en) * 2011-07-28 2013-01-31 Tatsuo Fujiwara Information processing apparatus, information processing method, and program
US20130054319A1 (en) * 2011-08-29 2013-02-28 United Video Properties, Inc. Methods and systems for presenting a three-dimensional media guidance application
US8411092B2 (en) 2010-06-14 2013-04-02 Nintendo Co., Ltd. 2D imposters for simplifying processing of plural animation objects in computer graphics generation
US8797360B2 (en) 2008-04-18 2014-08-05 Sony Corporation Image display device, method for controlling image display device, and information storage medium
US20160132991A1 (en) * 2013-07-08 2016-05-12 Seiichiro FUKUSHI Display control apparatus and computer-readable recording medium
US20160247313A1 (en) * 2013-11-27 2016-08-25 Google Inc. Methods and Systems for Viewing a Three-Dimensional (3D) Virtual Object
US20170148222A1 (en) * 2014-10-31 2017-05-25 Fyusion, Inc. Real-time mobile device capture and generation of art-styled ar/vr content
US20170148223A1 (en) * 2014-10-31 2017-05-25 Fyusion, Inc. Real-time mobile device capture and generation of ar/vr content
US10430995B2 (en) 2014-10-31 2019-10-01 Fyusion, Inc. System and method for infinite synthetic image generation from multi-directional structured image array
US10490062B2 (en) * 2015-11-24 2019-11-26 HELLA GmbH & Co. KGaA Remote control for automotive applications
US10540773B2 (en) 2014-10-31 2020-01-21 Fyusion, Inc. System and method for infinite smoothing of image sequences
US10719732B2 (en) 2015-07-15 2020-07-21 Fyusion, Inc. Artificially rendering images using interpolation of tracked control points
US10726593B2 (en) 2015-09-22 2020-07-28 Fyusion, Inc. Artificially rendering images using viewpoint interpolation and extrapolation
US10754524B2 (en) * 2017-11-27 2020-08-25 International Business Machines Corporation Resizing of images with respect to a single point of convergence or divergence during zooming operations in a user interface
US20200304380A1 (en) * 2017-09-07 2020-09-24 Spherica Systems Limited System and Methods Utilizing Dataset Management User Interface
US10818029B2 (en) 2014-10-31 2020-10-27 Fyusion, Inc. Multi-directional structured image array capture on a 2D graph
US10852902B2 (en) 2015-07-15 2020-12-01 Fyusion, Inc. Automatic tagging of objects on a multi-view interactive digital media representation of a dynamic entity
US10891780B2 (en) 2013-11-27 2021-01-12 Google Llc Methods and systems for viewing a three-dimensional (3D) virtual object
US11189057B2 (en) * 2017-11-20 2021-11-30 Nokia Technologies Oy Provision of virtual reality content
US11195314B2 (en) 2015-07-15 2021-12-07 Fyusion, Inc. Artificially rendering images using viewpoint interpolation and extrapolation
US11202017B2 (en) 2016-10-06 2021-12-14 Fyusion, Inc. Live style transfer on a mobile device
US11435869B2 (en) 2015-07-15 2022-09-06 Fyusion, Inc. Virtual reality environment based manipulation of multi-layered multi-view interactive digital media representations
US11488380B2 (en) 2018-04-26 2022-11-01 Fyusion, Inc. Method and apparatus for 3-D auto tagging
US11632533B2 (en) 2015-07-15 2023-04-18 Fyusion, Inc. System and method for generating combined embedded multi-view interactive digital media representations
US11636637B2 (en) 2015-07-15 2023-04-25 Fyusion, Inc. Artificially rendering images using viewpoint interpolation and extrapolation
US11776229B2 (en) 2017-06-26 2023-10-03 Fyusion, Inc. Modification of multi-view interactive digital media representation
US11783864B2 (en) 2015-09-22 2023-10-10 Fyusion, Inc. Integration of audio into a multi-view interactive digital media representation
US11876948B2 (en) 2017-05-22 2024-01-16 Fyusion, Inc. Snapshots at predefined intervals or angles
US11956412B2 (en) 2015-07-15 2024-04-09 Fyusion, Inc. Drone based capture of multi-view interactive digital media
US11960533B2 (en) 2017-01-18 2024-04-16 Fyusion, Inc. Visual search using multi-view interactive digital media representations

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8416266B2 (en) * 2001-05-03 2013-04-09 Noregin Assetts N.V., L.L.C. Interacting with detail-in-context presentations
GB0207373D0 (en) * 2002-03-28 2002-05-08 Superscape Ltd Item display
EP1904952A2 (en) * 2005-05-23 2008-04-02 Nextcode Corporation Efficient finder patterns and methods for application to 2d machine vision problems
GB0605587D0 (en) 2006-03-20 2006-04-26 British Broadcasting Corp Graphical user interface methods and systems
US20100007636A1 (en) * 2006-10-02 2010-01-14 Pioneer Corporation Image display device
US7996787B2 (en) * 2007-02-06 2011-08-09 Cptn Holdings Llc Plug-in architecture for window management and desktop compositing effects
US8384718B2 (en) * 2008-01-10 2013-02-26 Sony Corporation System and method for navigating a 3D graphical user interface
EP2238521B1 (de) * 2008-01-31 2012-08-01 Siemens Aktiengesellschaft Verfahren und vorrichtung zur visualisierung einer automatisierungstechnischen anlage mit einem werkstück
JP4384697B2 (ja) * 2008-03-26 2009-12-16 株式会社コナミデジタルエンタテインメント ゲーム装置、ゲーム処理方法、ならびに、プログラム
US20090259976A1 (en) * 2008-04-14 2009-10-15 Google Inc. Swoop Navigation
JP5174522B2 (ja) * 2008-04-18 2013-04-03 株式会社ソニー・コンピュータエンタテインメント 画像表示装置、画像表示装置の制御方法及びプログラム
JP4958835B2 (ja) * 2008-04-18 2012-06-20 株式会社ソニー・コンピュータエンタテインメント 画像表示装置、画像表示装置の制御方法及びプログラム
CA2734987A1 (en) * 2008-08-22 2010-02-25 Google Inc. Navigation in a three dimensional environment on a mobile device
JP5407294B2 (ja) * 2008-11-20 2014-02-05 ソニー株式会社 無線通信装置および無線通信方法
WO2010064236A1 (en) * 2008-12-01 2010-06-10 Visual Domains Ltd. Method and system for browsing visual content displayed in a virtual three-dimensional space
JP5175794B2 (ja) * 2009-04-17 2013-04-03 株式会社プロフィールド 情報処理装置、情報処理方法、およびプログラム
TWI493500B (zh) * 2009-06-18 2015-07-21 Mstar Semiconductor Inc 使二維影像呈現出三維效果之影像處理方法及相關影像處理裝置
US9128612B2 (en) 2009-08-19 2015-09-08 Siemens Aktiengesellschaft Continuous determination of a perspective
JP5134652B2 (ja) * 2010-07-07 2013-01-30 株式会社ソニー・コンピュータエンタテインメント 画像表示装置および画像表示方法
US8626434B1 (en) 2012-03-01 2014-01-07 Google Inc. Automatic adjustment of a camera view for a three-dimensional navigation system
US20140028674A1 (en) * 2012-07-24 2014-01-30 Ahmed Medo Eldin System and methods for three-dimensional representation, viewing, and sharing of digital content
KR102332752B1 (ko) * 2014-11-24 2021-11-30 삼성전자주식회사 지도 서비스를 제공하는 전자 장치 및 방법
JPWO2021124920A1 (ja) * 2019-12-19 2021-06-24
US11688126B2 (en) * 2021-02-09 2023-06-27 Canon Medical Systems Corporation Image rendering apparatus and method

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5678015A (en) * 1995-09-01 1997-10-14 Silicon Graphics, Inc. Four-dimensional graphical user interface
US5745109A (en) * 1996-04-30 1998-04-28 Sony Corporation Menu display interface with miniature windows corresponding to each page
US5745126A (en) * 1995-03-31 1998-04-28 The Regents Of The University Of California Machine synthesis of a virtual video camera/image of a scene from multiple video cameras/images of the scene in accordance with a particular perspective on the scene, an object in the scene, or an event in the scene
US5898435A (en) * 1995-10-02 1999-04-27 Sony Corporation Image controlling device and image controlling method
US6167142A (en) * 1997-12-18 2000-12-26 Fujitsu Limited Object movement simulation apparatus
US6281877B1 (en) * 1996-03-29 2001-08-28 British Telecommunications Plc Control interface
US6597358B2 (en) * 1998-08-26 2003-07-22 Intel Corporation Method and apparatus for presenting two and three-dimensional computer applications within a 3D meta-visualization
US6710788B1 (en) * 1996-12-03 2004-03-23 Texas Instruments Incorporated Graphical user interface
US6774914B1 (en) * 1999-01-15 2004-08-10 Z.A. Production Navigation method in 3D computer-generated pictures by hyper 3D navigator 3D image manipulation
US7137075B2 (en) * 1998-08-24 2006-11-14 Hitachi, Ltd. Method of displaying, a method of processing, an apparatus for processing, and a system for processing multimedia information

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5276785A (en) * 1990-08-02 1994-01-04 Xerox Corporation Moving viewpoint with respect to a target in a three-dimensional workspace
US5359703A (en) * 1990-08-02 1994-10-25 Xerox Corporation Moving an object in a three-dimensional workspace
US5333254A (en) * 1991-10-02 1994-07-26 Xerox Corporation Methods of centering nodes in a hierarchical display
JPH09153146A (ja) * 1995-09-28 1997-06-10 Toshiba Corp 仮想空間表示方法
JPH0991109A (ja) * 1995-09-28 1997-04-04 Oki Electric Ind Co Ltd 仮想3次元空間表示装置
US5926183A (en) * 1996-11-19 1999-07-20 International Business Machines Corporation Efficient rendering utilizing user defined rooms and windows
JPH10232757A (ja) * 1997-02-19 1998-09-02 Sharp Corp メディア選択装置
JPH1139132A (ja) 1997-07-15 1999-02-12 Sharp Corp インターフェースシステム
JPH11154244A (ja) 1997-11-21 1999-06-08 Canon Inc 画像処理装置と画像情報の処理方法
JP2000020754A (ja) * 1998-07-07 2000-01-21 Nec Corp モデル表示装置
JP3939444B2 (ja) 1998-07-23 2007-07-04 凸版印刷株式会社 映像表示装置
JP3646582B2 (ja) 1998-09-28 2005-05-11 富士通株式会社 電子情報表示方法、電子情報閲覧装置および電子情報閲覧プログラム記憶媒体
JP4019240B2 (ja) * 1999-05-07 2007-12-12 株式会社セガ 画像処理方法、記録媒体及び画像処理装置
US6573896B1 (en) * 1999-07-08 2003-06-03 Dassault Systemes Three-dimensional arrow
JP3836396B2 (ja) * 2001-05-18 2006-10-25 株式会社ソニー・コンピュータエンタテインメント 表示装置及び画像処理方法
US7190365B2 (en) * 2001-09-06 2007-03-13 Schlumberger Technology Corporation Method for navigating in a multi-scale three-dimensional scene
JP2004213641A (ja) * 2002-12-20 2004-07-29 Sony Computer Entertainment Inc 画像処理装置、画像処理方法、情報処理装置、情報処理システム、半導体デバイス、コンピュータプログラム

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5745126A (en) * 1995-03-31 1998-04-28 The Regents Of The University Of California Machine synthesis of a virtual video camera/image of a scene from multiple video cameras/images of the scene in accordance with a particular perspective on the scene, an object in the scene, or an event in the scene
US5678015A (en) * 1995-09-01 1997-10-14 Silicon Graphics, Inc. Four-dimensional graphical user interface
US5898435A (en) * 1995-10-02 1999-04-27 Sony Corporation Image controlling device and image controlling method
US6184884B1 (en) * 1995-10-02 2001-02-06 Sony Corporation Image controlling device and image controlling method for displaying a plurality of menu items
US6281877B1 (en) * 1996-03-29 2001-08-28 British Telecommunications Plc Control interface
US5745109A (en) * 1996-04-30 1998-04-28 Sony Corporation Menu display interface with miniature windows corresponding to each page
US6710788B1 (en) * 1996-12-03 2004-03-23 Texas Instruments Incorporated Graphical user interface
US6167142A (en) * 1997-12-18 2000-12-26 Fujitsu Limited Object movement simulation apparatus
US7137075B2 (en) * 1998-08-24 2006-11-14 Hitachi, Ltd. Method of displaying, a method of processing, an apparatus for processing, and a system for processing multimedia information
US6597358B2 (en) * 1998-08-26 2003-07-22 Intel Corporation Method and apparatus for presenting two and three-dimensional computer applications within a 3D meta-visualization
US6774914B1 (en) * 1999-01-15 2004-08-10 Z.A. Production Navigation method in 3D computer-generated pictures by hyper 3D navigator 3D image manipulation

Cited By (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7576756B1 (en) * 2002-02-21 2009-08-18 Xerox Corporation System and method for interaction of graphical objects on a computer controlled system
US20090295826A1 (en) * 2002-02-21 2009-12-03 Xerox Corporation System and method for interaction of graphical objects on a computer controlled system
US7194469B1 (en) * 2002-09-24 2007-03-20 Adobe Systems Incorporated Managing links in a collection of documents
US7783965B1 (en) 2002-09-24 2010-08-24 Adobe Systems Incorporated Managing links in a collection of documents
US7019742B2 (en) * 2003-11-20 2006-03-28 Microsoft Corporation Dynamic 2D imposters of 3D graphic objects
US20050110789A1 (en) * 2003-11-20 2005-05-26 Microsoft Corporation Dynamic 2D imposters of 3D graphic objects
US20070257914A1 (en) * 2004-03-31 2007-11-08 Hidenori Komatsumoto Image Processing Device, Image Processing Method, And Information Storage Medium
US20070257915A1 (en) * 2006-05-08 2007-11-08 Ken Kutaragi User Interface Device, User Interface Method and Information Storage Medium
US8890895B2 (en) * 2006-05-08 2014-11-18 Sony Corporation User interface device, user interface method and information storage medium
US8797360B2 (en) 2008-04-18 2014-08-05 Sony Corporation Image display device, method for controlling image display device, and information storage medium
US20110074918A1 (en) * 2009-09-30 2011-03-31 Rovi Technologies Corporation Systems and methods for generating a three-dimensional media guidance application
US8970669B2 (en) 2009-09-30 2015-03-03 Rovi Guides, Inc. Systems and methods for generating a three-dimensional media guidance application
US20120311056A1 (en) * 2010-03-31 2012-12-06 Rakuten, Inc. Information processing device, information processing method, information processing program, and storage medium
US8411092B2 (en) 2010-06-14 2013-04-02 Nintendo Co., Ltd. 2D imposters for simplifying processing of plural animation objects in computer graphics generation
US20120110501A1 (en) * 2010-11-03 2012-05-03 Samsung Electronics Co. Ltd. Mobile terminal and screen change control method based on input signals for the same
US9110582B2 (en) * 2010-11-03 2015-08-18 Samsung Electronics Co., Ltd. Mobile terminal and screen change control method based on input signals for the same
US20120174038A1 (en) * 2011-01-05 2012-07-05 Disney Enterprises, Inc. System and method enabling content navigation and selection using an interactive virtual sphere
US20130027393A1 (en) * 2011-07-28 2013-01-31 Tatsuo Fujiwara Information processing apparatus, information processing method, and program
US9342925B2 (en) * 2011-07-28 2016-05-17 Sony Corporation Information processing apparatus, information processing method, and program
US20130054319A1 (en) * 2011-08-29 2013-02-28 United Video Properties, Inc. Methods and systems for presenting a three-dimensional media guidance application
US10360658B2 (en) * 2013-07-08 2019-07-23 Ricoh Company, Ltd. Display control apparatus and computer-readable recording medium
US20160132991A1 (en) * 2013-07-08 2016-05-12 Seiichiro FUKUSHI Display control apparatus and computer-readable recording medium
US20160247313A1 (en) * 2013-11-27 2016-08-25 Google Inc. Methods and Systems for Viewing a Three-Dimensional (3D) Virtual Object
US10460510B2 (en) * 2013-11-27 2019-10-29 Google Llc Methods and systems for viewing a three-dimensional (3D) virtual object
US10891780B2 (en) 2013-11-27 2021-01-12 Google Llc Methods and systems for viewing a three-dimensional (3D) virtual object
US20170148223A1 (en) * 2014-10-31 2017-05-25 Fyusion, Inc. Real-time mobile device capture and generation of ar/vr content
US20170148222A1 (en) * 2014-10-31 2017-05-25 Fyusion, Inc. Real-time mobile device capture and generation of art-styled ar/vr content
US10430995B2 (en) 2014-10-31 2019-10-01 Fyusion, Inc. System and method for infinite synthetic image generation from multi-directional structured image array
US10540773B2 (en) 2014-10-31 2020-01-21 Fyusion, Inc. System and method for infinite smoothing of image sequences
US10726560B2 (en) * 2014-10-31 2020-07-28 Fyusion, Inc. Real-time mobile device capture and generation of art-styled AR/VR content
US10846913B2 (en) 2014-10-31 2020-11-24 Fyusion, Inc. System and method for infinite synthetic image generation from multi-directional structured image array
US10719939B2 (en) * 2014-10-31 2020-07-21 Fyusion, Inc. Real-time mobile device capture and generation of AR/VR content
US10818029B2 (en) 2014-10-31 2020-10-27 Fyusion, Inc. Multi-directional structured image array capture on a 2D graph
US10719732B2 (en) 2015-07-15 2020-07-21 Fyusion, Inc. Artificially rendering images using interpolation of tracked control points
US11632533B2 (en) 2015-07-15 2023-04-18 Fyusion, Inc. System and method for generating combined embedded multi-view interactive digital media representations
US11956412B2 (en) 2015-07-15 2024-04-09 Fyusion, Inc. Drone based capture of multi-view interactive digital media
US11776199B2 (en) 2015-07-15 2023-10-03 Fyusion, Inc. Virtual reality environment based manipulation of multi-layered multi-view interactive digital media representations
US10719733B2 (en) 2015-07-15 2020-07-21 Fyusion, Inc. Artificially rendering images using interpolation of tracked control points
US10852902B2 (en) 2015-07-15 2020-12-01 Fyusion, Inc. Automatic tagging of objects on a multi-view interactive digital media representation of a dynamic entity
US11636637B2 (en) 2015-07-15 2023-04-25 Fyusion, Inc. Artificially rendering images using viewpoint interpolation and extrapolation
US11195314B2 (en) 2015-07-15 2021-12-07 Fyusion, Inc. Artificially rendering images using viewpoint interpolation and extrapolation
US10733475B2 (en) 2015-07-15 2020-08-04 Fyusion, Inc. Artificially rendering images using interpolation of tracked control points
US11435869B2 (en) 2015-07-15 2022-09-06 Fyusion, Inc. Virtual reality environment based manipulation of multi-layered multi-view interactive digital media representations
US11783864B2 (en) 2015-09-22 2023-10-10 Fyusion, Inc. Integration of audio into a multi-view interactive digital media representation
US10726593B2 (en) 2015-09-22 2020-07-28 Fyusion, Inc. Artificially rendering images using viewpoint interpolation and extrapolation
US10490062B2 (en) * 2015-11-24 2019-11-26 HELLA GmbH & Co. KGaA Remote control for automotive applications
US11202017B2 (en) 2016-10-06 2021-12-14 Fyusion, Inc. Live style transfer on a mobile device
US11960533B2 (en) 2017-01-18 2024-04-16 Fyusion, Inc. Visual search using multi-view interactive digital media representations
US11876948B2 (en) 2017-05-22 2024-01-16 Fyusion, Inc. Snapshots at predefined intervals or angles
US11776229B2 (en) 2017-06-26 2023-10-03 Fyusion, Inc. Modification of multi-view interactive digital media representation
US20200304380A1 (en) * 2017-09-07 2020-09-24 Spherica Systems Limited System and Methods Utilizing Dataset Management User Interface
US11189057B2 (en) * 2017-11-20 2021-11-30 Nokia Technologies Oy Provision of virtual reality content
US10754523B2 (en) * 2017-11-27 2020-08-25 International Business Machines Corporation Resizing of images with respect to a single point of convergence or divergence during zooming operations in a user interface
US10754524B2 (en) * 2017-11-27 2020-08-25 International Business Machines Corporation Resizing of images with respect to a single point of convergence or divergence during zooming operations in a user interface
US11488380B2 (en) 2018-04-26 2022-11-01 Fyusion, Inc. Method and apparatus for 3-D auto tagging
US11967162B2 (en) 2018-04-26 2024-04-23 Fyusion, Inc. Method and apparatus for 3-D auto tagging

Also Published As

Publication number Publication date
US20060103650A1 (en) 2006-05-18
WO2002069276A1 (fr) 2002-09-06
US7812841B2 (en) 2010-10-12
EP1363246A1 (en) 2003-11-19
EP1363246A4 (en) 2006-11-08
JP4077321B2 (ja) 2008-04-16
JPWO2002069276A1 (ja) 2004-07-02

Similar Documents

Publication Publication Date Title
US7812841B2 (en) Display controlling apparatus, information terminal unit provided with display controlling apparatus, and viewpoint location controlling apparatus
US5841440A (en) System and method for using a pointing device to indicate movement through three-dimensional space
US5689628A (en) Coupling a display object to a viewpoint in a navigable workspace
US5583977A (en) Object-oriented curve manipulation system
US5371845A (en) Technique for providing improved user feedback in an interactive drawing system
US6842175B1 (en) Tools for interacting with virtual environments
US6091410A (en) Avatar pointing mode
US6542168B2 (en) Three-dimensional window displaying apparatus and method thereof
US5850206A (en) System for retrieving and displaying attribute information of an object based on importance degree of the object
US6084589A (en) Information retrieval apparatus
EP0685790A2 (en) Transporting a display object coupled to a viewpoint within or between navigable workspaces
Steinicke et al. Object selection in virtual environments using an improved virtual pointer metaphor
JP2000172248A (ja) 電子情報表示方法、電子情報閲覧装置および電子情報閲覧プログラム記憶媒体
US20060244745A1 (en) Computerized method and computer system for positioning a pointer
WO2007035988A1 (en) An interface for computer controllers
US6714198B2 (en) Program and apparatus for displaying graphical objects
JPH04308895A (ja) 処理の実行方法及び情報処理装置
JP3470771B2 (ja) 投影面連動表示装置
CN112632181A (zh) 地图显示方法、装置、设备、存储介质和终端设备
WO1995011482A1 (en) Object-oriented surface manipulation system
JPH08249500A (ja) 3次元図形の表示方法
EP0825558A2 (en) Method and apparatus for displaying free-form graphic objects
KR102392675B1 (ko) 3차원 스케치를 위한 인터페이싱 방법 및 장치
JPH04180180A (ja) 3次元位置入力方法及び装置
WO1995011480A1 (en) Object-oriented graphic manipulation system

Legal Events

Date Code Title Description
AS Assignment

Owner name: FUJITSU LIMITED, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KAMIWADA, TORU;FUJITA, TAKUSHI;REEL/FRAME:014399/0407

Effective date: 20030523

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION