JP2003346183A - 3d content service system - Google Patents

3d content service system

Info

Publication number
JP2003346183A
JP2003346183A JP2002154034A JP2002154034A JP2003346183A JP 2003346183 A JP2003346183 A JP 2003346183A JP 2002154034 A JP2002154034 A JP 2002154034A JP 2002154034 A JP2002154034 A JP 2002154034A JP 2003346183 A JP2003346183 A JP 2003346183A
Authority
JP
Japan
Prior art keywords
terminal
data
3d content
service system
server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
JP2002154034A
Other languages
Japanese (ja)
Inventor
Hideo Noro
英生 野呂
Original Assignee
Canon Inc
キヤノン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc, キヤノン株式会社 filed Critical Canon Inc
Priority to JP2002154034A priority Critical patent/JP2003346183A/en
Publication of JP2003346183A publication Critical patent/JP2003346183A/en
Application status is Withdrawn legal-status Critical

Links

Abstract

<P>PROBLEM TO BE SOLVED: To provide a 3D content service system capable of exerting sufficiently the capacity of each terminal by switching dynamically processing corresponding to the capacity of the terminal. <P>SOLUTION: In this system for servicing a 3D content through a network, at least one of a display content, a user interface, a delivery method and a user operation processing method is switched corresponding to the capacity of the terminal. In this case, if the terminal can have a connection to a server through the network such as a low-function cellular phone terminal, and can acquire two-dimensional image data from the server and display the data on a screen, the 3D content is rendered into a 2D content on the server side, and delivered to the terminal as 2D image data. <P>COPYRIGHT: (C)2004,JPO

Description

DETAILED DESCRIPTION OF THE INVENTION

[0001]

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a 3D content service system, and more particularly, to a 3D content service system suitable for use in a system for delivering 3D content to terminals having greatly different capabilities via a network.

[0002]

2. Description of the Related Art A conventional 3D content service system targets only high-performance terminals and directly distributes 3D content data. That was all rendered and distributed as 2D images on the server side.

[0003]

SUMMARY OF THE INVENTION A portable telephone, a combination of a small portable device and a portable telephone, a set-top box,
Terminals with various capabilities such as C have come to be used in every day situations. If the terminal intends to receive the 3D content service, it is expected that a service that fully demonstrates the capabilities of each terminal is provided.

[0004] In the above conventional example, processing according to the capability of the terminal is not performed, or even if it is performed, only a screen size is considered. As a result, it is not possible to provide a 3D content service that fully exercises the capabilities of each terminal.
Alternatively, the function is limited, and the service is provided only to terminals having a certain function. In a service provided only to a terminal having a certain function, a terminal lacking the capability cannot receive the service, and even if the terminal has a higher capability, it is not used, so that a low quality service is to be provided.

SUMMARY OF THE INVENTION The present invention has been made in view of the above-mentioned problems, and a purpose thereof is to dynamically perform a process according to the capability of a terminal so that the capability of each terminal can be sufficiently exhibited. It is to provide a 3D content service system.

[0006]

In order to achieve the above object, the present invention provides a system for providing 3D content via a network in a display content, a user interface, a distribution method, and a user operation process according to the capability of a terminal. Switching at least one of the methods is characterized.

[0007]

Embodiments of the present invention will be described below with reference to the accompanying drawings.

<Embodiment 1> FIG. 1 shows a 3D according to the present invention.
FIG. 4 shows a configuration example of a device to which the content service system is applied, and FIG. 4 shows a 3D content display example on a mobile phone terminal.
FIG. 5 shows an example of displaying 3D content on a PC terminal.

Here, a page description language such as HTML, HDML, or WML is assumed, and a UI is embedded in the UI on the server side, and 2D content data is described as an image or an object embedded therein. For the 2D content data, only the URI that is the location of the information is indicated, and in the case of actual acquisition, an acquisition request is made again.

In the case of directly transmitting 3D content data, here, it is assumed that the UI is prepared by the terminal side, and no new acquisition request is made. In the system to which the present invention is applied, there is no limitation as described above.

FIG. 6 shows an operation flow of the communication unit. When there is a connection or communication from the terminal to the server, first, the communication unit determines whether the connection is a new connection. If it is already connected, it should be processed by another unit, so the process ends without doing anything. In the case of a new connection, a terminal information object which is a data object for storing connection destination information is created. As shown in FIG. 2, the data held by the terminal information object includes a communication channel object, a screen size, a 3D renderable format, a 2D renderable format, a 2D animable format, and a 2D renderable format.
The information includes a streamable format, a UI (user interface) template, request content, a viewpoint position, and the like, and not all of the information is embedded. Information that has been appropriately found in each section is filled in.

The communication unit sets the connection, which is recognized as an object inside the server and is called a communication channel object here. Thereafter, communication for this connection is performed via the communication channel object held in the terminal information object. The communication unit passes the terminal information object to the determination unit, and ends the processing.

FIG. 7 is an operation flow of the judgment unit when the terminal information object is passed from the communication unit.

First, the terminal inquires the database based on the request information sent by the terminal, and sets the found attribute in the terminal information object. Request contents (URI
Etc.) are also set. At this point, most of the information specific to the terminal is set.

At this time, information specific to a series of exchanges between a terminal and a server called a session (session information)
If there is, the session information temporarily registered in the database is also set in the terminal information object.
Examples of the session information include the position of the viewpoint. This is unnecessary when the request contains the current or requested viewpoint position, but otherwise, it is managed as session information. Updating of session information can be performed as appropriate.

After the necessary information is set in the terminal information object, it is determined whether or not the request is a UI request. Whether a UI is required at the time of requesting 3D content can be described in, for example, a UI template attribute of the terminal information object. If the UI template attribute is empty, the UI information is prepared at the terminal, so that it is not necessary to prepare it at the server side. On the contrary, if the UI template is not empty, the server side executes the UI based on the UI template.
I need to return information.

When returning UI information, the request
Process as a request for TML, HDML, and WML documents. Here, this is called a UI request. When the UI information is unnecessary, or when the request is for an image or an object embedded in the UI information, the display content request is called and distinguished from the UI request. If the request is a UI request, the terminal information object is passed to the user interface selecting unit. Otherwise, that is, if the request is a display content request, the terminal information object is passed to the display content rendering unit, and the processing of the determining unit is terminated.

The user interface selection unit performs 2D-based processing based on the UI template information of the terminal information object.
The URI of the content data is embedded and delivered to the terminal as UI information. The distribution to the terminal is performed through the communication channel object of the terminal information object.

FIG. 3 is a diagram showing a configuration example of the display content rendering unit. According to the request content of the terminal information object passed from the determination unit, the module (part) that renders the display content is switched, and distribution is performed in a data format according to the terminal.

Second Embodiment In FIG. 3, a still image module is selected as a module for rendering display contents. First, use a 3D → 2D single screen renderer,
After the 2D screen image is generated, data to be delivered to the terminal is created using an encoder adapted to a still image format such as JPEG or PNG, and then delivered.

<Third Embodiment> An animation is a method of displaying a plurality of 2D still images by switching them at a time interval specified on a terminal, or a method of displaying the plurality of 2D still images and the specified time interval. These are combined into one file. The latter file format is sometimes called an animation format. In both cases, the specification of the time interval does not depend on the type of animation, and in that case, it is considered that the time interval is constant. Also, repetition can be designated. In this case, after displaying the last still image, the display from the first still image is repeated again. This can be used to create periodic animations.

When a plurality of 2D still images are not combined into one file, a request for one still image is issued from the terminal a plurality of times, so that the processing is the same as in the second embodiment. is there. Note that the instruction to issue a plurality of requests is included in the UI information for the UI request.

When distributing animation format data to a terminal, an animation module is selected as a module for rendering display contents in FIG. First, after generating a 2D image using a 3D → 2D one-screen renderer, data for one screen is created using a still image encoder compatible with an animation format such as JPEG or PNG. After repeating this a required number of times, the files are linked, necessary information such as time designation is added, and data to be delivered to the terminal is created and then delivered.

<Embodiment 4> Described in a UI template corresponding to a terminal is such that the terminal issues a UI request again after a certain time or at a time when it is expected that a certain amount of movement or more is detected. Keep it.

In the case where designation is made after a certain period of time, if the terminal is capable of interpreting the fixed time itself, the time can be described in advance in the UI template.

In the case where the designation is made after a certain time, and when the time at which the terminal issues the UI request again is described instead of the certain time, the user interface selection unit calculates the time obtained by adding the certain time to the current time. And UI it
Distribute information to embedded terminals.

When embedding a time at which a certain or more motion is expected to be detected, the user interface selecting unit obtains the time using a 3D motion prediction module, and then embeds the time in the UI information and embeds it in the terminal. To deliver.
If the terminal needs a time up to that time, not the time at which the UI request is issued, a time obtained by subtracting the current time from the time is calculated, and the time is embedded in the UI information and delivered to the terminal.

<Embodiment 5> When all the output results of the 3D → 2D one-screen renderer called from the animation module are equal, distribution to the terminal in the animation format is not performed and one still image is output. Perform output.

Instead of an animation format, a plurality of 2
When the terminal issues a request to acquire a D image, 3D → 2
D-screen renderer output results with no difference is 1
2D embedding in UI information, represented by two 2D images
The URI of the image is represented by one. In this way, if all output results are equal, one URI is indicated.

<Embodiment 6> In FIG. 3, a moving image module is selected as a module for rendering display contents. The data for one screen generates a 2D image using a 3D → 2D one-screen renderer. The result is sequential MPEG
And the like. After repeating this a required number of times, when a termination instruction is issued to the moving picture encoder, moving picture data is created. This is distributed to the terminal via the communication channel object of the terminal information object.

<Embodiment 7> In FIG. 3, a moving image streaming module is selected as a module for rendering display contents. Data for one screen is 3D → 2
A 2D image is generated using a D-screen renderer. The result is sequentially input to the streaming encoder. Since streaming data is output from the encoder as necessary, the streaming data is delivered to the terminal via the communication channel object of the terminal information object. This is repeated until the terminal or the server determines that the processing should be terminated.

Incidentally, a unique communication means is often used for streaming data distribution. In this case, the communication channel object of the terminal information object cannot be used for streaming data distribution. Therefore, the moving image streaming module also performs processing for establishing a new streaming communication channel. At this time, information about the established communication path is stored in the terminal information object.

<Eighth Embodiment> For example, let us consider a case where it is desired to observe a certain object from a desired position around the object. In this case, the UI information to be delivered to the terminal includes, for example, “right 30
And a UI button meaning “30 degrees left” is embedded and distributed. This is HTML <a> tag or <form>
The tag can be realized by using an <a> tag or a <do> tag in WML. By pressing the UI button or performing a predetermined operation such as utterance at the terminal, the intention of the operator can be transmitted to the server.

The information transmitted to the server is interpreted by the determination unit, and the new viewpoint position and the like are stored in the terminal information object. The instruction such as the start of the animation stores the start time in the terminal information object. Thereafter, the terminal information object is passed to the user interface selection unit, and new UI information is generated and distributed to the terminal.

Next, when there is a display content request from the terminal, the display content is created by the display content rendering unit.
The position of the viewpoint and the position of the object by animation are 3D →
The 2D one-screen renderer performs processing in consideration of these pieces of information stored in the terminal information object.

<Embodiment 9> In response to a display content request from a terminal, the display content rendering unit
Select the D module. If the 3D data format inside the server is the same as the 3D data format that the terminal can understand, the data is distributed to the terminal without any change. 3 if different
The data is converted to a format that can be understood by the terminal by the D format conversion unit and then delivered to the terminal. The distribution to the terminal is performed through the communication channel object to the terminal information object.

The terminal receiving the 3D data displays it and has its own UI, so that the user can freely operate.

[0038]

As is apparent from the above description, according to the present invention, in a system for providing 3D content via a network, a display content, a user interface, a distribution method, and a user operation process according to the capability of a terminal. Since at least one of the methods is switched, the processing of each terminal is dynamically switched by dynamically switching the processing according to the capability of the terminal to provide high-quality services to many terminals. The effect is obtained.

According to the present invention, a terminal can have a connection with a server via a network, such as a low-function mobile phone terminal, and two-dimensional image data is obtained from the server and displayed on a screen. In this case, since the 3D content is rendered in 2D on the server side and then delivered to the terminal as 2D image data, the terminal can have a connection with the server on the network. In the case where there is only a certain level of capability that can acquire two-dimensional image data from the server and display it on the screen, an effect of providing a high-quality service that sufficiently exercises the capability of the terminal can be obtained.

[Brief description of the drawings]

FIG. 1 is a diagram illustrating a configuration example of an apparatus to which a 3D content service system according to a first embodiment of the present invention is applied.

FIG. 2 is a diagram illustrating a terminal information object used inside a device to which the 3D content service system according to Embodiment 1 of the present invention is applied.

FIG. 3 is a diagram showing a detailed configuration example of a display content rendering unit of the device to which the 3D content service system according to Embodiment 1 of the present invention is applied.

FIG. 4 is a diagram illustrating a case where a mobile phone terminal is used as a terminal of an apparatus to which the 3D content service system according to Embodiment 1 of the present invention is applied;

FIG. 5 is a diagram illustrating a case where a PC terminal is used as a terminal of an apparatus to which the 3D content service system according to Embodiment 1 of the present invention is applied;

FIG. 6 is a flowchart showing an operation of a communication unit of the device to which the 3D content service system according to Embodiment 1 of the present invention is applied.

FIG. 7 is a flowchart illustrating the operation of a determination unit of the device to which the 3D content service system according to Embodiment 1 of the present invention is applied.

Claims (9)

[Claims]
1. A system for providing 3D content via a network, wherein at least one of a display content, a user interface, a distribution method, and a user operation processing method is switched according to the capability of the terminal. system.
2. The terminal, such as a low-function mobile phone terminal,
If it is possible to have a connection with the server on the network, obtain two-dimensional image data from the server and display it on the screen, render the 3D content in 2D on the server side, and then render the 2D image The 3D content service system according to claim 1, wherein the 3D content service system distributes the data to a terminal.
3. In the case of having a function of displaying a plurality of 2D image data in order or displaying data of an animation format using them as one data, one cycle of a 3D scene or a moving part can be displayed by a plurality of data. Render as 2D image, multiple 2D image data,
3. The method according to claim 2, wherein animation data in which the data is combined into one is delivered to the terminal.
The 3D content service system according to claim 1.
4. Decoding the designated time embedded in the 3D content and, when the time is reached, having a function of issuing a data acquisition request to the server again, after a certain period of time, after the server has delivered the 3D content to the terminal. 3. The 3D content service system according to claim 2, wherein a time at which a predetermined or more movement is predicted is embedded.
5. The 3D content service system according to claim 3, wherein if there is no difference between the rendered 2D images, one 2D image data is delivered to the terminal.
6. When a function of displaying moving image data is provided, one cycle of a 3D scene or a moving part is rendered as a moving image, and the moving image data is delivered to a terminal. The 3D content service system according to claim 1.
7. The 3D content service system according to claim 2, wherein the 3D content service system has a function of displaying moving image streaming data, renders a 3D scene as moving image streaming, and distributes the moving image streaming data to a terminal.
8. When the server has a function of notifying the server which of several options embedded in the content has been selected, a request to change a viewpoint, start an animation, or the like is sent to the server using this function. 3. The 3D content service system according to claim 2, wherein the notification is made, and the server renders the data based on the data reflecting the request from the terminal and distributes the data to the terminal.
9. When the terminal is a high function terminal such as a PC,
2. The 3D content according to claim 1, wherein the 3D content data is directly distributed, a request instructed via a user interface is processed in the terminal, and a 2D image reflecting the result is rendered on the terminal. Service system.
JP2002154034A 2002-05-28 2002-05-28 3d content service system Withdrawn JP2003346183A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2002154034A JP2003346183A (en) 2002-05-28 2002-05-28 3d content service system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2002154034A JP2003346183A (en) 2002-05-28 2002-05-28 3d content service system

Publications (1)

Publication Number Publication Date
JP2003346183A true JP2003346183A (en) 2003-12-05

Family

ID=29770923

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2002154034A Withdrawn JP2003346183A (en) 2002-05-28 2002-05-28 3d content service system

Country Status (1)

Country Link
JP (1) JP2003346183A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101034966B1 (en) * 2004-09-15 2011-05-17 휴렛-팩커드 디벨롭먼트 컴퍼니, 엘.피. A method and device for three-dimensional graphics to two-dimensional video encoding
JP2012523781A (en) * 2009-04-14 2012-10-04 ファーウェイ テクノロジーズ カンパニー リミテッド System and method for processing video files
KR101206892B1 (en) 2008-12-02 2012-11-30 한국전자통신연구원 Apparatus and method for 3d streaming based remote shading
JP2012252695A (en) * 2011-06-02 2012-12-20 Disney Enterprises Inc Providing single instance of virtual space depicted in either two dimensions or three dimensions via separate client computing devices
KR101231821B1 (en) 2011-07-05 2013-02-08 주식회사 엘지유플러스 Method and System for providing contents continuous play service
JP2014512199A (en) * 2011-02-08 2014-05-22 ムスタファ.アワイス I Method and system for providing video game content
US8892883B2 (en) 2011-05-02 2014-11-18 Crytek Ip Holding Llc Render service for remote access to applications

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101034966B1 (en) * 2004-09-15 2011-05-17 휴렛-팩커드 디벨롭먼트 컴퍼니, 엘.피. A method and device for three-dimensional graphics to two-dimensional video encoding
KR101206892B1 (en) 2008-12-02 2012-11-30 한국전자통신연구원 Apparatus and method for 3d streaming based remote shading
JP2012523781A (en) * 2009-04-14 2012-10-04 ファーウェイ テクノロジーズ カンパニー リミテッド System and method for processing video files
US8948247B2 (en) 2009-04-14 2015-02-03 Futurewei Technologies, Inc. System and method for processing video files
JP2014512199A (en) * 2011-02-08 2014-05-22 ムスタファ.アワイス I Method and system for providing video game content
US8892883B2 (en) 2011-05-02 2014-11-18 Crytek Ip Holding Llc Render service for remote access to applications
JP2012252695A (en) * 2011-06-02 2012-12-20 Disney Enterprises Inc Providing single instance of virtual space depicted in either two dimensions or three dimensions via separate client computing devices
KR101231821B1 (en) 2011-07-05 2013-02-08 주식회사 엘지유플러스 Method and System for providing contents continuous play service

Similar Documents

Publication Publication Date Title
US7924177B2 (en) Distributed on-demand media transcoding system and method
JP5852595B2 (en) How to share images using digital media frames
CN100477672C (en) Electronic equipment
US8217987B2 (en) Method for creating a videoconferencing displayed image
US8115800B2 (en) Server apparatus and video delivery method
JP2010178381A (en) System and method for rendering content on multiple devices
TWI229559B (en) An object oriented video system
KR20100127240A (en) Using triggers with video for interactive content identification
JP2008520131A (en) Method and system for streaming documents, email attachments, and maps to a wireless device
KR101333200B1 (en) System and method for providing video content associated with a source image to a television in a communication network
EP2477414A2 (en) Method for assembling a video stream, system and computer software
US7178159B1 (en) Information providing apparatus
US20020015042A1 (en) Visual content browsing using rasterized representations
CN101669141B (en) Method for processing digital image
US20100118190A1 (en) Converting images to moving picture format
US7499075B2 (en) Video conference choreographer
KR20080083353A (en) Extensions to rich media container format for use by mobile broadcast/multicast streaming servers
US7600022B2 (en) Communication apparatus, information sharing system and information sharing method
CN100461847C (en) A method and system for visually sharing an application program
CN1921495B (en) Distributed image processing for medical images
JP2009530706A (en) Efficient encoding of alternative graphic sets
US9270926B2 (en) System and method for distributed media personalization
US20030222883A1 (en) Optimized mixed media rendering
KR101329619B1 (en) Computer network-based 3D rendering system
JP2014531142A (en) Script-based video rendering

Legal Events

Date Code Title Description
RD01 Notification of change of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7421

Effective date: 20050518

A621 Written request for application examination

Effective date: 20050530

Free format text: JAPANESE INTERMEDIATE CODE: A621

A761 Written withdrawal of application

Effective date: 20061226

Free format text: JAPANESE INTERMEDIATE CODE: A761