WO2011093031A1 - 携帯端末、行動履歴描写方法、及び行動履歴描写システム - Google Patents
携帯端末、行動履歴描写方法、及び行動履歴描写システム Download PDFInfo
- Publication number
- WO2011093031A1 WO2011093031A1 PCT/JP2011/000236 JP2011000236W WO2011093031A1 WO 2011093031 A1 WO2011093031 A1 WO 2011093031A1 JP 2011000236 W JP2011000236 W JP 2011000236W WO 2011093031 A1 WO2011093031 A1 WO 2011093031A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- action history
- information
- image
- camera
- terminal
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 36
- 239000002131 composite material Substances 0.000 claims abstract description 38
- 239000000203 mixture Substances 0.000 claims abstract description 15
- 238000003384 imaging method Methods 0.000 claims description 35
- 230000005540 biological transmission Effects 0.000 claims description 5
- 230000002194 synthesizing effect Effects 0.000 claims description 5
- 230000015572 biosynthetic process Effects 0.000 claims description 3
- 238000003786 synthesis reaction Methods 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 6
- 238000009877 rendering Methods 0.000 description 4
- 230000002093 peripheral effect Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/02—Reservations, e.g. for tickets, services or events
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
Definitions
- the present invention relates to a portable terminal, an action history description method, and an action history description system.
- the action history is displayed on a two-dimensional map projected on a display by using position information acquired by GPS.
- Patent Document 1 searches for information in a virtual space, and can clearly confirm the past history by clearly indicating a portion displayed in the field of view in the past. Is disclosed.
- the image processing apparatus determines whether or not the polygon visually recognized in the field of view of the apparatus is the same as the previously displayed polygon, and if the polygon is already viewed, the display state of the polygon can be changed. To.
- Patent Document 2 discloses a technology related to a navigation system that combines a real image captured by a camera and a CG (Computer Graphics) image in a camera-equipped information device terminal.
- CG Computer Graphics
- Patent Document 1 and Patent Document 2 do not describe a technique for grasping a user's action history. Moreover, in the general action history description system using the information terminal, the user's action history is only traced with respect to the planar map. For this reason, there is a problem that it is difficult to grasp a realistic action history depending on an existing system.
- the present invention has been made to solve such problems, and provides a mobile terminal, a behavior history rendering method, and a behavior history rendering system that can depict a realistic behavior history. Objective.
- One aspect of the portable terminal includes terminal information acquisition means for acquiring information including position information of the terminal, direction information of the terminal, and terminal posture information of the terminal, Camera means for generating a camera image obtained by imaging, action history description means for calculating action history description information to be displayed based on an action history acquired in advance and an imaging range of the camera means, and the action history description Image composing means for generating a composite image in which information is depicted in the camera image, and display means for displaying the composite image are provided.
- One aspect of the action history description method is to acquire information including the position information of its own terminal, the orientation information that the terminal is facing, and the terminal attitude information of the terminal, and image the surroundings.
- a camera image is generated, action history description information to be displayed is calculated based on an action history acquired in advance and an imaging range of the camera image, and a composite image in which the action history description information is described in the camera image Generating and displaying the composite image.
- One aspect of the behavior history description system is the behavior history transmission means for generating the behavior history information from the information acquired by the terminal information acquisition means and transmitting the behavior history information to the behavior history server.
- An action history acquisition unit that acquires the action history information of an arbitrary user from the action history server, and an action history server that stores the action history information.
- a mobile terminal a behavior history rendering method, and a behavior history rendering system that can depict a realistic behavior history.
- FIG. 1 is a system configuration diagram of an action history description system according to a first exemplary embodiment; It is a figure of the action history table concerning Embodiment 1. It is a figure about the method of calculating the magnitude
- FIG. 3 is a diagram illustrating an operation concept of the mobile terminal according to the first exemplary embodiment.
- 1 is a system configuration diagram of an action history description system according to a first exemplary embodiment;
- the action history description system 1 includes a mobile terminal 10 and an action history server 20.
- Examples of the mobile terminal 10 include a mobile phone, a PDA (Personal Data Assistant), and a smart phone.
- the terminal information acquisition unit 11 is a processing unit for acquiring information related to the mobile terminal 10.
- the terminal information acquisition unit 11 includes a position acquisition unit 111, an orientation acquisition unit 112, a terminal posture acquisition unit 113, and a voice acquisition unit 114.
- the position acquisition unit 111 is a processing unit that acquires a current position where the mobile terminal 10 exists.
- the position acquisition unit 111 is a processing unit that implements, for example, GPS (Global Positioning System).
- the direction acquisition unit 112 is a processing unit for acquiring the direction in which the mobile terminal 10 is facing. For example, during the generation of a camera image by the camera unit 12 to be described later, the orientation acquisition unit 112 calculates the terminal orientation during image generation by a geomagnetic sensor provided in the mobile terminal 10.
- the terminal attitude acquisition unit 113 detects the terminal attitude of the mobile terminal 10.
- the terminal posture acquisition unit 113 calculates the terminal posture of the mobile terminal 10 that is generating an image by an acceleration sensor provided in the mobile phone.
- the audio acquisition unit 114 is a processing unit that acquires audio around the mobile terminal 10.
- the camera unit 12 is a processing unit that generates a camera image obtained by imaging an arbitrary three-dimensional space.
- the camera unit 12 is a processing unit including a camera function for capturing a moving image and a still image attached to the mobile terminal 10.
- the user points the camera attached to the portable terminal 10 toward the target area for which the action history is to be displayed.
- the camera unit 12 sequentially acquires camera images obtained by imaging the scenery in the direction in which the mobile terminal 10 is directed in response to the user's activation operation of the camera application.
- the camera unit 12 inputs the acquired camera image to the image composition unit 15.
- the action history acquisition unit 13 is a processing unit for acquiring an action history from the action history server 20.
- the action history acquisition unit 13 accesses the action history server 20 and acquires a desired action history according to the user's selection. For example, the user selects a desired action history by selecting an action history desired to be acquired from the display screen of the application.
- the action history acquisition unit 13 outputs the acquired action history to the depiction history calculation unit 14.
- the action history server 20 is a server that can be accessed by any user, and includes an action history table 21 for holding the action history of each user. Details of the action history table 21 are shown in FIG.
- the action history table 21 includes a registrant 211, a time 212, coordinates 213, a direction 214, a voice 215, and a height 216 as columns.
- the registrant 211 information on the name of the user who registered the action history is input.
- information on the time when the registered user registered the action history is input.
- position information that was located while the user input the action history is input.
- position information acquired by GPS is input to the coordinates 213.
- the terminal direction of the registrant who has input the action history is input.
- the information input in the direction 214 is determined as “36 degrees from the north direction”, for example.
- the sound information acquired by the sound acquisition unit 114 is input.
- the height 216 is registrant altitude information.
- the altitude information when the mobile terminal 10 is provided with an altimeter, it is conceivable that the altitude information is acquired by the altimeter. Further, height information may be acquired using information of RFID (Radio Frequency IDentification) attached to the ground, a wall, or an object.
- RFID Radio Frequency IDentification
- the action history server 20 may hold information on each user who registers the action history.
- the action history server 20 holds the height, sex, etc. of each user. You may customize the display of the avatar mentioned later using this information. For example, when displaying the action history of a female user, it is possible to display the avatar's clothes for women.
- Mr. A is moving from 11: 3 to 11:06 in 2010 with the terminal facing south (180 degrees from north) and moving south. It is registered.
- the voice recorded by the voice acquisition unit 114 of Mr. A's portable terminal 10 during the period (2010: 11: 3 to 11: 6) is stored. It is also registered that Mr. A was at an altitude of 0 m.
- the depiction history calculation unit 14 determines the size of action history description information (in this example, an avatar) displayed on the display unit 16 from the imaging range captured by the camera unit, the action history downloaded from the action history server 20, The display position and the display direction of the avatar are calculated.
- the depiction history calculation unit 14 refers to the imaging performance of the camera unit 12, the degree of zooming, and the like, and the imaging range (for example, latitude 35.00 degrees, longitude 135.00 degrees to latitude 35.01 degrees, longitude 135) .01 degrees, altitude 0m to 10m). Then, the depiction history calculation unit 14 determines whether or not action history position information exists within the imaging range. When it does not exist, the depiction history calculation unit 14 notifies the image composition unit 15 that the avatar is outside the display range.
- the depiction history calculation unit 14 calculates the display coordinates of the avatar on the image displayed on the display unit 16. In addition, the depiction history calculation unit 14 calculates the size of the avatar arranged at the coordinates. For example, the depiction history calculation unit 14 uses a general pinhole principle from the position where the camera of the mobile terminal 10 is operated, the position of the coordinates, and the height information of the user displayed by the avatar. It is conceivable to calculate the size of. Thereby, when the position where the camera is activated and the position information indicated by the action history are close, the avatar is displayed in a large size. On the other hand, when the location information indicated by the action history and the point where the camera is activated is far, the avatar is displayed small. The drawing history calculation unit 14 calculates the movement of the avatar by continuously calculating the position and size of the avatar.
- the size of the avatar may be determined according to the distance from the imaging point without considering the height of the user.
- the depiction history calculation unit 14 also calculates the orientation of the avatar to be displayed.
- the direction of the avatar is calculated from the azimuth 214 included in the action history and the azimuth of the mobile terminal 10 when operating the camera of the mobile terminal 10. More specifically, the depiction history calculation unit 14 calculates a difference between the direction 214 (for example, “45 degrees from north”) included in the action history and the direction of the mobile terminal 10 (for example, “90 degrees from north”). And the avatar direction is calculated using the difference.
- the image composition unit 15 synthesizes the camera image input from the camera unit 12 and the avatar information (the coordinates for displaying the avatar, the size of the avatar, the avatar direction, etc.) input from the depiction history calculation unit 14 and displays them. It is a processing unit that generates a composite image to be displayed on the unit 16.
- the image composition unit 15 When the avatar is outside the display range, the image composition unit 15 generates a composite image for notifying that the avatar is outside the display range on the camera image generated by the camera unit 12.
- the image generation unit 16 generates a composite image in which a display such as “Mr. A is walking outside the display range” is superimposed on the camera image generated by the camera unit 12.
- the image composition unit 15 when the avatar is within the display range, the image composition unit 15 generates a composite image in which the avatar is superimposed on the camera image generated by the camera unit 12.
- the image composition unit 15 outputs the generated composite image to the display unit 16.
- FIG. 3 is a diagram for explaining the avatar display position and the size of the avatar by the depiction history calculation unit 14 and the concept of the composite image generation by the image composition unit 15.
- the drawing history calculation unit 14 calculates the focal length p from the degree of zooming of the camera, the imaging performance of the camera, and the like during shooting by the camera unit 12.
- the depiction history calculation unit 14 calculates the position where the action history to be drawn exists from the action history information.
- the user whose action history is to be drawn is at a point (1) that is a distance f1 from the imaging point.
- the height of the user who is the subject of action history depiction is height g1.
- the size of the avatar displayed in the composite image is obtained based on p ⁇ g1 / f1.
- the depiction history calculation unit 14 calculates the display position of the avatar in the composite image.
- the depiction history calculation unit 14 projects the action history depiction target user onto a corresponding point in the composite image.
- the drawing history calculation unit 14 uses the coordinates of the synthesized image corresponding to the projected position as the display position of the avatar.
- the depiction history calculation unit 14 calculates the display position of the avatar according to the altitude when the user whose behavior history is to be depicted exists at a position with a height from the ground as in (2) point of FIG. 3, for example. .
- the display unit 16 is an information display unit such as a display screen provided in the mobile terminal 10.
- a composite image synthesized by the image synthesis unit 15 that is, an image in which an avatar is arranged on the camera image generated by the camera unit 12 is displayed. Is displayed.
- the action history transmission unit 17 is a processing unit that transmits its own action history information to the action history server 20.
- the action history information is generated based on the peripheral information acquired by the terminal information acquisition unit 11.
- a user having the mobile terminal 10 instructs the start of recording of an action history.
- the instruction is performed by selecting the start of saving the action history from the display menu displayed on the display unit 16.
- the action history is generated by acquiring information by each processing unit of the terminal information acquiring unit 11 and synthesizing the acquired information.
- the user instructs to register the generated action history in the action history server 20.
- the behavior history transmission unit 17 transmits the behavior history to the behavior history server 20 (S101).
- the user activates an application for displaying an action history (S102).
- the user instructs to acquire a desired user's action history from the action history server 20. For example, a list of behavior histories is displayed, and an instruction to acquire a behavior history of a desired user is given by selecting a behavior history to be displayed.
- the action history acquisition unit 13 acquires desired action history information from the action history server 20 (S103).
- the user can acquire the action history of any other user.
- the present invention is not limited to this, and a certain restriction may be imposed on acquisition of the action history by password authentication or the like.
- the current position of the mobile terminal 10 is calculated by the position acquisition unit 111 (S104), and the camera function is activated (S105).
- the user instructs the reproduction of the action history S106: Yes
- the drawing of the action history is started.
- the depiction history calculation unit 14 is based on the orientation of the mobile terminal 10, the terminal posture, the current position information acquired from the GPS, the imaging performance of the camera, and the like (for example, latitude 35.00 degrees, longitude 135.00 degrees to (Latitude 35.01 degrees, longitude 135.01 degrees, altitude 0 m to 10 m) is calculated as an image captured by the camera.
- the depiction history calculating unit 14 determines whether or not the depiction start point of the action history acquired from the action history server 20 is included in the imaging target region (S107).
- the drawing history calculation unit 14 calculates the display coordinates on the display unit 16 (S108). In addition, the depiction history calculation unit 14 calculates the size of the avatar to be placed at the coordinates and the orientation of the avatar (S108).
- the image composition unit 15 generates a composite image obtained by superimposing the avatar information on the camera image generated by the camera unit 12, and displays the composite image on the display unit 16 (S109).
- the avatar position, the size of the avatar, and the avatar direction are sequentially calculated by continuously performing the processes from S107 to S109 described above. As a result, an image in which the avatar is moving is generated.
- Mr. X's action history from the point C on the display unit 16 of the mobile terminal 10 will be described.
- the point where Mr. X was located is included in the drawing range. Therefore, an avatar corresponding to Mr. X is displayed in the composite image.
- the avatar is arranged so that Mr. X faces the direction that was directed to the action history recording start time.
- the avatar is displayed based on the distance between the points C and A and the size calculated from the height information of Mr. X.
- the composite image continues to be displayed on the display unit 16 until the avatar moves to the point D.
- the position of the avatar, the size of the avatar, and the direction of the avatar are always calculated and reflected in the composite image displayed on the display unit 16.
- the mobile terminal according to the present embodiment calculates the action history description information (avatar in the present embodiment) to be described with respect to the target space image captured by the imaging unit such as a camera.
- the arranged composite image is generated and displayed. Since the avatar is arranged in the camera image captured by the imaging means such as a camera, the composite image can grasp the action history while feeling a sense of reality.
- the action history of the friend can be displayed by the mobile terminal 10 according to the present embodiment.
- the user can grasp the road while feeling a sense of reality.
- a map information server having information such as the position and height of a building is separately provided.
- the mobile terminal 10 acquires information from this map information server, and determines whether or not the avatar is located in the shadow of the building.
- the avatar may be hidden in the composite image.
- you may display in a synthesized image that an avatar is located behind a building.
- the mobile terminal 10 may be configured to include a sensor that detects peripheral obstacles such as a laser range sensor and an ultrasonic sensor that detect peripheral obstacles.
- a sensor that detects peripheral obstacles such as a laser range sensor and an ultrasonic sensor that detect peripheral obstacles.
- an avatar arranged at a position behind the obstacle may be hidden.
- the avatar image to be displayed may be arbitrarily set by the user.
- the action history may be displayed by changing the display direction and display size of an arbitrary symbol (for example, a triangle or a square).
- a composite image may be generated by a server. That is, the mobile terminal 10 notifies its own position, terminal orientation, and terminal attitude to the server, and generates a composite image in the server.
- the mobile terminal 10 may display a composite image generated in the server.
- the mobile terminal 10 includes a terminal information acquisition unit 11, a camera unit 12, a depiction history calculation unit 14, an image composition unit 15, and a display unit 16.
- the terminal information acquisition unit 11 acquires information including position information of the mobile terminal 10, orientation information that the mobile terminal 10 faces, and terminal attitude information of the mobile terminal 10.
- the camera unit 12 is a processing unit that generates a camera image obtained by imaging the surroundings.
- the depiction history calculation unit 14 calculates behavior history description information to be displayed based on the action history acquired in advance and the imaging range captured by the camera unit 12.
- the depiction history calculation unit 14 calculates behavior history description information to be displayed based on the action history acquired in advance and the imaging range captured by the camera unit 12.
- the image synthesis unit 15 synthesizes an image obtained by synthesizing the action history calculated by the depiction history calculation unit 14 with the camera image.
- the display unit 16 displays the image generated by the composite image generation unit 15.
- (Appendix 1) Obtaining information including the position information of the terminal of itself, the orientation information that the terminal is facing, and the terminal attitude information of the terminal, Generate a camera image that captures the surroundings, Calculating action history description information to be displayed based on the action history acquired in advance and the imaging range of the camera image; Generating a composite image in which the action history description information is described in the camera image; An action history description method for displaying the composite image.
- Appendix 5 Generating the action history information from the information acquired in the process of acquiring the terminal information and transmitting it to the action history server;
- the present invention can be used in a portable terminal such as a mobile phone or a PDA having an imaging function.
Landscapes
- Business, Economics & Management (AREA)
- Engineering & Computer Science (AREA)
- Human Resources & Organizations (AREA)
- Tourism & Hospitality (AREA)
- Economics (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Marketing (AREA)
- Entrepreneurship & Innovation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Operations Research (AREA)
- Quality & Reliability (AREA)
- Development Economics (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Educational Administration (AREA)
- Game Theory and Decision Science (AREA)
- Studio Devices (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Telephone Function (AREA)
Abstract
Description
以下、図面を参照して本発明の実施の形態について説明する。まず、図1のシステム構成図を用いて、本発明の実施の形態1にかかる行動履歴描写システム1の構成について説明する。この行動履歴描写システム1は、携帯端末10と、行動履歴サーバ20と、を備える。携帯端末10の例として、携帯電話、PDA(Personal Data Assistant)、スマートフォン(Smartphone)が挙げられる。
自己の端末の位置情報と、前記端末の向いている方位情報と、前記端末の端末姿勢情報と、を含む情報を取得し、
周囲を撮像したカメラ画像を生成し、
予め取得した行動履歴と、前記カメラ画像の撮像範囲と、に基づいて表示すべき行動履歴描写情報を算出し、
前記行動履歴描写情報を前記カメラ画像に描写した合成画像を生成し、
前記合成画像を表示する、行動履歴描写方法。
前記行動履歴描写情報を算出する処理では、前記カメラ画像の撮像処理において撮像した撮像範囲と、前記行動履歴に含まれる位置情報と、に基づいて前記行動履歴描写情報を前記カメラ画像に描写する描写サイズを算出し、
前記画像合成を行う処理では、前記描写サイズに設定された前記行動履歴描写情報を前記カメラ画像に描写することを特徴とする付記1に記載の行動履歴描写方法。
前記行動履歴描写情報を算出する処理では、前記カメラ画像の撮像処理において撮像した撮像範囲と、前記行動履歴に含まれる方位情報と、に基づいて前記行動履歴描写情報を前記カメラ画像に描写する向きを算出し、
前記画像合成を行う処理では、前記描写する向きに配置した前記行動履歴描写情報を前記カメラ画像に描写することを特徴とする付記1または付記2に記載の行動履歴描写方法。
前記行動履歴描写情報は、アバターであることを特徴とする付記1乃至付記3のいずれか1項に記載の行動履歴描写方法。
前記端末情報を取得する処理おいて取得した情報から前記行動履歴情報を生成して行動履歴サーバに送信し、
前記行動履歴サーバから任意のユーザの前記行動履歴情報を取得する、ことを特徴とする付記1に記載の行動履歴描写方法。
10 携帯端末
11 端末情報取得部
111 位置取得部
112 方位取得部
113 端末姿勢取得部
114 音声取得部
12 カメラ部
13 行動履歴取得部
14 描写履歴算出部
15 画像合成部
16 表示部
17 行動履歴送信部
20 行動履歴サーバ
21 行動履歴テーブル
Claims (10)
- 自己の端末の位置情報と、前記端末の向いている方位情報と、前記端末の端末姿勢情報と、を含む情報を取得する端末情報取得手段と、
周囲を撮像したカメラ画像を生成するカメラ手段と、
予め取得した行動履歴と、前記カメラ手段の撮像範囲と、に基づいて表示すべき行動履歴描写情報を算出する描写履歴算出手段と、
前記行動履歴描写情報を前記カメラ画像に描写した合成画像を生成する画像合成手段と、
前記合成画像を表示する表示手段と、を備える携帯端末。 - 前記描写履歴算出手段は、前記カメラ手段の撮像範囲と、前記行動履歴に含まれる位置情報と、に基づいて前記行動履歴描写情報を前記カメラ画像に描写する描写サイズを算出し、
前記画像合成手段は、前記描写サイズに設定された前記行動履歴描写情報を前記カメラ画像に描写することを特徴とする請求項1に記載の携帯端末。 - 前記描写履歴算出手段は、前記カメラ手段の撮像範囲と、前記行動履歴に含まれる方位情報と、に基づいて前記行動履歴描写情報を前記カメラ画像に描写する向きを算出し、
前記画像合成手段は、前記描写する向きに配置した前記行動履歴描写情報を前記カメラ画像に描写することを特徴とする請求項1または請求項2に記載の携帯端末。 - 前記行動履歴描写情報は、アバターであることを特徴とする請求項1乃至請求項3のいずれか1項に記載の携帯端末。
- 前記端末情報取得手段が取得した情報から前記行動履歴情報を生成して行動履歴サーバに送信する行動履歴送信手段と、
前記行動履歴サーバから任意のユーザの前記行動履歴情報を取得する行動履歴取得手段と、を備えることを特徴とする請求項1乃至請求項4のいずれか1項に記載の携帯端末。 - 請求項5に記載の携帯端末と、請求項5に記載の行動履歴サーバと、を備える行動履歴描写システム。
- 自己の端末の位置情報と、前記端末の向いている方位情報と、前記端末の端末姿勢情報と、を含む情報を取得し、
周囲を撮像したカメラ画像を生成し、
予め取得した行動履歴と、前記カメラ画像の撮像範囲と、に基づいて表示すべき行動履歴描写情報を算出し、
前記行動履歴描写情報を前記カメラ画像に描写した合成画像を生成し、
前記合成画像を表示する、行動履歴描写方法。 - 前記行動履歴描写情報を算出する処理では、前記カメラ画像の撮像処理において撮像した撮像範囲と、前記行動履歴に含まれる位置情報と、に基づいて前記行動履歴描写情報を前記カメラ画像に描写する描写サイズを算出し、
前記画像合成を行う処理では、前記描写サイズに設定された前記行動履歴描写情報を前記カメラ画像に描写することを特徴とする請求項7に記載の行動履歴描写方法。 - 前記行動履歴描写情報を算出する処理では、前記カメラ画像の撮像処理において撮像した撮像範囲と、前記行動履歴に含まれる方位情報と、に基づいて前記行動履歴描写情報を前記カメラ画像に描写する向きを算出し、
前記画像合成を行う処理では、前記描写する向きに配置した前記行動履歴描写情報を前記カメラ画像に描写することを特徴とする請求項7または請求項8に記載の行動履歴描写方法。 - 前記行動履歴描写情報は、アバターであることを特徴とする請求項7乃至請求項9のいずれか1項に記載の行動履歴描写方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2011800069586A CN102713948A (zh) | 2010-02-01 | 2011-01-19 | 便携式终端、活动历史描写方法及活动历史描写系统 |
US13/521,358 US20120281102A1 (en) | 2010-02-01 | 2011-01-19 | Portable terminal, activity history depiction method, and activity history depiction system |
JP2011551745A JPWO2011093031A1 (ja) | 2010-02-01 | 2011-01-19 | 携帯端末、行動履歴描写方法、及び行動履歴描写システム |
EP11736748A EP2533188A1 (en) | 2010-02-01 | 2011-01-19 | Portable terminal, action history depiction method, and action history depiction system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-020655 | 2010-02-01 | ||
JP2010020655 | 2010-02-01 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011093031A1 true WO2011093031A1 (ja) | 2011-08-04 |
Family
ID=44319030
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/000236 WO2011093031A1 (ja) | 2010-02-01 | 2011-01-19 | 携帯端末、行動履歴描写方法、及び行動履歴描写システム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20120281102A1 (ja) |
EP (1) | EP2533188A1 (ja) |
JP (1) | JPWO2011093031A1 (ja) |
CN (1) | CN102713948A (ja) |
WO (1) | WO2011093031A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013113521A1 (de) * | 2012-02-03 | 2013-08-08 | Robert Bosch Gmbh | Auswertevorrichtung für ein überwachungssystem sowie überwachungssystem mit der auswertevorrichtung |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI435261B (zh) * | 2010-08-17 | 2014-04-21 | Wistron Corp | 電子裝置及其面板化操作介面之建立方法 |
KR102107810B1 (ko) * | 2013-03-19 | 2020-05-28 | 삼성전자주식회사 | 디스플레이 장치 및 그의 액티비티에 대한 정보 디스플레이 방법 |
JP7520535B2 (ja) * | 2020-03-11 | 2024-07-23 | キヤノン株式会社 | 画像処理装置 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11195138A (ja) | 1998-01-06 | 1999-07-21 | Sharp Corp | 画像処理装置 |
JPH11271060A (ja) * | 1998-03-26 | 1999-10-05 | Chubu Electric Power Co Inc | ビデオカメラ位置角度測定装置 |
JP2006105640A (ja) | 2004-10-01 | 2006-04-20 | Hitachi Ltd | ナビゲーション装置 |
WO2008100358A1 (en) * | 2007-02-16 | 2008-08-21 | Panasonic Corporation | Method and apparatus for efficient and flexible surveillance visualization with context sensitive privacy preserving and power lens data mining |
JP2009271750A (ja) * | 2008-05-08 | 2009-11-19 | Ntt Docomo Inc | 仮想空間提供装置、プログラム及び仮想空間提供システム |
JP2010020655A (ja) | 2008-07-14 | 2010-01-28 | Sharp Corp | 画像形成装置、画像形成方法、プリンタドライバプログラム及び画像形成システム |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6552744B2 (en) * | 1997-09-26 | 2003-04-22 | Roxio, Inc. | Virtual reality camera |
US6119147A (en) * | 1998-07-28 | 2000-09-12 | Fuji Xerox Co., Ltd. | Method and system for computer-mediated, multi-modal, asynchronous meetings in a virtual space |
JP4526639B2 (ja) * | 2000-03-02 | 2010-08-18 | 本田技研工業株式会社 | 顔認識装置およびその方法 |
US7375755B2 (en) * | 2001-08-30 | 2008-05-20 | Canon Kabushiki Kaisha | Image processing apparatus and method for displaying an image and posture information |
JP3768135B2 (ja) * | 2001-09-28 | 2006-04-19 | 三洋電機株式会社 | ディジタルカメラ |
TW593978B (en) * | 2002-02-25 | 2004-06-21 | Mitsubishi Electric Corp | Video picture processing method |
JP2004088729A (ja) * | 2002-06-25 | 2004-03-18 | Fuji Photo Film Co Ltd | デジタルカメラシステム |
JP2004227332A (ja) * | 2003-01-23 | 2004-08-12 | Hitachi Ltd | 情報表示方法 |
US7574070B2 (en) * | 2003-09-30 | 2009-08-11 | Canon Kabushiki Kaisha | Correction of subject area detection information, and image combining apparatus and method using the correction |
JP2005128437A (ja) * | 2003-10-27 | 2005-05-19 | Fuji Photo Film Co Ltd | 撮影装置 |
JP3971783B2 (ja) * | 2004-07-28 | 2007-09-05 | 松下電器産業株式会社 | パノラマ画像合成方法および物体検出方法、パノラマ画像合成装置、撮像装置、物体検出装置、並びにパノラマ画像合成プログラム |
WO2006035755A1 (ja) * | 2004-09-28 | 2006-04-06 | National University Corporation Kumamoto University | 移動体ナビゲート情報表示方法および移動体ナビゲート情報表示装置 |
JP2007025483A (ja) * | 2005-07-20 | 2007-02-01 | Ricoh Co Ltd | 画像記憶処理装置 |
JP4965475B2 (ja) * | 2008-02-05 | 2012-07-04 | オリンパス株式会社 | 仮想移動表示装置 |
JP4758499B2 (ja) * | 2009-07-13 | 2011-08-31 | 株式会社バンダイナムコゲームス | 画像生成システム及び情報記憶媒体 |
-
2011
- 2011-01-19 JP JP2011551745A patent/JPWO2011093031A1/ja not_active Ceased
- 2011-01-19 US US13/521,358 patent/US20120281102A1/en not_active Abandoned
- 2011-01-19 EP EP11736748A patent/EP2533188A1/en not_active Withdrawn
- 2011-01-19 CN CN2011800069586A patent/CN102713948A/zh active Pending
- 2011-01-19 WO PCT/JP2011/000236 patent/WO2011093031A1/ja active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11195138A (ja) | 1998-01-06 | 1999-07-21 | Sharp Corp | 画像処理装置 |
JPH11271060A (ja) * | 1998-03-26 | 1999-10-05 | Chubu Electric Power Co Inc | ビデオカメラ位置角度測定装置 |
JP2006105640A (ja) | 2004-10-01 | 2006-04-20 | Hitachi Ltd | ナビゲーション装置 |
WO2008100358A1 (en) * | 2007-02-16 | 2008-08-21 | Panasonic Corporation | Method and apparatus for efficient and flexible surveillance visualization with context sensitive privacy preserving and power lens data mining |
JP2009271750A (ja) * | 2008-05-08 | 2009-11-19 | Ntt Docomo Inc | 仮想空間提供装置、プログラム及び仮想空間提供システム |
JP2010020655A (ja) | 2008-07-14 | 2010-01-28 | Sharp Corp | 画像形成装置、画像形成方法、プリンタドライバプログラム及び画像形成システム |
Non-Patent Citations (1)
Title |
---|
HIDEKI MATSUMOTO: "Keitai Muke Kakucho Genjitsu 'AR' ga Jitsuyo-ka", NIKKEI COMMUNICATIONS, 15 October 2009 (2009-10-15), pages 14 - 15, XP008153715 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013113521A1 (de) * | 2012-02-03 | 2013-08-08 | Robert Bosch Gmbh | Auswertevorrichtung für ein überwachungssystem sowie überwachungssystem mit der auswertevorrichtung |
Also Published As
Publication number | Publication date |
---|---|
JPWO2011093031A1 (ja) | 2013-05-30 |
US20120281102A1 (en) | 2012-11-08 |
EP2533188A1 (en) | 2012-12-12 |
CN102713948A (zh) | 2012-10-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7268692B2 (ja) | 情報処理装置、制御方法及びプログラム | |
JP6635037B2 (ja) | 情報処理装置、および情報処理方法、並びにプログラム | |
JP6329343B2 (ja) | 画像処理システム、画像処理装置、画像処理プログラム、および画像処理方法 | |
KR20210046592A (ko) | 증강 현실 데이터 제시 방법, 장치, 기기 및 저장 매체 | |
JP6683127B2 (ja) | 情報処理装置、および情報処理方法、並びにプログラム | |
JP6077068B1 (ja) | 拡張現実システム、および拡張現実方法 | |
CN108710525B (zh) | 虚拟场景中的地图展示方法、装置、设备及存储介质 | |
JP7026819B2 (ja) | カメラの位置決め方法および装置、端末並びにコンピュータプログラム | |
WO2017219195A1 (zh) | 一种增强现实显示方法及头戴式显示设备 | |
JP6348741B2 (ja) | 情報処理システム、情報処理装置、情報処理プログラム、および情報処理方法 | |
KR101665399B1 (ko) | 실측을 통한 증강현실기반의 객체 생성장치 및 그 방법 | |
CN107771310B (zh) | 头戴式显示设备及其处理方法 | |
JP6624646B2 (ja) | 情報提示システム、情報提示方法及び情報提示プログラム | |
JP2006105640A (ja) | ナビゲーション装置 | |
JP2007133489A5 (ja) | ||
JP2012048597A (ja) | 複合現実感表示システム、画像提供画像提供サーバ、表示装置及び表示プログラム | |
JPWO2018034053A1 (ja) | 情報処理装置、情報処理システム及び情報処理方法 | |
JP6145563B2 (ja) | 情報表示装置 | |
WO2011093031A1 (ja) | 携帯端末、行動履歴描写方法、及び行動履歴描写システム | |
JP5350427B2 (ja) | 画像処理装置、画像処理装置の制御方法、及びプログラム | |
US20150371449A1 (en) | Method for the representation of geographically located virtual environments and mobile device | |
CN112212865B (zh) | Ar场景下的引导方法、装置、计算机设备及存储介质 | |
JP6393000B2 (ja) | 3dマップに関する仮説的ラインマッピングおよび検証 | |
WO2018008096A1 (ja) | 情報表示装置およびプログラム | |
JP2011022662A (ja) | 携帯電話端末及び情報処理システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180006958.6 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11736748 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011736748 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13521358 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011551745 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 6692/CHENP/2012 Country of ref document: IN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |