WO2013145900A1 - 情報処理装置、情報処理方法およびプログラム - Google Patents
情報処理装置、情報処理方法およびプログラム Download PDFInfo
- Publication number
- WO2013145900A1 WO2013145900A1 PCT/JP2013/053204 JP2013053204W WO2013145900A1 WO 2013145900 A1 WO2013145900 A1 WO 2013145900A1 JP 2013053204 W JP2013053204 W JP 2013053204W WO 2013145900 A1 WO2013145900 A1 WO 2013145900A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- captured image
- unit
- information processing
- expression
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 110
- 238000003672 processing method Methods 0.000 title claims description 5
- 230000008859 change Effects 0.000 claims abstract description 110
- 238000001514 detection method Methods 0.000 claims abstract description 23
- 238000000034 method Methods 0.000 claims description 29
- 230000002708 enhancing effect Effects 0.000 claims description 3
- 238000003384 imaging method Methods 0.000 description 53
- 238000010586 diagram Methods 0.000 description 23
- 230000008569 process Effects 0.000 description 6
- 238000000605 extraction Methods 0.000 description 5
- 230000009471 action Effects 0.000 description 3
- 238000004590 computer program Methods 0.000 description 2
- 239000000470 constituent Substances 0.000 description 2
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000005728 strengthening Methods 0.000 description 1
- 230000008719 thickening Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/661—Transmitting camera control signals through networks, e.g. control via the Internet
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/19613—Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion
Definitions
- This disclosure relates to an information processing apparatus, an information processing method, and a program.
- a technique for identifying a person shown in a captured image has been developed. For example, a technique is disclosed in which a face area reflected in a captured image is collated with a face image prepared in advance, and a person appearing in the captured image is specified based on the collation result (see, for example, Patent Document 1). ). According to this technique, since the door is opened and closed when the person is successfully identified, security can be enhanced.
- an image acquisition unit that acquires a captured image
- a change detection unit that detects a state change in a network service of a subject recognized from the captured image
- the change detection unit detects the change in the state.
- an information processing apparatus including an expression changing unit that changes the expression of the subject reflected in the captured image.
- the imaging An information processing method including changing the representation of the subject in the image.
- the computer includes an image acquisition unit that acquires a captured image, a change detection unit that detects a change in the state of a network service of a subject recognized from the captured image, and the change detection unit.
- a program for causing an information processing apparatus to function as an information processing apparatus includes an expression changing unit that changes the expression of the subject shown in the captured image when a change in state is detected.
- a plurality of constituent elements having substantially the same functional configuration may be distinguished by adding different alphabets after the same reference numeral.
- it is not necessary to particularly distinguish each of a plurality of constituent elements having substantially the same functional configuration only the same reference numerals are given.
- the functions of the information processing apparatus according to the present embodiment are roughly classified into a user specifying function, a composition determination function, and an expression change function. First, before describing each function, the preconditions for each function will be described.
- FIG. 1 is a diagram for explaining an outline of the present embodiment.
- a user U having a mobile terminal 20 exists in real space.
- the user U may exist anywhere in the real space.
- the mobile terminal 20 is provided with a function of displaying an image, and the user U can view a screen displayed by the mobile terminal 20.
- the user U is imaged by an imaging device 30 provided separately from the mobile terminal 20.
- the position where the imaging device 30 is provided is not particularly limited.
- the imaging device 30 may be attached in a building like a surveillance camera, or may be attached to a moving body such as a vehicle.
- the type of the imaging device 30 is not particularly limited, but may be an infrared camera, for example.
- FIG. 2 is a diagram showing a configuration of the information processing system according to the present embodiment.
- the information processing system 1 includes an information processing device 10, a mobile terminal 20, and an imaging device 30.
- FIG. 2 shows an example in which the information processing apparatus 10 and the mobile terminal 20 are connected via the network 40, the number of mobile terminals 20 connected to the network 40 may not be one.
- the information processing apparatus 10 and the imaging apparatus 30 are also connected.
- one imaging device 30 is connected to the information processing device 10, but the number of imaging devices 30 may not be one.
- a plurality of imaging devices 30 may exist, and each of the plurality of imaging devices 30 may be connected to the information processing device 10.
- the information processing device 10 is configured separately from the mobile terminal 20 and the imaging device 30, but may be incorporated in the mobile terminal 20 or may be incorporated in the imaging device 30. It may be.
- the captured image captured by the imaging device 30 is provided to the mobile terminal 20 via the information processing device 10, and the mobile terminal 20 can display the captured image.
- the information processing apparatus 10 can exhibit a user specifying function, a composition determination function, and an expression change function using the captured image provided from the imaging apparatus 30.
- FIG. 3 is a block diagram illustrating a functional configuration example of the information processing apparatus 10 according to the present embodiment.
- the information processing apparatus 10 includes a processing control unit 100, a display control unit 160, and a storage unit 50.
- the processing control unit 100 includes an image acquisition unit 111, a parameter acquisition unit 112, a candidate extraction unit 113, a specification unit 114, and an authentication unit 115.
- the image acquisition unit 111, parameter acquisition unit 112, candidate extraction unit 113, identification unit 114, and authentication unit 115 are blocks mainly related to the user identification function of the information processing apparatus 10.
- the processing control unit 100 includes a recognition unit 121, a condition determination unit 122, an information addition unit 123, a mode setting unit 124, a composition determination unit 125, and an image determination unit 126.
- the processing control unit 100 includes a recognition unit 121, a condition determination unit 122, an information addition unit 123, a mode setting unit 124, a composition determination unit 125, and an image determination unit 126.
- the processing control unit 100 mainly relates to a composition determination function of the information processing apparatus 10. It is a block.
- the processing control unit 100 includes a change detection unit 131 and an expression change unit 132.
- the change detection unit 131 and the expression change unit 132 are mainly blocks related to the expression change function of the information processing apparatus 10.
- the processing control unit 100 and the display control unit 160 correspond to a processor such as a CPU (Central Processing Unit) or a DSP (Digital Signal Processor).
- the processing control unit 100 and the display control unit 160 exhibit various functions of the processing control unit 100 and the display control unit 160 by executing a program stored in the storage unit 50 or another storage medium.
- the storage unit 50 stores a program and data for processing by the processing control unit 100 and the display control unit 160 using a storage medium such as a semiconductor memory or a hard disk.
- the storage unit 50 stores a feature dictionary used for object recognition.
- the storage unit 50 can store a recognition result generated as a result of object recognition.
- the storage unit 50 is incorporated in the information processing apparatus 10, but the storage unit 50 and the information processing apparatus 10 may be configured separately.
- FIG. 4 is a diagram illustrating an overview of the user specifying function of the information processing apparatus 10.
- the image acquisition unit 111 acquires a captured image captured by the imaging device 30.
- the captured image Img obtained by capturing the real space illustrated in FIG. 1 is acquired by the image acquisition unit 111 as an example of the captured image.
- the captured image acquired by the image acquisition unit 111 is controlled by the display control unit 160 so as to be displayed by the mobile terminal 20.
- the user is specified from the subject shown in the captured image acquired by the image acquisition unit 111.
- the purpose of specifying the user is not particularly limited, but here, some network service is provided by the information processing apparatus 10, and the user is specified in order for the user U to log in to the network service via the mobile terminal 20. Is assumed.
- the type of network service is not particularly limited, but here, using a mobile terminal that has completed logging in to the network service, the user himself / herself writes a message or browses a message written by the user or another user.
- a service that can be used is assumed.
- the timing at which the user is specified is not particularly limited, for example, it may be when an operation for requesting the user specification is performed on the mobile terminal 20 by the user.
- the operation for requesting the user identification may be an operation for requesting login to the network service.
- the parameter acquisition unit 112 acquires parameters related to the user to be specified.
- the timing at which parameters are acquired by the parameter acquisition unit 112 is not particularly limited.
- the parameters may be acquired in advance from the mobile terminal 20 that the user has, or may be acquired from the mobile terminal 20 when the user is specified. Good.
- the type of parameter acquired by the parameter acquisition unit 112 is not particularly limited.
- the parameter acquisition unit 112 may acquire the user's face image as the parameter P1, or may acquire sensor data indicating the user's movement as the parameter P2.
- An apparatus for capturing a user's face image is not particularly limited.
- the face image of the user may be, for example, a face image captured by the mobile terminal 20 or a face image captured by another device.
- the senor for detecting the user's movement is not particularly limited.
- the sensor that detects the user's movement may be an acceleration sensor, a GPS signal reception sensor, or a radio wave reception sensor transmitted from a Wi-fi base station.
- the sensor which detects a user's motion may be integrated in the portable terminal 20, and may be attached to the user's body.
- the candidate extracting unit 113 may specify the user candidate (step S11). For example, when a coordinate on a captured image displayed on the mobile terminal 20 is specified by a user operation (for example, a touch operation on the touch panel) input to the mobile terminal 20, the parameter acquisition unit 112 acquires the coordinates.
- the candidate extraction unit 113 may specify a subject selected by the coordinates (for example, a subject existing at the coordinates) as a user candidate. By identifying user candidates, the load required for user identification is reduced.
- the identifying unit 114 attempts to identify the user from the captured image based on the parameter acquired by the parameter acquiring unit 112 (step S12). For example, when the user's face image is acquired as the parameter P1 by the parameter acquisition unit 112, the specifying unit 114 determines the difference between the face area of the subject reflected in the captured image acquired by the image acquisition unit 111 and the user's face image. What is necessary is just to specify the subject extracted based on collation as a user. For example, the specifying unit 114 may specify, as a user, a subject having a face area with a similarity degree exceeding the threshold value with the user's face image.
- the specifying unit 114 detects the movement of the subject detected from the captured image acquired by the image acquisition unit 111.
- the subject extracted based on the comparison with the sensor data may be specified as the user.
- the specifying unit 114 may specify, as a user, a subject in which sensor data having a degree of similarity exceeding a threshold and a user's movement in the captured image is detected.
- the specifying unit 114 may specify the user based on the parameter from the user candidate.
- the display control unit 160 controls the mobile terminal 20 so that information indicating the user action necessary for specifying the user is displayed on the mobile terminal 20. Under the control of the display control unit 160, information indicating the user's operation is displayed on the portable terminal 20 (step S13).
- FIG. 5 is a diagram illustrating a display example of the mobile terminal 20 when the user identification fails.
- the display control unit 160 may control the mobile terminal 20 so that information that prompts the user to turn his / her face toward the imaging device is displayed on the mobile terminal 20.
- the imaging device 30 is a surveillance camera
- the display control unit 160 carries the mobile phone 20 so that the message M1 “Please turn your face toward the surveillance camera” is displayed on the mobile terminal 20.
- An example of controlling the terminal 20 is shown.
- the authentication unit 115 may perform an authentication process on the mobile terminal 20 specified by the specifying unit 114 before the process proceeds to step S15 (step S14). Specifically, for example, as illustrated in FIG. 4, when a password is input by a user operation input to the mobile terminal 20, the authentication unit 115 determines whether the input password and a previously registered password are present. The success or failure of authentication may be determined depending on whether or not they match. If the authentication is successful, the process proceeds to step S15, but if the authentication fails, the authentication process may be performed again without proceeding to step S15.
- step S15 the login to the network service is completed (step S15), and the display control unit 160 controls the portable terminal 20 so that the captured image and the additional information are displayed by the portable terminal 20. .
- the captured image and the additional information are displayed on the portable terminal 20 (step S16).
- the additional information is not particularly limited, but may be a message written by the user or another user, for example.
- FIG. 6 is a diagram illustrating a display example of the mobile terminal 20 when the user identification is successful.
- the display control unit 160 controls the mobile terminal 20 so that the captured image and the additional information are displayed.
- FIG. 6 shows an example in which the display control unit 160 controls the mobile terminal 20 so that a message written by another user is displayed as additional information.
- FIG. 7 is a diagram illustrating a display example of the mobile terminal 20 when the user identification fails.
- the display control unit 160 Based on the position of the candidate of the user extracted by the candidate extraction unit 113 and the position of the imaging device 30 set in advance, the display control unit 160 displays information indicating the user's operation necessary for specifying the user as the mobile terminal 20. You may control the portable terminal 20 so that it may be displayed.
- the display control unit 160 uses the position of the user candidate as a reference, and the straight line in the direction in which the user candidate is facing and the straight line in the direction in which the imaging device 30 exists are You may control the portable terminal 20 so that it may be displayed. Further, as illustrated in FIG. 7, the display control unit 160 displays, on the mobile terminal 20, an arrow indicating a rotation direction from a straight line in the direction in which the user candidate is facing to a straight line in the direction in which the imaging device 30 exists. The mobile terminal 20 may be controlled as described. In addition, as illustrated in FIG. 7, the display control unit 160 controls the mobile terminal 20 so that the message M2 “Turn Left” is displayed on the mobile terminal 20 as a message indicating the rotation direction when viewed from the user. May be.
- FIG. 8 is a diagram illustrating a display example of the mobile terminal 20 when the user identification fails.
- the display control unit 160 may control the mobile terminal 20 to display information indicating that the user is specified by another specifying method.
- the display control unit 160 controls the mobile terminal 20 so that the message M3 “switching the user specifying method” is displayed on the mobile terminal 20 as information indicating that the user is specified by another specific method. May be.
- the identification unit 114 may identify a subject extracted by another identification method as a user.
- the specifying unit 114 attempts to specify, as a user, a subject to be extracted based on a comparison between the face area of the subject shown in the captured image and the user's face image.
- the specifying unit 114 attempts to specify the subject extracted based on the comparison between the motion of the subject detected from the captured image and the sensor data as the user. Also good. Also, user identification may be attempted in the reverse order.
- the specifying unit 114 can specify a subject as a user when a subject that moves outside the specified normal range is detected from the captured image.
- FIG. 9 is a diagram illustrating a display example of the mobile terminal 20 when the user identification fails.
- the display control unit 160 displays information that prompts the user to move the mobile terminal 20 on the mobile terminal 20 as information indicating a user action necessary for specifying the user.
- the mobile terminal 20 may be controlled.
- the display control unit 160 causes the mobile terminal 20 to display a message M4 “Please move the mobile terminal” as information for prompting the mobile terminal 20 to move. May be controlled.
- an attempt is made to specify the user from the captured image based on the parameter related to the user of the mobile terminal, and the display control unit 160 specifies the user. If not, control is performed so that information indicating the user's operation necessary for specifying the user is displayed on the mobile terminal 20. By such control, it is expected that the user can be easily specified from the captured image.
- FIG. 10 is a diagram illustrating an outline of the composition determination function of the information processing apparatus 10.
- the captured image is displayed on the user's mobile terminal 20.
- the user can browse the captured image displayed on the mobile terminal 20.
- determination of a composition preferable as the captured image browsed by such a user will be described.
- information indicating the user's destination is set.
- Information indicating the user's destination may be input by a user operation input to the mobile terminal 20 or may be set by an application executed by the mobile terminal 20.
- Information indicating the user's destination is represented by a position in the real space, for example.
- Information indicating the user's destination is acquired by the information processing apparatus 10.
- the composition determination unit 125 determines the composition based on the user's recognition result based on the captured image acquired by the image acquisition unit 111 and information indicating the user's destination.
- the composition is a diagram showing an ideal position of an object (or an ideal direction of the object) in an image.
- the user recognition result includes at least the user's traveling direction and the user's position in the real space.
- the recognition result of the user is acquired by the recognition unit 121. That is, the recognition unit 121 obtains a recognition result by recognizing the user based on the captured image.
- the position in the user's real space is indicated as Pu
- the user's traveling direction is indicated as V1
- the direction of the user's destination based on the user's position Pu is indicated as V2. Yes.
- the composition determination unit 125 may determine the composition based on the direction of the user's destination based on the user's position. For example, in the example illustrated in FIG. 10, the composition determination unit 125 may determine the composition based on the user's destination direction V2 with reference to the user position Pu. For example, when the composition determination unit 125 determines the composition based on V2, the region in the direction of V2 with respect to the position Pu of the user is larger than the region in the direction opposite to V2 with respect to the position Pu of the user. Such an image may be determined as a composition.
- FIG. 10 shows an example of the composition determined in this way as K2.
- a point C2 indicates the center point of the composition K2.
- the composition determination unit 125 may determine the composition based on the user's traveling direction. For example, in the example illustrated in FIG. 10, the composition determination unit 125 may determine the composition based on the user's traveling direction V1. For example, when the composition determination unit 125 determines the composition based on V1, the region in the direction of V1 with respect to the position Pu of the user is larger than the region in the direction opposite to V1 with respect to the position Pu of the user. Such an image may be determined as a composition.
- FIG. 10 shows an example of the composition determined in this way as K1. A point C1 indicates the center point of the composition K1.
- the composition determination unit 125 may determine the composition based on the user's traveling direction V1 uniformly or based on the user's destination direction V2 based on the user's position Pu. However, any one of the determination methods may be selectively used. That is, the mode for determining the composition based on the user's traveling direction V1 is the first mode, and the mode for determining the composition based on the user's destination direction V2 with respect to the user's position Pu is the second mode. In this case, the mode setting unit 124 may set one of these two modes. In that case, the composition determination unit 125 may determine the composition according to the mode set by the mode setting unit 124. The mode may be set based on a user operation input to the mobile terminal 20 or may be set by an application executed by the mobile terminal 20.
- the image determination unit 126 determines a display image based on the composition determined by the composition determination unit 125.
- the image determination unit 126 may determine, for example, an image that matches the composition determined by the composition determination unit 125 as a display image, or is closest to the composition determined by the composition determination unit 125 within a selectable range. An image may be determined as a display image.
- the display control unit 160 controls the mobile terminal 20 so that the display image determined by the image determination unit 126 is displayed on the mobile terminal 20.
- FIG. 11 is a diagram for explaining a display image when the first mode is set.
- the composition K1 is determined based on the traveling direction V1 of the user.
- the image determination unit 126 may determine an image that matches the composition K1 as the display image D1.
- the composition determination unit 125 may determine the composition in which the user exists in the center, but as shown in FIG. 11, the composition determination unit 125 may determine the composition in which the user exists at a position away from the center by a predetermined distance d1. Good.
- the current imaging conditions may be determined. That is, the condition determination unit 122 determines whether or not the conditions for capturing the user are satisfied, and the information addition unit 123 displays the display when the condition determination unit 122 determines that the conditions are not satisfied. Information indicating that the condition is not satisfied may be added to the image. For example, if the condition determining unit 122 determines an imageable area, the information adding unit 123 may add an object indicating the limit of the imageable area to the display image.
- the information adding unit 123 displays an object indicating the limit of the imageable area as a display image. May be added.
- an object indicating the limit of the imageable area is added to the display image D1 as A1.
- the information adding unit 123 adds an object indicating the limit of the imageable area to the display image. May be.
- the information adding unit 123 may add an object indicating deterioration of the imaging condition to the display image. Such processing can prompt the user to change the traveling direction, but the information provided to the user is not limited to the added object.
- the information adding unit 123 may control the mobile terminal 20 to vibrate, or may control the mobile terminal 20 to emit an alarm sound.
- FIG. 12 is a diagram for explaining a display image when the second mode is set.
- the image determination unit 126 may determine an image that matches the composition K2 as the display image D2.
- the composition determination unit 125 may determine the composition in which the user exists in the center, but as shown in FIG. 12, the composition determination unit 125 may determine the composition in which the user exists at a position away from the center by a predetermined distance d2. Good.
- the image determination unit 126 determines a display image by cutting out a partial region of a captured image captured by the imaging device 30 based on the composition determined by the composition determination unit 125. did. If the imaging device 30 is one fixed camera, only such a method can be used. However, when the imaging device 30 is a direction change camera or an angle of view change camera, there are a plurality of imaging devices 30. In other cases, other techniques can be employed.
- the fixed camera means a camera that cannot change the imaging direction or the angle of view.
- the direction changing camera means a movable camera, and may be a camera having a pan function or a camera having a tilt function.
- the view angle changing camera means a camera capable of changing the view angle.
- FIG. 13 is a diagram for explaining a variation of the display image determination method. As illustrated in FIG. 13, when the imaging device 30 is a fixed camera, the image determination unit 126 adopts a method of determining an area selected from the captured image as a display image.
- the image determination unit 126 determines the imaging direction of the imaging device 30 that provides the captured image, and is imaged according to the determined imaging direction.
- the captured image is determined as a display image.
- the image determination unit 126 determines the imaging direction of the imaging device 30 based on the composition determined by the composition determination unit 125. More specifically, the image determination unit 126 may determine a direction in which an image closest to the composition determined by the composition determination unit 125 can be captured as the imaging direction of the imaging device 30.
- the image determination unit 126 determines the angle of view of the imaging device 30 that provides the captured image, and performs imaging based on the determined angle of view.
- the captured image to be processed is determined as a display image.
- the image determination unit 126 determines the angle of view of the imaging device 30 based on the composition determined by the composition determination unit 125. More specifically, the image determination unit 126 may determine the angle of view at which the image closest to the composition determined by the composition determination unit 125 can be captured as the angle of view of the imaging device 30.
- the image determination unit 126 is provided from each of the plurality of imaging devices 30.
- a captured image selected from the captured images is determined as a display image.
- the image determination unit 126 determines a captured image selected from captured images provided from each of the plurality of imaging devices 30 as a display image based on the composition determined by the composition determination unit 125. More specifically, the image determination unit 126 selects a captured image closest to the composition determined by the composition determination unit 125 from the captured images provided from each of the plurality of imaging devices 30, and selects the selected captured image. May be determined as a display image.
- the image closest to the composition determined by the composition determination unit 125 is used as the display image.
- the display image is displayed. It is also possible not to be determined. In such a case, it is assumed that the image determination unit 126 cannot determine the display image based on the composition determined by the composition determination unit 125. Therefore, in such a case, the image determination unit 126 may determine a display image based on detection data other than the composition.
- Detected data is not particularly limited.
- the image determination unit 126 can also determine, as a display image, an image obtained by capturing a position closest to the user's position specified by the specifying unit 114 based on the sensor data as described above.
- the image determination unit 126 may determine an image with the smallest number of reflected people as a display image, or may determine an image with the lowest density of reflected people as a display image.
- the image determination unit 126 may determine an image in which the widest range is captured as a display image, or may determine a captured image that is most frequently used as a display image.
- the image determination unit 126 may determine, as a display image, an image corresponding to the user position detected based on data other than the captured image and information indicating the user destination.
- the image determination unit 126 uses the position information and information indicating the user's destination to generate a Web Images may be searched from a page or the like.
- the image may be a drawn map or a photograph taken.
- the image determination unit 126 may determine a new image generated by combining the plurality of images as a display image.
- FIG. 14 is a diagram showing an example of the determined display image.
- the display control unit 160 arrives at the display image D ⁇ b> 2, an object Obj ⁇ b> 1 indicating the distance to the user's destination based on the user's position Pu, the distance to the destination, and the destination.
- An object Obj2 indicating time may be added.
- the display control unit 160 may add an object Obj3 indicating the direction, an object Obj4 indicating the direction of the user's destination based on the user position Pu, and the like to the display image D2. Good. Further, as shown in FIG. 14, the display control unit 160 may add an object Obj5 indicating the direction of the user's destination with respect to the user position Pu to the display image D2 to the user position Pu. The object Obj6 indicating the user position Pu may be added to the user position.
- FIG. 15 is a flowchart illustrating the flow of the composition determination operation performed by the information processing apparatus 10.
- the recognition unit 121 recognizes the user from data other than the captured image (step S22), Operation proceeds to step S26.
- step S21 when the captured image is acquired by the image acquisition unit 111 (“Yes” in step S21), the recognition unit 121 recognizes the user from the captured image (step S23), and the condition determination unit 122 captures the image. It is determined whether or not the conditions for doing so are met (step S24). If the condition determination unit 122 determines that the conditions for imaging are satisfied (“Yes” in step S24), the operation proceeds to step S26. On the other hand, when the condition determining unit 122 determines that the conditions for imaging are not satisfied (“No” in step S24), the information adding unit 123 captures information indicating that the conditions are not satisfied. (Step S25), and the operation proceeds to step S26.
- the mode setting unit 124 sets either the first mode or the second mode (step S26), and the composition determination unit 125 is based on the recognition result by the recognition unit 121. Then, the composition is determined according to the mode set by the mode setting unit 124 (step S27). Subsequently, the image determination unit 126 attempts to determine a display image based on the composition determined by the composition determination unit 125 (step S28).
- step S29 when the display image is determined by the image determination unit 126 (“Yes” in step S29), the operation is shifted to step S31.
- the image determination unit 126 determines a display image based on detection data other than the composition (step S30), and step S31.
- the operation is transferred to.
- the display control unit 160 controls the mobile terminal 20 so that the display image determined by the image determination unit 126 is displayed on the mobile terminal 20 (step S31), and the operation ends.
- a composition is determined based on the user's recognition result based on the captured image and information indicating the user's destination. Further, a display image is determined based on the composition determined in this way. Such control is expected to determine a display image that is more convenient for the user.
- Login to the network service is not limited to the user U, but can be performed by other users. That is, a person (hereinafter also referred to as “subject”) shown in the captured image may enjoy the network service. Therefore, hereinafter, a function for easily grasping a change in the state of the subject in the network service in the captured image will be described. Note that the subject shown in the captured image can be recognized by the recognition unit 121 by a method similar to the method for identifying the user.
- the change detection unit 131 detects a change in the state of the subject network service recognized from the captured image acquired by the image acquisition unit 111.
- the state in the network service is not particularly limited, for example, it may be information indicating whether the subject mobile terminal is logged in to the network service. That is, the expression changing unit 132 may change the expression of the subject shown in the captured image when the change detecting unit 131 detects that the subject mobile terminal has logged into the network service.
- the status in the network service may be information indicating whether or not a specific process has been performed while logged in.
- the expression changing unit 132 changes the expression of the subject shown in the captured image when the change detecting unit 131 detects that the operation of the subject mobile terminal in the network service satisfies a predetermined condition. Also good.
- the predetermined condition is not particularly limited, but may be, for example, a condition that a message is written or a condition that a message is written within a predetermined time.
- the expression changing unit 132 changes the expression of the subject shown in the captured image when the change detecting unit 131 detects a change in state.
- the subject change area displayed in the captured image may be grasped in any way by the expression changing unit 132. For example, when a difference occurs in the captured image before and after the subject is reflected, the difference area is set as the subject area. It may be grasped. There is no particular limitation on how to change the expression of the subject.
- the expression changing unit 132 may change the user's expression by emphasizing the user shown in the captured image when the specifying unit 114 specifies the user. There is no particular limitation on how to emphasize the user.
- the display control unit 160 controls the mobile terminal 20 so that a display image obtained by changing the expression by the expression changing unit 132 is displayed on the mobile terminal 20.
- FIG. 16 is a diagram illustrating an example of an expression change function by the information processing apparatus 10. As illustrated in FIG. 16, the expression changing unit 132 may emphasize the user U by reducing the definition of a subject other than the user U in the captured image.
- FIG. 17 is a diagram illustrating an example of an expression change function by the information processing apparatus 10.
- the expression changing unit 132 may emphasize the user U by enhancing the contour L1 of the user U in the captured image.
- the user U may be emphasized by thickening the contour L1 of the user U in the captured image.
- FIG. 18 is a diagram illustrating an example of an expression change function by the information processing apparatus 10.
- the expression changing unit 132 may emphasize the user U by changing the pixel value of a subject other than the user U in the captured image.
- the user U may be emphasized by reducing the luminance of a subject other than the user U in the captured image.
- FIG. 19 is a diagram illustrating an example of an expression change function by the information processing apparatus 10.
- the expression changing unit 132 may emphasize the user U by zooming in on the user U in the captured image.
- the expression changing unit 132 may perform zoom-out display for the user U after performing zoom-in display.
- FIG. 20 is a diagram illustrating an example of an expression change function by the information processing apparatus 10.
- the expression changing unit 132 may emphasize the user by displaying the object at the position of the user in the captured image or a position around the user. The range around the user may be defined in advance. In the example shown in FIG. 20, the expression changing unit 132 emphasizes the user U by adding objects Obj8 and Obj9 to positions around the user in the captured image.
- the emphasis degree of the user U may be constant or may be changed according to the situation.
- the expression changing unit 132 may change the degree of enhancement of the user according to the size of the user in the captured image.
- FIG. 21 is a diagram illustrating an example of an expression change function by the information processing apparatus 10.
- the expression changing unit 132 may increase the degree of enhancement of the user as the size of the user in the captured image is smaller.
- the size of the user U in the captured image is smaller as the position of the user U is farther from the imaging device 30.
- the expression changing unit 132 increases the degree of emphasis of the small user U by making the contour L0 of the small user U thicker than the contour L1 of the large user U.
- the user's emphasis degree is strengthened as the size of the user in the captured image is smaller, so that the user can be easily found in the captured image.
- FIG. 22 is a diagram illustrating an example of an expression change function by the information processing apparatus 10. As shown in FIG. 22, a part of the user U is hidden behind another subject in the captured image. Therefore, the expression changing unit 132 changes the method of emphasizing the user U by adding the object Obj10 to positions around the user U in the captured image.
- the range around the user U may be defined in advance.
- FIG. 23 is a flowchart showing the flow of the expression changing operation by the information processing apparatus 10.
- the operation ends.
- the recognition unit 121 recognizes the subject from the captured image (step S42).
- the change detection unit 131 determines whether or not a change in the state of the subject in the network service has been detected (step S43).
- step S43 If the change detection unit 131 does not detect a change in the state of the subject in the network service (“No” in step S43), the operation ends, but the change detection unit 131 detects a change in the state of the subject in the network service. If so ("Yes” in step S43), the expression changing unit 132 changes the expression of the subject (step S44).
- the display control unit 160 controls the display of the display image obtained by changing the expression by the expression changing unit 132 (step S45), and the operation ends.
- a change in state in the network service of the subject recognized from the captured image is detected. Further, when a change in state is detected, the representation of the subject shown in the captured image is changed. According to such control, it is expected that the change of the state of the subject in the captured image in the network service can be easily grasped.
- the image acquisition unit 111 that acquires a captured image
- the parameter acquisition unit 112 that acquires parameters related to the user of the mobile terminal 20
- a specifying unit 114 that specifies a user from a captured image based on a parameter, and when a user is not specified by the specifying unit 114, control is performed so that information indicating a user action necessary for specifying the user is displayed on the mobile terminal 20.
- An information processing apparatus 10 including a display control unit 160 is provided.
- the image acquisition unit 111 that acquires the captured image, the user recognition result based on the captured image, and the information indicating the user destination
- An information processing apparatus 10 is provided that includes a composition determination unit 125 that determines a composition based on the image determination unit 126 and an image determination unit 126 that determines a display image based on the composition determined by the composition determination unit 125.
- the composition is determined based on the recognition result of the user based on the captured image and the information indicating the destination of the user, and the display image is determined based on the composition.
- the composition determination unit 125 determines the composition based on information indicating the position of the user and information indicating the destination. Therefore, it is expected that a display image that is more convenient for the user is determined.
- the image acquisition unit 111 that acquires a captured image
- the change detection unit that detects a change in the state of the subject recognized from the captured image in the network service
- An information processing apparatus 10 which includes 131 and an expression changing unit 132 that changes the expression of a subject shown in a captured image when a change in state is detected by the change detecting unit 131.
- the representation of the subject shown in the captured image is changed.
- a user who browses the captured image can easily grasp the change in the state of the subject in the network service from the captured image.
- the example in which the information processing apparatus 10 includes all of the processing control unit 100 and the display control unit 160 has been mainly described. However, some or all of the blocks may be replaced with the information processing apparatus 10.
- the apparatus may have.
- the server may have a part of the processing control unit 100 and the display control unit 160, and the mobile terminal 20 may have the other part.
- the technology of the present disclosure can also be applied to cloud computing.
- each step in the operation of the information processing apparatus 10 of the present specification does not necessarily have to be processed in time series in the order described as a flowchart.
- each step in the operation of the information processing apparatus 10 may be processed in an order different from the order described as the flowchart, or may be processed in parallel.
- An image acquisition unit for acquiring a captured image
- a change detection unit for detecting a change in state in the network service of the subject recognized from the captured image
- An expression changing unit that changes the expression of the subject shown in the captured image when the change of the state is detected by the change detecting unit
- An information processing apparatus comprising: (2) The information processing apparatus includes: Further comprising a specifying unit for specifying a user of the mobile terminal from the captured image; The expression change unit Changing the user's expression by highlighting the user appearing in the captured image; The information processing apparatus according to (1).
- the information processing apparatus includes: A display control unit for controlling the display image obtained by changing the expression by the expression changing unit to be displayed on the mobile terminal;
- the information processing apparatus according to (2) further including: (4)
- the expression change unit changes the expression of the subject shown in the captured image when the change detection unit detects that the portable terminal of the subject has logged into the network service.
- the information processing apparatus according to any one of (1) to (3) The expression change unit changes the expression of the subject shown in the captured image when the change detection unit detects that the operation of the mobile terminal of the subject in the network service satisfies a predetermined condition.
- the expression changing unit changes the degree of enhancement of the user according to the size of the user in the captured image.
- the information processing apparatus according to (2) or (3).
- the expression changing unit increases the degree of enhancement of the user as the size of the user in the captured image is smaller.
- the expression changing unit changes the user's emphasis method when a part or all of the user is reflected by another object in the captured image.
- the information processing apparatus according to (2) or (3). (9)
- the expression changing unit emphasizes the user by reducing the sharpness of a subject other than the user in the captured image;
- the expression changing unit emphasizes the user by enhancing an outline of the user in the captured image;
- the expression changing unit emphasizes the user by changing a pixel value of a subject other than the user in the captured image; The information processing apparatus according to (2) or (3). (12) The expression changing unit emphasizes the user by zooming in on the user in the captured image. The information processing apparatus according to (2) or (3). (13) The expression changing unit emphasizes the user by adding an object to the position of the user in the captured image or a position around the user. The information processing apparatus according to (2) or (3). (14) Acquiring a captured image; Detecting a change in the state of the network service of the subject recognized from the captured image; Changing the expression of the subject shown in the captured image when a change in the state is detected; Including an information processing method.
- Computer An image acquisition unit for acquiring a captured image;
- a change detection unit for detecting a change in state in the network service of the subject recognized from the captured image;
- An expression changing unit that changes the expression of the subject shown in the captured image when the change of the state is detected by the change detecting unit;
- a program for causing an information processing apparatus to function.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Studio Devices (AREA)
- User Interface Of Digital Computer (AREA)
- Facsimiles In General (AREA)
- Image Processing (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Description
1.ユーザ特定機能
2.構図判定機能
3.表現変更機能
4.むすび
まず、情報処理装置10が有するユーザ特定機能について説明する。図4は、情報処理装置10が有するユーザ特定機能の概要を示す図である。まず、画像取得部111は、撮像装置30により撮像された撮像画像を取得する。図4に示した例では、撮像画像の例として、図1に示された実空間が撮像された撮像画像Imgが画像取得部111により取得されている。画像取得部111により取得された撮像画像は、携帯端末20により表示されるように表示制御部160により制御される。
続いて、情報処理装置10が有する構図判定機能について説明する。図10は、情報処理装置10が有する構図判定機能の概要を示す図である。上記したように、撮像装置30により撮像画像が撮像されると、撮像画像がユーザの携帯端末20に表示される。ユーザは携帯端末20に表示された撮像画像を閲覧することが可能である。ここでは、ユーザが撮像画像を閲覧しながら目的地に向かう場面を想定し、このようなユーザが閲覧する撮像画像として好ましい構図の判定について説明する。
続いて、情報処理装置10が有する表現変更機能について説明する。上記したように、情報処理装置10により何らかのネットワークサービスが提供されており、ユーザUが携帯端末20を介してネットワークサービスにログインすると、携帯端末20に対してネットワークサービスが提供されていることを想定している。
以上説明したように、本実施形態に係る情報処理装置10のユーザ特定機能によれば、撮像画像を取得する画像取得部111と、携帯端末20のユーザに関するパラメータを取得するパラメータ取得部112と、パラメータに基づいて撮像画像からユーザを特定する特定部114と、特定部114によりユーザが特定されない場合、ユーザの特定に必要なユーザの動作を示す情報が携帯端末20に表示されるように制御する表示制御部160と、を備える、情報処理装置10が提供される。
(1)
撮像画像を取得する画像取得部と、
前記撮像画像から認識された被写体のネットワークサービスにおける状態の変更を検出する変更検出部と、
前記変更検出部により前記状態の変更が検出された場合に、前記撮像画像に映っている前記被写体の表現を変更する表現変更部と、
を備える、情報処理装置。
(2)
前記情報処理装置は、
前記撮像画像から携帯端末のユーザを特定する特定部をさらに備え、
前記表現変更部は、
前記撮像画像に映っている前記ユーザを強調することにより前記ユーザの表現を変更する、
前記(1)に記載の情報処理装置。
(3)
前記情報処理装置は、
前記表現変更部により表現が変更されて得られる表示画像が前記携帯端末に表示されるように制御する表示制御部、
をさらに備える、前記(2)に記載の情報処理装置。
(4)
前記表現変更部は、前記被写体の携帯端末が前記ネットワークサービスにログインした旨が前記変更検出部により検出された場合に、前記撮像画像に映っている前記被写体の表現を変更する、
前記(1)~(3)のいずれか一項に記載の情報処理装置。
(5)
前記表現変更部は、前記被写体の携帯端末の前記ネットワークサービスにおける動作が所定の条件を満たした旨が前記変更検出部により検出された場合に、前記撮像画像に映っている前記被写体の表現を変更する、
前記(1)~(3)のいずれか一項に記載の情報処理装置。
(6)
前記表現変更部は、前記撮像画像における前記ユーザのサイズに応じて前記ユーザの強調度合いを変化させる、
前記(2)または(3)に記載の情報処理装置。
(7)
前記表現変更部は、前記撮像画像における前記ユーザのサイズが小さいほど前記ユーザの強調度合いを強める、
前記(6)に記載の情報処理装置。
(8)
前記表現変更部は、前記撮像画像において前記ユーザの一部または全部が他の物体により隠れて映っている場合に、前記ユーザの強調の手法を変化させる、
前記(2)または(3)に記載の情報処理装置。
(9)
前記表現変更部は、前記撮像画像における前記ユーザ以外の被写体の鮮明度を低下させることにより前記ユーザを強調する、
前記(2)または(3)に記載の情報処理装置。
(10)
前記表現変更部は、前記撮像画像における前記ユーザの輪郭を強調することにより前記ユーザを強調する、
前記(2)または(3)に記載の情報処理装置。
(11)
前記表現変更部は、前記撮像画像における前記ユーザ以外の被写体の画素値を変更することにより前記ユーザを強調する、
前記(2)または(3)に記載の情報処理装置。
(12)
前記表現変更部は、前記撮像画像における前記ユーザに対するズームイン表示により前記ユーザを強調する、
前記(2)または(3)に記載の情報処理装置。
(13)
前記表現変更部は、前記撮像画像における前記ユーザの位置または前記ユーザの周辺の位置にオブジェクトを付加することにより前記ユーザを強調する、
前記(2)または(3)に記載の情報処理装置。
(14)
撮像画像を取得することと、
前記撮像画像から認識された被写体のネットワークサービスにおける状態の変更を検出することと、
前記状態の変更が検出された場合に、前記撮像画像に映っている前記被写体の表現を変更することと、
を含む、情報処理方法。
(15)
コンピュータを、
撮像画像を取得する画像取得部と、
前記撮像画像から認識された被写体のネットワークサービスにおける状態の変更を検出する変更検出部と、
前記変更検出部により前記状態の変更が検出された場合に、前記撮像画像に映っている前記被写体の表現を変更する表現変更部と、
を備える情報処理装置として機能させるためのプログラム。
10 情報処理装置
20 携帯端末
30 撮像装置
40 ネットワーク
50 記憶部
100 処理制御部
111 画像取得部
112 パラメータ取得部
113 候補抽出部
114 特定部
115 認証部
121 認識部
122 条件判定部
123 情報付加部
124 モード設定部
125 構図判定部
126 画像決定部
131 変更検出部
132 表現変更部
160 表示制御部
Claims (15)
- 撮像画像を取得する画像取得部と、
前記撮像画像から認識された被写体のネットワークサービスにおける状態の変更を検出する変更検出部と、
前記変更検出部により前記状態の変更が検出された場合に、前記撮像画像に映っている前記被写体の表現を変更する表現変更部と、
を備える、情報処理装置。 - 前記情報処理装置は、
前記撮像画像から携帯端末のユーザを特定する特定部をさらに備え、
前記表現変更部は、
前記撮像画像に映っている前記ユーザを強調することにより前記ユーザの表現を変更する、
請求項1に記載の情報処理装置。 - 前記情報処理装置は、
前記表現変更部により表現が変更されて得られる表示画像が前記携帯端末に表示されるように制御する表示制御部、
をさらに備える、請求項2に記載の情報処理装置。 - 前記表現変更部は、前記被写体の携帯端末が前記ネットワークサービスにログインした旨が前記変更検出部により検出された場合に、前記撮像画像に映っている前記被写体の表現を変更する、
請求項1に記載の情報処理装置。 - 前記表現変更部は、前記被写体の携帯端末の前記ネットワークサービスにおける動作が所定の条件を満たした旨が前記変更検出部により検出された場合に、前記撮像画像に映っている前記被写体の表現を変更する、
請求項1に記載の情報処理装置。 - 前記表現変更部は、前記撮像画像における前記ユーザのサイズに応じて前記ユーザの強調度合いを変化させる、
請求項2に記載の情報処理装置。 - 前記表現変更部は、前記撮像画像における前記ユーザのサイズが小さいほど前記ユーザの強調度合いを強める、
請求項6に記載の情報処理装置。 - 前記表現変更部は、前記撮像画像において前記ユーザの一部または全部が他の物体により隠れて映っている場合に、前記ユーザの強調の手法を変化させる、
請求項2に記載の情報処理装置。 - 前記表現変更部は、前記撮像画像における前記ユーザ以外の被写体の鮮明度を低下させることにより前記ユーザを強調する、
請求項2に記載の情報処理装置。 - 前記表現変更部は、前記撮像画像における前記ユーザの輪郭を強調することにより前記ユーザを強調する、
請求項2に記載の情報処理装置。 - 前記表現変更部は、前記撮像画像における前記ユーザ以外の被写体の画素値を変更することにより前記ユーザを強調する、
請求項2に記載の情報処理装置。 - 前記表現変更部は、前記撮像画像における前記ユーザに対するズームイン表示により前記ユーザを強調する、
請求項2に記載の情報処理装置。 - 前記表現変更部は、前記撮像画像における前記ユーザの位置または前記ユーザの周辺の位置にオブジェクトを付加することにより前記ユーザを強調する、
請求項2に記載の情報処理装置。 - 撮像画像を取得することと、
前記撮像画像から認識された被写体のネットワークサービスにおける状態の変更を検出することと、
前記状態の変更が検出された場合に、前記撮像画像に映っている前記被写体の表現を変更することと、
を含む、情報処理方法。 - コンピュータを、
撮像画像を取得する画像取得部と、
前記撮像画像から認識された被写体のネットワークサービスにおける状態の変更を検出する変更検出部と、
前記変更検出部により前記状態の変更が検出された場合に、前記撮像画像に映っている前記被写体の表現を変更する表現変更部と、
を備える情報処理装置として機能させるためのプログラム。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014507500A JP6064995B2 (ja) | 2012-03-27 | 2013-02-12 | 情報処理装置、情報処理方法およびプログラム |
BR112014023249A BR112014023249A8 (pt) | 2012-03-27 | 2013-02-12 | Dispositivo e método de processamento de informação, e, programa |
IN7843DEN2014 IN2014DN07843A (ja) | 2012-03-27 | 2013-02-12 | |
US14/386,434 US9836644B2 (en) | 2012-03-27 | 2013-02-12 | Changing a depiction in a captured image based on a state of a subject present in the captured image |
CN201380015462.4A CN104205799A (zh) | 2012-03-27 | 2013-02-12 | 信息处理装置、信息处理方法及程序 |
EP13770270.0A EP2833627A4 (en) | 2012-03-27 | 2013-02-12 | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND PROGRAM |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-071333 | 2012-03-27 | ||
JP2012071333 | 2012-03-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013145900A1 true WO2013145900A1 (ja) | 2013-10-03 |
Family
ID=49259183
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/053204 WO2013145900A1 (ja) | 2012-03-27 | 2013-02-12 | 情報処理装置、情報処理方法およびプログラム |
Country Status (7)
Country | Link |
---|---|
US (1) | US9836644B2 (ja) |
EP (1) | EP2833627A4 (ja) |
JP (1) | JP6064995B2 (ja) |
CN (1) | CN104205799A (ja) |
BR (1) | BR112014023249A8 (ja) |
IN (1) | IN2014DN07843A (ja) |
WO (1) | WO2013145900A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2950570A1 (en) * | 2014-05-27 | 2015-12-02 | Panasonic Intellectual Property Management Co., Ltd. | Remote imaging method and remote imaging control device |
JP2018097883A (ja) * | 2018-01-09 | 2018-06-21 | カシオ計算機株式会社 | 情報表示装置および誘導表示方法 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006174195A (ja) * | 2004-12-17 | 2006-06-29 | Hitachi Ltd | 映像サービスシステム |
JP2009003659A (ja) | 2007-06-21 | 2009-01-08 | Sony Corp | 認証装置、入場管理装置、入退場管理装置、入場管理システム、入退場管理システム、これらの処理方法およびプログラム |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7688306B2 (en) * | 2000-10-02 | 2010-03-30 | Apple Inc. | Methods and apparatuses for operating a portable device based on an accelerometer |
JP2002175538A (ja) * | 2000-12-08 | 2002-06-21 | Mitsubishi Electric Corp | 似顔絵生成装置及び似顔絵生成方法及び似顔絵生成プログラムを記録した記録媒体及び通信用端末及び通信用端末による通信方法 |
US7487214B2 (en) * | 2004-11-10 | 2009-02-03 | Microsoft Corporation | Integrated electronic mail and instant messaging application |
US7633076B2 (en) * | 2005-09-30 | 2009-12-15 | Apple Inc. | Automated response to and sensing of user activity in portable devices |
JP4915420B2 (ja) * | 2006-12-11 | 2012-04-11 | 株式会社ニコン | 電子カメラ |
DE102007033391A1 (de) * | 2007-07-18 | 2009-01-22 | Robert Bosch Gmbh | Informationsvorrichtung, Verfahren zur Information und/oder Navigation von einer Person sowie Computerprogramm |
US8086071B2 (en) * | 2007-10-30 | 2011-12-27 | Navteq North America, Llc | System and method for revealing occluded objects in an image dataset |
KR101446772B1 (ko) * | 2008-02-04 | 2014-10-01 | 삼성전자주식회사 | 디지털 영상 처리 장치 및 그 제어 방법 |
US7952596B2 (en) * | 2008-02-11 | 2011-05-31 | Sony Ericsson Mobile Communications Ab | Electronic devices that pan/zoom displayed sub-area within video frames in response to movement therein |
KR101016556B1 (ko) * | 2010-05-06 | 2011-02-24 | 전성일 | 증강 현실을 이용하여 인물의 정보에 접근하기 위한 방법, 서버 및 컴퓨터 판독 가능한 기록 매체 |
KR101687613B1 (ko) * | 2010-06-21 | 2016-12-20 | 엘지전자 주식회사 | 이동 단말기 및 이것의 그룹 생성 방법 |
US8326001B2 (en) * | 2010-06-29 | 2012-12-04 | Apple Inc. | Low threshold face recognition |
KR101347518B1 (ko) * | 2010-08-12 | 2014-01-07 | 주식회사 팬택 | 필터의 선택이 가능한 증강 현실 사용자 장치 및 방법, 그리고, 증강 현실 서버 |
JP2012058838A (ja) * | 2010-09-06 | 2012-03-22 | Sony Corp | 画像処理装置、プログラム及び画像処理方法 |
US8315674B2 (en) * | 2010-10-08 | 2012-11-20 | Research In Motion Limited | System and method for displaying object location in augmented reality |
US8994499B2 (en) * | 2011-03-16 | 2015-03-31 | Apple Inc. | Locking and unlocking a mobile device using facial recognition |
US9111130B2 (en) * | 2011-07-08 | 2015-08-18 | Microsoft Technology Licensing, Llc | Facilitating face detection with user input |
US8560004B1 (en) * | 2012-08-31 | 2013-10-15 | Google Inc. | Sensor-based activation of an input device |
US9408076B2 (en) * | 2014-05-14 | 2016-08-02 | The Regents Of The University Of California | Sensor-assisted biometric authentication for smartphones |
-
2013
- 2013-02-12 US US14/386,434 patent/US9836644B2/en active Active
- 2013-02-12 IN IN7843DEN2014 patent/IN2014DN07843A/en unknown
- 2013-02-12 BR BR112014023249A patent/BR112014023249A8/pt not_active IP Right Cessation
- 2013-02-12 EP EP13770270.0A patent/EP2833627A4/en not_active Withdrawn
- 2013-02-12 JP JP2014507500A patent/JP6064995B2/ja active Active
- 2013-02-12 WO PCT/JP2013/053204 patent/WO2013145900A1/ja active Application Filing
- 2013-02-12 CN CN201380015462.4A patent/CN104205799A/zh active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006174195A (ja) * | 2004-12-17 | 2006-06-29 | Hitachi Ltd | 映像サービスシステム |
JP2009003659A (ja) | 2007-06-21 | 2009-01-08 | Sony Corp | 認証装置、入場管理装置、入退場管理装置、入場管理システム、入退場管理システム、これらの処理方法およびプログラム |
Non-Patent Citations (2)
Title |
---|
See also references of EP2833627A4 |
YUSUKE YAMADA: "''2009 Nen wa 'Mobile AR Gannen Datta '''", 30 December 2009 (2009-12-30), XP008174432, Retrieved from the Internet <URL:http://www.itmedia.co.jp/mobile/articles/0912/30/news001.html> [retrieved on 20130423] * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2950570A1 (en) * | 2014-05-27 | 2015-12-02 | Panasonic Intellectual Property Management Co., Ltd. | Remote imaging method and remote imaging control device |
US9565351B2 (en) | 2014-05-27 | 2017-02-07 | Panasonic Intellectual Property Management Co., Ltd. | Remote imaging method and remote imaging control device |
JP2018097883A (ja) * | 2018-01-09 | 2018-06-21 | カシオ計算機株式会社 | 情報表示装置および誘導表示方法 |
Also Published As
Publication number | Publication date |
---|---|
JP6064995B2 (ja) | 2017-01-25 |
JPWO2013145900A1 (ja) | 2015-12-10 |
US20150049909A1 (en) | 2015-02-19 |
IN2014DN07843A (ja) | 2015-04-24 |
US9836644B2 (en) | 2017-12-05 |
EP2833627A1 (en) | 2015-02-04 |
CN104205799A (zh) | 2014-12-10 |
BR112014023249A2 (ja) | 2017-06-20 |
EP2833627A4 (en) | 2015-11-11 |
BR112014023249A8 (pt) | 2017-07-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102465532B1 (ko) | 객체 인식 방법 및 장치 | |
KR20220144890A (ko) | 다중 사용자 환경에서 손 제스처를 사용하여 디바이스를 제어하기 위한 방법 및 시스템 | |
US10007841B2 (en) | Human face recognition method, apparatus and terminal | |
US9898090B2 (en) | Apparatus, method and recording medium for controlling user interface using input image | |
US11222231B2 (en) | Target matching method and apparatus, electronic device, and storage medium | |
CN109743504B (zh) | 一种辅助拍照方法、移动终端和存储介质 | |
CN105956518A (zh) | 一种人脸识别方法、装置和系统 | |
JP5662670B2 (ja) | 画像処理装置、画像処理方法、及びプログラム | |
KR102454515B1 (ko) | 네트워크 최적화 방법 및 장치, 이미지 처리 방법 및 장치, 및 기억 매체 | |
CN107766403B (zh) | 一种相册处理方法、移动终端以及计算机可读存储介质 | |
WO2019011098A1 (zh) | 解锁控制方法及相关产品 | |
JP6044633B2 (ja) | 情報処理装置、情報処理方法およびプログラム | |
US10713480B2 (en) | Information processing device and information processing method | |
KR20190128536A (ko) | 전자 장치 및 그 제어 방법 | |
CN110519503A (zh) | 一种扫描图像的获取方法及移动终端 | |
CN107911563B (zh) | 一种图像处理方法及移动终端 | |
JP6064995B2 (ja) | 情報処理装置、情報処理方法およびプログラム | |
CN115761867A (zh) | 基于人脸图像的身份检测方法、装置、介质及设备 | |
CN111145083B (zh) | 一种图像处理方法、电子设备及计算机可读存储介质 | |
KR20080034248A (ko) | 휴대 단말기에서 얼굴인식을 통한 사진 검색 방법 | |
JP2016021716A (ja) | 追尾装置及びその制御方法 | |
KR20110099845A (ko) | 화상 통화 시스템에서 전방향 화자 인식을 위한 장치 및 방법 | |
JP2001331804A (ja) | 画像領域検出装置及び方法 | |
CN118212598A (zh) | 一种针对车辆的车型识别结果生成方法和装置 | |
CN115631496A (zh) | 标签预测方法、文本预测模型的训练方法、装置及设备 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13770270 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014507500 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013770270 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14386434 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112014023249 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112014023249 Country of ref document: BR Kind code of ref document: A2 Effective date: 20140919 |