WO2014207844A1 - Procédé de traitement d'informations, dispositif de traitement d'informations, programme de traitement d'informations, et support de stockage stockant le programme de traitement d'informations - Google Patents

Procédé de traitement d'informations, dispositif de traitement d'informations, programme de traitement d'informations, et support de stockage stockant le programme de traitement d'informations Download PDF

Info

Publication number
WO2014207844A1
WO2014207844A1 PCT/JP2013/067548 JP2013067548W WO2014207844A1 WO 2014207844 A1 WO2014207844 A1 WO 2014207844A1 JP 2013067548 W JP2013067548 W JP 2013067548W WO 2014207844 A1 WO2014207844 A1 WO 2014207844A1
Authority
WO
WIPO (PCT)
Prior art keywords
event
region
information processing
edge
target portion
Prior art date
Application number
PCT/JP2013/067548
Other languages
English (en)
Japanese (ja)
Inventor
裕 陳野
赤鹿 秀樹
Original Assignee
楽天株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 楽天株式会社 filed Critical 楽天株式会社
Priority to JP2015523718A priority Critical patent/JP6033431B2/ja
Priority to PCT/JP2013/067548 priority patent/WO2014207844A1/fr
Publication of WO2014207844A1 publication Critical patent/WO2014207844A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices or networks
    • G06Q20/32Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
    • G06Q20/327Short range or proximity payments by means of M-devices
    • G06Q20/3278RFID or NFC payments by means of M-devices

Definitions

  • the present invention relates to a technical field such as a method for detecting a specific portion based on a moving image of an object imaged by an imaging means.
  • Patent Document 1 a method is known in which the position, distance, and moving direction of a person approaching the camera is detected, and information is changed based on the position, distance, and moving direction and presented on a display unit (Patent Document 1).
  • a distance is estimated from, for example, a skin color portion of a face or the width of an eye from an image captured by a camera, and the position and size of an image to be displayed in an overlapping manner based on the estimated value. Is to change.
  • the plane of the card can be imaged with the camera of the mobile terminal.
  • the relative positional relationship for example, distance
  • the distance is estimated using a feature amount that is commonly present on the surface of the subject. Therefore, when the feature amount cannot be acquired from the surface of the subject, the method described in Patent Document 1 cannot be used.
  • the present invention relates to an information processing apparatus capable of detecting a change in the relative positional relationship between an information processing apparatus such as a portable terminal and an object such as a card without depending on information appearing on the surface of the object.
  • An object is to provide an information processing method, an information processing program, and a recording medium on which the information processing program is recorded.
  • the invention according to claim 1 is an information processing method that an information processing program causes an information processing apparatus to perform, and a reference that constitutes a series of images continuously captured by an imaging unit.
  • a determination step of determining a target portion that can be specified with reference to an edge of an object appearing in the reference image in the image, and that the determined target portion has changed in any of a plurality of comparison images following the reference image And a detection step of detecting an event to be indicated, and an output step of outputting a signal corresponding to the detected event or another event related to the detected event.
  • the attention portion that can specify the change in the relative positional relationship between the information processing apparatus and the object in the space on the basis of the edge of the object without depending on the information appearing on the surface of the object. And a signal corresponding to the change can be output.
  • any two points on the target part from a first region including at least a part of the target part are obtained. It is characterized in that the event indicating that the part of interest has been displaced in the second region located in the perpendicular direction of the passing straight line is detected.
  • a change in the relative positional relationship between the information processing apparatus and the object in the space can be detected by the displacement of the target portion in one direction.
  • the information processing method in the information processing method according to the second aspect, at least a part of a linear edge of the object is accommodated in the first region, and the first region and the first region
  • the step is characterized in that the straight edge is determined as the target portion in the reference image in which at least a part of the straight edge falls within the first region.
  • the relative positional relationship between the information processing apparatus and the object can be adjusted by the user so that the displacement of the target portion in one direction can be reliably detected.
  • the information processing method in the information processing method according to the second aspect, at least a part of a linear edge of the object is accommodated in the first region, and the first region and the first region
  • the linear edge is determined as the target portion in the reference image in which at least a part is included in the first region.
  • the first information processing program is provided so that the user can reliably detect the displacement of the target portion in one direction without adjusting the relative positional relationship between the information processing apparatus and the object.
  • the second region can be arranged.
  • the second region is located in opposite directions with respect to the first region.
  • One region is included, and the output step outputs the signal that differs depending on a previous region in which the target portion is displaced among the two regions included in the second region.
  • different signals can be output depending on the direction in which the target portion is displaced from the first region.
  • the determining step determines a linear first edge of the object as a first target portion, and the first step A second linear edge perpendicular to the edge of the first region of interest is determined as a second portion of interest
  • the detecting step includes the step of detecting the first region from the first region including at least a portion of the determined first portion of interest.
  • a first event indicating that the first target portion is displaced in a second region located in a direction perpendicular to the first target portion; and a first event including at least a part of the determined second target portion.
  • a second event indicating that the second target portion has been displaced from a third region to a fourth region located in a direction perpendicular to the second target portion.
  • the change in the relative positional relationship between the information processing apparatus and the object in the space can be detected by the displacement of the target portion in two orthogonal directions.
  • a seventh aspect of the present invention is the information processing method according to the sixth aspect, wherein at least a part of the first edge fits in the first area, and the first area and the second area
  • the boundary of the region is parallel to the first edge, and at least a part of the second edge fits in the third region, and the boundary between the third region and the fourth region is
  • the relative positional relationship between the information processing apparatus and the object can be adjusted by the user so that the displacement of the target portion in two orthogonal directions can be reliably detected.
  • the invention according to claim 8 is the information processing method according to claim 6, wherein at least a part of the first edge fits in the first region, and the first region and the second region
  • the boundary of the region is parallel to the first edge, and at least a part of the second edge fits in the third region, and the boundary between the third region and the fourth region is
  • the method further includes an arrangement step of arranging each of the first to fourth regions so as to be parallel to the second edge, wherein the determining step includes at least part of the first edge being the first region.
  • the first edge is determined as the first target portion in the reference image in which at least a part of the second edge falls within the third region, and the second edge Is determined as the second portion of interest.
  • the information processing program is provided so that the user can reliably detect the displacement of the target portion in two orthogonal directions without adjusting the relative positional relationship between the information processing apparatus and the object.
  • First to fourth regions can be arranged.
  • the invention according to claim 9 is the information processing method according to any one of claims 6 to 8, wherein the output step includes a time difference between the first event and the second event within a predetermined threshold.
  • the signal is detected, the signal corresponding to the combined event of the first event and the second event is output.
  • the output step corresponds to a detection time difference between the first event and the second event.
  • a first signal corresponding only to the first event, a second signal corresponding only to the second event, and a combined event of the first event and the second event Any one of the third signals is selectively output.
  • the three signals can be used properly according to the detection time difference between the first event and the second event.
  • the output step outputs the third signal that differs according to the detection order of the first event and the second event. It is characterized by outputting.
  • two signals can be used properly according to the detection order of the first event and the second event.
  • a twelfth aspect of the present invention is the information processing method according to any one of the second to eleventh aspects, wherein the region including at least a part of the target portion and the perpendicular direction of the target portion are located.
  • the distance between the region and the perpendicular direction is set to a threshold value or more.
  • the present invention it is possible to prevent the detection of the displacement due to the hand shake of the user who has at least one of the information processing apparatus and the object, and to improve the detection accuracy.
  • a thirteenth aspect of the present invention is the information processing method according to any one of the second to twelfth aspects, wherein the imaging unit continuously captures images at a predetermined imaging interval, and the detection step includes the predetermined step. The event indicating that the target portion is displaced is detected so that the acceleration based on the imaging interval and the displacement amount of the target portion from the position in the reference image changes in a predetermined pattern.
  • the present invention it is possible to prevent the detection of the displacement due to the hand shake of the user who has at least one of the information processing apparatus and the object, and to improve the detection accuracy.
  • the object in the information processing method according to any one of the first to thirteenth aspects, includes a storage unit that stores balance data indicating a balance of electronic value, and short-range wireless communication.
  • a card comprising an IC module having a processing unit that executes balance change processing for changing the stored balance data using balance change information input from a device within a possible distance range, and responds to the processing result
  • an input step of acquiring from an external balance changing device and inputting to the IC module is a storage unit that stores balance data indicating a balance of electronic value, and short-range wireless communication.
  • a card comprising an IC module having a processing unit that executes balance change processing for changing the stored balance data using balance change information input from a device within a possible distance range, and responds to the processing result
  • the balance data stored in the storage unit of the IC module provided in the card can be changed to change the relative positional relationship between the card and the information processing apparatus.
  • the object in the information processing method according to any one of the first to thirteenth aspects, is within a distance range capable of short-range wireless communication with a storage unit that stores data.
  • a processing unit that executes predetermined processing on the data in response to a request from a device and responds to a processing result, and uses the predetermined data according to the signal output in the output step.
  • the method further includes a setting step of setting a parameter to be set.
  • a determination unit that determines a target portion that can be specified with reference to an edge of an object appearing in the reference image in a reference image that constitutes a series of images continuously captured by the imaging unit; Detecting means for detecting an event indicating that the determined portion of interest has changed in any of a plurality of comparison images subsequent to the reference image; and corresponding to the detected event or another event related to the event Output means for outputting a signal to be output.
  • the invention of the information processing program according to claim 17 determines a target portion that can be specified with reference to an edge of an object appearing in the reference image in a reference image constituting a series of images continuously captured by the imaging means.
  • a determination step a detection step of detecting an event indicating that the determined portion of interest has changed in any of a plurality of comparison images subsequent to the reference image, and the detected event or other related to the event And an output step of outputting a signal corresponding to the above event.
  • a recording medium on which an information processing program for causing a computer to execute an output step for outputting a signal to be recorded is recorded.
  • the present invention it is possible to specify a change in the relative positional relationship between the information processing apparatus and the object in the space on the basis of the edge of the object without depending on the information appearing on the surface of the object. And a signal corresponding to the change can be output.
  • FIG. It is a figure which shows the board card 1 and the portable terminal 2 which are used in embodiment of this invention.
  • (A) is a figure which shows the example of a schematic structure of the portable terminal 2
  • (B) is a figure which shows an example of the functional block in the control part 28.
  • FIG. It is a figure which shows the example of the displacement event of an attention part. It is a figure which shows the example of the displacement event of an attention part. It is a figure which shows the example of the displacement event of an attention part. It is a figure which shows the example of the displacement event of an attention part. It is a figure which shows the example of the displacement event of an attention part. It is a figure which shows the example of the displacement event of an attention part. It is a figure which shows the example of the displacement event of an attention part. It is a figure which shows the example of the displacement event of an attention part.
  • FIG. 1 It is a figure which shows the example of the displacement event of an attention part.
  • A is a figure which shows the attention part in case the board card 1 is imaged in the state in which the edge of the board card 1 turned sideways and the edge of the portable terminal 2 made vertical are parallel
  • B is a diagram showing an example in which the distance in the perpendicular direction between the region including the target portion and the region located in the normal direction of the target portion is set to be equal to or greater than a threshold value.
  • It is a figure showing an example of outline composition of electronic money system S to which the present invention is applied.
  • It is a sequence diagram which shows an example of the charge operation of electronic value.
  • FIG. 1 It is a figure which shows a mode that charge amount is set by the change of the relative positional relationship of the electronic money card 1 and the portable terminal 2.
  • FIG. It is a figure which shows a mode that charge amount is set by the change of the relative positional relationship of the electronic money card 1 and the portable terminal 2.
  • FIG. It is a sequence diagram which shows an example of the information acquisition operation by the portable terminal. It is a figure which shows a mode that an information type is set by the change of the relative positional relationship of the electronic money card 1 and the portable terminal 2.
  • FIG. 1 is a diagram showing a board card 1 and a portable terminal 2 used in the embodiment of the present invention.
  • FIG. 2A is a diagram illustrating a schematic configuration example of the mobile terminal 2.
  • the board card 1 is an example of an object in the present invention
  • the mobile terminal 2 is an example of an information processing apparatus in the present invention.
  • the present invention is preferably applied to the board card 1, it can also be applied to, for example, a key holder, a wallet, a periodic case, a wristwatch, and the like other than the board card 1.
  • the present invention is preferably applied to the mobile terminal 2, but can also be applied to a stationary personal computer or the like.
  • the board card 1 is a flat card whose dimensions are determined by, for example, international standards.
  • the board card 1 for example, an electronic money card, a credit card, a membership card, or the like can be applied.
  • the present invention is applicable to both the board card 1 on which a non-contact type IC chip is mounted and the board card 1 on which a non-contact type IC chip is not mounted.
  • the non-contact type IC chip mounted on the board card 1 is an IC module that adopts a near field communication (NFC) technology using, for example, a 13.56 MHz band frequency.
  • NFC near field communication
  • the non-contact type IC chip mounted on the board card 1 responds to a request from an antenna, a storage unit (for example, a nonvolatile memory) that stores data, and a portable terminal 2 that is within a distance range capable of short-range wireless communication. Accordingly, a processing unit (for example, a CPU) that performs a predetermined process on the data stored in the storage unit and responds with a processing result is provided.
  • the shape of the board card 1 shown in FIG. 1 is a rectangle having rounded corners, but may be a square, a rhombus, a circle, or the like.
  • the portable terminal 2 can be carried by a user, and for example, a smartphone, a mobile phone, a game machine, a tablet computer, or the like is applicable.
  • the mobile terminal 2 includes a camera 21 (an example of an imaging unit), a display unit 22, a speaker 23, an operation unit 24, a storage unit 25, a short-range wireless communication unit 26, a mobile wireless communication unit 27, And a control unit 28 and the like.
  • the camera 21, display unit 22, speaker 23, operation unit 24, storage unit 25, short-range wireless communication unit 26, and mobile wireless communication unit 27 are controlled via an input / output interface unit (not shown) and the bus 29. Connected to the unit 28.
  • the camera 21 is installed on the back side of the mobile terminal 2, for example, and images (photographs) a subject according to a user instruction.
  • an imaging mode by the camera 21 a moving image imaging mode and a still image imaging mode can be selected.
  • the image captured by the camera 21 is stored in an image storage area allocated to the storage unit 25 and displayed on the display D in the display unit 22.
  • the display unit 22 includes a touch panel display D, for example.
  • the display D is installed, for example, on the front side of the mobile terminal 2, and a screen for arranging user-designable icons according to a control signal from the control unit 28, an image of the board card 1 captured by the camera 21, and the like. Is displayed.
  • the speaker 23 outputs the audio signal from the control unit 28 to the outside as audio.
  • the operation unit 24 includes an input button for inputting an instruction from the user.
  • the operation unit 24 can input an instruction via an icon displayed on the touch panel display D.
  • the storage unit 25 is configured by, for example, a non-volatile memory such as a flash memory or an EEPROM.
  • the storage unit 25 stores an OS (Operating System), various application programs, and the like.
  • the application program includes the information processing program of the present invention.
  • the application program can be downloaded from a predetermined server connected to the Internet, for example.
  • the application program may be recorded on a recording medium such as a CD (Compact Disc) or a DVD (Digital Versatile Disc) and read from the recording medium into the storage unit 25 via a drive.
  • the near field communication unit 26 includes, for example, an IC module that employs a near field communication (NFC) technology using a 13.56 MHz band frequency, an antenna, and the like.
  • the short-range wireless communication unit 26 communicates with the non-contact type IC chip when the board card 1 containing the non-contact type IC chip comes close to within a distance range where short-range wireless communication is possible. It has become.
  • the mobile wireless communication unit 27 performs wireless communication with a base station in the mobile communication network. Thereby, the control part 28 can access a predetermined server via the internet, for example, and can communicate with the said server.
  • the control unit 28 includes, for example, a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), and the like.
  • FIG. 2B is a diagram illustrating an example of functional blocks in the control unit 28.
  • the control unit 28 executes processing to be described later according to an information processing program stored in the storage unit 25.
  • the control unit 28 includes a target portion determination unit 28a, a change event detection unit 28b, an information notification unit 28c, a region arrangement unit 28d, a signal output unit 28e, and a process. It functions as the part 28f.
  • the attention portion determination unit 28a is an example of a determination unit in the present invention.
  • the change event detection unit 28b is an example of detection means in the present invention.
  • the signal output unit 28e is an example of output means in the present invention.
  • the attention portion determination unit 28a captures a series of images continuously captured at a predetermined imaging interval by the camera 21 and configures the series of images.
  • the series of images corresponds to a moving image captured in the moving image capturing mode and a series of still images continuously captured (continuous shooting) in the still image capturing mode.
  • the attention portion determination unit 28a detects the straight edge of the board card 1 appearing in the reference image, and determines the detected straight edge as the attention portion.
  • One or more attention portions are determined. For example, when there are two attention portions, the attention portion determination unit 28a determines the first linear edge of the board card 1 appearing in the reference image as the first attention portion, and the first edge Is determined as a second target portion.
  • the change event detection unit 28b detects an event indicating that the attention part determined by the attention part determination unit 28a has changed in any of a plurality of comparison images subsequent to the reference image.
  • the “comparison image” needs to be close in time (for example, about 1 to 2 seconds from the imaging time of the reference image), but does not have to be the next image immediately after the reference image.
  • the change event detection unit 28b detects that the target portion is located in a second region located in a perpendicular direction of a straight line passing through any two points on the target portion from the first region including at least a part of the target portion. An event indicating that the displacement (that is, the position of the target portion has changed) is detected.
  • the attention portion in the comparison image is specified by detecting the straight edge of the board card 1 appearing in the comparison image, as in the reference image.
  • the edge of the board card 1 is detected in each of the reference image and the comparison image without depending on the information appearing on the surface of the board card 1.
  • the change event detection unit 28b determines the first attention from the first region including at least a part of the determined first attention part.
  • the “second region” includes two regions located in opposite directions with respect to a straight line passing through two arbitrary points on the first target portion.
  • the “fourth region” includes two regions located in opposite directions with respect to a straight line passing through two arbitrary points on the second target portion.
  • an event indicating that the attention portion has been displaced is referred to as a “displacement event of the attention portion”, and among these, in particular, the first event indicating that the first attention portion has been displaced is referred to as “the attention portion.
  • the “first displacement event” and the second event indicating that the second portion of interest has been displaced are referred to as the “second displacement event of the portion of interest”, respectively.
  • FIGS. 3 to 9 are diagrams showing examples of the displacement event of the target portion.
  • the base card 1 having a substantially rectangular edge appears in the reference image and the comparative image.
  • the area A1a shown in the examples of FIGS. 3 and 4 is an example of the “first area”, and the area A1b and the area A1c are examples of the “second area”.
  • the area A2a shown in FIGS. 5 to 9 is an example of the “third area”, and the area A2b and the area A2c are examples of the “fourth area”.
  • the positions of these regions are set in advance, for example. Note that the positions and numbers of the areas shown in FIGS. 3 to 9 are merely examples, and a larger number of areas than the areas shown in FIGS. 3 to 9 may be set.
  • a straight edge included in the area A1a is determined as the target portion P1.
  • the comparison image for example, after the user has translated the board card 1 or the portable terminal 2 in the horizontal direction (for example, the board card 1 is translated in the right direction in the drawing)
  • the target portion P1 is displaced from the region A1a to the region A1b.
  • the user translates the board card 1 or the portable terminal 2 in the direction opposite to that in FIG. 3B (for example, the board card 1 is translated in the left direction of the drawing).
  • the target portion P1 is displaced from the region A1a to the region A1c as shown in FIG. 3C.
  • the displacement event of the target portion P1 shown in FIG. 3 (B) and the displacement event of the target portion P1 shown in FIG. 3 (C) are detected separately by the change event detection unit 28b. Thereby, the change of the relative positional relationship of the portable terminal 2 and the board card 1 in space can be detected by the displacement event of the attention part in one direction.
  • the linear edge included in the region A1a is determined as the attention portion P1, as in FIG. 3A.
  • the attention portion P1 is displaced from the region A1a to the region A1b.
  • the comparative image after the user brings the board card 1 and the portable terminal 2 closer (closer to the vertical direction of the drawing) from the state shown in FIG. 4A as shown in FIG.
  • the attention portion P1 is displaced from the region A1a to the region A1c.
  • the linear edge included in the area A1a is determined as the first target portion P1
  • the area A2a A straight edge including a part is determined as the second target portion P2.
  • the comparison image after the user has translated the board card 1 or the portable terminal 2 in the oblique direction from the state shown in FIG. 5A for example, the board card 1 is translated in the diagonally lower right direction in the drawing.
  • the target portion P1 is displaced from the region A1a to the region A1b
  • the target portion P2 is displaced from the region A2a to the region A2b.
  • the user translates the board card 1 or the portable terminal 2 in the direction opposite to that in FIG. 5B (for example, the board card 1 is parallel to the diagonally upper left direction in the drawing).
  • the target portion P1 is displaced from the region A1a to the region A1c
  • the target portion P2 is displaced from the region A2a to the region A2c.
  • 5C is detected by being distinguished by the change event detection unit 28b. As described above, in the example shown in FIG. 5, it is possible to detect displacement events in two orthogonal directions. Thereby, the change in the relative positional relationship between the mobile terminal 2 and the board card 1 in the space can be detected by the displacement event of the target portion in two orthogonal directions.
  • the straight edge included in the area A1a among the edges of the board card 1 appearing in the reference image is the first target portion P1.
  • a straight edge that is determined and partly included in the region A2a is determined as the second target portion P2.
  • the attention portion P1 is changed from the area A1a to the area A1b.
  • the target portion P2 is displaced from the region A2a to the region A2b.
  • the attention portion P1 is changed from the area A1a to the area A1c.
  • the target portion P2 is displaced from the region A2a to the region A2c.
  • the second displacement event of the target portion P2 shown in FIG. 6C is detected by being distinguished by the change event detecting unit 28b.
  • the change event detecting unit 28b it becomes possible to detect displacement events in two orthogonal directions.
  • the board card 1 is imaged in a state where the edge of the board card 1 turned sideways and the edge of the portable terminal 2 turned sideways are parallel.
  • the information notification unit 28c has at least a part of the straight edge of the board card 1 within the area A1a, and the boundary between the area A1a and the area A1b (or the area A1c) is parallel to the straight edge.
  • the user may be notified of information that prompts the user to adjust the relative positional relationship between the mobile terminal 2 and the board card 1.
  • the change event detection part 28b can detect the displacement event of the attention part in one direction reliably.
  • the user adjusts the relative positional relationship between the portable terminal 2 and the board card 1, for example, the state shown in FIG. In the reference image in which at least a part of the edge is within the area A1a, the straight edge is determined as the target portion P1.
  • the information notification unit 28c notifies the user by displaying information for prompting the adjustment on the display D or outputting the information from the speaker 23 as sound.
  • the information notification unit 28c is configured to at least part of the linear first edge of the board card 1. Is within the region A1a, and the boundary between the region A1a and the region A1b (or the region A1c) is parallel to the first edge, and at least a second linear edge perpendicular to the first edge.
  • the relative positional relationship between the mobile terminal 2 and the board card 1 is such that a part of the mobile terminal 2 fits in the area A2a and the boundary between the area A2a and the area A2b (or the area A2c) is parallel to the second edge. Information that prompts adjustment may be notified to the user.
  • the change event detection unit 28b can reliably detect the displacement event of the target portion in two orthogonal directions. Can do. Then, after the user adjusts the relative positional relationship between the mobile terminal 2 and the board card 1, for example, the state shown in FIG. In the reference image in which at least a part of the second edge falls within the area A1a and at least a part of the second edge falls within the area A2a, the first edge is determined as the first target part P1, and the first The second edge is determined as the second target portion P2.
  • the region placement unit 28d has at least a part of the straight edge of the object of the board card 1. Arrange the first region and the second region so that they fit in the first region and the boundary between the first region and the second region is parallel to the linear edge (that is, You may comprise so that the position of the said area
  • the region placement unit 28d can change the position of the target portion in one direction without detecting the relative positional relationship between the portable terminal 2 and the board card 1 so that the user can reliably detect the displacement event in the one direction.
  • the first and second regions can be arranged.
  • the area arranging portion 28d has at least a part of the linear first edge of the board card 1 within the first area, and the boundary between the first area and the second area is the first edge. At least a portion of the linear second edge perpendicular to the first edge is within the third region, and the boundary between the third region and the fourth region is the first region.
  • region may each be arrange
  • the region placement unit 28d can reliably detect the displacement event of the target portion in the two orthogonal directions without adjusting the relative positional relationship between the mobile terminal 2 and the board card 1. Can arrange the first to fourth regions.
  • FIG. 7 is a diagram illustrating an example of the displacement event of the target portion when the areas A1a, A1b, A1c, A2a, A2b, and A2c arranged by the area arranging unit 28d are used.
  • the attention portion determination unit 28a has at least a part of the first edge within the region A1a and at least one of the second edge.
  • the first edge is determined as the first target portion P1
  • the second edge is determined as the second target portion P2.
  • the first displacement event of the target portion P1 shown in FIG. 7B, the second displacement event of the target portion P2 shown in FIG. 7B, and the target portion P1 shown in FIG. 7C are detected by being distinguished by the change event detection unit 28b.
  • the first displacement event of the target portion P1 and the second displacement event of the target portion P2 shown in FIG. 5 may occur after a certain period of time.
  • the example of FIG. 8 shows a case where the user translates the board card 1 or the portable terminal 2 in the horizontal direction (FIG. 8B) and then translates in the vertical direction (FIG. 8C). In this case, the detection order of the first displacement event of the target portion P1 is earlier than the second displacement event of the target portion P2.
  • FIG. 2 is translated in the vertical direction (FIG. 9B) and then translated in the horizontal direction (FIG. 9C). In this case, the second displacement event of the target portion P2 is shown.
  • the detection order of the first displacement event and the second displacement event is temporarily associated with each displacement event in the RAM or the like.
  • the signal output unit 28c detects the first displacement event and the second displacement event. Time difference, and the detection order of the first displacement event and the second displacement event can be specified.
  • FIG. 10A shows the attention portion P1 and the attention when the board card 1 is imaged in a state in which the edge of the board card 1 turned sideways is parallel to the edge of the portable terminal 2 turned vertically. Part P2 is shown. Even in this case, it is possible to detect a displacement event in two orthogonal directions by the change event detection unit 28b.
  • FIG. 10B shows the perpendicular direction distance d1 between the region A1a including the target portion P1 and the regions A1b and A1c located in the vertical direction of the target portion P1, and a part of the target portion P2.
  • the vertical direction distance d2 between the region A2a including the region A2a and the regions A2b and A2c located in the vertical direction of the target portion P2 is set to be equal to or more than a threshold value. Note that either one of the distance d1 and the distance d2 may be set to a threshold value or more.
  • a region located between the region A1b including the attention portion P1 and the region A1b whose distance from the region A1a is equal to or greater than a threshold is referred to as a buffer region BA.
  • the change event detecting unit 28b does not detect the displacement event of the target portion. Accordingly, even if the attention portion is displaced to the buffer area BA due to the hand shake of the user who has at least one of the board card 1 and the portable terminal 2, the displacement event can be prevented from being detected, and the detection accuracy can be improved.
  • the change event detection unit 28b displaces the target portion so that the acceleration based on the predetermined imaging interval by the camera 21 and the amount of displacement of the target portion from the position in the reference image changes in a predetermined pattern. You may comprise so that the displacement event which shows having performed may be detected.
  • the acceleration is calculated by the control unit 28 in time series based on a predetermined imaging interval by the camera 21 and the amount of displacement from the position of the target portion in the reference image.
  • the change event detection unit 28b has an initial acceleration (> 0) at which the target portion is displaced being equal to or greater than the first threshold value, and the target portion is displaced.
  • the acceleration is detected to change in a predetermined pattern, and the displacement event of the target portion is detected.
  • the acceleration is an acceleration in the same direction as the moving direction of the target portion.
  • the initial acceleration (> 0) is smaller than the first threshold value, or when the final acceleration ( ⁇ 0) is larger than the second threshold value, it is regarded as a camera shake, and the displacement event of the target portion is not detected.
  • the signal output unit 28e outputs, for example, a signal corresponding to the displacement event detected by the change event detection unit 28b or another event related to the detected displacement event (hereinafter referred to as “related event”).
  • related event a combined event of the first displacement event and the second displacement event can be cited.
  • the correspondence relationship between each displacement event or related event described above and an output signal may be defined on the information processing program of the present invention, or may be defined on a table stored in the storage unit 25. Also good. For example, in the example of FIGS. 3 to 7, when a displacement event (also a first displacement event) of the target portion P1 is detected, the signal SG1 is output.
  • the signal output unit 28e may be configured to output different signals depending on the previous region where the target portion is displaced among the two regions included in the second or fourth region. Thereby, a different signal can be output according to the direction in which the target portion is displaced from the first or third region.
  • the signal SG11 is output when the area where the target portion P1 is displaced is the area A1b.
  • a signal SG12 different from the signal SG11 is output. Further, in the examples of FIGS.
  • the signal output unit 28e detects the first displacement event of the target portion P1 and the second displacement event of the target portion P2 with a time difference within a predetermined threshold (for example, in the examples of FIGS. 5 to 7, substantially simultaneously. If detected, a signal SG3 (different from signal SG1 and signal SG2) corresponding to a composite event (an example of a related event) of the first displacement event and the second displacement event is output. May be. As a result, it is possible to output a signal corresponding to the fact that the target portions P1 and P2 are displaced substantially simultaneously in two orthogonal directions. In particular, in the example of FIGS.
  • the signal output unit 28e has a first signal (SG1, SG11, or SG12) corresponding to only the first displacement event according to the detection time difference between the first displacement event and the second displacement event.
  • the three signals can be used properly according to the detection time difference between the first displacement event and the second displacement event.
  • the signal output unit 28e determines the detection order of the first displacement event and the second displacement event.
  • a different third signal may be output accordingly. For example, in the case of the detection order shown in the example of FIG. 8, the signal SG31 is output. On the other hand, in the detection order shown in FIG.
  • a signal SG32 different from the signal SG31 is output.
  • two signals can be properly used according to the detection order of the first displacement event and the second displacement event.
  • the first displacement event of the target portion P1 and the second displacement event of the target portion P2 are set so as not to be detected with a time difference exceeding a predetermined threshold (detected almost simultaneously). Is done. Therefore, the output is limited to the output of the third signal SG3 corresponding to the combined event of the first displacement event and the second displacement event.
  • the processing unit 28f performs predetermined processing according to the signal output from the signal output unit 28e.
  • predetermined processing include processing for inputting predetermined information (including parameters) from the outside, processing for outputting predetermined information (including parameters) to the outside, processing for setting predetermined parameters, and displaying predetermined information For example, a process of displaying on D.
  • the process to be executed is determined from these processes according to the signal output from the signal output unit 28e.
  • processing for inputting predetermined information from the outside for example, processing for inputting (acquiring) information by a “read command” from a non-contact IC chip built in the board card 1, or the mobile wireless communication unit 27
  • processing for outputting predetermined information to the outside for example, processing for outputting predetermined information by a “write command” to a non-contact type IC chip built in the board card 1, or predetermined processing through the mobile radio communication unit 27
  • a process of outputting (transmitting) predetermined information to a device for example, a server
  • data stored in the storage unit
  • data is received in response to a request from the portable terminal 2 within a distance range in which the non-contact type IC chip built in the board card 1 is capable of short-range wireless communication.
  • There is a process for setting parameters used for a predetermined process to be executed on the data Thereby, the parameter used for the said process can be set by changing the relative positional relationship of the board card 1 and the portable terminal 2.
  • FIG. 11 is a flowchart illustrating an example of the displacement event detection and signal output processing of the target portion.
  • the process illustrated in FIG. 11 is started when a start instruction is input from the user via the operation unit 24, for example.
  • the control unit 28 starts capturing a series of images continuously captured by the camera 21 (step S1).
  • the control unit 28 performs a target part determination process (step S2).
  • the attention part determination process is a process in which the attention part determination unit 28a determines an attention part that can be specified with reference to the edge of the board card 1 appearing in the reference image in the reference image constituting the series of images.
  • the control unit 28 determines whether or not the target portion has been determined in the reference image (step S3).
  • step S4 the information notification unit 28c notifies the user of information for urging adjustment of the relative positional relationship between the board card 1 and the portable terminal 2 so that the target portion can be determined.
  • the area arranging portion 28d has at least a part of the linear edge of the object of the board card 1 within the first area, and the boundary between the first area and the second area is the linear edge.
  • the first region and the second region may be arranged so as to be parallel to each other.
  • step S3 when it is determined that the target portion can be determined in the reference image (step S3: YES), the process proceeds to step S5.
  • step S ⁇ b> 5 the control unit 28 executes a detection process for the displacement event of the target portion.
  • the attention part displacement event detection process is a process in which the change event detection unit 28b detects an event indicating that the attention part is displaced in any of a plurality of comparison images subsequent to the reference image.
  • step S6 determines whether or not the displacement event of the target portion has been detected (step S6). And when it determines with the displacement event of an attention part not being detectable (step S6: NO), it progresses to step S4.
  • step S4 the information notification unit 28c notifies the user of information for urging adjustment of the relative positional relationship between the board card 1 and the portable terminal 2 so that the displacement event of the target portion can be detected.
  • step S6 when it is determined that the displacement event of the target portion has been detected (step S6: YES), the process proceeds to step S7.
  • step S7 the signal output unit 28e outputs a signal corresponding to the detected displacement event or related event to the processing unit 28f.
  • the processing unit 28f performs a predetermined process according to the output signal.
  • step S8 determines whether or not there is an end instruction (step S8). For example, when an end instruction is input from the user via the operation unit 24, it is determined that there is an end instruction (step S8: YES), and the process illustrated in FIG. 11 ends. On the other hand, when it is determined that there is no termination instruction (step S8: NO), the process returns to step S2 and the above process is repeated.
  • the change in the relative positional relationship between the board card 1 and the mobile terminal 2 in the space does not depend on the information appearing on the surface of the board card 1.
  • it can be detected by a change in the portion of interest that can be specified with the edge of the board card 1 as a reference, and a signal corresponding to the change can be output.
  • the target portion determination unit 28a is configured to determine the straight edge of the object as the target portion. ), A surface surrounded by the edge, a tangent line that touches the edge, or a point on the tangent line may be determined as a target portion. Further, in the above embodiment, the change event detection unit 28b is configured to detect an event indicating that the target portion has been displaced (that is, the position of the target portion has changed). An event indicating that the area of the surface surrounded by the edge has changed may be detected. A change in the area of a surface means a change in the occupation ratio of the surface in an image frame or a specific region in the image frame. [2.
  • FIG. 12 is a diagram showing a schematic configuration example of an electronic money system S to which the present invention is applied.
  • the electronic money system S is configured using an electronic money card 1, a portable terminal 2, an electronic money server 3, a database DB, and the like.
  • the electronic money card 1 is an example of the board card 1 described above.
  • the electronic money server 3 is an example of a balance change device.
  • the electronic money card 1 includes a non-contact type IC chip 1a (IC module) having a storage unit and a processing unit, and an antenna.
  • a storage unit for example, a non-volatile memory
  • balance data indicating a balance of electronic value, log data, an electronic money number, and the like are stored.
  • Electronic value is electronic information corresponding to monetary value.
  • the electronic money number is, for example, identification information for identifying the balance of electronic value from the balance of electronic value of another electronic money card.
  • Log data is stored in the non-contact IC chip 1a in real time at the time of transaction or charging.
  • the log data corresponds to, for example, transaction history data and charge history data.
  • the transaction history data is information generated with respect to a transaction in which an electronic money card is used for settlement (payment) at a store, for example.
  • the transaction history data includes, for example, data such as an electronic money number, transaction date and time, transaction amount, and store ID of the store where the transaction was made.
  • the store ID is identification information uniquely assigned to each store.
  • the charge history data is information that is generated with respect to a charge in which the electronic money server 3 is used, for example.
  • charging means increasing the balance of the electronic value stored in the storage unit of the non-contact IC chip 1a.
  • the charge history data includes data such as an electronic money number, a charge date and time, and a charge amount.
  • the electronic money card can be charged from a payment terminal installed in the store.
  • the charge history data includes, for example, data such as an electronic money number, a charge date and time, a charge amount, and a store ID of the charged store.
  • the processing unit (for example, CPU) of the electronic money card 1 is stored in the storage unit using balance change information input from a device (for example, the portable terminal 2) within a distance range in which near field communication is possible.
  • a balance change process for changing the balance data is executed, and the result of the process is returned.
  • the electronic money server 3 is a server that manages the movement of money value by electronic value, and includes a communication unit, a storage unit, a control unit, and the like.
  • the monetary value is transferred by changing (increasing or decreasing) the balance of electronic value.
  • the communication unit of the electronic money server 3 is connected to a network NW composed of a mobile communication network and the Internet, and controls the communication state.
  • the storage unit of the electronic money server 3 is configured by, for example, a hard disk drive, and stores an operating system and an electronic money management program for performing settlement processing using electronic value and electronic value charging processing.
  • the control unit of the electronic money server 3 includes a CPU, a ROM, a RAM, and the like, and performs a payment process using an electronic value and a charge process using an electronic value according to an electronic money management program.
  • the database DB is provided in a database server installed in the electronic money server 3 or outside the electronic money server 3.
  • the database DB is provided with a user information database, a store information database, a log database, and an electronic value database.
  • User information of users (members) of electronic money cards is registered for each user in the user information database.
  • the user information includes, for example, a user ID, a password, a user name, an address, a telephone number, an e-mail address, and an electronic money number.
  • the user ID is identification information uniquely assigned to each user.
  • Store information database and store information of stores (member stores) that can handle electronic money cards are registered for each store.
  • the store information includes, for example, a store ID, a password, a store name, an address, a telephone number, and an e-mail address.
  • log database for example, log data transmitted from the payment terminal of the store to the electronic money server 3 via the network NW every predetermined time, for example, in batch processing is registered.
  • the management value of the electronic value balance is registered in association with the electronic money number or the like.
  • the management value of the electronic value balance indicates the balance of the electronic value specified by the electronic money number.
  • the electronic money server 3 issues privilege information (for example, a coupon) to a user who has made a transaction or charge based on the log data registered in the log database by satisfying the privilege information issue condition.
  • the privilege information is, for example, information indicating that a special benefit or treatment such as a discount can be received from a store when a payment is made for a product or service.
  • a condition for issuing privilege information for example, “within the last x months (period), at store xx (or at more than x stores) (location), there were x transactions (or x charges). ) (Number of times) "or” within the last x months (period), at the store xx (or at more than x stores) (location), there was a transaction for x yen (or x
  • the condition “(Yen charged) (amount)” is set.
  • FIG. 13 is a sequence diagram showing an example of an electronic value charging operation.
  • 14 and 15 are diagrams showing how the charge amount is set by the change in the relative positional relationship between the electronic money card 1 and the portable terminal 2.
  • a predetermined instruction for example, the balance of the electronic money card 1 is displayed.
  • the mobile terminal 2 activates the camera 21 and requests balance data from the electronic money card 1 via the short-range wireless communication unit 26 (hereinafter referred to as “balance request”).
  • balance request balance data from the electronic money card 1 via the short-range wireless communication unit 26
  • step S11 the non-contact IC chip 1a of the electronic money card 1 reads the electronic money number together with the balance data stored in the storage unit and transmits it to the portable terminal 2 (step S12).
  • the portable terminal 2 acquires the electronic money number together with the balance data from the electronic money card 1, and starts the process shown in FIG. 11 (step S13).
  • the portable terminal 2 displays information prompting the user to determine a desired charge amount on the display D (step S14).
  • the user fits the edge of the electronic money card 1 in the area between the frame a and the frame b displayed on the display D, for example.
  • the relative positional relationship between the electronic money card 1 and the portable terminal 2 is adjusted. Note that a partial region between the frame a and the frame b corresponds to the region A1a and the region A2a illustrated in FIG.
  • the user changes the relative positional relationship between the electronic money card 1 and the portable terminal 2 so as to correspond to a desired charge amount. For example, when the user wants to charge only ⁇ 1,000, the electronic money card 1 and the portable terminal 2 are arranged so that the edge of the electronic money card 1 is within the area within the frame a from the basic position (FIG. 14A). And keep them away vertically.
  • the electronic money card 1 is placed in the area between the frame b and the frame c from the basic position (FIG. 14C).
  • the money card 1 and the portable terminal 2 are brought close to each other in the vertical direction.
  • a partial area between the frame b and the frame c corresponds to the area A1c and the area A2c shown in FIG.
  • the electronic money card 1 is placed in the area between the frame a and the frame b from the state where the edge of the electronic money card 1 is in the frame a.
  • the relative positional relationship between the money card 1 and the portable terminal 2 is adjusted. As described above, when the relative positional relationship between the electronic money card 1 and the portable terminal 2 is changed, the displacement detected by the change event detection unit 28b in the displacement event detection and signal output processing of the target portion shown in FIG.
  • a signal corresponding to the event is output from the signal output unit 28e to the processing unit 28f.
  • the processing unit 28f of the portable terminal 2 displays the charge amount corresponding to the signal on the display D according to the output signal (step S15). For example, as shown in FIGS. 14A to 14C, “ ⁇ 1,000 charge”, “ ⁇ 3,000 charge”, and “ ⁇ 5,000 charge” are switched and displayed. Further, the processing unit 28f of the portable terminal 2 sets a charge amount (amount to be added to the balance of the electronic value) corresponding to the output signal, for example, when the screen of the display D is tapped by the user (Step S28). S16).
  • the user moves the electronic money card 1 in the right direction in the figure (or the mobile terminal 2 in the left direction in the figure), for example, so as to correspond to the desired charge amount.
  • the relative positional relationship between the money card 1 and the portable terminal 2 is changed.
  • a partial area between the amount line of ⁇ 3,000 and the amount line of ⁇ 5,000 corresponds to the area A1c shown in FIG.
  • the partial area between the amount line and the amount line of ⁇ 7,000 corresponds to the area A1a shown in FIG. 3, and the partial area between the amount line of ⁇ 7,000 and the amount line of ⁇ 10,000 is ,
  • the user adjusts the relative positional relationship between the electronic money card 1 and the portable terminal 2 to finally align the edge of the electronic money card 1 between the amount lines corresponding to the desired charge amount.
  • a signal corresponding to the displacement event detected by the change event detection unit 28b is output from the signal output unit 28e to the processing unit 28f.
  • the processing unit 28f of the portable terminal 2 displays the charge amount corresponding to the signal on the display D according to the output signal (step S15). For example, as shown in FIGS. 15A to 15C, “ ⁇ 3,000 charge”, “ ⁇ 5,000 charge”, and “ ⁇ 7,000 charge” are switched and displayed.
  • the processing unit 28f of the portable terminal 2 sets a charge amount (amount to be added to the balance of the electronic value) corresponding to the output signal when the screen of the display D is tapped by the user, for example (Step S28). S16).
  • the processing unit 28 f of the mobile terminal 2 for example, in accordance with a transmission instruction from the user, the set charge amount, balance data and electronic money number acquired from the non-contact IC chip 1 a of the electronic money card 1, Is transmitted to the electronic money server 3 via the mobile wireless communication unit 27 (step S17).
  • the electronic money number may be acquired through an image from, for example, printing on the surface of the electronic money card 1.
  • the electronic money server 3 searches the database DB using the electronic money number as a key (step S18).
  • the electronic money server 3 acquires necessary information by searching the database DB, and then performs user authentication processing (step S19).
  • the electronic money server 3 refers to the management value of the balance of the electronic value associated with the electronic money number in the electronic value database and changes the balance data by the charge amount.
  • Balance change information (write command in this embodiment) is generated, and the balance change information is transmitted to the portable terminal 2 (step S20).
  • the write command is, for example, a command “overwrite the balance to ⁇ 3,000”.
  • the command is added to this command.
  • the parameter to be set is “ ⁇ 3,000”.
  • an addition command may be applied. This addition command is, for example, a command “Add 2,000 to the balance”. In this case, the parameter added to this command is “ ⁇ 2,000”.
  • the processing unit 28f of the portable terminal 2 acquires the balance change information transmitted from the electronic money server 3, and inputs (transfers) the balance change information to the non-contact type IC chip 1a of the electronic money card 1 (step S21).
  • the non-contact IC chip 1a of the electronic money card 1 executes balance change processing for changing balance data stored in the storage unit using the balance change information input from the mobile terminal 2 (step S22).
  • the processing result is returned (step S23).
  • the portable terminal 2 transfers the processing result acquired from the non-contact type IC chip 1a to the electronic money server 3 (step S24).
  • the electronic money server 3 receives the processing result transmitted from the mobile terminal 2, the electronic money server 3 transmits a completion notification to the mobile terminal 2 (step S25).
  • the processing unit 28f of the mobile terminal 2 sets the charge amount according to the output signal
  • the charge amount and the non-contact type IC chip 1a are acquired.
  • the electronic money number may be transmitted to the electronic money server 3 (that is, balance data is not transmitted at this time).
  • the electronic money server 3 acquires balance data from the non-contact type IC chip 1a via the portable terminal 2 (by a read command).
  • the electronic money server 3 performs the processes of steps S18 and S19, and inputs balance change information to the non-contact type IC chip 1a via the portable terminal 2 (by a write command).
  • the contactless IC chip 1a of the electronic money card 1 executes balance change processing for changing balance data stored in the storage unit using the balance change information input from the mobile terminal 2, and obtains the processing result. It responds via the portable terminal 2.
  • the electronic money server 3 receives the processing result transmitted from the mobile terminal 2, the electronic money server 3 transmits a completion notification to the mobile terminal 2.
  • the contactless IC chip 1 a of the electronic money card 1 is changed.
  • the balance data stored in the storage unit can be changed.
  • FIG. 16 is a sequence diagram illustrating an example of an information acquisition operation performed by the mobile terminal 2.
  • FIG. 17 is a diagram illustrating a state in which the information type is set by a change in the relative positional relationship between the electronic money card 1 and the mobile terminal 2.
  • This information type is the type of information requested to the electronic money server 3.
  • This information type indicates, for example, privilege information such as a coupon or a log.
  • the user brings the electronic money card 1 and the portable terminal 2 close to each other within a distance range in which short-range wireless communication is possible, and a predetermined instruction (for example, the balance of the electronic money card 1 is displayed).
  • the portable terminal 2 activates the camera 21 and requests log data from the electronic money card 1 via the short-range wireless communication unit 26 (in this embodiment, by a read command) (Ste S31).
  • the non-contact IC chip 1a of the electronic money card 1 reads the electronic money number together with the log data stored in the storage unit and transmits it to the mobile terminal 2 (step S32).
  • the portable terminal 2 acquires the electronic money number together with the log data from the electronic money card 1, and starts the process shown in FIG. 11 (step S33). Next, the portable terminal 2 displays the balance included in the acquired log data and information prompting the user to determine a desired information type on the display D (step S34).
  • the mobile terminal 2 may request balance data from the electronic money card 1 separately from the log data request, and acquire the balance data from the electronic money card 1. In this case, the balance indicated by the balance data may be displayed.
  • the user fits the edge of the electronic money card 1 in an area between the frame a and the frame b displayed on the display D, for example.
  • the relative positional relationship between the electronic money card 1 and the portable terminal 2 is adjusted.
  • the user changes the relative positional relationship between the electronic money card 1 and the portable terminal 2 so as to correspond to a desired information type.
  • the electronic money card 1 is arranged so that the edge of the electronic money card 1 fits in the area within the frame a displayed on the display D (FIG. 17A). 1 and the portable terminal 2 are moved away from each other in the vertical direction.
  • the edge of the electronic money card 1 fits in an area between the frame b and the frame c displayed on the display D (FIG.
  • the electronic money card 1 and the portable terminal 2 are brought close to each other in the vertical direction.
  • the displacement detected by the change event detection unit 28b in the displacement event detection and signal output processing of the target portion shown in FIG. A signal corresponding to the event is output from the signal output unit 28e to the processing unit 28f.
  • the processing unit 28f of the mobile terminal 2 displays the information type corresponding to the signal on the display D in accordance with the output signal (step S35). For example, as shown in FIGS. 17A to 17C, “coupon”, “balance ⁇ 3,000”, and “log” are switched and displayed.
  • the processing unit 28f of the mobile terminal 2 sets an information type corresponding to the output signal (step S36).
  • the processing unit 28f of the portable terminal 2 receives the information indicating the set information type and the electronic money number acquired from the non-contact type IC chip 1a of the electronic money card 1 in response to a transmission instruction from the user. It transmits to the electronic money server 3 via the mobile radio
  • the electronic money server 3 searches the database DB using the electronic money number as a key (step S38).
  • the electronic money server 3 obtains necessary information by searching the database DB, and then performs user authentication processing (step S39). If the result of the authentication process is good, the electronic money server 3 acquires information corresponding to the received information type and transmits it to the mobile terminal 2 (step S40).
  • the information type indicates privilege information and the user satisfies the privilege information issuance condition
  • privilege information for the user is issued, and the privilege information is transmitted to the mobile terminal 2 as information corresponding to the information type.
  • the information type indicates a log
  • log data including the acquired electronic money number is acquired from the log database, and the log data is transmitted to the mobile terminal 2 as information corresponding to the information type.
  • the processing unit 28 f of the mobile terminal 2 acquires information (privilege information or log data) transmitted from the electronic money server 3, and uses the information corresponding to the information type set on the display D of the display unit 22. (Step S41).
  • the log data is acquired from the electronic money server 3
  • the log data and the log data acquired from the electronic money card 1 are integrated and displayed on the display D.
  • the privilege information display and the log data display are switched by changing the relative positional relationship between the electronic money card 1 and the mobile terminal 2. be able to.
  • the present invention can be applied to other information processing systems.
  • the present invention can be applied to a payment processing system using a credit card or a payment processing system such as an online transaction.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Accounting & Taxation (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Financial Or Insurance-Related Operations Such As Payment And Settlement (AREA)

Abstract

Dans un contexte dans lequel une carte en plaque (1) et un terminal mobile (2) sont rapprochés, ce dispositif de traitement d'informations détermine, dans des images de référence qui configurent une série d'images imagées en continu à un intervalle d'imagerie prescrit par un appareil photographique (21), une zone d'intérêt spécifiable basée sur le bord de la carte en plaque (1) figurant dans les images de référence. En outre, ce dispositif de traitement d'informations détecte les événements indiquant des changements dans la zone d'intérêt déterminée dans l'une quelconque des multiples images comparatives qui suivent les images de référence, et délivre en sortie un signal correspondant aux événements de déplacement détectés ou aux événements associés.
PCT/JP2013/067548 2013-06-26 2013-06-26 Procédé de traitement d'informations, dispositif de traitement d'informations, programme de traitement d'informations, et support de stockage stockant le programme de traitement d'informations WO2014207844A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2015523718A JP6033431B2 (ja) 2013-06-26 2013-06-26 情報処理方法、情報処理装置、及び情報処理プログラム
PCT/JP2013/067548 WO2014207844A1 (fr) 2013-06-26 2013-06-26 Procédé de traitement d'informations, dispositif de traitement d'informations, programme de traitement d'informations, et support de stockage stockant le programme de traitement d'informations

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2013/067548 WO2014207844A1 (fr) 2013-06-26 2013-06-26 Procédé de traitement d'informations, dispositif de traitement d'informations, programme de traitement d'informations, et support de stockage stockant le programme de traitement d'informations

Publications (1)

Publication Number Publication Date
WO2014207844A1 true WO2014207844A1 (fr) 2014-12-31

Family

ID=52141250

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/067548 WO2014207844A1 (fr) 2013-06-26 2013-06-26 Procédé de traitement d'informations, dispositif de traitement d'informations, programme de traitement d'informations, et support de stockage stockant le programme de traitement d'informations

Country Status (2)

Country Link
JP (1) JP6033431B2 (fr)
WO (1) WO2014207844A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020021504A (ja) * 2019-10-09 2020-02-06 Necプラットフォームズ株式会社 情報処理装置、情報処理方法、およびプログラム

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010079771A (ja) * 2008-09-29 2010-04-08 Hitachi Ltd 入力装置
JP2011215914A (ja) * 2010-03-31 2011-10-27 Ntt Docomo Inc 評価情報生成装置及び評価情報生成方法

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7496212B2 (en) * 2003-05-16 2009-02-24 Hitachi Kokusai Electric Inc. Change detecting method and apparatus
JP2008244670A (ja) * 2007-03-26 2008-10-09 Funai Electric Co Ltd 撮像装置
JP5101429B2 (ja) * 2008-08-11 2012-12-19 セコム株式会社 画像監視装置

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010079771A (ja) * 2008-09-29 2010-04-08 Hitachi Ltd 入力装置
JP2011215914A (ja) * 2010-03-31 2011-10-27 Ntt Docomo Inc 評価情報生成装置及び評価情報生成方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020021504A (ja) * 2019-10-09 2020-02-06 Necプラットフォームズ株式会社 情報処理装置、情報処理方法、およびプログラム

Also Published As

Publication number Publication date
JPWO2014207844A1 (ja) 2017-02-23
JP6033431B2 (ja) 2016-11-30

Similar Documents

Publication Publication Date Title
US20210004789A1 (en) Mobile device and control method thereof
US11514430B2 (en) User interfaces for transfer accounts
CN105844462B (zh) 用于支付的用户界面
US11284251B2 (en) Mobile device and control method thereof
EP3839857B1 (fr) Interface utilisateur pour comptes de fidélité et comptes de marque privée pour un dispositif portable
EP2769289B1 (fr) Procédé et appareil pour déterminer la présence d'un dispositif pour exécuter des opérations
US20180012057A1 (en) Electronic device with touch sensor and driving method therefor
US11176526B2 (en) Mobile electronic device and method for electronic payment
CN105706127A (zh) 在电子设备上提供并认证凭据
EP3239913A1 (fr) Procédé et dispositif de commande de transmission d'informations et terminal intelligent
CN105447691A (zh) 基于地理位置的电子卡交易授权
JP5985632B2 (ja) 情報処理装置、情報処理方法、及び情報処理プログラム
EP3285219A1 (fr) Procédé et appareil de transfert de ressources, programme informatique et support d'enregistrement
CN105447694A (zh) 通过ce设备的触摸屏接收指纹
US11095331B2 (en) Electronic device, protective case for electronic device, and displaying method thereof
JP6182527B2 (ja) 決済端末、情報処理サーバ、決済端末の制御方法、及びプログラム
CN110503416B (zh) 数值转移方法、装置、计算机设备及存储介质
KR20160147340A (ko) 디스플레이 장치 및 제어 방법
KR101867450B1 (ko) 식별 코드를 이용하여 결제를 수행할 수 있는 이동단말기, 결제 서버, 그것들의 결제시스템 및 그 제어방법
JP6033431B2 (ja) 情報処理方法、情報処理装置、及び情報処理プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13887676

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2015523718

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13887676

Country of ref document: EP

Kind code of ref document: A1