EP3364293A1 - Dispositif et procédé de commande - Google Patents
Dispositif et procédé de commande Download PDFInfo
- Publication number
- EP3364293A1 EP3364293A1 EP17187831.7A EP17187831A EP3364293A1 EP 3364293 A1 EP3364293 A1 EP 3364293A1 EP 17187831 A EP17187831 A EP 17187831A EP 3364293 A1 EP3364293 A1 EP 3364293A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- behavior
- user
- guidance
- skill level
- threshold
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
- G06F9/453—Help systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/172—Processing image signals image signals comprising non-image signal components, e.g. headers or format information
- H04N13/183—On-screen display [OSD] information, e.g. subtitles or menus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
- H04N13/373—Image reproducers using viewer tracking for tracking forward-backward translational head movements, i.e. longitudinal movements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/398—Synchronisation thereof; Control thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04101—2.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
Definitions
- the present disclosure relates to a device and a control method.
- a device such as an automatic ticket machine that is used by a user according to a prescribed procedure
- a device that outputs voice and the like to guide the user of the procedure.
- a behavior recognition automatic ticket machine that recognizes the user's behavior using image processing and that guides the user corresponding to the user's behavior.
- a device includes an output control unit.
- the output control unit When time taken by a user to carry out first behavior is equal to or more than a first threshold, or when number of times the first behavior is repeated is equal to or more than a second threshold, the output control unit outputs first guidance to lead the user to first expected behavior that is behavior the user is expected to carry out subsequent to the first behavior.
- the output control unit When the time taken to carry out the first behavior is less than the first threshold, or when the number of times the first behavior is repeated is less than the second threshold, the output control unit omits the first guidance to lead the user to the first expected behavior or outputs second guidance that is simpler than the first guidance.
- FIG. 1 is a diagram illustrating an example of a hardware configuration of the device 100 of the first arrangement.
- the device 100 of the first arrangement includes a processor 101, an auxiliary storage device 102, a main storage device 103, a camera 104, a display device 105, an input device 106, and a speaker 107.
- the device 100 may be any device.
- the device 100 may be an automatic ticket machine.
- the processor 101 reads out a computer program from a storage medium such as the auxiliary storage device 102 and executes the computer program.
- the auxiliary storage device 102 stores therein information such as a computer program.
- the auxiliary storage device 102 may be any device.
- the auxiliary storage device 102 may be a hard disk drive (HDD).
- the main storage device 103 is a storage area used as a work area by the processor 101.
- the camera 104 acquires a plurality of images in time series, by taking images of a user of the device 100.
- the camera 104 may be any device.
- the camera 104 may be a visible light camera and a depth image camera.
- the camera 104 is installed at a location where the camera 104 can take images of the action taken by the user who is using the device 100.
- the camera 104 continuously takes images of the action taken by the user, from when the user starts using the device 100 until the user finishes using the device 100.
- the device 100 may include a plurality of the cameras 104.
- the cameras 104 take images of the user from different locations and at different angles, to thereby take images of the front, back, hands, and the like of the user, for example.
- the action taken by the user of the device 100 can be taken as an image including depth information.
- the display device 105 displays information that is offered to the user of the device 100 and the like.
- the display device 105 may be any device.
- the display device 105 may be a liquid crystal display.
- the input device 106 receives an operational input from the user of the device 100.
- the input device 106 may be a hardware key.
- the display device 105 and the input device 106 may also be a liquid crystal touch panel or the like that has both display function and input function.
- the speaker 107 outputs voice guidance and the like to the user of the device 100.
- FIG. 2 is a diagram illustrating an example of a functional configuration of the device 100 of the first arrangement.
- the device 100 of the first arrangement includes an imaging unit 1, an acquiring unit 2, a recognizing unit 3, a first determining unit 4a, a second determining unit 4b, an output control unit 5, and a storage unit 6.
- the imaging unit 1 acquires images in time series, by taking images of the user of the device 100.
- the imaging unit 1 may be implemented by the camera 104.
- the acquiring unit 2 Upon acquiring the images taken by the imaging unit 1, the acquiring unit 2 supplies the images to the recognizing unit 3, the first determining unit 4a, and the second determining unit 4b.
- the acquiring unit 2 may be implemented using a computer program executed by the processor 101.
- the acquiring unit 2 may be implemented by hardware such as an integrated circuit (IC).
- the recognizing unit 3 Upon receiving the images from the acquiring unit 2, the recognizing unit 3 recognizes the user's behavior from the images.
- the recognizing unit 3 supplies behavior information indicating the recognized behavior to the first determining unit 4a and the second determining unit 4b.
- the recognizing unit 3 may be implemented using a computer program executed by the processor 101.
- the recognizing unit 3 may be implemented by hardware such as the IC.
- the first determining unit 4a determines the skill level of the user, from at least one of the images and the behavior information.
- the data format for the skill level may be any format.
- the skill level may be a numerical value indicating the degree of skill level.
- the skill level may be indicated by numerical values from 1 to 10.
- the initial value of the skill level may be set to 5
- the first determining unit 4a may determine the skill level of the user, by adding or subtracting the skill level according to the user's behavior.
- the skill level may be expressed by binary values (0: low and 1: high).
- the first determining unit 4a supplies skill level information indicating the determined skill level, to the second determining unit 4b.
- the second determining unit 4b Upon receiving the behavior information from the recognizing unit 3, and receiving the skill level information from the first determining unit 4a, the second determining unit 4b determines guidance to be output, from the behavior information and the skill level information. The second determining unit 4b supplies guidance information indicating the determined guidance, to the output control unit 5.
- the first determining unit 4a and the second determining unit 4b may be implemented using a computer program executed by the processor 101.
- the first determining unit 4a and the second determining unit 4b may be implemented by hardware such as the IC.
- the first determining unit 4a and the second determining unit 4b may be implemented using a single functional block.
- the output control unit 5 Upon receiving the guidance information from the second determining unit 4b, the output control unit 5 outputs the guidance information.
- the output control unit 5 may be implemented using a computer program executed by the processor 101.
- the output control unit 5 may be implemented by hardware such as the IC.
- the storage unit 6 stores therein information.
- the storage unit 6 may be implemented by the auxiliary storage device 102 and the main storage device 103.
- the information to be stored in the storage unit 6 is guidance information in which a set of user's behavior and skill level is associated with the guidance for the device 100.
- the guidance information is formed so that the guidance for the device 100 is retrieved, using the set of behavior and skill level as a search key. Hence, even when the recognizing unit 3 recognizes the behaviors to be the same, if the skill levels are different, a different guidance will be retrieved.
- the guidance for the device 100 that is stored in the guidance information is not limited to guidance such as sound to be output and a text to be output.
- the guidance for the device 100 that is to be stored in the guidance information may also be information on operation for helping the user with a skill level of less than a threshold (fourth threshold). For example, the operation for helping the user may be "call a person in charge" and the like.
- the output control unit 5 calls a person in charge, by notifying the other device such as a terminal used by the person in charge, through a network.
- the guidance for the device 100 that is stored in the guidance information may be information on operation that does not obstruct the operation performed by the user with a skill level of equal to or more than a threshold.
- the operation that does not obstruct the operation performed by the user may be "not outputting guidance”.
- the data format of the guidance information may be any format.
- the guidance information may be divided into database that stores therein behaviors and types of guidance, and database that stores therein sets of skill levels and types of guidance as well as the guidance for the device 100.
- FIG. 3 is a flowchart illustrating an operational example of the device 100 of the first arrangement.
- the acquiring unit 2 acquires images taken by the imaging unit 1 (step S1).
- the recognizing unit 3 recognizes the user's behavior from the images obtained through the process at step S1 (step S2). More specifically, the recognizing unit 3 extracts a feature amount of each frame of an image, and a feature amount specified by the preceding and subsequent frames.
- the feature vector of each frame includes coordinate information in an image of the characteristic parts of the user's body.
- the coordinate information indicates points on a plane or in space.
- the characteristic part of the user's body includes a part of the body that is detected as an edge in an image, such as an eye of the user.
- the feature vector of each frame includes information on appearance in an image that is specified by gradient information in the image.
- the feature vector specified by the preceding and subsequent frames is a feature indicating the movement of the user included in the preceding and subsequent frames.
- the recognizing unit 3 recognizes the user's behavior using a dynamics i.e. what are the trajectories of the feature vector over time. More specifically, for example, the recognizing unit 3 recognizes the most plausible behavior, by comparing the dynamics in the extracted feature vector with a model that is prepared in advance to recognize the behavior. Moreover, for example, the recognizing unit 3 compares the dynamics in the extracted feature vector with a change pattern of a predetermined feature vector. The recognizing unit 3 then specifies the change pattern of the feature vector that is closest to the dynamics in the extracted feature vector. The recognizing unit 3 then recognizes the behavior pattern that is associated in advance with the change pattern of the specified feature vector, as the user's behavior. For example, the behavior pattern may be "insert a coin" and "touching the liquid crystal touch panel".
- the recognizing unit 3 recognizes the series of behaviors, using a plurality of behavior patterns that are recognized in time series.
- information indicating the series of behaviors includes time between the behavior of a user and the subsequent behavior of the user.
- the information indicating the series of behaviors includes a set of behavior of a user and the subsequent behavior of the user.
- the first determining unit 4a determines the skill level of the user, from at least one of the images acquired through the process at step S1, and the behavior and the series of behaviors that are recognized through the process at step S2 (step S3).
- a method of determining the skill level may be any method. Hereinafter, the method of determining the skill level will be described.
- the first determining unit 4a determines the skill level, by performing regression and classification on the skill level of the user and the like, by extracting the feature vector described above from the images and using the dynamics in the feature vector.
- the first determining unit 4a determines the skill level, by performing regression and classification on the skill level of the user and the like, using the behavior and the series of behaviors that are recognized by the recognizing unit 3. More specifically, the first determining unit 4a performs regression and classification on the skill level of the user, using the relation in time length between the time taken to carry out each behavior and a predetermined time set in advance for each behavior. Moreover, for example, the first determining unit 4a performs regression and classification on the skill level of the user, using the relation in time length between an interval between the series of behaviors and a predetermined time set in advance for each of the series. Moreover, for example, the first determining unit 4a performs regression and classification on the skill level of the user, using the number of times each behavior is repeated.
- the predetermined time described above may not be determined according to the behavior and the series of behaviors. For example, the predetermined time described above may be uniformly determined. Moreover, for example, the predetermined time described above may be adjusted according to the skill level of the user that is decided up to the present time.
- the first determining unit 4a subtracts a certain value from the skill level of the user.
- the certain value may be one.
- the first determining unit 4a adds a certain value to the skill level of the user.
- the first determining unit 4a subtracts a certain value from the skill level of the user.
- the first determining unit 4a adds a certain value to the skill level of the user.
- the first determining unit 4a determines the skill level to have a value less than a threshold (third threshold).
- the first determining unit 4a sets the skill level to a smaller value, with an increase in time between the first behavior and second behavior that is behavior carried out by the user subsequent to the first behavior. For example, the first determining unit 4a increases a subtraction value of the skill level of the user, with an increase in time between the first behavior and the second behavior.
- the first determining unit 4a may determine one or both of the threshold for time (first threshold) and the threshold for the number of times (second threshold).
- the second determining unit 4b determines the guidance for the device 100 that is to be output from the output control unit 5, from the behavior recognized through the process at step S2 and the skill level determined through the process at step S3 (step S4). More specifically, the second determining unit 4b determines the guidance for the device 100 that is retrieved from the guidance information described above, by using the set of behavior and skill level as a search key, as the guidance for the device 100 to be output from the output control unit 5.
- the output control unit 5 controls the output of the guidance that is determined by the second determining unit 4b (step S5).
- the output control unit 5 outputs the guidance to the display device 105 and the speaker 107.
- the output control unit 5 when the skill level after the first behavior is carried out is less than the threshold (third threshold), the output control unit 5 outputs first guidance to lead the user to first expected behavior that is behavior the user is expected to carry out subsequent to the first behavior.
- the output control unit 5 When the skill level is equal to or more than the threshold (third threshold), the output control unit 5 omits the guidance to lead the user to the first expected behavior, or outputs second guidance that is simpler than the first guidance.
- the first expected behavior is to "take the receipt”.
- the first guidance is “please take your receipt from the output port located at the right bottom of the screen”
- the second guidance is "please take your receipt", "please don't forget to take your receipt", or the like.
- the output control unit 5 when the skill level after the second behavior is carried out is less than the threshold (third threshold), the output control unit 5 outputs third guidance to lead the user to second expected behavior that is behavior the user is expected to carry out subsequent to the second behavior.
- the output control unit 5 omits the guidance to lead the user to the second expected behavior, or outputs fourth guidance that is simpler than the third guidance.
- the second determining unit 4b determines the guidance for the device 100 based on the behavior recognized by the recognizing unit 3 through the process at step S2 and the skill level determined by the first determining unit 4a through the process at step S3.
- the second determining unit 4b may determine the guidance for the device 100 only based on the behavior recognized by the recognizing unit 3.
- the recognizing unit 3 may recognize the user's behavior, not only by the images taken by the camera 104, but also by using an operational input acquired through the input device 106 and the like. For example, the recognizing unit 3 may also recognize that the user's behavior is "pressing the receipt issuing button", when the receipt issuing button is pressed.
- the output control unit 5 when the time taken to carry out the first behavior is equal to or more than the first threshold, or when the number of times the first behavior is repeated is equal to or more than the second threshold, the output control unit 5 outputs the first guidance to lead the user to the first expected behavior that is behavior the user is expected to carry out subsequent to the first behavior.
- the output control unit 5 When the time taken to carry out the first behavior is less than the first threshold, or when the number of times the first behavior is repeated is less than the second threshold, the output control unit 5 omits the guidance to lead the user to the first expected behavior, or outputs the second guidance that is simpler than the first guidance.
- the device 100 of the first arrangement it is possible to control the guidance for the device 100 according to the skill level of the user. More specifically, with the device 100 of the first arrangement, it is possible to carry out the most appropriate guidance according to the usage status of the device 100 by the user, and the skill level of the user. Consequently, it is possible to prevent the guidance that may place a psychological burden on the user.
- the skill level is further determined based on a deviation indicating whether the response of the user at the second time is desirable as a response to the guidance that has been offered to the user at the first time.
- the first difference of the second arrangement from the first arrangement is that the deviation is used to determine the skill level at the second time. It is to be noted that the second time is later than the first time.
- the interval between the first time and the second time may be any interval.
- the second difference of the second arrangement from the first arrangement is that the expected behavior that is behavior the user is expected to carry out according to the guidance for the device 100 included in the guidance information described above, is also stored in an associated manner.
- the behavior, the skill level, the guidance, and the expected behavior are stored in an associated manner.
- FIG. 4 is a diagram illustrating an example of a functional configuration of the device 100 of a second arrangement.
- the device 100 of the second arrangement includes the imaging unit 1, the acquiring unit 2, the recognizing unit 3, the first determining unit 4a, the second determining unit 4b, a third determining unit 4c, the output control unit 5, and the storage unit 6.
- the third determining unit 4c is further added to the functional configuration of the device 100 of the first arrangement.
- the third determining unit 4c may be implemented using a computer program executed by the processor 101. Moreover, for example, the third determining unit 4c may be implemented by hardware such as the IC. Moreover, the first determining unit 4a, the second determining unit 4b, and the third determining unit 4c may be implemented using a single functional block.
- the third determining unit 4c determines a deviation between the first expected behavior that is behavior the user is expected to carry out subsequent to the first behavior and the second behavior.
- the first expected behavior is expected behavior associated with the first behavior that is included in the guidance information described above.
- the method of determining the deviation may be any method.
- the third determining unit 4c determines the deviation from a difference between the feature amount that indicates the first behavior described above, and the feature amount indicating the first expected behavior described above.
- the second determining unit 4b further determines the skill level of the user based on the deviation determined by the third determining unit 4c. For example, the second determining unit 4b sets the skill level to a smaller value with an increase in the deviation.
- FIG. 5 is a flowchart illustrating an operational example of the device 100 of the second arrangement.
- the first behavior that is carried out at the first time is behavior that is first recognized by the recognizing unit 3.
- the third determining unit 4c does not perform a process at step S2-2. Because the operation on the first behavior at the start of use is the same as the operation of the device 100 in the first arrangement (see FIG. 3 ), the explanation thereof will be omitted.
- the acquiring unit 2 acquires images taken by the imaging unit 1 (step S1).
- the recognizing unit 3 recognizes the second behavior of the user, from the images acquired through the process at step S1 (step S2-1).
- the third determining unit 4c determines the deviation between the first expected behavior that is behavior the user is expected to carry out subsequent to the first behavior, and the second behavior that is recognized through the process at step S2-1 (step S2-2). It is to be noted that the third determining unit 4c specifies the first expected behavior, by acquiring the expected behavior associated with the first guidance that is determined through the process on the first behavior performed at the first time, from the guidance information described above.
- the first determining unit 4a determines the skill level of the user, from at least one of the images acquired through the process at step S1, the behavior and the series of behaviors that are recognized through the process at step S2-1, and the deviation determined through the process at step S2-2 (step S3). For example, the first determining unit 4a sets the skill level to a smaller value, with an increase in the deviation between the first expected behavior and the second behavior.
- step S4 and step S5 are the same as those of the operational method of the device 100 in the first arrangement (see FIG. 3 ), the explanation thereof will be omitted.
- the first determining unit 4a determines the skill level to have a smaller value, with an increase in the deviation between the first expected behavior and the second behavior.
- the behavior of the user at the second time in response to the guidance at the first time can be taken into consideration to determine the skill level at the second time. Consequently, it is possible to control the guidance in which the user's behavior in response to the guidance is taken into consideration.
- a problem illustrated in FIG. 6A may be considered as a problem to be solved by the second arrangement.
- the user with a high skill level who has completed the purchasing procedure using the device 100 recognizes in advance that the next behavior to be taken is an action of "taking the receipt".
- the user starts the action of "taking the receipt", immediately after the end of use announcement takes place.
- the device 100 outputs guidance such as an announcement for "prompting the user to take the receipt"
- the second arrangement when the action of "taking the receipt" is confirmed as the response of the user to the "end of use” announcement, it is possible to control the guidance so as to cancel the announcement for "prompting the user to take the receipt". Consequently, with the device 100 of the second arrangement, it is possible to prevent the guidance that may place a psychological burden on the user.
- a problem illustrated in FIG. 6B may be considered as a problem to be solved by the second arrangement. It is assumed that the user who has been guided by the announcement for "prompting the user to take the receipt" or the like by the device 100, starts the action of "taking the receipt". However, it is useless to repeat the announcement to the user with a low skill level who cannot start the above action in response to the announcement. With the device 100 of the second arrangement, it is possible to control the guidance so as to change the announcement to a more specific instruction or to change the way of handling such as to send a person in charge, to the user who is not carrying out the behavior the user is expected to take as a response to the announcement. Consequently, with the device 100 of the second arrangement, it is possible to prevent the guidance that may place a psychological burden on the user.
- the third arrangement will be described.
- the same description as that in the first arrangement will be omitted, and different points from the first arrangement will be described.
- the skill level is further determined based on the usage history of the user.
- the first difference of the third arrangement from the first arrangement is that the device 100 reads out the user information of the user.
- the second difference of the third arrangement from the first arrangement is that the device 100 stores the usage history of the user.
- FIG. 7 is a diagram illustrating an example of a hardware configuration of the device 100 of the third arrangement.
- the device 100 of the third arrangement includes the processor 101, the auxiliary storage device 102, the main storage device 103, the camera 104, the display device 105, the input device 106, the speaker 107, and a reading device 108.
- the reading device 108 is further added to the hardware configuration of the device 100 of the first arrangement.
- the reading device 108 reads out the user information of the user.
- the reading device 108 may be an IC card reader.
- the user information is information relating to the user.
- the user information at least includes user identification information for identifying the user.
- the auxiliary storage device 102 further stores therein the usage history of the device 100 by the user.
- the usage history is recorded when the user uses the device 100.
- the usage history may include the user identification information described above, time and date of use, skill level, and the like.
- the time and date of use is the time and date when the user has used the device 100.
- the skill level is the skill level determined when the user has used the device 100.
- FIG. 8 is a diagram illustrating an example of a functional configuration of the device 100 of the third arrangement.
- the device 100 of the third arrangement includes the imaging unit 1, the acquiring unit 2, the recognizing unit 3, the first determining unit 4a, the second determining unit 4b, the output control unit 5, the storage unit 6, and a reading unit 7.
- the reading unit 7 is further added to the functional configuration of the device 100 of the first arrangement.
- the reading unit 7 reads out the user information described above.
- the reading unit 7 may be implemented by the reading device 108.
- the storage unit 6 further stores therein the usage history described above.
- FIG. 9 is a flowchart illustrating an operational example of the device 100 of the third arrangement.
- the acquiring unit 2 acquires the images taken by the imaging unit 1 (step S1-1).
- the reading unit 7 reads out the user information described above (step S1-2).
- the recognizing unit 3 recognizes the user's behavior from the images acquired through the process at step S1-1 (step S2).
- the first determining unit 4a determines the skill level of the user, from at least one of the images acquired through the process at step S1-1, the user information read out through the process at step S1-2, and the behavior and the series of behaviors that are recognized through the process at step S2 (step S3).
- the method of determining the skill level may be any method.
- an example of the method of determining the skill level will be described.
- the first determining unit 4a determines the skill level of the user at the start of use, by the skill level read out from the usage history, using the user identification information included in the user information.
- the first determining unit 4a determines the skill level of the user to be the skill level read out from the usage history.
- the first determining unit 4a determines the skill level of the user, from at least one of the images acquired through the process at step S1-1, and the behavior and the series of behaviors that are recognized through the process at step S2.
- the first determining unit 4a determines the skill level based on the skill level read out from the usage history, and the skill level that is determined from at least one of the images acquired through the process at step S1-1 and the behavior and the series of behaviors that are recognized through the process at step S2.
- the first determining unit 4a may reflect the skill level read out from the usage history in determining the skill level, with a certain influence degree, or may attenuate the influence degree of the skill level that is read out from the usage history with the time series.
- the skill level determined through the process at step S3 is stored in the usage history of the storage unit 6, at the point when the user has completed a series of operations.
- step S4 and step S5 are the same as those in the operational method of the device 100 in the first arrangement (see FIG. 3 ), the explanation thereof will be omitted.
- the storage unit 6 stores therein the usage history of the device 100 by the user.
- the first determining unit 4a then further determines the skill level of the user based on the usage history.
- the device 100 of the third arrangement it is possible to use the skill level of the user that is determined when the device 100 is last used by the user. Consequently, it is possible to appropriately control the guidance, even if the user is not operating the device 100 such as at the start of use. In other words, with the device 100 of the third arrangement, it is possible to prevent the guidance that may place a psychological burden on the user.
- the user information described above may also include the identification information for identifying the user and the skill level of the user.
- the first determining unit 4a determines the skill level of the user from the skill level included in the user information.
- the device 100 of the first to third arrangements described above may be implemented by a computer including a general-purpose processor 101.
- all or a part of functions that can be implemented by a computer program, among the functions of the device 100 described above (see FIG. 2 , FIG. 4 , and FIG. 8 ) may be implemented by causing the general-purpose processor 101 to execute the computer program.
- the computer program executed by the device 100 of the first to third arrangements is provided as a computer program product by being recorded in a computer-readable recording medium such as a compact disc-read only memory (CD-ROM), a memory card, a compact disc-recordable (CD-R), and a digital versatile disc (DVD) in an installable or executable file format.
- a computer-readable recording medium such as a compact disc-read only memory (CD-ROM), a memory card, a compact disc-recordable (CD-R), and a digital versatile disc (DVD) in an installable or executable file format.
- the computer program executed by the device 100 of the first to third arrangements may also be stored in a computer connected to a network such as the Internet, and provided by being downloaded via the network.
- the computer program executed by the device 100 of the first to third arrangements may also be provided via a network such as the Internet without being downloaded.
- the computer program executed by the device 100 of the first to third arrangements may also be provided by incorporating the computer program into a read-only memory (ROM) and the like in advance.
- ROM read-only memory
- a part of the functions of the device 100 of the first to third arrangements may be implemented by hardware such as the IC.
- the IC may be a dedicated processor 101 that executes a predetermined process.
- the device 100 may also include a plurality of the processors 101.
- each of the processors 101 may implement one of the functions or may implement two or more of the functions.
- the operational mode of the device 100 of the first to third arrangements may be any mode.
- the functions of the device 100 of the first to third arrangements may be operated as a cloud system on the network.
- a device include an output control unit. When time taken by a user to carry out first behavior is equal to or more than a first threshold, or when number of times the first behavior is repeated is equal to or more than a second threshold, the output control unit outputs first guidance to lead the user to first expected behavior that is behavior the user is expected to carry out subsequent to the first behavior.
- a device includes a determining unit and an output control unit.
- the determining unit is configured to determine a skill level of a user from first behavior of the user.
- the output control unit outputs first guidance to lead the user to first expected behavior that is behavior the user is expected to carry out subsequent to the first behavior.
- the output control unit When the skill level is equal to or more than the third threshold, the output control unit omits the first guidance to lead the user to the first expected behavior or output second guidance that is simpler than the first guidance.
- Example 3 In the device according to Example 2, when time taken to carry out the first behavior is equal to or more than a first threshold, or when number of times the first behavior is repeated is equal to or more than a second threshold, the determining unit determines the skill level to have a value less than the third threshold.
- Example 4 further includes a recognizing unit configured to recognize the first behavior of the user.
- the recognizing unit further recognizes second behavior carried out by the user subsequent to the first behavior.
- the determining unit determines the skill level to have a smaller value with an increase in time between the first behavior and the second behavior.
- the output control unit outputs third guidance to lead the user to second expected behavior that is behavior the user is expected to carry out subsequent to the second behavior, when the skill level is less than the third threshold; and omits the third guidance to lead the user to the second expected behavior, or changes to fourth guidance that is simpler than the third guidance, when the skill level is equal to or more than the third threshold.
- Example 6 In the device according to Example 5, the determining unit determines the skill level to have a smaller value, with an increase in a deviation between the first expected behavior and the second behavior.
- the output control unit notifies another device that the device is used by a user with a low skill level.
- the recognizing unit recognizes the first behavior based on at least one of an image including the user and an operational input by the user.
- Example 9 The device according to Example 2 further includes a reading unit configured to read out user information including identification information for identifying the user and the skill level of the user. The determining unit determines the skill level of the user based on the skill level included in the user information, when the reading unit reads out the user information.
- Example 10 The device according to Example 2 further includes a storage unit configured to store therein usage history of the device by the user.
- the determining unit further determines the skill level of the user based on the usage history.
- a control method includes, when time taken by a user to carry out first behavior is equal to or more than a first threshold, or when number of times the first behavior is repeated is equal to or more than a second threshold, outputting first guidance to lead the user to first expected behavior that is behavior the user is expected to carry out subsequent to the first behavior; and when the time taken to carry out the first behavior is less than the first threshold, or when the number of times the first behavior is repeated is less than the second threshold, omitting the first guidance to lead the user to the first expected behavior or outputting second guidance that is simpler than the first guidance.
- Example 12
- a control method includes determining a skill level of a user from first behavior of the user; when the skill level is less than a third threshold, outputting first guidance to lead the user to first expected behavior that is behavior the user is expected to carry out subsequent to the first behavior; and when the skill level is equal to or more than the third threshold, omitting the first guidance to lead the user to the first expected behavior or outputting second guidance that is simpler than the first guidance.
- Example 13 In the method according to Example 12, when time taken to carry out the first behavior is equal to or more than a first threshold, or when number of times the first behavior is repeated is equal to or more than a second threshold, the determining includes determining the skill level to have a value less than the third threshold.
- the method according to Example further includes recognizing the first behavior of the user.
- Example 15 the recognizing further includes recognizing second behavior carried out by the user subsequent to the first behavior.
- the determining includes determining the skill level to have a smaller value with an increase in time between the first behavior and the second behavior.
- the outputting includes outputting third guidance to lead the user to second expected behavior that is behavior the user is expected to carry out subsequent to the second behavior, when the skill level is less than the third threshold; and omitting the third guidance to lead the user to the second expected behavior, or changing to fourth guidance that is simpler than the third guidance, when the skill level is equal to or more than the third threshold.
- Example 16
- the determining includes determining the skill level to have a smaller value, with an increase in a deviation between the first expected behavior and the second behavior.
- the method according to Example 12 further includes notifying another device that a device is used by a user with a low skill level when the skill level is equal to or less than a fourth threshold.
- the recognizing includes recognizing the first behavior based on at least one of an image including the user and an operational input by the user.
- Example 19 further includes reading out user information including identification information for identifying the user and the skill level of the user.
- the determining includes determining the skill level of the user based on the skill level included in the user information, when the reading out reads out the user information.
- Example 20 The method according to Example 15.
- the method according to Example 12 further includes storing usage history of a device by the user in a memory.
- the determining includes determining the skill level of the user based on the usage history.
- a computer program causes a computer to function as an output control unit. When time taken by a user to carry out first behavior is equal to or more than a first threshold, or when number of times the first behavior is repeated is equal to or more than a second threshold, the output control unit outputs first guidance to lead the user to first expected behavior that is behavior the user is expected to carry out subsequent to the first behavior.
- a computer program causes a computer to function as a determining unit and an output control unit.
- the determining unit determines a skill level of a user from first behavior of the user.
- the control output unit outputs first guidance to lead the user to first expected behavior that is behavior the user is expected to carry out subsequent to the first behavior.
- the output control unit omits the first guidance to lead the user to the first expected behavior or output second guidance that is simpler than the first guidance.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
- Image Analysis (AREA)
- Control Of Vending Devices And Auxiliary Devices For Vending Devices (AREA)
- Navigation (AREA)
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017028314A JP2018133050A (ja) | 2017-02-17 | 2017-02-17 | 機器、制御方法及びプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3364293A1 true EP3364293A1 (fr) | 2018-08-22 |
Family
ID=59799211
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP17187831.7A Withdrawn EP3364293A1 (fr) | 2017-02-17 | 2017-08-24 | Dispositif et procédé de commande |
Country Status (3)
Country | Link |
---|---|
US (1) | US20180239458A1 (fr) |
EP (1) | EP3364293A1 (fr) |
JP (1) | JP2018133050A (fr) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110362352A (zh) * | 2019-06-14 | 2019-10-22 | 深圳市富途网络科技有限公司 | 一种引导信息的展示方法及装置 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0292934A2 (fr) * | 1987-05-25 | 1988-11-30 | Fujitsu Limited | Système de commande de l'affichage de messages destinés à guider l'opérateur |
EP0913798A2 (fr) * | 1997-10-31 | 1999-05-06 | Ncr International Inc. | Méthode et système pour surveiller et améliorer l'efficacité assistée par ordinateur |
US20040088273A1 (en) * | 2002-10-21 | 2004-05-06 | Canon Kabushiki Kaisha | Information processing device and method |
US20070157092A1 (en) * | 2005-12-29 | 2007-07-05 | Sap Ag | System and method for providing user help according to user category |
-
2017
- 2017-02-17 JP JP2017028314A patent/JP2018133050A/ja not_active Abandoned
- 2017-08-07 US US15/670,412 patent/US20180239458A1/en not_active Abandoned
- 2017-08-24 EP EP17187831.7A patent/EP3364293A1/fr not_active Withdrawn
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0292934A2 (fr) * | 1987-05-25 | 1988-11-30 | Fujitsu Limited | Système de commande de l'affichage de messages destinés à guider l'opérateur |
EP0913798A2 (fr) * | 1997-10-31 | 1999-05-06 | Ncr International Inc. | Méthode et système pour surveiller et améliorer l'efficacité assistée par ordinateur |
US20040088273A1 (en) * | 2002-10-21 | 2004-05-06 | Canon Kabushiki Kaisha | Information processing device and method |
US20070157092A1 (en) * | 2005-12-29 | 2007-07-05 | Sap Ag | System and method for providing user help according to user category |
Also Published As
Publication number | Publication date |
---|---|
JP2018133050A (ja) | 2018-08-23 |
US20180239458A1 (en) | 2018-08-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106228168B (zh) | 卡片图像反光检测方法和装置 | |
CN106682068A (zh) | 用于适应性更新用于用户认证的注册数据库的方法和设备 | |
US10095949B2 (en) | Method, apparatus, and computer-readable storage medium for area identification | |
EP3133527A1 (fr) | Procédé de reconnaissance de visage humain, appareil et terminal | |
EP3910535A1 (fr) | Procédé de d'entrée d'empreinte digitale et dispositif associé | |
CN111343496A (zh) | 一种视频处理方法及装置 | |
WO2016132731A1 (fr) | Dispositif d'aide au travail, système d'aide au travail, procédé d'aide au travail et support d'enregistrement stockant un programme d'aide au travail | |
AU2015324346A1 (en) | Credit card with built-in sensor for fraud detection | |
CN111240482B (zh) | 一种特效展示方法及装置 | |
CN110807368B (zh) | 一种注入攻击的识别方法、装置及设备 | |
EP3407256A1 (fr) | Reconnaissance de caractéristiques biologiques | |
EP3098765A1 (fr) | Procédé et appareil permettant de recommander une carte cloud | |
CN105353938A (zh) | 悬浮短信显示方法及装置 | |
CN104899588B (zh) | 识别图像中的字符的方法及装置 | |
US20150199571A1 (en) | Pos terminal apparatus and customer information acquisition method | |
CN106650513A (zh) | 密码输入方式的推荐方法和装置 | |
EP3364293A1 (fr) | Dispositif et procédé de commande | |
US20220401825A1 (en) | Alarm method and apparatus for tabletop game, electronic device and storage medium | |
CN111428806B (zh) | 图像标签确定方法、装置、电子设备及存储介质 | |
US11205258B2 (en) | Image processing apparatus, image processing method, and storage medium | |
US20170262870A1 (en) | Information processing apparatus, method of controlling same, and non-transitory computer-readable storage medium | |
US10216988B2 (en) | Information processing device, information processing method, and computer program product | |
CN110647841B (zh) | 图像识别结果过滤方法、装置、计算机设备及存储介质 | |
CN109245788A (zh) | 卡托取出方法、系统、存储介质、移动终端及卡座模组 | |
JP2013069187A (ja) | 画像処理システム、画像処理方法、サーバおよびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20170824 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20190223 |