US20210133450A1 - Information processing method, information processing apparatus, and information processing system - Google Patents

Information processing method, information processing apparatus, and information processing system Download PDF

Info

Publication number
US20210133450A1
US20210133450A1 US17/079,215 US202017079215A US2021133450A1 US 20210133450 A1 US20210133450 A1 US 20210133450A1 US 202017079215 A US202017079215 A US 202017079215A US 2021133450 A1 US2021133450 A1 US 2021133450A1
Authority
US
United States
Prior art keywords
product
obtaining
packaging material
image data
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US17/079,215
Other versions
US11743415B2 (en
Inventor
Takahiro Nakayama
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Assigned to CANON KABUSHIKI KAISHA reassignment CANON KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NAKAYAMA, TAKAHIRO
Publication of US20210133450A1 publication Critical patent/US20210133450A1/en
Application granted granted Critical
Publication of US11743415B2 publication Critical patent/US11743415B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06K9/00671
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14172D bar codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/083Shipping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K2007/10504Data fields affixed to objects or articles
    • G06K2209/25
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/09Recognition of logos
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/02Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip

Definitions

  • the present disclosure relates to a technique in which an image obtained by capturing a product is used to display an image for assisting initial installation of a product.
  • Precision devices such as electronic devices are covered with and held in place by multiple packaging materials to avoid breakage and the like in a distribution process and are shipped as products. Accordingly, a user who purchases such a product needs to perform installation work such as unpacking, initial installation, and initial setting of the arrived product. Such installation work of a product has become complex with progress of electronic devices and is cumbersome and difficult work for an ordinary user.
  • Japanese Patent Laid-Open No. 2013-140563 discloses a technique of presenting an error recovery operation for an image forming apparatus in an error state by means of AR guidance. Specifically, in the case where the image forming apparatus connected to an external server via a network falls into an error state, the external server analyzes an error log and sends AR information to a mobile device of a user, the AR information including the AR guidance that presents the error recovery operation. The user performs work for error recovery by referring to the AR information displayed on the mobile device.
  • the image forming apparatus needs to be connected to the external server via the network. Accordingly, this technique cannot be applied to initial installation work of a product at arrival that is in a packaged state.
  • An object of the present disclosure is to enable assistance of an initial installation work of a product by displaying an image.
  • the present disclosure is an information processing method comprising: an image capturing step of capturing an image of a product and obtaining captured image data; a first obtaining step of obtaining product information relating to initial installation of the product based on the captured image data; a second obtaining step of obtaining a determination result of an installation state of the product based on the captured image data and the product information; a third obtaining step of obtaining combined image data expressing a combined image obtained by combining the captured image and an instruction image of installation work to be performed on the product by a user, depending on the determined installation state; and a display control step of causing a display unit to display the combined image based on the combined image data.
  • the present disclosure can assist an initial installation work of a product by displaying an image.
  • FIG. 1 is a diagram illustrating a configuration of an information processing system in an embodiment and a product
  • FIG. 2 is a perspective view illustrating an example of a packaging box
  • FIGS. 3A to 3C are diagrams illustrating an example of a printer main body and inner packaging materials
  • FIG. 4 is a diagram illustrating an example of a table storing packaging material information
  • FIG. 5 is a flowchart illustrating a process of performing AR guidance
  • FIG. 6 is a flowchart illustrating a process of S 3 in FIG. 5 in detail
  • FIG. 7 is a flowchart illustrating a process of S 103 in FIG. 6 in detail
  • FIG. 8 is a flowchart illustrating a process of S 104 in FIG. 6 in detail
  • FIG. 9 is a flowchart illustrating a specific example of a process illustrated in FIG. 8 ;
  • FIG. 10 is a flowchart illustrating a process of S 403 in FIG. 8 in detail
  • FIG. 11 is a flowchart more specifically illustrating a process of S 503 in FIG. 9 ;
  • FIG. 12 is a flowchart illustrating an interruption process performed in the middle of the process of S 403 in FIG. 8 ;
  • FIG. 13 is a flowchart specifically illustrating a process of S 105 in FIG. 6 :
  • FIGS. 14A to 14C are views illustrating examples of images displayed by an AR technology.
  • FIGS. 15A to 15C are views illustrating examples of AR images for giving instructions to a user by the AR technology.
  • An information processing system performs processing for assisting initial installation work of a product by means of augmented reality (AR) guidance, the product including a predetermined device packaged in packaging materials.
  • AR augmented reality
  • FIG. 1 is a diagram illustrating a configuration of the information processing system in the embodiment and a product 10 that is a target of the initial installation work.
  • the information processing system in the embodiment includes a terminal 200 that is operable by a user and a cloud server 300 that is an external server on a network communicable with the terminal 200 .
  • a printer main body as an example of a device main body packaged in packaging materials.
  • a multifunction mobile phone (hereinafter, referred to as smartphone) 200 is used as the terminal.
  • the smartphone 200 is provided with a wireless communication unit 201 that can communicate with a printer main body 100 and the cloud server 300 and a camera (image capturing unit) 202 that obtains captured image data of an image capturing part of or the entire product 10 , a two-dimensional code (QR code) 104 , or an identifier 105 .
  • the smartphone 200 is further provided with a controller 203 , a display unit (display unit) 204 that can display images captured by the camera 202 , AR images to be described later, and the like, and an operation unit 205 on which input operations are performed.
  • the controller 203 is formed of a processor such as a central processing unit (CPU) and has a function as a control unit that controls the aforementioned units, a function as a determination unit that performs determination processing to be described later, and a function as a generation unit that generates image data. Specifically, the controller 203 generates instruction image data depicting an instruction image of an instruction such as what the user is to do next, based on later-described product information and the like obtained from the later-described cloud server 300 .
  • a processor such as a central processing unit (CPU) and has a function as a control unit that controls the aforementioned units, a function as a determination unit that performs determination processing to be described later, and a function as a generation unit that generates image data.
  • the controller 203 generates instruction image data depicting an instruction image of an instruction such as what the user is to do next, based on later-described product information and the like obtained from the later-described cloud server 300 .
  • the controller 203 performs control of generating combined image data (Augmented Reality image data (AR image data)) by combining the generated instruction image data and the captured image data obtained by the camera 202 and displaying a combined image (AR image) on the display unit 204 based on the generated combined image data.
  • the controller 203 in the embodiment has functions of performing control operations of the units and obtaining various types of information, and functions as a display control unit, a first obtaining unit, a second obtaining unit, and a third obtaining unit of the present disclosure.
  • the smartphone 200 is used as the terminal in the embodiment, a tablet personal computer (PC), a laptop PC, or the like may be used as the terminal instead of the smartphone 200 .
  • the terminal used in the information processing system of the embodiment may be any device that has functions equivalent to the camera 202 , the display unit 204 , the operation unit 205 , the wireless communication unit 201 , and the controller 203 .
  • the cloud server 300 includes a wireless communication unit 301 that performs wireless communication with the network and a server 302 that supplies product information for performing AR guidance relating to the initial installation of the product 10 to the smartphone 200 .
  • the product information supplied from the cloud server 300 includes, for example, the following information:
  • the product information varies depending on the type of the printer being the target and the cloud server 300 supplies information optimal for the printer being the target.
  • the product 10 in the embodiment includes the printer main body 100 , packaging materials 103 in which the printer main body 100 is packaged, and the identifiers 105 provided respectively to the packaging materials 103 and the printer main body 100 .
  • the packaging materials 103 in the embodiment include an inner packaging material 102 in which the printer main body 100 is packaged and a packaging box 101 that is an outer packaging material housing the printer main body 100 packaged in the inner packaging material 102 .
  • the identifier 105 is a collective term for an identification symbol whose image (identifier image) expressing the identifier is read to obtain various pieces of product information associated with the read image data and includes, for example, the following objects:
  • FIG. 2 illustrates an example of the packaging box 101 forming an outer shell of the product 10 .
  • FIG. 1 illustrates a state where the product 10 has just arrived, that is a state before unpacking of the product.
  • Outer surfaces of the packaging box 101 are provided with the identifier 105 and the two-dimensional code (QR code (product name)) 104 including specification information for specifying the product.
  • QR code product name
  • various types of identifiers 105 a , 105 b , and 105 c are printed on different outer surfaces of the packaging box 101 as the identifier 105 .
  • the QR code 104 includes information for specifying the printer main body. Reading the QR code 104 with the smartphone 200 and sending the read information by accessing the cloud server 300 on the network allows the controller 203 to obtain information (product information) relating to the product 10 corresponding to the QR code 104 from the cloud server 300 .
  • the identifier 105 is formed of symbols in which various pieces of information are embedded.
  • the identifier 105 a is a logo of a manufacturer used as the identifier.
  • the smartphone 200 reads the identifier 105 a with the camera 202 and identifies the orientation of the packaging box 101 and the location of an unpacking opening OP based on the read information.
  • the QR code 104 and the identifier 105 are provided at different positions, a symbol collectively including the information expressed by the QR code 104 and the information expressed by the identifier 105 may be printed on the outer side of the packaging box 101 as the identifier.
  • FIGS. 3A to 3C are diagrams illustrating an example of the printer main body 100 and the inner packaging material 102 in which the printer main body 100 is packaged.
  • the inner packaging material 102 includes a first inner packaging material 102 a , a second inner packaging material 102 b , a third inner packaging material 102 c , a fourth inner packaging material 102 d , and a fifth inner packaging material 102 e .
  • the first inner packaging material 102 a and the second inner packaging material 102 b are tape-shaped members (orange tapes) used to hold movable portions and the like provided in the printer main body 100 in place and protect them, and are attached to parts of an outer surface of the printer main body 100 to be capable of being peeled off.
  • the third inner packaging material 102 c is a sheet-shaped or bag-shaped member (protective vinyl product) covering the entire printer main body 100 .
  • the fourth inner packaging material 102 d and the fifth inner packaging material 102 e are cushioning materials covering left and right portions of the printer main body 100 covered with the third inner packaging material 102 c and have a function of holding the printer main body 100 in place in the packaging box 101 and protecting it.
  • the first inner packaging material 102 a , the second inner packaging material 102 b , the fourth inner packaging material 102 d , and the fifth inner packaging material 102 e illustrated in FIGS. 3A and 3B have a function of themselves being the identifier 105 in addition to the function of holding the printer main body 100 in place and protecting it.
  • presence or absence of each of the first inner packaging material 102 a , the second inner packaging material 102 b , the fourth inner packaging material 102 d , and the fifth inner packaging material 102 e is information indicating whether the inner packaging material is removed or not.
  • a mode in which the packaging material 102 a and the identifier 105 are independently provided as illustrated in FIG. 3C may be employed.
  • FIG. 4 is a diagram illustrating an example of a packaging material information table provided in the cloud server 300 .
  • the packaging material information table (hereinafter, simply referred to as table T)
  • information product information for correctly removing all of the packaging materials 103 in the product 10 including the packaged printer main body 100 in the correct order is stored.
  • Items of the table T include “category (name)”, “position”, “priority”, “confirmation identifier”, “determination method”, “orientation identification”, and the like. Information as described below is stored in these items.
  • FIG. 5 is a flowchart illustrating a process performed in the case where the initial installation work of the product 10 is assisted by means of AR guidance. Note that S attached to the step numbers in the flowcharts illustrated in FIGS. 5, 6, and 12 means step.
  • the controller 203 activates a dedicated application on the smartphone and starts image capturing with the camera 202 .
  • the controller 203 extracts and reads the QR code from image data captured by the camera 202 and sends the read QR code from the wireless communication unit 201 to the cloud server 300 on the network (S 1 ).
  • the cloud server 300 sends information (product information) on the product corresponding to the received QR code to the smartphone 200 via the network.
  • the smartphone 200 receives the product information sent from the cloud server 300 via the wireless communication unit 201 and the controller 203 obtains the product information (S 2 ). Thereafter, the controller 203 of the smartphone 200 performs AR guidance of initial setting for the user on the display unit 204 based on the obtained product information (S 3 ).
  • FIG. 6 is a flowchart illustrating a process of S 3 in FIG. 5 in detail.
  • the controller 203 obtains the captured image data of the product 10 being the target from the camera 202 of the smartphone 200 .
  • the product 10 referred herein includes not only the printer main body 100 but also objects covered with the various packaging materials 103 covering the printer main body 100 .
  • the controller 203 analyzes the captured image data by image recognition and classifies the installation state of the product 10 into one of the following three states:
  • image recognition using machine learning such as deep learning and image recognition using pattern matching are conceivable.
  • a neural network that is a learning model obtained by learning a large number of images capturing the product 10 in the aforementioned three states is prepared in the smartphone 200 .
  • this learning model receives an image as input data and outputs percentages of the respective three states described above as output data (recognition result). The state corresponding to an output value with the highest percentage is thus the analysis result of the received image.
  • the following pieces of data are prepared as learning data: the images of the product 10 in the aforementioned three states are prepared as input data and information indicating the states of the product 10 in the respective images are prepared as training data (correct answer data). Then, the training data and output data (recognition result) outputted by inputting the images being the input data into the learning model are provided to a loss function and a deviation amount from the correct answer of the recognition result is thus obtained. Weighting coefficients of connections between nodes in a neural network in the learning model and the like are updated such that deviation amounts L for many pieces of training data becomes smaller. Backpropagation is a method of adjusting the weighting coefficients of connections between nodes in each neural network such that the aforementioned errors become smaller.
  • a nearest neighbor algorithm a naive Bayes algorithm, a decision tree, a support vector machine, and the like can be given as specific algorithms of machine learning.
  • deep learning that generates characteristic amounts for learning and the weight coefficients of connections by itself by using a neural network can be given as the specific algorithm. Any of the aforementioned algorithms that are usable can be used and applied to the embodiment.
  • the controller 203 determines the state of the product 10 by inputting the image of the product 10 captured by the user with the smartphone 200 into the learning model generated as described above.
  • the controller 203 determines presence or absence of each identifier 105 by pattern matching. For example, in the case where the controller 203 recognizes an identifier that is present only before the unpacking, the controller 203 determines that the product 10 is in the pre-unpacking state. In the case where the controller 203 recognizes an identifier that is recognizable only after the unpacking, the controller 203 determines that the product 10 is being unpacked.
  • the controller 203 selectively performs one of initial installation processes, specifically, a pre-unpacking process, an unpacking process, and a post-unpacking process, depending on the result of determining the state of the product. These processes are described in detail later.
  • the controller 203 determines whether all initial installation processes are completed. If the controller 203 confirms that all initial installation processes are completed, the controller 203 terminates the processing. If not, the controller 203 performs the aforementioned processes again from S 101 one by one.
  • FIG. 7 is a flowchart illustrating the process of S 103 in FIG. 6 in detail and illustrates the process before unpacking of the packaging box 101 .
  • the controller 203 recognizes the orientation of the packaging box 101 based on the identifier 105 obtained by the camera 202 of the smartphone 200 .
  • the controller 203 recognizes the orientation of the packaging box 101 based on the type and angle of the identifier 105 recognized by image recognition. For example, assuming the case of FIG. 2 , the controller 203 recognizes the identifier 105 a of the packaging box 101 for the printer by image recognition.
  • controller 203 recognizes that the identifier 105 a is oriented at 270 degrees with respect to the horizontal as a result of this recognition.
  • the controller 203 combines this identifier information and “packaging box orientation identification data” obtained from the cloud server 300 and recognizes that the packaging box 101 is oriented at 270 degrees with respect to the horizontal.
  • step S 202 the controller 203 specifies the position of the unpacking opening OP by analyzing the orientation of the packaging box 101 and the “packaging box orientation identification data” obtained from the cloud server 300 .
  • step S 203 the controller 203 instructs the user to take out the printer from the unpacking opening by using AR technology.
  • FIGS. 14A to 14C are views illustrating examples of images (AR images) displayed by using the AR technology.
  • FIGS. 14A and 14B illustrate AR images before opening of the unpacking opening OP and
  • FIG. 14C illustrates an AR image notifying the user of completion unpacking.
  • FIG. 8 is a flowchart illustrating the process of S 104 in FIG. 6 in detail and illustrates the process in unpacking.
  • the controller 203 checks whether the packaging materials 103 are removed or not in the descending order of priority of the packaging materials 103 in the packaging material information table T. For example, in S 401 , the controller 203 checks whether the packaging materials with priority of 1 in the table T are present or absent one by one. In the case where at least one packaging material 103 with priority of 1 is present, the controller 203 performs AR guidance by using the display unit 204 of the smartphone 200 until confirming removable of the packaging material 103 with the priority of 1.
  • the controller 203 confirms removable of the packaging materials with priority of 2 and the packaging materials with priority of 3 one by one and, in the case where the packaging materials with priority of n are removed from the product 10 , the processing proceeds to S 404 .
  • the controller 203 displays a notification to the user indicating that all packaging materials 103 are correctly removed, on the display unit 204 of the smartphone 200 and terminates the process of S 104 .
  • FIG. 9 is a flowchart more specifically illustrating the process illustrated in FIG. 8 .
  • a specific check process performed by the controller 203 based on the table T of FIG. 5 is as illustrated in S 501 to S 505 .
  • the controller 203 checks whether the fifth inner packaging material (left) 102 e as a cushioning material and the fourth inner packaging material (right) 102 d as a cushioning material that are the packaging materials with priority of 1 are present and performs AR guidance until determining that the fifth inner packaging material 102 e and the fourth inner packaging material 102 d are absent.
  • the controller 203 checks whether the protection vinyl product 102 c that is the third inner packaging material with priority of 2 is present and performs AR guidance until determining that the third inner packaging material 102 c is absent.
  • the controller 203 checks whether the orange tape 102 a that is the first inner packaging material with priority of 3 is present.
  • the orange tape 102 a is provided in the location [1] extending on both of a side surface and a front surface of the printer main body 100 and the controller 203 performs AR guidance until determining that the orange tape 102 a is removed from the location [1] and is absent.
  • the controller 203 checks whether the orange tape 102 b that is the second inner packaging material with priority of 4 is present.
  • the second inner packaging material 102 b is provided in a location [2] extending on both of the front surface and an upper surface of the printer main body 100 and the controller 203 performs AR guidance until determining that the second inner packaging material 102 b is removed from the location [2] and is absent.
  • the controller 203 notifies the user that all packaging materials are correctly removed, by using the display unit 204 of the smartphone 200 .
  • FIG. 10 is a flowchart illustrating the process of S 403 in FIG. 8 in detail.
  • FIG. 10 illustrates steps of checking the packaging materials 103 with priority of n one by one to check whether each packaging material 103 is correctly removed.
  • the controller 203 obtains information on the packaging material 103 being the target from the table T illustrated in FIG. 4 .
  • the controller 203 determines whether the orientation of the product 10 needs to be identified to specify the position of the packaging material 103 being the target, based on the information obtained in S 601 . This determination is performed by obtaining information of “necessary” or “unnecessary” defined in the item of “orientation identification” corresponding to the packaging material being the target.
  • step S 603 the controller 203 performs image recognition and analysis based on the captured image data of the product captured by the camera 202 of the smartphone 200 and determines the orientation of the product.
  • a method using deep learning, pattern matching using the identifiers provided in the product, or the like is employed as a method of the image recognition.
  • the controller 203 specifies the position of the packaging material 103 based on the position information in the table T and the orientation information of the product 10 determined in S 603 .
  • the position information includes information indicating a portion of the product 10 where each of the packaging materials 103 is present.
  • the controller 203 can specify the position of the packaging material 103 with respect to the captured image obtained by the camera 202 of the smartphone 200 by combining the position information and the orientation information.
  • the controller 203 determines whether the captured image obtained by the camera 202 of the smartphone 200 includes the packaging material 103 of the product 10 . If the controller 203 determines that the captured image does not include the packaging material 103 , the processing proceeds to S 606 and the controller 203 performs AR guidance instructing the user to take an image of the position where the packaging material 103 is provided with the camera 202 of the smartphone 200 , on the display unit 204 of the smartphone 200 .
  • the controller 203 determines whether the packaging material 103 being the target of removable is present or absent based on the identifier 105 . Specifically, in the case where the identifier 105 is configured to become exposed by the removal of the packaging material 103 , the controller 203 determines that the packaging material 103 is removed if the image captured by the camera 202 includes the identifier 105 . Meanwhile, in the case where the identifier 105 is provided in the packaging material 103 , the controller 203 determines that the packaging material 103 is removed if the image captured by the camera 202 does not include the identifier.
  • the controller 203 determines whether the packaging material 103 is removed. If the controller 203 determines that the packaging material 103 is not removed, the processing proceeds to S 609 and the controller 203 performs AR guidance of prompting the user to remove the packaging material 103 on the display unit 204 . Then, the processing returns to S 607 . If the controller 203 determines that the packaging material 103 is removed in S 608 , the processing proceeds to S 610 and the controller 203 displays an AR image indicating that the packaging material 103 is correctly removed on the display unit 204 . This notification allows the user to recognize that work is performed in an orderly manner. Accordingly, erroneous work such as returning the removed packaging material 103 to its original position is not performed.
  • the controller 203 determines whether all packaging materials 103 are removed. If the controller 203 determines that all packaging materials 103 are removed, the series of processes is terminated. If there is still a not-removed packaging material, the controller 203 continues performing the processes again from step S 601 .
  • FIG. 11 is a flowchart more specifically illustrating the process of “checking the orange tape” described in S 503 of FIG. 9 .
  • the controller 203 refers to the packaging material information table T and obtains information on the packaging material being the target.
  • the packaging material 103 being the target is the orange tape and the controller 203 needs to recognize the orientation of the product 10 to specify the position of the orange tape. Accordingly, in the subsequent S 703 , the controller 203 recognizes the orientation of the product 10 .
  • the controller 203 specifies the position where the orange tape 102 a is provided based on the position information obtained from the table T and the orientation information of the product recognized in S 703 .
  • the controller 203 determines whether the position (target position) where the orange tape 102 a is provided is included in the image captured by the camera 202 of the smartphone 200 . If the controller 203 determines that the position where the orange tape 102 a is provided is not captured, the processing proceeds to S 706 and the controller 203 performs AR guidance instructing the user to take an image of the target position where the orange tape 102 a is provided with the smartphone 200 , on the display unit 204 .
  • step S 707 the controller 203 determines whether the orange tape 102 a is present or absent at the target position based on the identifier 105 . Specifically, in the case where the identifier 105 is configured to be exposed by the removable of the orange tape 102 a , the controller 203 determines that the orange tape is removed if the identifier 105 is present. Meanwhile, in the case where the orange tape has the function of the identifier, the controller 203 determines that the packaging material 103 is removed if the orange tape is absent.
  • the controller 203 determines whether the orange tape 102 a is removed. If the controller 203 determines that the orange tape 102 a is not removed (NO), in S 709 , the controller 203 displays an AR image instructing the user to remove the orange tape 102 a on the display unit 204 . Meanwhile, if the controller 203 determines that the orange tape 102 a is removed in S 708 , the controller 203 notifies the user of correct removal of the orange tape 102 a via the display unit 204 in S 710 .
  • FIGS. 15A to 15C illustrate examples of AR images giving instructions to the user by using the AR technology.
  • FIG. 15A is a view illustrating an AR image instructing the user to take an image of the position where the orange tape 102 a is provided with the smartphone 200 in the process of S 706 of FIG. 11 .
  • FIG. 15B is a view illustrating an AR image instructing the user to remove the orange tape in S 709 of FIG. 11 .
  • FIG. 15C is a view illustrating an AR image notifying the user of correct removal of the packaging material 103 in S 710 of FIG. 11 .
  • FIG. 12 is a flowchart illustrating an interruption process performed in the middle of the process of S 403 in FIG. 8 to terminate the process by a user operation.
  • the controller 203 executes the interruption process in S 403 a and terminates the process of S 403 .
  • the controller 203 can thereby terminate the process of S 403 at the moment where the user finishes removing the packaging material 103 , without waiting for completion of all check processes illustrated in FIG. 8 . Accordingly, it is possible to swiftly proceed to the subsequent post-unpacking process.
  • a method of the user inputting the removal of the packaging material 103 being the target includes, for example, an input using a voice user interface (UI) or a touch input UI such as a complete button or a check box displayed in the display unit 204 of the smartphone 200 .
  • UI voice user interface
  • touch input UI such as a complete button or a check box displayed in the display unit 204 of the smartphone 200 .
  • FIG. 13 is a flowchart specifically illustrating the post-unpacking process in S 105 of FIG. 6 .
  • the controller 203 instructs the user to connect an AC cable (attachment member) to the printer main body 100 by AR guidance.
  • the controller 203 displays an image of AR guidance that explicitly indicates a socket for the AC cable provided in the printer main body 100 and that instructs the user to perform an operation of connecting the AC cable to this socket on the display unit 204 of the smartphone 200 .
  • the controller 203 instructs the user to turn on the power by operating a soft key provided in the printer main body 100 by using an AR image.
  • the controller 203 explicitly indicates a power button provided in the printer main body 100 and gives an instruction on a method of operating the power button by using the AR image. The user can thereby perform work without confusion even if the power on operation includes an operation unique to the electronic device such as long-press of a power ON button.
  • the controller 203 instructs the user to perform initial set-up work (initial setting work) on the printer main body 100 by using AR images. Specifically, in S 803 , the controller 203 instructs the user to attach an ink tank (attachment member) to the printer main body 100 by using an AR image. In this case, the controller 203 recognizes the ink tank by image recognition and performs guidance of correctly removing a packaging of the ink tank and then attaching the ink tank to a correct position in the printer main body 100 . This can prevent occurrence of errors and the like that are caused by erroneous attachment of the ink tank or failure to remove the packaging material.
  • the controller 203 instructs the user to perform registration adjustment in the printer main body 100 by using an AR image. Specifically, the controller 203 instructs the user to perform setting of print sheets and operations on a UI provided in the printer main body 100 by using an AR image.
  • the server 302 provided in the cloud server 300 may perform some of the processes described in S 3 of FIG. 5 .
  • the cloud server 300 includes a processor such as a CPU like the smartphone 200 , the cloud server 300 can perform the process of generating an AR image by superimposing information for performing AR guidance on an image captured by the camera of the smartphone 200 . Then, the cloud server 300 can send the generated AR image from the cloud server 300 to the smartphone 200 to display the AR image on the display unit 204 .
  • the present disclosure is not limited to this configuration.
  • the AR images can be generated based only on information provided in a terminal such as the smartphone, without using an external server such as the cloud server.
  • Embodiment(s) of the present disclosure can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s).
  • computer executable instructions e.g., one or more programs
  • a storage medium which may also be referred to more fully as a
  • the computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions.
  • the computer executable instructions may be provided to the computer, for example, from a network or the storage medium.
  • the storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD) T TM), a flash memory device, a memory card, and the like.

Abstract

A display control unit causes a display unit to display a captured image of a product captured by an image capturing unit. A first obtaining unit obtains product information relating to initial setting of the product based on the captured image data. A second obtaining unit obtains a determination result of an installation state of the product based on the captured image data and the product information of the product. A third obtaining unit obtains combined image data expressing a combined image obtained by combining the captured image captured by the image capturing unit and an instruction image of installation work to be performed on the product by a user, depending on the installation state obtained by the second obtaining unit. The display control unit causes the display unit to display the combined image based on the combined image data.

Description

    BACKGROUND OF THE INVENTION Field of the Invention
  • The present disclosure relates to a technique in which an image obtained by capturing a product is used to display an image for assisting initial installation of a product.
  • Description of the Related Art
  • Precision devices such as electronic devices are covered with and held in place by multiple packaging materials to avoid breakage and the like in a distribution process and are shipped as products. Accordingly, a user who purchases such a product needs to perform installation work such as unpacking, initial installation, and initial setting of the arrived product. Such installation work of a product has become complex with progress of electronic devices and is cumbersome and difficult work for an ordinary user.
  • In recent years, a technique of visually assisting work of a user by using an augmented reality (AR) technology in which digital information is superimposed on an actual image has been proposed and performed.
  • Japanese Patent Laid-Open No. 2013-140563 discloses a technique of presenting an error recovery operation for an image forming apparatus in an error state by means of AR guidance. Specifically, in the case where the image forming apparatus connected to an external server via a network falls into an error state, the external server analyzes an error log and sends AR information to a mobile device of a user, the AR information including the AR guidance that presents the error recovery operation. The user performs work for error recovery by referring to the AR information displayed on the mobile device. However, in the technique disclosed in Japanese Patent Laid-Open No. 2013-140563, the image forming apparatus needs to be connected to the external server via the network. Accordingly, this technique cannot be applied to initial installation work of a product at arrival that is in a packaged state.
  • SUMMARY OF THE INVENTION
  • An object of the present disclosure is to enable assistance of an initial installation work of a product by displaying an image.
  • The present disclosure is an information processing method comprising: an image capturing step of capturing an image of a product and obtaining captured image data; a first obtaining step of obtaining product information relating to initial installation of the product based on the captured image data; a second obtaining step of obtaining a determination result of an installation state of the product based on the captured image data and the product information; a third obtaining step of obtaining combined image data expressing a combined image obtained by combining the captured image and an instruction image of installation work to be performed on the product by a user, depending on the determined installation state; and a display control step of causing a display unit to display the combined image based on the combined image data.
  • The present disclosure can assist an initial installation work of a product by displaying an image.
  • Further features of the present disclosure will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a diagram illustrating a configuration of an information processing system in an embodiment and a product;
  • FIG. 2 is a perspective view illustrating an example of a packaging box;
  • FIGS. 3A to 3C are diagrams illustrating an example of a printer main body and inner packaging materials;
  • FIG. 4 is a diagram illustrating an example of a table storing packaging material information;
  • FIG. 5 is a flowchart illustrating a process of performing AR guidance;
  • FIG. 6 is a flowchart illustrating a process of S3 in FIG. 5 in detail;
  • FIG. 7 is a flowchart illustrating a process of S103 in FIG. 6 in detail;
  • FIG. 8 is a flowchart illustrating a process of S104 in FIG. 6 in detail;
  • FIG. 9 is a flowchart illustrating a specific example of a process illustrated in FIG. 8;
  • FIG. 10 is a flowchart illustrating a process of S403 in FIG. 8 in detail;
  • FIG. 11 is a flowchart more specifically illustrating a process of S503 in FIG. 9;
  • FIG. 12 is a flowchart illustrating an interruption process performed in the middle of the process of S403 in FIG. 8;
  • FIG. 13 is a flowchart specifically illustrating a process of S105 in FIG. 6:
  • FIGS. 14A to 14C are views illustrating examples of images displayed by an AR technology; and
  • FIGS. 15A to 15C are views illustrating examples of AR images for giving instructions to a user by the AR technology.
  • DESCRIPTION OF THE EMBODIMENTS
  • A first embodiment of the present disclosure is described below with reference to the drawings.
  • <Configuration of Information Processing System>
  • An information processing system according to the embodiment performs processing for assisting initial installation work of a product by means of augmented reality (AR) guidance, the product including a predetermined device packaged in packaging materials.
  • FIG. 1 is a diagram illustrating a configuration of the information processing system in the embodiment and a product 10 that is a target of the initial installation work. The information processing system in the embodiment includes a terminal 200 that is operable by a user and a cloud server 300 that is an external server on a network communicable with the terminal 200. Note that, in the embodiment, description is given by using a printer main body as an example of a device main body packaged in packaging materials.
  • In the information processing system of the embodiment, a multifunction mobile phone (hereinafter, referred to as smartphone) 200 is used as the terminal. The smartphone 200 is provided with a wireless communication unit 201 that can communicate with a printer main body 100 and the cloud server 300 and a camera (image capturing unit) 202 that obtains captured image data of an image capturing part of or the entire product 10, a two-dimensional code (QR code) 104, or an identifier 105. The smartphone 200 is further provided with a controller 203, a display unit (display unit)204 that can display images captured by the camera 202, AR images to be described later, and the like, and an operation unit 205 on which input operations are performed.
  • The controller 203 is formed of a processor such as a central processing unit (CPU) and has a function as a control unit that controls the aforementioned units, a function as a determination unit that performs determination processing to be described later, and a function as a generation unit that generates image data. Specifically, the controller 203 generates instruction image data depicting an instruction image of an instruction such as what the user is to do next, based on later-described product information and the like obtained from the later-described cloud server 300. Moreover, the controller 203 performs control of generating combined image data (Augmented Reality image data (AR image data)) by combining the generated instruction image data and the captured image data obtained by the camera 202 and displaying a combined image (AR image) on the display unit 204 based on the generated combined image data. As described above, the controller 203 in the embodiment has functions of performing control operations of the units and obtaining various types of information, and functions as a display control unit, a first obtaining unit, a second obtaining unit, and a third obtaining unit of the present disclosure.
  • Although the smartphone 200 is used as the terminal in the embodiment, a tablet personal computer (PC), a laptop PC, or the like may be used as the terminal instead of the smartphone 200. In other words, the terminal used in the information processing system of the embodiment may be any device that has functions equivalent to the camera 202, the display unit 204, the operation unit 205, the wireless communication unit 201, and the controller 203.
  • The cloud server 300 includes a wireless communication unit 301 that performs wireless communication with the network and a server 302 that supplies product information for performing AR guidance relating to the initial installation of the product 10 to the smartphone 200. The product information supplied from the cloud server 300 includes, for example, the following information:
  • (a) product state identification data
  • (b) packaging box orientation identification data
  • (c) printer main body orientation identification data
  • (d) packaging material information table
  • (e) unpacking steps.
  • The product information varies depending on the type of the printer being the target and the cloud server 300 supplies information optimal for the printer being the target.
  • A product used in the embodiment is described. As illustrated in FIG. 1, the product 10 in the embodiment includes the printer main body 100, packaging materials 103 in which the printer main body 100 is packaged, and the identifiers 105 provided respectively to the packaging materials 103 and the printer main body 100. The packaging materials 103 in the embodiment include an inner packaging material 102 in which the printer main body 100 is packaged and a packaging box 101 that is an outer packaging material housing the printer main body 100 packaged in the inner packaging material 102.
  • An outer side of the packaging box 101 is provided with the identifier 105 and the QR code 104 for embedding the product information. The identifier 105 is a collective term for an identification symbol whose image (identifier image) expressing the identifier is read to obtain various pieces of product information associated with the read image data and includes, for example, the following objects:
  • 1. brand logo
  • 2. printer packaging material itself
  • 3. part of a design embedded in a printer case
  • 4. pictogram symbols
  • 5. QR code
  • 6. digital watermarks.
  • FIG. 2 illustrates an example of the packaging box 101 forming an outer shell of the product 10. Note that FIG. 1 illustrates a state where the product 10 has just arrived, that is a state before unpacking of the product. Outer surfaces of the packaging box 101 are provided with the identifier 105 and the two-dimensional code (QR code (product name)) 104 including specification information for specifying the product. In this example, various types of identifiers 105 a, 105 b, and 105 c (see FIG. 2) are printed on different outer surfaces of the packaging box 101 as the identifier 105.
  • The QR code 104 includes information for specifying the printer main body. Reading the QR code 104 with the smartphone 200 and sending the read information by accessing the cloud server 300 on the network allows the controller 203 to obtain information (product information) relating to the product 10 corresponding to the QR code 104 from the cloud server 300.
  • The identifier 105 is formed of symbols in which various pieces of information are embedded. For example, the identifier 105 a is a logo of a manufacturer used as the identifier. The smartphone 200 reads the identifier 105 a with the camera 202 and identifies the orientation of the packaging box 101 and the location of an unpacking opening OP based on the read information. In the embodiment, although the QR code 104 and the identifier 105 are provided at different positions, a symbol collectively including the information expressed by the QR code 104 and the information expressed by the identifier 105 may be printed on the outer side of the packaging box 101 as the identifier.
  • FIGS. 3A to 3C are diagrams illustrating an example of the printer main body 100 and the inner packaging material 102 in which the printer main body 100 is packaged. In this example, the inner packaging material 102 includes a first inner packaging material 102 a, a second inner packaging material 102 b, a third inner packaging material 102 c, a fourth inner packaging material 102 d, and a fifth inner packaging material 102 e. The first inner packaging material 102 a and the second inner packaging material 102 b are tape-shaped members (orange tapes) used to hold movable portions and the like provided in the printer main body 100 in place and protect them, and are attached to parts of an outer surface of the printer main body 100 to be capable of being peeled off. The third inner packaging material 102 c is a sheet-shaped or bag-shaped member (protective vinyl product) covering the entire printer main body 100. The fourth inner packaging material 102 d and the fifth inner packaging material 102 e are cushioning materials covering left and right portions of the printer main body 100 covered with the third inner packaging material 102 c and have a function of holding the printer main body 100 in place in the packaging box 101 and protecting it.
  • The first inner packaging material 102 a, the second inner packaging material 102 b, the fourth inner packaging material 102 d, and the fifth inner packaging material 102 e illustrated in FIGS. 3A and 3B have a function of themselves being the identifier 105 in addition to the function of holding the printer main body 100 in place and protecting it. Specifically, presence or absence of each of the first inner packaging material 102 a, the second inner packaging material 102 b, the fourth inner packaging material 102 d, and the fifth inner packaging material 102 e is information indicating whether the inner packaging material is removed or not. Moreover, a mode in which the packaging material 102 a and the identifier 105 are independently provided as illustrated in FIG. 3C may be employed.
  • FIG. 4 is a diagram illustrating an example of a packaging material information table provided in the cloud server 300. In the packaging material information table (hereinafter, simply referred to as table T), information (product information) for correctly removing all of the packaging materials 103 in the product 10 including the packaged printer main body 100 in the correct order is stored. Items of the table T include “category (name)”, “position”, “priority”, “confirmation identifier”, “determination method”, “orientation identification”, and the like. Information as described below is stored in these items.
      • The item of “category (name)” stores information indicating the categories or names of all packaging materials 103 in the product 10 being the target.
      • The item of “position” stores position information of each packaging material 103 being the target. In this example, the position information includes information such as “entire product”, “location (1)”, and “location (2)”. The “entire product” means that the packaging material spreads over the entire product, the “location (1)” indicates the portion where the orange tape 102 a is provided, and the “location (2)” indicates the portion where the orange tape 102 b is provided.
      • The item “priority” stores information indicating the order of the removable of the packaging materials 103. In this example, the packaging materials 103 are removed in the ascending order of the numbers in the “priority”.
      • The item “confirmation identifier” stores information on the identifiers used to recognize removal of the packaging materials 103. Specifically, it is recognized that each packaging material 103 is removed in the case where the information of the corresponding identifier stored in this item is confirmed or oppositely is not confirmed by image recognition performed on the position of the packaging material 103.
      • The item “determination method” stores information on a state of each identifier in which it is determined that the corresponding packaging material is removed. Specifically, in each of the determination methods in which the item of the determination method is “absent”, it is determined that the packaging material is removed in the case where the identifier is absent at a target position. Meanwhile, in each of the determination methods in which the item of the determination method is “present”, it is determined that the packaging material is removed in the case where the presence of the identifier is confirmed.
      • The item “orientation identification” stores information on whether identification of the orientation of the printer main body 100 is necessary for identifying the position of each packaging material 103. The packaging materials 103 include packaging materials whose positions are identifiable depending on the orientation of the printer main body 100 and packaging materials whose positions are identifiable irrespective of the orientation of the printer main body 100 and information of the item “orientation identification” is thus necessary. For example, in the case where the packaging material 103 is a packaging material that covers the entire printer, the position of the packaging material 103 is obvious and thus the identification of the orientation of the printer main body 100 is unnecessary. Meanwhile, in the case where the packaging material 103 is provided in one portion of the printer main body 100, the position of the packaging material 103 can be correctly recognized only by recognizing the orientation of the printer main body 100.
    <Process for AR Guidance>
  • FIG. 5 is a flowchart illustrating a process performed in the case where the initial installation work of the product 10 is assisted by means of AR guidance. Note that S attached to the step numbers in the flowcharts illustrated in FIGS. 5, 6, and 12 means step.
  • In the case where the user inputs a smartphone image capturing start instruction, the controller 203 activates a dedicated application on the smartphone and starts image capturing with the camera 202. In the case where the user directs the camera 202 toward the QR code printed on the packaging box 101 in this state, the controller 203 extracts and reads the QR code from image data captured by the camera 202 and sends the read QR code from the wireless communication unit 201 to the cloud server 300 on the network (S1). The cloud server 300 sends information (product information) on the product corresponding to the received QR code to the smartphone 200 via the network. The smartphone 200 receives the product information sent from the cloud server 300 via the wireless communication unit 201 and the controller 203 obtains the product information (S2). Thereafter, the controller 203 of the smartphone 200 performs AR guidance of initial setting for the user on the display unit 204 based on the obtained product information (S3).
  • FIG. 6 is a flowchart illustrating a process of S3 in FIG. 5 in detail. In S100, the controller 203 obtains the captured image data of the product 10 being the target from the camera 202 of the smartphone 200. The product 10 referred herein includes not only the printer main body 100 but also objects covered with the various packaging materials 103 covering the printer main body 100. Next, in S101, the controller 203 analyzes the captured image data by image recognition and classifies the installation state of the product 10 into one of the following three states:
      • pre-unpacking state: a state in which the printer main body 100 is housed in the packaging box 101 and is unpacked;
      • post-unpacking state: a state in which all packaging materials 103 are removed from the printer main body 100;
      • unpacking state: a state other than the aforementioned two states such as a state in which some of the packaging materials 103 are attached to the printer main body 100.
  • As a method of analyzing the product image data, image recognition using machine learning such as deep learning and image recognition using pattern matching are conceivable. For example, in the case where the image recognition is performed by using deep learning, a neural network that is a learning model obtained by learning a large number of images capturing the product 10 in the aforementioned three states is prepared in the smartphone 200. Specifically, this learning model receives an image as input data and outputs percentages of the respective three states described above as output data (recognition result). The state corresponding to an output value with the highest percentage is thus the analysis result of the received image.
  • In learning of the learning model, the following pieces of data are prepared as learning data: the images of the product 10 in the aforementioned three states are prepared as input data and information indicating the states of the product 10 in the respective images are prepared as training data (correct answer data). Then, the training data and output data (recognition result) outputted by inputting the images being the input data into the learning model are provided to a loss function and a deviation amount from the correct answer of the recognition result is thus obtained. Weighting coefficients of connections between nodes in a neural network in the learning model and the like are updated such that deviation amounts L for many pieces of training data becomes smaller. Backpropagation is a method of adjusting the weighting coefficients of connections between nodes in each neural network such that the aforementioned errors become smaller. A nearest neighbor algorithm, a naive Bayes algorithm, a decision tree, a support vector machine, and the like can be given as specific algorithms of machine learning. Moreover, deep learning that generates characteristic amounts for learning and the weight coefficients of connections by itself by using a neural network can be given as the specific algorithm. Any of the aforementioned algorithms that are usable can be used and applied to the embodiment.
  • The controller 203 determines the state of the product 10 by inputting the image of the product 10 captured by the user with the smartphone 200 into the learning model generated as described above.
  • In the case where the controller 203 performs image recognition by using pattern matching, the controller 203 determines presence or absence of each identifier 105 by pattern matching. For example, in the case where the controller 203 recognizes an identifier that is present only before the unpacking, the controller 203 determines that the product 10 is in the pre-unpacking state. In the case where the controller 203 recognizes an identifier that is recognizable only after the unpacking, the controller 203 determines that the product 10 is being unpacked. Next, in S103, S104, and S105, the controller 203 selectively performs one of initial installation processes, specifically, a pre-unpacking process, an unpacking process, and a post-unpacking process, depending on the result of determining the state of the product. These processes are described in detail later.
  • Finally, in S106, the controller 203 determines whether all initial installation processes are completed. If the controller 203 confirms that all initial installation processes are completed, the controller 203 terminates the processing. If not, the controller 203 performs the aforementioned processes again from S101 one by one.
  • FIG. 7 is a flowchart illustrating the process of S103 in FIG. 6 in detail and illustrates the process before unpacking of the packaging box 101. First, in S201, the controller 203 recognizes the orientation of the packaging box 101 based on the identifier 105 obtained by the camera 202 of the smartphone 200. Specifically, the controller 203 recognizes the orientation of the packaging box 101 based on the type and angle of the identifier 105 recognized by image recognition. For example, assuming the case of FIG. 2, the controller 203 recognizes the identifier 105 a of the packaging box 101 for the printer by image recognition. Assume that the controller 203 recognizes that the identifier 105 a is oriented at 270 degrees with respect to the horizontal as a result of this recognition. The controller 203 combines this identifier information and “packaging box orientation identification data” obtained from the cloud server 300 and recognizes that the packaging box 101 is oriented at 270 degrees with respect to the horizontal.
  • Next, in step S202, the controller 203 specifies the position of the unpacking opening OP by analyzing the orientation of the packaging box 101 and the “packaging box orientation identification data” obtained from the cloud server 300. Finally, in step S203, the controller 203 instructs the user to take out the printer from the unpacking opening by using AR technology. FIGS. 14A to 14C are views illustrating examples of images (AR images) displayed by using the AR technology. FIGS. 14A and 14B illustrate AR images before opening of the unpacking opening OP and FIG. 14C illustrates an AR image notifying the user of completion unpacking.
  • FIG. 8 is a flowchart illustrating the process of S104 in FIG. 6 in detail and illustrates the process in unpacking. In step S401 to S403, the controller 203 checks whether the packaging materials 103 are removed or not in the descending order of priority of the packaging materials 103 in the packaging material information table T. For example, in S401, the controller 203 checks whether the packaging materials with priority of 1 in the table T are present or absent one by one. In the case where at least one packaging material 103 with priority of 1 is present, the controller 203 performs AR guidance by using the display unit 204 of the smartphone 200 until confirming removable of the packaging material 103 with the priority of 1. Then, the controller 203 confirms removable of the packaging materials with priority of 2 and the packaging materials with priority of 3 one by one and, in the case where the packaging materials with priority of n are removed from the product 10, the processing proceeds to S404. In S404, the controller 203 displays a notification to the user indicating that all packaging materials 103 are correctly removed, on the display unit 204 of the smartphone 200 and terminates the process of S104.
  • FIG. 9 is a flowchart more specifically illustrating the process illustrated in FIG. 8. A specific check process performed by the controller 203 based on the table T of FIG. 5 is as illustrated in S501 to S505.
  • First, in S501, the controller 203 checks whether the fifth inner packaging material (left) 102 e as a cushioning material and the fourth inner packaging material (right) 102 d as a cushioning material that are the packaging materials with priority of 1 are present and performs AR guidance until determining that the fifth inner packaging material 102 e and the fourth inner packaging material 102 d are absent.
  • In S502, the controller 203 checks whether the protection vinyl product 102 c that is the third inner packaging material with priority of 2 is present and performs AR guidance until determining that the third inner packaging material 102 c is absent. In S503, the controller 203 checks whether the orange tape 102 a that is the first inner packaging material with priority of 3 is present. The orange tape 102 a is provided in the location [1] extending on both of a side surface and a front surface of the printer main body 100 and the controller 203 performs AR guidance until determining that the orange tape 102 a is removed from the location [1] and is absent.
  • In S504, the controller 203 checks whether the orange tape 102 b that is the second inner packaging material with priority of 4 is present. The second inner packaging material 102 b is provided in a location [2] extending on both of the front surface and an upper surface of the printer main body 100 and the controller 203 performs AR guidance until determining that the second inner packaging material 102 b is removed from the location [2] and is absent. Then, in S505, the controller 203 notifies the user that all packaging materials are correctly removed, by using the display unit 204 of the smartphone 200.
  • FIG. 10 is a flowchart illustrating the process of S403 in FIG. 8 in detail. FIG. 10 illustrates steps of checking the packaging materials 103 with priority of n one by one to check whether each packaging material 103 is correctly removed.
  • First, in S601, the controller 203 obtains information on the packaging material 103 being the target from the table T illustrated in FIG. 4. Next, in S602, the controller 203 determines whether the orientation of the product 10 needs to be identified to specify the position of the packaging material 103 being the target, based on the information obtained in S601. This determination is performed by obtaining information of “necessary” or “unnecessary” defined in the item of “orientation identification” corresponding to the packaging material being the target. In the case where the controller 203 determines that the orientation identification is necessary, in step S603, the controller 203 performs image recognition and analysis based on the captured image data of the product captured by the camera 202 of the smartphone 200 and determines the orientation of the product. A method using deep learning, pattern matching using the identifiers provided in the product, or the like is employed as a method of the image recognition.
  • Next, in S604, the controller 203 specifies the position of the packaging material 103 based on the position information in the table T and the orientation information of the product 10 determined in S603. The position information includes information indicating a portion of the product 10 where each of the packaging materials 103 is present. The controller 203 can specify the position of the packaging material 103 with respect to the captured image obtained by the camera 202 of the smartphone 200 by combining the position information and the orientation information.
  • Next, in S605, the controller 203 determines whether the captured image obtained by the camera 202 of the smartphone 200 includes the packaging material 103 of the product 10. If the controller 203 determines that the captured image does not include the packaging material 103, the processing proceeds to S606 and the controller 203 performs AR guidance instructing the user to take an image of the position where the packaging material 103 is provided with the camera 202 of the smartphone 200, on the display unit 204 of the smartphone 200.
  • Next, in S607, the controller 203 determines whether the packaging material 103 being the target of removable is present or absent based on the identifier 105. Specifically, in the case where the identifier 105 is configured to become exposed by the removal of the packaging material 103, the controller 203 determines that the packaging material 103 is removed if the image captured by the camera 202 includes the identifier 105. Meanwhile, in the case where the identifier 105 is provided in the packaging material 103, the controller 203 determines that the packaging material 103 is removed if the image captured by the camera 202 does not include the identifier.
  • Next, in S608, the controller 203 determines whether the packaging material 103 is removed. If the controller 203 determines that the packaging material 103 is not removed, the processing proceeds to S609 and the controller 203 performs AR guidance of prompting the user to remove the packaging material 103 on the display unit 204. Then, the processing returns to S607. If the controller 203 determines that the packaging material 103 is removed in S608, the processing proceeds to S610 and the controller 203 displays an AR image indicating that the packaging material 103 is correctly removed on the display unit 204. This notification allows the user to recognize that work is performed in an orderly manner. Accordingly, erroneous work such as returning the removed packaging material 103 to its original position is not performed.
  • Thereafter, in S611, the controller 203 determines whether all packaging materials 103 are removed. If the controller 203 determines that all packaging materials 103 are removed, the series of processes is terminated. If there is still a not-removed packaging material, the controller 203 continues performing the processes again from step S601.
  • FIG. 11 is a flowchart more specifically illustrating the process of “checking the orange tape” described in S503 of FIG. 9. First, in S701, the controller 203 refers to the packaging material information table T and obtains information on the packaging material being the target. In this case, the packaging material 103 being the target is the orange tape and the controller 203 needs to recognize the orientation of the product 10 to specify the position of the orange tape. Accordingly, in the subsequent S703, the controller 203 recognizes the orientation of the product 10.
  • Next, in S704, the controller 203 specifies the position where the orange tape 102 a is provided based on the position information obtained from the table T and the orientation information of the product recognized in S703. In S705, the controller 203 determines whether the position (target position) where the orange tape 102 a is provided is included in the image captured by the camera 202 of the smartphone 200. If the controller 203 determines that the position where the orange tape 102 a is provided is not captured, the processing proceeds to S706 and the controller 203 performs AR guidance instructing the user to take an image of the target position where the orange tape 102 a is provided with the smartphone 200, on the display unit 204.
  • In step S707, the controller 203 determines whether the orange tape 102 a is present or absent at the target position based on the identifier 105. Specifically, in the case where the identifier 105 is configured to be exposed by the removable of the orange tape 102 a, the controller 203 determines that the orange tape is removed if the identifier 105 is present. Meanwhile, in the case where the orange tape has the function of the identifier, the controller 203 determines that the packaging material 103 is removed if the orange tape is absent.
  • Next, in S708, the controller 203 determines whether the orange tape 102 a is removed. If the controller 203 determines that the orange tape 102 a is not removed (NO), in S709, the controller 203 displays an AR image instructing the user to remove the orange tape 102 a on the display unit 204. Meanwhile, if the controller 203 determines that the orange tape 102 a is removed in S708, the controller 203 notifies the user of correct removal of the orange tape 102 a via the display unit 204 in S710.
  • FIGS. 15A to 15C illustrate examples of AR images giving instructions to the user by using the AR technology. FIG. 15A is a view illustrating an AR image instructing the user to take an image of the position where the orange tape 102 a is provided with the smartphone 200 in the process of S706 of FIG. 11. FIG. 15B is a view illustrating an AR image instructing the user to remove the orange tape in S709 of FIG. 11. FIG. 15C is a view illustrating an AR image notifying the user of correct removal of the packaging material 103 in S710 of FIG. 11.
  • FIG. 12 is a flowchart illustrating an interruption process performed in the middle of the process of S403 in FIG. 8 to terminate the process by a user operation. In the case where the user inputs removal of the packaging material 103 being the target in the middle of the check process of the packaging material 103 with priority of n described in S403, the controller 203 executes the interruption process in S403 a and terminates the process of S403. The controller 203 can thereby terminate the process of S403 at the moment where the user finishes removing the packaging material 103, without waiting for completion of all check processes illustrated in FIG. 8. Accordingly, it is possible to swiftly proceed to the subsequent post-unpacking process. A method of the user inputting the removal of the packaging material 103 being the target includes, for example, an input using a voice user interface (UI) or a touch input UI such as a complete button or a check box displayed in the display unit 204 of the smartphone 200.
  • FIG. 13 is a flowchart specifically illustrating the post-unpacking process in S105 of FIG. 6. First, in S801, the controller 203 instructs the user to connect an AC cable (attachment member) to the printer main body 100 by AR guidance. Specifically, the controller 203 displays an image of AR guidance that explicitly indicates a socket for the AC cable provided in the printer main body 100 and that instructs the user to perform an operation of connecting the AC cable to this socket on the display unit 204 of the smartphone 200.
  • Next, in S802, the controller 203 instructs the user to turn on the power by operating a soft key provided in the printer main body 100 by using an AR image. Specifically, the controller 203 explicitly indicates a power button provided in the printer main body 100 and gives an instruction on a method of operating the power button by using the AR image. The user can thereby perform work without confusion even if the power on operation includes an operation unique to the electronic device such as long-press of a power ON button.
  • Next, in S803 and S804, the controller 203 instructs the user to perform initial set-up work (initial setting work) on the printer main body 100 by using AR images. Specifically, in S803, the controller 203 instructs the user to attach an ink tank (attachment member) to the printer main body 100 by using an AR image. In this case, the controller 203 recognizes the ink tank by image recognition and performs guidance of correctly removing a packaging of the ink tank and then attaching the ink tank to a correct position in the printer main body 100. This can prevent occurrence of errors and the like that are caused by erroneous attachment of the ink tank or failure to remove the packaging material.
  • In S804, the controller 203 instructs the user to perform registration adjustment in the printer main body 100 by using an AR image. Specifically, the controller 203 instructs the user to perform setting of print sheets and operations on a UI provided in the printer main body 100 by using an AR image.
  • OTHER EMBODIMENTS
  • In the aforementioned embodiment, description is given of an example in which the guidance using the AR images is performed to assist the user in the unpacking work of the product 10 in which the printer main body 100 is packaged in the packaging materials 103. However, the present disclosure can be also applied to various products other than printers that require unpacking work. For example, installation work of personal computers and measurement devices that require complex unpacking work and setting work, large devices that require support by a service staff, and similar devices can be assisted by using the AR technology as in the aforementioned first embodiment by embedding identifier information in packaging materials and a device main body.
  • In the aforementioned first embodiment, description is given of the example in which the processes described in S3 of FIG. 5 are performed in the smartphone 200. However, the server 302 provided in the cloud server 300 may perform some of the processes described in S3 of FIG. 5. For example, since the cloud server 300 includes a processor such as a CPU like the smartphone 200, the cloud server 300 can perform the process of generating an AR image by superimposing information for performing AR guidance on an image captured by the camera of the smartphone 200. Then, the cloud server 300 can send the generated AR image from the cloud server 300 to the smartphone 200 to display the AR image on the display unit 204.
  • Moreover, although the example in which the guidance using the AR images is performed by using the information obtained from the cloud server is described in the aforementioned embodiment, the present disclosure is not limited to this configuration. Specifically, the AR images can be generated based only on information provided in a terminal such as the smartphone, without using an external server such as the cloud server. For example, it is possible to store an AR guidance application including table information as illustrated in FIG. 4 in the smartphone and perform the AR guidance without communicating with an external server.
  • Embodiment(s) of the present disclosure can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)T™), a flash memory device, a memory card, and the like.
  • While the present disclosure has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
  • This application claims the benefit of Japanese Patent Application No. 2019-199390 filed Oct. 31, 2019, which is hereby incorporated by reference wherein in its entirety.

Claims (13)

What is claimed is:
1. An information processing method comprising:
an image capturing step of capturing an image of a product and obtaining captured image data;
a first obtaining step of obtaining product information relating to initial installation of the product based on the captured image data;
a second obtaining step of obtaining a determination result of an installation state of the product based on the captured image data and the product information;
a third obtaining step of obtaining combined image data expressing a combined image obtained by combining the captured image and an instruction image of installation work to be performed on the product by a user, depending on the determined installation state; and
a display control step of causing a display unit to display the combined image based on the combined image data.
2. The information processing method according to claim 1, wherein the product includes at least a device main body and a packaging material covering the device main body.
3. The information processing method according to claim 2, wherein the product information includes information on the device main body and information on the packaging material.
4. The information processing method according to claim 2, wherein the second obtaining step includes obtaining a determination result of a state of the packaging material based on the captured image data of the product and the information on the packaging material.
5. The information processing method according to claim 4, wherein the second obtaining step includes obtaining a determination result indicating one of states of pre-unpacking, unpacking, and post-unpacking which the packaging material of product is in.
6. The information processing method according to claim 2, wherein
the packaging material includes an outer packaging material forming an outer surface of the product and an inner packaging material housed in the outer packaging material and covering the device main body, and
the outer packaging material and the inner packaging material are provided respectively with identifiers.
7. The information processing method according to claim 6, wherein the second obtaining step includes obtaining a determination result indicating a state of the packaging material based on presence or absence of the identifiers in the captured image data of the product and orientations of the identifiers in the captured image.
8. The information processing method according to claim 2, wherein
the third obtaining step includes obtaining AR image data obtained by combining the captured image data captured in the image capturing step and instruction image data giving an instruction of a method of removing the packaging material, and
the display control step includes causing the display unit to display an AR image according to the AR image data.
9. The information processing method according to claim 2, wherein the display control step includes causing the display unit to display an AR image obtained by combining the captured image captured in the image capturing step and an instruction image giving instructions of a method of attaching an attachment members to be attached to the device main body and a method of adjusting the device main body.
10. The information processing method according to claim 1, wherein
at least the image capturing step, the first obtaining step, the second obtaining step, the third obtaining step, and the display control step are performed in a terminal operable by a user,
the product information is stored in a server communicable with the terminal, and
the first obtaining step includes obtaining the product information from the server by sending specification information specifying the product to the server.
11. The information processing method according to claim 10, wherein the specification information is information obtained by reading a two-dimensional code illustrated on an outer packaging material forming an outer surface of the product.
12. An information processing apparatus comprising:
a display control unit that causes a display unit capable of displaying a captured image of a product captured by an image capturing unit to display the captured image based on captured image data expressing the captured image;
a storage unit storing product information relating to initial installation of the product;
a first obtaining unit that obtains the product information from the storage unit;
a second obtaining unit that obtains a determination result of an installation state of the product based on the captured image data and the product information obtained by the first obtaining unit from the storage unit; and
a third obtaining unit that obtains combined image data expressing a combined image obtained by combining the captured image captured by the image capturing unit and an instruction image of installation work to be performed on the product by a user, depending on the installation state obtained by the second obtaining unit, wherein
the display control unit causes the display unit to display the combined image based on the combined image data.
13. An information processing system comprising:
a display control unit that causes a display unit to display a captured image of a product captured by an image capturing unit based on captured image data expressing the captured image;
a first obtaining unit that obtains product information relating to initial setting of the product based on the captured image data;
a second obtaining unit that obtains a determination result of an installation state of the product based on the captured image data and the product information of the product; and
a third obtaining unit that obtains combined image data expressing a combined image obtained by combining the captured image captured by the image capturing unit and an instruction image of installation work to be performed on the product by a user, depending on the installation state obtained by the second obtaining unit, wherein
the display control unit causes the display unit to display the combined image based on the combined image data.
US17/079,215 2019-10-31 2020-10-23 Information processing method, information processing apparatus, and information processing system Active US11743415B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-199390 2019-10-31
JP2019199390A JP2021072001A (en) 2019-10-31 2019-10-31 Program, information processing device, information processing method, and information processing system

Publications (2)

Publication Number Publication Date
US20210133450A1 true US20210133450A1 (en) 2021-05-06
US11743415B2 US11743415B2 (en) 2023-08-29

Family

ID=75688710

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/079,215 Active US11743415B2 (en) 2019-10-31 2020-10-23 Information processing method, information processing apparatus, and information processing system

Country Status (2)

Country Link
US (1) US11743415B2 (en)
JP (1) JP2021072001A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114079705A (en) * 2020-08-21 2022-02-22 精工爱普生株式会社 Method for setting image processing apparatus, recording medium, and image processing system
US20220083776A1 (en) * 2020-09-11 2022-03-17 Schlage Lock Company Llc Technologies for leveraging machine learning for customized installation of access control hardware
US11922076B2 (en) * 2022-01-18 2024-03-05 Seiko Epson Corporation Print control device, print control method, three-dimensional object printing apparatus, and non-transitory computer-readable storage medium storing program

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090055476A1 (en) * 2005-07-27 2009-02-26 Markus Michael J Collections of linked databases and systems and methods for communicating about updates thereto

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120249588A1 (en) * 2011-03-22 2012-10-04 Panduit Corp. Augmented Reality Data Center Visualization
KR101898754B1 (en) * 2011-05-23 2018-09-13 레고 에이/에스 A toy construction system for augmented reality
US9015245B1 (en) * 2011-07-20 2015-04-21 Google Inc. Experience sharing with commenting
US9235819B2 (en) * 2011-11-04 2016-01-12 Canon Kabushiki Kaisha Printing system, image forming apparatus, and method
US9338622B2 (en) * 2012-10-04 2016-05-10 Bernt Erik Bjontegard Contextually intelligent communication systems and processes
US9401849B2 (en) * 2013-07-02 2016-07-26 Hitachi, Ltd. Network construction support system and method
US20150130834A1 (en) * 2013-11-11 2015-05-14 International Business Machines Corporation Interactive augmented reality for memory dimm installation
US11257143B2 (en) * 2014-12-30 2022-02-22 Hand Held Products, Inc. Method and device for simulating a virtual out-of-box experience of a packaged product
JP6572600B2 (en) * 2015-04-09 2019-09-11 セイコーエプソン株式会社 Information processing apparatus, information processing apparatus control method, and computer program
WO2016171894A1 (en) * 2015-04-20 2016-10-27 NSF International Computer-implemented methods for remotely interacting with performance of food quality and workplace safety tasks using a head mounted display
US9972133B2 (en) * 2015-04-24 2018-05-15 Jpw Industries Inc. Wearable display for use with tool
WO2018102355A1 (en) * 2016-11-29 2018-06-07 Wal-Mart Stores, Inc. Augmented reality-assisted modular set-up and product stocking systems and methods
US11348475B2 (en) * 2016-12-09 2022-05-31 The Boeing Company System and method for interactive cognitive task assistance
CA3005051A1 (en) * 2017-05-16 2018-11-16 Michael J. Schuster Augmented reality task identification and assistance in construction, remodeling, and manufacturing
US10650609B2 (en) * 2018-02-23 2020-05-12 Sap Se Virtual prototyping and assembly validation
KR102605342B1 (en) * 2019-08-06 2023-11-22 엘지전자 주식회사 Method and apparatus for providing information based on object recognition, and mapping apparatus therefor

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090055476A1 (en) * 2005-07-27 2009-02-26 Markus Michael J Collections of linked databases and systems and methods for communicating about updates thereto

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114079705A (en) * 2020-08-21 2022-02-22 精工爱普生株式会社 Method for setting image processing apparatus, recording medium, and image processing system
US20220057970A1 (en) * 2020-08-21 2022-02-24 Seiko Epson Corporation Method for setting image processing apparatus, storage medium, and image processing system
US11816367B2 (en) * 2020-08-21 2023-11-14 Seiko Epson Corporation Method for setting image processing apparatus, storage medium, and image processing system
US20220083776A1 (en) * 2020-09-11 2022-03-17 Schlage Lock Company Llc Technologies for leveraging machine learning for customized installation of access control hardware
US11922076B2 (en) * 2022-01-18 2024-03-05 Seiko Epson Corporation Print control device, print control method, three-dimensional object printing apparatus, and non-transitory computer-readable storage medium storing program

Also Published As

Publication number Publication date
US11743415B2 (en) 2023-08-29
JP2021072001A (en) 2021-05-06

Similar Documents

Publication Publication Date Title
US11743415B2 (en) Information processing method, information processing apparatus, and information processing system
US11704085B2 (en) Augmented reality quick-start and user guide
US10529335B2 (en) Auto-complete methods for spoken complete value entries
US10217089B2 (en) System and method for guided printer servicing
US11443363B2 (en) Confirming product location using a subset of a product identifier
US11185977B2 (en) Information processing apparatus, grasping system, and information processing method
JP6439566B2 (en) Multilingual display device, multilingual display system, multilingual display method, and multilingual display program
EP3217260B1 (en) Maintenance support method, maintenance support system, and maintenance support program
CN109191730A (en) Information processing unit
US10769489B2 (en) Reading test cards using a mobile device
US9471830B2 (en) Collation apparatus, collation method, and computer program product
US10257372B2 (en) Color measurement system, image generating apparatus, and non-transitory computer readable medium for performing color measurement on an image displayed in an augmented reality space
US10484575B2 (en) Image forming apparatus and control method for the same
US20220343308A1 (en) Overlap detection for an item recognition system
CN114080590A (en) Robotic bin picking system and method using advanced scanning techniques
US20200389564A1 (en) System and method for augmented reality device maintenance
US20170337402A1 (en) Tool verification systems and methods for a workflow process
JP2009289046A (en) Operation support device and method using three-dimensional data
KR20210138477A (en) Electronic device for providing online examination and method thereof
JP2015191473A (en) Image forming device management system and image forming device
WO2021033310A1 (en) Processing device, processing method, and program
US9911070B2 (en) Improving product packing operation efficiency
US11695880B2 (en) Information processing method and information processing apparatus to display an image that is a captured image on which electronic information is superimposed
US20230085797A1 (en) Analysis device, analysis system, analysis method, and storage medium
US20230267292A1 (en) Subregion transformation for label decoding by an automated checkout system

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

AS Assignment

Owner name: CANON KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NAKAYAMA, TAKAHIRO;REEL/FRAME:055188/0905

Effective date: 20210105

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STCF Information on status: patent grant

Free format text: PATENTED CASE