WO2024013777A1 - Shopping assistance system, shopping assistance method, and recording medium - Google Patents

Shopping assistance system, shopping assistance method, and recording medium Download PDF

Info

Publication number
WO2024013777A1
WO2024013777A1 PCT/JP2022/027181 JP2022027181W WO2024013777A1 WO 2024013777 A1 WO2024013777 A1 WO 2024013777A1 JP 2022027181 W JP2022027181 W JP 2022027181W WO 2024013777 A1 WO2024013777 A1 WO 2024013777A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
product
customer
support system
shopping support
Prior art date
Application number
PCT/JP2022/027181
Other languages
French (fr)
Japanese (ja)
Inventor
佑樹 鶴岡
明彦 大仁田
祐史 丹羽
峰 三宅
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Priority to PCT/JP2022/027181 priority Critical patent/WO2024013777A1/en
Publication of WO2024013777A1 publication Critical patent/WO2024013777A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions

Definitions

  • the present disclosure relates to a shopping support system and the like.
  • Dome-shaped displays are known as means for displaying virtual spaces and images (for example, Patent Documents 1 and 2).
  • a wearable information processing terminal is known as a means for displaying virtual spaces and images (for example, Patent Document 3).
  • the user may perform operations using a controller.
  • operations are performed using a controller, there is a problem in that it is difficult to operate. For example, when shopping at a store in a virtual space or at a remote store, it may be troublesome for some users to use a controller to perform operations such as viewing products displayed at the store.
  • An example of the purpose of the present disclosure is to provide a shopping support system that can improve convenience when using a large display.
  • a shopping support system includes a first output control means for displaying information regarding a sales floor of a store where products are displayed on a first display;
  • the apparatus includes a product detection means for detecting a product, and a second output control means for displaying information regarding the detected product on a second display.
  • a shopping support method includes displaying information about the sales floor of a store where products are displayed on a first display, detecting a product that the customer wants to acquire based on the customer's behavior, and displaying information on a second display. information regarding the detected product is displayed.
  • a program causes a computer to display information regarding a sales floor of a store where products are displayed on a first display, detects a product that the customer wants to acquire based on the customer's behavior, and displays a second display.
  • a display is caused to display information regarding the detected product and a process is executed.
  • the program may be stored in a computer-readable non-transitory recording medium.
  • FIG. 1 is a block diagram showing an example of a configuration of a shopping support system according to a first embodiment
  • FIG. 3 is a flowchart illustrating an example of the operation of the shopping support system according to the first embodiment.
  • FIG. 2 is an explanatory diagram showing an example of a dome-shaped display. It is an explanatory view showing an example of installation of an aerial display and an example of installation of devices around a customer. It is an explanatory view showing an example of a sales floor of a store. It is an explanatory view showing an aerial display simply.
  • FIG. 2 is a block diagram showing an example of a configuration of a shopping support system according to a second embodiment.
  • FIG. 2 is an explanatory diagram showing an example in which a store's sales floor is displayed on a dome-shaped display.
  • FIG. 2 is an explanatory diagram showing an example of detecting a product from a customer's hand movement and an example of displaying information regarding the product on an aerial display.
  • FIG. 6 is an explanatory diagram showing an example of detecting a product from the movement of a customer's line of sight and an example of displaying information regarding the product on an aerial display.
  • FIG. 6 is an explanatory diagram showing an example of detecting a product from a conversation and an example of displaying information regarding the product on an aerial display.
  • FIG. 6 is an explanatory diagram showing an example of changing the orientation of a displayed product by changing the position and orientation of an imaging device that images the product.
  • FIG. 3 is an explanatory diagram showing an example of changing the orientation of a displayed product by switching the imaging device. It is an explanatory view showing an example of changing the orientation of a displayed product by rotating the stand. 7 is a flowchart illustrating an example of the operation of the shopping support system according to the second embodiment.
  • FIG. 2 is an explanatory diagram showing an example of implementation of a shopping support system.
  • FIG. 2 is an explanatory diagram showing an example of the hardware configuration of a computer.
  • Embodiments of a shopping support system, a shopping support method, a program, and a non-temporary recording medium for recording the program according to the present disclosure will be described in detail below with reference to the drawings.
  • This embodiment does not limit the disclosed technology.
  • each embodiment may be explained using an aerial display as an example of a non-contact display.
  • shopping is taken as an example, the user of the first display is called a customer, and the description is given as a shopping support system, but there are use cases in which the first display is combined with a second display such as an aerial display. is not limited to shopping.
  • FIG. 1 is a block diagram showing an example of a configuration of a shopping support system according to a first embodiment.
  • the shopping support system 10 includes a first output control section 101, a product detection section 102, and a second output control section 103.
  • the first output control unit 101 controls the display on the first display.
  • the first display is, for example, a large display.
  • large size means at least larger than the second display.
  • the first display may be a display device structured to cover the customer's field of view. This allows customers to shop in an immersive manner.
  • a dome-shaped display will be described as an example of the first display.
  • the first output control unit 101 causes a dome-shaped display to display information regarding the department of a store where products are displayed.
  • the information regarding the sales floor is a video of the sales floor, an image of the sales floor, or the like. Note that the video here refers to a moving image, and the image refers to a still image.
  • the first output control unit 101 may cause the dome-shaped display to display information regarding other products, such as the name of the sales floor, as additional information, along with the video of the sales floor and the image of the sales floor.
  • stores include various retail stores such as supermarkets, super centers, convenience stores, mass retailers, home centers, drug stores, stores that handle apparel products, department stores, and individual stores such as bakeries and delicatessen stores. may be used, and is not particularly limited.
  • the product detection unit 102 detects products that the customer wants to acquire based on the customer's behavior.
  • the product that the customer wants to acquire is the product that the customer wants to check the details of.
  • a product that a customer wants to acquire is a product that attracts attention of the customer.
  • the customer's behavior is, for example, the customer's movement.
  • the customer's behavior may be a specific movement of the customer's body, such as a movement of the customer's hand or a movement of the customer's line of sight.
  • the product detection unit 102 detects a customer's movement from an image captured by an imaging device.
  • the product detection unit 102 specifies the position where the customer extends his/her hand based on the movement of the customer's hand, and detects the product according to the specified position. do.
  • the product corresponding to the specified position is, for example, a product displayed on the dome-shaped display that is on an extension of the specified position, a product that is within a predetermined range from the specified position, etc.
  • the customer's behavior may be a conversation between the customer and a store clerk. Therefore, the product detection unit 102 may detect the customer's behavior by performing voice recognition on the conversation.
  • the second output control unit 103 controls the display on the second display.
  • the second display is a display for the customer's hand.
  • the second display is, for example, a display that is closer to the customer than the first display.
  • the second display is between the customer and the first display.
  • the second display is a display installed at a position where the customer can easily view the display.
  • the display area of the second display is smaller than the display area of the first display.
  • the first display displays information about the sales floor, such as images and videos of the sales floor, while the second display displays images and videos of products that have been noticed, such as products selected by customers. Display information about products. For this reason, for example, it is desirable that the second display be installed closer to the customer than the dome-shaped display and display information about the product in a way that is easy to see.
  • the second display may be a fixed display or a portable display.
  • a non-contact type display will be described as an example, but a contact type display may be used.
  • the display target is not particularly limited, such as images, videos, characters and numbers, and special codes that can be read by any device.
  • the second output control unit 103 displays information regarding the detected product on a non-contact display.
  • the information regarding the product includes an image of the product, a video of the product, information related to the product, and the like.
  • the information related to the product includes characters, numbers, codes, etc. related to the product, and is not particularly limited to the price of the product, the name of the product, the place of production, and the manufacturer of the product.
  • the product code may be a code representing characters and numbers related to the product.
  • the image of the product may be a real-time image or may be an image captured in advance.
  • the image of the product may be a new image taken in real time, or may be an image taken in advance. Products are also shown in videos and images of the sales floor, but videos and images of products are videos and images that focus on the product.
  • FIG. 2 is a flowchart showing an example of the operation of the shopping support system 10 according to the first embodiment.
  • the first output control unit 101 causes the dome-shaped display to display information regarding the sales floor of the store (step S101).
  • the product detection unit 102 determines whether the product that the customer wants to acquire is detected (step S102). In step S102, the product detection unit 102 detects a product that the customer wants to acquire based on the customer's behavior. If the product that the customer wants to acquire is not detected (step S102: No), the product detection unit 102 returns to step S102.
  • step S102 If the product that the customer wants to acquire is detected (step S102: No), the second output control unit 103 displays information regarding the detected product on the non-contact display (step S103). The shopping support system 10 then ends the process.
  • the shopping support system 10 displays the store on the first display, which is a large display, and displays the products detected based on the customer's behavior on the second display.
  • This makes it possible to improve convenience when using a large display such as a dome-shaped display. Therefore, a more realistic shopping experience can be provided.
  • the shopping support system 10 displays information about the store's sales floor on a dome-shaped display, and when a displayed product is selected by a customer, a large image or video of the selected product is displayed on a display at hand. You can also display additional information about the selected product.
  • the second display is installed between the customer and the first display.
  • an aerial display is used as the second display
  • the aerial display portion invisible to surrounding people, information such as the customer's personal information that the customer does not want other people to see can be presented to the customer via the aerial display.
  • the first display is large, there is a risk that other people may see the information displayed. In this way, information that the user does not want others to see is content that is difficult to display on the first display. Furthermore, when an aerial display is used as the second display, the customer may be able to view the product three-dimensionally.
  • a wearable information processing terminal such as an HMD (Head Mounted Display)
  • HMD Head Mounted Display
  • the user may not be able to see his/her hand while the virtual space is displayed.
  • a dome-shaped display is used as the display means
  • the user has the advantage of being able to see his or her hand even when the virtual space is displayed.
  • a second display such as a spatial display can be installed.
  • a dome-shaped display when used, it can be used for general purposes. For example, it is assumed that a dome-shaped display is installed in a place where everyone can access it, and that various users use the dome-shaped display. In such cases, some users may be unfamiliar with the controller. According to the first embodiment, it is possible to improve the operability during shopping. By improving convenience when using a dome-shaped display as in the first embodiment, it is possible to further enhance versatility. In addition, since videos and images of the sales floor are displayed on the dome-shaped display 21 that covers the customer's field of view, the customer can shop with an immersive feeling. Therefore, it is possible to provide a shopping experience closer to actual shopping.
  • the operation by the controller and the shopping support system 10 may be used in combination as appropriate.
  • Embodiment 2 Next, Embodiment 2 will be described in detail with reference to the drawings.
  • Embodiment 2 an example of a dome-shaped display as a first display, an installation example of an aerial display as a second display, etc. will be described.
  • Embodiment 2 an example will be described in which the display of the dome-shaped display is controlled in order to make the displayed product easier to see while displaying information about the product on the aerial display.
  • a description of contents that overlap with the above description will be omitted to the extent that the description of the second embodiment is not unclear.
  • FIG. 3 is an explanatory diagram showing an example of a dome-shaped display.
  • FIG. 4 is an explanatory diagram showing an example of installing an aerial display and an example of installing devices around a customer.
  • the dome-shaped display 21 includes, for example, a screen 2101, a projection device 2102, and a table 2103.
  • the dome-shaped display 21 displays a video or image
  • the projection device 2102 projects the video or image onto the screen 2101.
  • the aerial display 22 is installed on the table 2103. Further, in FIG. 4, the screen 2101 of the dome-shaped display 21 and the projection device 2102 are omitted. In FIG. 4, a customer is sitting on a chair and looking at apples displayed on the dome-shaped display and the aerial display 22.
  • FIG. 4 shows an example in which the table 2103 is viewed from the side, in reality, the aerial display 22 may not have a thickness for aerial imaging, which will be described later.
  • the aerial image is viewed from the side in the thickness direction, the aerial image cannot be seen, but in the following explanation, for ease of understanding, This will be explained using an example in which an apple or the like is displayed. Furthermore, in FIG.
  • an imaging device 23 and a recording device 24 may be installed at the location where the dome-shaped display 21 is installed.
  • the imaging device 23 and the recording device 24 are used to detect customer behavior.
  • an audio output device such as a speaker for outputting the voice of the store clerk may be installed at the location where the dome-shaped display 21 is installed.
  • the number of imaging devices 23, recording devices 24, and aerial displays 22 is not particularly limited.
  • the imaging device 23 may transmit the captured video or image to a shopping support system or the like.
  • the recording device 24 may transmit the audio to a shopping support system or the like.
  • the dome-shaped display 21 is, for example, a device in which at least a portion of the screen 2101 is curved to cover the customer's field of view.
  • the dome-shaped display 21 may not have a complete dome shape such as 360 degrees, but may have a partially broken shape such as 180 degrees.
  • a portion of the screen 2101 does not need to be curved.
  • the dome-shaped display 21 is sized to create a gap between it and the user that is large enough to allow the installation of the table 2103 on which the aerial display 22 is installed. In this way, the size and type of the dome-shaped display 21 are not particularly limited.
  • the dome-shaped display 21 may be a 180-degree dome-shaped display 21 or a 360-degree dome-shaped display 21.
  • the size of the dome-shaped display 21 may be, for example, approximately 1 meter to 2 meters in length, 1 meter to 2 meters in width, and 1 meter to 2 meters in height.
  • the installation location of the dome-shaped display 21 is not particularly limited.
  • the dome-shaped display 21 may be installed in a customer's home, office, etc., or may be installed in a place where anyone can use it.
  • FIG. 5 is an explanatory diagram showing an example of a store's sales floor.
  • the store may be a store in a virtual space or an actual store.
  • the store in the virtual space may be a virtual store that does not actually exist, or a virtual store that imitates an actual store. If there is an actual store and an image of the actual store is displayed on the dome-shaped display 21, for example, an imaging device 25 and a recording device 26 are installed in the sales floor of the actual store. Further, although not shown, an audio output device such as a speaker for outputting customer voices may be installed in the sales floor of the actual store.
  • imaging devices 25-1 and 25-2 are installed, but the number of imaging devices 25 is not particularly limited.
  • the imaging device 25 images the sales floor of the store. Further, the imaging device 25 may take an image of a store clerk. Further, the imaging device 25 may take images of each product in the store.
  • the imaging device 25-1 may take an image of the sales floor of the store including the store staff, and the imaging device 25-2 may take an image of a specified product.
  • the imaging device 25 may transmit the captured video to a shopping support system or the like.
  • the recording device 26 records or collects the voices of the sales floor. Then, for example, the recording device 26 may transmit the audio to a shopping support system or the like.
  • FIG. 5 shows an example of an actual store
  • a store in a virtual space may also be used.
  • a store in virtual space may be used when there is no actual store.
  • FIG. 6 is an explanatory diagram that simply shows an aerial display.
  • the aerial display 22 includes, for example, an optical element, a display, and a sensor.
  • optical elements pass the light rays emitted by the display's display and form the same image on the side opposite the display.
  • this image is an aerial image.
  • Sensors are used to manipulate images in the air.
  • the sensor is, for example, a motion sensor.
  • an operation that can be detected by a sensor will also be referred to as an operation on the aerial display.
  • the type of aerial display is not particularly limited.
  • FIG. 7 is a block diagram showing an example of the configuration of a shopping support system according to the second embodiment.
  • the shopping support system 20 includes a first output control section 201, a product detection section 202, a second output control section 203, an acquisition section 204, a video analysis section 205, an audio analysis section 206, and an imaging device control section 207. , a platform control section 208 , a superimposition position specifying section 209 , a visual position specifying section 210 , and an operation receiving section 211 .
  • the shopping support system 20 includes an acquisition unit 204, a video analysis unit 205, an audio analysis unit 206, an imaging device control unit 207, a stand control unit 208, in addition to the shopping support system 10 according to the first embodiment.
  • a superimposition position specifying section 209, a visible position specifying section 210, and an operation receiving section 211 are added.
  • the first output control section 201 has the functions of the first output control section 101 according to the first embodiment as a basic function.
  • the product detection unit 202 has the function of the product detection unit 102 according to the first embodiment as a basic function.
  • the second output control section 203 has the functions of the second output control section 103 according to the first embodiment as a basic function.
  • the shopping support system 20 includes a customer DB (DataBase) 2001, a store DB 2002, and a product DB 2003. Note that each functional unit of the shopping support system 20 can refer to and update various databases as appropriate.
  • the customer DB 2001 stores customer video and audio, customer behavior history, and the like.
  • the behavior history may be a history of the customer's movements detected from video, a history of conversation behavior detected from audio, or a history of the customer's operations on the aerial display 22.
  • the store DB 2002 stores store videos, store images, store sounds, and the like.
  • the product DB 2003 stores product identification information and product information in association with each other for each product.
  • the product information includes, for example, a video of the product, an image of the product, a price of the product, a name of the product, a display position of the product on the sales floor, characteristics of the product, and the like.
  • the characteristics of the product may be the color of the product, the shape of the product, the production area of the product, the raw materials of the product, the manufacturing company of the product, and the like.
  • the characteristics of a product may be used to identify the product based on a conversation between a customer and a store employee, so there is no particular limitation on the characteristics as long as the information allows the product to be identified.
  • the acquisition unit 204 acquires a captured image of the customer from the imaging device 23.
  • the acquired video data is stored in the customer DB 2001, for example.
  • the acquisition unit 204 acquires the customer's voice from the recording device 24.
  • the acquired voice data is stored in the customer DB 2001, for example.
  • the acquisition unit 204 acquires a captured image of the store from the imaging device 25.
  • the acquired video data is stored in the store DB 2002, for example.
  • the acquisition unit 204 may acquire an image of a store created in advance. For example, when there is no actual store, an image of the store in virtual space may be created.
  • the acquired video data is stored in the store DB 2002, for example.
  • the acquisition unit 204 acquires the store's audio from the recording device 26.
  • the acquired voice data is stored in the store DB 2002, for example.
  • the acquisition unit 204 may acquire audio in the virtual space.
  • the first output control unit 201 causes the dome-shaped display 21 to display information regarding the department of the store where the products are displayed.
  • information regarding the sales floor of a store includes a video of the sales floor, an image of the sales floor, and the like.
  • the first output control unit 201 causes the dome-shaped display 21 to display an image of the sales floor acquired from the imaging device 25. This allows customers to view video of the sales floor captured in real time, giving them the feeling of shopping in an actual store.
  • FIG. 8 is an explanatory diagram showing an example in which the sales floor of a store is displayed on the dome-shaped display 21.
  • the dome-shaped display 21 displays a sales floor as shown in FIG.
  • the customer can sit on a chair and view the sales floor.
  • the first output control unit 201 further causes the dome-shaped display 21 to display information regarding the store clerk.
  • Information regarding the store clerk includes a video of the store clerk, an image of the store clerk, the name of the store clerk, the employee's affiliation, characteristics of the store employee, and the like.
  • the employee's affiliation is the employee's affiliation at the store, such as "in charge of vegetable section," for example.
  • the characteristics of the store clerk may be, for example, qualifications held.
  • the imaging device 25 captures an image of the sales floor including the store employee. Then, the first output control unit 201 causes the dome-shaped display 21 to display the captured video.
  • the first output control section 201 may display an avatar of the store clerk. For example, when displaying a store clerk's avatar, the imaging device 25 images the sales floor. The first output control unit 201 may then combine the captured video with the store clerk's avatar and display the combined video. Alternatively, for example, the first output control unit 201 may combine the store clerk's avatar with the video of the sales floor in the virtual space, and display the combined video.
  • the acquisition unit 204 may acquire the voice of the customer and the store clerk from the recording device 24.
  • the product detection unit 202 detects products that the customer wants to acquire based on the customer's behavior. First, in order for the product detection unit 202 to detect the product, for example, the video analysis unit 205 analyzes the customer's behavior from the video. Then, the product detection unit 202 detects the product based on the analyzed customer behavior. Alternatively, in order for the product detection unit 202 to detect the product, the voice analysis unit 206 analyzes the customer's behavior from the voice. Then, the product detection unit 202 detects the product based on the analyzed customer behavior. Note that examples of product detection will be described later for each customer's behavior.
  • the second output control unit 203 causes the aerial display 22 to display information regarding the detected product.
  • the information regarding the product may be a video of the product, an image of the product, or information related to the product such as the name of the product.
  • the image of the product may be, for example, a captured image or an image that reproduces the product in a virtual space.
  • the captured video may be a video of the product captured in real time by the imaging device 25, or may be a video of the product captured in advance.
  • the image of the product may be a captured image of the product, an image of the product such as an illustration, or an image of the product reproduced in virtual space.
  • the captured image may be an image of the product captured in real time by the imaging device 25, or may be an image of the product captured in advance.
  • the customer's behavior is the customer's hand movement as an example.
  • the video analysis unit 205 may detect the movement of the customer's hand from the video captured by the imaging device 23.
  • the product detection unit 202 detects a product that the customer wants to acquire based on the customer's hand movements.
  • the product detection unit 202 detects the product based on the position where the customer extends his/her hand.
  • the product detection unit 202 detects a product located at a position where a customer extends his or her hand or a product located near the position.
  • a product near the position where the customer extends his/her hand may be a product located on an extension of the position where the customer extends his or her hand, or a product that is closest to the position where the customer extends his or her hand. You can.
  • FIG. 9 An example of displaying information regarding a product on the aerial display 22 based on detection of a customer's hand movement will be described using FIG. 9.
  • FIG. 9 is an explanatory diagram showing an example of detecting a product from a customer's hand movement and an example of displaying information regarding the product on the aerial display 22.
  • the screen 2101 of the dome-shaped display 21 and the projection device 2102 are not shown.
  • the imaging device 23 images a customer.
  • the acquisition unit 204 then acquires the captured image of the customer in real time.
  • the product detection unit 202 then detects the customer's behavior from the video.
  • the video analysis unit 205 detects that the customer stands up from a sitting position and extends his hand toward the dome-shaped display 21 based on the video. Then, in FIG. 9, the product detection unit 202 detects the product at the end of the customer's outstretched hand from among the products on the sales floor displayed on the dome-shaped display 21. Then, the second output control unit 203 causes the aerial display 22 to output information regarding the detected product. Accordingly, in FIG. 9, the aerial display 22 displays information regarding the product, such as an image of the product or a video of the product. In the example of FIG. 9, an apple is displayed.
  • Hand movements are not limited to those detected by a sensor such as the imaging device 23, but may also be detected by a dedicated glove or the like. Note that the sensor is not limited to the imaging device 23, and can be modified in various ways.
  • the customer's behavior is the movement of the customer's line of sight.
  • the video analysis unit 205 may detect the movement of the customer's line of sight from the image captured by the imaging device 23.
  • the product that the customer is watching may be the product that the customer wants to check. Therefore, the product detection unit 202 detects the product that the customer is watching as a product that the customer wants to acquire.
  • the products that customers are looking at include products that the customer's gaze is directed toward, products that the customer's gaze is directed toward for more than a predetermined amount of time, and products that the customer's gaze is directed toward for more than a predetermined number of times. .
  • the method of identifying the product that the customer is watching is not particularly limited.
  • FIG. 10 An example of displaying information regarding products on the aerial display 22 from detection of a customer's line of sight will be described using FIG. 10.
  • FIG. 10 is an explanatory diagram showing an example of detecting a product from the movement of the customer's line of sight and an example of displaying information regarding the product on the aerial display 22.
  • the screen 2101 of the dome-shaped display 21 and the projection device 2102 are not shown.
  • the imaging device 23 images a customer.
  • the acquisition unit 204 acquires the captured image of the customer in real time.
  • the video analysis unit 205 detects the movement of the customer's line of sight from the video.
  • the product detection unit 202 detects the product on which the customer has been looking for a predetermined period of time or more from among the products on the sales floor displayed on the dome-shaped display 21 based on the video. Then, the second output control unit 203 causes the aerial display 22 to display information regarding the detected product. Accordingly, in FIG. 10, the aerial display 22 displays information regarding the product, such as an image of the product or a video of the product. In the example of FIG. 10, an apple is displayed.
  • the movement of the line of sight is not limited to the example detected by the imaging device 23, but may be detected by another sensor. Furthermore, the direction of the face may be used regardless of the movement of the line of sight.
  • the customer's action is a conversation with a store clerk.
  • the voice analysis unit 206 recognizes the conversation between the customer and the store clerk. Then, the voice analysis unit 206 detects keywords that can identify the product, such as the name of the product, the display position of the product on the sales floor, the price of the product, and the characteristics of the product, from the conversation.
  • the characteristics of the product may be the color of the product, the shape of the product, the production area of the product, the raw materials of the product, the manufacturing company of the product, and the like.
  • the position of the product on the sales floor may be the right end, the left end, the upper right, the position on the shelf, etc., and is not particularly limited.
  • a keyword that includes the location of a product is, for example, "apple on the far right.”
  • the product detection unit 202 detects a product matching the detected keyword from the product DB 2003.
  • the keyword may be a keyword issued by a customer or a keyword issued by a store clerk.
  • FIG. 11 is an explanatory diagram showing an example of detecting a product from a conversation and an example of displaying information regarding the product on the aerial display 22.
  • the screen 2101 of the dome-shaped display 21 and the projection device 2102 are not shown.
  • the recording device 24 records audio or collects audio.
  • the acquisition unit 204 acquires audio from the recording device 24 in real time.
  • the voice analysis unit 206 detects information that can identify the product from the conversation by performing voice recognition on the conversation.
  • the product detection unit 202 detects the keyword "apple on the far right" from the conversation. Then, the second output control unit 203 causes the aerial display 22 to display information regarding the detected product. Accordingly, in FIG. 11, the aerial display 22 displays information regarding the product, such as an image of the product or a video of the product. In the example of FIG. 11, an apple is displayed.
  • the customer's behavior obtained from the voice is not limited to the conversation between the customer and the store clerk, and the customer's behavior may be, for example, the customer's utterance.
  • the product detection unit 202 may detect a product that the customer wants to acquire based on the analyzed conversation and the analyzed eye movement.
  • the video analysis unit 205 detects the customer's hand movement with respect to the aerial display 22 by analyzing the customer's video. Specifically, for example, the video analysis unit 205 may detect a hand movement such as turning a product. For example, the video analysis unit 205 may detect whether the hand movement is turning the product to the left or the hand movement is turning the product to the right.
  • a hand movement such as turning a product.
  • the video analysis unit 205 may detect whether the hand movement is turning the product to the left or the hand movement is turning the product to the right.
  • the shopping support system 20 performs control such that the orientation of the product in the image or video displayed on the aerial display 22 is changed based on the hand movement.
  • control methods There are various examples of control methods.
  • the image capture device control unit 207 determines the position of the image capture device 25 and At least one of the orientations of the imaging device 25 is controlled. This allows the imaging device 25 to take images of the product in different orientations. More specifically, for example, when a hand movement such as turning the product to the left is detected, the imaging device control unit 207 changes the imaging position and orientation of the imaging device 25 so that the left side of the product currently being imaged can be imaged. control at least one of the following.
  • the imaging position and direction may be specified according to the movement of the hand. Note that in a store, products may be placed on a stand, and the imaging device 25 may take an image of the product placed on the stand.
  • the method of controlling the imaging position and orientation of the imaging device 25 is not particularly limited.
  • the imaging device control unit 207 can control the imaging position of the imaging device 25 by controlling the robot arm.
  • FIG. 12 is an explanatory diagram showing an example of changing the orientation of a displayed product by changing the position and orientation of the imaging device 25 that images the product.
  • the imaging device 25 captures an image of a product placed on a stand 27.
  • an apple is placed on a stand 27 as a product.
  • the second output control unit 203 displays the image of the product on the aerial display 22.
  • an apple is displayed as a product on the aerial display 22.
  • the video analysis unit 205 analyzes the customer's hand movements by analyzing the customer's video captured by the imaging device 23.
  • the imaging device control unit 207 controls the position and orientation of the imaging device 25 based on the movement of the customer's hand.
  • the position and orientation of the imaging device 25 have changed.
  • an image of the product whose orientation has been changed is displayed on the aerial display 22.
  • other parts of the apple are displayed on the aerial display 22.
  • the second output control unit 203 causes the aerial display 22 to display an image of a product captured in real time by an imaging device 25 selected from a plurality of imaging devices 25 installed in a store
  • the second output control unit 203 A new imaging device 25 is selected based on the hand movement.
  • the second output control unit 203 causes the aerial display 22 to display the video imaged by the newly selected imaging device 25. In this manner, by switching the imaging device 25, the orientation of the product displayed on the aerial display 22 can be changed.
  • FIG. 13 is an explanatory diagram showing an example of changing the orientation of the displayed product by switching the imaging device 25.
  • an imaging device 25-1 and an imaging device 25-2 are installed in a store.
  • the imaging device 25-1 and the imaging device 25-2 capture images of the products placed on the stand 27 in the store.
  • an apple is placed on a stand 27 as a product.
  • the second output control unit 203 causes the aerial display 22 to display an image of the product captured by the imaging device 25-1.
  • an apple is displayed as a product on the aerial display 22.
  • the video analysis unit 205 analyzes the customer's hand movements by analyzing the customer's video captured by the imaging device 23.
  • the second output control unit 203 selects the imaging device 25-2 from the imaging device 25-1 and the imaging device 25-2 based on the customer's hand movement, and selects the imaging device 25-2 from the imaging device 25-1 and the imaging device 25-2. 25-2 is displayed.
  • the aerial display 22 displays an image of the product whose orientation has been changed. For example, in FIG. 13, other parts of the apple are displayed on the aerial display 22.
  • the product image is a product image captured in real time by an imaging device 25 installed in a store
  • the product to be displayed on the aerial display 22 is placed on the stand 27 by a store employee
  • the imaging device 25 There are cases where an image of the product placed on the stand 27 is taken.
  • the stand 27 may be a rotatable stand 27.
  • the stand control unit 208 controls the rotation of the stand 27. This allows the orientation of the product to be imaged to be changed.
  • the table control unit 208 controls the rotation of the table 27 based on the customer's hand movements. This allows the imaging device 25 to take images of the product in different orientations.
  • the imaging device control unit 207 rotates the stand 27 to the left so that the left side of the product currently being imaged can be imaged. .
  • the amount of rotation may be specified according to the movement of the hand.
  • the table control unit 208 may control the height of the table 27 without being limited to the example of rotating the table 27. If the position of the stand 27 can be changed, the stand control unit 208 changes the position of the stand 27. Thereby, by controlling the height, position, and rotation of the stand 27, the stand control unit 208 can change the orientation of the product imaged by the imaging device 25.
  • FIG. 14 is an explanatory diagram showing an example of changing the orientation of the displayed product by rotating the table 27.
  • the imaging device 25 captures an image of a product placed on a stand 27 in a store.
  • an apple is placed on a stand 27 as a product.
  • the second output control unit 203 displays the image of the product on the aerial display 22.
  • an apple is displayed as a product on the aerial display 22.
  • the video analysis unit 205 analyzes the customer's hand movements by analyzing the customer's video captured by the imaging device 23.
  • the table control unit 208 rotates the table 27 based on the customer's hand movement.
  • the aerial display 22 displays an image of the product whose orientation has been changed. For example, in FIG. 14, other parts of the apple are displayed on the aerial display 22.
  • the second output control unit 203 outputs an image of the product that is oriented in accordance with the movement of the hand from among the images of the product that has been captured in advance. Display.
  • control is performed such that the direction of the product in the image displayed on the aerial display 22 is changed based on the customer's hand movement.
  • Control may be performed such that the orientation of the product is changed.
  • the second output control unit 203 displays one of a plurality of images of the product taken from different directions. Then, when the movement of the customer's hand is detected, the second output control unit 203 displays an image taken in an orientation corresponding to the movement of the customer's hand from among the plurality of images.
  • At least one of the imaging position and orientation of the imaging device 25 may be controlled by the imaging device control unit 207, similarly to the video example.
  • the imaging device control unit 207 controls at least one of the position and orientation of the imaging device 25, and after the position and orientation are controlled, the imaging device 25 newly captures an image of the product. Then, the second output control unit 203 displays the newly captured image of the product.
  • the imaging device 25 may be switched by the second output control unit 203, similarly to the video example.
  • the second output control unit 203 causes the aerial display 22 to display an image of the product captured by the imaging device 25 selected from the plurality of imaging devices 25 installed in the store
  • the second output control unit 203 displays the hand movement from the plurality of imaging devices 25.
  • a new imaging device 25 is selected based on.
  • the newly selected imaging device 25 then captures an image of the product.
  • the second output control unit 203 causes the aerial display 22 to display the image captured by the newly selected imaging device 25.
  • the stand 27 may be controlled by the stand control unit 208, similarly to the video example.
  • the table control unit 208 controls the rotation of the table 27 based on the customer's hand movements.
  • the imaging device 25 then takes a new image of the product in a different orientation.
  • the second output control unit 203 causes the aerial display 22 to display the image newly captured by the imaging device 25.
  • control is performed such that the direction of the product is changed in the image displayed on the aerial display 22 based on the customer's hand movement, but the control is not limited to the hand movement.
  • control may be performed such that the orientation of the product in the video or image displayed on the aerial display 22 is changed.
  • the aerial display 22 can accept user operations. Therefore, the operation reception unit 211 may accept an operation on the aerial display 22.
  • the imaging device control unit 207 may control at least one of the imaging position and orientation of the imaging device 25 based on the received operation.
  • the second output control unit 203 newly selects an imaging device 25 from the plurality of imaging devices 25 based on the hand movement based on the received operation. Then, the second output control unit 203 causes the aerial display 22 to display the video imaged by the newly selected imaging device 25.
  • the stand control unit 208 controls the rotation of the stand 27 based on the received operation.
  • the first output control unit 201 may select a product corresponding to the received operation from among the images that have been captured in advance. Display an image that is facing the same direction.
  • the background of the aerial display 22 may be visible through it.
  • the dome-shaped display 21 is in front of the customer's line of sight. The customer can see the display on the dome-shaped display 21 in the background of the product and beyond the transparent product. In this way, while the information regarding the product is being displayed on the aerial display 22, it may be difficult for the customer to see the display on the dome-shaped display 21.
  • the first output control unit 201 changes the display on the dome-shaped display 21 so that the product displayed on the aerial display 22 does not become difficult to see. Thereby, when the aerial display 22 and the dome-shaped display 21 are combined, it is possible to prevent the product from becoming difficult to see.
  • the first output control unit 201 may change the entire display on the dome-shaped display 21, or may change the display on a part of the dome-shaped display 21.
  • the first output control unit 201 may turn off the display of the store video, or display a video corresponding to the product instead of the store video. For example, if the product is food, the image corresponding to the product may be a plate or the like.
  • the first output control unit 201 controls the dome-shaped display 21 that overlaps the product when the customer views the product in the video or image displayed on the aerial display 22. You may erase some of the display or change the color. Thereby, when the aerial display 22 and the dome-shaped display 21 are combined, it is possible to prevent the product from becoming difficult to see.
  • the first output control unit 201 changes the display at the fixed position of the dome-shaped display 21.
  • the fixed position is a position corresponding to a display area where the aerial display 22 displays information regarding the product.
  • the first output control unit 201 may erase the display of the fixed position.
  • the first output control unit 201 sets the display color of the fixed position to a predetermined color.
  • the predetermined color may be a color that makes the product easier to see, such as black or white, or a color that corresponds to the product.
  • the superimposition position specifying unit 209 identifies the customer based on the positional relationship between the customer's position and the position of the aerial display 22, and the positional relationship between the display area where the aerial display 22 displays information regarding the product and the position of the dome-shaped display 21.
  • the relative position of the dome-shaped display 21 that overlaps with the display area is specified.
  • the display area is the aerial imaging area described with reference to FIG.
  • the customer's location may be specified by a three-dimensional (3D) sensor or the like.
  • the 3D sensor may be installed on the dome-shaped display 21, for example.
  • the aerial display 22 is displaying information regarding the product
  • the first output control unit 201 changes the display of the specified position on the dome-shaped display 21. The method of changing the display is as described above.
  • the first output control unit 201 causes the dome-shaped display 21 to display an image different from the image of the sales floor.
  • the images may vary depending on the product, such as plates, other shelves, and the like.
  • the explanation will be given using a video as an example, but an image may also be used.
  • the first output control unit 201 may control the display on the dome-shaped display 21 for each of the multiple customers.
  • the first output control unit 201 may control the display on the dome-shaped display 21 for each of the plurality of aerial displays 22.
  • the second output control unit 203 may display a measuring tape on the aerial display 22 that allows the dimensions to be confirmed, along with the image of the product. This allows the customer to check the actual size of the product by comparing the tape measure with the size of the product.
  • the customer's viewing position may be specified so that a full-size image or video of the product can be displayed to the customer.
  • a 3D sensor may be installed on the dome-shaped display 21. A 3D sensor then measures the position of the customer's head and hands.
  • the acquisition unit 204 then acquires the measurement results of the 3D sensor.
  • the visual position specifying unit 210 specifies the customer's visual position based on the positional relationship between the head position and the hand position based on the measurement results.
  • the second output control unit 203 displays either an image or a video of the product at a size based on the actual size of the product and the visible position. As a result, images and videos of the product are displayed so that the product can be seen in its actual size.
  • the second output control unit 203 may cause the aerial display 22 to display the product in an enlarged manner.
  • the second output control unit 203 may display the product on the aerial display 22 so as to reduce the size of the product.
  • the operation reception unit 211 may receive an instruction to enlarge or an instruction to reduce by accepting an operation on the aerial display 22. Based on the received operation, the second output control unit 203 may display the product in the image or video in an enlarged manner, or may display the product in the image or video in a reduced size.
  • the color and shape will differ depending on the product.
  • an image such as an illustration of a product or an image of a product reproduced in virtual space is displayed on the aerial display 22, if the same image is displayed every time, the customer will feel that it is different from realistic shopping.
  • the explanation will be given using an image of a product as an example, but the same applies to a video of a product. It is desirable to be closer to realistic shopping. Therefore, for example, the second output control unit 203 causes the aerial display 22 to display an image of the product based on at least one of information regarding the color of the product and the shape of the product.
  • the color of the product and the shape of the product may be obtained from a conversation between a store clerk and a customer.
  • the voice analysis unit 206 detects at least one of information regarding the color of the product and information regarding the shape of the product by performing voice recognition on a conversation between a store clerk and a customer. Then, the second output control unit 203 displays an image of the product in at least one of a color and a shape depending on the conversation. Specifically, for example, when a conversation between a store clerk and a customer saying "Today's apples are redder than usual" is detected, the second output control unit 203 displays an image of an apple in a color redder than a predetermined standard. Display.
  • the second output control unit 203 controls the display so that the apples are displayed a predetermined size larger than a predetermined standard. Display an image of an apple. Thereby, the image of the product can be displayed on the aerial display 22 in the color and shape of the product that are close to the actual product.
  • the second output control unit 203 may display either the image or the video according to the individual product.
  • the second output control unit 203 may display the name of the product and the price of the product on the aerial display 22 along with the video and image of the product.
  • the display of information regarding the product on the aerial display 22 may end.
  • the video analysis unit 205 may detect the customer's line of sight from the customer's video. For example, the video analysis unit 205 detects that the customer's line of sight is upward and not looking at the aerial display 22. Then, when it is detected that the person is not looking at the aerial display 22, the second output control unit 203 ends displaying the information regarding the product on the aerial display 22. Further, when the product detection unit 202 detects a new product, such as when the customer extends his hand toward another product, the second output control unit 203 sends information about the product to the aerial display 22. End the display. However, in this case, the second output control unit 203 may cause the aerial display 22 to display information regarding the newly detected product.
  • the display of information regarding the product on the aerial display 22 may be terminated by the customer's operation.
  • the customer's operation may be an operation indicating the end of display, or an operation to add the item to a shopping cart.
  • the customer's operation can be detected by the customer's voice, customer's gesture, etc.
  • the video analysis unit 205 detects a predetermined gesture of the customer from the customer's video. Then, when a predetermined gesture is detected, the second output control unit 203 ends displaying information regarding the product on the aerial display 22.
  • the voice analysis unit 206 detects a predetermined keyword from the customer's voice. Then, when a predetermined keyword is detected, the second output control unit 203 ends displaying information regarding the product on the aerial display 22.
  • the display of information regarding the product on the aerial display 22 may end.
  • the customer disappearing from the dome-shaped display 21 means that the customer disappears from a place where the dome-shaped display 21 can be viewed or operated.
  • the video analysis unit 205 may detect that the customer is gone by detecting from the video imaged by the imaging device 23 whether or not the customer is captured. Then, when it is detected that the customer is gone, the second output control unit 203 ends the display of information regarding the product on the aerial display 22.
  • FIG. 15 is a flowchart showing an example of the operation of the shopping support system 20 according to the second embodiment.
  • the acquisition unit 204 acquires the video imaged by the imaging device 23 and the audio acquired from the recording device 24.
  • the video analysis unit 205 analyzes the customer's behavior from the video captured by the imaging device 23.
  • the voice analysis unit 206 analyzes the customer's behavior from the voice acquired from the recording device 24.
  • the first output control unit 201 causes the dome-shaped display 21 to display information regarding the sales floor of the store (step S201).
  • the product detection unit 202 determines whether a product that the customer wants to acquire is detected based on the customer's behavior (step S202). If no product is detected (step S202: No), the product detection unit 202 returns to step S202.
  • step S202 determines whether the product is detected (step S202: Yes).
  • the second output control unit 203 causes the aerial display 22 to display information regarding the product (step S203).
  • the first output control unit 201 also changes the display on the dome-shaped display 21 (step S204).
  • step S204 the first output control unit 201 may change the display of the fixed position on the dome-shaped display 21, or may change the display of the specified position on the dome-shaped display 21. Note that the order of processing in step S203 and step S204 is not particularly limited. For example, step S203 and step S204 may be performed at the same timing.
  • the second output control unit 203 changes the display of information regarding the product based on the customer's behavior (step S205).
  • the second output control unit 203 may change the display orientation of the product in the video or image based on the customer's behavior.
  • the control is not limited to the second output control section 203, and the control of the platform 27 by the platform control section 208 or the control of the imaging device 25 by the imaging device control section 207 may be performed.
  • the imaging device control unit 207 may change the orientation of the product displayed on the aerial display 22 by changing the imaging position and orientation of the imaging device 25.
  • the stand control unit 208 may control the stand 27 to change the orientation of the product displayed on the aerial display 22.
  • the display control by the second output control section 203, the control of the platform 27 by the platform control section 208, and the control of the imaging device 25 by the imaging device control section 207 may be combined as appropriate.
  • the second output control unit 203 determines whether to end the display of information regarding the product (step S206). If the display of information regarding the product is not finished (step S206: No), the second output control unit 203 returns to step S205. If the display of information regarding the product is to be ended (step S206: Yes), the first output control unit 201 returns to step S201.
  • the shopping support system 20 detects the customer's hand movement from the customer's video or image, and detects the product that the customer wants to acquire based on the detected customer's hand movement. . More specifically, for example, the shopping support system 20 detects the product that the customer wants to acquire based on the position where the customer extends his/her hand. For example, the customer can display the desired product on the aerial display 22 by simply making an intuitive movement such as taking a product from a shelf. Furthermore, when a customer visits an actual store to shop, it is assumed that the customer grabs a product on a shelf and brings it close to his or her face to check it. Therefore, according to the shopping support system 20 in which information regarding products detected from hand movements is displayed on the aerial display 22, it is possible to provide a shopping experience closer to actual shopping.
  • the shopping support system 20 analyzes the conversation between the customer and the store clerk from the audio, and detects the product the customer wants to acquire based on the conversation. As a result, information regarding the product the customer wants to check is displayed on the aerial display 22 simply by having a conversation with the store clerk. Furthermore, for example, when a customer visits an actual store to shop, it is assumed that the customer will have a conversation with the store staff, and the store staff will recommend and show products to the customer, or hand the product to the customer. be done. Therefore, according to the shopping support system 20 in which information regarding products detected from conversations is displayed on the aerial display 22, it is possible to provide a shopping experience closer to actual shopping.
  • the shopping support system 20 causes the aerial display 22 to display an image or video of the product in at least one of the color and shape depending on the conversation. Thereby, the image or video of the product can be displayed on the aerial display 22 in the color and shape of the product that are close to the actual product.
  • the shopping support system 20 causes the dome-shaped display 21 to display information regarding the store staff. This allows customers to shop in a manner that is more similar to that of an actual store.
  • the shopping support system 20 changes the display of the product in the video based on the customer's hand movements.
  • the shopping support system 20 controls an imaging device 25 installed in a store based on the customer's hand movements.
  • the imaging device 25 takes an image of the product placed on the table 27, the shopping support system 20 controls the rotation of the table 27 based on the customer's hand movement. This allows the customer to change the display orientation of the product by moving his or her hand. In this way, the orientation of the product can be changed intuitively.
  • the shopping support system 20 changes the display on the dome-shaped display 21 while the aerial display 22 is displaying information regarding the product. Specifically, for example, the shopping support system 20 may change the display of the fixed position of the dome-shaped display 21. Alternatively, the shopping support system 20 may specify the position of the dome-shaped display 21 that is on the line of sight of the customer when viewing the product, and change the display at the specified position. Thereby, it is possible to prevent the customer from having difficulty viewing the product.
  • the store may be a store in a virtual space or may be an actual store. If there is an actual store, the shopping support system 20 may display an image of the store's sales floor captured in real time by the imaging device 25 on the dome-shaped display 21. As a result, customers can be supported to have an experience as if they were in an actual store, even if they are not in the actual store.
  • each embodiment may be modified and used. Each embodiment may be used in combination as appropriate. Furthermore, in each embodiment, the shopping support system 20 may have a configuration in which each functional unit and a part of information are included.
  • each embodiment is not limited to the above-mentioned example, and can be modified in various ways.
  • the configuration of the shopping support system 20 in each embodiment is not particularly limited.
  • the shopping support system 20 may be realized by one device such as one server.
  • one device may be called a shopping support device, an information processing device, etc., and is not particularly limited.
  • the shopping support system 20 in each embodiment may be realized by different devices depending on functions or data.
  • each functional unit of the shopping support system 20 may be configured with a plurality of servers, and the shopping support system 20 including the plurality of servers may be realized.
  • the shopping support system 20 may be realized by a database server including each DB and a server having each functional unit.
  • FIG. 16 is an explanatory diagram showing an example of implementation of the shopping support system 20. This will be explained using an example where there is an actual store.
  • the shopping support system 20 includes, for example, an edge terminal device 31 and a server 32.
  • the edge terminal device 31, the aerial display 22, the dome-shaped display 21, the imaging device 23, and the recording device 24 are installed in a house, a shared space, or the like.
  • the edge terminal device 31, the aerial display 22, the dome-shaped display 21, the imaging device 23, and the recording device 24 are connected via a communication network.
  • the imaging device 25 is installed in a store.
  • the imaging device 25, the edge terminal device 31, and the server 32 are connected via a communication network.
  • the shopping support system 20 may be configured as an entire system including an edge terminal device 31, a server 32, an aerial display 22, a dome-shaped display 21, an imaging device 23, a recording device 24, and an imaging device 25.
  • each functional unit in each embodiment is realized by an edge terminal device 31 and a server 32.
  • the edge terminal device 31 includes an acquisition section 204, a video analysis section 205, an audio analysis section 206, and an operation reception section 211.
  • the server 32 includes a first output control section 201, a product detection section 202, a second output control section 203, an acquisition section 204, an imaging device control section 207, a stand control section 208, a superimposition position specifying section 209, and a visual recognition section 209.
  • a position specifying section 210 is provided.
  • the server 32 may be a plurality of servers. In this way, each functional unit of the shopping support systems 10 and 20 may be realized by a plurality of devices, and the plurality of devices may be installed at different locations.
  • each piece of information and each DB may include part of the above-mentioned information. Moreover, each piece of information and each DB may include information other than the above-mentioned information. Each piece of information and each DB may be divided into a plurality of DBs and a plurality of pieces of information in more detail. In this way, the method of implementing each piece of information and each DB is not particularly limited.
  • each screen is an example and is not particularly limited.
  • buttons, lists, check boxes, information display fields, input fields, etc. may be added.
  • the background color of the screen, etc. may be changed.
  • the process of generating information etc. to be displayed on the aerial display 22 may be performed by the second output control unit 203 of the shopping support system 20. Further, this processing may be performed by the aerial display 22.
  • the process of generating information and the like to be displayed on the dome-shaped display 21 may be performed by the first output control unit 201 of the shopping support system 20. Further, this processing may be performed by the dome-shaped display 21.
  • a display in which at least a portion of the screen is curved is used as an example of the first display, and specifically, the dome-shaped display 21 is used as an example of the first display.
  • the first display is not limited to a display in which at least a portion of the screen is curved, as long as it is a display for the purpose of displaying a virtual space, video, images, etc.
  • the first display may be a device configured to cover the user's field of vision.
  • a non-contact display was used as an example of the second display.
  • the second display may be a contact type display and is not particularly limited.
  • FIG. 17 is an explanatory diagram showing an example of the hardware configuration of a computer.
  • part or all of each device can be realized using any combination of a computer 80 and a program as shown in FIG. 17, for example.
  • the computer 80 includes, for example, a processor 801, a ROM (Read Only Memory) 802, a RAM (Random Access Memory) 803, and a storage device 804. Further, the computer 80 has a communication interface 805 and an input/output interface 806. Each component is connected to each other via a bus 807, for example. Note that the number of each component is not particularly limited, and each component is one or more.
  • a processor 801 controls the entire computer 80.
  • Examples of the processor 801 include a CPU (Central Processing Unit), a DSP (Digital Signal Processor), and a GPU (Graphics Processing Unit).
  • the computer 80 includes a ROM 802, a RAM 803, a storage device 804, and the like as storage units.
  • Examples of the storage device 804 include semiconductor memory such as flash memory, HDD (Hard Disk Drive), SSD (Solid State Drive), and the like.
  • the storage device 804 stores OS (Operating System) programs, application programs, programs according to each embodiment, and the like.
  • the ROM 802 stores application programs, programs according to each embodiment, and the like.
  • the RAM 803 is used as a work area for the processor 801.
  • the processor 801 loads programs stored in the storage device 804, ROM 802, etc. The processor 801 then executes each process coded in the program. Furthermore, the processor 801 may download various programs via the communication network NT. Further, the processor 801 functions as part or all of the computer 80. The processor 801 may then execute the processes or instructions in the illustrated flowchart based on the program.
  • the communication interface 805 is connected to a communication network NT such as a LAN (Local Area Network) or a WAN (Wide Area Network) through a wireless or wired communication line.
  • a communication network NT such as a LAN (Local Area Network) or a WAN (Wide Area Network) through a wireless or wired communication line.
  • the communication network NT may be composed of a plurality of communication networks NT.
  • the computer 80 is connected to an external device or an external computer 80 via the communication network NT.
  • Communication interface 805 serves as an interface between communication network NT and the inside of computer 80 .
  • the communication interface 805 controls input and output of data from external devices and the external computer 80.
  • the input/output interface 806 is connected to at least one of an input device, an output device, and an input/output device.
  • the connection method may be wireless or wired.
  • Examples of the input device include a keyboard, a mouse, and a microphone.
  • Examples of the output device include a display device, a lighting device, and an audio output device that outputs audio.
  • examples of the input/output device include a touch panel display. Note that the input device, output device, input/output device, etc. may be built into the computer 80 or may be externally attached.
  • Computer 80 may include some of the components shown in FIG. Computer 80 may include components other than those shown in FIG.
  • the computer 80 may include a drive device or the like.
  • the processor 801 may then read programs and data stored in a recording medium attached to a drive device or the like to the RAM 803. Examples of non-temporary tangible recording media include optical disks, flexible disks, magneto-optical disks, USB (Universal Serial Bus) memories, and the like.
  • the computer 80 may include an input device such as a keyboard and a mouse. Computer 80 may have an output device such as a display. Further, the computer 80 may each have an input device, an output device, and an input/output device.
  • the computer 80 may include various sensors (not shown). The type of sensor is not particularly limited. Further, the computer 80 may include an imaging device capable of capturing images and videos.
  • each device may be realized by an arbitrary combination of a computer and a program, each of which is different for each component.
  • the plurality of components included in each device may be realized by an arbitrary combination of one computer and a program.
  • each device may be realized by application-specific circuits. Moreover, a part or all of each component of each device may be realized by a general-purpose circuit including a processor such as an FPGA (Field Programmable Gate Array). Furthermore, some or all of the components of each device may be realized by a combination of application-specific circuits, general-purpose circuits, and the like. Also, these circuits may be a single integrated circuit. Alternatively, these circuits may be divided into multiple integrated circuits. Further, the plurality of integrated circuits may be configured by being connected via a bus or the like.
  • each component of each device is realized by a plurality of computers, circuits, etc.
  • the plurality of computers, circuits, etc. may be arranged centrally or in a distributed arrangement.
  • the shopping support method described in each embodiment is realized by being executed by the shopping support system. Further, for example, the shopping support method is realized by a computer such as a server or a terminal device executing a program prepared in advance.
  • the programs described in each embodiment are recorded on a computer-readable recording medium such as an HDD, SSD, flexible disk, optical disk, flexible disk, magneto-optical disk, or USB memory. Then, the program is executed by being read from the recording medium by the computer.
  • the program may also be distributed via the communications network NT.
  • each component of the shopping support system in each embodiment described above may be realized by dedicated hardware, such as a computer.
  • each component may be realized by software.
  • each component may be realized by a combination of hardware and software.
  • (Additional note 1) a first output control means that causes the first display to display information regarding the department of the store where the products are displayed; product detection means for detecting a product that the customer wants to acquire based on the customer's behavior; a second output control means for displaying information regarding the detected product on a second display;
  • a shopping support system equipped with the customer's action is a hand movement of the customer; The shopping support system described in Appendix 1.
  • the product detection means detects the product based on the position where the customer extends his/her hand.
  • the customer's behavior is a conversation between the customer and a store clerk.
  • the shopping support system according to any one of Supplementary Notes 1 to 3.
  • the first output control means further causes the first display to display information regarding the store clerk.
  • the information regarding the product is a video of the product or an image of the product
  • the second output control means displays information regarding the product in at least one of a color and a shape depending on the conversation between the customer and the store clerk.
  • the information regarding the product is an image of the product;
  • the second output control means changes the display of the product in the image of the product based on the customer's hand movement.
  • the shopping support system described in Appendix 7. (Appendix 9)
  • the image of the product is an image of the product captured by an imaging device installed in the store, Imaging device control means for controlling at least one of the imaging position and orientation of the imaging device based on the customer's hand movement;
  • the shopping support system described in Appendix 7. (Appendix 10)
  • the image of the product is an image of the product captured by an imaging device installed in the store, a table control means for controlling the rotation of a rotary table on which the product can be placed in the store based on the customer's hand movement;
  • the shopping support system described in Appendix 7. (Appendix 11)
  • the first output control means changes the display of the fixed position of the first display when the second display displays information regarding the product.
  • the shopping support system according to any one of Supplementary Notes 1 to 10.
  • Appendix 12 Based on the positional relationship between the customer's position and the position of the second display, and the positional relationship between the display area of information regarding the product on the second display and the position of the first display, the customer specifying means for specifying a position of the first display that overlaps the display area when viewing information regarding the product displayed on the display area; Equipped with The first output control means changes the display of the specified position on the first display when the second display displays information regarding the product.
  • the shopping support system according to any one of Supplementary Notes 1 to 10.
  • the store is a store in a virtual space
  • the information regarding the sales floor is an image of the sales floor of the store captured by an imaging device installed in the store.
  • the shopping support system according to any one of Supplementary Notes 1 to 12.
  • the first display is a device structured to cover the customer's field of view.
  • the first display is a dome-shaped display;
  • the second display is a non-contact display.
  • the second display is an aerial display; The shopping support system described in Appendix 17.
  • the second display is installed between the customer and the first display; The shopping support system according to any one of Supplementary Notes 1 to 18.
  • the second display is installed closer to the customer than the first display; The shopping support system according to any one of Supplementary Notes 1 to 18.
  • the first display is larger than the second display; The shopping support system according to any one of Supplementary Notes 1 to 20.

Landscapes

  • Business, Economics & Management (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

This shopping assistance system comprises a first output control unit, a commodity detection unit, and a second output control unit. The first output control unit causes a first display to show information relating to sales floors of a store where commodities are displayed. The commodity detection unit detects, on the basis of the action of a customer, a commodity that the customer wants to acquire. The second output control unit causes a second display to show information relating to the detected commodity.

Description

買物支援システム、買物支援方法、および記録媒体Shopping support system, shopping support method, and recording medium
 本開示は、買物支援システムなどに関する。 The present disclosure relates to a shopping support system and the like.
 仮想空間や映像を表示する手段として、ドーム型ディスプレイが知られている(例えば、特許文献1,2)。また、仮想空間や映像を表示する手段として、装着型の情報処理端末が知られている(例えば、特許文献3)。 Dome-shaped displays are known as means for displaying virtual spaces and images (for example, Patent Documents 1 and 2). Additionally, a wearable information processing terminal is known as a means for displaying virtual spaces and images (for example, Patent Document 3).
国際公開第2018/101279号International Publication No. 2018/101279 国際公開第2017/187821号International Publication No. 2017/187821 特開2020-129356号公報Japanese Patent Application Publication No. 2020-129356
 例えば、表示手段としてドーム型ディスプレイなどの大型のディスプレイを使う場合、ユーザは、コントローラを使って操作を行う場合がある。コントローラを使って操作を行うと、操作がし難いという問題点がある。例えば、仮想空間上の店舗または遠隔の店舗において買物を行う場合、コントローラを使って店舗に陳列された商品を見るなどの操作を行うのは、ユーザによっては煩わしい。 For example, when using a large display such as a dome-shaped display as a display means, the user may perform operations using a controller. When operations are performed using a controller, there is a problem in that it is difficult to operate. For example, when shopping at a store in a virtual space or at a remote store, it may be troublesome for some users to use a controller to perform operations such as viewing products displayed at the store.
 本開示の目的の一例は、大型のディスプレイを使う場合における利便性の向上を図ることができる買物支援システムなどを提供することにある。 An example of the purpose of the present disclosure is to provide a shopping support system that can improve convenience when using a large display.
 本開示の一態様における買物支援システムは、第1ディスプレイに、商品が陳列された店舗の売り場に関する情報を表示させる第1出力制御手段と、顧客の行動に基づいて、前記顧客が取得したい商品を検出する商品検出手段と、第2ディスプレイに、検出された前記商品に関する情報を表示させる第2出力制御手段と、を備える。 A shopping support system according to an aspect of the present disclosure includes a first output control means for displaying information regarding a sales floor of a store where products are displayed on a first display; The apparatus includes a product detection means for detecting a product, and a second output control means for displaying information regarding the detected product on a second display.
 本開示の一態様における買物支援方法は、第1ディスプレイに、商品が陳列された店舗の売り場に関する情報を表示させ、顧客の行動に基づいて、前記顧客が取得したい商品を検出し、第2ディスプレイに、検出した前記商品に関する情報を表示させる。 A shopping support method according to an aspect of the present disclosure includes displaying information about the sales floor of a store where products are displayed on a first display, detecting a product that the customer wants to acquire based on the customer's behavior, and displaying information on a second display. information regarding the detected product is displayed.
 本開示の一態様におけるプログラムは、コンピュータに、第1ディスプレイに、商品が陳列された店舗の売り場に関する情報を表示させ、顧客の行動に基づいて、前記顧客が取得したい商品を検出し、第2ディスプレイに、検出した前記商品に関する情報を表示させる、処理を実行させる。 A program according to an aspect of the present disclosure causes a computer to display information regarding a sales floor of a store where products are displayed on a first display, detects a product that the customer wants to acquire based on the customer's behavior, and displays a second display. A display is caused to display information regarding the detected product and a process is executed.
 プログラムは、コンピュータが読み取り可能な非一時的な記録媒体に記憶されていてもよい。 The program may be stored in a computer-readable non-transitory recording medium.
 本開示によれば、大型のディスプレイを使う場合における利便性の向上を図ることができる。 According to the present disclosure, it is possible to improve convenience when using a large display.
実施の形態1にかかる買物支援システムの一構成例を示すブロック図である。1 is a block diagram showing an example of a configuration of a shopping support system according to a first embodiment; FIG. 実施の形態1にかかる買物支援システムの一動作例を示すフローチャートである。3 is a flowchart illustrating an example of the operation of the shopping support system according to the first embodiment. ドーム型ディスプレイの一例を示す説明図である。FIG. 2 is an explanatory diagram showing an example of a dome-shaped display. 空中ディスプレイの設置例および顧客周辺の装置の設置例を示す説明図である。It is an explanatory view showing an example of installation of an aerial display and an example of installation of devices around a customer. 店舗の売り場の一例を示す説明図である。It is an explanatory view showing an example of a sales floor of a store. 空中ディスプレイを簡易的に示す説明図である。It is an explanatory view showing an aerial display simply. 実施の形態2にかかる買物支援システムの一構成例を示すブロック図である。FIG. 2 is a block diagram showing an example of a configuration of a shopping support system according to a second embodiment. ドーム型ディスプレイに店舗の売り場が表示される例を示す説明図である。FIG. 2 is an explanatory diagram showing an example in which a store's sales floor is displayed on a dome-shaped display. 顧客の手の動きから商品を検出する例および空中ディスプレイへ商品に関する情報を表示させる例を示す説明図である。FIG. 2 is an explanatory diagram showing an example of detecting a product from a customer's hand movement and an example of displaying information regarding the product on an aerial display. 顧客の視線の動きから商品を検出する例および空中ディスプレイへ商品に関する情報を表示させる例を示す説明図である。FIG. 6 is an explanatory diagram showing an example of detecting a product from the movement of a customer's line of sight and an example of displaying information regarding the product on an aerial display. 会話から商品を検出する例および空中ディスプレイへ商品に関する情報を表示させる例を示す説明図である。FIG. 6 is an explanatory diagram showing an example of detecting a product from a conversation and an example of displaying information regarding the product on an aerial display. 商品を撮像する撮像装置の位置および向きの変更による表示される商品の向きを変更する例を示す説明図である。FIG. 6 is an explanatory diagram showing an example of changing the orientation of a displayed product by changing the position and orientation of an imaging device that images the product. 撮像装置を切り替えることにより表示される商品の向きを変更する例を示す説明図である。FIG. 3 is an explanatory diagram showing an example of changing the orientation of a displayed product by switching the imaging device. 台の回転により表示される商品の向きを変更する例を示す説明図である。It is an explanatory view showing an example of changing the orientation of a displayed product by rotating the stand. 実施の形態2にかかる買物支援システムの一動作例を示すフローチャートである。7 is a flowchart illustrating an example of the operation of the shopping support system according to the second embodiment. 買物支援システムの一実現例を示す説明図である。FIG. 2 is an explanatory diagram showing an example of implementation of a shopping support system. コンピュータのハードウェア構成例を示す説明図である。FIG. 2 is an explanatory diagram showing an example of the hardware configuration of a computer.
 以下に図面を参照して、本開示にかかる買物支援システム、買物支援方法、プログラム、およびプログラムを記録する非一時的な記録媒体の実施の形態を詳細に説明する。本実施の形態は、開示の技術を限定するものではない。ここで、各実施の形態について、非接触型のディスプレイとして空中ディスプレイを一例に挙げて説明する場合がある Embodiments of a shopping support system, a shopping support method, a program, and a non-temporary recording medium for recording the program according to the present disclosure will be described in detail below with reference to the drawings. This embodiment does not limit the disclosed technology. Here, each embodiment may be explained using an aerial display as an example of a non-contact display.
 また、各実施の形態では、買物を例に挙げて、第1ディスプレイのユーザを顧客と呼び、買物支援システムとして説明するが、第1ディスプレイと、空中ディスプレイなどの第2ディスプレイとを組み合わせるユースケースは、買物に限定されない。 In addition, in each embodiment, shopping is taken as an example, the user of the first display is called a customer, and the description is given as a shopping support system, but there are use cases in which the first display is combined with a second display such as an aerial display. is not limited to shopping.
 (実施の形態1)
 まず、実施の形態1では、買物支援システムの基本機能について説明する。図1は、実施の形態1にかかる買物支援システムの一構成例を示すブロック図である。買物支援システム10は、第1出力制御部101と、商品検出部102と、第2出力制御部103と、を備える。
(Embodiment 1)
First, in Embodiment 1, basic functions of a shopping support system will be explained. FIG. 1 is a block diagram showing an example of a configuration of a shopping support system according to a first embodiment. The shopping support system 10 includes a first output control section 101, a product detection section 102, and a second output control section 103.
 第1出力制御部101は、第1ディスプレイの表示を制御する。第1ディスプレイは、例えば、大型のディスプレイである。ここでの大型とは、少なくとも第2ディスプレイよりも大きいことである。例えば、第1ディスプレイは、顧客の視界を覆うような構造になっている表示装置であってもよい。これにより、顧客は没入感がある状態で、買物を行うことができる。ここでは、第1ディスプレイとしてドーム型ディスプレイを例に挙げて説明する。例えば、第1出力制御部101は、ドーム型ディスプレイに、商品が陳列された店舗の売り場に関する情報を表示させる。ここで、売り場に関する情報とは、売り場の映像または売り場の画像などである。なお、ここでの映像は、動画像を示し、画像は、静止画を示す。なお、第1出力制御部101は、ドーム型ディスプレイに、売り場の映像や売り場の画像などとともに、追加情報として売り場の名称などの他の商品に関する情報を表示させてもよい。ここで、店舗は、例えば、スーパーマーケット、スーパーセンタ、コンビニエンスストア、量販店、ホームセンター、ドラッグストア、アパレル商品を扱う店舗、百貨店小売店、パン屋や惣菜店などの個人商店のような各種の小売店であってもよく、特に限定されない。 The first output control unit 101 controls the display on the first display. The first display is, for example, a large display. Here, large size means at least larger than the second display. For example, the first display may be a display device structured to cover the customer's field of view. This allows customers to shop in an immersive manner. Here, a dome-shaped display will be described as an example of the first display. For example, the first output control unit 101 causes a dome-shaped display to display information regarding the department of a store where products are displayed. Here, the information regarding the sales floor is a video of the sales floor, an image of the sales floor, or the like. Note that the video here refers to a moving image, and the image refers to a still image. Note that the first output control unit 101 may cause the dome-shaped display to display information regarding other products, such as the name of the sales floor, as additional information, along with the video of the sales floor and the image of the sales floor. Here, stores include various retail stores such as supermarkets, super centers, convenience stores, mass retailers, home centers, drug stores, stores that handle apparel products, department stores, and individual stores such as bakeries and delicatessen stores. may be used, and is not particularly limited.
 商品検出部102は、顧客の行動に基づいて、顧客が取得したい商品を検出する。例えば、顧客が取得したい商品とは、顧客が詳細を確認したい商品である。例えば、顧客が取得したい商品とは、顧客が注目する商品である。ここで、顧客の行動は、例えば、顧客の動きである。具体的に、例えば、顧客の行動は、より詳細に顧客の動きのうち顧客の手の動き、顧客の視線の動きなどのように身体の特定の動きであってもよい。例えば、商品検出部102は、撮像装置によって撮像された画像から顧客の動きを検出する。例えば、顧客の行動が顧客の手の動きである場合、商品検出部102は、顧客の手の動きに基づいて、顧客が手を伸ばした位置を特定し、特定した位置に応じた商品を検出する。特定した位置に応じた商品とは、例えば、ドーム型ディスプレイに表示された商品のうち、特定した位置の延長線上にある商品、特定した位置から所定の範囲内にある商品などである。 The product detection unit 102 detects products that the customer wants to acquire based on the customer's behavior. For example, the product that the customer wants to acquire is the product that the customer wants to check the details of. For example, a product that a customer wants to acquire is a product that attracts attention of the customer. Here, the customer's behavior is, for example, the customer's movement. Specifically, for example, the customer's behavior may be a specific movement of the customer's body, such as a movement of the customer's hand or a movement of the customer's line of sight. For example, the product detection unit 102 detects a customer's movement from an image captured by an imaging device. For example, if the customer's behavior is a movement of the customer's hand, the product detection unit 102 specifies the position where the customer extends his/her hand based on the movement of the customer's hand, and detects the product according to the specified position. do. The product corresponding to the specified position is, for example, a product displayed on the dome-shaped display that is on an extension of the specified position, a product that is within a predetermined range from the specified position, etc.
 また、例えば、顧客の行動は、顧客と店員との会話であってもよい。そこで、商品検出部102は、会話を音声認識することにより顧客の行動を検出してもよい。 Also, for example, the customer's behavior may be a conversation between the customer and a store clerk. Therefore, the product detection unit 102 may detect the customer's behavior by performing voice recognition on the conversation.
 第2出力制御部103は、第2ディスプレイの表示を制御する。第2ディスプレイは、顧客の手元用のディスプレイである。第2ディスプレイは、例えば、第1ディスプレイよりも顧客の近傍にあるディスプレイである。例えば、第2ディスプレイは、顧客と第1ディスプレイとの間にある。また、例えば、第2ディスプレイは、顧客が表示を見やすくなる位置に設置されたディスプレイである。例えば、第2ディスプレイの表示領域は、第1ディスプレイの表示領域よりも小さい。例えば、第1ディスプレイは、売り場の画像や映像などの売り場に関する情報を表示するのに対して、第2ディスプレイは、顧客によって選択された商品などのように注目された商品の画像や映像などの商品に関する情報を表示する。このため、例えば、第2ディスプレイは、顧客がドーム型ディスプレイよりも顧客に近くに設置され、見易いように商品に関する情報を表示することが望ましい。 The second output control unit 103 controls the display on the second display. The second display is a display for the customer's hand. The second display is, for example, a display that is closer to the customer than the first display. For example, the second display is between the customer and the first display. Further, for example, the second display is a display installed at a position where the customer can easily view the display. For example, the display area of the second display is smaller than the display area of the first display. For example, the first display displays information about the sales floor, such as images and videos of the sales floor, while the second display displays images and videos of products that have been noticed, such as products selected by customers. Display information about products. For this reason, for example, it is desirable that the second display be installed closer to the customer than the dome-shaped display and display information about the product in a way that is easy to see.
 また、第2ディスプレイは、固定設置されたディスプレイであってもよいし、可搬型のディスプレイであってもよい。ここで、第2ディスプレイとして、非接触型のディスプレイを例に挙げて説明するが、接触型のディスプレイであってもよい。 Furthermore, the second display may be a fixed display or a portable display. Here, as the second display, a non-contact type display will be described as an example, but a contact type display may be used.
 また、例えば、表示の対象は、画像、映像、文字や数字、任意の装置で読取可能な特殊なコードなど特に限定されない。 Further, for example, the display target is not particularly limited, such as images, videos, characters and numbers, and special codes that can be read by any device.
 例えば、第2出力制御部103は、検出された商品に関する情報を非接触型のディスプレイに表示させる。例えば、商品に関する情報は、商品の画像、商品の映像、商品に関連する情報などである。例えば、商品に関連する情報は、商品に関連する文字や数字、コードなどであり、具体的に商品の価格、商品の名称、商品の産地や製造メーカなど特に限定されない。商品のコードは、商品に関連する文字や数字を表すコードであってもよい。商品の映像は、リアルタイムの映像であってもよいし、事前に撮像された映像であってもよい。商品の画像は、リアルタイムに新たに撮像される画像であってもよいし、事前に撮像された画像であってもよい。売り場の映像や画像にも商品は映っているが、商品の映像や画像は、商品に注目した映像や画像である。 For example, the second output control unit 103 displays information regarding the detected product on a non-contact display. For example, the information regarding the product includes an image of the product, a video of the product, information related to the product, and the like. For example, the information related to the product includes characters, numbers, codes, etc. related to the product, and is not particularly limited to the price of the product, the name of the product, the place of production, and the manufacturer of the product. The product code may be a code representing characters and numbers related to the product. The image of the product may be a real-time image or may be an image captured in advance. The image of the product may be a new image taken in real time, or may be an image taken in advance. Products are also shown in videos and images of the sales floor, but videos and images of products are videos and images that focus on the product.
 (フローチャート)
 図2は、実施の形態1にかかる買物支援システム10の一動作例を示すフローチャートである。第1出力制御部101は、ドーム型ディスプレイに、店舗の売り場に関する情報を表示させる(ステップS101)。
(flowchart)
FIG. 2 is a flowchart showing an example of the operation of the shopping support system 10 according to the first embodiment. The first output control unit 101 causes the dome-shaped display to display information regarding the sales floor of the store (step S101).
 商品検出部102は、顧客が取得したい商品を検出したかを判定する(ステップS102)。ステップS102において、商品検出部102は、顧客の行動に基づいて、顧客が取得したい商品を検出する。顧客が取得したい商品が検出されていない場合(ステップS102:No)、商品検出部102は、ステップS102へ戻る。 The product detection unit 102 determines whether the product that the customer wants to acquire is detected (step S102). In step S102, the product detection unit 102 detects a product that the customer wants to acquire based on the customer's behavior. If the product that the customer wants to acquire is not detected (step S102: No), the product detection unit 102 returns to step S102.
 顧客が取得したい商品が検出された場合(ステップS102:No)、第2出力制御部103は、非接触型のディスプレイに、検出された商品に関する情報を表示させる(ステップS103)。そして、買物支援システム10は、処理を終了する。 If the product that the customer wants to acquire is detected (step S102: No), the second output control unit 103 displays information regarding the detected product on the non-contact display (step S103). The shopping support system 10 then ends the process.
 以上、実施の形態1において、買物支援システム10は、例えば、大型のディスプレイである第1ディスプレイに店舗を表示させ、顧客の行動によって検出された商品を第2ディスプレイに表示させる。これにより、ドーム型ディスプレイなどの大型のディスプレイを使う場合における利便性の向上を図ることができる。したがって、より実感的な買物体験を提供することができる。例えば、買物支援システム10は、ドーム型ディスプレイに店舗の売り場に関する情報を表示させ、陳列された商品が顧客によって選択されると、手元用のディスプレイに、選択された商品の画像または映像を大きく表示させたり、選択された商品の追加情報を提示するなどを行うことができる。そして、顧客は、各商品や商品に関する情報をドーム型ディスプレイで確認する場合と比較して、手元用のディスプレイなどを用いて、より近くで見えたほうが確認しやすい。 As described above, in the first embodiment, the shopping support system 10 displays the store on the first display, which is a large display, and displays the products detected based on the customer's behavior on the second display. This makes it possible to improve convenience when using a large display such as a dome-shaped display. Therefore, a more realistic shopping experience can be provided. For example, the shopping support system 10 displays information about the store's sales floor on a dome-shaped display, and when a displayed product is selected by a customer, a large image or video of the selected product is displayed on a display at hand. You can also display additional information about the selected product. In addition, it is easier for customers to check information about each product or product if they can see it closer using a handheld display or the like, compared to checking information about each product or product on a dome-shaped display.
 また、例えば、第2ディスプレイは、顧客と第1ディスプレイとの間に設置される。第2ディスプレイとして空中ディスプレイが使用される場合、空中ディスプレイに商品に関する情報を表示させていないときには、顧客と第1ディスプレイとの間に物理的なものがない。このため、第2ディスプレイとして空中ディスプレイが使用される場合、顧客の手の動きの邪魔にならず、顧客が第1ディスプレイを見るときの視界の邪魔にならない。このため、商品に手を伸ばしたり、遮るものがなく店員と会話するような実際の買物により近い買物体験を提供することができる。さらに、空中ディスプレイ部分を周囲の人物から見えないようにすることにより、顧客の個人情報などの他の人に見られたくない情報を、空中ディスプレイを介して顧客に提示することができる。なお、第1ディスプレイは、大きいため、各情報を表示させると、他の人に見られてしまう恐れがある。このように、他の人に見られたくない情報は、第1ディスプレイに表示させにくい内容である。また、第2ディスプレイとして空中ディスプレイが使用される場合、顧客は、商品を立体的に見ることができる場合がある。 Also, for example, the second display is installed between the customer and the first display. When an aerial display is used as the second display, there is no physical space between the customer and the first display when the aerial display is not displaying information about the product. Therefore, when an aerial display is used as the second display, it does not interfere with the customer's hand movements and does not interfere with the customer's view when viewing the first display. Therefore, it is possible to provide a shopping experience that is closer to actual shopping, such as reaching for products and having an unobstructed conversation with store staff. Furthermore, by making the aerial display portion invisible to surrounding people, information such as the customer's personal information that the customer does not want other people to see can be presented to the customer via the aerial display. Note that since the first display is large, there is a risk that other people may see the information displayed. In this way, information that the user does not want others to see is content that is difficult to display on the first display. Furthermore, when an aerial display is used as the second display, the customer may be able to view the product three-dimensionally.
 また、例えば、HMD(Head Mounted Display)のように装着型の情報処理端末が表示手段として利用される場合、仮想空間が表示された状態では、ユーザは手元が見えない場合がある。一方、例えばドーム型ディスプレイが表示手段として利用される場合、仮想空間が表示された状態であっても、ユーザは、手元が見えるという利点がある。また、ドーム型ディスプレイが表示手段として利用される場合、ドーム型ディスプレイとユーザとの間に距離があるため、空間ディスプレイなどの第2ディスプレイが設置可能となる。 Further, for example, when a wearable information processing terminal such as an HMD (Head Mounted Display) is used as a display means, the user may not be able to see his/her hand while the virtual space is displayed. On the other hand, for example, when a dome-shaped display is used as the display means, the user has the advantage of being able to see his or her hand even when the virtual space is displayed. Further, when a dome-shaped display is used as a display means, since there is a distance between the dome-shaped display and the user, a second display such as a spatial display can be installed.
 また、ドーム型ディスプレイが使用される場合、汎用的な使用が考えられる。例えば、誰もがアクセスできるような場所にドーム型ディスプレイが設置され、様々なユーザが、ドーム型ディスプレイを使用することが想定される。このような場合、コントローラに不慣れなユーザもいる。実施の形態1によれば、買物時の操作性の向上を図ることができる。実施の形態1のようにドーム型ディスプレイを使う場合における利便性の向上を図ることにより、より汎用性を高めることを可能とする。また、視界を覆うようなドーム型ディスプレイ21に売り場の映像や画像が表示されることにより、顧客は没入感がある状態で、買物を行うことができる。したがって、実際の買物により近い買物体験を提供することができる。 Additionally, when a dome-shaped display is used, it can be used for general purposes. For example, it is assumed that a dome-shaped display is installed in a place where everyone can access it, and that various users use the dome-shaped display. In such cases, some users may be unfamiliar with the controller. According to the first embodiment, it is possible to improve the operability during shopping. By improving convenience when using a dome-shaped display as in the first embodiment, it is possible to further enhance versatility. In addition, since videos and images of the sales floor are displayed on the dome-shaped display 21 that covers the customer's field of view, the customer can shop with an immersive feeling. Therefore, it is possible to provide a shopping experience closer to actual shopping.
 ただし、コントローラによる操作と買物支援システム10とは、適宜組み合わせて利用されてもよい。 However, the operation by the controller and the shopping support system 10 may be used in combination as appropriate.
 (実施の形態2)
 つぎに、実施の形態2について図面を参照して詳細に説明する。実施の形態2では、第1ディスプレイとしてのドーム型ディスプレイの例、第2ディスプレイとしての空中ディスプレイの設置例などについて説明する。さらに、実施の形態2では、空中ディスプレイにおいて商品に関する情報の表示中に、表示された商品をより見やすくするために、ドーム型ディスプレイの表示を制御する例について説明する。以下、本実施の形態2の説明が不明確にならない範囲で、前述の説明と重複する内容については説明を省略する。
(Embodiment 2)
Next, Embodiment 2 will be described in detail with reference to the drawings. In Embodiment 2, an example of a dome-shaped display as a first display, an installation example of an aerial display as a second display, etc. will be described. Furthermore, in Embodiment 2, an example will be described in which the display of the dome-shaped display is controlled in order to make the displayed product easier to see while displaying information about the product on the aerial display. Hereinafter, a description of contents that overlap with the above description will be omitted to the extent that the description of the second embodiment is not unclear.
 図3は、ドーム型ディスプレイの一例を示す説明図である。図4は、空中ディスプレイの設置例および顧客周辺の装置の設置例を示す説明図である。図3において、ドーム型ディスプレイ21は、例えば、スクリーン2101と、投影装置2102と、テーブル2103と、を備える。 FIG. 3 is an explanatory diagram showing an example of a dome-shaped display. FIG. 4 is an explanatory diagram showing an example of installing an aerial display and an example of installing devices around a customer. In FIG. 3, the dome-shaped display 21 includes, for example, a screen 2101, a projection device 2102, and a table 2103.
 ドーム型ディスプレイ21が映像や画像を表示するとは、投影装置2102がスクリーン2101に映像や画像を投影することである。 When the dome-shaped display 21 displays a video or image, it means that the projection device 2102 projects the video or image onto the screen 2101.
 例えば、テーブル2103には、空中ディスプレイ22が設置される。また、図4では、ドーム型ディスプレイ21のスクリーン2101や投影装置2102を省略している。図4では、顧客が、椅子に座って、ドーム型ディスプレイと空中ディスプレイ22に表示されたリンゴを見ている。図4では、テーブル2103を横から見ている例を示すが、実際には、空中ディスプレイ22には、後述する空中結像に厚さがない場合がある。また、このような場合には、空中結像を厚さ方向に横から見た場合、空中結像を見ることができないが、以降の説明において、理解の容易化のために、横から見た場合にリンゴ等が表示されている例を用いて説明する。また、図4において、ドーム型ディスプレイ21が設置される場所には、撮像装置23と、録音装置24と、が設置されてもよい。例えば、撮像装置23と録音装置24とは、顧客の行動を検出するために用いられる。なお、図示しないが、ドーム型ディスプレイ21が設置される場所には、店員の声を出力するスピーカなどの音声出力装置が設置されていてもよい。 For example, the aerial display 22 is installed on the table 2103. Further, in FIG. 4, the screen 2101 of the dome-shaped display 21 and the projection device 2102 are omitted. In FIG. 4, a customer is sitting on a chair and looking at apples displayed on the dome-shaped display and the aerial display 22. Although FIG. 4 shows an example in which the table 2103 is viewed from the side, in reality, the aerial display 22 may not have a thickness for aerial imaging, which will be described later. In addition, in such a case, if the aerial image is viewed from the side in the thickness direction, the aerial image cannot be seen, but in the following explanation, for ease of understanding, This will be explained using an example in which an apple or the like is displayed. Furthermore, in FIG. 4, an imaging device 23 and a recording device 24 may be installed at the location where the dome-shaped display 21 is installed. For example, the imaging device 23 and the recording device 24 are used to detect customer behavior. Although not shown, an audio output device such as a speaker for outputting the voice of the store clerk may be installed at the location where the dome-shaped display 21 is installed.
 撮像装置23と録音装置24と空中ディスプレイ22とのそれぞれの数は、特に限定されない。例えば、撮像装置23は、撮像した映像または画像を買物支援システムなどに送信すればよい。また、録音装置24は、音声を買物支援システムなどに送信すればよい。 The number of imaging devices 23, recording devices 24, and aerial displays 22 is not particularly limited. For example, the imaging device 23 may transmit the captured video or image to a shopping support system or the like. Furthermore, the recording device 24 may transmit the audio to a shopping support system or the like.
 ドーム型ディスプレイ21は、例えば、スクリーン2101の少なくとも一部が曲面になっていて、顧客の視界を覆うような構造になっている装置である。なお、ドーム型ディスプレイ21は、例えば、360度のような完全なドーム状ではなく、180度のように一部分が欠けていてもよい。また、ドーム型ディスプレイ21は、スクリーン2101の一部分が湾曲してなくてもよい。また、ドーム型ディスプレイ21は、ユーザとの間に、空中ディスプレイ22が設置されたテーブル2103を設置可能な程度の隙間ができるようなサイズである。このように、ドーム型ディスプレイ21のサイズおよび種類は、特に限定されない。例えば、ドーム型ディスプレイ21は、180度のドーム型ディスプレイ21であってもよいし、360度のドーム型ディスプレイ21であってもよい。また、例えば、ドーム型ディスプレイ21のサイズは、例えば、縦1メートルから2メートル、横1メートルから2メートル、高さ1メートルから2メートル程度であってもよい。 The dome-shaped display 21 is, for example, a device in which at least a portion of the screen 2101 is curved to cover the customer's field of view. Note that the dome-shaped display 21 may not have a complete dome shape such as 360 degrees, but may have a partially broken shape such as 180 degrees. Furthermore, in the dome-shaped display 21, a portion of the screen 2101 does not need to be curved. Furthermore, the dome-shaped display 21 is sized to create a gap between it and the user that is large enough to allow the installation of the table 2103 on which the aerial display 22 is installed. In this way, the size and type of the dome-shaped display 21 are not particularly limited. For example, the dome-shaped display 21 may be a 180-degree dome-shaped display 21 or a 360-degree dome-shaped display 21. Further, for example, the size of the dome-shaped display 21 may be, for example, approximately 1 meter to 2 meters in length, 1 meter to 2 meters in width, and 1 meter to 2 meters in height.
 また、ドーム型ディスプレイ21の設置場所は、特に限定されない。例えば、ドーム型ディスプレイ21は、顧客の家、会社などに設置されていてもよいし、誰でも使えるような場所に設置されていてもよい。 Further, the installation location of the dome-shaped display 21 is not particularly limited. For example, the dome-shaped display 21 may be installed in a customer's home, office, etc., or may be installed in a place where anyone can use it.
 図5は、店舗の売り場の一例を示す説明図である。例えば、店舗の売り場には、例えば、商品棚などが設置される。設置棚には、商品が陳列される。店舗は、仮想空間における店舗であってもよいし、実際の店舗であってもよい。仮想空間における店舗は、実際にはない仮想の店舗であってもよいし、実際の店舗を模した仮想の店舗であってもよい。実際の店舗があり、実際の店舗の映像がドーム型ディスプレイ21に表示される場合、実際の店舗の売り場には、例えば、撮像装置25、録音装置26が設置される。また、図示しないが、実際の店舗の売り場には、例えば、顧客の声を出力するスピーカなどの音声出力装置が設置されていてもよい。 FIG. 5 is an explanatory diagram showing an example of a store's sales floor. For example, in the sales floor of a store, for example, product shelves are installed. Products are displayed on the installed shelves. The store may be a store in a virtual space or an actual store. The store in the virtual space may be a virtual store that does not actually exist, or a virtual store that imitates an actual store. If there is an actual store and an image of the actual store is displayed on the dome-shaped display 21, for example, an imaging device 25 and a recording device 26 are installed in the sales floor of the actual store. Further, although not shown, an audio output device such as a speaker for outputting customer voices may be installed in the sales floor of the actual store.
 図5において、撮像装置25-1と撮像装置25-2との2台が設置されているが、撮像装置25の数は、特に限定されない。なお、撮像装置25を特に限定しない場合、撮像装置25と表す。撮像装置25は、店舗の売り場を撮像する。また、撮像装置25は、店員を撮像してもよい。また、撮像装置25は、店舗の各商品を撮像してもよい。ここで、撮像装置25-1は、店員を含む店舗の売り場を撮像し、撮像装置25-2は、指定された商品を撮像してもよい。例えば、撮像装置25は、撮像した映像を買物支援システムなどに送信すればよい。 In FIG. 5, two imaging devices 25-1 and 25-2 are installed, but the number of imaging devices 25 is not particularly limited. Note that if the imaging device 25 is not particularly limited, it will be referred to as an imaging device 25. The imaging device 25 images the sales floor of the store. Further, the imaging device 25 may take an image of a store clerk. Further, the imaging device 25 may take images of each product in the store. Here, the imaging device 25-1 may take an image of the sales floor of the store including the store staff, and the imaging device 25-2 may take an image of a specified product. For example, the imaging device 25 may transmit the captured video to a shopping support system or the like.
 また、録音装置26は、売り場の音声を録音または集音する。そして、例えば、録音装置26は、音声を買物支援システムなどに送信すればよい。 Additionally, the recording device 26 records or collects the voices of the sales floor. Then, for example, the recording device 26 may transmit the audio to a shopping support system or the like.
 なお、図5には、実際の店舗がある例を挙げているが、仮想空間上の店舗が用いられてもよい。例えば、実際の店舗がない場合などに仮想空間上の店舗が用いられてもよい。 Although FIG. 5 shows an example of an actual store, a store in a virtual space may also be used. For example, a store in virtual space may be used when there is no actual store.
 図6は、空中ディスプレイを簡易的に示す説明図である。空中ディスプレイ22は、例えば、光学素子と、ディスプレイと、センサと、を備える。 FIG. 6 is an explanatory diagram that simply shows an aerial display. The aerial display 22 includes, for example, an optical element, a display, and a sensor.
 空中ディスプレイ22では、光学素子が、ディスプレイの表示によって放たれた光線を通過させ、ディスプレイと反対側に同じ像を形成する。図6において、この像が、空中結像である。センサは、空中の映像を操作するために用いられる。センサは、例えば、モーションセンサである。以降の説明では、例えばセンサによって検出可能な操作を空中ディスプレイに対する操作とも呼ぶ。空中ディスプレイの種類は、特に限定されない。 In the aerial display 22, optical elements pass the light rays emitted by the display's display and form the same image on the side opposite the display. In FIG. 6, this image is an aerial image. Sensors are used to manipulate images in the air. The sensor is, for example, a motion sensor. In the following description, for example, an operation that can be detected by a sensor will also be referred to as an operation on the aerial display. The type of aerial display is not particularly limited.
 図7は、実施の形態2にかかる買物支援システムの一構成例を示すブロック図である。買物支援システム20は、第1出力制御部201と、商品検出部202と、第2出力制御部203と、取得部204と、映像解析部205と、音声解析部206と、撮像装置制御部207と、台制御部208と、重畳位置特定部209と、視認位置特定部210と、操作受付部211と、を備える。 FIG. 7 is a block diagram showing an example of the configuration of a shopping support system according to the second embodiment. The shopping support system 20 includes a first output control section 201, a product detection section 202, a second output control section 203, an acquisition section 204, a video analysis section 205, an audio analysis section 206, and an imaging device control section 207. , a platform control section 208 , a superimposition position specifying section 209 , a visual position specifying section 210 , and an operation receiving section 211 .
 買物支援システム20は、実施の形態1にかかる買物支援システム10に対して、取得部204と、映像解析部205と、音声解析部206と、撮像装置制御部207と、台制御部208と、重畳位置特定部209と、視認位置特定部210と、操作受付部211と、が追加される。第1出力制御部201は、実施の形態1にかかる第1出力制御部101の機能を基本機能として備える。商品検出部202は、実施の形態1にかかる商品検出部102の機能を基本機能として備える。第2出力制御部203は、実施の形態1にかかる第2出力制御部103の機能を基本機能として備える。 The shopping support system 20 includes an acquisition unit 204, a video analysis unit 205, an audio analysis unit 206, an imaging device control unit 207, a stand control unit 208, in addition to the shopping support system 10 according to the first embodiment. A superimposition position specifying section 209, a visible position specifying section 210, and an operation receiving section 211 are added. The first output control section 201 has the functions of the first output control section 101 according to the first embodiment as a basic function. The product detection unit 202 has the function of the product detection unit 102 according to the first embodiment as a basic function. The second output control section 203 has the functions of the second output control section 103 according to the first embodiment as a basic function.
 また、例えば、買物支援システム20は、顧客DB(DataBase)2001と、店舗DB2002と、商品DB2003と、を有する。なお、買物支援システム20の各機能部は、各種データベースを適宜参照したり、更新することができる。 For example, the shopping support system 20 includes a customer DB (DataBase) 2001, a store DB 2002, and a product DB 2003. Note that each functional unit of the shopping support system 20 can refer to and update various databases as appropriate.
 例えば、顧客DB2001は、顧客の映像や音声、顧客の行動履歴などを記憶する。行動履歴は、映像から検出された顧客の動きの履歴、音声から検出された会話の履歴行動であってもよいし、空中ディスプレイ22に対する顧客の操作の履歴であってもよい。 For example, the customer DB 2001 stores customer video and audio, customer behavior history, and the like. The behavior history may be a history of the customer's movements detected from video, a history of conversation behavior detected from audio, or a history of the customer's operations on the aerial display 22.
 また、例えば、店舗DB2002は、店舗の映像、店舗の画像、店舗の音声などを記憶する。 Also, for example, the store DB 2002 stores store videos, store images, store sounds, and the like.
 商品DB2003は、商品別に、商品の商品識別情報と、商品の情報と、を対応付けて記憶する。商品の情報は、例えば、商品の映像、商品の画像、商品の価格、商品の名称、売り場における商品の陳列位置、商品の特徴などの情報である。商品の特徴は、商品の色、商品の形状、商品の産地、商品の原材料、商品の製造会社などであってもよい。例えば、商品の特徴は、顧客と店員との会話などから商品の特定に用いられる場合があるため、商品を特定可能なような情報であれば、特に限定されない。 The product DB 2003 stores product identification information and product information in association with each other for each product. The product information includes, for example, a video of the product, an image of the product, a price of the product, a name of the product, a display position of the product on the sales floor, characteristics of the product, and the like. The characteristics of the product may be the color of the product, the shape of the product, the production area of the product, the raw materials of the product, the manufacturing company of the product, and the like. For example, the characteristics of a product may be used to identify the product based on a conversation between a customer and a store employee, so there is no particular limitation on the characteristics as long as the information allows the product to be identified.
 つぎに、各機能部について説明する。 Next, each functional section will be explained.
 例えば、取得部204は、撮像装置23から、撮像された顧客の映像を取得する。例えば、取得された映像のデータは、例えば、顧客DB2001に記憶される。また、例えば、取得部204は、録音装置24から、顧客の音声を取得する。例えば、取得された音声のデータは、例えば、顧客DB2001に記憶される。 For example, the acquisition unit 204 acquires a captured image of the customer from the imaging device 23. For example, the acquired video data is stored in the customer DB 2001, for example. Further, for example, the acquisition unit 204 acquires the customer's voice from the recording device 24. For example, the acquired voice data is stored in the customer DB 2001, for example.
 例えば、実際の店舗がある場合、取得部204は、撮像装置25から、撮像された店舗の映像を取得する。例えば、取得された映像のデータは、例えば、店舗DB2002に記憶される。なお、取得部204は、予め作成された店舗の映像を取得してもよい。例えば、実際の店舗がない場合などに、仮想空間における店舗の映像が作成されてもよい。例えば、取得された映像のデータは、例えば、店舗DB2002に記憶される。 For example, if there is an actual store, the acquisition unit 204 acquires a captured image of the store from the imaging device 25. For example, the acquired video data is stored in the store DB 2002, for example. Note that the acquisition unit 204 may acquire an image of a store created in advance. For example, when there is no actual store, an image of the store in virtual space may be created. For example, the acquired video data is stored in the store DB 2002, for example.
 実際の店舗がある場合、取得部204は、録音装置26から、店舗の音声を取得する。取得された音声のデータは、例えば、店舗DB2002に記憶される。 If there is an actual store, the acquisition unit 204 acquires the store's audio from the recording device 26. The acquired voice data is stored in the store DB 2002, for example.
 また、取得部204は、仮想空間上における音声を取得してもよい。 Additionally, the acquisition unit 204 may acquire audio in the virtual space.
 第1出力制御部201は、ドーム型ディスプレイ21に、商品が陳列された店舗の売り場に関する情報を表示させる。例えば、店舗の売り場に関する情報は、売り場の映像、売り場の画像などである。具体的に、実際の店舗がある場合、例えば、第1出力制御部201は、撮像装置25から取得された売り場の映像を、ドーム型ディスプレイ21に表示させる。これにより、顧客は、リアルタイムに取得された売り場の映像を見ることができるため、実際の店舗で買物しているように感じることができる。 The first output control unit 201 causes the dome-shaped display 21 to display information regarding the department of the store where the products are displayed. For example, information regarding the sales floor of a store includes a video of the sales floor, an image of the sales floor, and the like. Specifically, if there is an actual store, for example, the first output control unit 201 causes the dome-shaped display 21 to display an image of the sales floor acquired from the imaging device 25. This allows customers to view video of the sales floor captured in real time, giving them the feeling of shopping in an actual store.
 図8は、ドーム型ディスプレイ21に店舗の売り場が表示される例を示す説明図である。図8において、ドーム型ディスプレイ21には、図5に示すような売り場が表示されている。そして、例えば、顧客は、椅子に座って、売り場を見ることができる。 FIG. 8 is an explanatory diagram showing an example in which the sales floor of a store is displayed on the dome-shaped display 21. In FIG. 8, the dome-shaped display 21 displays a sales floor as shown in FIG. For example, the customer can sit on a chair and view the sales floor.
 図7の説明に戻る。第1出力制御部201は、ドーム型ディスプレイ21に、さらに、店員に関する情報を表示させる。店員に関する情報とは、店員の映像、店員の画像、店員の名前、店員の所属、店員の特徴などである。店員の所属とは、例えば、「野菜売り場担当」などのように店舗における店員の所属である。店員の特徴は、例えば、保有資格などであってもよい。例えば、実際の店舗に店員がいる場合、撮像装置25は、店員を含むように売り場を撮像する。そして、第1出力制御部201は、ドーム型ディスプレイ21に、撮像された映像を表示させる。図8の例では、図5の売り場にいる店員を含めて撮像された映像が表示されている。また、店員に関する情報を表示させる場合に、第1出力制御部201は、店員のアバターを表示させてもよい。例えば、店員のアバターを表示させる場合に、撮像装置25が、売り場を撮像する。そして、第1出力制御部201は、撮像された映像に店員のアバターを合成し、合成された映像を表示させてもよい。または、例えば、第1出力制御部201は、仮想空間における売り場の映像に、店員のアバターを合成し、合成された映像を表示させてもよい。 Returning to the explanation of FIG. 7. The first output control unit 201 further causes the dome-shaped display 21 to display information regarding the store clerk. Information regarding the store clerk includes a video of the store clerk, an image of the store clerk, the name of the store clerk, the employee's affiliation, characteristics of the store employee, and the like. The employee's affiliation is the employee's affiliation at the store, such as "in charge of vegetable section," for example. The characteristics of the store clerk may be, for example, qualifications held. For example, if a store employee is present in an actual store, the imaging device 25 captures an image of the sales floor including the store employee. Then, the first output control unit 201 causes the dome-shaped display 21 to display the captured video. In the example of FIG. 8, an image captured including the store clerk at the sales floor of FIG. 5 is displayed. Further, when displaying information regarding a store clerk, the first output control section 201 may display an avatar of the store clerk. For example, when displaying a store clerk's avatar, the imaging device 25 images the sales floor. The first output control unit 201 may then combine the captured video with the store clerk's avatar and display the combined video. Alternatively, for example, the first output control unit 201 may combine the store clerk's avatar with the video of the sales floor in the virtual space, and display the combined video.
 なお、録音装置24から店員の音声が取得可能であれば、取得部204は、録音装置24から顧客および店員の音声を取得してもよい。 Note that if the voice of the store clerk can be acquired from the recording device 24, the acquisition unit 204 may acquire the voice of the customer and the store clerk from the recording device 24.
 商品検出部202は、顧客の行動に基づいて、顧客が取得したい商品を検出する。まず、商品検出部202が商品を検出するために、例えば、映像解析部205は、映像から顧客の行動を解析する。そして、商品検出部202が、解析された顧客の行動に基づいて、商品を検出する。または、商品検出部202が商品を検出するために、音声解析部206は、音声から顧客の行動を解析する。そして、商品検出部202が、解析された顧客の行動に基づいて、商品を検出する。なお、商品の検出例については、顧客の行動別に後述する。 The product detection unit 202 detects products that the customer wants to acquire based on the customer's behavior. First, in order for the product detection unit 202 to detect the product, for example, the video analysis unit 205 analyzes the customer's behavior from the video. Then, the product detection unit 202 detects the product based on the analyzed customer behavior. Alternatively, in order for the product detection unit 202 to detect the product, the voice analysis unit 206 analyzes the customer's behavior from the voice. Then, the product detection unit 202 detects the product based on the analyzed customer behavior. Note that examples of product detection will be described later for each customer's behavior.
 そして、第2出力制御部203は、検出された商品に関する情報を空中ディスプレイ22に表示させる。商品に関する情報は、商品の映像であってもよいし、商品の画像であってもよいし、商品の名称などの商品に関連する情報であってもよい。商品の映像は、例えば、撮像された映像であってもよいし、商品を仮想空間上に再現した映像であってもよい。撮像された映像は、撮像装置25によってリアルタイムに商品が撮像された映像であってもよいし、事前に商品が撮像された映像であってもよい。また、商品の画像は、商品が撮像された画像であってもよいし、イラストなどの商品の画像であってもよいし、商品を仮想空間上に再現した画像であってもよい。撮像された画像は、撮像装置25によってリアルタイムに商品が撮像された画像であってもよいし、事前に商品が撮像された画像であってもよい。 Then, the second output control unit 203 causes the aerial display 22 to display information regarding the detected product. The information regarding the product may be a video of the product, an image of the product, or information related to the product such as the name of the product. The image of the product may be, for example, a captured image or an image that reproduces the product in a virtual space. The captured video may be a video of the product captured in real time by the imaging device 25, or may be a video of the product captured in advance. Further, the image of the product may be a captured image of the product, an image of the product such as an illustration, or an image of the product reproduced in virtual space. The captured image may be an image of the product captured in real time by the imaging device 25, or may be an image of the product captured in advance.
 <商品の検出例>
 つぎに、顧客の行動別に、商品の検出例について説明する。
<Example of product detection>
Next, examples of product detection will be explained based on customer behavior.
 <手の動きによる商品の検出例>
 まず、顧客の行動が顧客の手の動きである場合を例に挙げて説明する。例えば、顧客の行動が顧客の手の動きである場合、映像解析部205は、撮像装置23によって撮像された映像から、顧客の手の動きを検出してもよい。例えば、商品検出部202は、顧客の手の動きに基づいて、顧客が取得したい商品を検出する。より具体的に商品を検出する方法として、例えば、商品検出部202は、顧客が手を伸ばした位置に基づいて、商品を検出する。例えば、商品検出部202は、顧客が手を伸ばした位置にある商品またはその位置の近傍にある商品を検出する。顧客が手を伸ばした位置の近傍にある商品とは、顧客が手を伸ばした位置の延長線上にある商品であってもよいし、顧客が手を伸ばした位置から最も近くにある商品であってもよい。
<Example of product detection based on hand movements>
First, we will explain the case where the customer's behavior is the customer's hand movement as an example. For example, if the customer's behavior is a movement of the customer's hand, the video analysis unit 205 may detect the movement of the customer's hand from the video captured by the imaging device 23. For example, the product detection unit 202 detects a product that the customer wants to acquire based on the customer's hand movements. As a more specific method of detecting the product, for example, the product detection unit 202 detects the product based on the position where the customer extends his/her hand. For example, the product detection unit 202 detects a product located at a position where a customer extends his or her hand or a product located near the position. A product near the position where the customer extends his/her hand may be a product located on an extension of the position where the customer extends his or her hand, or a product that is closest to the position where the customer extends his or her hand. You can.
 ここで、図9を用いて、顧客の手の動きの検出から空中ディスプレイ22への商品に関する情報の表示の一例を説明する。 Here, an example of displaying information regarding a product on the aerial display 22 based on detection of a customer's hand movement will be described using FIG. 9.
 図9は、顧客の手の動きから商品を検出する例および空中ディスプレイ22へ商品に関する情報を表示させる例を示す説明図である。図9において、ドーム型ディスプレイ21のスクリーン2101や投影装置2102については、図示省略している。例えば、撮像装置23が、顧客を撮像する。そして、取得部204は、撮像された顧客の映像をリアルタイムで取得する。そして、商品検出部202は、映像から顧客の行動を検出する。 FIG. 9 is an explanatory diagram showing an example of detecting a product from a customer's hand movement and an example of displaying information regarding the product on the aerial display 22. In FIG. 9, the screen 2101 of the dome-shaped display 21 and the projection device 2102 are not shown. For example, the imaging device 23 images a customer. The acquisition unit 204 then acquires the captured image of the customer in real time. The product detection unit 202 then detects the customer's behavior from the video.
 図9において、映像解析部205は、映像に基づいて、顧客が座っている状態から立ち上がりドーム型ディスプレイ21に対して手を伸ばしていることを検出する。そして、図9において、商品検出部202は、ドーム型ディスプレイ21に表示されている売り場の商品から、顧客の手を伸ばした先にある商品を検出する。そして、第2出力制御部203は、検出された商品に関する情報を空中ディスプレイ22に出力させる。これにより、図9において、空中ディスプレイ22は、商品の画像または商品の映像などの商品に関する情報を表示する。図9の例では、リンゴが表示されている。 In FIG. 9, the video analysis unit 205 detects that the customer stands up from a sitting position and extends his hand toward the dome-shaped display 21 based on the video. Then, in FIG. 9, the product detection unit 202 detects the product at the end of the customer's outstretched hand from among the products on the sales floor displayed on the dome-shaped display 21. Then, the second output control unit 203 causes the aerial display 22 to output information regarding the detected product. Accordingly, in FIG. 9, the aerial display 22 displays information regarding the product, such as an image of the product or a video of the product. In the example of FIG. 9, an apple is displayed.
 図7の説明に戻る。手の動きは、撮像装置23などのセンサによって検出される例に限られず、専用のグローブなどによって検出されてもよい。なお、センサは、撮像装置23に限らず、種々変更可能である。 Returning to the explanation of FIG. 7. Hand movements are not limited to those detected by a sensor such as the imaging device 23, but may also be detected by a dedicated glove or the like. Note that the sensor is not limited to the imaging device 23, and can be modified in various ways.
 <視線の動きによる商品の検出例>
 つぎに、顧客の行動が顧客の視線の動きである場合を例に挙げて説明する。例えば、顧客の行動が顧客の視線の動きである場合、映像解析部205は、撮像装置23によって撮像された画像から、顧客の視線の動きを検出してもよい。
<Example of product detection based on eye movement>
Next, an example will be explained in which the customer's behavior is the movement of the customer's line of sight. For example, if the customer's behavior is the movement of the customer's line of sight, the video analysis unit 205 may detect the movement of the customer's line of sight from the image captured by the imaging device 23.
 例えば、顧客が注視している商品は、顧客が確認したい商品である場合がある。そこで、商品検出部202は、顧客が注視している商品を、顧客が取得したい商品として検出する。なお、顧客が注視している商品は、顧客の視線が向けられている商品、顧客の視線が所定時間以上向けられている商品、顧客の視線が所定回数以上向けられている商品などが挙げられる。顧客が注視している商品の特定方法は、特に限定されない。 For example, the product that the customer is watching may be the product that the customer wants to check. Therefore, the product detection unit 202 detects the product that the customer is watching as a product that the customer wants to acquire. Note that the products that customers are looking at include products that the customer's gaze is directed toward, products that the customer's gaze is directed toward for more than a predetermined amount of time, and products that the customer's gaze is directed toward for more than a predetermined number of times. . The method of identifying the product that the customer is watching is not particularly limited.
 ここで、図10を用いて、顧客の視線の検出から空中ディスプレイ22への商品に関する情報の表示の一例を説明する。 Here, an example of displaying information regarding products on the aerial display 22 from detection of a customer's line of sight will be described using FIG. 10.
 図10は、顧客の視線の動きから商品を検出する例および空中ディスプレイ22へ商品に関する情報を表示させる例を示す説明図である。図10において、ドーム型ディスプレイ21のスクリーン2101や投影装置2102については、図示省略している。例えば、撮像装置23が、顧客を撮像する。そして、取得部204は、撮像された顧客の映像をリアルタイムで取得する。そして、映像解析部205は、映像から顧客の視線の動きを検出する。 FIG. 10 is an explanatory diagram showing an example of detecting a product from the movement of the customer's line of sight and an example of displaying information regarding the product on the aerial display 22. In FIG. 10, the screen 2101 of the dome-shaped display 21 and the projection device 2102 are not shown. For example, the imaging device 23 images a customer. The acquisition unit 204 then acquires the captured image of the customer in real time. Then, the video analysis unit 205 detects the movement of the customer's line of sight from the video.
 図10において、商品検出部202は、映像に基づいて、ドーム型ディスプレイ21に表示されている売り場の商品から、顧客が所定時間以上視線を向けている商品を検出する。そして、第2出力制御部203は、検出された商品に関する情報を空中ディスプレイ22に表示させる。これにより、図10において、空中ディスプレイ22は、商品の画像または商品の映像などの商品に関する情報を表示する。図10の例では、リンゴが表示されている。 In FIG. 10, the product detection unit 202 detects the product on which the customer has been looking for a predetermined period of time or more from among the products on the sales floor displayed on the dome-shaped display 21 based on the video. Then, the second output control unit 203 causes the aerial display 22 to display information regarding the detected product. Accordingly, in FIG. 10, the aerial display 22 displays information regarding the product, such as an image of the product or a video of the product. In the example of FIG. 10, an apple is displayed.
 図7の説明に戻る。視線の動きは、撮像装置23によって検出される例に限らず、他のセンサによって検出されてもよい。また、視線の動きに関わらず、顔の向きが用いられてもよい。 Returning to the explanation of FIG. 7. The movement of the line of sight is not limited to the example detected by the imaging device 23, but may be detected by another sensor. Furthermore, the direction of the face may be used regardless of the movement of the line of sight.
 <会話による商品の検出例>
 つぎに、顧客の行動が店員との会話である場合を例に挙げて説明する。例えば、顧客の行動が店員と顧客との会話である場合、音声解析部206は、顧客と店員との会話を音声認識する。そして、音声解析部206は、会話から、商品の名称、売り場における商品の陳列位置、商品の価格、商品の特徴など商品を特定可能なキーワードを検出する。商品の特徴は、商品の色、商品の形状、商品の産地、商品の原材料、商品の製造会社などであってもよい。売り場における商品の位置とは、右端、左端、右上、棚における位置などであってもよいし、特に限定されない。例えば、商品の位置を含むキーワードは、例えば、「右端のリンゴ」などである。例えば、商品検出部202は、商品DB2003から、検出されたキーワードに合致する商品を検出する。なお、キーワードは、顧客が発したキーワードであってもよいし、店員が発したキーワードであってもよい。
<Example of product detection through conversation>
Next, an example will be explained in which the customer's action is a conversation with a store clerk. For example, if the customer's behavior is a conversation between a store clerk and the customer, the voice analysis unit 206 recognizes the conversation between the customer and the store clerk. Then, the voice analysis unit 206 detects keywords that can identify the product, such as the name of the product, the display position of the product on the sales floor, the price of the product, and the characteristics of the product, from the conversation. The characteristics of the product may be the color of the product, the shape of the product, the production area of the product, the raw materials of the product, the manufacturing company of the product, and the like. The position of the product on the sales floor may be the right end, the left end, the upper right, the position on the shelf, etc., and is not particularly limited. For example, a keyword that includes the location of a product is, for example, "apple on the far right." For example, the product detection unit 202 detects a product matching the detected keyword from the product DB 2003. Note that the keyword may be a keyword issued by a customer or a keyword issued by a store clerk.
 ここで、図11を用いて、会話から商品を検出し、空中ディスプレイ22への商品に関する情報の表示の一例を説明する。 Here, an example of detecting a product from a conversation and displaying information regarding the product on the aerial display 22 will be described using FIG. 11.
 図11は、会話から商品を検出する例および空中ディスプレイ22へ商品に関する情報を表示させる例を示す説明図である。図11において、ドーム型ディスプレイ21のスクリーン2101や投影装置2102については、図示省略している。例えば、録音装置24が、音声を録音、または音声を集音する。そして、取得部204は、録音装置24から、音声をリアルタイムに取得する。そして、音声解析部206は、会話を音声認識することにより、会話から、商品を特定可能な情報を検出する。 FIG. 11 is an explanatory diagram showing an example of detecting a product from a conversation and an example of displaying information regarding the product on the aerial display 22. In FIG. 11, the screen 2101 of the dome-shaped display 21 and the projection device 2102 are not shown. For example, the recording device 24 records audio or collects audio. The acquisition unit 204 then acquires audio from the recording device 24 in real time. Then, the voice analysis unit 206 detects information that can identify the product from the conversation by performing voice recognition on the conversation.
 図11において、例えば、商品検出部202は、会話から、キーワード「右端のリンゴ」を検出する。そして、第2出力制御部203は、検出された商品に関する情報を空中ディスプレイ22に表示させる。これにより、図11において、空中ディスプレイ22は、商品の画像または商品の映像などの商品に関する情報を表示する。図11の例では、リンゴが表示されている。 In FIG. 11, for example, the product detection unit 202 detects the keyword "apple on the far right" from the conversation. Then, the second output control unit 203 causes the aerial display 22 to display information regarding the detected product. Accordingly, in FIG. 11, the aerial display 22 displays information regarding the product, such as an image of the product or a video of the product. In the example of FIG. 11, an apple is displayed.
 図7の説明に戻る。音声から得られる顧客の行動は、顧客と店員との会話に限らず、顧客の行動は、例えば、顧客の発話であってもよい。 Returning to the explanation of FIG. 7. The customer's behavior obtained from the voice is not limited to the conversation between the customer and the store clerk, and the customer's behavior may be, for example, the customer's utterance.
 以上、商品の検出例についての説明を終了する。以上のように、商品の検出例は、様々な方法があり、適宜組み合わせられてもよい。例えば、商品検出部202は、解析された会話と、解析された視線の動きとに基づいて、顧客が取得したい商品を検出してもよい。 This concludes the explanation of the product detection example. As described above, there are various methods for detecting products, and these methods may be combined as appropriate. For example, the product detection unit 202 may detect a product that the customer wants to acquire based on the analyzed conversation and the analyzed eye movement.
 <画像や映像における商品の向きを変更する例>
 ここで、空中ディスプレイ22に商品の画像や映像を表示中に、画像や映像における商品の表示の向きを変更する例について説明する。
<Example of changing the orientation of a product in an image or video>
Here, an example will be described in which, while an image or video of a product is being displayed on the aerial display 22, the display orientation of the product in the image or video is changed.
 例えば、映像解析部205は、顧客の映像を解析することにより、空中ディスプレイ22に対する顧客の手の動きを検出する。具体的に、例えば、映像解析部205は、商品を回すような手の動きを検出してもよい。例えば、映像解析部205は、商品を左に回すような手の動きか、右に回すような手の動きかを検出してもよい。 For example, the video analysis unit 205 detects the customer's hand movement with respect to the aerial display 22 by analyzing the customer's video. Specifically, for example, the video analysis unit 205 may detect a hand movement such as turning a product. For example, the video analysis unit 205 may detect whether the hand movement is turning the product to the left or the hand movement is turning the product to the right.
 つぎに、買物支援システム20は、手の動きに基づいて、空中ディスプレイ22に表示される画像や映像における商品の向きが変更されるような制御を行う。制御方法は、様々な例がある。ここでは、商品の映像を例に挙げて説明する。 Next, the shopping support system 20 performs control such that the orientation of the product in the image or video displayed on the aerial display 22 is changed based on the hand movement. There are various examples of control methods. Here, we will explain using a product video as an example.
 まず、手の動きに基づいて、店舗に設置された撮像装置25の撮像位置および向きの少なくともいずれかを制御する例を説明する。例えば、商品の映像が、店舗に設置された撮像装置25によってリアルタイムに撮像された商品の映像である場合、撮像装置制御部207は、顧客の手の動きに基づいて、撮像装置25の位置および撮像装置25の向きの少なくともいずれかを制御する。これにより、撮像装置25は、商品の異なる向きを撮像することができる。より具体的に、例えば、撮像装置制御部207は、商品を左に回すような手の動きが検出された場合、現在撮像されている商品の左側を撮像可能に撮像装置25の撮像位置および向きの少なくともいずれかを制御する。撮像位置および向きは、手の動きに応じて特定されればよい。なお、店舗において、商品は、台に載せられ、撮像装置25は、台に載せられた商品を撮像してもよい。 First, an example will be described in which at least one of the imaging position and orientation of the imaging device 25 installed in a store is controlled based on hand movements. For example, if the product image is a product image captured in real time by an image capture device 25 installed in a store, the image capture device control unit 207 determines the position of the image capture device 25 and At least one of the orientations of the imaging device 25 is controlled. This allows the imaging device 25 to take images of the product in different orientations. More specifically, for example, when a hand movement such as turning the product to the left is detected, the imaging device control unit 207 changes the imaging position and orientation of the imaging device 25 so that the left side of the product currently being imaged can be imaged. control at least one of the following. The imaging position and direction may be specified according to the movement of the hand. Note that in a store, products may be placed on a stand, and the imaging device 25 may take an image of the product placed on the stand.
 ここで、撮像装置25の撮像位置および向きの制御方法は、特に限定されない。例えば、撮像装置25がロボットアームに取り付けられている場合、撮像装置制御部207は、ロボットアームを制御することにより、撮像装置25の撮像位置を制御することができる。 Here, the method of controlling the imaging position and orientation of the imaging device 25 is not particularly limited. For example, when the imaging device 25 is attached to a robot arm, the imaging device control unit 207 can control the imaging position of the imaging device 25 by controlling the robot arm.
 図12は、商品を撮像する撮像装置25の位置および向きの変更による表示される商品の向きを変更する例を示す説明図である。まず、店舗において、撮像装置25は、台27に載せられた商品の映像を撮像する。図12において、台27には、商品としてリンゴが載せられている。 FIG. 12 is an explanatory diagram showing an example of changing the orientation of a displayed product by changing the position and orientation of the imaging device 25 that images the product. First, at the store, the imaging device 25 captures an image of a product placed on a stand 27. In FIG. 12, an apple is placed on a stand 27 as a product.
 そして、第2出力制御部203は、商品の映像を空中ディスプレイ22に表示させる。図12において、空中ディスプレイ22には、商品としてリンゴが表示されている。 Then, the second output control unit 203 displays the image of the product on the aerial display 22. In FIG. 12, an apple is displayed as a product on the aerial display 22.
 そして、図12において、映像解析部205は、撮像装置23によって撮像された顧客の映像を解析することにより、顧客の手の動きを解析する。 In FIG. 12, the video analysis unit 205 analyzes the customer's hand movements by analyzing the customer's video captured by the imaging device 23.
 そして、図12において、撮像装置制御部207は、顧客の手の動きに基づいて、撮像装置25の位置および向きを制御する。図12において、撮像装置25の位置および向きが、変わっている。これにより、空中ディスプレイ22には、向きが変更された商品の映像が表示される。例えば、図12において、空中ディスプレイ22には、リンゴの他の部分が表示される。 In FIG. 12, the imaging device control unit 207 controls the position and orientation of the imaging device 25 based on the movement of the customer's hand. In FIG. 12, the position and orientation of the imaging device 25 have changed. As a result, an image of the product whose orientation has been changed is displayed on the aerial display 22. For example, in FIG. 12, other parts of the apple are displayed on the aerial display 22.
 図7の説明に戻る。例えば、第2出力制御部203が、店舗に設置された複数の撮像装置25から選択された撮像装置25によってリアルタイムに撮像された商品の映像を空中ディスプレイ22に表示させる場合、複数の撮像装置25から手の動きに基づいて新たに撮像装置25を選択する。そして、第2出力制御部203は、新たに選択された撮像装置25によって撮像された映像を空中ディスプレイ22に表示させる。このように、撮像装置25を切り替えることにより、空中ディスプレイ22に表示される商品の向きを変えることができる。 Returning to the explanation of FIG. 7. For example, when the second output control unit 203 causes the aerial display 22 to display an image of a product captured in real time by an imaging device 25 selected from a plurality of imaging devices 25 installed in a store, the second output control unit 203 A new imaging device 25 is selected based on the hand movement. Then, the second output control unit 203 causes the aerial display 22 to display the video imaged by the newly selected imaging device 25. In this manner, by switching the imaging device 25, the orientation of the product displayed on the aerial display 22 can be changed.
 図13は、撮像装置25を切り替えることにより表示される商品の向きを変更する例を示す説明図である。まず、店舗において、撮像装置25-1と撮像装置25-2とが設置されている。撮像装置25-1および撮像装置25-2は、店舗において、台27に載せられた商品の映像を撮像する。図13において、台27には、商品としてリンゴが載せられている。 FIG. 13 is an explanatory diagram showing an example of changing the orientation of the displayed product by switching the imaging device 25. First, an imaging device 25-1 and an imaging device 25-2 are installed in a store. The imaging device 25-1 and the imaging device 25-2 capture images of the products placed on the stand 27 in the store. In FIG. 13, an apple is placed on a stand 27 as a product.
 例えば、第2出力制御部203は、撮像装置25-1によって撮像された商品の映像を空中ディスプレイ22に表示させる。図13において、空中ディスプレイ22には、リンゴが商品として表示されている。 For example, the second output control unit 203 causes the aerial display 22 to display an image of the product captured by the imaging device 25-1. In FIG. 13, an apple is displayed as a product on the aerial display 22.
 図13において、映像解析部205は、撮像装置23によって撮像された顧客の映像を解析することにより、顧客の手の動きを解析する。 In FIG. 13, the video analysis unit 205 analyzes the customer's hand movements by analyzing the customer's video captured by the imaging device 23.
 そして、図13において、第2出力制御部203は、顧客の手の動きに基づいて、撮像装置25-1と撮像装置25-2とから撮像装置25-2を選択し、選択された撮像装置25-2によって撮像された映像を表示させる。これにより、空中ディスプレイ22には、向きが変更された商品の映像が表示されている。例えば、図13において、空中ディスプレイ22には、リンゴの他の部分が表示される。 In FIG. 13, the second output control unit 203 selects the imaging device 25-2 from the imaging device 25-1 and the imaging device 25-2 based on the customer's hand movement, and selects the imaging device 25-2 from the imaging device 25-1 and the imaging device 25-2. 25-2 is displayed. As a result, the aerial display 22 displays an image of the product whose orientation has been changed. For example, in FIG. 13, other parts of the apple are displayed on the aerial display 22.
 図7の説明に戻る。例えば、商品の映像が、店舗に設置された撮像装置25によってリアルタイムに撮像された商品の映像である場合に、空中ディスプレイ22に表示させる商品が、店員によって台27に載せられ、撮像装置25が台27に載っている商品を撮像する場合がある。例えば、台27が、回転可能な台27である場合がある。このような場合、台制御部208は、台27の回転を制御する。これにより、撮像される商品の向きを変えることができる。台制御部208は、顧客の手の動きに基づいて、台27の回転を制御する。これにより、撮像装置25は、商品の異なる向きを撮像することができる。より具体的に、例えば、撮像装置制御部207は、商品を左に回すような手の動きが検出された場合、現在撮像されている商品の左側を撮像可能に、台27を左に回転させる。回転量は、手の動きに応じて特定されればよい。また、台27の回転の例に限らず、台制御部208は、台27の高さを制御してもよい。台27の位置が変更可能であれば、台制御部208は、台27の位置を変更させる。これにより、台制御部208が、台27の高さ、位置、回転を制御することにより、撮像装置25によって撮像される商品の向きなどを変更することができる。 Returning to the explanation of FIG. 7. For example, if the product image is a product image captured in real time by an imaging device 25 installed in a store, the product to be displayed on the aerial display 22 is placed on the stand 27 by a store employee, and the imaging device 25 There are cases where an image of the product placed on the stand 27 is taken. For example, the stand 27 may be a rotatable stand 27. In such a case, the stand control unit 208 controls the rotation of the stand 27. This allows the orientation of the product to be imaged to be changed. The table control unit 208 controls the rotation of the table 27 based on the customer's hand movements. This allows the imaging device 25 to take images of the product in different orientations. More specifically, for example, when a hand movement such as turning the product to the left is detected, the imaging device control unit 207 rotates the stand 27 to the left so that the left side of the product currently being imaged can be imaged. . The amount of rotation may be specified according to the movement of the hand. Moreover, the table control unit 208 may control the height of the table 27 without being limited to the example of rotating the table 27. If the position of the stand 27 can be changed, the stand control unit 208 changes the position of the stand 27. Thereby, by controlling the height, position, and rotation of the stand 27, the stand control unit 208 can change the orientation of the product imaged by the imaging device 25.
 図14は、台27の回転により表示される商品の向きを変更する例を示す説明図である。まず、撮像装置25は、店舗において、台27に載せられた商品の映像を撮像する。図14において、台27には、商品としてリンゴが載せられている。 FIG. 14 is an explanatory diagram showing an example of changing the orientation of the displayed product by rotating the table 27. First, the imaging device 25 captures an image of a product placed on a stand 27 in a store. In FIG. 14, an apple is placed on a stand 27 as a product.
 そして、第2出力制御部203は、商品の映像を空中ディスプレイ22に表示させる。図14において、空中ディスプレイ22には、リンゴが商品として表示されている。 Then, the second output control unit 203 displays the image of the product on the aerial display 22. In FIG. 14, an apple is displayed as a product on the aerial display 22.
 図14において、映像解析部205は、撮像装置23によって撮像された顧客の映像を解析することにより、顧客の手の動きを解析する。 In FIG. 14, the video analysis unit 205 analyzes the customer's hand movements by analyzing the customer's video captured by the imaging device 23.
 そして、図14において、台制御部208は、顧客の手の動きに基づいて、台27を回転させる。これにより、空中ディスプレイ22には、向きが変更された商品の映像が表示されている。例えば、図14において、空中ディスプレイ22には、リンゴの他の部分が表示される。 Then, in FIG. 14, the table control unit 208 rotates the table 27 based on the customer's hand movement. As a result, the aerial display 22 displays an image of the product whose orientation has been changed. For example, in FIG. 14, other parts of the apple are displayed on the aerial display 22.
 図7の説明に戻る。例えば、商品の映像が、事前に撮像された映像である場合、第2出力制御部203は、事前に撮像された商品の映像のうち、手の動きに応じた商品の向きとなるような映像を表示させる。 Returning to the explanation of FIG. 7. For example, if the image of the product is an image that has been captured in advance, the second output control unit 203 outputs an image of the product that is oriented in accordance with the movement of the hand from among the images of the product that has been captured in advance. Display.
 以上の例では、顧客の手の動きに基づいて、空中ディスプレイ22に表示される映像における商品の向きが変更されるような制御が行われる例を説明したが、空中ディスプレイ22に表示される画像における商品の向きが変更されるような制御が行われてもよい。例えば、まずは、第2出力制御部203は、異なる向きから撮像された商品の複数の画像のうちのいずれかの画像を表示させる。そして、顧客の手の動きが検出されると、第2出力制御部203は、複数の画像のうち、顧客の手の動きに応じた向きで撮像された画像を表示させる。 In the above example, control is performed such that the direction of the product in the image displayed on the aerial display 22 is changed based on the customer's hand movement. Control may be performed such that the orientation of the product is changed. For example, first, the second output control unit 203 displays one of a plurality of images of the product taken from different directions. Then, when the movement of the customer's hand is detected, the second output control unit 203 displays an image taken in an orientation corresponding to the movement of the customer's hand from among the plurality of images.
 また、表示させる商品の画像を新たに撮像させる場合、映像の例と同様に、撮像装置制御部207による撮像装置25の撮像位置および向きの少なくともいずれかが制御されてもよい。例えば、撮像装置制御部207は、撮像装置25の位置および向きの少なくともいずれかを制御し、位置および向きが制御された後に、撮像装置25が、新たに商品の画像を撮像する。そして、第2出力制御部203は、新たに撮像された商品の画像を表示させる。 Furthermore, when newly capturing an image of the product to be displayed, at least one of the imaging position and orientation of the imaging device 25 may be controlled by the imaging device control unit 207, similarly to the video example. For example, the imaging device control unit 207 controls at least one of the position and orientation of the imaging device 25, and after the position and orientation are controlled, the imaging device 25 newly captures an image of the product. Then, the second output control unit 203 displays the newly captured image of the product.
 また、表示させる商品の画像を新たに撮像させる場合、映像の例と同様に、第2出力制御部203によって撮像装置25が切り替えられてもよい。第2出力制御部203が、店舗に設置された複数の撮像装置25から選択された撮像装置25によって撮像された商品の画像を空中ディスプレイ22に表示させる場合、複数の撮像装置25から手の動きに基づいて新たに撮像装置25を選択する。そして、新たに選択された撮像装置25が、商品の画像を撮像する。そして、第2出力制御部203は、新たに選択された撮像装置25によって撮像された画像を空中ディスプレイ22に表示させる。 Furthermore, when newly capturing an image of the product to be displayed, the imaging device 25 may be switched by the second output control unit 203, similarly to the video example. When the second output control unit 203 causes the aerial display 22 to display an image of the product captured by the imaging device 25 selected from the plurality of imaging devices 25 installed in the store, the second output control unit 203 displays the hand movement from the plurality of imaging devices 25. A new imaging device 25 is selected based on. The newly selected imaging device 25 then captures an image of the product. Then, the second output control unit 203 causes the aerial display 22 to display the image captured by the newly selected imaging device 25.
 また、表示させる商品の画像を新たに撮像させる場合、映像の例と同様に、台制御部208によって台27が制御されてもよい。台制御部208は、顧客の手の動きに基づいて、台27の回転を制御する。そして、撮像装置25は、商品の異なる向きを新たに撮像する。そして、第2出力制御部203は、撮像装置25によって新たに撮像された画像を空中ディスプレイ22に表示させる。 Furthermore, when a new image of the product to be displayed is captured, the stand 27 may be controlled by the stand control unit 208, similarly to the video example. The table control unit 208 controls the rotation of the table 27 based on the customer's hand movements. The imaging device 25 then takes a new image of the product in a different orientation. Then, the second output control unit 203 causes the aerial display 22 to display the image newly captured by the imaging device 25.
 また、以上の例では、顧客の手の動きに基づいて、空中ディスプレイ22に表示される映像に商品の向きが変更されるような制御が行われる例を説明したが、手の動きに限られず、空中ディスプレイ22に対する操作に応じて、空中ディスプレイ22に表示される映像や画像の商品の向きが変更されるような制御が行われてもよい。 Further, in the above example, control is performed such that the direction of the product is changed in the image displayed on the aerial display 22 based on the customer's hand movement, but the control is not limited to the hand movement. In response to an operation on the aerial display 22, control may be performed such that the orientation of the product in the video or image displayed on the aerial display 22 is changed.
 例えば、空中ディスプレイ22は、ユーザの操作を受け付けることができる。そこで、操作受付部211は、空中ディスプレイ22に対する操作を受け付けてもよい。 For example, the aerial display 22 can accept user operations. Therefore, the operation reception unit 211 may accept an operation on the aerial display 22.
 そして、手の動きの例と同様に、撮像装置制御部207は、受け付けた操作に基づいて、撮像装置25の撮像位置および向きの少なくともいずれかを制御してもよい。または、手の動きの例と同様に、第2出力制御部203は、受け付けた操作に基づいて、複数の撮像装置25から手の動きに基づいて新たに撮像装置25を選択する。そして、第2出力制御部203は、新たに選択された撮像装置25によって撮像された映像を空中ディスプレイ22に表示させる。または、手の動きの例と同様に、台制御部208は、受け付けた操作に基づいて、台27の回転を制御する。または、手の動きの例と同様に、商品の映像が、事前に撮像された映像である場合、第1出力制御部201は、事前に撮像された映像のうち、受け付けた操作に応じた商品の向きとなるような映像を表示させる。 Similarly to the hand movement example, the imaging device control unit 207 may control at least one of the imaging position and orientation of the imaging device 25 based on the received operation. Alternatively, similar to the hand movement example, the second output control unit 203 newly selects an imaging device 25 from the plurality of imaging devices 25 based on the hand movement based on the received operation. Then, the second output control unit 203 causes the aerial display 22 to display the video imaged by the newly selected imaging device 25. Alternatively, similarly to the example of hand movement, the stand control unit 208 controls the rotation of the stand 27 based on the received operation. Alternatively, as in the example of hand movements, if the image of the product is an image that has been captured in advance, the first output control unit 201 may select a product corresponding to the received operation from among the images that have been captured in advance. Display an image that is facing the same direction.
 なお、受け付けた操作に基づく各機能部の具体的な制御例については、手の動きの例と同様であるため、詳細な説明を省略する。 Note that a detailed explanation of the specific control example of each functional unit based on the received operation is omitted because it is the same as the example of hand movement.
 <空中ディスプレイ22が商品に関する情報を表示中におけるドーム型ディスプレイ21の表示を制御>
 例えば、空中ディスプレイ22は、背景が透けて見える場合がある。顧客が、空中ディスプレイ22によって表示された商品を見た場合に、顧客の視線の先には、ドーム型ディスプレイ21がある。顧客には、商品の背景および透過した商品の先に、ドーム型ディスプレイ21の表示が目に映る。このように、空中ディスプレイ22への商品に関する情報の表示中に、ドーム型ディスプレイ21の表示が顧客の目に映ると見難い場合がある。
<Controlling the display of the dome-shaped display 21 while the aerial display 22 is displaying information about the product>
For example, the background of the aerial display 22 may be visible through it. When a customer views the product displayed on the aerial display 22, the dome-shaped display 21 is in front of the customer's line of sight. The customer can see the display on the dome-shaped display 21 in the background of the product and beyond the transparent product. In this way, while the information regarding the product is being displayed on the aerial display 22, it may be difficult for the customer to see the display on the dome-shaped display 21.
 そこで、空中ディスプレイ22に表示された商品が見難くならないように、第1出力制御部201は、ドーム型ディスプレイ21の表示を変更する。これにより、空中ディスプレイ22とドーム型ディスプレイ21とを組み合わせた場合に商品が見難くなるのを抑制することができる。ドーム型ディスプレイ21の表示の変更として、第1出力制御部201は、ドーム型ディスプレイ21の全体の表示を変更してもよいし、ドーム型ディスプレイ21の一部の表示を変更してもよい。全体の表示を変更する例として、第1出力制御部201は、売り場の映像の表示を消す、売り場の映像の代わりに、商品に応じた映像を表示させるなどがある。例えば、商品が食品であれば、商品に応じた映像とは、皿などであってもよい。 Therefore, the first output control unit 201 changes the display on the dome-shaped display 21 so that the product displayed on the aerial display 22 does not become difficult to see. Thereby, when the aerial display 22 and the dome-shaped display 21 are combined, it is possible to prevent the product from becoming difficult to see. To change the display on the dome-shaped display 21, the first output control unit 201 may change the entire display on the dome-shaped display 21, or may change the display on a part of the dome-shaped display 21. As an example of changing the overall display, the first output control unit 201 may turn off the display of the store video, or display a video corresponding to the product instead of the store video. For example, if the product is food, the image corresponding to the product may be a plate or the like.
 また、ドーム型ディスプレイ21の一部の表示を変更する場合、第1出力制御部201は、顧客が空中ディスプレイ22に表示された映像や画像における商品を見たときに商品に重なるドーム型ディスプレイ21の一部の表示を消すか、色を変えてもよい。これにより、空中ディスプレイ22とドーム型ディスプレイ21とを組み合わせた場合に商品が見難くなるのを抑制することができる。 In addition, when changing the display of a part of the dome-shaped display 21, the first output control unit 201 controls the dome-shaped display 21 that overlaps the product when the customer views the product in the video or image displayed on the aerial display 22. You may erase some of the display or change the color. Thereby, when the aerial display 22 and the dome-shaped display 21 are combined, it is possible to prevent the product from becoming difficult to see.
 一部の表示を変更する例として、例えば、空中ディスプレイ22が商品に関する情報を表示中に、第1出力制御部201は、ドーム型ディスプレイ21の固定位置の表示を変更する。固定位置は、空中ディスプレイ22が商品に関する情報を表示する表示領域に対応する位置である。固定位置の表示の変更方法として、第1出力制御部201は、固定位置の表示を消してもよい。または、第1出力制御部201は、固定位置の表示の色を所定の色にする。所定の色は、黒や白のように商品が見えやすい色であってもよいし、商品に応じた色であってもよい。 As an example of changing a part of the display, for example, while the aerial display 22 is displaying information regarding the product, the first output control unit 201 changes the display at the fixed position of the dome-shaped display 21. The fixed position is a position corresponding to a display area where the aerial display 22 displays information regarding the product. As a method of changing the display of the fixed position, the first output control unit 201 may erase the display of the fixed position. Alternatively, the first output control unit 201 sets the display color of the fixed position to a predetermined color. The predetermined color may be a color that makes the product easier to see, such as black or white, or a color that corresponds to the product.
 また、固定位置を例に挙げたが、より詳細に位置が特定されてもよい。重畳位置特定部209は、顧客の位置と空中ディスプレイ22の位置との位置関係、および空中ディスプレイ22が商品に関する情報を表示する表示領域とドーム型ディスプレイ21の位置との位置関係に基づいて、顧客が表示領域に表示された商品に関する情報を見た場合にその表示領域と重なるドーム型ディスプレイ21の相対的な位置を特定する。例えば、表示領域は、図6で説明した空中結像の領域である。なお、顧客の位置は、3D(three-dimensional)センサなどによって特定されてもよい。3Dセンサは、例えば、ドーム型ディスプレイ21に設置されてもよい。空中ディスプレイ22が商品に関する情報を表示中に、第1出力制御部201は、ドーム型ディスプレイ21について、特定された位置の表示を変更する。表示の変更方法は、前述の通りである。 Furthermore, although a fixed position is taken as an example, the position may be specified in more detail. The superimposition position specifying unit 209 identifies the customer based on the positional relationship between the customer's position and the position of the aerial display 22, and the positional relationship between the display area where the aerial display 22 displays information regarding the product and the position of the dome-shaped display 21. When a person views information regarding a product displayed in a display area, the relative position of the dome-shaped display 21 that overlaps with the display area is specified. For example, the display area is the aerial imaging area described with reference to FIG. Note that the customer's location may be specified by a three-dimensional (3D) sensor or the like. The 3D sensor may be installed on the dome-shaped display 21, for example. While the aerial display 22 is displaying information regarding the product, the first output control unit 201 changes the display of the specified position on the dome-shaped display 21. The method of changing the display is as described above.
 または、空中ディスプレイ22が商品に関する情報を表示中に、第1出力制御部201は、ドーム型ディスプレイ21に、売り場の映像と異なる映像を表示させる。例えば、映像は、お皿、他の棚などのように商品によって変えてもよい。ここでは、映像を例に挙げて説明するが、画像であってもよい。 Alternatively, while the aerial display 22 is displaying information regarding the product, the first output control unit 201 causes the dome-shaped display 21 to display an image different from the image of the sales floor. For example, the images may vary depending on the product, such as plates, other shelves, and the like. Here, the explanation will be given using a video as an example, but an image may also be used.
 ここで、1つのドーム型ディスプレイ21に対して、一人の顧客がいる例を挙げたが、複数の顧客が一緒に買物していてもよい。複数の顧客がドーム型ディスプレイ21にいる場合、第1出力制御部201は、複数の顧客のそれぞれについて、ドーム型ディスプレイ21の表示を制御してもよい。 Here, an example has been given in which there is one customer for one dome-shaped display 21, but multiple customers may be shopping together. When multiple customers are present on the dome-shaped display 21, the first output control unit 201 may control the display on the dome-shaped display 21 for each of the multiple customers.
 また、複数の空中ディスプレイ22が設置されている場合、第1出力制御部201は、複数の空中ディスプレイ22のそれぞれについて、ドーム型ディスプレイ21の表示を制御してもよい。 Furthermore, when a plurality of aerial displays 22 are installed, the first output control unit 201 may control the display on the dome-shaped display 21 for each of the plurality of aerial displays 22.
 以上、空中ディスプレイ22への商品に関する情報の表示中にドーム型ディスプレイ21の表示を制御する例についての説明を終了する。 This concludes the explanation of the example of controlling the display of the dome-shaped display 21 while displaying information about products on the aerial display 22.
 <画像や映像における商品の表示サイズや色の制御>
 空中ディスプレイ22に商品の画像や映像が表示される場合、顧客が表示された商品を見る位置によって商品のサイズが異なって見える場合がある。
<Control of product display size and color in images and videos>
When an image or video of a product is displayed on the aerial display 22, the size of the product may appear to be different depending on the position from which the customer views the displayed product.
 そこで、第2出力制御部203は、商品の映像とともに、寸法を確認可能なメジャーを空中ディスプレイ22に表示させてもよい。これにより、顧客は、メジャーと商品のサイズとの比較により商品の実寸大を確認することができる。 Therefore, the second output control unit 203 may display a measuring tape on the aerial display 22 that allows the dimensions to be confirmed, along with the image of the product. This allows the customer to check the actual size of the product by comparing the tape measure with the size of the product.
 また、顧客に対して、実寸大の商品の画像や映像が表示可能となるように、顧客の視認位置が特定されてもよい。例えば、3Dセンサが、ドーム型ディスプレイ21に設置されてもよい。そして、3Dセンサが、顧客の頭の位置と手の位置を計測する。そして、取得部204は、3Dセンサの計測結果を取得する。視認位置特定部210は、計測結果に基づいて、頭の位置と手の位置との位置関係により、顧客の視認位置を特定する。第2出力制御部203は、商品の実寸大と視認位置とに基づくサイズで商品の画像および映像のいずれかを表示させる。これにより、商品が実寸大で見えるようにとなるように商品の画像や映像が表示される。 Additionally, the customer's viewing position may be specified so that a full-size image or video of the product can be displayed to the customer. For example, a 3D sensor may be installed on the dome-shaped display 21. A 3D sensor then measures the position of the customer's head and hands. The acquisition unit 204 then acquires the measurement results of the 3D sensor. The visual position specifying unit 210 specifies the customer's visual position based on the positional relationship between the head position and the hand position based on the measurement results. The second output control unit 203 displays either an image or a video of the product at a size based on the actual size of the product and the visible position. As a result, images and videos of the product are displayed so that the product can be seen in its actual size.
 また、第2出力制御部203は、空中ディスプレイ22に商品を拡大させるように表示させてもよい。または、第2出力制御部203は、空中ディスプレイ22に商品を縮小させるように表示させてもよい。ここで、操作受付部211は、空中ディスプレイ22に対する操作を受け付けることにより、拡大の指示や縮小の指示を受け付けてもよい。そして、第2出力制御部203は、受け付けた操作に基づいて、画像や映像における商品を拡大表示させてもよいし、画像や映像における商品を縮小表示させてもよい。 Additionally, the second output control unit 203 may cause the aerial display 22 to display the product in an enlarged manner. Alternatively, the second output control unit 203 may display the product on the aerial display 22 so as to reduce the size of the product. Here, the operation reception unit 211 may receive an instruction to enlarge or an instruction to reduce by accepting an operation on the aerial display 22. Based on the received operation, the second output control unit 203 may display the product in the image or video in an enlarged manner, or may display the product in the image or video in a reduced size.
 また、例えば、商品が野菜や果物などの生鮮食品である場合に、商品によって色や形状が異なることが想定される。例えば、商品のイラストなどの画像、商品を仮想空間上に再現した画像などが空中ディスプレイ22に表示される場合、毎回同じ画像が表示されると、顧客は、現実的な買物と異なるように感じることが想定される。ここでは、商品の画像を例に挙げて説明するが、商品の映像についても同様である。より現実的な買物に近づけることが望ましい。そこで、例えば、第2出力制御部203は、商品の色および商品の形状に関する情報の少なくともいずかに基づいて、空中ディスプレイ22に、商品の画像を表示させる。商品の色や商品の形状は、店員と顧客との会話から得られてもよい。例えば、音声解析部206は、店員と顧客との会話を音声認識することにより、商品の色に関する情報および商品の形状に関する情報の少なくともいずれかを検出する。そして、第2出力制御部203は、会話に応じた色および形状の少なくともいずれかで商品の画像を表示させる。具体的に、例えば、店員と顧客による「今日のリンゴは、いつもより赤い」という会話が検出されると、第2出力制御部203は、予め決められた基準よりも赤い色でリンゴの画像を表示させる。例えば、店員と顧客による「今日のリンゴは、いつもより大きい」という会話が検出されると、第2出力制御部203は、予め決められた基準よりも所定サイズ大きくリンゴが表示されるように、リンゴの画像を表示させる。これにより、実際の商品に近い商品の色や形状で商品の画像を空中ディスプレイ22に表示させることができる。 Furthermore, for example, if the product is a fresh food such as vegetables or fruits, it is assumed that the color and shape will differ depending on the product. For example, when an image such as an illustration of a product or an image of a product reproduced in virtual space is displayed on the aerial display 22, if the same image is displayed every time, the customer will feel that it is different from realistic shopping. It is assumed that Here, the explanation will be given using an image of a product as an example, but the same applies to a video of a product. It is desirable to be closer to realistic shopping. Therefore, for example, the second output control unit 203 causes the aerial display 22 to display an image of the product based on at least one of information regarding the color of the product and the shape of the product. The color of the product and the shape of the product may be obtained from a conversation between a store clerk and a customer. For example, the voice analysis unit 206 detects at least one of information regarding the color of the product and information regarding the shape of the product by performing voice recognition on a conversation between a store clerk and a customer. Then, the second output control unit 203 displays an image of the product in at least one of a color and a shape depending on the conversation. Specifically, for example, when a conversation between a store clerk and a customer saying "Today's apples are redder than usual" is detected, the second output control unit 203 displays an image of an apple in a color redder than a predetermined standard. Display. For example, when a conversation between a store clerk and a customer saying "Today's apples are bigger than usual" is detected, the second output control unit 203 controls the display so that the apples are displayed a predetermined size larger than a predetermined standard. Display an image of an apple. Thereby, the image of the product can be displayed on the aerial display 22 in the color and shape of the product that are close to the actual product.
 ただし、すべての商品の個体の画像および映像のいずれかが入手可能であれば、第2出力制御部203は、商品の個体に応じた画像および映像のいずれかを表示させてもよい。 However, if either the images or videos of the individual products are available, the second output control unit 203 may display either the image or the video according to the individual product.
 また、例えば、実際の店舗の映像がリアルタイムで表示されている場合に、商品の棚札などが、商品と一緒に撮像されていれば、顧客は、商品の価格や名称を確認することができる。しかしながら、棚札がない場合や棚札が撮像されていない場合、第2出力制御部203は、商品の映像や画像とともに、商品の名称、商品の価格を空中ディスプレイ22に表示させてもよい。 Also, for example, when images of the actual store are displayed in real time, if the shelf labels of the products are captured together with the products, customers can check the prices and names of the products. . However, if there is no shelf label or if the shelf label is not imaged, the second output control unit 203 may display the name of the product and the price of the product on the aerial display 22 along with the video and image of the product.
 <空中ディスプレイ22への商品に関する情報の表示の終了例>
 空中ディスプレイ22への商品に関する情報の表示の終了のタイミングは、アプリケーションプログラムの使い方によって種々変更可能である。
<Example of ending the display of product information on the aerial display 22>
The timing at which the display of product information on the aerial display 22 ends can be changed in various ways depending on how the application program is used.
 例えば、顧客が、別の商品や店員に気が向いているときなどには、空中ディスプレイ22への商品に関する情報の表示が終了してもよい。例えば、映像解析部205は、顧客の映像から、顧客の視線を検出してもよい。例えば、映像解析部205は、顧客の目線が上を向いて、空中ディスプレイ22を見ていないことを検出する。そして、第2出力制御部203は、空中ディスプレイ22を見ていないことが検出された場合、空中ディスプレイ22への商品に関する情報の表示を終了する。また、顧客が、別の商品に向かって手を伸ばした場合などのように商品検出部202が新たな商品を検出した場合に、第2出力制御部203は、空中ディスプレイ22への商品に関する情報の表示を終了する。ただし、この場合、第2出力制御部203は、空中ディスプレイ22に、新たに検出された商品に関する情報を表示させてもよい。 For example, when the customer is interested in another product or a store employee, the display of information regarding the product on the aerial display 22 may end. For example, the video analysis unit 205 may detect the customer's line of sight from the customer's video. For example, the video analysis unit 205 detects that the customer's line of sight is upward and not looking at the aerial display 22. Then, when it is detected that the person is not looking at the aerial display 22, the second output control unit 203 ends displaying the information regarding the product on the aerial display 22. Further, when the product detection unit 202 detects a new product, such as when the customer extends his hand toward another product, the second output control unit 203 sends information about the product to the aerial display 22. End the display. However, in this case, the second output control unit 203 may cause the aerial display 22 to display information regarding the newly detected product.
 例えば、顧客の操作によって空中ディスプレイ22への商品に関する情報の表示が終了してもよい。例えば、顧客の操作は、表示の終了を表す操作であってもよいし、ショッピングカートに入れるための操作であってもよい。顧客の操作は、顧客の音声や顧客のジェスチャなどによって検出可能である。具体的に、例えば、映像解析部205は、顧客の映像から、顧客の所定のジェスチャを検出する。そして、所定のジェスチャが検出されると、第2出力制御部203は、空中ディスプレイ22への商品に関する情報の表示を終了する。または、例えば、音声解析部206は、顧客の音声から、所定のキーワードを検出する。そして、所定のキーワードが検出されると、第2出力制御部203は、空中ディスプレイ22への商品に関する情報の表示を終了する。 For example, the display of information regarding the product on the aerial display 22 may be terminated by the customer's operation. For example, the customer's operation may be an operation indicating the end of display, or an operation to add the item to a shopping cart. The customer's operation can be detected by the customer's voice, customer's gesture, etc. Specifically, for example, the video analysis unit 205 detects a predetermined gesture of the customer from the customer's video. Then, when a predetermined gesture is detected, the second output control unit 203 ends displaying information regarding the product on the aerial display 22. Alternatively, for example, the voice analysis unit 206 detects a predetermined keyword from the customer's voice. Then, when a predetermined keyword is detected, the second output control unit 203 ends displaying information regarding the product on the aerial display 22.
 また、顧客がドーム型ディスプレイ21からいなくなった場合に、空中ディスプレイ22への商品に関する情報の表示が終了してもよい。顧客がドーム型ディスプレイ21からいなくなるとは、例えば、顧客が、ドーム型ディスプレイ21を見たり、操作するなどが可能な場所からいなくなることである。具体的に、例えば、顧客がドーム型ディスプレイ21からいなくなるとは、ドーム型ディスプレイ21の前に設置された椅子から顧客が離席した場合などがある。例えば、映像解析部205は、撮像装置23によって撮像された映像から、顧客が撮像されていないかを検出することにより、顧客がいなくなったことを検出してもよい。そして、顧客がいなくなったことが検出された場合に、第2出力制御部203は、空中ディスプレイ22への商品に関する情報の表示を終了する。 Furthermore, when the customer disappears from the dome-shaped display 21, the display of information regarding the product on the aerial display 22 may end. For example, the customer disappearing from the dome-shaped display 21 means that the customer disappears from a place where the dome-shaped display 21 can be viewed or operated. Specifically, for example, when a customer disappears from the dome-shaped display 21, there is a case where the customer leaves the chair installed in front of the dome-shaped display 21. For example, the video analysis unit 205 may detect that the customer is gone by detecting from the video imaged by the imaging device 23 whether or not the customer is captured. Then, when it is detected that the customer is gone, the second output control unit 203 ends the display of information regarding the product on the aerial display 22.
 (フローチャート)
 図15は、実施の形態2にかかる買物支援システム20の一動作例を示すフローチャートである。ここで、特にフローチャートに明示しないが、取得部204は、撮像装置23によって撮像された映像、録音装置24から取得された音声を取得する。また、映像解析部205は、撮像装置23によって撮像された映像から、顧客の行動を解析する。また、音声解析部206は、録音装置24から取得された音声から、顧客の行動を解析する。
(flowchart)
FIG. 15 is a flowchart showing an example of the operation of the shopping support system 20 according to the second embodiment. Here, although not specifically shown in the flowchart, the acquisition unit 204 acquires the video imaged by the imaging device 23 and the audio acquired from the recording device 24. Further, the video analysis unit 205 analyzes the customer's behavior from the video captured by the imaging device 23. Furthermore, the voice analysis unit 206 analyzes the customer's behavior from the voice acquired from the recording device 24.
 第1出力制御部201は、ドーム型ディスプレイ21に、店舗の売り場に関する情報を表示させる(ステップS201)。 The first output control unit 201 causes the dome-shaped display 21 to display information regarding the sales floor of the store (step S201).
 つぎに、商品検出部202は、顧客の行動に基づいて、顧客が取得したい商品を検出したかを判定する(ステップS202)。商品が検出されていない場合(ステップS202:No)、商品検出部202は、ステップS202へ戻る。 Next, the product detection unit 202 determines whether a product that the customer wants to acquire is detected based on the customer's behavior (step S202). If no product is detected (step S202: No), the product detection unit 202 returns to step S202.
 一方、商品が検出された場合(ステップS202:Yes)、第2出力制御部203は、空中ディスプレイ22に、商品に関する情報を表示させる(ステップS203)。また、第1出力制御部201は、ドーム型ディスプレイ21の表示を変更する(ステップS204)。ステップS204において、第1出力制御部201は、ドーム型ディスプレイ21について固定位置の表示を変更してもよいし、ドーム型ディスプレイ21について特定された位置の表示を変更してもよい。なお、ステップS203とステップS204との処理の順番は、特に限定されない。例えば、ステップS203とステップS204とは、同じタイミングであってもよい。 On the other hand, if the product is detected (step S202: Yes), the second output control unit 203 causes the aerial display 22 to display information regarding the product (step S203). The first output control unit 201 also changes the display on the dome-shaped display 21 (step S204). In step S204, the first output control unit 201 may change the display of the fixed position on the dome-shaped display 21, or may change the display of the specified position on the dome-shaped display 21. Note that the order of processing in step S203 and step S204 is not particularly limited. For example, step S203 and step S204 may be performed at the same timing.
 つぎに、第2出力制御部203は、顧客の行動に基づいて、商品に関する情報の表示を変更する(ステップS205)。ステップS205において、第2出力制御部203は、顧客の行動に基づいて、映像や画像における商品の表示の向きを変更してもよい。また、ステップS205について、第2出力制御部203による制御に限らず、台制御部208による台27の制御、または撮像装置制御部207による撮像装置25の制御が行われてもよい。例えば、撮像装置制御部207は、撮像装置25の撮像位置や向きを変更することにより、空中ディスプレイ22に表示される商品の向きを変更してもよい。または、例えば、台制御部208が、台27を制御することにより、空中ディスプレイ22に表示される商品の向きを変更してもよい。なお、第2出力制御部203による表示の制御、台制御部208による台27の制御、撮像装置制御部207による撮像装置25の制御は、適宜組み合わせられてもよい。 Next, the second output control unit 203 changes the display of information regarding the product based on the customer's behavior (step S205). In step S205, the second output control unit 203 may change the display orientation of the product in the video or image based on the customer's behavior. Furthermore, regarding step S205, the control is not limited to the second output control section 203, and the control of the platform 27 by the platform control section 208 or the control of the imaging device 25 by the imaging device control section 207 may be performed. For example, the imaging device control unit 207 may change the orientation of the product displayed on the aerial display 22 by changing the imaging position and orientation of the imaging device 25. Alternatively, for example, the stand control unit 208 may control the stand 27 to change the orientation of the product displayed on the aerial display 22. Note that the display control by the second output control section 203, the control of the platform 27 by the platform control section 208, and the control of the imaging device 25 by the imaging device control section 207 may be combined as appropriate.
 第2出力制御部203は、商品に関する情報の表示を終了するかを判定する(ステップS206)。商品に関する情報の表示を終了しない場合(ステップS206:No)、第2出力制御部203は、ステップS205へ戻る。商品に関する情報の表示を終了する場合(ステップS206:Yes)、第1出力制御部201は、ステップS201へ戻る。 The second output control unit 203 determines whether to end the display of information regarding the product (step S206). If the display of information regarding the product is not finished (step S206: No), the second output control unit 203 returns to step S205. If the display of information regarding the product is to be ended (step S206: Yes), the first output control unit 201 returns to step S201.
 なお、フローチャートについては、適宜終了すればよい。 Note that the flowchart can be ended as appropriate.
 以上、実施の形態2において、買物支援システム20は、顧客の映像または画像から、顧客の手の動きを検出し、検出された顧客の手の動きに基づいて、顧客が取得したい商品を検出する。より具体的に、例えば、買物支援システム20は、顧客が手を伸ばした位置によって、顧客が取得したい商品を検出する。例えば、顧客は、商品を棚からとるような動作をするなど、直感的な動きをするだけで、確認したい商品が空中ディスプレイ22に表示される。また、顧客が実際の店舗に来店して買物する場合、顧客は、棚にある商品を掴んで顔に近づけて確認することが想定される。よって、手の動きから検出された商品に関する情報が空中ディスプレイ22に表示される買物支援システム20によれば、実際の買物により近い買物体験を提供することができる。 As described above, in the second embodiment, the shopping support system 20 detects the customer's hand movement from the customer's video or image, and detects the product that the customer wants to acquire based on the detected customer's hand movement. . More specifically, for example, the shopping support system 20 detects the product that the customer wants to acquire based on the position where the customer extends his/her hand. For example, the customer can display the desired product on the aerial display 22 by simply making an intuitive movement such as taking a product from a shelf. Furthermore, when a customer visits an actual store to shop, it is assumed that the customer grabs a product on a shelf and brings it close to his or her face to check it. Therefore, according to the shopping support system 20 in which information regarding products detected from hand movements is displayed on the aerial display 22, it is possible to provide a shopping experience closer to actual shopping.
 また、買物支援システム20は、音声から、顧客と店員との会話を解析し、会話に基づいて、顧客が取得したい商品を検出する。これにより、顧客は、店員と会話するだけで、確認したい商品に関する情報が空中ディスプレイ22に表示される。また、例えば、顧客が実際の店舗に来店して買物する場合、顧客は、店員と会話を行い、店員が、商品を勧めて見せてくれたり、商品を顧客に渡してくれるようなことが想定される。よって、会話から検出された商品に関する情報が空中ディスプレイ22に表示される買物支援システム20によれば、実際の買物により近い買物体験を提供することができる。 Additionally, the shopping support system 20 analyzes the conversation between the customer and the store clerk from the audio, and detects the product the customer wants to acquire based on the conversation. As a result, information regarding the product the customer wants to check is displayed on the aerial display 22 simply by having a conversation with the store clerk. Furthermore, for example, when a customer visits an actual store to shop, it is assumed that the customer will have a conversation with the store staff, and the store staff will recommend and show products to the customer, or hand the product to the customer. be done. Therefore, according to the shopping support system 20 in which information regarding products detected from conversations is displayed on the aerial display 22, it is possible to provide a shopping experience closer to actual shopping.
 また、買物支援システム20は、空中ディスプレイ22に、会話に応じた色および形状の少なくともいずれかで商品の画像または映像を表示させる。これにより、実際の商品に近い商品の色や形状で商品の画像または映像を空中ディスプレイ22に表示させることができる。 Additionally, the shopping support system 20 causes the aerial display 22 to display an image or video of the product in at least one of the color and shape depending on the conversation. Thereby, the image or video of the product can be displayed on the aerial display 22 in the color and shape of the product that are close to the actual product.
 また、買物支援システム20は、ドーム型ディスプレイ21に、店員に関する情報を表示させる。これにより、顧客は、より実際の店舗に近い形態で、買物することができる。 Additionally, the shopping support system 20 causes the dome-shaped display 21 to display information regarding the store staff. This allows customers to shop in a manner that is more similar to that of an actual store.
 また、買物支援システム20は、顧客の手の動きに基づいて、映像における商品の表示を変える。例えば、買物支援システム20は、顧客の手の動きに基づいて、店舗に設置された撮像装置25を制御する。回転式の台27に商品が載せられ、撮像装置25が台27に載せられた商品を撮像する場合、買物支援システム20は、顧客の手の動きに基づいて、台27の回転を制御する。これにより、顧客は、手を動かすことにより、商品の表示向き等を変更することができる。このように、直感的に商品の向きを変更することができる。 Additionally, the shopping support system 20 changes the display of the product in the video based on the customer's hand movements. For example, the shopping support system 20 controls an imaging device 25 installed in a store based on the customer's hand movements. When a product is placed on the rotating table 27 and the imaging device 25 takes an image of the product placed on the table 27, the shopping support system 20 controls the rotation of the table 27 based on the customer's hand movement. This allows the customer to change the display orientation of the product by moving his or her hand. In this way, the orientation of the product can be changed intuitively.
 また、前述の通り、空中ディスプレイ22に表示された商品に関する情報は透ける。このため、顧客は、商品が見難い場合がある。そこで、買物支援システム20は、空中ディスプレイ22が商品に関する情報を表示中に、ドーム型ディスプレイ21の表示を変更する。具体的に、例えば、買物支援システム20は、ドーム型ディスプレイ21の固定位置の表示を変更してもよい。または、買物支援システム20は、顧客から商品を見た場合に顧客の視線の延長線上にあるドーム型ディスプレイ21の位置を特定し、特定された位置の表示を変更してもよい。これにより、顧客が商品を見難くなるのを抑制することができる。 Additionally, as described above, the information regarding the product displayed on the aerial display 22 is transparent. Therefore, it may be difficult for customers to see the products. Therefore, the shopping support system 20 changes the display on the dome-shaped display 21 while the aerial display 22 is displaying information regarding the product. Specifically, for example, the shopping support system 20 may change the display of the fixed position of the dome-shaped display 21. Alternatively, the shopping support system 20 may specify the position of the dome-shaped display 21 that is on the line of sight of the customer when viewing the product, and change the display at the specified position. Thereby, it is possible to prevent the customer from having difficulty viewing the product.
 また、店舗は、仮想空間における店舗であってもよいし、実際の店舗であってもよい。実際の店舗がある場合、買物支援システム20は、撮像装置25がリアルタイムで撮像した店舗の売り場の映像をドーム型ディスプレイ21に表示させてもよい。これにより、顧客は、実際の店舗にいなくとも実際の店舗にいるかのような体験ができるような支援をすることができる。 Further, the store may be a store in a virtual space or may be an actual store. If there is an actual store, the shopping support system 20 may display an image of the store's sales floor captured in real time by the imaging device 25 on the dome-shaped display 21. As a result, customers can be supported to have an experience as if they were in an actual store, even if they are not in the actual store.
 以上、各実施の形態の説明を終了する。各実施の形態は、変形して用いられてもよい。各実施の形態は、適宜組み合わせて用いられてもよい。また、各実施の形態において、買物支援システム20は、各機能部および情報の一部が含まれる構成であってもよい。 This concludes the description of each embodiment. Each embodiment may be modified and used. Each embodiment may be used in combination as appropriate. Furthermore, in each embodiment, the shopping support system 20 may have a configuration in which each functional unit and a part of information are included.
 また、各実施の形については、上述した例に限られず、種々変更可能である。また、各実施の形態における買物支援システム20の構成は特に限定されない。例えば、買物支援システム20は、一台のサーバなどの一台の装置によって実現されてもよい。買物支援システム20の各機能部を一台の装置によって実現される場合、例えば、一台の装置は、例えば買物支援装置、情報処理装置などと呼ばれてもよいし、特に限定されない。または、各実施の形態における買物支援システム20は、機能またはデータ別に異なる装置によって実現されてもよい。例えば、買物支援システム20の各機能部は、複数のサーバで構成され、複数のサーバを含む買物支援システム20が実現されてもよい。例えば、買物支援システム20は、各DBを含むデータベースサーバと、各機能部を有するサーバと、によって実現されてもよい。 Further, the form of each embodiment is not limited to the above-mentioned example, and can be modified in various ways. Further, the configuration of the shopping support system 20 in each embodiment is not particularly limited. For example, the shopping support system 20 may be realized by one device such as one server. When each functional part of the shopping support system 20 is realized by one device, for example, one device may be called a shopping support device, an information processing device, etc., and is not particularly limited. Alternatively, the shopping support system 20 in each embodiment may be realized by different devices depending on functions or data. For example, each functional unit of the shopping support system 20 may be configured with a plurality of servers, and the shopping support system 20 including the plurality of servers may be realized. For example, the shopping support system 20 may be realized by a database server including each DB and a server having each functional unit.
 図16は、買物支援システム20の一実現例を示す説明図である。実際の店舗がある場合を例に挙げて説明する。 FIG. 16 is an explanatory diagram showing an example of implementation of the shopping support system 20. This will be explained using an example where there is an actual store.
 図16において、買物支援システム20は、例えば、エッジ端末装置31と、サーバ32とを備える。 In FIG. 16, the shopping support system 20 includes, for example, an edge terminal device 31 and a server 32.
 例えば、エッジ端末装置31、空中ディスプレイ22、ドーム型ディスプレイ21、撮像装置23、および録音装置24は、家や共有スペースなどに設置される。エッジ端末装置31、空中ディスプレイ22、ドーム型ディスプレイ21、撮像装置23、録音装置24は、通信ネットワークを介して接続される。 For example, the edge terminal device 31, the aerial display 22, the dome-shaped display 21, the imaging device 23, and the recording device 24 are installed in a house, a shared space, or the like. The edge terminal device 31, the aerial display 22, the dome-shaped display 21, the imaging device 23, and the recording device 24 are connected via a communication network.
 例えば、撮像装置25は、店舗に設置される。撮像装置25と、エッジ端末装置31と、サーバ32とは、通信ネットワークを介して接続される。 For example, the imaging device 25 is installed in a store. The imaging device 25, the edge terminal device 31, and the server 32 are connected via a communication network.
 また、買物支援システム20は、エッジ端末装置31、サーバ32、空中ディスプレイ22、ドーム型ディスプレイ21、撮像装置23、録音装置24、撮像装置25を備える全体のシステムとして構成されてもよい。 Additionally, the shopping support system 20 may be configured as an entire system including an edge terminal device 31, a server 32, an aerial display 22, a dome-shaped display 21, an imaging device 23, a recording device 24, and an imaging device 25.
 図16において、例えば、各実施の形態における各機能部は、エッジ端末装置31およびサーバ32によって実現される。例えば、エッジ端末装置31では、取得部204、映像解析部205、音声解析部206、操作受付部211を備える。例えば、映像や音声の解析は、負荷が大きいため、エッジ端末装置が、これらの解析を行うように構成されてもよい。一方、例えば、サーバ32は、第1出力制御部201、商品検出部202、第2出力制御部203、取得部204、撮像装置制御部207、台制御部208、重畳位置特定部209、および視認位置特定部210を備える。また、サーバ32は、複数のサーバであってもよい。このように、買物支援システム10,20の各機能部は、複数の装置によって実現されてもよく、複数の装置は、異なる場所に設置されていてもよい。 In FIG. 16, for example, each functional unit in each embodiment is realized by an edge terminal device 31 and a server 32. For example, the edge terminal device 31 includes an acquisition section 204, a video analysis section 205, an audio analysis section 206, and an operation reception section 211. For example, since analysis of video and audio requires a large load, an edge terminal device may be configured to perform these analyses. On the other hand, for example, the server 32 includes a first output control section 201, a product detection section 202, a second output control section 203, an acquisition section 204, an imaging device control section 207, a stand control section 208, a superimposition position specifying section 209, and a visual recognition section 209. A position specifying section 210 is provided. Further, the server 32 may be a plurality of servers. In this way, each functional unit of the shopping support systems 10 and 20 may be realized by a plurality of devices, and the plurality of devices may be installed at different locations.
 また、各実施の形態において、各情報や各DBは、前述の情報の一部を含んでもよい。また、各情報や各DBは、前述の情報以外の情報を含んでもよい。各情報や各DBが、より詳細に、複数のDBや複数の情報に分けられてもよい。このように、各情報や各DBの実現方法は、特に限定されない。 Furthermore, in each embodiment, each piece of information and each DB may include part of the above-mentioned information. Moreover, each piece of information and each DB may include information other than the above-mentioned information. Each piece of information and each DB may be divided into a plurality of DBs and a plurality of pieces of information in more detail. In this way, the method of implementing each piece of information and each DB is not particularly limited.
 また、各画面は、一例であり、特に限定されない。各画面において、図示しないボタン、リスト、チェックボックス、情報表示欄、入力欄などが追加されてもよい。また、画面の背景色などが、変更されてもよい。 Furthermore, each screen is an example and is not particularly limited. In each screen, buttons, lists, check boxes, information display fields, input fields, etc. (not shown) may be added. Also, the background color of the screen, etc. may be changed.
 また、例えば、各実施の形態において、空中ディスプレイ22に表示させる情報等を生成する処理は、買物支援システム20の第2出力制御部203によって行われてもよい。また、この処理は、空中ディスプレイ22によって行われてもよい。 Further, for example, in each embodiment, the process of generating information etc. to be displayed on the aerial display 22 may be performed by the second output control unit 203 of the shopping support system 20. Further, this processing may be performed by the aerial display 22.
 また、例えば、各実施の形態において、ドーム型ディスプレイ21に表示させる情報等を生成する処理は、買物支援システム20の第1出力制御部201によって行われてもよい。また、この処理は、ドーム型ディスプレイ21によって行われてもよい。 Furthermore, for example, in each embodiment, the process of generating information and the like to be displayed on the dome-shaped display 21 may be performed by the first output control unit 201 of the shopping support system 20. Further, this processing may be performed by the dome-shaped display 21.
 また、各実施の形態において、第1ディスプレイとしてスクリーンの少なくとも一部分が曲がったようなディスプレイを例に挙げ、具体的に第1ディスプレイとしてドーム型ディスプレイ21を例に挙げて説明した。第1ディスプレイは、仮想空間の表示、映像や画像などの表示を目的としたディスプレイであれば、スクリーンの少なくとも一部分が曲がったようなディスプレイなどに限定されない。または、例えば、前述のように、第1ディスプレイは、ユーザの視界を覆うような構造になっている装置であってもよい。 Furthermore, in each of the embodiments, a display in which at least a portion of the screen is curved is used as an example of the first display, and specifically, the dome-shaped display 21 is used as an example of the first display. The first display is not limited to a display in which at least a portion of the screen is curved, as long as it is a display for the purpose of displaying a virtual space, video, images, etc. Alternatively, for example, as described above, the first display may be a device configured to cover the user's field of vision.
 また、各実施の形態において、第2ディスプレイとして非接触型のディスプレイを例に挙げて説明した。前述の通り、第2ディスプレイは、接触型のディスプレイであってもよいし、特に限定されない。 Furthermore, in each of the embodiments, a non-contact display was used as an example of the second display. As described above, the second display may be a contact type display and is not particularly limited.
 (コンピュータのハードウェア構成例)
 つぎに、各実施の形態において説明した買物支援システム10,20などの各装置をコンピュータで実現した場合のハードウェア構成例について説明する。図17は、コンピュータのハードウェア構成例を示す説明図である。例えば、各装置の一部又は全部は、例えば図17に示すようなコンピュータ80とプログラムとの任意の組み合わせを用いて実現することも可能である。
(Example of computer hardware configuration)
Next, an example of the hardware configuration when each device such as the shopping support systems 10 and 20 described in each embodiment is realized by a computer will be described. FIG. 17 is an explanatory diagram showing an example of the hardware configuration of a computer. For example, part or all of each device can be realized using any combination of a computer 80 and a program as shown in FIG. 17, for example.
 コンピュータ80は、例えば、プロセッサ801と、ROM(Read Only Memory)802と、RAM(Random Access Memory)803と、記憶装置804と、を有する。また、コンピュータ80は、通信インタフェース805と、入出力インタフェース806と、を有する。各構成部は、例えば、バス807を介してそれぞれ接続される。なお、各構成部の数は、特に限定されず、各構成部は1または複数である。 The computer 80 includes, for example, a processor 801, a ROM (Read Only Memory) 802, a RAM (Random Access Memory) 803, and a storage device 804. Further, the computer 80 has a communication interface 805 and an input/output interface 806. Each component is connected to each other via a bus 807, for example. Note that the number of each component is not particularly limited, and each component is one or more.
 プロセッサ801は、コンピュータ80の全体を制御する。プロセッサ801は、例えば、CPU(Central Processing Unit)、DSP(Digital Signal Processor)、GPU(Graphics Processing Unit)などが挙げられる。コンピュータ80は、記憶部として、ROM802、RAM803および記憶装置804などを有する。記憶装置804は、例えば、フラッシュメモリなどの半導体メモリ、HDD(Hard Disk Drive)、SSD(Solid State Drive)などが挙げられる。例えば、記憶装置804は、OS(Operating System)のプログラム、アプリケーションプログラム、各実施の形態にかかるプログラムなどを記憶する。または、ROM802は、アプリケーションプログラム、各実施の形態にかかるプログラムなどを記憶する。そして、RAM803は、プロセッサ801のワークエリアとして使用される。 A processor 801 controls the entire computer 80. Examples of the processor 801 include a CPU (Central Processing Unit), a DSP (Digital Signal Processor), and a GPU (Graphics Processing Unit). The computer 80 includes a ROM 802, a RAM 803, a storage device 804, and the like as storage units. Examples of the storage device 804 include semiconductor memory such as flash memory, HDD (Hard Disk Drive), SSD (Solid State Drive), and the like. For example, the storage device 804 stores OS (Operating System) programs, application programs, programs according to each embodiment, and the like. Alternatively, the ROM 802 stores application programs, programs according to each embodiment, and the like. The RAM 803 is used as a work area for the processor 801.
 また、プロセッサ801は、記憶装置804、ROM802などに記憶されたプログラムをロードする。そして、プロセッサ801は、プログラムにコーディングされている各処理を実行する。また、プロセッサ801は、通信ネットワークNTを介して各種プログラムをダウンロードしてもよい。また、プロセッサ801は、コンピュータ80の一部または全部として機能する。そして、プロセッサ801は、プログラムに基づいて図示したフローチャートにおける処理または命令を実行してもよい。 Additionally, the processor 801 loads programs stored in the storage device 804, ROM 802, etc. The processor 801 then executes each process coded in the program. Furthermore, the processor 801 may download various programs via the communication network NT. Further, the processor 801 functions as part or all of the computer 80. The processor 801 may then execute the processes or instructions in the illustrated flowchart based on the program.
 通信インタフェース805は、無線または有線の通信回線を通じて、LAN(Local Area Network)、WAN(Wide Area Network)などの通信ネットワークNTに接続される。なお、通信ネットワークNTは複数の通信ネットワークNTによって構成されてもよい。これにより、コンピュータ80は、通信ネットワークNTを介して外部の装置や外部のコンピュータ80に接続される。通信インタフェース805は、通信ネットワークNTとコンピュータ80の内部とのインタフェースを司る。そして、通信インタフェース805は、外部の装置や外部のコンピュータ80からのデータの入出力を制御する。 The communication interface 805 is connected to a communication network NT such as a LAN (Local Area Network) or a WAN (Wide Area Network) through a wireless or wired communication line. Note that the communication network NT may be composed of a plurality of communication networks NT. Thereby, the computer 80 is connected to an external device or an external computer 80 via the communication network NT. Communication interface 805 serves as an interface between communication network NT and the inside of computer 80 . The communication interface 805 controls input and output of data from external devices and the external computer 80.
 また、入出力インタフェース806は、入力装置、出力装置、および入出力装置の少なくともいずれかに接続される。接続方法は、無線であってもよいし、有線であってもよい。入力装置は、例えば、キーボード、マウス、マイクなどが挙げられる。出力装置は、例えば、表示装置、点灯装置、音声を出力する音声出力装置などが挙げられる。また、入出力装置は、タッチパネルディスプレイなどが挙げられる。なお、入力装置、出力装置、および入出力装置などは、コンピュータ80に内蔵されていてもよいし、外付けであってもよい。 Further, the input/output interface 806 is connected to at least one of an input device, an output device, and an input/output device. The connection method may be wireless or wired. Examples of the input device include a keyboard, a mouse, and a microphone. Examples of the output device include a display device, a lighting device, and an audio output device that outputs audio. Further, examples of the input/output device include a touch panel display. Note that the input device, output device, input/output device, etc. may be built into the computer 80 or may be externally attached.
 コンピュータ80のハードウェア構成は一例である。コンピュータ80は、図17に示す一部の構成要素を有していてもよい。コンピュータ80は、図17に示す以外の構成要素を有していてもよい。例えば、コンピュータ80は、ドライブ装置などを有してもよい。そして、プロセッサ801は、ドライブ装置などに装着された記録媒体に記憶されたプログラムやデータをRAM803に読み出してもよい。非一時的な有形な記録媒体としては、光ディスク、フレキシブルディスク、磁気光ディスク、USB(Universal Serial Bus)メモリなどが挙げられる。また、前述の通り、例えば、コンピュータ80は、キーボードやマウスなどの入力装置を有してもよい。コンピュータ80は、ディスプレイなどの出力装置を有していてもよい。また、コンピュータ80は、入力装置および出力装置と、入出力装置とをそれぞれ有してもよい。 The hardware configuration of the computer 80 is an example. Computer 80 may include some of the components shown in FIG. Computer 80 may include components other than those shown in FIG. For example, the computer 80 may include a drive device or the like. The processor 801 may then read programs and data stored in a recording medium attached to a drive device or the like to the RAM 803. Examples of non-temporary tangible recording media include optical disks, flexible disks, magneto-optical disks, USB (Universal Serial Bus) memories, and the like. Further, as described above, for example, the computer 80 may include an input device such as a keyboard and a mouse. Computer 80 may have an output device such as a display. Further, the computer 80 may each have an input device, an output device, and an input/output device.
 また、コンピュータ80は、図示しない各種センサを有してもよい。センサの種類は特に限定されない。また、コンピュータ80は、画像や映像を撮像可能な撮像装置を備えていてもよい。 Additionally, the computer 80 may include various sensors (not shown). The type of sensor is not particularly limited. Further, the computer 80 may include an imaging device capable of capturing images and videos.
 以上で、各装置のハードウェア構成の説明を終了する。また、各装置の実現方法には、様々な変形例がある。例えば、各装置は、構成要素ごとにそれぞれ異なるコンピュータとプログラムとの任意の組み合わせにより実現されてもよい。また、各装置が備える複数の構成要素が、一つのコンピュータとプログラムとの任意の組み合わせにより実現されてもよい。 This concludes the explanation of the hardware configuration of each device. Furthermore, there are various variations in the method of implementing each device. For example, each device may be realized by an arbitrary combination of a computer and a program, each of which is different for each component. Further, the plurality of components included in each device may be realized by an arbitrary combination of one computer and a program.
 また、各装置の各構成要素の一部または全部は、特定用途向けの回路で実現されてもよい。また、各装置の各構成要素の一部または全部は、FPGA(Field Programmable Gate Array)のようなプロセッサなどを含む汎用の回路によって実現されてもよい。また、各装置の各構成要素の一部または全部は、特定用途向けの回路や汎用の回路などの組み合わせによって実現されてもよい。また、これらの回路は、単一の集積回路であってもよい。または、これらの回路は、複数の集積回路に分割されてもよい。そして、複数の集積回路は、バスなどを介して接続されることにより構成されてもよい。 Additionally, some or all of the components of each device may be realized by application-specific circuits. Moreover, a part or all of each component of each device may be realized by a general-purpose circuit including a processor such as an FPGA (Field Programmable Gate Array). Furthermore, some or all of the components of each device may be realized by a combination of application-specific circuits, general-purpose circuits, and the like. Also, these circuits may be a single integrated circuit. Alternatively, these circuits may be divided into multiple integrated circuits. Further, the plurality of integrated circuits may be configured by being connected via a bus or the like.
 また、各装置の各構成要素の一部または全部が複数のコンピュータや回路などにより実現される場合、複数のコンピュータや回路などは、集中配置されてもよいし、分散配置されてもよい。 Further, in the case where a part or all of each component of each device is realized by a plurality of computers, circuits, etc., the plurality of computers, circuits, etc. may be arranged centrally or in a distributed arrangement.
 各実施の形態で説明した買物支援方法は、買物支援システムが実行することにより実現される。また、例えば、買物支援方法は、予め用意されたプログラムをサーバや端末装置などのコンピュータが実行することにより実現される。各実施の形態で説明したプログラムは、HDD、SSD、フレキシブルディスク、光ディスク、フレキシブルディスク、磁気光ディスク、USBメモリなどのコンピュータで読み取り可能な記録媒体に記録される。そして、プログラムは、コンピュータによって記録媒体から読み出されることによって実行される。また、プログラムは、通信ネットワークNTを介して配布されてもよい。 The shopping support method described in each embodiment is realized by being executed by the shopping support system. Further, for example, the shopping support method is realized by a computer such as a server or a terminal device executing a program prepared in advance. The programs described in each embodiment are recorded on a computer-readable recording medium such as an HDD, SSD, flexible disk, optical disk, flexible disk, magneto-optical disk, or USB memory. Then, the program is executed by being read from the recording medium by the computer. The program may also be distributed via the communications network NT.
 以上説明した、各実施の形態における買物支援システムの各構成要素は、コンピュータのように、その機能を専用のハードウェアで実現されてもよい。または、各構成要素は、ソフトウェアによって実現されてもよい。または、各構成要素は、ハードウェアおよびソフトウェアの組み合わせによって実現されてもよい。 The functions of each component of the shopping support system in each embodiment described above may be realized by dedicated hardware, such as a computer. Alternatively, each component may be realized by software. Alternatively, each component may be realized by a combination of hardware and software.
 以上、各実施の形態を参照して本開示を説明したが、本開示は上記実施の形態に限定されるものではない。各本開示の構成や詳細には、本開示のスコープ内で当業者が把握し得る様々な変更を適用した実施の形態を含み得る。本開示は、本明細書に記載された事項を必要に応じて適宜に組み合わせ、または置換した実施の形態を含み得る。例えば、特定の実施の形態を用いて説明された事項は、矛盾を生じない範囲において、他の実施の形態に対しても適用され得る。例えば、複数の動作をフローチャートの形式で順番に記載してあるが、その記載の順番は複数の動作を実行する順番を限定するものではない。このため、各実施の形態を実施するときには、その複数の動作の順番を内容的に支障しない範囲で変更することができる。 Although the present disclosure has been described above with reference to each embodiment, the present disclosure is not limited to the above embodiments. The configuration and details of each present disclosure may include embodiments in which various changes that can be grasped by those skilled in the art within the scope of the present disclosure are applied. The present disclosure may include embodiments in which the matters described in this specification are appropriately combined or replaced as necessary. For example, matters described using a particular embodiment may be applied to other embodiments to the extent that no contradiction occurs. For example, although a plurality of operations are described in order in the form of a flowchart, the order in which they are described does not limit the order in which the plurality of operations are executed. Therefore, when implementing each embodiment, the order of the plurality of operations can be changed within a range that does not interfere with the content.
 上記の実施の形態の一部または全部は、以下の付記のようにも記載されることができる。ただし、上記の実施の形態の一部または全部は、以下に限られない。 Part or all of the above embodiments can also be described as in the following additional notes. However, some or all of the above embodiments are not limited to the following.
 (付記1)
 第1ディスプレイに、商品が陳列された店舗の売り場に関する情報を表示させる第1出力制御手段と、
 顧客の行動に基づいて、前記顧客が取得したい商品を検出する商品検出手段と、
 第2ディスプレイに、検出された前記商品に関する情報を表示させる第2出力制御手段と、
 を備える買物支援システム。
(付記2)
 前記顧客の行動は、前記顧客の手の動きである、
 付記1に記載の買物支援システム。
(付記3)
 前記商品検出手段は、前記顧客が手を伸ばした位置によって、前記商品を検出する、
 付記2に記載の買物支援システム。
(付記4)
 前記顧客の行動は、前記顧客と店員との会話である、
 付記1から3のいずれかに記載の買物支援システム。
(付記5)
 前記第1出力制御手段は、前記第1ディスプレイに、さらに、前記店員に関する情報を表示させる、
 付記4に記載の買物支援システム。
(付記6)
 前記商品に関する情報は、前記商品の映像または前記商品の画像であり、
 前記第2出力制御手段は、前記顧客と店員との会話に応じた色および形状の少なくともいずれかで前記商品に関する情報を表示させる、
 付記1から5のいずれかに記載の買物支援システム。
(付記7)
 前記商品に関する情報は、前記商品の映像である、
 付記1から5のいずれかに記載の買物支援システム。
(付記8)
 前記第2出力制御手段は、前記顧客の手の動きに基づいて、前記商品の前記映像における前記商品の表示を変える、
 付記7に記載の買物支援システム。
(付記9)
 前記商品の前記映像は、前記店舗に設置された撮像装置によって撮像された前記商品の映像であり、
 前記顧客の手の動きに基づいて、前記撮像装置の撮像位置および向きの少なくともいずれかを制御する撮像装置制御手段、
 を備える
 付記7に記載の買物支援システム。
(付記10)
 前記商品の前記映像は、前記店舗に設置された撮像装置によって撮像された前記商品の映像であり、
 前記顧客の手の動きに基づいて、前記店舗において前記商品を載置可能な回転式の台の回転を制御する台制御手段、
 を備える
 付記7に記載の買物支援システム。
(付記11)
 前記第1出力制御手段は、前記第2ディスプレイが前記商品に関する情報を表示する際に、前記第1ディスプレイの固定位置の表示を変更する、
 付記1から10のいずれかに記載の買物支援システム。
(付記12)
 前記顧客の位置と前記第2ディスプレイの位置との位置関係、および前記第2ディスプレイによる前記商品に関する情報の表示領域と前記第1ディスプレイの位置との位置関係に基づいて、前記顧客が前記表示領域に表示された前記商品に関する情報を見た場合に前記表示領域と重なる前記第1ディスプレイの位置を特定する特定手段、
 を備え、
 前記第1出力制御手段は、前記第2ディスプレイが前記商品に関する情報を表示する際に、前記第1ディスプレイにおける特定された前記位置の表示を変更する、
 付記1から10のいずれかに記載の買物支援システム。
(付記13)
 前記店舗は、仮想空間における店舗である、
 付記1から12のいずれかに記載の買物支援システム。
(付記14)
 前記売り場に関する情報は、前記店舗に設置された撮像装置によって撮像された前記店舗の前記売り場の映像である、
 付記1から12のいずれかに記載の買物支援システム。
(付記15)
 前記第1ディスプレイは、前記顧客の視界を覆うような構造になっている装置である、
 付記1から14のいずれかに記載の買物支援システム。
(付記16)
 前記第1ディスプレイは、ドーム型ディスプレイである、
 付記1から15のいずれかに記載の買物支援システム。
(付記17)
 前記第2ディスプレイは、非接触型のディスプレイである、
 付記1から16のいずれかに記載の買物支援システム。
(付記18)
 前記第2ディスプレイは、空中ディスプレイである、
 付記17に記載の買物支援システム。
(付記19)
 前記第2ディスプレイは、前記顧客と前記第1ディスプレイとの間に設置される、
 付記1から18のいずれかに記載の買物支援システム。
(付記20)
 前記第2ディスプレイは、前記第1ディスプレイよりも前記顧客の近傍に設置される、
 付記1から18のいずれかに記載の買物支援システム。
(付記21)
 前記第1ディスプレイは、前記第2ディスプレイよりも大きい、
 付記1から20のいずれかに記載の買物支援システム。
(付記22)
 第1ディスプレイに、商品が陳列された店舗の売り場に関する情報を表示させ、
 顧客の行動に基づいて、前記顧客が取得したい商品を検出し、
 第2ディスプレイに、検出した前記商品に関する情報を表示させる、
 買物支援方法。
(付記23)
 コンピュータに、
 第1ディスプレイに、商品が陳列された店舗の売り場に関する情報を表示させ、
 顧客の行動に基づいて、前記顧客が取得したい商品を検出し、
 第2ディスプレイに、検出した前記商品に関する情報を表示させる、
 処理を実行させるプログラムを記録する、前記コンピュータが読み取り可能な非一時的な記録媒体。
(付記24)
 コンピュータに、
 第1ディスプレイに、商品が陳列された店舗の売り場に関する情報を表示させ、
 顧客の行動に基づいて、前記顧客が取得したい商品を検出し、
 第2ディスプレイに、検出した前記商品に関する情報を表示させる、
 処理を実行させるプログラム。
(Additional note 1)
a first output control means that causes the first display to display information regarding the department of the store where the products are displayed;
product detection means for detecting a product that the customer wants to acquire based on the customer's behavior;
a second output control means for displaying information regarding the detected product on a second display;
A shopping support system equipped with
(Additional note 2)
the customer's action is a hand movement of the customer;
The shopping support system described in Appendix 1.
(Additional note 3)
The product detection means detects the product based on the position where the customer extends his/her hand.
The shopping support system described in Appendix 2.
(Additional note 4)
The customer's behavior is a conversation between the customer and a store clerk.
The shopping support system according to any one of Supplementary Notes 1 to 3.
(Appendix 5)
The first output control means further causes the first display to display information regarding the store clerk.
The shopping support system described in Appendix 4.
(Appendix 6)
The information regarding the product is a video of the product or an image of the product,
The second output control means displays information regarding the product in at least one of a color and a shape depending on the conversation between the customer and the store clerk.
The shopping support system according to any one of Supplementary Notes 1 to 5.
(Appendix 7)
The information regarding the product is an image of the product;
The shopping support system according to any one of Supplementary Notes 1 to 5.
(Appendix 8)
The second output control means changes the display of the product in the image of the product based on the customer's hand movement.
The shopping support system described in Appendix 7.
(Appendix 9)
The image of the product is an image of the product captured by an imaging device installed in the store,
Imaging device control means for controlling at least one of the imaging position and orientation of the imaging device based on the customer's hand movement;
The shopping support system described in Appendix 7.
(Appendix 10)
The image of the product is an image of the product captured by an imaging device installed in the store,
a table control means for controlling the rotation of a rotary table on which the product can be placed in the store based on the customer's hand movement;
The shopping support system described in Appendix 7.
(Appendix 11)
The first output control means changes the display of the fixed position of the first display when the second display displays information regarding the product.
The shopping support system according to any one of Supplementary Notes 1 to 10.
(Appendix 12)
Based on the positional relationship between the customer's position and the position of the second display, and the positional relationship between the display area of information regarding the product on the second display and the position of the first display, the customer specifying means for specifying a position of the first display that overlaps the display area when viewing information regarding the product displayed on the display area;
Equipped with
The first output control means changes the display of the specified position on the first display when the second display displays information regarding the product.
The shopping support system according to any one of Supplementary Notes 1 to 10.
(Appendix 13)
The store is a store in a virtual space,
The shopping support system according to any one of Supplementary Notes 1 to 12.
(Appendix 14)
The information regarding the sales floor is an image of the sales floor of the store captured by an imaging device installed in the store.
The shopping support system according to any one of Supplementary Notes 1 to 12.
(Appendix 15)
The first display is a device structured to cover the customer's field of view.
The shopping support system according to any one of Supplementary Notes 1 to 14.
(Appendix 16)
the first display is a dome-shaped display;
The shopping support system according to any one of Supplementary Notes 1 to 15.
(Appendix 17)
The second display is a non-contact display.
The shopping support system according to any one of Supplementary Notes 1 to 16.
(Appendix 18)
the second display is an aerial display;
The shopping support system described in Appendix 17.
(Appendix 19)
the second display is installed between the customer and the first display;
The shopping support system according to any one of Supplementary Notes 1 to 18.
(Additional note 20)
the second display is installed closer to the customer than the first display;
The shopping support system according to any one of Supplementary Notes 1 to 18.
(Additional note 21)
the first display is larger than the second display;
The shopping support system according to any one of Supplementary Notes 1 to 20.
(Additional note 22)
displaying information regarding the department of the store where the product is displayed on the first display;
detecting products that the customer wants to acquire based on the customer's behavior;
displaying information regarding the detected product on a second display;
Shopping support method.
(Additional note 23)
to the computer,
displaying information regarding the department of the store where the product is displayed on the first display;
detecting products that the customer wants to acquire based on the customer's behavior;
displaying information regarding the detected product on a second display;
The computer-readable non-transitory recording medium that records a program for executing processing.
(Additional note 24)
to the computer,
displaying information regarding the department of the store where the product is displayed on the first display;
detecting products that the customer wants to acquire based on the customer's behavior;
displaying information regarding the detected product on a second display;
A program that executes processing.
10,20 買物支援システム
21 ドーム型ディスプレイ
22 空中ディスプレイ
23 撮像装置
24 録音装置
25,25-1,25-2 撮像装置
26 録音装置
27 台
31 エッジ端末装置
32 サーバ
80 コンピュータ
101,201 第1出力制御部
102,202 商品検出部
103,203 第2出力制御部
204 取得部
205 映像解析部
206 音声解析部
207 撮像装置制御部
208 台制御部
209 重畳位置特定部
210 視認位置特定部
211 操作受付部
801 プロセッサ
802 ROM
803 RAM
804 記憶装置
805 通信インタフェース
806 入出力インタフェース
807 バス
2101 スクリーン
2102 投影装置
2103 テーブル
NT 通信ネットワーク
2001 顧客DB
2002 店舗DB
2003 商品DB
10, 20 Shopping support system 21 Dome-shaped display 22 Aerial display 23 Imaging device 24 Recording device 25, 25-1, 25-2 Imaging device 26 Recording device 27 Unit 31 Edge terminal device 32 Server 80 Computer 101, 201 First output control Units 102 and 202 Product detection units 103 and 203 Second output control unit 204 Acquisition unit 205 Video analysis unit 206 Audio analysis unit 207 Imaging device control unit 208 Stand control unit 209 Superposition position identification unit 210 Visual recognition position identification unit 211 Operation reception unit 801 Processor 802 ROM
803 RAM
804 Storage device 805 Communication interface 806 Input/output interface 807 Bus 2101 Screen 2102 Projection device 2103 Table NT Communication network 2001 Customer DB
2002 Store DB
2003 Product DB

Claims (23)

  1.  第1ディスプレイに、商品が陳列された店舗の売り場に関する情報を表示させる第1出力制御手段と、
     顧客の行動に基づいて、前記顧客が取得したい商品を検出する商品検出手段と、
     第2ディスプレイに、検出された前記商品に関する情報を表示させる第2出力制御手段と、
     を備える買物支援システム。
    a first output control means that causes the first display to display information regarding the department of the store where the products are displayed;
    product detection means for detecting a product that the customer wants to acquire based on the customer's behavior;
    a second output control means for displaying information regarding the detected product on a second display;
    A shopping support system equipped with
  2.  前記顧客の行動は、前記顧客の手の動きである、
     請求項1に記載の買物支援システム。
    the customer's action is a hand movement of the customer;
    The shopping support system according to claim 1.
  3.  前記商品検出手段は、前記顧客が手を伸ばした位置によって、前記商品を検出する、
     請求項2に記載の買物支援システム。
    The product detection means detects the product based on the position where the customer extends his/her hand.
    The shopping support system according to claim 2.
  4.  前記顧客の行動は、前記顧客と店員との会話である、
     請求項1から3のいずれかに記載の買物支援システム。
    The customer's behavior is a conversation between the customer and a store clerk.
    A shopping support system according to any one of claims 1 to 3.
  5.  前記第1出力制御手段は、前記第1ディスプレイに、さらに、前記店員に関する情報を表示させる、
     請求項4に記載の買物支援システム。
    The first output control means further causes the first display to display information regarding the store clerk.
    The shopping support system according to claim 4.
  6.  前記商品に関する情報は、前記商品の映像または前記商品の画像であり、
     前記第2出力制御手段は、前記顧客と店員との会話に応じた色および形状の少なくともいずれかで前記商品に関する情報を表示させる、
     請求項1から5のいずれかに記載の買物支援システム。
    The information regarding the product is a video of the product or an image of the product,
    The second output control means displays information regarding the product in at least one of a color and a shape depending on the conversation between the customer and the store clerk.
    A shopping support system according to any one of claims 1 to 5.
  7.  前記商品に関する情報は、前記商品の映像である、
     請求項1から5のいずれかに記載の買物支援システム。
    The information regarding the product is an image of the product;
    A shopping support system according to any one of claims 1 to 5.
  8.  前記第2出力制御手段は、前記顧客の手の動きに基づいて、前記商品の前記映像における前記商品の表示を変える、
     請求項7に記載の買物支援システム。
    The second output control means changes the display of the product in the image of the product based on the customer's hand movement.
    The shopping support system according to claim 7.
  9.  前記商品の前記映像は、前記店舗に設置された撮像装置によって撮像された前記商品の映像であり、
     前記顧客の手の動きに基づいて、前記撮像装置の撮像位置および向きの少なくともいずれかを制御する撮像装置制御手段、
     を備える
     請求項7に記載の買物支援システム。
    The image of the product is an image of the product captured by an imaging device installed in the store,
    Imaging device control means for controlling at least one of the imaging position and orientation of the imaging device based on the customer's hand movement;
    The shopping support system according to claim 7, comprising:
  10.  前記商品の前記映像は、前記店舗に設置された撮像装置によって撮像された前記商品の映像であり、
     前記顧客の手の動きに基づいて、前記店舗において前記商品を載置可能な回転式の台の回転を制御する台制御手段、
     を備える
     請求項7に記載の買物支援システム。
    The image of the product is an image of the product captured by an imaging device installed in the store,
    a table control means for controlling the rotation of a rotary table on which the product can be placed in the store based on the customer's hand movement;
    The shopping support system according to claim 7, comprising:
  11.  前記第1出力制御手段は、前記第2ディスプレイが前記商品に関する情報を表示する際に、前記第1ディスプレイの固定位置の表示を変更する、
     請求項1から10のいずれかに記載の買物支援システム。
    The first output control means changes the display of the fixed position of the first display when the second display displays information regarding the product.
    A shopping support system according to any one of claims 1 to 10.
  12.  前記顧客の位置と前記第2ディスプレイの位置との位置関係、および前記第2ディスプレイによる前記商品に関する情報の表示領域と前記第1ディスプレイの位置との位置関係に基づいて、前記顧客が前記表示領域に表示された前記商品に関する情報を見た場合に前記表示領域と重なる前記第1ディスプレイの位置を特定する特定手段、
     を備え、
     前記第1出力制御手段は、前記第2ディスプレイが前記商品に関する情報を表示する際に、前記第1ディスプレイにおける特定された前記位置の表示を変更する、
     請求項1から10のいずれかに記載の買物支援システム。
    Based on the positional relationship between the customer's position and the position of the second display, and the positional relationship between the display area of information regarding the product on the second display and the position of the first display, the customer specifying means for specifying a position of the first display that overlaps the display area when viewing information regarding the product displayed on the display area;
    Equipped with
    The first output control means changes the display of the specified position on the first display when the second display displays information regarding the product.
    A shopping support system according to any one of claims 1 to 10.
  13.  前記店舗は、仮想空間における店舗である、
     請求項1から12のいずれかに記載の買物支援システム。
    The store is a store in a virtual space,
    A shopping support system according to any one of claims 1 to 12.
  14.  前記売り場に関する情報は、前記店舗に設置された撮像装置によって撮像された前記店舗の前記売り場の映像である、
     請求項1から12のいずれかに記載の買物支援システム。
    The information regarding the sales floor is an image of the sales floor of the store captured by an imaging device installed in the store.
    A shopping support system according to any one of claims 1 to 12.
  15.  前記第1ディスプレイは、前記顧客の視界を覆うような構造になっている装置である、
     請求項1から14のいずれかに記載の買物支援システム。
    The first display is a device structured to cover the customer's field of view.
    A shopping support system according to any one of claims 1 to 14.
  16.  前記第1ディスプレイは、ドーム型ディスプレイである、
     請求項1から15のいずれかに記載の買物支援システム。
    the first display is a dome-shaped display;
    A shopping support system according to any one of claims 1 to 15.
  17.  前記第2ディスプレイは、非接触型のディスプレイである、
     請求項1から16のいずれかに記載の買物支援システム。
    The second display is a non-contact display.
    A shopping support system according to any one of claims 1 to 16.
  18.  前記第2ディスプレイは、空中ディスプレイである、
     請求項17に記載の買物支援システム。
    the second display is an aerial display;
    The shopping support system according to claim 17.
  19.  前記第2ディスプレイは、前記顧客と前記第1ディスプレイとの間に設置される、
     請求項1から18のいずれかに記載の買物支援システム。
    the second display is installed between the customer and the first display;
    A shopping support system according to any one of claims 1 to 18.
  20.  前記第2ディスプレイは、前記第1ディスプレイよりも前記顧客の近傍に設置される、
     請求項1から18のいずれかに記載の買物支援システム。
    the second display is installed closer to the customer than the first display;
    A shopping support system according to any one of claims 1 to 18.
  21.  前記第1ディスプレイは、前記第2ディスプレイよりも大きい、
     請求項1から20のいずれかに記載の買物支援システム。
    the first display is larger than the second display;
    A shopping support system according to any one of claims 1 to 20.
  22.  第1ディスプレイに、商品が陳列された店舗の売り場に関する情報を表示させ、
     顧客の行動に基づいて、前記顧客が取得したい商品を検出し、
     第2ディスプレイに、検出した前記商品に関する情報を表示させる、
     買物支援方法。
    displaying information regarding the department of the store where the product is displayed on the first display;
    detecting products that the customer wants to acquire based on the customer's behavior;
    displaying information regarding the detected product on a second display;
    Shopping support method.
  23.  コンピュータに、
     第1ディスプレイに、商品が陳列された店舗の売り場に関する情報を表示させ、
     顧客の行動に基づいて、前記顧客が取得したい商品を検出し、
     第2ディスプレイに、検出した前記商品に関する情報を表示させる、
     処理を実行させるプログラムを記録する、前記コンピュータが読み取り可能な非一時的な記録媒体。
    to the computer,
    displaying information regarding the department of the store where the product is displayed on the first display;
    detecting products that the customer wants to acquire based on the customer's behavior;
    displaying information regarding the detected product on a second display;
    The computer-readable non-transitory recording medium that records a program for executing processing.
PCT/JP2022/027181 2022-07-11 2022-07-11 Shopping assistance system, shopping assistance method, and recording medium WO2024013777A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/027181 WO2024013777A1 (en) 2022-07-11 2022-07-11 Shopping assistance system, shopping assistance method, and recording medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/027181 WO2024013777A1 (en) 2022-07-11 2022-07-11 Shopping assistance system, shopping assistance method, and recording medium

Publications (1)

Publication Number Publication Date
WO2024013777A1 true WO2024013777A1 (en) 2024-01-18

Family

ID=89536256

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/027181 WO2024013777A1 (en) 2022-07-11 2022-07-11 Shopping assistance system, shopping assistance method, and recording medium

Country Status (1)

Country Link
WO (1) WO2024013777A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004318791A (en) * 2003-04-16 2004-11-11 Ryuichi Yokota Electronic commerce support system and method
JP2012216116A (en) * 2011-04-01 2012-11-08 Seiko Epson Corp Sales supporting device, sales supporting device system, control method of sales supporting device and program
JP2013174642A (en) * 2012-02-23 2013-09-05 Toshiba Corp Image display device
WO2019087996A1 (en) * 2017-10-30 2019-05-09 ピクシーダストテクノロジーズ株式会社 Retinal projection device and retinal projection system
WO2019155916A1 (en) * 2018-02-09 2019-08-15 国立大学法人 福井大学 Image display device using retinal scan display unit and method therefor
WO2020160165A1 (en) * 2019-02-01 2020-08-06 Sony Corporation Multi-factor authentication for virtual reality

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004318791A (en) * 2003-04-16 2004-11-11 Ryuichi Yokota Electronic commerce support system and method
JP2012216116A (en) * 2011-04-01 2012-11-08 Seiko Epson Corp Sales supporting device, sales supporting device system, control method of sales supporting device and program
JP2013174642A (en) * 2012-02-23 2013-09-05 Toshiba Corp Image display device
WO2019087996A1 (en) * 2017-10-30 2019-05-09 ピクシーダストテクノロジーズ株式会社 Retinal projection device and retinal projection system
WO2019155916A1 (en) * 2018-02-09 2019-08-15 国立大学法人 福井大学 Image display device using retinal scan display unit and method therefor
WO2020160165A1 (en) * 2019-02-01 2020-08-06 Sony Corporation Multi-factor authentication for virtual reality

Similar Documents

Publication Publication Date Title
US20210166300A1 (en) Virtual reality platform for retail environment simulation
US11226688B1 (en) System and method for human gesture processing from video input
JP5015926B2 (en) Apparatus and method for monitoring individuals interested in property
US20130145272A1 (en) System and method for providing an interactive data-bearing mirror interface
US20150215674A1 (en) Interactive streaming video
KR20190005082A (en) Method and appratus for providing information on offline merchandise to sales on online through augmented reality
US20120209715A1 (en) Interaction with networked screen content via motion sensing device in retail setting
KR20180123217A (en) Method and apparatus for providing a user interface with a computerized system and interacting with the virtual environment
US10282904B1 (en) Providing augmented reality view of objects
CN202142050U (en) Interactive customer reception system
US20150100464A1 (en) Information displaying apparatus and method of object
CN109683711B (en) Product display method and device
JP2020095581A (en) Method for processing information, information processor, information processing system, and store
WO2024013777A1 (en) Shopping assistance system, shopping assistance method, and recording medium
JP6466363B2 (en) Fixtures
JP6583043B2 (en) Image display device, display control method, and display control program
TW201710982A (en) Interactive augmented reality house viewing system enabling users to interactively simulate and control augmented reality object data in the virtual house viewing system
CN112150230A (en) Entity store information interaction system and information pushing method
JP6586904B2 (en) Image display device, display control method, and display control program
JP2012048656A (en) Image processing apparatus, and image processing method
KR20160041224A (en) Merchandise sales service device based on dynamic scene change, Merchandise sales system based on dynamic scene change, method for selling merchandise based on dynamic scene change and computer readable medium having computer program recorded therefor
CN109685568B (en) Product display method and device
JP2021121939A (en) Online sales system, online purchase system, and computer program
TWM514071U (en) Interactive augmented reality audio/video house browsing system
TWI492176B (en) Image Playing System with Customer Preference Product Analysis Capability and Its Method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22950998

Country of ref document: EP

Kind code of ref document: A1