WO2023111902A1 - Interface device for trade show events and/or virtual exhibitions - Google Patents

Interface device for trade show events and/or virtual exhibitions Download PDF

Info

Publication number
WO2023111902A1
WO2023111902A1 PCT/IB2022/062226 IB2022062226W WO2023111902A1 WO 2023111902 A1 WO2023111902 A1 WO 2023111902A1 IB 2022062226 W IB2022062226 W IB 2022062226W WO 2023111902 A1 WO2023111902 A1 WO 2023111902A1
Authority
WO
WIPO (PCT)
Prior art keywords
audio
interaction means
entity
acquisition
dynamic
Prior art date
Application number
PCT/IB2022/062226
Other languages
French (fr)
Inventor
Renzo BRAGLIA
Original Assignee
Braglia Renzo
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Braglia Renzo filed Critical Braglia Renzo
Publication of WO2023111902A1 publication Critical patent/WO2023111902A1/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F27/00Combined visual and audible advertising or displaying, e.g. for public address

Definitions

  • the present invention relates to an interface device for trade show events and/or virtual exhibitions.
  • holding a trade show event involves setting up special spaces, made accessible to the public, within which companies and/or individuals can stage shows or advertisements of their products and services.
  • the main aim of the present invention is, therefore, to devise an interface device which allows trade show events and/or virtual exhibitions to be staged conveniently and quickly, thus effectively reducing the costs involved in staging them.
  • Another object of the present invention is to devise an interface device for trade show events and/or exhibitions which allows images of different entities to be selectively shown clearly to an interlocutor, different views of the same entity and/or images provided by external multimedia sources.
  • Another object of the present invention is to devise an interface device for trade show events and/or virtual exhibitions which allows the aforementioned drawbacks of the prior art to be overcome within the framework of a simple, rational, easy and effective to use as well as affordable solution.
  • this interface device for trade show events and/or virtual exhibitions having the characteristics of claim 1.
  • FIG. 1 is a perspective and front view of the interface device in accordance with the present invention.
  • FIG. 1 is a perspective and rear view of the interface device in Figure 1;
  • FIG. 3 is a perspective view of an embodiment of the interface device in accordance with the present invention.
  • FIG. 4 is a view of a block diagram of the interface system in accordance with the present invention.
  • reference numeral 1 globally indicates an interface device for trade show events and/or virtual exhibitions.
  • the interface device 1 allows for audio and/or video communication between at least one actor entity 2, under a promoter entity, with at least one spectator entity 3, placed in separate environments with each other in order to stage a trade show event and/or a virtual exhibition.
  • actor entity 2 is used to refer to both an inanimate actor entity, such as an item or a product that the promoter entity intends to advertise and/or sell at a trade show, and an animate actor entity, such as e.g. a person charged by the promoter entity to show the product to be advertised.
  • the interface device 1 comprises:
  • a supporting structure 4 which can be placed in an exhibition area which can be occupied by at least one actor entity 2 under a promoter entity;
  • first interaction means 5 mounted on the supporting structure 4 and configured to share audio and/or video contents bi-directionally and in real time with at least second interaction means 6, used by at least one spectator entity 3, the first interaction means 5 comprising a plurality of acquisition groups 7, each configured to acquire the image and/or audio of at least one actor entity 2 and/or of at least one piece of multimedia data shared during the communication with at least the second interaction means 6;
  • a direction and control device 8 mounted on the supporting structure 4 and configured to selectively activate the sharing of the image and/or audio acquired by each of the acquisition groups 7 by the first interaction means 5.
  • the direction and control device 8 is configured to select images and/or audio, preferably acquired by the first interaction means 5, to be shared with the second interaction means 6.
  • the second interaction means 6 are thus configured to play back to the spectator entity the images and/or audio that the direction and control device 8 has selected and shared.
  • This configuration allows the animate actor entity 2 to make a real-time presentation aimed at one or more spectator entities 3, in order to advertise and/or to sell an inanimate actor entity 2.
  • the first interaction means 5 comprise at least one dynamic audio and/or video acquisition group 9a, 9b provided with at least one of:
  • a dynamic video acquisition appliance 10a, 10b configured to acquire the image of the at least one actor entity 2,
  • a dynamic audio acquisition appliance 10a, 10b configured to acquire the audio of the at least one actor entity 2,
  • - movement means I la, 11b of the acquisition appliances 10a, 10b configured to move at least one of the acquisition appliances 10a, 10b at least in rotation and/or along a direction of movement ai, a2.
  • the dynamic video acquisition appliance 10a, 10b is of the type of an electronic video acquisition device, such as e.g. a camera, a webcam or the like.
  • the dynamic audio acquisition appliance 10a, 10b is preferably of the type of an electronic audio acquisition device, such as e.g. a microphone or the like.
  • the interface device 1 cannot however be ruled out wherein the video acquisition appliance 10a, 10b and the audio acquisition appliance 10a, 10b are made in a single electronic device configured for audio and video acquisition.
  • the interface device 1 comprises a plurality of dynamic audio and/or video acquisition groups 9a, 9b.
  • the movement means I la, 11b of at least one of the acquisition groups are configured to move the relevant acquisition appliance 10a, 10b independently of the movement means I la, 11b of at least another of the acquisition groups 10a, 10b.
  • at least one of the dynamic acquisition appliances 10a of at least one of the dynamic acquisition groups 9a is moved in a direction of movement ai which is transverse to the direction of movement a2 of at least one of the dynamic acquisition appliances 10b of at least another of the dynamic acquisition groups 9b.
  • the movement means 1 la of one of the dynamic acquisition groups 9a are configured to move the relevant dynamic acquisition appliance 10a along a first direction of movement ai.
  • the movement means 11b of another of the dynamic acquisition groups 9b are configured to move the relevant dynamic acquisition appliance 10b along a second direction of movement ai.
  • the first direction of movement ai is transverse to the second direction of movement a2.
  • the first direction of movement ai is substantially perpendicular to the second direction of movement a2.
  • the actor entity 2 can be shot from different angles and the movement of the actor entity 2 can be followed within the exposure area.
  • the supporting structure 4 has a substantially parallelepiped shape, wherein at least one upper surface 12 is identified from which at least one front surface 13 and one back surface 14 project, which are substantially parallel to each other and perpendicular to the upper surface 12.
  • the upper surface 12 is arranged substantially horizontally, while the front and back surfaces 13, 14 are arranged substantially vertically.
  • the terms “upper” and “lower”, as well as the adjectives “front”, “rear”, “vertical” and “horizontal”, employed with reference to the interface device 1 are meant to refer to the conditions under which the interface device 1 is normally used, that is, those in which it is employed by an animate actor entity 2 and is placed resting on the ground.
  • a dynamic acquisition group 9a is arranged on the upper surface 12 of the supporting structure 4 and is movable along the direction of movement ai, preferably substantially horizontal.
  • Another dynamic acquisition group 9b is arranged on the back surface 14 of the supporting structure 4 and is movable along the direction of movement a2, preferably substantially vertical.
  • the interface device 1 comprises at least one playback group 15a, 15b provided with at least one audio and/or video playback device 16a, 16b, configured to play back the audio and/or video shared by the first interaction means 5 and/or by the second interaction means 6 during communication.
  • the audio and/or video playback device 16a, 16b is configured to substantially play in real time the audio and/or video shared by the first interaction means 5 and/or by the second interaction means 6 during communication.
  • Such a characteristic allows the animate actor entity 2 to make a live exhibition that is customizable according to, for example, the requests of the spectator entity 3.
  • the audio and/or video playback device 16a, 16b is of the type of a display provided with audio speakers, such as e.g. a television set, personal computer, smart phone, tablet and the like.
  • the audio and/or video playback device 16a, 16b may comprise respective video playback elements and audio playback elements possibly operatively connected to each other.
  • the playback group 15a, 15b is provided with at least one static audio and/or video acquisition appliance 17a, 17b, built in the audio and/or video playback device 16a, 16b.
  • the interface device 1 may comprise one or more static acquisition appliances 17a, 17b separate from the playback group 15a, 15b.
  • the static acquisition appliance 17a, 17b is configured to acquire the images and/or audio of the animate actor entity 2, e.g. while using the direction and control device 8.
  • the static audio and/or video acquisition appliances 17a, 17b is comprised by the first interaction means 5 and are selectively activated by the direction and control device 8, similarly to what is described with reference to the acquisition groups 7.
  • the direction and control device 8 is configured to selectively activate the sharing of the image and/or audio acquired by each of the acquisition groups 7 and by one or more static acquisition appliances 17, 17b.
  • the movement means I la, 11b are configured to move at least one of the dynamic acquisition appliances 10a, 10b independently of the playback group 15a, 15b and/or by the static acquisition appliance 17a, 17b.
  • the static acquisition appliance 17a, 17b is configured to acquire the images and/or audio of the animate actor entity 2, e.g. while using the direction and control device 8
  • the dynamic acquisition appliances 10a, 10b are configured to acquire the images and/or audio of an exhibition area and/or of another animate actor entity 2.
  • the animate actor entity 2 uses the direction and control device 8 to selectively activate or deactivate each of the acquisition groups 7 and of one or more static acquisition appliances 17, 17b.
  • the interface device 1 comprises a pair of playback groups 15a, 15b arranged one on top of the other.
  • the playback groups 15a, 15b are mounted on the front surface 13 of the supporting structure 4.
  • the direction and control device 8 is configured to simultaneously activate the playback of the acquired image and/or audio from two or more acquisition groups 7. Specifically, the direction and control device 8 is configured to share with the second interaction means 6 the images and/or audio acquired by two or more acquisition groups 7 at the same time.
  • This expedient allows the actor entity 2 to decide the number and the type of images and/or sounds to which the spectator entity 3 is subjected.
  • the interface device 1 allows the animate actor entity 2 to play back any image and/or audio acquired by the acquisition groups 7 in combination with each other.
  • the animate actor entity 2 takes on the role of “director” who decides, as needed, the number, type and mode of playback of images and/or sounds to which the spectator entity 3 is subjected.
  • the direction and control device 8 is operatively connected to each of the acquisition groups 7 and to at least one of the audio/video playback devices.
  • the direction and control device 8 is operatively connected to each of the acquisition groups 7 and each of the audio/video playback devices.
  • the direction and control device 8 is configured to share the images and/or audio acquired by at least one of the acquisition groups 7 with the second interaction means 6.
  • the direction and control device 8 is then provided with a control appliance adapted to be used by the animate actor entity 2 to select one or more of the acquisition groups 7 by which to acquire images and/or video.
  • the control appliance is also configured to allow the animate actor entity 2 to select the images and/or audio to be shared with the second interaction means 6.
  • the second interaction means 6 are thus configured to play back the images and/or audio selected by the animate actor entity 2 by means of the control appliance.
  • the direction and control device 8 is employed by the actor entity 2 to show selectively to the spectator entity 3 the image and/or audio acquired by each of the acquisition groups 7.
  • the direction and control device 8 is of the type of an electronic control device, such as e.g. a personal computer, PLC, microcontroller, smart phone, tablet or the like.
  • an electronic control device such as e.g. a personal computer, PLC, microcontroller, smart phone, tablet or the like.
  • the direction and control device 8 comprises at least one hand-held control appliance, e.g. a remote control or the like, employed by the animate actor entity 2 to remotely control the direction and control device 8.
  • the animate actor entity 2 is free to control the interface device 1 by moving easily within the exposure area.
  • the hand-held control appliance is adapted to command the direction and control device 8 so that the animate actor entity 2 is enabled to use the direction and control device 8 remotely.
  • the hand-held control appliance is configured to perform one or more of the functions carried out by the direction and control device 8.
  • the interface device 1 comprises connection means for connecting the first and the second interaction means 5, 6, configured to establish full duplex communication type between the latter, so as to transmit the image and audio acquired by the first interaction means 5 to the second interaction means 6 and vice versa bi-directionally, simultaneously and substantially in real time.
  • connection means are of the type of wireless connection means configured to establish communication between the first and the second interaction means 5, 6 by means of a telecommunication network, such as e.g. the Internet or the like.
  • connection means be of the type of wired or at least partly wired connection means.
  • the interface device 1 may comprise displacement means 18 which are configured to move the supporting structure 4.
  • the interface device 1 can be moved within the exhibition area allowing the animate actor entity 2 to acquire images and/or audio of other actor entities 2 to share with the spectator entity 3.
  • the displacement means 18 comprise one or more wheels 19 mounted on the supporting structure 4.
  • the displacement means 18 can be controlled remotely, e.g. by the animate actor entity 2.
  • the first interaction means 5 are powered by connection to a power network.
  • the interface device 1 may comprise an uninterruptible power supply unit, configured to power at least the first interaction means 5 when they are disconnected from the power network.
  • This expedient allows promoting the displacement of the interface device 1 within the exhibition area while keeping the first interaction means 5 in operation.
  • the first interaction means 5 comprise input means 20 configured to command at least partly the first interaction means 5.
  • the interface device 1 comprises a table top 21 intended to support the input means 20.
  • the interface device 1 may comprise a desk 22.
  • the interface device 1 can be used in an upright configuration, in which the actor entity 2 is placed in a standing position or in a sitting configuration, wherein the desk 22 is arranged coplanar to the table top 21 and the actor entity 2 is arranged in a sitting position.
  • the table top 21 is placed at a greater height than the height of the desk 22, so that the desk 22 can be placed below the table top 21 and the latter resting on the desk 22.
  • One object of the present invention is an interface system 23 for trade show events and/or virtual exhibitions, comprising at least one interface device 1 and interface means 24 configured to:
  • the characteristic piece of data comprises at least one piece of information selected from the list comprising at least one of:
  • the data identifying the type of actor entity 2 may refer to the product’s commodity class, the product type, whether the product belongs to a specific collection, such as e.g. in the case of clothing products, etc.
  • the system 23 may also comprise the second interaction means 6 arranged in an environment different from the exhibition area and configured to acquire the image and/or audio of the at least one spectator entity 3.
  • the second interaction means 6 are configured to play back the image and/or audio acquired by the acquisition groups 7 of the first interaction means 5.
  • the second interaction means comprise at least one audio and/or video acquisition device of the at least one spectator entity 3.
  • system 23 comprises a plurality of interface devices 1.
  • the interface devices 1 are intended to be placed in different exhibition areas to allow different actor entities 2 to share at least audio and/or video contents with at least one spectator entity 3.
  • the interface means 24 are configured to:
  • the interface means 24 allow a spectator entity 3 to have the necessary information to choose which interface device 1 to connect to in order to enjoy the audio and/or video contents shared by the relevant first interaction means 5.
  • the animate actor entity 2 may thus share with the spectator entity 3 presentations and/or exhibitions regarding the inanimate actor entity 2 promoted by the promoter entity in order to advertise and/or sell it.
  • the interface device allows images of different actor entities to be selectively shown to an interlocutor in a clear manner and/or different views of the same entity and/or images provided by external multimedia sources, so that real-time exhibitions can be made.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)
  • Toys (AREA)

Abstract

The interface device (1) for trade show events and/or virtual exhibitions comprises: - a supporting structure (4), which can be placed in an exhibition area which can be occupied by at least one actor entity (2) under a promoter entity; - first interaction means (5), mounted on the supporting structure (4) and configured to share audio and/or video contents bi-directionally and in real time with at least second interaction means (6), used by at least one spectator entity (3), the first interaction means (5) comprising a plurality of acquisition groups (7), each configured to acquire the image and/or audio of the at least one actor entity (3) and/or of said at least one piece of multimedia data shared during the communication with the at least second interaction means (6); - a direction and control device (8), mounted on the supporting structure (4) and configured to selectively activate the sharing of the image and/or audio acquired by each of the acquisition groups (7) by the first interaction means (5).

Description

INTERFACE DEVICE FOR TRADE SHOW EVENTS AND/OR
VIRTUAL EXHIBITIONS
Technical Field
The present invention relates to an interface device for trade show events and/or virtual exhibitions.
Background Art
Several types of trade show events or exhibitions are known that allow companies and/or individuals to show their products and services to the public in order to sell and advertise them.
Typically, holding a trade show event involves setting up special spaces, made accessible to the public, within which companies and/or individuals can stage shows or advertisements of their products and services.
Generally, trade shows must be able to contain a large number of spaces of various sizes for exhibitors to set up their booths.
In addition, trade shows must be able to provide certain basic services, such as e.g. catering space, toilets, etc.
For these and other reasons, holding a trade show event is always extremely costly.
In addition, unpredictable conditions may occur that do not allow the trade show to be organized and held, e.g. due to health emergencies, natural disasters, etc., resulting in substantial economic losses by those involved.
Description of the Invention
The main aim of the present invention is, therefore, to devise an interface device which allows trade show events and/or virtual exhibitions to be staged conveniently and quickly, thus effectively reducing the costs involved in staging them.
Another object of the present invention is to devise an interface device for trade show events and/or exhibitions which allows images of different entities to be selectively shown clearly to an interlocutor, different views of the same entity and/or images provided by external multimedia sources.
Another object of the present invention is to devise an interface device for trade show events and/or virtual exhibitions which allows the aforementioned drawbacks of the prior art to be overcome within the framework of a simple, rational, easy and effective to use as well as affordable solution.
The aforementioned objects are achieved by this interface device for trade show events and/or virtual exhibitions having the characteristics of claim 1.
The aforementioned objects are achieved by this interface system for trade show events and/or virtual exhibitions having the characteristics of claim 13.
Brief Description of the Drawings
Other characteristics and advantages of the present invention will become more apparent from the description of a preferred, but not exclusive, embodiment of an interface device for trade show events and/or virtual exhibitions, illustrated by way of an indicative, yet non-limiting example in the accompanying tables of drawings in which:
- Figure 1 is a perspective and front view of the interface device in accordance with the present invention;
- Figure 2 is a perspective and rear view of the interface device in Figure 1;
- Figure 3 is a perspective view of an embodiment of the interface device in accordance with the present invention;
- Figure 4 is a view of a block diagram of the interface system in accordance with the present invention.
Embodiments of the Invention
With particular reference to these figures, reference numeral 1 globally indicates an interface device for trade show events and/or virtual exhibitions.
Specifically, the interface device 1 allows for audio and/or video communication between at least one actor entity 2, under a promoter entity, with at least one spectator entity 3, placed in separate environments with each other in order to stage a trade show event and/or a virtual exhibition.
In the remainder of this disclosure, the term actor entity 2 is used to refer to both an inanimate actor entity, such as an item or a product that the promoter entity intends to advertise and/or sell at a trade show, and an animate actor entity, such as e.g. a person charged by the promoter entity to show the product to be advertised.
Conveniently, the interface device 1 comprises:
- a supporting structure 4, which can be placed in an exhibition area which can be occupied by at least one actor entity 2 under a promoter entity;
- first interaction means 5, mounted on the supporting structure 4 and configured to share audio and/or video contents bi-directionally and in real time with at least second interaction means 6, used by at least one spectator entity 3, the first interaction means 5 comprising a plurality of acquisition groups 7, each configured to acquire the image and/or audio of at least one actor entity 2 and/or of at least one piece of multimedia data shared during the communication with at least the second interaction means 6;
- a direction and control device 8, mounted on the supporting structure 4 and configured to selectively activate the sharing of the image and/or audio acquired by each of the acquisition groups 7 by the first interaction means 5.
Specifically, the direction and control device 8 is configured to select images and/or audio, preferably acquired by the first interaction means 5, to be shared with the second interaction means 6. The second interaction means 6 are thus configured to play back to the spectator entity the images and/or audio that the direction and control device 8 has selected and shared.
This configuration allows the animate actor entity 2 to make a real-time presentation aimed at one or more spectator entities 3, in order to advertise and/or to sell an inanimate actor entity 2.
Usefully, the first interaction means 5 comprise at least one dynamic audio and/or video acquisition group 9a, 9b provided with at least one of:
- a dynamic video acquisition appliance 10a, 10b, configured to acquire the image of the at least one actor entity 2,
- a dynamic audio acquisition appliance 10a, 10b, configured to acquire the audio of the at least one actor entity 2, and
- movement means I la, 11b of the acquisition appliances 10a, 10b, configured to move at least one of the acquisition appliances 10a, 10b at least in rotation and/or along a direction of movement ai, a2.
In this way, it is possible to acquire contents from multiple actor entities 2 arranged in different locations, as well as being able to vary the image acquisition framing.
Preferably, the dynamic video acquisition appliance 10a, 10b is of the type of an electronic video acquisition device, such as e.g. a camera, a webcam or the like. The dynamic audio acquisition appliance 10a, 10b is preferably of the type of an electronic audio acquisition device, such as e.g. a microphone or the like.
Alternative embodiments of the interface device 1 cannot however be ruled out wherein the video acquisition appliance 10a, 10b and the audio acquisition appliance 10a, 10b are made in a single electronic device configured for audio and video acquisition.
As shown in Figure 2, advantageously, the interface device 1 comprises a plurality of dynamic audio and/or video acquisition groups 9a, 9b.
Appropriately, the movement means I la, 11b of at least one of the acquisition groups are configured to move the relevant acquisition appliance 10a, 10b independently of the movement means I la, 11b of at least another of the acquisition groups 10a, 10b. In particular, at least one of the dynamic acquisition appliances 10a of at least one of the dynamic acquisition groups 9a is moved in a direction of movement ai which is transverse to the direction of movement a2 of at least one of the dynamic acquisition appliances 10b of at least another of the dynamic acquisition groups 9b.
In the present case, the movement means 1 la of one of the dynamic acquisition groups 9a are configured to move the relevant dynamic acquisition appliance 10a along a first direction of movement ai. The movement means 11b of another of the dynamic acquisition groups 9b are configured to move the relevant dynamic acquisition appliance 10b along a second direction of movement ai. The first direction of movement ai is transverse to the second direction of movement a2. Preferably, the first direction of movement ai is substantially perpendicular to the second direction of movement a2.
In this way, the actor entity 2 can be shot from different angles and the movement of the actor entity 2 can be followed within the exposure area.
In one or more versions, the supporting structure 4 has a substantially parallelepiped shape, wherein at least one upper surface 12 is identified from which at least one front surface 13 and one back surface 14 project, which are substantially parallel to each other and perpendicular to the upper surface 12.
In detail, in use, the upper surface 12 is arranged substantially horizontally, while the front and back surfaces 13, 14 are arranged substantially vertically.
In the context of this disclosure, the terms “upper” and “lower”, as well as the adjectives “front”, “rear”, “vertical” and “horizontal”, employed with reference to the interface device 1 are meant to refer to the conditions under which the interface device 1 is normally used, that is, those in which it is employed by an animate actor entity 2 and is placed resting on the ground.
Preferably, a dynamic acquisition group 9a is arranged on the upper surface 12 of the supporting structure 4 and is movable along the direction of movement ai, preferably substantially horizontal.
Another dynamic acquisition group 9b is arranged on the back surface 14 of the supporting structure 4 and is movable along the direction of movement a2, preferably substantially vertical.
Conveniently, the interface device 1 comprises at least one playback group 15a, 15b provided with at least one audio and/or video playback device 16a, 16b, configured to play back the audio and/or video shared by the first interaction means 5 and/or by the second interaction means 6 during communication.
Advantageously, the audio and/or video playback device 16a, 16b is configured to substantially play in real time the audio and/or video shared by the first interaction means 5 and/or by the second interaction means 6 during communication.
Such a characteristic allows the animate actor entity 2 to make a live exhibition that is customizable according to, for example, the requests of the spectator entity 3.
Preferably, the audio and/or video playback device 16a, 16b is of the type of a display provided with audio speakers, such as e.g. a television set, personal computer, smart phone, tablet and the like.
Further embodiments of the interface device 1 cannot, however, be ruled out, wherein the audio and/or video playback device 16a, 16b may comprise respective video playback elements and audio playback elements possibly operatively connected to each other.
Advantageously, the playback group 15a, 15b is provided with at least one static audio and/or video acquisition appliance 17a, 17b, built in the audio and/or video playback device 16a, 16b.
It cannot, however, be ruled out that the interface device 1 may comprise one or more static acquisition appliances 17a, 17b separate from the playback group 15a, 15b.
Specifically, the static acquisition appliance 17a, 17b is configured to acquire the images and/or audio of the animate actor entity 2, e.g. while using the direction and control device 8.
In this way, the animate actor entity 2 and the spectator entity 3 can communicate with each other at a distance from their respective environments. Usefully, the static audio and/or video acquisition appliances 17a, 17b is comprised by the first interaction means 5 and are selectively activated by the direction and control device 8, similarly to what is described with reference to the acquisition groups 7.
In other words, the direction and control device 8 is configured to selectively activate the sharing of the image and/or audio acquired by each of the acquisition groups 7 and by one or more static acquisition appliances 17, 17b.
Specifically, the movement means I la, 11b are configured to move at least one of the dynamic acquisition appliances 10a, 10b independently of the playback group 15a, 15b and/or by the static acquisition appliance 17a, 17b. In this way, while the static acquisition appliance 17a, 17b is configured to acquire the images and/or audio of the animate actor entity 2, e.g. while using the direction and control device 8, the dynamic acquisition appliances 10a, 10b are configured to acquire the images and/or audio of an exhibition area and/or of another animate actor entity 2. At the same time, the animate actor entity 2 uses the direction and control device 8 to selectively activate or deactivate each of the acquisition groups 7 and of one or more static acquisition appliances 17, 17b.
In one or more versions, shown in Figure 1, the interface device 1 comprises a pair of playback groups 15a, 15b arranged one on top of the other.
Preferably, the playback groups 15a, 15b are mounted on the front surface 13 of the supporting structure 4.
Usefully, the direction and control device 8 is configured to simultaneously activate the playback of the acquired image and/or audio from two or more acquisition groups 7. Specifically, the direction and control device 8 is configured to share with the second interaction means 6 the images and/or audio acquired by two or more acquisition groups 7 at the same time.
This expedient allows the actor entity 2 to decide the number and the type of images and/or sounds to which the spectator entity 3 is subjected.
In other words, the interface device 1 allows the animate actor entity 2 to play back any image and/or audio acquired by the acquisition groups 7 in combination with each other.
In this way, the animate actor entity 2 takes on the role of “director” who decides, as needed, the number, type and mode of playback of images and/or sounds to which the spectator entity 3 is subjected.
Appropriately, the direction and control device 8 is operatively connected to each of the acquisition groups 7 and to at least one of the audio/video playback devices. Preferably, the direction and control device 8 is operatively connected to each of the acquisition groups 7 and each of the audio/video playback devices. Additionally, the direction and control device 8 is configured to share the images and/or audio acquired by at least one of the acquisition groups 7 with the second interaction means 6.
The direction and control device 8 is then provided with a control appliance adapted to be used by the animate actor entity 2 to select one or more of the acquisition groups 7 by which to acquire images and/or video. The control appliance is also configured to allow the animate actor entity 2 to select the images and/or audio to be shared with the second interaction means 6. The second interaction means 6 are thus configured to play back the images and/or audio selected by the animate actor entity 2 by means of the control appliance.
In this way, the direction and control device 8 is employed by the actor entity 2 to show selectively to the spectator entity 3 the image and/or audio acquired by each of the acquisition groups 7.
Preferably, the direction and control device 8 is of the type of an electronic control device, such as e.g. a personal computer, PLC, microcontroller, smart phone, tablet or the like.
Preferably, the direction and control device 8 comprises at least one hand-held control appliance, e.g. a remote control or the like, employed by the animate actor entity 2 to remotely control the direction and control device 8. In this way, the animate actor entity 2 is free to control the interface device 1 by moving easily within the exposure area.
The hand-held control appliance is adapted to command the direction and control device 8 so that the animate actor entity 2 is enabled to use the direction and control device 8 remotely.
In other words, the hand-held control appliance is configured to perform one or more of the functions carried out by the direction and control device 8.
Usefully, the interface device 1 comprises connection means for connecting the first and the second interaction means 5, 6, configured to establish full duplex communication type between the latter, so as to transmit the image and audio acquired by the first interaction means 5 to the second interaction means 6 and vice versa bi-directionally, simultaneously and substantially in real time.
Preferably, the connection means are of the type of wireless connection means configured to establish communication between the first and the second interaction means 5, 6 by means of a telecommunication network, such as e.g. the Internet or the like.
Further embodiments of the interface device 1 cannot, however, be ruled out, in which the connection means be of the type of wired or at least partly wired connection means. Usefully, the interface device 1 may comprise displacement means 18 which are configured to move the supporting structure 4.
In this way, the interface device 1 can be moved within the exhibition area allowing the animate actor entity 2 to acquire images and/or audio of other actor entities 2 to share with the spectator entity 3.
Preferably, the displacement means 18 comprise one or more wheels 19 mounted on the supporting structure 4.
In one or more versions, the displacement means 18 can be controlled remotely, e.g. by the animate actor entity 2.
Preferably, the first interaction means 5 are powered by connection to a power network.
Usefully, the interface device 1 may comprise an uninterruptible power supply unit, configured to power at least the first interaction means 5 when they are disconnected from the power network.
This expedient allows promoting the displacement of the interface device 1 within the exhibition area while keeping the first interaction means 5 in operation.
In addition, in the case of exhibition areas provided with several rooms, it is possible to move the interface device 1 between the different rooms to show the spectator entity 3 additional actor entities 2 occupying the different rooms.
As shown in Figure 3, the first interaction means 5 comprise input means 20 configured to command at least partly the first interaction means 5.
Usefully, the interface device 1 comprises a table top 21 intended to support the input means 20.
In addition, the interface device 1 may comprise a desk 22.
In this way, the interface device 1 can be used in an upright configuration, in which the actor entity 2 is placed in a standing position or in a sitting configuration, wherein the desk 22 is arranged coplanar to the table top 21 and the actor entity 2 is arranged in a sitting position.
For this purpose, the table top 21 is placed at a greater height than the height of the desk 22, so that the desk 22 can be placed below the table top 21 and the latter resting on the desk 22.
One object of the present invention is an interface system 23 for trade show events and/or virtual exhibitions, comprising at least one interface device 1 and interface means 24 configured to:
- operatively link at least the first interaction means 5 to at least the second interaction means 6, and/or
- share at least one characteristic piece of data of the interface device 1 with at least the second interaction means 6.
Specifically, the characteristic piece of data comprises at least one piece of information selected from the list comprising at least one of:
- the number of acquisition groups 7 comprising the interface device 1 ;
- which of either the dynamic acquisition groups 9a, 9b, the playback groups 15a, 15b or the displacement means 18 are present in the interface device 1;
- the personal data of the promoter entity;
- the data identifying the type of actor entity 2;
- a temporal piece of data defining the beginning of the sharing of audio and/or video contents by the first interaction means 5;
- the number of spectator entities 3 linked with the first interaction means 5.
In detail, in the event of the actor entity 2 being a product, the data identifying the type of actor entity 2 may refer to the product’s commodity class, the product type, whether the product belongs to a specific collection, such as e.g. in the case of clothing products, etc.
Preferably, the system 23 may also comprise the second interaction means 6 arranged in an environment different from the exhibition area and configured to acquire the image and/or audio of the at least one spectator entity 3.
In addition, the second interaction means 6 are configured to play back the image and/or audio acquired by the acquisition groups 7 of the first interaction means 5.
Conveniently, the second interaction means comprise at least one audio and/or video acquisition device of the at least one spectator entity 3.
In one or more versions, shown in Figure 4, the system 23 comprises a plurality of interface devices 1.
Specifically, the interface devices 1 are intended to be placed in different exhibition areas to allow different actor entities 2 to share at least audio and/or video contents with at least one spectator entity 3.
Usefully, the interface means 24 are configured to:
- share with the spectator entity 3 the characteristic data of each interface device 1 of the plurality of interface devices 1 to which the spectator entity 3 can link with the second interaction means 6; and/or
- share requests for participation in the audio and/or video content sharing by the first interaction means 5; and/or
- receive from the spectator entity 3 a response to the requests for participation; and/or
- on the basis of the response, selectively put the second interaction means 6 in communication with the first interaction means 5 of at least one of the plurality of interface devices 1.
In this way, the interface means 24 allow a spectator entity 3 to have the necessary information to choose which interface device 1 to connect to in order to enjoy the audio and/or video contents shared by the relevant first interaction means 5.
The animate actor entity 2 may thus share with the spectator entity 3 presentations and/or exhibitions regarding the inanimate actor entity 2 promoted by the promoter entity in order to advertise and/or sell it.
It has in practice been ascertained that the described invention achieves the intended objects, and in particular, the fact is emphasized that through the interface device, it is possible to stage virtual trade shows in a practical and fast manner, thus reducing the costs involved in setting up a trade show.
In addition, the interface device allows images of different actor entities to be selectively shown to an interlocutor in a clear manner and/or different views of the same entity and/or images provided by external multimedia sources, so that real-time exhibitions can be made.

Claims

1) Interface device (1) for trade show events and/or virtual exhibitions comprising:
- a supporting structure (4), which can be placed in an exhibition area which can be occupied by at least one actor entity (2) under a promoter entity;
- first interaction means (5), mounted on said supporting structure (4) and configured to share audio and/or video contents bi-directionally and in real time with at least second interaction means (6), used by at least one spectator entity (3), said first interaction means (5) comprising a plurality of acquisition groups (7), each configured to acquire the image and/or audio of said at least one actor entity (3) and/or of said at least one piece of multimedia data shared during said communication with said at least second interaction means (6);
- a direction and control device (8), mounted on said supporting structure (4) and configured to selectively activate the sharing of the image and/or audio acquired by each of said acquisition groups (7) by said first interaction means (5).
2) Device (1) according to claim 1, characterized by the fact that said first interaction means (5) comprise at least one dynamic audio and/or video acquisition group (9a, 9b) provided with at least one of:
- a dynamic video acquisition appliance (10a, 10b), configured to acquire the image of said at least one actor entity (3);
- a dynamic audio acquisition appliance (10a, 10b), configured to acquire the audio of said at least one actor entity (3);
- movement means (I la, 11b) of said dynamic acquisition appliances (10a, 10b), configured to move at least one of said dynamic acquisition appliances (10a, 10b) at least in rotation and/or along a direction of movement (ai, az).
3) Device (1) according to claim 2, characterized by the fact that it comprises a plurality of dynamic audio and/or video acquisition groups (9a, 9b), at least one of said dynamic acquisition appliances (10a) of at least one of said acquisition groups (9a) being moved in a direction of movement (ai) transverse to the direction of movement (az) of at least one of said dynamic acquisition appliances (10b) of at least another of said dynamic acquisition groups (9a).
4) Device (1) according to one or more of the preceding claims, characterized by the fact that it comprises at least one playback group (15a, 15b), provided with at least one audio/video playback device (16a, 16b) configured to play the audio and/or video shared by said first and/or said second interaction means (6) during said communication.
5) Device (1) according to claim 4, characterized by the fact that said playback group (15a, 15b) is provided with a static audio and/or video acquisition appliance (17a, 17b), built in said playback device (16a, 16b).
6) Device (1) according to claim 4 or 5, characterized by the fact that it comprises at least one pair of playback groups (15a, 15b), arranged one on top of the other.
7) Device (1) according to one or more of the preceding claims, characterized by the fact that it comprises displacement means (18) configured to move said supporting structure (4).
8) Device (1) according to one or more of the preceding claims, characterized by the fact that it comprises at least one uninterruptible power supply unit, configured to power at least said first interaction means (5) when they are disconnected from the power grid.
9) Interface system (23) for trade show events and/or virtual exhibitions, characterized by the fact that it comprises at least one interface device (1) according to one or more of the preceding claims and interface means (24) configured to:
- operatively link at least said first interaction means (5) to at least said second interaction means (6), and/or
- share at least one characteristic piece of data of said interface device (1) with at least said second interaction means (6), said characteristic piece of data comprising at least one piece of information selected from the list comprising at least one of:
- the number of said acquisition groups (7) comprising said interface device (1);
- which of either said dynamic acquisition groups (9a, 9b), said playback groups 14
(15a, 15b) or said displacement means (18) are present in said interface device (1);
- the personal data of said promoter entity;
- the data identifying the type of said actor entity (2);
- a temporal piece of data defining the beginning of said sharing of audio and/or video contents by said first interaction means (5);
- the number of said spectator entities (3) linked with said first interaction means (5).
10) System (23) according to claim 9, characterized by the fact that it comprises a plurality of said interface devices (1), said interface means (24) being configured to:
- share with said spectator entity (3) said characteristic data of each of said plurality of interface devices (1) to which said spectator entity (3) can link with said second interaction means (6);
- share requests for participation in said audio and/or video content sharing by said first interaction means (5);
- receive from said spectator entity (3) a response to said requests for participation;
- on the basis of said response, selectively put said second interaction means (6) in communication with said first interaction means (5) of at least one of said plurality of interface devices (1).
PCT/IB2022/062226 2021-12-14 2022-12-14 Interface device for trade show events and/or virtual exhibitions WO2023111902A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IT102021000031355 2021-12-14
IT102021000031355A IT202100031355A1 (en) 2021-12-14 2021-12-14 INTERFACE DEVICE FOR TRADE EVENTS AND/OR VIRTUAL EXHIBITIONS

Publications (1)

Publication Number Publication Date
WO2023111902A1 true WO2023111902A1 (en) 2023-06-22

Family

ID=80462065

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2022/062226 WO2023111902A1 (en) 2021-12-14 2022-12-14 Interface device for trade show events and/or virtual exhibitions

Country Status (2)

Country Link
IT (1) IT202100031355A1 (en)
WO (1) WO2023111902A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998051078A1 (en) * 1997-05-07 1998-11-12 Telbotics Inc. Teleconferencing robot with swiveling video monitor
ES2395376A1 (en) * 2011-03-15 2013-02-12 Eulen, S.A. System of an interactive communication between a user and a service center. (Machine-translation by Google Translate, not legally binding)
JP2017050018A (en) * 2010-12-30 2017-03-09 アイロボット コーポレイション Movable robot system
WO2017086815A1 (en) * 2015-11-17 2017-05-26 S.L.Invest Sebastian Łabanowicz Interactive sale stand, system of interactive sale stands, and interactive sale communication method
KR20210009415A (en) * 2021-01-19 2021-01-26 홍승경 A kiosk that displays a vr image by rotating 360 degrees horizontally

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998051078A1 (en) * 1997-05-07 1998-11-12 Telbotics Inc. Teleconferencing robot with swiveling video monitor
JP2017050018A (en) * 2010-12-30 2017-03-09 アイロボット コーポレイション Movable robot system
ES2395376A1 (en) * 2011-03-15 2013-02-12 Eulen, S.A. System of an interactive communication between a user and a service center. (Machine-translation by Google Translate, not legally binding)
WO2017086815A1 (en) * 2015-11-17 2017-05-26 S.L.Invest Sebastian Łabanowicz Interactive sale stand, system of interactive sale stands, and interactive sale communication method
KR20210009415A (en) * 2021-01-19 2021-01-26 홍승경 A kiosk that displays a vr image by rotating 360 degrees horizontally

Also Published As

Publication number Publication date
IT202100031355A1 (en) 2023-06-14

Similar Documents

Publication Publication Date Title
US10178343B2 (en) Method and apparatus for interactive two-way visualization using simultaneously recorded and projected video streams
US20170223312A1 (en) Communication stage and related systems
US11356639B1 (en) System and method for performing immersive audio-visual communications
US9402051B2 (en) Apparatus and method for simultaneous live recording through and projecting live video images onto an interactive touch screen
CA2371501A1 (en) Simulation of attendance at a live event
US10015444B1 (en) Network architecture for immersive audio-visual communications by temporary communication structures
US20060238724A1 (en) Portable projection device
US9961301B1 (en) Modular communications systems and methods therefore
US20060100930A1 (en) Method and system for advertising
US20070166671A1 (en) Display device
JP2018022480A (en) Box type live camera net direct sales system
US8384757B2 (en) System and method for providing videoconferencing among a plurality of locations
WO2023111902A1 (en) Interface device for trade show events and/or virtual exhibitions
US11119721B1 (en) Visual display system
CN202584214U (en) An interactive displaying sale and control system
US20120154256A1 (en) Visual Display System
CA2853996A1 (en) Outdoor advertising display case
CN111182337B (en) Commodity video display method and system
CN202408033U (en) Suspension imaging display cabinet
US20140160296A1 (en) Distance mirror television (dmtv) apparatus and method thereof
RU2606638C2 (en) System for interactive video access of users to exposure in real time
US20140129263A1 (en) Providing a Virtual Tour
Teerds La Casa Telematica Milan (IT): Ugo La Pietra
CN213028323U (en) Virtual conferencing device
RO129491B1 (en) System and method for remote display of an exhibit in real time

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22839894

Country of ref document: EP

Kind code of ref document: A1