WO2021053605A1 - Computerised system to provide assistance to a person, in particular to a blind or visually impaired person - Google Patents

Computerised system to provide assistance to a person, in particular to a blind or visually impaired person Download PDF

Info

Publication number
WO2021053605A1
WO2021053605A1 PCT/IB2020/058719 IB2020058719W WO2021053605A1 WO 2021053605 A1 WO2021053605 A1 WO 2021053605A1 IB 2020058719 W IB2020058719 W IB 2020058719W WO 2021053605 A1 WO2021053605 A1 WO 2021053605A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
computerized
computerized device
artificial intelligence
data
Prior art date
Application number
PCT/IB2020/058719
Other languages
French (fr)
Inventor
Alessandra PROTO
Original Assignee
Proto Alessandra
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Proto Alessandra filed Critical Proto Alessandra
Publication of WO2021053605A1 publication Critical patent/WO2021053605A1/en

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Definitions

  • the present invention refers to a computerized system for providing assistance to a person, in particular to a blind or visually impaired person.
  • the computerized system comprises a wearable computerised device and a computerized platform equipped with an artificial intelligence unit configured to execute automatic learning algorithms and to interact, through the aforementioned computerized wearable device, with a person in order to assist it with carrying out a certain activity.
  • a blind or partially sighted person often needs help when it has to interact with the surrounding environment (for example, moving around the house or on the street). Often, the necessary assistance is provided by a companion or a guide dog. In other cases, the blind or visually impaired person uses appropriate hand tools, such as a cane for orientation or an electronic obstacle detector (e.g. of vibrating type).
  • the wearable electronic devices are inconvenient to use as they generally include several electronic modules (typically installed manually by the user) intended to communicate and interact with a mobile computerized device, for example a smartphone, that in turn is connectable with a remote computerized platform.
  • a mobile computerized device for example a smartphone
  • the main task of the present invention is to provide a computerized system able to overcome the limitations of the prior art, highlighted above.
  • one purpose of the present invention is to provide a computerized system able to give assistance to a person, for example to a blind or visually impaired person, in carrying out a vast range of activities, and in general, interact with the surrounding environment on a daily basis.
  • a further aim of the invention is to provide a computerized system able to give assistance to a person in dealing with potential unforeseen situations, whilst carrying out a certain activity.
  • a further aim of the invention is to provide a computerized system that includes a wearable computerized device that has a very compact overall structure and is of simple and immediate practical use.
  • a further aim of the present invention is to provide a computerized system which is easily achievable at an industrial level.
  • the computerized device comprises at least one computerized device provided with a support structure shaped or arranged in such way as to be worn by a user (for example a blind or partially sighted person).
  • This support structure is configured to house or sustain one or more components of the aforementioned computerised device.
  • the support structure is shaped in such a way as to be wearable by a user at the face or head level.
  • such support structure is shaped like a spectacle frame.
  • the wearable computerized device comprises a control unit configured to control its functioning.
  • Such control unit is at least partially located within or fixed to the aforementioned support structure.
  • said control unit comprises a removable memory.
  • the wearable computer device is provided with sensor means adapted to detect data or images relating to the user's environment.
  • sensor means are at least partially housed in or fixed to the aforementioned support structure and are operatively connected to the control unit.
  • sensor means comprise one or more cameras, for example infrared cameras with motion sensors.
  • Such cameras are arranged so as to cover a horizontal field of view of at least 240°.
  • the wearable computerized device is equipped with user interface means adapted to reporting allow the user to send or receive sound signals, preferably voice signals.
  • Said user interface means are at least partially located within or fixed to the said support structure and are operationally connected to the control unit.
  • the aforementioned user interface means comprise one or more earphones with microphone.
  • the wearable computerized device is provided with wireless communication means at least partially located within or fixed to the aforementioned support structure and are operationally connected to said control unit.
  • the aforementioned communication means comprise at least one communication module for high-speed communications, preferably a communication module for high-speed communications through the telephone network.
  • said wireless communication means also comprise one or more communication modules for communications at a local level.
  • said wireless communication means also comprise one or more communication modules for communicating with a position detecting system at satellite level.
  • the wearable computerized device is provided with power supply means for one or more components of said computerized device.
  • the power supply means are at least partially housed in or fixed to the support structure.
  • said power supply means include a battery operatively connected to an electronic interface circuit, in turn operatively connected to one or more components of the said computerized device.
  • the wearable computerized device comprises one or more micro-displays for viewing images.
  • these micro-displays are fixed to the aforementioned support structure.
  • the wearable computerized device comprises activation means that can be operated by the user to activate (and possibly deactivate) said computerized device.
  • the computerized system includes a remote computerized platform configured to communicate with the wearable computerized device.
  • This computerized platform includes an artificial intelligence unit configured to execute automatic learning algorithms (by way of example, deep learning algorithms for reinforcement - for example Deep Q-Leaming algorithms), to interact with the user through said computerized device and to interact with one or more application units external or internal to said computerized platform.
  • automatic learning algorithms by way of example, deep learning algorithms for reinforcement - for example Deep Q-Leaming algorithms
  • the aforementioned computerized platform also includes a database configured to store historical data including data provided by at least one computerized wearable device or by other computerized platforms or data processed by said artificial intelligence unit.
  • the artificial intelligence unit of the aforementioned computerized platform is configured to execute an interaction procedure in real time with the user of the said computerized wearable device.
  • This interaction procedure includes a first sequence of steps which includes:
  • said first processing activity includes one or more of the following activities:
  • an application unit capable of executing said first data processing activity and acquiring the data provided by said application unit during the execution of said first data processing activity.
  • the selection of the most suitable application unit can be carried out basing on performance data indicative of the performances of the application units.
  • the artificial intelligence unit can execute suitable algorithms for analysing the aforesaid performance data.
  • the aforementioned interaction procedure includes a continuous acquisition of additional input data indicative of a condition of the user or a condition of the environment surrounding the user, through said computerized device.
  • this continuous acquisition of further input data can take place during any step of the aforementioned acquisition procedure, in particular during the step in which said artificial intelligence unit interacts with the user, through said computerized device.
  • said interaction procedure with the user of said computerized device includes at least a second sequence of steps which includes:
  • the second data processing activity comprises one or more of the following activities:
  • this further sequence of steps is performed in parallel with a step of the first sequence of steps of the interaction procedure, described previously.
  • said artificial intelligence unit is configured to execute a performance data acquisition procedure which includes the following steps:
  • said artificial intelligence unit is configured to execute an approach procedure with the user of said wearable computerized device which includes the following steps:
  • FIG. 1 schematically illustrates an embodiment of the computerized system according to the invention
  • FIG. 2 schematically illustrates a block diagram of the wearable computerized device included in the computerized system, according to the invention
  • FIG. 2A schematically illustrates an embodiment of the wearable computerized device included in the computerized system, according to the invention
  • FIG. 3 schematically illustrates a block diagram of the computerized platform included in the computerized system, according to the invention
  • FIG. 4-7 schematically illustrate the functioning of an artificial intelligence unit included in the computerized platform of figure 3 and interacting with the aforementioned wearable computerized device.
  • the present invention refers to a computerized system 500 adapted to provide assistance to a person in daily life and in carrying out certain activities.
  • the computerized system 500 is particularly suitable for providing assistance to blind or partially sighted people.
  • the computerized system 500 comprises at least one computerized device 1 wearable by a user.
  • the computerized device 1 comprises a support structure 2 shaped or predisposed so as to be wearable by a user (for example a blind or partially sighted person) at the face or head level.
  • the support structure 2 is shaped like a spectacle frame.
  • the support structure 2 comprises a pair of rods 2A (for example of the "golf” or "hedgehog” type) connected to a frontal portion 2B shaped, for example, as a mask.
  • rods 2A for example of the "golf" or "hedgehog" type
  • Corresponding to the rims of the frame can be positioned opaque or coloured protective shells, classic eyeglass lenses and / or micro-displays 11 for viewing of images and / or information.
  • the support structure 2 is arranged to hold or support at least partially one or more components (for example electric or electronic components) of the computerized device.
  • the computerized device 1 comprises a control unit 6.
  • This control unit preferably includes means for digital data processing (for example one or more microprocessors) capable of executing suitable software instructions stored on a memory medium for implementing the functionalities of the control unit 6.
  • the control unit 6 is at least partially housed in or fixed to the support structure 2.
  • the control unit 6 in addition to its own memory (for example of the RAM type), the control unit 6 also includes a removable memory 6A, for example a memory card which can be extracted through an appropriate opening (not shown) of the support structure 2.
  • the removable memory 6A can be temporarily removed from the computerized device 1 to be updated or replaced with another removable memory.
  • the computerized device 1 comprises sensor means 3 adapted to detect data or images relating to the environment surrounding the user.
  • the sensor means 3 comprise one or more cameras, for example a pair of lateral micro-cameras equipped with motion sensors and a central micro-camera (for example infrared) equipped with a microphone.
  • a pair of lateral micro-cameras equipped with motion sensors
  • a central micro-camera for example infrared
  • the cameras 3 are arranged so as to cover a horizontal field of view (i.e. measured along a plane parallel to the user's eyes) of 240° or larger to acquire the largest possible amount of information to assist the user.
  • the sensor means 3 are at least partially housed in or fixed to the support structure 2.
  • a lateral micro-camera is conveniently arranged in correspondence with each of the rods 2A while the central micro-camera is advantageously placed in correspondence with the front portion 2B, in practice in correspondence with jumper of the frame 2.
  • the sensor means 3 are operatively connected to the control unit 6 by means of connection of a known type and they can exchange with this control unit suitable command signals and data signals (dashed lines in Figure 2).
  • the computerized device 1 comprises user interface means 4 adapted to allow the user to interact with the aforementioned computerized device, in particular with the control unit 6.
  • the user interface means 4 comprise one or more microphones and one or more earphones, arranged so that the user can send or receive voice signals.
  • the user interface means 4 are at least partially housed in or fixed to the support structure 2.
  • an earphone-microphone unit can be placed at the end portion of each of the rods 2A in a distal position with respect to the front portion 2B, so as to facilitate the sending and receiving of voice signals by the user.
  • the user interface means 4 are operationally connected to the control unit 6 by means of connection of a known type and can exchange with this control unit appropriate control signals and data signals (dashed lines in Figure 2).
  • the computerized device 1 comprises wireless communication means 9, 10A, 10B.
  • the aforementioned wireless communication means comprise at least one communication module 9 arranged to allow the computerized device 1, in particular to the control unit 6, to communicate wirelessly with a remote computerized platform.
  • the communication module 9 is arranged to implement high-speed wireless communications .
  • high speed communications refers to wireless communications compliant with the 4G or higher communication standards, for example with data transmission speed (bit rate for each cell telephone) of at least 20 Gbps in download and at least 10 Gbps in upload and with a latency time of less than a few ms (for example, less than 4 ms).
  • the aforementioned term refers to wireless communications conforming to the 5G communication standard or higher.
  • the communication module 9 is predisposed to operate through the telephone network (data traffic via the Internet).
  • the aforementioned wireless communication means comprise one or more communication modules 10A for communications at local level.
  • each communication module 10A allows the computerized device 1, in particular to the control unit 6, to communicate with a computerized device located nearby.
  • the communication modules 10A comprise at least a Bluetooth TM communication module and /or a WiFi TM communication module.
  • the aforementioned wireless communication means comprise one or more communication modules 10B for communicating with a position detection system of the computerized device 1 at satellite level.
  • the communication modules 10B comprise at least a communication module for communicating with a GPS (Global Positioning System) detection system.
  • GPS Global Positioning System
  • the wireless communication means 9, 10A, 10B are at least partially housed in or fixed to the support structure 2.
  • a 5G communication module, a Bluetooth TM communication module, a WiFi TM communication module and a GPS communication module are advantageously predisposed in correspondence with the rods 2A.
  • the means of communication 9, 10A, 10B are operationally connected to the control unit 6 through connection means of a known type and they can exchange with the control unit suitable control signals and data signals (dashed lines in Figure 2).
  • the computerized device 1 comprises power supply means 7, 8 arranged to supply electrical power (continuous lines of Figure 2) to one or more components of said computerized device.
  • these power supply means include a battery 7 operatively connected to an electronic interface circuit 8, in turn connected to one or more components, for example to the control unit, to the sensor means 3, to the user interface means. 4 and to the means of communication 9, 10A, 10B.
  • the power supply means 7, 8 are at least partially housed in or fixed to the support structure 2.
  • a battery 7 and the relative electronic interface circuit 8 are advantageously placed in correspondence with one of the rods 2A.
  • the power supply means 7, 8 (in particular the electronic interface circuit 8) are operatively connected to the control unit 6 through connection means of known type and they can exchange with the control unit suitable control signals and data signals (dashed lines of figure 2).
  • the computerized device 1 comprises activation means 5 that can be operated by the user to activate the aforementioned computerized device.
  • the activation means 5 can comprise an ON/OFF activation key that can be operated manually by the user to activate or deactivate the computerized device 1 or a touch sensor arranged such as the computerized device 1 is activated or deactivated when worn or removed by the user.
  • the activation means 5 are at least partially housed in or fixed to the support structure 2.
  • the activation key or touch sensor 5 is advantageously placed in correspondence with one of the rods 2A.
  • the activation means 5 are operatively connected to the control unit 6 through connection means of a known type and they can exchange with this control unit suitable command signals and data signals (dashed lines in Figure 2).
  • the computerized device 1 could be achieved according to further embodiment variations. According to some embodiments, the computerized device 1 also includes one or more micro displays 11 arranged to allow the user to view images.
  • the micro-displays 11 are predisposed to allow the visualisation of enlarged images of the surrounding environment or images of the surrounding environment according to processing methods such as "Augmented Reality” or “Virtual Reality”.
  • the micro-displays 11 are fixed to the support structure 2 so that they can be observed by the user, once the computerized device 2 is worn.
  • the micro-displays 11 could be positioned in correspondence of the central portion 2B (in practice in correspondence with the rims) of the frame 2.
  • the micro-displays 11 are operatively connected to the control unit 6 through connection means of a known type and they can exchange with this control unit appropriate control signals and data signals (dashed lines in Figure 2).
  • the computerized device 1 could comprise further sensor means housed in or fixed to said support structure and operationally connected to said control unit.
  • further sensor means could include one or more position sensors or one or more sensors suitable for detecting physical measurements (for example temperature, humidity, presence of harmful substances, etc.) which are the characteristics of the environment surrounding the user.
  • many components or parts of the electronic device 1, for example the support structure 2, the control unit 6, the sensor means 3, the means for user interface 4, the communication means 9-10A-10B and optionally, the power supply means 7-8, the activation means 5 and the micro-displays 11, can be implemented at an industrial level according to known manufacturing techniques.
  • the computerized system 500 comprises a remote computerized platform 100.
  • This computerised platform is able to communicate in real time (normally through the Internet) with the wearable computerized device 1 (or better the related control unit 6).
  • the term “communicate in real time” refers to wireless communications with reduced latency time, for example less than a few ms.
  • the aforementioned term refers to wireless communications compliant with the 4G communication standard or higher.
  • the computerized platform 100 may include one or more computerized units 600, for example connected through the Internet and interacting with each other, for example in order to implement a cloud architecture.
  • the computerized platform 100 could include one or more computerized units equipped with an operating system to implement "server" type functions, for example Windows ServerTM, Windows AzureTM, Mac OS ServerTM, etc.
  • "server" type functions for example Windows ServerTM, Windows AzureTM, Mac OS ServerTM, etc.
  • the computerized platform 100 is able to interact with the computerized device 1 in a "server to client" type mode.
  • the computerized platform 100 includes an artificial intelligence unit 101.
  • the artificial intelligence unit 101 may include one or more computerized units (for example connected via LAN or Internet and interacting with each other) comprising means for digital data processing (for example one or more microprocessors) capable of carrying out suitable software instructions stored on a memory medium in order to implement the functionalities of the artificial intelligence unit 101.
  • means for digital data processing for example one or more microprocessors
  • the computerized platform 100 also includes a database 102 adapted to store data provided by the computerized device 1 or processed by the artificial intelligence unit 101.
  • the database 102 can include one or more memory units (for example connected through LAN or Internet and interacting with each other) managed by a corresponding computerized unit (not shown).
  • the artificial intelligence unit 101 is able to communicate and interact with the database 102 in known ways (for example through LAN or Internet).
  • the artificial intelligence unit 101 is also able to interact with one or more application units 150.
  • each application unit 150 is configured to execute a specific function, for example the calculation of a route or visual recognition.
  • Each application unit 150 can include one or more computerized units (for example connected through the Internet and interacting with each other) including means for digital data processing (for example one or more microprocessors) capable of executing appropriate software instructions stored on a memory support such as to implement the functionalities of the application unit 150.
  • means for digital data processing for example one or more microprocessors
  • the application units 150 could be part of the platform 100. In this case, they constitute data processing resources internal to the computerized platform 100 and usable by the artificial intelligence unit 101.
  • the application units 150 could be part of other computer platforms, for example computer platforms accessible from the computer platform 100 through LAN or the Internet.
  • the artificial intelligence unit 101 is able to communicate and interact with the application units 150 in known ways, usually through LAN or Internet through appropriate APIs (Application Programming Interfaces).
  • the artificial intelligence unit 101 is configured to execute also automatic learning algorithms (which can be of a known type, for example "Deep Q-Leaming" algorithms).
  • automatic learning algorithms which can be of a known type, for example "Deep Q-Leaming" algorithms.
  • Such automatic learning algorithms allow the artificial intelligence unit 101 to train and adapt to the requests of the user (blind or partially sighted person) of the computerized device 1 and to the environment with which the user interacts, improving, from time to time, the choices of processing actions to be performed in order to assist the user.
  • the artificial intelligence unit 101 is therefore able to interact in real time with the user (through the computerized device 1) and to assist him in carrying out a certain activity or, more generally, in interacting with the surrounding environment.
  • the artificial intelligence unit 101 is able to assist the user by carrying out appropriate processing activities selected based on the user requests or other input data. Obviously, the execution of the aforementioned inferential tasks takes place based on the received training.
  • the artificial intelligence unit 101 is configured to execute in real time an interaction procedure 300 with the user (for example a blind or partially sighted person) in order to assist the latter in carrying out a certain activity.
  • the interaction procedure 300 is repeated cyclically by the artificial intelligence unit 101 every time the user intends to carry out a certain activity or in general, when it is necessary to respond to one of his requests.
  • the interaction procedure 300 is advantageously repeated by the artificial intelligence unit 101 also in case when the electronic device 1 automatically detects (for example through the sensor means 3) a certain condition of the user or a certain environmental condition, for example a dangerous situation.
  • the interaction procedure 300 includes a first sequence of steps which includes a step 301 in which the artificial intelligence unit 101 receives at least a first user request R1 or at least first input data D1 indicative of a user's condition or a condition of the environment surrounding the user.
  • the request R1 is advantageously sent by the user through a voice message that is collected by the user interface means 4 (microphone) and sent to the control unit 6.
  • the voice message is suitably processed by the control unit 6 and sent by the latter, through the remote communication module 9, to the computerized platform 100, more specifically to the artificial intelligence unit 101.
  • the first input data D1 may include data collected by the sensor means 3 (cameras) or by additional sensor means (for example position sensors) of the computerized device 1 and sent to the control unit 6.
  • the collected data are suitably processed by the control unit 6 and sent by the latter, through the remote communication module 9, to the computerized platform 100, more specifically to the artificial intelligence unit 101.
  • the first input data D1 may include data received from other computerized platforms (such as a GPS platform) in communication with the computerized platform 100 and then made available to the artificial intelligence unit 101.
  • the interaction procedure 300 comprises a step 302 in which the artificial intelligence unit 101 executes (preferably on the basis of the first user request R1 or the first input data D1 or both) a first data processing activity TASK1 to provide assistance to the user.
  • the artificial intelligence unit 101 determines the first processing activity TASK1 to be executed by executing suitable voice or visual recognition algorithms (which may be of a known type), possibly interacting with an application unit 150, if necessary. This allows the artificial intelligence unit 101 to interpret the incoming request R1 and/or information D1 and determine the most suitable processing activity to assist the user based on this information.
  • the first data processing activity TASK1 comprises the execution, by the artificial intelligence unit 101, of suitable selection and processing algorithms (which may be of a known type) of historical data DS stored in the database 102 in order to provide the data 20A necessary to assist the user.
  • such historical data DS comprise training data generated and stored in the database during suitable training procedures executed by the artificial intelligence unit 101, performance data indicative of the functioning of the application units 150, data received from the computerized device 1 or data received from other computerized platforms 600.
  • the artificial intelligence unit 101 can retrieve and process all possible information already received and stored in order to optimise the results of the processing activity TASK1 to be executed.
  • the first data processing activity TASK1 comprises the selection, by the artificial intelligence unit, of an application unit 150 capable of executing the aforementioned first data processing activity TAKSK 1 based on appropriate selection parameters.
  • selection parameters include performance data DP indicative of the performance of the application units 150.
  • the artificial intelligence unit 101 can execute appropriate analysis algorithms of the data performance DP (which may be of a known type).
  • these algorithms analyse the performance data DP based on predefined evaluation criteria (for example memorized and inserted in the programming step of the artificial intelligence unit 101) or dynamically generated by the artificial intelligence unit 101 (for example by means of further data processing algorithms of a known type) on the basis of the first request of the user R1 or the first input data Dl.
  • the performance data DP are generated and stored in the database 102 during appropriate training procedures performed by the artificial intelligence unit 101.
  • the performance data DP can be generated and stored in the database 102 also during the normal interaction process between the artificial intelligence unit 101 and the user, for example during previous execution cycles of the interaction procedure with the user 300, described here.
  • the selected application unit 150 can be an application unit internal or external to the computerized platform 100, according to requirements.
  • the artificial intelligence unit 101 assigns the aforementioned first data processing activity TASK1 to the selected application unit 150.
  • the artificial intelligence unit 101 receives the data D20 provided by the application unit 150 selected during the execution of the first data processing activity TASK1 assigned to it.
  • the interaction procedure 300 therefore comprises a step 303 in which the artificial intelligence unit 101 interacts (preferably on the basis of the results D2, D20 of the first processing activity TASK1) with the user through the computerized device 1 in such a manner as to provide assistance to the user.
  • the artificial intelligence unit 101 processes the results D2, D20 obtained during the execution of the first data processing activity TASK1 and supplies suitable control signals and/or data signals to the computerized device 1.
  • the control unit 6 of the computerized device 1 receives these signals and, in turn, provides suitable voice or sound signals to the user, through the user interface means 4 (earphones).
  • the control unit 6 can advantageously make the information sent by the artificial intelligence unit 101 available in visual form.
  • the user can thus be guided by the artificial intelligence unit 101 in carrying out a requested activity or an activity to be carried out in response to a detected user or environmental condition
  • the interaction procedure can advantageously comprise the continuous acquisition by the artificial intelligence unit 101 of further input data indicative of a condition of the user or of a condition of the environment surrounding the user of the computerized device 1.
  • the acquisition of the aforementioned further input data can take place during any step 301-303 of the above acquisition procedure, in particular during the step 303 in which said artificial intelligence unit interacts with the user, through the computerized wearable device.
  • the aforementioned additional input data include data or images provided by the sensor means 3 of the computerized device 1, possibly processed, where necessary, by the control unit 6.
  • the real-time acquisition of the aforementioned input data by the artificial intelligence unit 101 allows to improve considerably the level of assistance provided to the user in carrying out certain activities, for example in walking.
  • the user sends a voice request Rl, for example of the type "I want to go from place A to place B", to the artificial intelligence unit 101.
  • the artificial intelligence unit 101 receives the request Rl and determines that the user intends to execute the following activity: "go from place A to place B”.
  • the artificial intelligence unit 101 determines that, to carry out this activity, it is necessary to execute a data processing activity TASK1 which consists in "calculating the optimal path to go from place A to place B and interact, on the basis of the observation of the environment, with the user”.
  • the artificial intelligence unit 101 selects and processes the data necessary to calculate the optimal route. Based on the data thus obtained and on all the additional input data provided continuously by the sensor means 3 (for example by the cameras) in real time it provides the data necessary to guide the user. On the basis of the data thus obtained, the artificial intelligence unit 101 sends to the user, through the computerized device 1, suitable voice signals to guide the user along the identified path to go from place A to place B, that is, to carry out the requested activity, indicating the presence of any obstacles identified by the artificial intelligence.
  • the artificial intelligence unit 101 receives input data D1 indicative of a dangerous weather situation.
  • the input data may have been detected directly from the computerized device 1 (through cameras 3) or learned from a specialized computerized platform available on the Internet.
  • the artificial intelligence unit 101 receives the D1 input data and determines that the user should perform the following activity: "go from place A to home” .
  • the artificial intelligence unit 101 determines that, in order to carry out this activity, it is necessary to execute a data processing activity TASK1 which consists in "calculating the shortest path to go from place A to home and interact with user, on the basis of observation of the environment" .
  • the artificial intelligence unit 101 is able to interact with multiple application units 150 capable of providing road navigation services.
  • the artificial intelligence unit 101 analyses the historical data DS stored in the database 102 and determines that it is necessary to resort to application unit 150 to carry out the processing activity TASK1 required to assist the user.
  • the artificial intelligence unit 101 selects (for example on the basis of the DP performance data stored in the database 102) an application unit 150 able to provide road navigation services and it assigns to this application unit the activity of data processing TASK1 indicated above, at the same time excluding the other available application units.
  • the application unit 150 performs the assigned data processing activity TASK1 and provides the results of this processing activity to the artificial intelligence unit 101.
  • the artificial intelligence unit 101 Based on the data thus obtained and all the further input data supplied continuously by the sensor means 3, for example by the cameras in real time, the artificial intelligence unit 101 sends to the user, through the computerized device 1, suitable voice signals to guide the user along the identified path to go from place A to home (i.e. to react to a danger condition).
  • the artificial intelligence unit 101 can use the interaction procedure 300 in order to satisfy every need of the user of the computerized device 1, for example to respond to trivial requests (for example of the type "what time is it” or “tell me what the weather will be like today”) or complex (for example of the type “tell me the name of a place in front of me” or “guide me home”).
  • the artificial intelligence unit 101 can assist the user of the computerized device 1 by making the following functionalities available: reading of texts and numbers, recognition of coins and payment cards, recognition of commonly used objects, recognition of objects with favourite brands or logos, recognition of food or drinks, recognition of familiar faces, gender recognition (male or female face), recognition of animals, recognition of elements of the surrounding environment (buildings, natural landscape, etc.), typical functions of a smartphone, “live streaming” features, suggestions on additional opportunities and/or improvements in respect of the user requests, and so on.
  • the artificial intelligence unit 101 can assist the user by using directly suitable application units 150 and, where necessary, the aforementioned additional input data supplied in real time by the sensor means 3 to execute the necessary data processing activities.
  • an interaction procedure 300 in order to meet a certain user request can be interrupted should this request be completely satisfied or should there be a further request relating to another activity.
  • the interaction procedure 300 previously in progress can be restored if necessary, for example upon completion of the new interaction procedure aimed at satisfying the new user request.
  • the reactivation of the interaction procedure 300 previously in progress is carried out in response to a further request from the user.
  • the artificial intelligence unit 101 is able to provide assistance to the user (for example, a blind or partially sighted person) in dealing with any unforeseen situations while carrying out a certain activity.
  • the interaction procedure 300 to assist the user during that ongoing activity is interrupted and the execution of a new interaction procedure 300 initiates in order to assist the 'user in dealing with this activity.
  • the interaction procedure 300 can advantageously provide the execution of at least a second sequence of steps 3030. This further sequence of steps is intended to be carried out when the artificial intelligence unit 101 is already interacting with the user, in parallel with the execution of a step of the above described first sequence of steps of the interaction procedure 300, for example step 303, illustrated above.
  • the execution of the sequence of steps 3030 can take place in response to a user request or an automatic detection of particular conditions of the user or of the environment surrounding the user.
  • the sequence of steps 3030 comprises a step 303A in which the artificial intelligence unit 101 receives at least a second user request R2 or at least third input data D3 indicative of a user’s condition or a condition of the environment surrounding the user.
  • the request R2 is sent to the artificial intelligence unit 101, through the computerized device 1, according to the modalities already described (step 301 of the interaction procedure 300).
  • the third input data D3 may include data collected by the sensor means 3 (cameras) or by other sensor means (e.g. position sensors) of the computerized device 1. These data are sent to the artificial intelligence unit 101, through the computerized device 1, according to the modalities already described (step 301 of the interaction procedure 300).
  • the third input data D3 may include data received from other computerized platforms 600 in communication with the computerized platform 100. These data are also sent to the artificial intelligence unit 101 in the way already described (step 301 of interaction procedure 300).
  • the sequence of steps 3030 includes a step 303B in which the artificial intelligence unit 101 performs a second data processing activity TASK2 to provide assistance to the user.
  • the artificial intelligence unit 101 determines (preferably based on the user request R2 or the third input data D3 or both) the second processing activity TASK2 in the way described above.
  • the second data processing activity TASK2 includes the processing of data acquired in real time from the sensor media by the artificial intelligence unit 101 in addition to the execution of appropriate selection and processing algorithms (which may be of known type) of historical data DS stored in the database 102 to provide the data D4 necessary to assist the user.
  • the second data processing activity TASK2 includes the selection, by the artificial intelligence unit, of an application unit 150 able to execute the above mentioned second data processing activity TASK2 according to appropriate selection parameters.
  • the selection of the most suitable application unit 150 can be done according to the above described methods for the first data processing activity TASK1.
  • the artificial intelligence unit 101 assigns the aforementioned second data processing activity TASK2 to the selected application unit 150.
  • the interaction procedure 3030 therefore comprises a step 303C in which the artificial intelligence unit 101 interacts (preferably on the basis of the results D4, D40 of the second processing activity TASK2) with the user through the computerized device 1 in a manner such as to provide assistance to the user.
  • the interaction process of the artificial intelligence unit 101 with the user advantageously takes place according to the methods already described (step 303 of the interaction procedure 300). Thanks to the voice or sound signals received in this way, the user can thus be guided by the artificial intelligence unit 101 to carry out a further activity or to react to an unforeseen situation, while carrying out an activity already in progress.
  • the interaction procedure 300 can provide the execution of several second sequences of steps 3030 in parallel with the execution of a first sequence of steps of the interaction procedure 300, for example step 303, illustrated above.
  • the artificial intelligence unit 101 is interacting with the user, according to the modalities already described (step 303 of interaction procedure 300).
  • the computerized device 1 detects the failure of a traffic light by means of the sensor means 3 (cameras).
  • the artificial intelligence unit 101 receives new input data D3 indicative of this unforeseen dangerous situation.
  • the artificial intelligence unit 101 determines that the user must perform the following activity: "find help”.
  • the artificial intelligence unit 101 selects and processes historical data DS stored in the database 120 to provide data that can respond to the user's need, for example to "find a person who can help".
  • the artificial intelligence unit 101 interrupts the assistance activity in progress and, on the basis of the data thus obtained and of all the data provided by the sensor means, for example the cameras, sends to the user, through the computerized device 1, suitable voice signals to guide the user to a person who can help (for example, a traffic warden).
  • the artificial intelligence unit 101 is interacting with the user, according to the modalities already described (step 303 of the interaction procedure 300).
  • the user sends a new voice request, for example of the type "I'm thirsty", to the artificial intelligence unit 101.
  • the artificial intelligence unit 101 receives the new request and determines that the user intends to execute the following (unexpected) activity: "find a place to buy a drink’’ .
  • the artificial intelligence unit 101 also determines that, in order to carry out this activity, it is necessary to execute a TASK2 data processing activity which consists in "finding a place to buy a drink along the way from place A to place B
  • the artificial intelligence unit analyses the historical data DS stored in the database 102 and determines that it is necessary to resort to application unit 150 to carry out the processing activity TASK2 required to assist the user.
  • the artificial intelligence unit 101 selects an application unit 150 capable of executing the data processing activity TASK2 and assigns this activity of data processing to the selected application unit.
  • the application unit 150 performs the assigned data processing activity TASK2 and supplies the results of this processing activity to the artificial intelligence unit 101.
  • the artificial intelligence unit 101 sends to the user, through the computerized device 1, suitable voice signals to guide the user to a place where to buy a drink, according to the new (unexpected) user request.
  • the artificial intelligence unit 101 is configured to execute an approach procedure 200 with the user of the computerized device 1 (for example, a blind or partially sighted person).
  • the approach procedure 200 can be performed by the artificial intelligence unit 101 to check if the user is using the computerized device 1 and to solicit requests from the user himself.
  • the approach procedure 200 includes a step 201 in which the artificial intelligence unit 101 checks if the computerized device 1 is in an activation state. This check can be carried out in a manner known in communications between computerized devices, for example by sending a "ping" signal to the control unit 6 and waiting for a response signal from the latter.
  • the procedure 200 includes a step 202 in which the artificial intelligence unit 101 sends a first invitation message Ml to the user, through the computerized device 1.
  • the Ml invitation message is a voice signal that can be sent by the artificial intelligence unit 1 in the manner already described (step 303 of the interaction procedure 300).
  • the procedure 200 includes a step 203 in which the artificial intelligence unit 101 receives a response message R0 from the user through the computerized device 1.
  • the response message R0 is sent to the artificial intelligence unit 101, through the computerized device 1, according to the methods already described (step 301 of the interaction procedure 300).
  • the procedure 200 therefore includes a step 204 in which the artificial intelligence unit 101 determines whether the user intends to carry out some activity or not.
  • the artificial intelligence unit 101 can execute appropriate voice recognition algorithms (which may be of a known type). If the user intends to carry out some activity, the procedure 200 therefore includes a step 205 in which the artificial intelligence unit 101 selects a class of data processing activity to provide assistance to the user in carrying out a certain type of activity.
  • the artificial intelligence unit 101 can execute appropriate voice recognition algorithms (which may be of a known type).
  • the procedure 200 therefore includes a step 206 in which the artificial intelligence unit 101 sends a second invitation message M2 to the user, through the computerized device 1.
  • the M2 invitation message is also a voice signal that can be sent by the artificial intelligence unit 1 in the manner already described (step 303 of the interaction procedure 300).
  • the artificial intelligence unit 101 waits for a user request, so that it can possibly execute the interaction procedure 300 described above.
  • the artificial intelligence unit 101 can cyclically repeat the sending of the second invitation message M2.
  • the execution of the procedure 200 allows the artificial intelligence unit 101 to proactively inform the user that it is ready to assist him, if necessary. Furthermore, based on the limited information received by the user (reply message R0), the artificial intelligence unit 101 prepares itself in the most appropriate way (step 205 of procedure 200) to assist the user in carrying out an activity which, probably, will soon be requested by the user.
  • the user wears the computerized device 1.
  • the user activates the computerized device 1 by means of activation 5 (ON / OFF button or touch sensor).
  • the artificial intelligence unit 101 detects that the computerized device 1 is in an activation state and sends a first Ml invitation message to the user (voice message), such as "what do we do?"
  • the user sends a response message R0 (voice message), for example of the type "I would like to go out", to the artificial intelligence unit 101.
  • R0 voice message
  • the artificial intelligence unit 101 selects a data processing activity class in order to provide assistance to the user in carrying out activities outdoor. In this way, the artificial intelligence unit 101 can also identify a group of application units 150 with which to interact to assist the user.
  • the artificial intelligence unit 101 sends a second invitation message M2 to the user (voice message), for example a message such as "where are we going?".
  • the artificial intelligence unit 101 awaits a new request from the user (for example, the R1 request provided for by the interaction procedure 300 described above).
  • the artificial intelligence unit 101 can be configured to execute various approach procedures compared to the approach procedure 200 illustrated above.
  • the artificial intelligence unit 101 is configured to execute training procedures in which automatic learning algorithms are performed.
  • training procedures allow the artificial intelligence unit 101 to adapt to the requests of the user of the computerized device 1 and to the environment with which the user interacts, improving, from time to time, the choices of the processing actions to be performed in order to assist the user.
  • the training procedures may be of a known type and will not be described in detail here for obvious reasons of brevity.
  • the artificial intelligence unit 101 can repeatedly execute the aforementioned training procedures, whenever necessary.
  • the artificial intelligence unit 101 is configured to train continuously, even during the normal user assistance activities, in particular during the execution of data processing activities by the application units 150.
  • the acquisition procedure 400 allows the artificial intelligence unit 101 to collect and store DP performance data indicative of the performance of the application units 150. In this way, the artificial intelligence unit 101 can make an optimal choice and where necessary select an application unit 150 to execute a certain data processing activity.
  • the acquisition procedure 400 includes a step 401 in which the artificial intelligence unit 101 selects a group of application units 150 capable of executing a data processing activity TASK.
  • the acquisition procedure 400 includes a step 402 in which the artificial intelligence unit 101 assigns this data processing activity TASK to each application unit 150.
  • the application units 150 thus concurrently execute the assigned data processing activity TASK.
  • the acquisition procedure 400 comprises a step 403 in which the artificial intelligence unit 101 receives, from each application activity 150, the performance data DP indicative of the performance of this application unit in the execution of the assigned data processing activity TASK.
  • the acquisition procedure 400 includes a step 404 in which the artificial intelligence unit 101 stores the performance data DP of each application unit in the database 102.
  • the stored performance data DP thus becomes part of the historical data DS available to be used during the assistance to be provided to the user, in particular during the execution of the interaction procedure 300, described above.
  • the acquisition procedure 400 can be performed whenever necessary, even during the execution of the interaction procedure 300 or during a learning procedure.
  • a further aspect of the present invention relates to a computerized system 500 comprising the computerized device 1 and the computerized platform 100, described above.
  • the computerized system 500 represents a real IoT (Internet of Things) system.
  • the computerized device 1 can be achieved in practice according to further variants falling within the scope of the present invention.
  • control unit 6 comprises an additional artificial intelligence unit 60.
  • this additional artificial intelligence unit 60 is pre-trained before being installed (and subsequently updated) on board the control unit 6. It can then interact with the user based on the training received.
  • this additional artificial intelligence unit can preferably be intended to execute only inferential tasks. This allows to arrange it on board the control unit 6 without excessively increasing the computing power of the control unit 6.
  • an additional artificial intelligence unit 60 on board of the computerized device 1 offers the advantage of simplifying the interaction with the user and providing assistance in carrying out simpler activities, without the need to communicate with the computerized platform 100.
  • an additional artificial intelligence unit 60 is also useful to provide a minimum level of assistance to the user if, for any reason, it is not possible to communicate with the computerized platform 100, for example due to a lack of field.
  • the computerized system 500 has significant advantages over the prior art.
  • the computerized system 500 is able to provide assistance to blind or visually impaired people in carrying out a number of activities, even very diversified from each other.
  • the computerized system 500 is able to provide assistance to blind or partially sighted people in many aspects of daily life, effectively responding to their needs and improving their autonomy.
  • the computerized system 500 is able to provide assistance to blind or visually impaired people even in dealing with unexpected situations during the performance of a certain activity.
  • the computerized system 500 comprises a wearable computerized device 1 which has a very contained overall structure that makes it particularly suitable for being worn by the user, for example as a pair of glasses.
  • the computerized device 1 is of easy and intuitive practical use and does not require the use of additional devices in order to interact with the remote computerized platform 100.
  • the computerized system 500 (in particular as regards the wearable computerized device 1) is easily achievable at an industrial level, with manufacturing methods HW/SW, of known type.

Abstract

Computerized system (500) for providing assistance to a user characterized in that it includes at least one computerized device (1) comprising: - a support structure (2) shaped or arranged in such way as to be wearable by a user, said support structure being able to house or support one or more components of said computerized device; - a control unit (6) adapted to control the functioning of said computerized device, said control unit being at least partially housed in or fixed to said support structure; - sensor means (3) adapted to detect data or images relating to the environment surrounding the user, said sensor means being at least partially housed in or fixed to said support structure and being operationally connected to said control unit; - user interface means (4) adapted to allow the user to send or receive sound signals, said user interface means being at least partially housed in or fixed to said support structure and operatively connected to said control unit; - communication means (9, 10A, 10B) at least partially housed in or fixed to said support structure and operatively connected to said control unit, said communication means comprising at least one communication module (9) for high-speed communications. The computerized system includes a remote computerized platform (100) capable of communicating with the wearable computerized device. Such computerized platform includes an artificial intelligence unit (101) capable of executing machine learning algorithms to assist the user. The artificial intelligence unit is able to interact with the user of said computerized device and with one or more application units (150).

Description

COMPUTERISED SYSTEM TO PROVIDE ASSISTANCE TO A PERSON, IN PARTICULAR TO A BLIND OR VISUALLY IMPAIRED PERSON
DESCRIPTION
The present invention refers to a computerized system for providing assistance to a person, in particular to a blind or visually impaired person.
The computerized system, according to this invention, comprises a wearable computerised device and a computerized platform equipped with an artificial intelligence unit configured to execute automatic learning algorithms and to interact, through the aforementioned computerized wearable device, with a person in order to assist it with carrying out a certain activity.
It is widely known that a blind or partially sighted person often needs help when it has to interact with the surrounding environment (for example, moving around the house or on the street). Often, the necessary assistance is provided by a companion or a guide dog. In other cases, the blind or visually impaired person uses appropriate hand tools, such as a cane for orientation or an electronic obstacle detector (e.g. of vibrating type).
There are known computerised systems that use wearable electronic devices in order to provide assistance to a blind or partially sighted person in carrying out some activities, for example walking or reading. An example of such devices is described in the patent application EP2371339A1.
The solutions of this type, currently available, offer satisfactory performance only for some specific tasks such as, for example, the recognition of obstacles and / or recognition of texts, objects or people.
It has also been observed that such systems are unable to offer adequate assistance to a blind or partially sighted person when inconveniences, unforeseen or abnormal situations arise.
Finally, often, the wearable electronic devices are inconvenient to use as they generally include several electronic modules (typically installed manually by the user) intended to communicate and interact with a mobile computerized device, for example a smartphone, that in turn is connectable with a remote computerized platform.
The main task of the present invention is to provide a computerized system able to overcome the limitations of the prior art, highlighted above.
Within this aim, one purpose of the present invention is to provide a computerized system able to give assistance to a person, for example to a blind or visually impaired person, in carrying out a vast range of activities, and in general, interact with the surrounding environment on a daily basis. A further aim of the invention is to provide a computerized system able to give assistance to a person in dealing with potential unforeseen situations, whilst carrying out a certain activity.
A further aim of the invention is to provide a computerized system that includes a wearable computerized device that has a very compact overall structure and is of simple and immediate practical use.
A further aim of the present invention is to provide a computerized system which is easily achievable at an industrial level.
This task and these aims, as well as other aims that will appear evident from the subsequent description and from the attached drawings, are achieved, according to the invention, by a computerized system, according to claim 1 and to the related dependent claims, proposed below.
In its general definition, the computerized device, according to the invention, comprises at least one computerized device provided with a support structure shaped or arranged in such way as to be worn by a user (for example a blind or partially sighted person). This support structure is configured to house or sustain one or more components of the aforementioned computerised device.
Preferably, the support structure is shaped in such a way as to be wearable by a user at the face or head level.
More preferably, such support structure is shaped like a spectacle frame.
The wearable computerized device comprises a control unit configured to control its functioning. Such control unit is at least partially located within or fixed to the aforementioned support structure.
Preferably, said control unit comprises a removable memory.
The wearable computer device is provided with sensor means adapted to detect data or images relating to the user's environment. Such sensor means are at least partially housed in or fixed to the aforementioned support structure and are operatively connected to the control unit. Preferably, such sensor means comprise one or more cameras, for example infrared cameras with motion sensors.
Preferably, such cameras are arranged so as to cover a horizontal field of view of at least 240°. The wearable computerized device is equipped with user interface means adapted to reporting allow the user to send or receive sound signals, preferably voice signals. Said user interface means are at least partially located within or fixed to the said support structure and are operationally connected to the control unit. Preferably, the aforementioned user interface means comprise one or more earphones with microphone.
The wearable computerized device is provided with wireless communication means at least partially located within or fixed to the aforementioned support structure and are operationally connected to said control unit.
The aforementioned communication means comprise at least one communication module for high-speed communications, preferably a communication module for high-speed communications through the telephone network.
Preferably, said wireless communication means also comprise one or more communication modules for communications at a local level.
Preferably, said wireless communication means also comprise one or more communication modules for communicating with a position detecting system at satellite level.
Preferably, the wearable computerized device is provided with power supply means for one or more components of said computerized device. The power supply means are at least partially housed in or fixed to the support structure.
Preferably, said power supply means include a battery operatively connected to an electronic interface circuit, in turn operatively connected to one or more components of the said computerized device.
Preferably, the wearable computerized device comprises one or more micro-displays for viewing images. Advantageously, these micro-displays are fixed to the aforementioned support structure.
Preferably, the wearable computerized device comprises activation means that can be operated by the user to activate (and possibly deactivate) said computerized device.
The computerized system, according to the invention, includes a remote computerized platform configured to communicate with the wearable computerized device.
This computerized platform includes an artificial intelligence unit configured to execute automatic learning algorithms (by way of example, deep learning algorithms for reinforcement - for example Deep Q-Leaming algorithms), to interact with the user through said computerized device and to interact with one or more application units external or internal to said computerized platform.
The aforementioned computerized platform also includes a database configured to store historical data including data provided by at least one computerized wearable device or by other computerized platforms or data processed by said artificial intelligence unit. According to the invention, the artificial intelligence unit of the aforementioned computerized platform is configured to execute an interaction procedure in real time with the user of the said computerized wearable device.
This interaction procedure includes a first sequence of steps which includes:
- receiving at least a first user request or at least first input data indicative of a user’ s condition or a condition of the environment surrounding the user, through the said computerized device;
- executing an initial data processing activity to provide assistance to the user;
- based on the results of the said first data processing activity, interacting with the user, through said computerized device, to provide assistance to the user.
According to the invention, said first processing activity includes one or more of the following activities:
- executing voice or visual recognition algorithms to interpret said first user request or said first input data and to determine said first data processing activity;
- executing algorithms for selecting and processing historical data stored in said database;
- selecting, where necessary, an application unit capable of executing said first data processing activity and acquiring the data provided by said application unit during the execution of said first data processing activity.
As better described in the following, the selection of the most suitable application unit can be carried out basing on performance data indicative of the performances of the application units. Conveniently, for this selection process, the artificial intelligence unit can execute suitable algorithms for analysing the aforesaid performance data.
Preferably, the aforementioned interaction procedure includes a continuous acquisition of additional input data indicative of a condition of the user or a condition of the environment surrounding the user, through said computerized device.
Advantageously, this continuous acquisition of further input data can take place during any step of the aforementioned acquisition procedure, in particular during the step in which said artificial intelligence unit interacts with the user, through said computerized device.
Preferably, said interaction procedure with the user of said computerized device includes at least a second sequence of steps which includes:
- receiving, through said computerized device, a second request from the user or third input data indicative of a condition of the user or a condition of the environment surrounding the user;
- executing a second data processing activity to provide assistance to the user; - based on the results of said second data processing activity, interacting with the user, through said computerized device, to provide assistance to the user.
Preferably, the second data processing activity comprises one or more of the following activities:
- executing voice or visual recognition algorithms to interpret said second user request or said second input data and to determine said second data processing activity;
- executing algorithms for selecting and processing historical data stored in said database;
- selecting, where necessary, an application unit capable of executing said second data processing activity and acquiring the data provided by said application unit during the execution of said second data processing activity.
Preferably, this further sequence of steps is performed in parallel with a step of the first sequence of steps of the interaction procedure, described previously.
According to an aspect of the invention, said artificial intelligence unit is configured to execute a performance data acquisition procedure which includes the following steps:
- selecting a group of application units capable of executing data processing activity;
- assigning to each application unit the selected data processing activity;
- receiving, from each application activity, performance data indicative of the performance of said application unit in the execution of said data processing activity;
- storing said performance data in said database.
According to an aspect of the invention, said artificial intelligence unit is configured to execute an approach procedure with the user of said wearable computerized device which includes the following steps:
- checking whether said computerized device is in an activated state;
- if said computerized device is in an activated state, sending an initial invitation message to the user, through said computerized device;
- receiving a reply message from the user, through said computerized device;
- determining if the user intends to carry out any activity;
- if the user intends to carry out some activity, selecting, based on said response message, a class of data processing activity to provide assistance to the user;
- sending a second invitation message to the user, through said computerized device.
Further characteristics and advantages of the present invention can be better perceived by referring to the description given below and to the attached figures, provided for purely illustrative and non-limiting purposes, in which: - figure 1 schematically illustrates an embodiment of the computerized system according to the invention;
- figure 2 schematically illustrates a block diagram of the wearable computerized device included in the computerized system, according to the invention;
- figure 2A schematically illustrates an embodiment of the wearable computerized device included in the computerized system, according to the invention;
- figure 3 schematically illustrates a block diagram of the computerized platform included in the computerized system, according to the invention;
- figures 4-7 schematically illustrate the functioning of an artificial intelligence unit included in the computerized platform of figure 3 and interacting with the aforementioned wearable computerized device.
With reference to the quoted figures, the present invention refers to a computerized system 500 adapted to provide assistance to a person in daily life and in carrying out certain activities.
The computerized system 500 is particularly suitable for providing assistance to blind or partially sighted people.
The computerized system 500 comprises at least one computerized device 1 wearable by a user. The computerized device 1 comprises a support structure 2 shaped or predisposed so as to be wearable by a user (for example a blind or partially sighted person) at the face or head level. According to preferred embodiments of the invention (Figure 2A), the support structure 2 is shaped like a spectacle frame. In this case, it comprises a pair of rods 2A (for example of the "golf" or "hedgehog" type) connected to a frontal portion 2B shaped, for example, as a mask. Corresponding to the rims of the frame can be positioned opaque or coloured protective shells, classic eyeglass lenses and / or micro-displays 11 for viewing of images and / or information. Advantageously, the support structure 2 is arranged to hold or support at least partially one or more components (for example electric or electronic components) of the computerized device. The computerized device 1 comprises a control unit 6. This control unit preferably includes means for digital data processing (for example one or more microprocessors) capable of executing suitable software instructions stored on a memory medium for implementing the functionalities of the control unit 6.
The control unit 6 is at least partially housed in or fixed to the support structure 2. For example, in the embodiment of figure 2A, it is advantageously located in the internal volume of one of the rods 2A. Preferably, in addition to its own memory (for example of the RAM type), the control unit 6 also includes a removable memory 6A, for example a memory card which can be extracted through an appropriate opening (not shown) of the support structure 2.
Conveniently, the removable memory 6A can be temporarily removed from the computerized device 1 to be updated or replaced with another removable memory.
The computerized device 1 comprises sensor means 3 adapted to detect data or images relating to the environment surrounding the user.
Preferably, the sensor means 3 comprise one or more cameras, for example a pair of lateral micro-cameras equipped with motion sensors and a central micro-camera (for example infrared) equipped with a microphone.
Preferably, the cameras 3 are arranged so as to cover a horizontal field of view (i.e. measured along a plane parallel to the user's eyes) of 240° or larger to acquire the largest possible amount of information to assist the user.
The sensor means 3 are at least partially housed in or fixed to the support structure 2. For example, in the embodiment of Figure 2A, a lateral micro-camera is conveniently arranged in correspondence with each of the rods 2A while the central micro-camera is advantageously placed in correspondence with the front portion 2B, in practice in correspondence with jumper of the frame 2.
The sensor means 3 are operatively connected to the control unit 6 by means of connection of a known type and they can exchange with this control unit suitable command signals and data signals (dashed lines in Figure 2).
The computerized device 1 comprises user interface means 4 adapted to allow the user to interact with the aforementioned computerized device, in particular with the control unit 6. Advantageously, the user interface means 4 comprise one or more microphones and one or more earphones, arranged so that the user can send or receive voice signals.
The user interface means 4 are at least partially housed in or fixed to the support structure 2. For example, an earphone-microphone unit can be placed at the end portion of each of the rods 2A in a distal position with respect to the front portion 2B, so as to facilitate the sending and receiving of voice signals by the user.
The user interface means 4 are operationally connected to the control unit 6 by means of connection of a known type and can exchange with this control unit appropriate control signals and data signals (dashed lines in Figure 2).
The computerized device 1 comprises wireless communication means 9, 10A, 10B. The aforementioned wireless communication means comprise at least one communication module 9 arranged to allow the computerized device 1, in particular to the control unit 6, to communicate wirelessly with a remote computerized platform.
In particular, the communication module 9 is arranged to implement high-speed wireless communications .
For the sake of clarity, it is specified that, in the context of the present invention, the term "high speed communications" refers to wireless communications compliant with the 4G or higher communication standards, for example with data transmission speed (bit rate for each cell telephone) of at least 20 Gbps in download and at least 10 Gbps in upload and with a latency time of less than a few ms (for example, less than 4 ms).
According to some possible embodiments of the invention, the aforementioned term refers to wireless communications conforming to the 5G communication standard or higher.
Preferably, the communication module 9 is predisposed to operate through the telephone network (data traffic via the Internet).
Preferably, the aforementioned wireless communication means comprise one or more communication modules 10A for communications at local level.
Advantageously, each communication module 10A allows the computerized device 1, in particular to the control unit 6, to communicate with a computerized device located nearby. Preferably, the communication modules 10A comprise at least a Bluetooth ™ communication module and /or a WiFi ™ communication module.
Preferably, the aforementioned wireless communication means comprise one or more communication modules 10B for communicating with a position detection system of the computerized device 1 at satellite level.
Preferably, the communication modules 10B comprise at least a communication module for communicating with a GPS (Global Positioning System) detection system.
The wireless communication means 9, 10A, 10B are at least partially housed in or fixed to the support structure 2. For example, in the embodiment of figure 2 A, a 5G communication module, a Bluetooth ™ communication module, a WiFi ™ communication module and a GPS communication module are advantageously predisposed in correspondence with the rods 2A. The means of communication 9, 10A, 10B are operationally connected to the control unit 6 through connection means of a known type and they can exchange with the control unit suitable control signals and data signals (dashed lines in Figure 2). Advantageously, the computerized device 1 comprises power supply means 7, 8 arranged to supply electrical power (continuous lines of Figure 2) to one or more components of said computerized device.
Preferably, these power supply means include a battery 7 operatively connected to an electronic interface circuit 8, in turn connected to one or more components, for example to the control unit, to the sensor means 3, to the user interface means. 4 and to the means of communication 9, 10A, 10B.
The power supply means 7, 8 are at least partially housed in or fixed to the support structure 2. For example, in the embodiment of Figure 2A, a battery 7 and the relative electronic interface circuit 8 are advantageously placed in correspondence with one of the rods 2A.
The power supply means 7, 8 (in particular the electronic interface circuit 8) are operatively connected to the control unit 6 through connection means of known type and they can exchange with the control unit suitable control signals and data signals (dashed lines of figure 2). Preferably, the computerized device 1 comprises activation means 5 that can be operated by the user to activate the aforementioned computerized device.
The activation means 5 can comprise an ON/OFF activation key that can be operated manually by the user to activate or deactivate the computerized device 1 or a touch sensor arranged such as the computerized device 1 is activated or deactivated when worn or removed by the user. The activation means 5 are at least partially housed in or fixed to the support structure 2. For example, in the embodiment of Figure 2A, the activation key or touch sensor 5 is advantageously placed in correspondence with one of the rods 2A.
The activation means 5 are operatively connected to the control unit 6 through connection means of a known type and they can exchange with this control unit suitable command signals and data signals (dashed lines in Figure 2).
The computerized device 1 could be achieved according to further embodiment variations. According to some embodiments, the computerized device 1 also includes one or more micro displays 11 arranged to allow the user to view images.
Preferably, the micro-displays 11 are predisposed to allow the visualisation of enlarged images of the surrounding environment or images of the surrounding environment according to processing methods such as "Augmented Reality" or "Virtual Reality”.
The micro-displays 11 are fixed to the support structure 2 so that they can be observed by the user, once the computerized device 2 is worn. For example, in the embodiment of figure 2A, the micro-displays 11 could be positioned in correspondence of the central portion 2B (in practice in correspondence with the rims) of the frame 2. The micro-displays 11 are operatively connected to the control unit 6 through connection means of a known type and they can exchange with this control unit appropriate control signals and data signals (dashed lines in Figure 2).
According to additional embodiments (not shown), the computerized device 1 could comprise further sensor means housed in or fixed to said support structure and operationally connected to said control unit. These further sensor means could include one or more position sensors or one or more sensors suitable for detecting physical measurements (for example temperature, humidity, presence of harmful substances, etc.) which are the characteristics of the environment surrounding the user.
In general, many components or parts of the electronic device 1, for example the support structure 2, the control unit 6, the sensor means 3, the means for user interface 4, the communication means 9-10A-10B and optionally, the power supply means 7-8, the activation means 5 and the micro-displays 11, can be implemented at an industrial level according to known manufacturing techniques.
A very important aspect of the present invention is that the computerized system 500 comprises a remote computerized platform 100.
This computerised platform is able to communicate in real time (normally through the Internet) with the wearable computerized device 1 (or better the related control unit 6).
For the sake of clarity, it is specified that, in the context of the present invention, the term "communicate in real time" refers to wireless communications with reduced latency time, for example less than a few ms. In particular, the aforementioned term refers to wireless communications compliant with the 4G communication standard or higher.
The computerized platform 100 may include one or more computerized units 600, for example connected through the Internet and interacting with each other, for example in order to implement a cloud architecture. For example, the computerized platform 100 could include one or more computerized units equipped with an operating system to implement "server" type functions, for example Windows Server™, Windows Azure™, Mac OS Server™, etc. In this case, the computerized platform 100 is able to interact with the computerized device 1 in a "server to client" type mode.
According to the invention, the computerized platform 100 includes an artificial intelligence unit 101.
The artificial intelligence unit 101 may include one or more computerized units (for example connected via LAN or Internet and interacting with each other) comprising means for digital data processing (for example one or more microprocessors) capable of carrying out suitable software instructions stored on a memory medium in order to implement the functionalities of the artificial intelligence unit 101.
According to the invention, the computerized platform 100 also includes a database 102 adapted to store data provided by the computerized device 1 or processed by the artificial intelligence unit 101.
The database 102 can include one or more memory units (for example connected through LAN or Internet and interacting with each other) managed by a corresponding computerized unit (not shown).
Advantageously, the artificial intelligence unit 101 is able to communicate and interact with the database 102 in known ways (for example through LAN or Internet).
The artificial intelligence unit 101 is also able to interact with one or more application units 150.
Preferably, each application unit 150 is configured to execute a specific function, for example the calculation of a route or visual recognition.
Each application unit 150 can include one or more computerized units (for example connected through the Internet and interacting with each other) including means for digital data processing (for example one or more microprocessors) capable of executing appropriate software instructions stored on a memory support such as to implement the functionalities of the application unit 150.
The application units 150 could be part of the platform 100. In this case, they constitute data processing resources internal to the computerized platform 100 and usable by the artificial intelligence unit 101.
The application units 150 could be part of other computer platforms, for example computer platforms accessible from the computer platform 100 through LAN or the Internet.
The artificial intelligence unit 101 is able to communicate and interact with the application units 150 in known ways, usually through LAN or Internet through appropriate APIs (Application Programming Interfaces).
According to the invention, the artificial intelligence unit 101 is configured to execute also automatic learning algorithms (which can be of a known type, for example "Deep Q-Leaming" algorithms).
Such automatic learning algorithms allow the artificial intelligence unit 101 to train and adapt to the requests of the user (blind or partially sighted person) of the computerized device 1 and to the environment with which the user interacts, improving, from time to time, the choices of processing actions to be performed in order to assist the user. The artificial intelligence unit 101 is therefore able to interact in real time with the user (through the computerized device 1) and to assist him in carrying out a certain activity or, more generally, in interacting with the surrounding environment.
The artificial intelligence unit 101 is able to assist the user by carrying out appropriate processing activities selected based on the user requests or other input data. Obviously, the execution of the aforementioned inferential tasks takes place based on the received training.
It should be noted that real-time interaction with the user is made possible (and effective) by the fact that the computerized device 1 is able to communicate wirelessly (advantageously at high speed) with the computerized platform 100.
According to the invention, the artificial intelligence unit 101 is configured to execute in real time an interaction procedure 300 with the user (for example a blind or partially sighted person) in order to assist the latter in carrying out a certain activity.
Advantageously, the interaction procedure 300 is repeated cyclically by the artificial intelligence unit 101 every time the user intends to carry out a certain activity or in general, when it is necessary to respond to one of his requests.
The interaction procedure 300 is advantageously repeated by the artificial intelligence unit 101 also in case when the electronic device 1 automatically detects (for example through the sensor means 3) a certain condition of the user or a certain environmental condition, for example a dangerous situation.
With reference to Figure 4, the interaction procedure 300 includes a first sequence of steps which includes a step 301 in which the artificial intelligence unit 101 receives at least a first user request R1 or at least first input data D1 indicative of a user's condition or a condition of the environment surrounding the user.
The request R1 is advantageously sent by the user through a voice message that is collected by the user interface means 4 (microphone) and sent to the control unit 6.
The voice message is suitably processed by the control unit 6 and sent by the latter, through the remote communication module 9, to the computerized platform 100, more specifically to the artificial intelligence unit 101.
The first input data D1 may include data collected by the sensor means 3 (cameras) or by additional sensor means (for example position sensors) of the computerized device 1 and sent to the control unit 6.
In this case, the collected data are suitably processed by the control unit 6 and sent by the latter, through the remote communication module 9, to the computerized platform 100, more specifically to the artificial intelligence unit 101. The first input data D1 may include data received from other computerized platforms (such as a GPS platform) in communication with the computerized platform 100 and then made available to the artificial intelligence unit 101.
Subsequently to step 301, the interaction procedure 300 comprises a step 302 in which the artificial intelligence unit 101 executes (preferably on the basis of the first user request R1 or the first input data D1 or both) a first data processing activity TASK1 to provide assistance to the user.
Advantageously, the artificial intelligence unit 101 determines the first processing activity TASK1 to be executed by executing suitable voice or visual recognition algorithms (which may be of a known type), possibly interacting with an application unit 150, if necessary. This allows the artificial intelligence unit 101 to interpret the incoming request R1 and/or information D1 and determine the most suitable processing activity to assist the user based on this information. Preferably, the first data processing activity TASK1 comprises the execution, by the artificial intelligence unit 101, of suitable selection and processing algorithms (which may be of a known type) of historical data DS stored in the database 102 in order to provide the data 20A necessary to assist the user.
Preferably, such historical data DS comprise training data generated and stored in the database during suitable training procedures executed by the artificial intelligence unit 101, performance data indicative of the functioning of the application units 150, data received from the computerized device 1 or data received from other computerized platforms 600.
In practice, to meet the user's needs, the artificial intelligence unit 101 can retrieve and process all possible information already received and stored in order to optimise the results of the processing activity TASK1 to be executed.
Where necessary, the first data processing activity TASK1 comprises the selection, by the artificial intelligence unit, of an application unit 150 capable of executing the aforementioned first data processing activity TAKSK 1 based on appropriate selection parameters.
Preferably, such selection parameters include performance data DP indicative of the performance of the application units 150.
Advantageously, for this selection process, the artificial intelligence unit 101 can execute appropriate analysis algorithms of the data performance DP (which may be of a known type). Advantageously, these algorithms analyse the performance data DP based on predefined evaluation criteria (for example memorized and inserted in the programming step of the artificial intelligence unit 101) or dynamically generated by the artificial intelligence unit 101 (for example by means of further data processing algorithms of a known type) on the basis of the first request of the user R1 or the first input data Dl.
Preferably, as it will become clearer subsequently, the performance data DP are generated and stored in the database 102 during appropriate training procedures performed by the artificial intelligence unit 101.
However, the performance data DP can be generated and stored in the database 102 also during the normal interaction process between the artificial intelligence unit 101 and the user, for example during previous execution cycles of the interaction procedure with the user 300, described here.
The selected application unit 150 can be an application unit internal or external to the computerized platform 100, according to requirements.
Once the selection is complete, the artificial intelligence unit 101 assigns the aforementioned first data processing activity TASK1 to the selected application unit 150.
Subsequently, the artificial intelligence unit 101 receives the data D20 provided by the application unit 150 selected during the execution of the first data processing activity TASK1 assigned to it.
The interaction procedure 300 therefore comprises a step 303 in which the artificial intelligence unit 101 interacts (preferably on the basis of the results D2, D20 of the first processing activity TASK1) with the user through the computerized device 1 in such a manner as to provide assistance to the user.
Advantageously, to interact with the user, the artificial intelligence unit 101 processes the results D2, D20 obtained during the execution of the first data processing activity TASK1 and supplies suitable control signals and/or data signals to the computerized device 1.
The control unit 6 of the computerized device 1 receives these signals and, in turn, provides suitable voice or sound signals to the user, through the user interface means 4 (earphones). In the event that the computerized device 1 is equipped with a micro-display 11, the control unit 6 can advantageously make the information sent by the artificial intelligence unit 101 available in visual form.
Thanks to the voice, sound or visual signals thus received, the user can thus be guided by the artificial intelligence unit 101 in carrying out a requested activity or an activity to be carried out in response to a detected user or environmental condition
If necessary, the interaction procedure can advantageously comprise the continuous acquisition by the artificial intelligence unit 101 of further input data indicative of a condition of the user or of a condition of the environment surrounding the user of the computerized device 1. Advantageously, the acquisition of the aforementioned further input data can take place during any step 301-303 of the above acquisition procedure, in particular during the step 303 in which said artificial intelligence unit interacts with the user, through the computerized wearable device.
Preferably, the aforementioned additional input data include data or images provided by the sensor means 3 of the computerized device 1, possibly processed, where necessary, by the control unit 6.
The real-time acquisition of the aforementioned input data by the artificial intelligence unit 101 allows to improve considerably the level of assistance provided to the user in carrying out certain activities, for example in walking.
EXAMPLE 1
It is now described an example of implementation of the interaction 300 procedure as presented above.
It is assumed that the user of the computerized device 1 (blind or partially sighted person) intends to go from place A to place B.
Through the computerized device 1, the user sends a voice request Rl, for example of the type "I want to go from place A to place B", to the artificial intelligence unit 101.
The artificial intelligence unit 101 receives the request Rl and determines that the user intends to execute the following activity: "go from place A to place B".
The artificial intelligence unit 101 determines that, to carry out this activity, it is necessary to execute a data processing activity TASK1 which consists in "calculating the optimal path to go from place A to place B and interact, on the basis of the observation of the environment, with the user”.
The artificial intelligence unit 101 selects and processes the data necessary to calculate the optimal route. Based on the data thus obtained and on all the additional input data provided continuously by the sensor means 3 (for example by the cameras) in real time it provides the data necessary to guide the user. On the basis of the data thus obtained, the artificial intelligence unit 101 sends to the user, through the computerized device 1, suitable voice signals to guide the user along the identified path to go from place A to place B, that is, to carry out the requested activity, indicating the presence of any obstacles identified by the artificial intelligence. EXAMPLE 2
It is now described a further example of implementation of the interaction procedure 300 described above. It is assumed that the user of the computerized device 1 (blind or partially sighted person) is in place A.
The artificial intelligence unit 101 receives input data D1 indicative of a dangerous weather situation. The input data may have been detected directly from the computerized device 1 (through cameras 3) or learned from a specialized computerized platform available on the Internet.
The artificial intelligence unit 101 receives the D1 input data and determines that the user should perform the following activity: "go from place A to home” .
The artificial intelligence unit 101 determines that, in order to carry out this activity, it is necessary to execute a data processing activity TASK1 which consists in "calculating the shortest path to go from place A to home and interact with user, on the basis of observation of the environment" .
It is assumed that the artificial intelligence unit 101 is able to interact with multiple application units 150 capable of providing road navigation services.
The artificial intelligence unit 101 analyses the historical data DS stored in the database 102 and determines that it is necessary to resort to application unit 150 to carry out the processing activity TASK1 required to assist the user.
The artificial intelligence unit 101 selects (for example on the basis of the DP performance data stored in the database 102) an application unit 150 able to provide road navigation services and it assigns to this application unit the activity of data processing TASK1 indicated above, at the same time excluding the other available application units.
The application unit 150 performs the assigned data processing activity TASK1 and provides the results of this processing activity to the artificial intelligence unit 101.
Based on the data thus obtained and all the further input data supplied continuously by the sensor means 3, for example by the cameras in real time, the artificial intelligence unit 101 sends to the user, through the computerized device 1, suitable voice signals to guide the user along the identified path to go from place A to home (i.e. to react to a danger condition).
As the skilled person can easily understand, the artificial intelligence unit 101 can use the interaction procedure 300 in order to satisfy every need of the user of the computerized device 1, for example to respond to trivial requests (for example of the type "what time is it” or “tell me what the weather will be like today”) or complex (for example of the type “tell me the name of a place in front of me” or “guide me home”).
In general, the artificial intelligence unit 101 can assist the user of the computerized device 1 by making the following functionalities available: reading of texts and numbers, recognition of coins and payment cards, recognition of commonly used objects, recognition of objects with favourite brands or logos, recognition of food or drinks, recognition of familiar faces, gender recognition (male or female face), recognition of animals, recognition of elements of the surrounding environment (buildings, natural landscape, etc.), typical functions of a smartphone, “live streaming” features, suggestions on additional opportunities and/or improvements in respect of the user requests, and so on.
As illustrated above, where necessary, the artificial intelligence unit 101 can assist the user by using directly suitable application units 150 and, where necessary, the aforementioned additional input data supplied in real time by the sensor means 3 to execute the necessary data processing activities.
Obviously, the execution of an interaction procedure 300 in order to meet a certain user request can be interrupted should this request be completely satisfied or should there be a further request relating to another activity.
In these cases, the execution of a new interaction procedure 300 begins, aimed at satisfying the new user request.
In any case, the interaction procedure 300 previously in progress can be restored if necessary, for example upon completion of the new interaction procedure aimed at satisfying the new user request.
Advantageously, the reactivation of the interaction procedure 300 previously in progress is carried out in response to a further request from the user.
According to an aspect of the invention, the artificial intelligence unit 101 is able to provide assistance to the user (for example, a blind or partially sighted person) in dealing with any unforeseen situations while carrying out a certain activity.
In the event of an unforeseen situation during the execution of an activity, the interaction procedure 300 to assist the user during that ongoing activity is interrupted and the execution of a new interaction procedure 300 initiates in order to assist the 'user in dealing with this activity. According to some embodiments of the invention, the interaction procedure 300, described above, can advantageously provide the execution of at least a second sequence of steps 3030. This further sequence of steps is intended to be carried out when the artificial intelligence unit 101 is already interacting with the user, in parallel with the execution of a step of the above described first sequence of steps of the interaction procedure 300, for example step 303, illustrated above. The execution of the sequence of steps 3030 can take place in response to a user request or an automatic detection of particular conditions of the user or of the environment surrounding the user.
With reference to Figure 5, the sequence of steps 3030 comprises a step 303A in which the artificial intelligence unit 101 receives at least a second user request R2 or at least third input data D3 indicative of a user’s condition or a condition of the environment surrounding the user. The request R2 is sent to the artificial intelligence unit 101, through the computerized device 1, according to the modalities already described (step 301 of the interaction procedure 300).
The third input data D3 may include data collected by the sensor means 3 (cameras) or by other sensor means (e.g. position sensors) of the computerized device 1. These data are sent to the artificial intelligence unit 101, through the computerized device 1, according to the modalities already described (step 301 of the interaction procedure 300).
The third input data D3 may include data received from other computerized platforms 600 in communication with the computerized platform 100. These data are also sent to the artificial intelligence unit 101 in the way already described (step 301 of interaction procedure 300).
The sequence of steps 3030 includes a step 303B in which the artificial intelligence unit 101 performs a second data processing activity TASK2 to provide assistance to the user. Advantageously, the artificial intelligence unit 101 determines (preferably based on the user request R2 or the third input data D3 or both) the second processing activity TASK2 in the way described above.
Preferably, the second data processing activity TASK2 includes the processing of data acquired in real time from the sensor media by the artificial intelligence unit 101 in addition to the execution of appropriate selection and processing algorithms (which may be of known type) of historical data DS stored in the database 102 to provide the data D4 necessary to assist the user. When necessary, the second data processing activity TASK2 includes the selection, by the artificial intelligence unit, of an application unit 150 able to execute the above mentioned second data processing activity TASK2 according to appropriate selection parameters.
The selection of the most suitable application unit 150 can be done according to the above described methods for the first data processing activity TASK1.
Once the selection is complete, the artificial intelligence unit 101 assigns the aforementioned second data processing activity TASK2 to the selected application unit 150.
Subsequently, the artificial intelligence unit 101 receives the data D40 provided by the application unit 150 selected during the execution of the second data processing activity TASK2 assigned to it. The interaction procedure 3030 therefore comprises a step 303C in which the artificial intelligence unit 101 interacts (preferably on the basis of the results D4, D40 of the second processing activity TASK2) with the user through the computerized device 1 in a manner such as to provide assistance to the user.
The interaction process of the artificial intelligence unit 101 with the user advantageously takes place according to the methods already described (step 303 of the interaction procedure 300). Thanks to the voice or sound signals received in this way, the user can thus be guided by the artificial intelligence unit 101 to carry out a further activity or to react to an unforeseen situation, while carrying out an activity already in progress.
Where necessary, the interaction procedure 300, described above, can provide the execution of several second sequences of steps 3030 in parallel with the execution of a first sequence of steps of the interaction procedure 300, for example step 303, illustrated above.
EXAMPLE 3
It is now described an example of implementing the interaction procedure 300 outlined above in the event that an unexpected situation arises.
It is supposed that the user of the computerized device 1 (blind or partially sighted person) is going from place A to place B. In carrying out this activity, the artificial intelligence unit 101 is interacting with the user, according to the modalities already described (step 303 of interaction procedure 300).
It is supposed that the computerized device 1 detects the failure of a traffic light by means of the sensor means 3 (cameras).
Through the computerized device 1, the artificial intelligence unit 101 receives new input data D3 indicative of this unforeseen dangerous situation.
Based on the input data, the artificial intelligence unit 101 determines that the user must perform the following activity: "find help".
The artificial intelligence unit 101 selects and processes historical data DS stored in the database 120 to provide data that can respond to the user's need, for example to "find a person who can help".
The artificial intelligence unit 101 interrupts the assistance activity in progress and, on the basis of the data thus obtained and of all the data provided by the sensor means, for example the cameras, sends to the user, through the computerized device 1, suitable voice signals to guide the user to a person who can help (for example, a traffic warden).
Once the dangerous situation has been overcome, the artificial intelligence unit 101 can resume the activity previously in progress, upon request from the user. EXAMPLE 4
We will describe a further example of the implementation of the interaction procedure 300 outlined above in the event that an unexpected situation arises.
It is supposed that the user of the computerized device 1 (blind or partially sighted person) is going from place A to place B. In carrying out this activity, the artificial intelligence unit 101 is interacting with the user, according to the modalities already described (step 303 of the interaction procedure 300).
Using the computerized device 1, the user sends a new voice request, for example of the type "I'm thirsty", to the artificial intelligence unit 101.
The artificial intelligence unit 101 receives the new request and determines that the user intends to execute the following (unexpected) activity: "find a place to buy a drink’’ .
The artificial intelligence unit 101 also determines that, in order to carry out this activity, it is necessary to execute a TASK2 data processing activity which consists in "finding a place to buy a drink along the way from place A to place B
The artificial intelligence unit analyses the historical data DS stored in the database 102 and determines that it is necessary to resort to application unit 150 to carry out the processing activity TASK2 required to assist the user.
On the basis of the performance data DP stored in the database 102 and/or other selection parameters, the artificial intelligence unit 101 selects an application unit 150 capable of executing the data processing activity TASK2 and assigns this activity of data processing to the selected application unit.
The application unit 150 performs the assigned data processing activity TASK2 and supplies the results of this processing activity to the artificial intelligence unit 101.
In parallel to the ongoing assistance activity, based on the input data thus received, the artificial intelligence unit 101 sends to the user, through the computerized device 1, suitable voice signals to guide the user to a place where to buy a drink, according to the new (unexpected) user request. According to an aspect of the invention, the artificial intelligence unit 101 is configured to execute an approach procedure 200 with the user of the computerized device 1 (for example, a blind or partially sighted person).
The approach procedure 200 can be performed by the artificial intelligence unit 101 to check if the user is using the computerized device 1 and to solicit requests from the user himself.
With reference to Figure 6, the approach procedure 200 includes a step 201 in which the artificial intelligence unit 101 checks if the computerized device 1 is in an activation state. This check can be carried out in a manner known in communications between computerized devices, for example by sending a "ping" signal to the control unit 6 and waiting for a response signal from the latter.
If the computerized device 1 is in an activated state, the procedure 200 includes a step 202 in which the artificial intelligence unit 101 sends a first invitation message Ml to the user, through the computerized device 1.
Preferably, the Ml invitation message is a voice signal that can be sent by the artificial intelligence unit 1 in the manner already described (step 303 of the interaction procedure 300). Preferably, the procedure 200 includes a step 203 in which the artificial intelligence unit 101 receives a response message R0 from the user through the computerized device 1.
The response message R0 is sent to the artificial intelligence unit 101, through the computerized device 1, according to the methods already described (step 301 of the interaction procedure 300).
The procedure 200 therefore includes a step 204 in which the artificial intelligence unit 101 determines whether the user intends to carry out some activity or not.
Advantageously, to implement the determination process of step 204, the artificial intelligence unit 101 can execute appropriate voice recognition algorithms (which may be of a known type). If the user intends to carry out some activity, the procedure 200 therefore includes a step 205 in which the artificial intelligence unit 101 selects a class of data processing activity to provide assistance to the user in carrying out a certain type of activity.
Advantageously, in order to implement the selection process of step 205, the artificial intelligence unit 101 can execute appropriate voice recognition algorithms (which may be of a known type).
The procedure 200 therefore includes a step 206 in which the artificial intelligence unit 101 sends a second invitation message M2 to the user, through the computerized device 1. Preferably, the M2 invitation message is also a voice signal that can be sent by the artificial intelligence unit 1 in the manner already described (step 303 of the interaction procedure 300). Subsequently to sending the second invitation message M2, the artificial intelligence unit 101 waits for a user request, so that it can possibly execute the interaction procedure 300 described above.
In case of no-requests from the user, the artificial intelligence unit 101 can cyclically repeat the sending of the second invitation message M2.
The execution of the procedure 200 allows the artificial intelligence unit 101 to proactively inform the user that it is ready to assist him, if necessary. Furthermore, based on the limited information received by the user (reply message R0), the artificial intelligence unit 101 prepares itself in the most appropriate way (step 205 of procedure 200) to assist the user in carrying out an activity which, probably, will soon be requested by the user.
EXAMPLE 5
It is now described an example of the implementation of the approach procedure 200 outlined above.
It is supposed that the user (for example, blind or visually impaired person) wears the computerized device 1. In doing so, the user activates the computerized device 1 by means of activation 5 (ON / OFF button or touch sensor).
The artificial intelligence unit 101 detects that the computerized device 1 is in an activation state and sends a first Ml invitation message to the user (voice message), such as "what do we do?"
Through the computerized device 1, the user sends a response message R0 (voice message), for example of the type "I would like to go out", to the artificial intelligence unit 101.
Based on the user response message R0, the artificial intelligence unit 101 selects a data processing activity class in order to provide assistance to the user in carrying out activities outdoor. In this way, the artificial intelligence unit 101 can also identify a group of application units 150 with which to interact to assist the user.
Subsequently, the artificial intelligence unit 101 sends a second invitation message M2 to the user (voice message), for example a message such as "where are we going?".
At this point, the artificial intelligence unit 101 awaits a new request from the user (for example, the R1 request provided for by the interaction procedure 300 described above).
As the expert person in this field can easily understand the artificial intelligence unit 101 can be configured to execute various approach procedures compared to the approach procedure 200 illustrated above.
In general, such approach procedures may be of a known type and will not be described in detail here for obvious reasons of brevity.
As illustrated above, the artificial intelligence unit 101 is configured to execute training procedures in which automatic learning algorithms are performed.
These training procedures allow the artificial intelligence unit 101 to adapt to the requests of the user of the computerized device 1 and to the environment with which the user interacts, improving, from time to time, the choices of the processing actions to be performed in order to assist the user. In general, the training procedures may be of a known type and will not be described in detail here for obvious reasons of brevity.
The artificial intelligence unit 101 can repeatedly execute the aforementioned training procedures, whenever necessary.
Preferably, however, the artificial intelligence unit 101 is configured to train continuously, even during the normal user assistance activities, in particular during the execution of data processing activities by the application units 150.
With reference to Figure 7, it is described a procedure for acquiring DP performance data indicative of the performance of the application units 150 that can be performed by the artificial intelligence unit 101.
The acquisition procedure 400 allows the artificial intelligence unit 101 to collect and store DP performance data indicative of the performance of the application units 150. In this way, the artificial intelligence unit 101 can make an optimal choice and where necessary select an application unit 150 to execute a certain data processing activity.
The acquisition procedure 400 includes a step 401 in which the artificial intelligence unit 101 selects a group of application units 150 capable of executing a data processing activity TASK. The acquisition procedure 400 includes a step 402 in which the artificial intelligence unit 101 assigns this data processing activity TASK to each application unit 150.
The application units 150 thus concurrently execute the assigned data processing activity TASK.
The acquisition procedure 400 comprises a step 403 in which the artificial intelligence unit 101 receives, from each application activity 150, the performance data DP indicative of the performance of this application unit in the execution of the assigned data processing activity TASK.
The acquisition procedure 400 includes a step 404 in which the artificial intelligence unit 101 stores the performance data DP of each application unit in the database 102.
The stored performance data DP thus becomes part of the historical data DS available to be used during the assistance to be provided to the user, in particular during the execution of the interaction procedure 300, described above.
The acquisition procedure 400 can be performed whenever necessary, even during the execution of the interaction procedure 300 or during a learning procedure.
As the expert person in this field can easily understand, a further aspect of the present invention relates to a computerized system 500 comprising the computerized device 1 and the computerized platform 100, described above. Given that the computerized device 1 and the computerized platform 100 are able to communicate wirelessly at high speed, the computerized system 500 represents a real IoT (Internet of Things) system.
The computerized device 1 can be achieved in practice according to further variants falling within the scope of the present invention.
According to further possible embodiments of the invention, the control unit 6 comprises an additional artificial intelligence unit 60.
Preferably, this additional artificial intelligence unit 60 is pre-trained before being installed (and subsequently updated) on board the control unit 6. It can then interact with the user based on the training received.
In practice, this additional artificial intelligence unit can preferably be intended to execute only inferential tasks. This allows to arrange it on board the control unit 6 without excessively increasing the computing power of the control unit 6.
The use of an additional artificial intelligence unit 60 on board of the computerized device 1 offers the advantage of simplifying the interaction with the user and providing assistance in carrying out simpler activities, without the need to communicate with the computerized platform 100.
The use of an additional artificial intelligence unit 60 is also useful to provide a minimum level of assistance to the user if, for any reason, it is not possible to communicate with the computerized platform 100, for example due to a lack of field.
The computerized system 500, according to the invention, has significant advantages over the prior art.
Unlike the known solutions of the state of art, the computerized system 500 is able to provide assistance to blind or visually impaired people in carrying out a number of activities, even very diversified from each other.
In practice, the computerized system 500 is able to provide assistance to blind or partially sighted people in many aspects of daily life, effectively responding to their needs and improving their autonomy.
The computerized system 500 is able to provide assistance to blind or visually impaired people even in dealing with unexpected situations during the performance of a certain activity.
The computerized system 500 comprises a wearable computerized device 1 which has a very contained overall structure that makes it particularly suitable for being worn by the user, for example as a pair of glasses. The computerized device 1 is of easy and intuitive practical use and does not require the use of additional devices in order to interact with the remote computerized platform 100.
The computerized system 500 (in particular as regards the wearable computerized device 1) is easily achievable at an industrial level, with manufacturing methods HW/SW, of known type.

Claims

1. Computerized system (500) to provide assistance to a user characterised in that it comprises: at least one computerized device (1) comprising: a support structure (2) shaped or arranged to be wearable by the user; a control unit (6) at least partially housed in or fixed to said support structure; sensor means (3) adapted to detect data or images related to the environment surrounding the user, said sensor means being at least partially housed in or fixed to said support structure and being operationally connected to said control unit; user interface means (4) adapted to allow the user to send or receive sound signals, said user interface means being at least partially housed in or fixed to said support structure and operatively connected to said control unit; wireless communication means (9, 10A, 10B) at least partially housed in or fixed to said support structure and operatively connected to said control unit, said communication means comprising at least a communication module (9) for high-speed communications; a computerized platform (100) configured to communicate with said computerized device, said computerised platform comprising: an artificial intelligence unit (101) configured to execute automatic learning algorithms and to interact with the user of said computerized device, said artificial intelligence unit being also configured to interact with one or more application units (150) internal or external to said computerised platform (100); a database (102) configured to store historical data (DS) including data provided by said computerized device (1) or by other computerized platforms (600) or data processed by said artificial intelligence unit (101); wherein said artificial intelligence unit (101) is configured to execute an interaction procedure (300) in real time with the user of said computerized device (1), said interaction procedure comprising a first sequence of steps including:
- receiving (301) at least a first request (Rl) from the user or at least first input data (Dl) indicative of a condition of the user or a condition of the environment surrounding the user, through said computerised device;
- executing (302) a first data processing activity (TASK1) to provide assistance to the user, said first processing activity including one or more of the following activities: - executing voice or visual recognition algorithms to interpret said first user request (Rl) or said first input data (Dl) and to determine said first data processing activity (TASK1);
- executing algorithms for selecting and processing historical data (DS) stored in said database (102);
- selecting, where necessary, an application unit (150) capable of executing said first data processing activity (TASK1) and acquiring the data provided by said application unit during the execution of said first data processing activity;
- based on the results (D2, D20) of said first data processing activity (TASK1), interacting (303) with the user, through said computerised device, to provide assistance to the user.
2. Computerized system, according to claim 1, characterised in that said interaction procedure (300) comprises a continuous acquisition of further input data indicative of a user’s condition or of a condition of the environment surrounding the user through said computerised device (1).
3. Computerized system, according to one or more of the preceding claims, characterised in that said interaction procedure (300) with the user of said computerised device (1) comprises a second sequence of steps (3030) including:
- receiving (303A), through said computerized device, at least a second request (R2) from the user or at least second input data (D3) indicative of a user's condition or a condition of the environment surrounding the user;
- executing (303B) a second data processing activity (TASK2) to provide assistance to the user, said second processing activity including one or more of the following activities:
- executing voice or visual recognition algorithms to interpret said second user request (R2) or said second input data (D3) and to determine said second data processing activity (TASK2);
- executing algorithms for selecting and processing historical data (DS) stored in said database (102);
- selecting, where necessary, an application unit (150) capable of executing said second data processing activity (TASK2) and acquiring the data provided by said application unit during the execution of said second data processing activity; - based on the results (D4, D40) of said second data processing activity (TASK2), interacting (303C) with the user, through said computerized device (1), to provide assistance to the user; said second sequence of steps (3030) being performed in parallel to one or more steps (301, 302, 303) of the first sequence of steps of the interaction procedure (300) with the user.
4. Computerized system, according to one or more of the preceding claims, characterised in that said artificial intelligence unit (11) is configured to execute an acquisition procedure (400) of performance data (DP) indicative of the performance of one or more application units (150), said acquisition procedure including the following steps:
- selecting (401) a group of application units (150) comprising one or more application units capable of executing a data processing activity (TASK);
- assigning (402) said data processing activity (TASK) to each application unit;
- receiving (403), from each application activity (150), performance data (DP) indicative of the performance of said application unit in the execution of said data processing activity (TASK);
- storing (404) said performance data (DP) in said database (102).
5. Computerized system, according to one or more of the preceding claims, characterised in that said artificial intelligence unit (101) is configured to execute an approach procedure (200) with the user of said computerized device (1), said approach procedure including the following steps:
- checking (201) whether said computerized device (1) is in an activated state;
- if said computerized device (1) is in an activated state, sending (202) a first invitation message (Ml) to the user, through said computerised device;
- receiving (203) a response message (R0) from the user, through said computerized device;
- determining (204) if the user intends to carry out any activity;
- if the user intends to carry out some activity, selecting (205) a class of data processing activities to provide assistance to the user;
- sending (206) a second invitation message (M2) to the user, through said computerised device.
6. Computerized system, according to one or more of the preceding claims, characterised in that the support structure (2) of said computerised device (1) is shaped or arranged so as to be wearable by a user at the face or head level.
7. Computerized system, according to claim 6, characterised in that said support structure (2) is shaped or arranged like an eyeglass frame.
8. Computerized system, according to one or more of the preceding claims, characterised in that the control unit (6) of said computerised device (1) comprises a removable memory (6A).
9. Computerized system, according to one or more of the preceding claims, characterised in that the control unit (6) of said computerized device (1) comprises a further artificial intelligence unit (60) configured to interact with a user of said computerized device.
10. Computerized system, according to one or more of the preceding claims, characterised in that the sensor means (3) of said computerized device (1) comprise one or more cameras.
11. Computerized system, according to claim 10, characterised in that said cameras (3) are arranged so as to cover a horizontal field of view of at least 240 °.
12. Computerized system, according to one or more of the preceding claims, characterised in that the user interface means (4) of said computerized device (1) comprise one or more earphones with microphone.
13. Computerized system, according to one or more of the preceding claims, characterised by the fact that the at least a communication module (9) of said computerized device (1) is a communication module for high-speed communications through the telephone network.
14. Computerized device, according to one or more of the preceding claims, characterised by the fact that the wireless communication means of said computerized device (1) comprise one or more communication modules (10A) for communications at local level.
15. Computerized system, according to one or more of the preceding claims, characterised in that the wireless communication means of said computerized device (1) comprise one or more communication modules (10B) for communicating with a GPS detection system.
16. Computerized system, according to one or more of the preceding claims, characterised by the fact that said computerized device (1) comprises power supply means (7, 8) at least partially housed in or fixed to said support structure.
17. Computerized system, according to one or more of the preceding claims, characterised by the fact that said computerized device (1) comprises one or more micro-displays (11) at least partially housed in or fixed to said support structure.
8. Computerized system, according to one or more of the preceding claims, characterised by the fact that said computerized device (1) comprises activation means (5) that can be operated by the user to activate said computerized device.
PCT/IB2020/058719 2019-09-19 2020-09-18 Computerised system to provide assistance to a person, in particular to a blind or visually impaired person WO2021053605A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IT102019000016727A IT201900016727A1 (en) 2019-09-19 2019-09-19 Wearable computerized device for providing assistance to a person, particularly a blind or partially sighted person.
IT102019000016727 2019-09-19

Publications (1)

Publication Number Publication Date
WO2021053605A1 true WO2021053605A1 (en) 2021-03-25

Family

ID=69375771

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2020/058719 WO2021053605A1 (en) 2019-09-19 2020-09-18 Computerised system to provide assistance to a person, in particular to a blind or visually impaired person

Country Status (2)

Country Link
IT (1) IT201900016727A1 (en)
WO (1) WO2021053605A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120249797A1 (en) * 2010-02-28 2012-10-04 Osterhout Group, Inc. Head-worn adaptive display
US20130044042A1 (en) * 2011-08-18 2013-02-21 Google Inc. Wearable device with input and output structures
US20180249151A1 (en) * 2015-03-17 2018-08-30 Raytrx, Llc Wearable image manipulation and control system with correction for vision defects and augmentation of vision and sensing

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2371339A1 (en) 2010-04-02 2011-10-05 POZOR 360 d.o.o. Surroundings recognition & describing device for blind people

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120249797A1 (en) * 2010-02-28 2012-10-04 Osterhout Group, Inc. Head-worn adaptive display
US20130044042A1 (en) * 2011-08-18 2013-02-21 Google Inc. Wearable device with input and output structures
US20180249151A1 (en) * 2015-03-17 2018-08-30 Raytrx, Llc Wearable image manipulation and control system with correction for vision defects and augmentation of vision and sensing

Also Published As

Publication number Publication date
IT201900016727A1 (en) 2021-03-19

Similar Documents

Publication Publication Date Title
US20210081650A1 (en) Command Processing Using Multimodal Signal Analysis
Poggi et al. A wearable mobility aid for the visually impaired based on embedded 3D vision and deep learning
US20130250078A1 (en) Visual aid
US10571715B2 (en) Adaptive visual assistive device
CN107949504B (en) Autonomous vehicle safety system and method
US9922236B2 (en) Wearable eyeglasses for providing social and environmental awareness
US10024667B2 (en) Wearable earpiece for providing social and environmental awareness
US10024678B2 (en) Wearable clip for providing social and environmental awareness
US9805619B2 (en) Intelligent glasses for the visually impaired
WO2015108882A1 (en) Smart necklace with stereo vision and onboard processing
KR102047988B1 (en) Vision aids apparatus for the vulnerable group of sight, remote managing apparatus and method for vision aids
CN112020411B (en) Mobile robot apparatus and method for providing service to user
US10602264B2 (en) Systems and methods for directing audio output of a wearable apparatus
KR102351584B1 (en) System for providing navigation service for visually impaired person
Mattoccia et al. 3D glasses as mobility aid for visually impaired people
US9996730B2 (en) Vision-assist systems adapted for inter-device communication session
Manjari et al. CREATION: Computational constRained travEl aid for objecT detection in outdoor eNvironment
KR102399594B1 (en) Display apparatus and operation method of the same
KR101982848B1 (en) Vision aids apparatus for the vulnerable group of sight comprising a removable wearable device and a control device
WO2021053605A1 (en) Computerised system to provide assistance to a person, in particular to a blind or visually impaired person
WO2019054086A1 (en) Information processing device, information processing method, and program
CN111344776B (en) Information processing device, information processing method, and program
Kunapareddy et al. Smart Vision based Assistant for Visually Impaired
KR20210080140A (en) Indoor navigation method and system thereof
Neumann et al. Skadi: heterogeneous human-sensing system for automotive IoT

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20788879

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20788879

Country of ref document: EP

Kind code of ref document: A1