WO2023012215A1 - Intégration avec des objets intelligents dans l'environnement de dispositifs mobiles - Google Patents

Intégration avec des objets intelligents dans l'environnement de dispositifs mobiles Download PDF

Info

Publication number
WO2023012215A1
WO2023012215A1 PCT/EP2022/071813 EP2022071813W WO2023012215A1 WO 2023012215 A1 WO2023012215 A1 WO 2023012215A1 EP 2022071813 W EP2022071813 W EP 2022071813W WO 2023012215 A1 WO2023012215 A1 WO 2023012215A1
Authority
WO
WIPO (PCT)
Prior art keywords
mobile device
sensor data
intelligent object
user
gesture
Prior art date
Application number
PCT/EP2022/071813
Other languages
German (de)
English (en)
Inventor
Klaus David
Original Assignee
Universität Kassel
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Universität Kassel filed Critical Universität Kassel
Publication of WO2023012215A1 publication Critical patent/WO2023012215A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/1396Protocols specially adapted for monitoring users' activity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/52Network services specially adapted for the location of the user terminal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/38Services specially adapted for particular environments, situations or purposes for collecting sensor information

Definitions

  • the present invention relates to a device for interacting with a smart object in an environment of a mobile device.
  • the present invention further relates to a mobile device and a method for interacting with an intelligent object in an environment of the mobile device and a computer program product.
  • Smart City summarizes approaches for applying progressive digitization in cities and communities.
  • the equipping and upgrading of existing and new infrastructures with digital technology and the linking of previously separate infrastructures or their subsystems should be promoted.
  • Smartphones in particular, but also tablet computers, smart glasses, smart watches and other mobile devices are omnipresent and offer extensive possibilities for use in a smart city.
  • Applications can be found, among other things, in the field of transport, for example when displaying current timetables for public transport, information, for example when guiding and informing tourists and Environmental protection, or environmental protection, for example in the recording and targeted avoidance of local pollution.
  • Internet of things and services is often interpreted to mean that sensors are embedded in intelligent objects within a smart city, which collect data and make it available for applications.
  • An intelligent object is understood to mean in particular an object that is capable of communication in some way and with which an interaction in the sense of a reaction of the intelligent object to inputs, signals or queries is possible.
  • Intelligent objects can be, for example, timetable displays at tram stops, traffic lights, street lights, irrigation systems, advertising pillars, information boards, sensors, stumbling blocks, restaurants (menu card, reservation), etc.
  • US 2020 073 482 A1 discloses a method and a system for detecting tactile interactions in augmented reality.
  • Program actions may be initiated upon detection of a user's predefined gesture with a real-world object.
  • Users can interact with their environment in augmented reality by detecting interaction with real objects using a combination of location and motion detection with identification using wearable sensors.
  • a predefined gesture may be identified and a program action associated with a corresponding target interaction with the real-world object performed.
  • the user's experience can be enhanced by providing haptic feedback on a tactile gesture and event.
  • a challenge in this context lies in simplifying the interaction of city dwellers or users of mobile devices with intelligent objects. For example, older people often have difficulties using modern technologies and using their user interfaces intuitively capture. Approaches to integrating digitization into everyday life in a smart city or in other contexts are only as good as their interface or the possibility of interaction with users in terms of their benefits.
  • the object of the present invention is to enable or simplify interaction with intelligent objects.
  • an approach is to be created that enables an intuitively comprehensible interaction.
  • a simple and easy-to-use interaction option should be created.
  • the present invention relates in a first aspect to an apparatus for interacting with an intelligent object in an environment of a mobile device, comprising: an input interface for receiving environment sensor data and orientation sensor data, the environment sensor data relating to information on a relative position of the intelligent object to the mobile device and the orientation sensor data includes information about an orientation of the mobile device in relation to the environment; an alignment unit for detecting whether the mobile device is aligned with the smart object based on the environmental sensor data and the alignment sensor data; a communication unit to establish a communication link with the smart object when the mobile device points to the smart object; and a user interface for notifying a user of the mobile device when the communication link is established.
  • the present invention relates to a mobile device for interacting with an intelligent object in an environment of the mobile device, comprising: an environment sensor for detecting a relative position of the intelligent object; an orientation sensor for detecting an orientation of the mobile device; and an apparatus as described above.
  • aspects of the invention relate to a method designed according to the device and a computer program product with program code for performing the steps of the method when the program code is executed on a computer.
  • one aspect of the invention relates to a storage medium on which a computer program is stored. The computer program, when executed on a computer, causes execution of the method described herein.
  • environmental sensor data on the one hand and alignment sensor data on the other hand are processed in the device.
  • Information about a relative position of the intelligent object in relation to the mobile device is comprised by an environment sensor.
  • the aim is to determine where the intelligent object is located. For this purpose in particular receive an absolute or relative position of the intelligent object.
  • Information about how the mobile device is oriented in relation to the environment is received by an orientation sensor. In particular, therefore, a relative position of the mobile device in relation to the environment is received.
  • the mobile device Based on the environmental sensor data and orientation sensor data, it can be detected whether the mobile device is aimed at the intelligent object. In particular, it can be determined whether a direction predefined for the mobile device corresponds to the direction in which the mobile object is located.
  • the direction predefined for the mobile device can, for example, correspond to a direction orthogonal to a main display of the mobile device or also a direction along a longitudinal axis of the mobile device, depending on the mobile device or its orientation. In other words, the direction predefined for the mobile device can also be adapted or changed, if necessary, based on the alignment of the mobile device.
  • a communication link is established with the intelligent object using a communication unit.
  • a possibility is established to interact with the intelligent object.
  • the user of the mobile device is informed about this via a user interface of the device or a corresponding output unit of the mobile device. The user can then interact with the intelligent object in its environment.
  • the inventively provided detection of the orientation and establishment of a communication connection depending on an orientation brings about a simplification.
  • the initiation of communication becomes more intuitive and automatically matches the current context of the user.
  • direct input by the user can be dispensed with, for example.
  • the user can directly interact with the smart objects without any complex configuration. This simplifies the interaction between users and intelligent objects, which can lead to an improved acceptance of smart city concepts.
  • the device comprises a gesture unit for recognizing a gesture performed by a user of the mobile device based on the orientation sensor data.
  • the gesture unit is preferably designed to recognize the gesture based on a machine learning approach, in particular based on a pre-trained artificial neural network.
  • a gesture is understood to mean, in particular, a movement of the mobile device that is carried out by the user using the mobile device. For example, a user can swivel their smartphone and this movement can be detected using a sensor. This gesture can then be recognized.
  • a pre-trained artificial neural network or other machine learning approach is used to reliably and quickly recognize the gesture.
  • a database can also be used for the recognition.
  • the communication unit is designed to establish the communication connection on the condition that an activation gesture has previously been recognized. Additionally or alternatively, the communication unit is designed to end the communication connection when the communication connection is established and a deactivation gesture is recognized.
  • the establishment and termination of the communication connection can be made dependent on a gesture in addition to the detection of the alignment or the alignment of the mobile device with respect to the intelligent object. This can be achieved that em user of the mobile device only consciously establishes a communication link. Unintentionally interacting with smart objects is avoided.
  • the activation and deactivation gestures can, for example, correspond to rotating and/or turning movements of a smartphone. The possibility of interaction is further simplified and the use is improved.
  • the communication unit is designed to transmit a control command to the intelligent object when the communication connection is established and a control gesture is recognized. It is also possible for the actual interaction with the intelligent object to take place based on gesture recognition.
  • a control gesture can be a gesture that causes a specific action or reaction of the intelligent object. For example, further information can be requested from the intelligent object by a swiping movement, or a display of the intelligent object can be changed on a display of the intelligent object.
  • An intuitive and easy-to-use interaction option is provided through the use of gesture recognition and detection of a control gesture. It becomes possible for users of the device according to the invention to interact with intelligent objects in an intuitive manner.
  • the communication unit is designed to receive interaction data with information on the user's options for interacting with the intelligent object.
  • the user interface is designed to inform the user about the interaction options based on the interaction data received, preferably by means of an indication on a display.
  • various options for interacting with the intelligent object are first received (interaction options). The user is then informed about the interaction options and can start the desired interaction. For example, available information can be displayed so that the user of the mobile device can decide what information he needs and wanted to query. It is also possible for control options to be displayed along with the appropriate user inputs, enabling the user to interact with and control the smart object. In this respect, the possibility of interaction is further improved and simplified.
  • the user interface is designed to output an acoustic, optical and/or haptic signal, in particular a haptic vibration signal, by means of a vibration actuator of the mobile device.
  • the user interface can announce the establishment of the communication connection by a corresponding signal to the user.
  • the use of a vibration actuator of the mobile device is particularly advantageous and intuitive.
  • the mobile device can display a type of locking or make it tangible for the user when the communication connection has been established. Operation is further simplified and designed to be intuitive for the user.
  • the user interface is designed to visualize the relative position of the intelligent object and the orientation of the mobile device based on the environmental sensor data and the orientation sensor data, in particular by displaying a position of the intelligent object on a centered on a position of the user and based on the Orientation of the mobile device oriented map of the area.
  • a display of the mobile device can be used for the visualization. It is indicated that there is a smart object in the vicinity of the user or mobile device. In addition, it is shown where, i.e. at which relative position, this intelligent object is located in relation to the position of the user. This enables the user to get an overview of available intelligent objects in his environment and to interact with them based on this. The result is a simple and intuitive interaction option.
  • the communication unit is designed to establish the communication connection when the mobile device is aimed at the intelligent object for a period of time that is above a predefined threshold value.
  • a time threshold value has to be exceeded before the communication connection is established.
  • the mobile device is pointed at a smart object for a certain period of time, which indicates to the user that a communication link should be established. Operating errors are avoided and the intuitiveness is further improved.
  • the communication unit is designed to establish the communication connection with the intelligent object via a direct short-distance communication connection, in particular via Bluetooth Low Energy (BLE).
  • BLE Bluetooth Low Energy
  • Direct communication means that internet-based communication can be dispensed with. In this respect, use in an environment without a mobile data connection can also be considered.
  • a direct connection is often more energy-efficient and less susceptible to manipulation.
  • the alignment unit is designed to detect whether the mobile device is aligned with the intelligent object.
  • This detection can be based on a machine learning approach, in particular based on a pre-trained artificial neural network. Additionally or alternatively, this detection can be based on a pointing direction predefined for a mobile device.
  • the pointing direction corresponds to the direction specified for the mobile device, which corresponds to a type of device property.
  • the pointing direction of a device is therefore defined. For example, a device is specified to have a pointing direction orthogonal to a display or along a longitudinal axis. This pointing direction is the basis for determining whether the mobile device is aimed at the smart object.
  • a machine learning approach can be used for the detection of the orientation, the one reliable and efficiently calculable execution. In addition, a high degree of accuracy can be achieved.
  • the environmental data include position information of intelligent objects that were determined based on a current position of the mobile device, preferably by a database query based on a position determined by a satellite navigation sensor of the mobile device.
  • the environmental sensor data includes image data from a camera sensor of the mobile device.
  • a satellite navigation system is used to determine the current position of the mobile device (GPS, GALILEO, GLONASS, Beidou, etc.). This position can then be displayed on a map, which also shows the positions of smart objects.
  • the positions of the intelligent objects can, for example, be queried based on one's own position from an Internet database, taking into account a radius around one's own position. The user can then orient himself and, for example, control intelligent objects in the immediate vicinity in order to interact with them.
  • the environmental sensor data may include image data from a camera sensor. It is then also conceivable, for example, to use augmented reality approaches in order to superimpose the intelligent objects virtually in the image data.
  • the alignment sensor data includes position sensor data that is received from an acceleration sensor, yaw rate sensor, and/or magnetic field sensor of the mobile device.
  • the orientation sensor data includes image data from a camera sensor of the mobile device.
  • the orientation it is possible for the orientation to be determined based on position sensor data from an inertial sensor. This usually enables a reliable and efficiently calculable position estimate.
  • Interaction of a device or a mobile device with an intelligent object is understood here to mean, in particular, an exchange of information.
  • information can be received from the intelligent object and/or information can be transmitted to the intelligent object.
  • an interaction is understood to mean triggering an action on the mobile device and/or on the intelligent object by the respective communication partner.
  • a display can be effected and/or haptic, tactile, acoustic or visual feedback can be output.
  • the surroundings of a mobile device or of a user of a mobile device are understood to be their surroundings.
  • a radius can be variable.
  • An intelligent object is understood to mean, in particular, a communication-capable artefact.
  • such an artefact can be embedded in an infrastructure object or in the environment.
  • an intelligent object can be, in particular, a physical object that has a communication option or in which a communication option is embedded.
  • an intelligent object can be a physical object of the infrastructure. For example, traffic lights, tram stops, sensors in green areas, embedded sensors in a historic building represent intelligent objects. At a tram stop, for example, current departure times can be displayed or a timetable can be requested.
  • an intelligent object it is also possible for an intelligent object to exist only virtually and in this respect to be in the form of a digital object (digital artefact, digital item).
  • a digital or virtual intelligent object is provided at a specific position, with which a mobile device or a user of a mobile device can interact.
  • a position is therefore linked to a communication option, with the communication then taking place, for example, with an Internet server.
  • a digital object can also be linked to a position in the vicinity of a tram stop certain display and interaction options.
  • a digital intelligent object can be provided in the middle of a public square, through which information about a builder of the square or a historical background, etc. can be obtained. It is also possible for a digital, intelligent object to be in the form of a digital noteboard or a digital pinboard, so that a (virtual) note can be attached and shared or a reaction to such can be made using a mobile device.
  • An interaction with an intelligent object is understood to mean, in particular, an exchange of information. It is possible, for example, for a control signal to be transmitted or received in order to trigger an action or reaction on the part of the intelligent object and/or the mobile device. It is also possible for a link or a reference to a downloadable app to be displayed, for example to enable access to further information and control options. In addition, retrieving current information or also transmitting a request signal is conceivable.
  • FIG. 1 shows a schematic representation of a mobile device according to the invention in interaction with an intelligent object in the vicinity of the mobile device;
  • Figure 2 is a schematic representation of an inventive
  • FIGS. 3 and 4 show a schematic representation of the establishment of a communication connection with an intelligent object and the control of the intelligent object;
  • FIG. 5 shows a schematic representation of a gesture recognition based on a mobile device according to the invention.
  • FIG. 6 shows a schematic representation of a device according to the invention
  • a mobile device 10 according to the invention for interacting with an intelligent object 14 in the environment of the mobile device 10 is shown schematically in FIG.
  • the representation is to be understood as a schematic plan view. It is shown that a user 12 is holding a mobile device 10, for example a smartphone or a tablet, in his hand. According to the invention, it is provided that a communication connection is established between the mobile device 10 and the intelligent object 14 when the mobile device 10 is aligned with the intelligent object 14 .
  • Shown in FIG. 1 is a field of view 22 of the mobile device 10, which visualizes an area in which, for an intelligent object 14 located therein, it is assumed that the mobile device 10 is aligned with it.
  • the left-hand side in FIG. 1 shows that mobile device 10 is not aligned with intelligent object 14 .
  • the right-hand side shows that the user 12 has turned together with the mobile device 10 until the mobile device 10 is aligned with the intelligent object 14 .
  • a communication link is established between the mobile device 10 and the intelligent object 14 and the user 12 is informed of this, for example by the mobile device 10 vibrating.
  • Intelligent objects 14 in smart cities can be made easier to use.
  • a traffic light request for pedestrians can be implemented, for example.
  • the user 12 of the mobile device 10 directs the mobile device 10 towards the traffic light.
  • a communication link is established and the user 12 is notified that the communication link has been established.
  • the user can then request a green phase, for example.
  • the application in a smart city described above is to be understood as an example. Also could offer a use of the approach in buildings to z. B. in a private apartment with kitchen appliances or other intelligent objects (artefacts) or in public areas (train station, office, etc.) to interact with intelligent objects.
  • the mobile device 10 in the exemplary embodiment shown comprises a device 16 for interacting with the intelligent object 14, an environment sensor 18 for detecting a relative position of the intelligent object, and an orientation sensor 20 for detecting an orientation of the mobile device 10.
  • the environment sensor 18 can in particular include a position sensor of a satellite navigation system. For example, a current absolute position of the mobile device 10 can be determined, which is then compared with positions of mobile intelligent objects 14 based on a database query in order to determine their relative positions (relative to the user's or mobile device's own position) based thereon.
  • the orientation sensor 20 can include, for example, an inertial sensor (acceleration and/or yaw rate sensor) and/or a magnetic field sensor.
  • a position (for example in the form of an indication of the Euler angle) can be detected by such a sensor. In particular, this location can then be used in conjunction with the relative position to determine the alignment of the mobile device in relation to the intelligent object 14.
  • the device 16 according to the invention is shown schematically in FIG.
  • the device 16 includes an input interface 24, a Alignment unit 26, a communication unit 28 and a user interface 30.
  • the device 16 also includes a gesture unit 32.
  • the various interfaces and units can be implemented in hardware and/or software and can be designed individually or in combination. In particular, it is possible for the device 16 to correspond, so to speak, to a smartphone or another device with a corresponding app.
  • the environmental sensor data and the orientation sensor data are received via the input interface 24 .
  • the input interface 24 can be connected to corresponding sensors of the mobile device for this purpose.
  • the input interface 24 it is also possible for the input interface 24 to receive the information from remotely located sensors, for example via a short-range, low-energy radio link.
  • the device it is possible for the device to be integrated into intelligent glasses (smart glasses) and for data to be received from a position sensor of a smartphone via the input interface 24 via Bluetooth.
  • the alignment unit 26 detects whether the mobile device is aligned with the intelligent object. For this purpose, the environmental sensor data and the alignment sensor data are processed. In this case, the data can be pre-processed and/or further data can be accessed. In particular, it is advantageous if a machine learning approach is used in the alignment unit 26 in order to detect whether the mobile device is aligned with the intelligent object. In particular, a pre-trained artificial neural network can be used that was trained based on corresponding training data at a previous point in time.
  • a predefined pointing direction of the mobile device can be used to detect whether the mobile device is aimed at the intelligent object.
  • This predefined pointing direction can in particular correspond to a direction specified for the mobile device, for which a deviation can then optionally be specified.
  • the communication unit 28 is designed to communicate with the intelligent object.
  • a low-power, short-distance communication link can be established with the intelligent object.
  • a Bluetooth Low Energy (BLE) communication chip can be used, which is controlled in a corresponding manner via the communication unit 28 .
  • BLE Bluetooth Low Energy
  • the communication unit 28 it is also possible for the communication unit 28 to be designed to communicate via the internet, ie via a corresponding internet server and a mobile data connection. It goes without saying that the intelligent object itself is also capable of communication and to this extent can be addressed/reachable.
  • the user interface 30 is for informing the user of the mobile device when the communication link has been established.
  • a display or another actuator of a mobile device can be controlled for this purpose.
  • This locking can be made detectable based on a vibration function or a vibration actuator of the mobile device 10 .
  • a visual output can take place, for example, by marking a traffic light on a visualization of a map representation on the display of the mobile device 10 .
  • a signal tone can be emitted.
  • the user 12 is thus shown with haptic, acoustic and/or visual feedback that the smartphone is locked and that an action in the sense of an interaction with the intelligent object 14 can take place. For example, a green phase of the traffic light can then be requested by gesture control or by pressing a button on a touch screen.
  • the user can also make a preselection via the user interface 30 with regard to the intelligent objects to be displayed Will be received. For example, the user can select or define categories of intelligent objects of interest in order to be able to interact with the environment in a more targeted manner and to find the intelligent objects or functionalities relevant to him.
  • the gesture unit 32 is for recognizing a gesture performed by a user of the mobile device. To do this, the alignment sensor data is processed. In particular, a gesture that the user of the mobile device performs with the mobile device can be recognized. For example, the user can nod, in the case of smart glasses, or perform a swipe or other movement with a smartphone or tablet.
  • the detection can then be used on the one hand as an additional prerequisite for establishing the communication link.
  • it may also be necessary to first perform an activation gesture in order to avoid unintentional connection establishments.
  • a control gesture it is possible for a control gesture to be recognized by means of the gesture unit 32 .
  • the user of the mobile device can remotely control an intelligent object using gesture control after the communication connection has been established.
  • FIGS. 3 and 4 one possibility of establishing a communication link with an intelligent object and of controlling the intelligent object is shown schematically.
  • a view of a display 34 of a smartphone or tablet or other mobile device is shown before (FIG. 3) and after (FIG. 4) the establishment of the communication connection.
  • Relative positions R1, R2, R3 of three intelligent objects in the area surrounding the mobile device are visualized at position M on a map of an area surrounding the mobile device. For example, thumbnails of the smart objects can be displayed.
  • the map display is centered on the position M of the mobile device.
  • the alignment of the mobile device is also visualized since the representation moves with the rotation of the mobile device. The user can therefore rotate in relation to the intelligent objects, with the map display also rotating like a compass and the wind rose, thus visualizing the orientation.
  • a communication link is established with it. This may require a minimum time (for example 2 seconds) for which the mobile device must be aligned with the intelligent object.
  • Whether the mobile device is aligned with the intelligent object can be determined based on a pointing direction of the mobile device, for example.
  • This pointing direction corresponds to a parameter set for the mobile device.
  • This pointing direction can correspond to a longitudinal axis of the mobile device, for example.
  • the field of view of a mobile device can be defined, for example, based on an indication of a maximum deviation from this pointing direction, for example 5°.
  • the user in FIG. 4 aligns the mobile device with the intelligent object at position R2, so that a communication connection can then be established with it.
  • the mobile device vibrates to indicate to the user that the connection has been established.
  • interaction with the intelligent object can take place using a button 36 that is displayed for this purpose (FIG. 4).
  • the map view rotates like a compass rose so that the top of the mobile device display 34 shows the direction in which the mobile device is being held.
  • This direction can be specified, for example, by specifying a number of degrees. In the example shown, the direction is at 70 degrees ( Figure 3). If the mobile device is then rotated, for example to 115 degrees ( Figure 4), a Pedestrian traffic light as an intelligent object in front of him. As soon as the mobile device is aligned with the traffic light, the snapping process described above takes place and the traffic light is marked. The smartphone vibrates, there is a beep and the image of a request button appears in the form of button 36. This shows the user with haptic, acoustic and visual feedback that the mobile device is locked and that an action or interaction with the intelligent object can be carried out.
  • a green phase can be requested with a slight backward movement of the mobile device.
  • the mobile device is intended to be held approximately level until it points to a smart object. It can also be possible here for the mobile device to be rotated through 90 degrees, so that the real environment can then be shown on the display 34 by means of a camera arranged on the back of the mobile device. The pointing direction would then correspond to a direction perpendicular to the display 34. It is possible for the communication connection to be established by locking in the form of an augmented reality display. For example, the snapped smart object can be highlighted.
  • a gesture control provided according to the invention is shown schematically in FIG.
  • the gesture control can be used in connection with other applications.
  • a corresponding gesture control can be made available in a general smart city application for a city with networked buses, trams, museums and other information.
  • the user 12 holds a mobile device 10 (in particular a smartphone or a tablet) in his hand and uses it to perform a movement.
  • a mobile device 10 in particular a smartphone or a tablet
  • an activation gesture must first be recognized.
  • this can correspond to the swipe movement that is made.
  • the activation gesture can represent the prerequisite for establishing a communication connection.
  • a control gesture may be recognized to interact with the smart object. For example, a green phase can be requested at a traffic light or a timetable can be requested from an intelligent tram stop.
  • gesture control mode It can be advantageous to first initiate a type of gesture control mode in order to then be able to carry out gesture recognition. It can also be advantageous if provision is made for a return from gesture control to a classic operation, for example by touching a touchscreen. This can reduce the risk of detecting an unintentional gesture.
  • a relevant question in this context is how the user can arrive at the use of gesture control or initiate this gesture control mode.
  • One possibility for this can be activating a corresponding selection point on a touchscreen of the mobile device. A corresponding button is thus pressed, which puts the mobile device into a gesture control or recognition mode.
  • the gesture control mode can be activated by a typical gesture (e.g., a right turn, like unlocking a lock with a key, or a click gesture, or an individually learned and individual gesture).
  • a typical gesture e.g., a right turn, like unlocking a lock with a key, or a click gesture, or an individually learned and individual gesture.
  • reaching gesture control mode is through a kind of snapping of the mobile device is implemented by means of a vibration actuator etc. or in some other way.
  • the gesture control mode is activated, the user's gestures are recognized and he can use this to interact with the intelligent object, for example. In a similar way, switching back from gesture control to classic operation can also take place.
  • a method for interacting with an intelligent object in an environment of a mobile device is shown schematically in FIG.
  • the method includes steps of receiving S10 environmental sensor data and orientation sensor data, detecting S12 whether the mobile device is aligned with the intelligent object, establishing S14 a communication link with the intelligent object, and notifying S16 a user of the mobile device when the communication link is established became.
  • the method can be implemented, for example, as a smartphone app or as a tablet app and made available via a corresponding app store.

Abstract

L'invention concerne un dispositif (16) pour l'intégration avec un objet intelligent (14) dans l'environnement d'un dispositif mobile (10), comprenant : une interface d'entrée (24) pour recevoir des données de capteur d'environnement et des données de capteur d'alignement, dans laquelle les données de capteur d'environnement comprennent des informations sur la position de l'objet intelligent par rapport au dispositif mobile, et les données d'alignement comprennent des informations sur l'alignement du dispositif mobile par rapport à l'environnement ; une unité d'alignement (26) pour détecter si le dispositif mobile est aligné vers l'objet intelligent sur la base des données de capteur d'environnement et des données de capteur d'alignement ; une unité de communication (28) pour établir une connexion de communication avec l'objet intelligent lorsque le dispositif mobile est aligné vers l'objet intelligent ; et une interface utilisateur (30) pour informer un utilisateur du dispositif mobile lorsque la connexion de communication a été établie. L'invention concerne en outre un dispositif mobile (19) et un procédé.
PCT/EP2022/071813 2021-08-04 2022-08-03 Intégration avec des objets intelligents dans l'environnement de dispositifs mobiles WO2023012215A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102021120315.5A DE102021120315A1 (de) 2021-08-04 2021-08-04 Interagieren mit intelligenten Objekten in der Umgebung mobiler Geräte
DE102021120315.5 2021-08-04

Publications (1)

Publication Number Publication Date
WO2023012215A1 true WO2023012215A1 (fr) 2023-02-09

Family

ID=83115611

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/071813 WO2023012215A1 (fr) 2021-08-04 2022-08-03 Intégration avec des objets intelligents dans l'environnement de dispositifs mobiles

Country Status (2)

Country Link
DE (1) DE102021120315A1 (fr)
WO (1) WO2023012215A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180352070A1 (en) * 2017-06-05 2018-12-06 Bose Corporation Wireless pairing and control
US20200073482A1 (en) 2017-03-21 2020-03-05 Pcms Holdings, Inc. Method and system for the detection and augmentation of tactile interactions in augmented reality
WO2020170105A1 (fr) * 2019-02-18 2020-08-27 Purple Tambourine Limited Interaction avec un dispositif intelligent à l'aide d'un dispositif de commande de pointage

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200073482A1 (en) 2017-03-21 2020-03-05 Pcms Holdings, Inc. Method and system for the detection and augmentation of tactile interactions in augmented reality
US20180352070A1 (en) * 2017-06-05 2018-12-06 Bose Corporation Wireless pairing and control
WO2020170105A1 (fr) * 2019-02-18 2020-08-27 Purple Tambourine Limited Interaction avec un dispositif intelligent à l'aide d'un dispositif de commande de pointage

Also Published As

Publication number Publication date
DE102021120315A1 (de) 2023-02-09

Similar Documents

Publication Publication Date Title
DE102014114711B4 (de) Informationsverarbeitungsverfahren und erste elektronische Vorrichtung
DE112018004395T5 (de) Virtueller zugriff auf ein zugriffsbeschränktes objekt
DE102016113060A1 (de) Verfahren zum Steuern eines Objekts
CN108351221A (zh) 用于生成交互式用户界面的系统和方法
DE102005061211A1 (de) Verfahren zum Erzeugen einer Mensch-Maschine-Benutzer-Oberfläche
CN102393808A (zh) 手持无线设备中的智能图形界面
EP2467822B1 (fr) Détermination continue d'une perspective
CN103135916A (zh) 手持无线设备中的智能图形界面
DE112012004785T5 (de) Merkmalerkennung zum Konfigurieren einer Fahrzeugkonsole und zugeordnete Vorrichtungen
EP3025223A1 (fr) Procédé et dispositif de commande à distance d'une fonction d'un véhicule automobile
DE102015007493B4 (de) Verfahren zum Trainieren eines in einem Kraftfahrzeug eingesetzten Entscheidungsalgorithmus und Kraftfahrzeug
DE112019006699T5 (de) Steuerung entfernter vorrichtungen unter verwendung von benutzerschnittstellenvorlagen
EP2350799A1 (fr) Procédé et dispositif d'affichage d'informations ordonnées sous forme de liste
DE112015005511T5 (de) Mehrzweck-anwendungsstartoberfläche
CN111641933A (zh) 车队管理方法、装置及相关设备
DE102017204976B4 (de) Bediensystem zur Steuerung zumindest einer bedienbaren Einrichtung, Fahrzeug mit einem Bediensystem und Verfahren zum Betreiben eines Bediensystems
EP3485621B1 (fr) Procédé de fourniture d'un moyen d'accès à une source de données personnelles
WO2017167489A1 (fr) Dispositif de commande et procédé de coordination de composants fonctionnels de véhicule automobile entre eux et/ou avec au moins un composant fonctionnel extérieur au véhicule
WO2023012215A1 (fr) Intégration avec des objets intelligents dans l'environnement de dispositifs mobiles
CN109032343B (zh) 基于视觉和力觉触觉增强现实的工业人机交互系统及方法
DE102018123635A1 (de) 3d-kartierung einer prozesssteuerungsumgebung
EP2697702B1 (fr) Dispositif et procédé de commande gestuelle d'un écran de visualisation dans une salle de contrôle
WO2017108560A1 (fr) Dispositif d'affichage et système de commande
DE102014220527A1 (de) System und Verfahren für das Steuern einer Fahrzeugtür
EP3108333A1 (fr) Interface utilisateur et procédé d'assistance d'un utilisateur lors de la commande d'une interface utilisateur

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22761141

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2022761141

Country of ref document: EP

Effective date: 20240304