EP2616909A1 - Arrangement and related method for digital signage - Google Patents

Arrangement and related method for digital signage

Info

Publication number
EP2616909A1
EP2616909A1 EP10857203.3A EP10857203A EP2616909A1 EP 2616909 A1 EP2616909 A1 EP 2616909A1 EP 10857203 A EP10857203 A EP 10857203A EP 2616909 A1 EP2616909 A1 EP 2616909A1
Authority
EP
European Patent Office
Prior art keywords
arrangement
view
data
entity
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP10857203.3A
Other languages
German (de)
French (fr)
Other versions
EP2616909A4 (en
Inventor
Petteri Lappalainen
Markus Porvari
Timo Arnivuo
Mikko Ahtiainen
Heikki Uljas
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hyperin Inc
Original Assignee
Hyperin Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hyperin Inc filed Critical Hyperin Inc
Publication of EP2616909A1 publication Critical patent/EP2616909A1/en
Publication of EP2616909A4 publication Critical patent/EP2616909A4/en
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • G06Q10/047Optimisation of routes or paths, e.g. travelling salesman problem
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]

Definitions

  • Digital marketing is an evolving technology used in a retail setting that brings together promotional, advertising and informational content to local out-of-home environments.
  • Various researches indicate that generally over 70% of the consumer decision making process is done on location of the purchased service or a product and digital advertisements are 5-10 times more likely to be noticed when compared to static media. Accordingly, no wonder why digital marketing is one of the world's fastest growing marketing channels with a projected volume of 1 1,4 b € by the year 201 1, but yet picking up momentum due to heavy costs, complicated hardware or system projects and the lack of standardized services.
  • Digital signage generally refers to the provision of information such as wayfmding data or marketing data including advertisements to visual perception by way of digital signs, i.e. digital displays like flat screens, media kiosks or feasible projection means, for instance.
  • the content to be visualized via the digital signage displays may be thus flexibly controlled even remotely by administrative personnel through the access of the associated content management entity such as a content server and a network infrastructure, such as a wired or wireless network, connecting the server with a number of signage displays.
  • Data may be ultimately locally visualized by each display such that also various environmental factors such as ambient lighting conditions are catered for.
  • a typical solution for e.g. indoor navigation in connection with a digital signage display located in a mall, airport or other location merely shows the associated floor plans in a stripped-down 2D map format, which is, however, rather sub-optimal regarding e.g. the natural human sense of the corresponding space, and may result in annoying misjudgments relative to correct position, heading, and route in the space as taken by the users.
  • the objective of the present invention is to at least alleviate one or more of the afore-explained defects associated with the prior art solutions and provide an interactive and adaptive alternative for digital signage equipment such as media kiosks or other terminals.
  • the objective is met by a digital signage arrangement and a related method in accordance with the present invention.
  • the digital signage arrangement may be configured to display data such as product, service, and/or wayfmding information to a user thereof via a suitable visualization device, such as a display or a projector, and further be configured to receive user input on the basis of which the view, i.e. one or more elements thereof, produced by the visualization device may be adapted so as to elevate the usability thereof from the standpoint of the particular user.
  • the arrangement may be physically realized as a substantially integrated entity such as a media kiosk, e.g. a touch screen stand, or through co-operation of multiple at least functionally connected entities such as a (touch)screen or other type of input/visualization entity connected to a remote computer, or as a more comprehensive terminal device, such as a mobile terminal, a PDA (personal digital assistant), or a tablet computer running e.g. a web browser and/or other enabling application and being connected to a remote computer system providing data, for instance.
  • a media kiosk e.g. a touch screen stand
  • multiple at least functionally connected entities such as a (touch)screen or other type of input/visualization entity connected to a remote computer
  • a more comprehensive terminal device such as a mobile terminal, a PDA (personal digital assistant), or a tablet computer running e.g. a web browser and/or other enabling application and being connected to a remote computer system providing data, for instance.
  • a computerized electronic arrangement for digital signage such as a media kiosk like a touch screen stand, comprises
  • a memory entity such as one or more memory chips, configured to store data, such as product, service, and/or wayfmding information, to be visualized via the arrangement,
  • -a visualization entity such as a display, configured to visualize the data
  • processing entity such as at least one microprocessor or microcontroller, configured to control the visualization of the data
  • the arrangement may further be configured to perform the adaptation through utilization of at least one option selected from the group consisting of: move the element lower, e.g.
  • the arrangement may be configured to visualize an element associated with the adaptation, such as an icon or symbol forming e.g. a part of the view, wherein the element is selectable by the user utilizing the user input entity for triggering the adaptation of the view.
  • the arrangement may include a data transfer interface, such as at least one interface for wired communication and/or at least one wireless transceiver, for communication with at least one external entity. Alternatively or additionally, as being potentially physically distributed, one or more entities of the arrangement may mutually communicate utilizing the data transfer interface. Data to be transferred may include control instructions and/or data to be visualized to the users via the display entity (or otherwise reproduced, e.g. audibly), for example. Likewise, data such as usage statistics, user input and/or related requests may be transferred via the data transfer interface.
  • the arrangement such as the user input entity thereof, may include a touch screen for data output and input.
  • the display element of the touch screen visualizes the data
  • the touch sensitive part e.g. optical, infrared, imaging, resistive, and/or capacitive part, registers the user's touch including one or more parameters such as location, pressure (force), duration, etc.
  • the touch screen includes a dual-touch or a multi-touch feature capable of simultaneously registering two or more touches and preferably also touch locations, respectively. Different functionalities such as element movement, zooming, and/or element activation may be controlled accordingly.
  • the user may indeed be provided with a possibility to change e.g.
  • the arrangement may include at least one user input entity selected from the group consisting of: a switch, button, keypad, keyboard, and touchpad.
  • the aforementioned data interface such as a predetermined transceiver may be configured to receive user input utilizing wired or wireless data transfer technology.
  • suitable wireless technology e.g. Bluetooth, FID, NFC (Near Field Communication), WiFi/WLAN (Wireless LAN), and/or a selected cellular network technology (GSM, UMTS, CDMA2000, etc.
  • GSM Global System for Mobile communications
  • UMTS Code Division Multiple Access 2000
  • CDMA2000 Code Division Multiple Access 2000
  • a user may carry a personal communications-enabled device such as a cellular phone, a PDA, or a tablet computer along and apply it for providing user input to the arrangement.
  • Input may be submitted using textual messages such as SMS (Short Message Service) messages or via an USSD (Unstructured Supplementary Service Data) application, for instance.
  • SMS Short Message Service
  • USSD Unstructured Supplementary Service Data
  • a call to a predetermined number optionally shown via or on the arrangement may be applied for input purposes.
  • DTMF Dual-tone Multi-Frequency
  • the arrangement may be configured to provide wayfmding aid to a user thereof. In order to obtain this, e.g. the visualization entity may be applied.
  • the aforementioned wayfmding information may refer to location, map, and/or route information, and related data, for example, available via the arrangement.
  • a 2D map, floor plan, or other information may be shown to the users with optional route guidance data such as graphical data including e.g. arrows. Audio guidance such as spoken route instructions may be provided.
  • a substantially 3D representation displayed via a 3D projection i.e. utilizing a two-dimensional plane provided by a display, for example
  • a 3D projection i.e. utilizing a two-dimensional plane provided by a display, for example
  • an isometric view or other wayfmding-enabling view, such as a bird's eye view
  • the predetermined direction is the initial direction to which the user has to travel in order to reach the destination from the location.
  • the view may be a first-person view, i.e.
  • the wayfmding view may thus generally refer to a computer-generated, i.e. artificial view with artificial elements such as direction arrows, and/or it may comprise natural elements such as a static or real-time camera view augmented with artificial elements like computer-generated route guidance data, e.g. the arrow symbols, overlaid thereon and resulting in an augmented reality view. Route information may thus be cleverly shown also with the 3D view.
  • the wayfmding embodiment may also be implemented as a stand-alone solution potentially independent of the view adaptation described hereinbefore.
  • the arrangement may be configured to provide vehicle finding aid, which may be considered as a special case of wayfmding.
  • vehicle finding aid For example, in the context of a mall, airport or other big complex with large-scale parking facilities such as an optionally multi-store garage facility, finding one's own vehicle, such as a car, after having spent some time doing completely different things, such as shopping, may turn out challenging due to various facts: quite often the all the garages and related structures, parking spaces, etc. look the same and without carefully observing the characteristic features such as floor number, near-by structures, and the route walked upon leaving the vehicle in the first place, considerable amount of time may be unnecessarily consumed to find the vehicle again.
  • the arrangement may be thus arranged to provide route information, optionally including e.g. direction information such as initial heading, for locating the vehicle on the basis of user input.
  • the user input may include at least one element selected from the group consisting of: coordinates of the location of the vehicle, identification data of the location of the vehicle, and identification data such as register number of the vehicle. Data included in the user input may be first manually acquired by the user. He/she may write it down or, upon parking the vehicle, type it in or scan (photograph, for example) it utilizing his/her electronic notebook device such as a cellular phone or PDA, or other available device.
  • the data visualized in the near-by display or surface may include floor number and/or other ID, parking space number and/or other ID.
  • a barcode or a tag e.g. an FID tag, may be applied for representing and/or carrying the necessary information.
  • Identification data relating to the vehicle itself may be utilized in route determination provided that the parking facilities include technology, such as imaging technology, to detect and recognize the vehicle accordingly and monitor its movements and/or trace its parking location.
  • a vehicle may contain a tag that can be externally read for tracking purposes.
  • the arrangement may be configured to determine a route from the location of the arrangement, an element such as the visualization entity thereof, or some other predetermined location, back to the vehicle.
  • the arrangement may utilize a number of predetermined route determination criteria such as route length and/or estimated route duration. Typically minimization of both criteria is sought after.
  • a database of ready-determined routes may be applied.
  • the vehicle finding aid may exploit the aforesaid more generic wayfmding feature for initial direction visualization, for instance.
  • the arrangement may be configured to associate a number of metadata entities with a number of data entities, preferably images such as photographs of subject entities like products or services to be promoted or otherwise put forth via the arrangement.
  • a metadata entity describing an image may include at least one metadata element selected from the group consisting of: image identifier, identifier of the image's subject matter such as product name and/or product model name, subject class such as product or service class, dealer or other source of the subject, dealer location, manufacturer, price, and free word field.
  • An image may be shown via the visualization entity in a (view) position and/or at a time instant depending on the metadata. Images may be searched using the metadata and a related search facility offered by the arrangement.
  • Potential metadata values may be applied as search terms by the user of the arrangement.
  • the result group of the search may include images with metadata matching the query criteria.
  • Search results, i.e. images, matching the query may be visualized.
  • the wayfmding methods described herein may be utilized for determining and visualizing a route to the (nearest) location offering the product or service represented by an image.
  • a method for adapting digital signage views comprises
  • a digital signage data visualization entity such as a display
  • a user input entity such as a touch screen functionality associated with the data visualization entity
  • the user input being indicative of a request by the user of the arrangement for adaptation of the present view visualized via the visualization entity in terms of at least one usability issue, such as disability and/or other property of the user, e.g. physical property such as height, and
  • the utility of the present invention arises from a plurality of factors depending on each particular embodiment.
  • use experience associated with digital signage equipment may be enhanced by adapting data representation according to user preferences.
  • wayfmding may be made more illustrative and user-friendly through the proposed 3D modeling and/or augmented camera view solution.
  • Vehicle location finding embodiments may provide additional value to the users of digital signage gear when they are about to exit e.g. a mall provided with a huge parking facility with little or no memory of the location of their vehicles, for instance.
  • products purchasable at or near the location of the digital signage user equipment may facilitate implementation of both versatile and efficient product/dealer search-and-fmd feature in the context of digital signage with optional hooks such as wayfmding.
  • the users may be provided with more targeted and complete information package concerning each marketed service or product.
  • data transfer may refer to transmitting data, receiving data, or both, depending on the role(s) of a particular entity under analysis relative a data transfer action, i.e. a role of a sender, a role of a recipient, or both.
  • the terms “a” and “an” do not denote a limitation of quantity, but denote the presence of at least one of the referenced item.
  • first and second do not denote any order, quantity, or importance, but rather are used to distinguish one element from another.
  • Fig. 1 illustrates the general concept of the present invention according to an embodiment thereof.
  • Fig. 2a illustrates an embodiment of the present invention as to the view adaptation feature suggested.
  • Fig. 2b illustrates the embodiment of Figure 2a at a second instant after acquisition of user input and related adaptation.
  • Fig. 3 illustrates an embodiment of wayfmding in accordance with the present invention.
  • Fig. 4 illustrates an embodiment of wayfmding, in particular vehicle finding, in accordance with the present invention.
  • Fig. 5 illustrates an embodiment of product/service promotion in accordance with the present invention.
  • Fig. 6 is a block diagram of one embodiment in accordance with the arrangement of the present invention.
  • Fig. 7 depicts a flow diagram of a method in accordance with the present invention to be performed by the server arrangement.
  • An arrangement according to an embodiment of the present invention may be configured to decide on the selected adaptation based on a predetermined logic and the nature of the user input.
  • the user input thus preferably indicates the usability issue to be tackled.
  • an icon, other symbol, graphical element and/or text which is associated technically, i.e. by the logic of the arrangement, and preferably also mentally with a certain usability issue, such as user disability and/or other property, may be visualized using the visualization entity, the selection or other activation measure of which by the user, e.g. via the touch screen of the arrangement, subsequently triggers the adaptation procedure.
  • a symbol of a wheelchair may be associated with a corresponding disability affecting e.g.
  • the user input may even be automatically provided to the arrangement via a personal terminal device of the user, such as a mobile terminal or communications-enabled tablet, a PDA, or e.g. a wristop computer.
  • the terminal device may be configured to store user-adjustable settings for digital signage to be transferred advantageously automatically upon detection of external signage equipment (e.g. via active scan procedure) or not until user confirmation/initiation obtained via the UI of the terminal.
  • the view may be adapted by adapting at least one visualized element thereof, and/or the view may be adapted by adding thereto and/or removing therefrom a number of elements.
  • an already visualized element may be adapted to lessen the burden of consuming it. Consuming may refer to reading, for example, in the case of an element including text from the standpoint of a weak-eyed or visually impaired user.
  • a new element such as a textual and/or graphical element may be visualized; for example, in the case of a hearing-impaired user, a scarcely heard audio message such as an ad may be suitably converted into a text visually represented to the user.
  • a number of audio-related factors such as audio level and/or reproduction speed or tempo of speech or other audio may be adapted. For example, playback volume/audio level may be increased.
  • Elements visualized but considered, according to predetermined logic, as secondary may be removed, moved to background, minimized, or otherwise de-emphasized in favour of priority element(s).
  • commercial information such as an ad may be considered as secondary in contrast to e.g. geographical information such as a map including e.g. a floor plan.
  • priority element(s) may be highlighted and emphasized during the adaptation for elevated usability. Reverting to Figure 1, the overall architecture applicable in connection with the present invention is illustrated according to one potential embodiment 101.
  • the arrangement of the present invention may be implemented via a device 106, 108, such as a media kiosk or other display-containing device, best suiting each particular context in question 102, 104, by which it may be referred to a mall, bus station, metro station, airport, commercial building, company premises, etc.
  • the device 106, 108 may also be formed of a plurality of at least functionally connected devices such that e.g. necessary power and/or communications connections are established in between to form the arrangement.
  • the arrangement may be provided with a connection or at least connectivity to a number of wired and/or wireless network infrastructures 1 12, which may refer to public or private network(s) both alike.
  • An Internet connection/access may be arranged, for example.
  • external entities such as a number of servers 1 14, e.g. a cloud computing infrastructure, functionally connected to the arrangement via the Internet and/or using a private network, for instance, may be applied.
  • the server 1 14 may host, e.g. in a commercial context, a web site for consumers, i.e. potential end users of the arrangement, and/or an administration portal, which may be used for creating the content for the user interface, such as shop information and ads, and/or for information sharing between product and service providers creating the consumer content.
  • User-related data such as user selections and other user input may be gathered for analysis, e.g. profiling and targeted marketing purposes.
  • the arrangement may be considered to at least logically contain a number of communications network(s) 1 12 and/or external entities such as servers 1 14. Yet, in some embodiments the arrangement may even substantially consist of or contain a wireless mobile terminal device 1 10 such as a cellular phone, a PDA (personal digital assistant), a tablet computer, or a laptop computer, which is discussed further below.
  • the arrangement may be provided with a display device, user input device (these two may be cleverly combined in a touch screen, for example), a computing device such as a number of processors, and a memory device such as a number of memory chips. Further, a number of optional elements such as a data transfer interface for communication with external entities may be provided therewith.
  • each element may have to be provided with a data transfer means sufficient for necessary mutual communication.
  • the elements of the arrangement and their higher level configuration remind the one of a personal computer (pc), for example, which makes it functionally possible to implement a similar solution in a mobile personal terminal device provided that the mobile device is configured to contain and/or obtain necessary data such as location data and location-related data for adequately serving the user in terms of digital signage.
  • the mobility may also open up new possibilities relative to location-awareness and wayfmding features provided by some embodiments of the arrangement as described in more detail hereinbelow.
  • the arrangement such as a media kiosk, may thus include a pc running a web browser with necessary add-on applications and features.
  • a web application framework such as SilverlightTM may utilized for constructing the functionality of the arrangement.
  • suitable supplementary software such as the Real Kiosk (R-kiosk) extension for the FirefoxTM browser, may further be applied to turn the browser more into a kiosk-style browser UI.
  • the arrangement may offer easy and simple way to find products and services of interest in a defined local environment and the product or service provider an efficient medium to reach the correct target group in a defined local environment through e.g. Internet connected touch screens and/or mobile devices, for example.
  • FIG. 2a an embodiment of the view adaptation feature is illustrated via two views relating to two time instants, respectively.
  • Figure 2a shows merely an exemplary view 202 of a digital signage solution prior to adapting
  • Figure 2b represents a corresponding adapted view 220.
  • a map/floo ⁇ lan/wayflnding view 204 or generally a first element, or a "sub-view", of the overall view 202 may be provided with optional pointers, symbols, and/or text, describing entities such as shops and restaurants e.g. in the vicinity of the signage gear itself or at an alternative, e.g. user-selected, location such as a different floor, etc.
  • a second element 210 such as an advertisement space for associated text, images, and/or videos relating to a number of advertisements, etc., may be simultaneously visualized at a different location than the first element 204.
  • the view 202 may include further elements such as a clock/calendar element 206.
  • Some elements 208, 212 may relate to user input entity such as a touch screen and indicate actions triggered via selecting (touching the associated screen location in conjunction with a touch screen, for example, or via point-and-click, if e.g. a cursor or highlighting indicator and a relating control feature such as a joystick, trackball, or a touchpad is provided) or otherwise activating them, e.g. via a button press of a predetermined button.
  • an element 212 associated with the adaptation such as an icon or symbol forming a part of the view 202, may be provided.
  • the element 212 may, as described above, be selectable and/or otherwise activable, or at least the related adaptation function addressable, by the user via the user input entity for triggering the adaptation.
  • the user input entity may apply data interface for obtaining the input transferred utilizing e.g. a short-range wireless or cellular connection, for instance.
  • the element 212 may be configured to visually indicate the nature of the disability and/or of the usability-enhancing action associated therewith.
  • the element 212 includes the international symbol of access (ISA) through the selection of which the display view 202 is adapted from the standpoint of a predetermined disability such as physical disability.
  • ISA international symbol of access
  • Element 204b illustrates an enlargened, zoomed (in) and relocated (moved closer to the screen bottom) element 204.
  • the usability has improved relative to e.g. physically disabled users using a wheelchair etc. and having difficulty seeing clearly to the upper part of the display, for instance.
  • Elements 206 and 210 considered of having lesser importance in the light of the usability adaptation associated with the element 212 have been removed. Alternatively, they could be at least reduced, hid, minimized, put into background, or otherwise de-emphasized, for example.
  • Elements 208 have been re-arranged including re -positioning 208b to facilitate access thereto.
  • Element 212 has been updated 212b regarding its visual appearance in this case to indicate the current state ("usability adaptation on") of the associated functionality to the user. Alternatively or additionally, e.g. future state after next activation could be indicated by the visual update. Re-selection of element 212b, or some other predetermined user input, may switch the view 220 back into 202, for instance, so that the views 202, 220 may be alternated.
  • New element 222 has also appeared.
  • the element 222 may be associated with a second usability adaptation measure, such as increase in text/font size to further facilitate reading the shown information by the visually impaired.
  • FIG. 3 illustrates an embodiment of wayfmding in accordance with the present invention.
  • a view 301 may be produced by a digital signage arrangement such as a media kiosk after the user thereof has input a destination location utilizing the user input entity, for example.
  • the equipment may utilize a dynamic route determination algorithm and/or a database of at least partially ready-determined routes. Even the latter may work fine in connection with static setting and e.g. fixed location of the arrangement (display).
  • the view 301 may include a 3D element 302, such as a camera-provided view including still photo image data and/or video image data, a computer-generated view, or a mixture of both, illustrating e.g. the surroundings of the arrangement in a fixed or dynamic, potentially user-controllable, direction.
  • the direction is aligned with the initial direction of the determined route.
  • predetermined or user-selected part(s) of the overall route may be visually and/or otherwise, potentially audibly, represented with route guidance data via the arrangement.
  • At least part of the route may be shown utilizing video data, a number of images, and/or computer graphics, for example.
  • the element 302 may include an isometric, a first-person, or a bird's eye view, for instance.
  • route guidance data 304, 306 such as a number of arrows 304, lines, other symbols, and/or textual/numerical information, e.g. distance indicator associated with direction arrow 304 and/or a (target) location pointer 306, may be shown simultaneously with the view as superimposed thereon, for instance.
  • other data such as commercial data and/or various announcements may be provided on the wayfmding view 301. Audio playback (route guidance and/or other audio data) is a further possibility.
  • the arrangement 404 may be configured to provide route information, optionally including e.g. directional information such as initial heading, for locating the target vehicle such as a parked car.
  • the user input may include coordinates of the location of the vehicle, other identification data of the location of the vehicle or near-by location, and/or vehicle-related identification data such as register number of the vehicle.
  • Route information may be provided via the data visualization entity and/or utilizing other techniques such as a printed route note and/or a printed route map. At least part of the route may be illustrated via a plane view 406, e.g. floor plan view, or applying 3D modeling techniques and/or camera views 408, for example.
  • a plane view 406 e.g. floor plan view, or applying 3D modeling techniques and/or camera views 408, for example.
  • computer-generated route guidance data in a form of arrows has been superimposed on a camera view, being a photo image or video image view.
  • route information and/or guidance data may in some implementations be supplied to a personal digital terminal device of a user, optionally in order to enable substantially continuous, real-time navigation service.
  • the terminal may send data such as location information to the arrangement 404.
  • Feasible indoor and/or outdoor positioning technologies may be utilized.
  • WLAN, Bluetooth, ZigBee, cellular (e.g. Cell-ID, TOA (Time of Arrival), TDOA (Time Difference of Arrival)), and/or satellite positioning, such as GPS (Global Positioning System) or GLONASS (Global Orbiting Navigation Satellite System) may be applied.
  • the terminal may, after initial communication with the arrangement 404, act autonomously (information regarding whole route received for independent utilization, for instance) or at least partially rely on information obtained from the arrangement 404 (real-time navigation instructions based on updated terminal location, for example).
  • Figure 5 illustrates an embodiment of product and/or service promotion in accordance with the present invention.
  • a number of metadata entities may be associated with a number of data entities, preferably images such as photographs of subject entities like products or services to be promoted or otherwise put forth via the arrangement.
  • a metadata entity may be or include a data entity identifier, identifier of the data entity's subject matter such as product/service name and/or model name, a subject class such as product or service class, an indication of the dealer or other source of the subject, an indication of dealer/source location, an indication of the manufacturer, a price indicator, and/or a free word field.
  • a search facility 504 applying the metadata may be provided to the users by the arrangement for finding interesting products and/or services. Potential metadata values may be applied as search terms.
  • the outcome of the search may include a number of images with metadata matching the search criteria.
  • the search results may be visualized 502.
  • the wayfmding methods described herein may be utilized for determining and visualizing a route to the (nearest) location offering the product or service represented by the image.
  • Metadata search terms applied by the users may be further monitored, analyzed and exploited. For example, an indication of most popular search criteria, and/or of related hits, i.e. search results meaning the data entities such as the images, may be listed or otherwise provided to the users and/or the operator or administrator of the arrangement. Further, the operator/administrator may be provided with a tool to more specifically manage the search terms, i.e. through censoring or "moderating".
  • search term list shown to the users.
  • other criteria such as commercial criteria may be additionally or alternatively applied for determining the order of the search terms in the search term list shown to the users. For example, advertisers such as product manufacturers or dealers may be willing to be pay for a desired (e.g. early) position in the "most searched" or "most popular" listing.
  • image data some other form of a visual representation such as video may be supplemented with related metadata.
  • still some other type of a data entity such as a multimedia element, audio element, or textual element may be supplemented with metadata. The similar reproduction and search procedure as described above may be correspondingly applied.
  • product and/or service information may be searched and visualized.
  • the images of the search result group may be visualized, e.g. in an automatically rotating or user-controllably rotatable slideshow format.
  • Metadata such as product/service dealer information (e.g. location of the nearest dealer) may be represented as well.
  • FIG. 6 is a block diagram of one embodiment in accordance with the arrangement 600 of the present invention.
  • the arrangement 600 may physically contain a number of at least functionally connected elements.
  • the arrangement 600 is typically provided with one or more processing devices capable of processing instructions and other data, such as one or more microprocessors, micro-controllers, DSP's (digital signal processor), programmable logic chips, etc.
  • the processing entity 602 may thus, as a functional entity, physically comprise a plurality of mutually co-operating processors and/or a number of sub-processors connected to a central processing unit, for instance.
  • the processing entity 602 may be configured to execute the code stored in a memory 604, which may refer to instructions and data relative to the digital signage arrangement software logic and software architecture 610 for controlling the arrangement 600.
  • the processing entity 602 may be configured to control data visualization and optionally also audio reproduction, for example.
  • the memory entity 604 may be divided between one or more physical memory chips or other memory elements.
  • the memory 604 may store program code and other data such as marketing data, map/wayfmding data, etc.
  • the memory 604 may further refer to and include other storage media such as a preferably detachable memory card, a floppy disc, a CD-ROM, or a fixed storage medium such as a hard drive.
  • the memory 604 may be non-volatile, e.g. ROM (Read Only Memory), and/or volatile, e.g. RAM (Random Access Memory), by nature.
  • Software (product) 610 may be provided on a carrier medium such as a memory card, a memory stick, an optical disc (e.g. CD-ROM or DVD), or some other memory carrier.
  • the UI (user interface) 612 may comprise a display or a data projector 612b, and keyboard/keypad or other applicable user (control) input entity 612a such as a touch screen and/or a voice control input, or a number of separate keys, buttons, knobs, switches, a tag reader such as RFID reader, a touchpad, a joystick, a mouse, and/or imaging device such as a barcode (1 st , 2 nd , and/or 3 rd generation compatible, for example) reader configured to provide the user of the arrangement 600 with practicable data visualization and device control means, respectively.
  • a display or a data projector 612b such as a touch screen and/or a voice control input, or a number of separate keys, buttons, knobs, switches, a tag reader such as RFID reader, a touchpad, a joystick, a mouse, and/or imaging device such as a barcode (1 st , 2 nd , and/or 3 rd generation compatible, for example) reader
  • the UI 612 may include one or more loudspeakers and associated circuitry such as D/A (digital- to-analogue) converter(s) for sound output, and optionally a microphone with A/D converter for sound input.
  • D/A digital- to-analogue
  • a printer may be included in the arrangement for providing more permanent output.
  • a mobile-executable application or other software may be downloaded preferably wirelessly to the terminal for data visualization, audio playback and/or communication such as user input provision relative to the (rest of the) arrangement 600.
  • browser and/or messages such as text or USSD messages may be applied for user input purposes and/or data transfer in the opposite, downlink direction.
  • the arrangement 600 may further comprise a data interface 608 such as a number of wired and/or wireless transmitters, receivers, and/or transceivers for communication with other devices such as terminals and/or network infrastructure(s).
  • a data interface 608 such as a number of wired and/or wireless transmitters, receivers, and/or transceivers for communication with other devices such as terminals and/or network infrastructure(s).
  • Non-limiting examples of the generally applicable technologies include GSM (Global System for Mobile Communications), GPRS (General Packet Radio Service), EDGE (Enhanced Data rates for Global Evolution), UMTS (Universal Mobile Telecommunications System), WCDMA (wideband code division multiple access), CDMA2000, PDC (Personal Digital Cellular), PHS (Personal Handy-phone System), WLAN (Wireless LAN, wireless local area network), WiFi, Ethernet, USB (Universal Serial Bus), RFID, NFC (Near-Field Communication), and Firewire. Further, a Bluetooth adapter for peer-to-peer communication and piconet/scatternet use may be provided.
  • GSM Global System for Mobile Communications
  • GPRS General Packet Radio Service
  • EDGE Enhanced Data rates for Global Evolution
  • UMTS Universal Mobile Telecommunications System
  • WCDMA wideband code division multiple access
  • CDMA2000 Code Division Multiple Access 2000
  • PDC Personal Digital Cellular
  • PHS Personal Handy-phone System
  • WLAN Wireless LAN, wireless local area network
  • WiFi Ethernet
  • the arrangement 600 may comprise numerous additional functional and/or structural elements for providing advantageous communication, processing or other features, whereupon this disclosure is not to be construed as limiting the presence of the additional elements in any manner.
  • FIG. 7 is a flow diagram of an embodiment of a method in accordance with the present invention.
  • an arrangement in accordance with an embodiment of the present invention is obtained and configured, for example via loading and execution of related software, for managing a digital signage installation.
  • a connection may be established to a network server and/or a mobile client device (terminal) carried by a user.
  • digital signage data such as product, service and/or wayfmding -related data, is visualized via an applicable entity such as a separate digital signage display and/or terminal device of the user.
  • the arrangement may be configured to receive user input via a number of options including a dedicated user interface like a touch screen.
  • interface software running in a compatible terminal device carried along by the user may be applied, for example.
  • the user input is captured, the user input being indicative of a request for adapting of the present view visualized via the visualization entity in terms of at least one usability issue, such as disability and/or other property of the user, e.g. physical property such as height.
  • the position, size and/or the way of representation, such as style, of at least one element of the view is adapted in response to the user input so as to improve the usability of the arrangement in the light of the aforesaid usability issue.
  • the method execution is ended. Broken lines depict the potentially repetitive nature of various method items.
  • a method flow diagram relating to an embodiment of wayfmding in connection with digital signage is illustrated.
  • Initial and final method items (not shown) with preparatory and concluding actions, respectively, may be similar to the ones 700, 708 of the adaptation method also in this case.
  • the method may actually be executed jointly with the adaptation method, in parallel therewith, or independently.
  • route/navigation starting point is solved. It may be, by default, associated with the location of the digital signage arrangement, or at least e.g. the display thereof, if the related elements are physically separate.
  • the user may determine the starting point utilizing the user input entity and further optionally e.g. his/her personal terminal device may be capable of preferably wirelessly providing route source location information to the signage gear.
  • an indication of the route target location is received, preferably again utilizing the user input entity.
  • the target location may imply the location of a particular store, for instance.
  • the target location may indicate the location of a predetermined vehicle.
  • at least part of the route preferably at least e.g. the initial direction is determined, which may refer to dynamic calculations and/or utilization of pre-defined routes ("route database"), for example.
  • route information such as a 3D view like a video camera or still (photo) image view preferably augmented with route navigation data such as instructive arrows, is provided utilizing the visualization entity and/or data interface for transmitting data to be visualized to an external device such as a terminal device of the user as mentioned hereinbefore.
  • the terminal location may be acquired using e.g. network-based or terminal-based positioning, whereupon the route information may be dynamically determined on the basis of the location by the arrangement and/or the terminal itself.

Abstract

A computerized electronic arrangement (106, 108, 110, 114, 404, 600) for digital signage, such as a media kiosk like a touch screen stand, comprising a memory entity (604) configured to store data, such as product, service, and/or wayfinding information, to be visualized via the arrangement, a visualization entity (612b), such as a display, configured to visualize the data,a processing entity (602) configured to control the visualization of the data, and a user input entity (612a, 608), such as a touch screen functionality associated with the data visualization entity, configured to receive user input indicative of a request by the user of the arrangement for adaptation of the present view visualized via the visualization entity in terms of at least one usability issue, such as disability and/or other property of the user, e.g. physical property such as height, whereupon the processing entity is configured to adapt the position (208, 208b), size (204, 204b), and/or the way of representation (212, 212b) of at least one element of the view in response to the user input so as to improve the usability of the arrangement in the light of the aforesaid usability issue (202, 220).A related method is presented.

Description

ARRANGEMENT AND RELATED METHOD FOR DIGITAL SIGNAGE
FIELD OF THE INVENTION
The present invention generally relates to the fields of digital information storage, processing, transfer and representation. In particular, however not exclusively, the invention concerns digital signage and dynamic adaptation thereof in view of usability.
BACKGROUND OF THE INVENTION
Digital marketing is an evolving technology used in a retail setting that brings together promotional, advertising and informational content to local out-of-home environments. Various researches indicate that generally over 70% of the consumer decision making process is done on location of the purchased service or a product and digital advertisements are 5-10 times more likely to be noticed when compared to static media. Accordingly, no wonder why digital marketing is one of the world's fastest growing marketing channels with a projected volume of 1 1,4 b€ by the year 201 1, but yet picking up momentum due to heavy costs, complicated hardware or system projects and the lack of standardized services.
Digital signage generally refers to the provision of information such as wayfmding data or marketing data including advertisements to visual perception by way of digital signs, i.e. digital displays like flat screens, media kiosks or feasible projection means, for instance. In contrast to traditional static signage, the content to be visualized via the digital signage displays may be thus flexibly controlled even remotely by administrative personnel through the access of the associated content management entity such as a content server and a network infrastructure, such as a wired or wireless network, connecting the server with a number of signage displays. Data may be ultimately locally visualized by each display such that also various environmental factors such as ambient lighting conditions are catered for.
Notwithstanding the various benefits the contemporary (digital) signage products offer over more traditional solutions, few defects still remain therewith.
On one hand, even a use case -specifically tailored digital signage does not guarantee adequate use experience for all members of the audience. In conjunction with information displays, interactive digital signage, etc., it may still happen that, despite of the initial optimization of the positioning, alignment, and content (view) of the signage means from the standpoint of average user thereof as to his/her personal capabilities and preferences, some potential users cannot acquire the information provided by the signage at all, or the use experience is just dissatisfying. This may easily happen to the elderly, disabled, or otherwise non- average users depending on the particular context in question, for example.
On the other hand, available wayfmding (navigation) and related search & guidance services are not perfect either. A typical solution for e.g. indoor navigation in connection with a digital signage display located in a mall, airport or other location merely shows the associated floor plans in a stripped-down 2D map format, which is, however, rather sub-optimal regarding e.g. the natural human sense of the corresponding space, and may result in annoying misjudgments relative to correct position, heading, and route in the space as taken by the users.
Further, although commercial data such as advertisements have been traditionally placed on digital signage media, merely displaying a visual advertisement with product information such as product image and price, and optionally providing also relating auditory clues, does not provide that much bonus in contrast to traditional non-digital commercial signage. Yet, many contemporary digital signage solutions that are configured to playback ads, utilize automated, usually extremely simple and restricted rotation logic for alternating between several ads, each being thus allocated with a limited playback period within the overall playback cycle. If one misses some interesting information relative to a certain ad, seeing/hearing it again will take the rest of the whole cycle, for instance.
SUMMARY OF THE INVENTION The objective of the present invention is to at least alleviate one or more of the afore-explained defects associated with the prior art solutions and provide an interactive and adaptive alternative for digital signage equipment such as media kiosks or other terminals. The objective is met by a digital signage arrangement and a related method in accordance with the present invention. The digital signage arrangement may be configured to display data such as product, service, and/or wayfmding information to a user thereof via a suitable visualization device, such as a display or a projector, and further be configured to receive user input on the basis of which the view, i.e. one or more elements thereof, produced by the visualization device may be adapted so as to elevate the usability thereof from the standpoint of the particular user. The arrangement may be physically realized as a substantially integrated entity such as a media kiosk, e.g. a touch screen stand, or through co-operation of multiple at least functionally connected entities such as a (touch)screen or other type of input/visualization entity connected to a remote computer, or as a more comprehensive terminal device, such as a mobile terminal, a PDA (personal digital assistant), or a tablet computer running e.g. a web browser and/or other enabling application and being connected to a remote computer system providing data, for instance.
Accordingly, in an aspect of the present invention, a computerized electronic arrangement for digital signage, such as a media kiosk like a touch screen stand, comprises
-a memory entity, such as one or more memory chips, configured to store data, such as product, service, and/or wayfmding information, to be visualized via the arrangement,
-a visualization entity, such as a display, configured to visualize the data,
-a processing entity, such as at least one microprocessor or microcontroller, configured to control the visualization of the data, and
-a user input entity, such as a touch screen functionality associated with the data visualization entity, configured to receive user input indicative of a request by the user of the arrangement for adaptation of the present view visualized via the visualization entity in terms of at least one usability issue, such as disability and/or other property of the user, e.g. physical property such as height, whereupon the processing entity is configured to adapt the position, size and/or the way of representation of at least one element of the view in response to the user input so as to improve the usability of the arrangement in the light of the aforesaid usability issue. In one embodiment, the arrangement may further be configured to perform the adaptation through utilization of at least one option selected from the group consisting of: move the element lower, e.g. substantially to the bottom of the view, move the element upper, e.g. substantially to the top of the view, center the element, zoom in or out the element or at least part thereof, change the font type associated with the element, change the font size associated with the element, change the font emphasis (e.g. bold, underlined, and/or italics) associated with the element, change the element size, enlarge the element, divide the element into a number of sub- elements, change at least one color associated with the element, switch symbolic and/or graphical representation into a textual one or vice versa, reproduce audio signal describing the element, such as speech or music sample or synthesized speech, and reproduce visual representation describing audio signal such as reproduced speech. In another, either supplementary or alternative, embodiment the arrangement may be configured to visualize an element associated with the adaptation, such as an icon or symbol forming e.g. a part of the view, wherein the element is selectable by the user utilizing the user input entity for triggering the adaptation of the view. In a further, either supplementary or alternative, embodiment the arrangement may include a data transfer interface, such as at least one interface for wired communication and/or at least one wireless transceiver, for communication with at least one external entity. Alternatively or additionally, as being potentially physically distributed, one or more entities of the arrangement may mutually communicate utilizing the data transfer interface. Data to be transferred may include control instructions and/or data to be visualized to the users via the display entity (or otherwise reproduced, e.g. audibly), for example. Likewise, data such as usage statistics, user input and/or related requests may be transferred via the data transfer interface.
Yet, in a further either supplementary or alternative, embodiment the arrangement, such as the user input entity thereof, may include a touch screen for data output and input. The display element of the touch screen visualizes the data, whereas the touch sensitive part, e.g. optical, infrared, imaging, resistive, and/or capacitive part, registers the user's touch including one or more parameters such as location, pressure (force), duration, etc. Optionally, the touch screen includes a dual-touch or a multi-touch feature capable of simultaneously registering two or more touches and preferably also touch locations, respectively. Different functionalities such as element movement, zooming, and/or element activation may be controlled accordingly. In some embodiments, the user may indeed be provided with a possibility to change e.g. the location of a visualized element and/or maximize/minimize it utilizing, for instance, either of the aforesaid touch techniques. As a result, the usability of the view may be improved among other potential benefits. Additionally or alternatively, the arrangement may include at least one user input entity selected from the group consisting of: a switch, button, keypad, keyboard, and touchpad. The aforementioned data interface such as a predetermined transceiver may be configured to receive user input utilizing wired or wireless data transfer technology. For example, suitable wireless technology, e.g. Bluetooth, FID, NFC (Near Field Communication), WiFi/WLAN (Wireless LAN), and/or a selected cellular network technology (GSM, UMTS, CDMA2000, etc.) may be applied. A user may carry a personal communications-enabled device such as a cellular phone, a PDA, or a tablet computer along and apply it for providing user input to the arrangement. Input may be submitted using textual messages such as SMS (Short Message Service) messages or via an USSD (Unstructured Supplementary Service Data) application, for instance. A call to a predetermined number optionally shown via or on the arrangement may be applied for input purposes. Further optionally, even DTMF (Dual-tone Multi-Frequency) tones may be exploited. Still, in a further either supplementary or alternative, embodiment the arrangement may be configured to provide wayfmding aid to a user thereof. In order to obtain this, e.g. the visualization entity may be applied. The aforementioned wayfmding information may refer to location, map, and/or route information, and related data, for example, available via the arrangement. A 2D map, floor plan, or other information may be shown to the users with optional route guidance data such as graphical data including e.g. arrows. Audio guidance such as spoken route instructions may be provided.
More preferably, however, a substantially 3D representation displayed via a 3D projection (i.e. utilizing a two-dimensional plane provided by a display, for example), such as an isometric view, or other wayfmding-enabling view, such as a bird's eye view, may be generated regarding e.g. a predetermined direction from the location of the arrangement or of at least one element like the visualization entity thereof. Preferably the predetermined direction is the initial direction to which the user has to travel in order to reach the destination from the location. Advantageously, in some embodiments the view may be a first-person view, i.e. it imitates, resembles or substantially shows the natural view the user could see with his/her eyes, even though the screen view or other type of view produced by the visualization entity may contain artificial elements, potentially being solely artificial elements in terms of computer graphics, for instance. Again, audio guidance may be provided. The wayfmding view may thus generally refer to a computer-generated, i.e. artificial view with artificial elements such as direction arrows, and/or it may comprise natural elements such as a static or real-time camera view augmented with artificial elements like computer-generated route guidance data, e.g. the arrow symbols, overlaid thereon and resulting in an augmented reality view. Route information may thus be cleverly shown also with the 3D view. The wayfmding embodiment may also be implemented as a stand-alone solution potentially independent of the view adaptation described hereinbefore.
In a further, either supplementary or alternative, embodiment the arrangement may be configured to provide vehicle finding aid, which may be considered as a special case of wayfmding. For example, in the context of a mall, airport or other big complex with large-scale parking facilities such as an optionally multi-store garage facility, finding one's own vehicle, such as a car, after having spent some time doing completely different things, such as shopping, may turn out challenging due to various facts: quite often the all the garages and related structures, parking spaces, etc. look the same and without carefully observing the characteristic features such as floor number, near-by structures, and the route walked upon leaving the vehicle in the first place, considerable amount of time may be unnecessarily consumed to find the vehicle again.
The arrangement may be thus arranged to provide route information, optionally including e.g. direction information such as initial heading, for locating the vehicle on the basis of user input. The user input may include at least one element selected from the group consisting of: coordinates of the location of the vehicle, identification data of the location of the vehicle, and identification data such as register number of the vehicle. Data included in the user input may be first manually acquired by the user. He/she may write it down or, upon parking the vehicle, type it in or scan (photograph, for example) it utilizing his/her electronic notebook device such as a cellular phone or PDA, or other available device. The data visualized in the near-by display or surface, such as a pillar or wall, may include floor number and/or other ID, parking space number and/or other ID. A barcode or a tag, e.g. an FID tag, may be applied for representing and/or carrying the necessary information. Identification data relating to the vehicle itself may be utilized in route determination provided that the parking facilities include technology, such as imaging technology, to detect and recognize the vehicle accordingly and monitor its movements and/or trace its parking location. A vehicle may contain a tag that can be externally read for tracking purposes.
On the basis of the parking space location and/or vehicle identification data, the arrangement may be configured to determine a route from the location of the arrangement, an element such as the visualization entity thereof, or some other predetermined location, back to the vehicle. The arrangement may utilize a number of predetermined route determination criteria such as route length and/or estimated route duration. Typically minimization of both criteria is sought after. A database of ready-determined routes may be applied. The vehicle finding aid may exploit the aforesaid more generic wayfmding feature for initial direction visualization, for instance.
In a further, either supplementary or alternative, embodiment the arrangement may be configured to associate a number of metadata entities with a number of data entities, preferably images such as photographs of subject entities like products or services to be promoted or otherwise put forth via the arrangement. A metadata entity describing an image may include at least one metadata element selected from the group consisting of: image identifier, identifier of the image's subject matter such as product name and/or product model name, subject class such as product or service class, dealer or other source of the subject, dealer location, manufacturer, price, and free word field. An image may be shown via the visualization entity in a (view) position and/or at a time instant depending on the metadata. Images may be searched using the metadata and a related search facility offered by the arrangement. Potential metadata values may be applied as search terms by the user of the arrangement. The result group of the search may include images with metadata matching the query criteria. Search results, i.e. images, matching the query may be visualized. Optionally the wayfmding methods described herein may be utilized for determining and visualizing a route to the (nearest) location offering the product or service represented by an image. In another aspect, a method for adapting digital signage views comprises
-visualizing data, such as product, service, and/or wayfmding information, via a digital signage data visualization entity, such as a display,
-obtaining user input via a user input entity, such as a touch screen functionality associated with the data visualization entity, the user input being indicative of a request by the user of the arrangement for adaptation of the present view visualized via the visualization entity in terms of at least one usability issue, such as disability and/or other property of the user, e.g. physical property such as height, and
-adapting the position, size and/or the way of representation of at least one element of the view in response to the user input so as to improve the usability of the arrangement in the light of the aforesaid usability issue.
The utility of the present invention arises from a plurality of factors depending on each particular embodiment. First, use experience associated with digital signage equipment may be enhanced by adapting data representation according to user preferences. Secondly, wayfmding may be made more illustrative and user-friendly through the proposed 3D modeling and/or augmented camera view solution. Vehicle location finding embodiments may provide additional value to the users of digital signage gear when they are about to exit e.g. a mall provided with a huge parking facility with little or no memory of the location of their vehicles, for instance. Further, provision of metadata-equipped images associated with e.g. products purchasable at or near the location of the digital signage user equipment, such as a display or a media kiosk, may facilitate implementation of both versatile and efficient product/dealer search-and-fmd feature in the context of digital signage with optional hooks such as wayfmding. Altogether, the users (consumers) may be provided with more targeted and complete information package concerning each marketed service or product.
The expression "a plurality of refers herein to any integer starting from two (2), e.g. two, three, or four. The expression "a number of refers herein to any integer starting from one (1), e.g. one, two, or three. The expression "data transfer" may refer to transmitting data, receiving data, or both, depending on the role(s) of a particular entity under analysis relative a data transfer action, i.e. a role of a sender, a role of a recipient, or both. The terms "a" and "an" do not denote a limitation of quantity, but denote the presence of at least one of the referenced item.
The terms "first" and "second" do not denote any order, quantity, or importance, but rather are used to distinguish one element from another.
Different embodiments of the present invention are disclosed in the dependent claims.
BRIEF DESCRIPTION OF THE DRAWINGS
In the following, the present invention is described in more detail by reference to the attached drawings, wherein
Fig. 1 illustrates the general concept of the present invention according to an embodiment thereof.
Fig. 2a illustrates an embodiment of the present invention as to the view adaptation feature suggested.
Fig. 2b illustrates the embodiment of Figure 2a at a second instant after acquisition of user input and related adaptation.
Fig. 3 illustrates an embodiment of wayfmding in accordance with the present invention.
Fig. 4 illustrates an embodiment of wayfmding, in particular vehicle finding, in accordance with the present invention.
Fig. 5 illustrates an embodiment of product/service promotion in accordance with the present invention.
Fig. 6 is a block diagram of one embodiment in accordance with the arrangement of the present invention.
Fig. 7 depicts a flow diagram of a method in accordance with the present invention to be performed by the server arrangement.
DETAILED DESCRIPTION OF THE EMBODIMENTS An arrangement according to an embodiment of the present invention may be configured to decide on the selected adaptation based on a predetermined logic and the nature of the user input. The user input thus preferably indicates the usability issue to be tackled. For instance, an icon, other symbol, graphical element and/or text, which is associated technically, i.e. by the logic of the arrangement, and preferably also mentally with a certain usability issue, such as user disability and/or other property, may be visualized using the visualization entity, the selection or other activation measure of which by the user, e.g. via the touch screen of the arrangement, subsequently triggers the adaptation procedure. A symbol of a wheelchair may be associated with a corresponding disability affecting e.g. the effective height and thus reach and viewing angle of the user in question, for example. In some embodiments, the user input may even be automatically provided to the arrangement via a personal terminal device of the user, such as a mobile terminal or communications-enabled tablet, a PDA, or e.g. a wristop computer. The terminal device may be configured to store user-adjustable settings for digital signage to be transferred advantageously automatically upon detection of external signage equipment (e.g. via active scan procedure) or not until user confirmation/initiation obtained via the UI of the terminal. Depending on the embodiment, the view may be adapted by adapting at least one visualized element thereof, and/or the view may be adapted by adding thereto and/or removing therefrom a number of elements. For example, an already visualized element may be adapted to lessen the burden of consuming it. Consuming may refer to reading, for example, in the case of an element including text from the standpoint of a weak-eyed or visually impaired user. Additionally or alternatively, a new element such as a textual and/or graphical element may be visualized; for example, in the case of a hearing-impaired user, a scarcely heard audio message such as an ad may be suitably converted into a text visually represented to the user. Additionally or alternatively, a number of audio-related factors such as audio level and/or reproduction speed or tempo of speech or other audio may be adapted. For example, playback volume/audio level may be increased.
Elements visualized but considered, according to predetermined logic, as secondary may be removed, moved to background, minimized, or otherwise de-emphasized in favour of priority element(s). For example, commercial information such as an ad may be considered as secondary in contrast to e.g. geographical information such as a map including e.g. a floor plan. Correspondingly, priority element(s) may be highlighted and emphasized during the adaptation for elevated usability. Reverting to Figure 1, the overall architecture applicable in connection with the present invention is illustrated according to one potential embodiment 101. The arrangement of the present invention may be implemented via a device 106, 108, such as a media kiosk or other display-containing device, best suiting each particular context in question 102, 104, by which it may be referred to a mall, bus station, metro station, airport, commercial building, company premises, etc. However, the device 106, 108 may also be formed of a plurality of at least functionally connected devices such that e.g. necessary power and/or communications connections are established in between to form the arrangement. The arrangement may be provided with a connection or at least connectivity to a number of wired and/or wireless network infrastructures 1 12, which may refer to public or private network(s) both alike. An Internet connection/access may be arranged, for example. Further, external entities such as a number of servers 1 14, e.g. a cloud computing infrastructure, functionally connected to the arrangement via the Internet and/or using a private network, for instance, may be applied.
The server 1 14, which may be locally or remotely accessed by the operator 1 16, may be utilized for managing the arrangement as to its operation, configuration and/or content. The server 1 14 may host, e.g. in a commercial context, a web site for consumers, i.e. potential end users of the arrangement, and/or an administration portal, which may be used for creating the content for the user interface, such as shop information and ads, and/or for information sharing between product and service providers creating the consumer content. User-related data such as user selections and other user input may be gathered for analysis, e.g. profiling and targeted marketing purposes.
Therefore, it is natural that in some embodiments the arrangement may be considered to at least logically contain a number of communications network(s) 1 12 and/or external entities such as servers 1 14. Yet, in some embodiments the arrangement may even substantially consist of or contain a wireless mobile terminal device 1 10 such as a cellular phone, a PDA (personal digital assistant), a tablet computer, or a laptop computer, which is discussed further below. The arrangement may be provided with a display device, user input device (these two may be cleverly combined in a touch screen, for example), a computing device such as a number of processors, and a memory device such as a number of memory chips. Further, a number of optional elements such as a data transfer interface for communication with external entities may be provided therewith. In case of physically separated elements of the arrangement, also each element may have to be provided with a data transfer means sufficient for necessary mutual communication. Thus, the elements of the arrangement and their higher level configuration remind the one of a personal computer (pc), for example, which makes it functionally possible to implement a similar solution in a mobile personal terminal device provided that the mobile device is configured to contain and/or obtain necessary data such as location data and location-related data for adequately serving the user in terms of digital signage. Obviously, the mobility may also open up new possibilities relative to location-awareness and wayfmding features provided by some embodiments of the arrangement as described in more detail hereinbelow.
The arrangement, such as a media kiosk, may thus include a pc running a web browser with necessary add-on applications and features. For instance, a web application framework such as Silverlight™ may utilized for constructing the functionality of the arrangement. Additionally, suitable supplementary software, such as the Real Kiosk (R-kiosk) extension for the Firefox™ browser, may further be applied to turn the browser more into a kiosk-style browser UI. From the standpoint of the consumer-type end user (e.g. a mall visitor), the arrangement may offer easy and simple way to find products and services of interest in a defined local environment and the product or service provider an efficient medium to reach the correct target group in a defined local environment through e.g. Internet connected touch screens and/or mobile devices, for example.
With reference to Figure 2a and Figure 2b, an embodiment of the view adaptation feature is illustrated via two views relating to two time instants, respectively.
Figure 2a shows merely an exemplary view 202 of a digital signage solution prior to adapting, whereas Figure 2b represents a corresponding adapted view 220. A map/flooφlan/wayflnding view 204, or generally a first element, or a "sub-view", of the overall view 202 may be provided with optional pointers, symbols, and/or text, describing entities such as shops and restaurants e.g. in the vicinity of the signage gear itself or at an alternative, e.g. user-selected, location such as a different floor, etc. A second element 210, such as an advertisement space for associated text, images, and/or videos relating to a number of advertisements, etc., may be simultaneously visualized at a different location than the first element 204. The view 202 may include further elements such as a clock/calendar element 206. Some elements 208, 212 may relate to user input entity such as a touch screen and indicate actions triggered via selecting (touching the associated screen location in conjunction with a touch screen, for example, or via point-and-click, if e.g. a cursor or highlighting indicator and a relating control feature such as a joystick, trackball, or a touchpad is provided) or otherwise activating them, e.g. via a button press of a predetermined button.
As mentioned hereinbefore, an element 212 associated with the adaptation, such as an icon or symbol forming a part of the view 202, may be provided. The element 212 may, as described above, be selectable and/or otherwise activable, or at least the related adaptation function addressable, by the user via the user input entity for triggering the adaptation. As further deliberated earlier, alternatively or additionally the user input entity may apply data interface for obtaining the input transferred utilizing e.g. a short-range wireless or cellular connection, for instance. The element 212 may be configured to visually indicate the nature of the disability and/or of the usability-enhancing action associated therewith. In this particular example, the element 212 includes the international symbol of access (ISA) through the selection of which the display view 202 is adapted from the standpoint of a predetermined disability such as physical disability.
At 220, an example of adaptation results is depicted. Element 204b illustrates an enlargened, zoomed (in) and relocated (moved closer to the screen bottom) element 204. As a result the usability has improved relative to e.g. physically disabled users using a wheelchair etc. and having difficulty seeing clearly to the upper part of the display, for instance.
Elements 206 and 210 considered of having lesser importance in the light of the usability adaptation associated with the element 212 have been removed. Alternatively, they could be at least reduced, hid, minimized, put into background, or otherwise de-emphasized, for example. Elements 208 have been re-arranged including re -positioning 208b to facilitate access thereto. Element 212 has been updated 212b regarding its visual appearance in this case to indicate the current state ("usability adaptation on") of the associated functionality to the user. Alternatively or additionally, e.g. future state after next activation could be indicated by the visual update. Re-selection of element 212b, or some other predetermined user input, may switch the view 220 back into 202, for instance, so that the views 202, 220 may be alternated. New element 222 has also appeared. For example, the element 222 may be associated with a second usability adaptation measure, such as increase in text/font size to further facilitate reading the shown information by the visually impaired.
Figure 3 illustrates an embodiment of wayfmding in accordance with the present invention. A view 301 may be produced by a digital signage arrangement such as a media kiosk after the user thereof has input a destination location utilizing the user input entity, for example. The equipment may utilize a dynamic route determination algorithm and/or a database of at least partially ready-determined routes. Even the latter may work fine in connection with static setting and e.g. fixed location of the arrangement (display). The view 301 may include a 3D element 302, such as a camera-provided view including still photo image data and/or video image data, a computer-generated view, or a mixture of both, illustrating e.g. the surroundings of the arrangement in a fixed or dynamic, potentially user-controllable, direction. Preferably the direction is aligned with the initial direction of the determined route.
Optionally, predetermined or user-selected part(s) of the overall route may be visually and/or otherwise, potentially audibly, represented with route guidance data via the arrangement. At least part of the route may be shown utilizing video data, a number of images, and/or computer graphics, for example. The element 302 may include an isometric, a first-person, or a bird's eye view, for instance. Yet, route guidance data 304, 306 such as a number of arrows 304, lines, other symbols, and/or textual/numerical information, e.g. distance indicator associated with direction arrow 304 and/or a (target) location pointer 306, may be shown simultaneously with the view as superimposed thereon, for instance. In addition to or instead of route guidance data, other data such as commercial data and/or various announcements may be provided on the wayfmding view 301. Audio playback (route guidance and/or other audio data) is a further possibility.
With reference to Figure 4, a special embodiment of wayfmding, in particular vehicle finding, in accordance with the present invention is visualized. Based on the user input provided via a cell phone or other electronic terminal device 402, or directly via the UI of the digital signage arrangement 404, such as a touch screen, for example, the arrangement 404 may be configured to provide route information, optionally including e.g. directional information such as initial heading, for locating the target vehicle such as a parked car. The user input may include coordinates of the location of the vehicle, other identification data of the location of the vehicle or near-by location, and/or vehicle-related identification data such as register number of the vehicle. Route information may be provided via the data visualization entity and/or utilizing other techniques such as a printed route note and/or a printed route map. At least part of the route may be illustrated via a plane view 406, e.g. floor plan view, or applying 3D modeling techniques and/or camera views 408, for example. In the illustrated case 408, computer-generated route guidance data in a form of arrows has been superimposed on a camera view, being a photo image or video image view.
Regarding any afore-reviewed embodiment of wayfmding, route information and/or guidance data may in some implementations be supplied to a personal digital terminal device of a user, optionally in order to enable substantially continuous, real-time navigation service. Correspondingly, the terminal may send data such as location information to the arrangement 404. Feasible indoor and/or outdoor positioning technologies may be utilized. E.g. WLAN, Bluetooth, ZigBee, cellular (e.g. Cell-ID, TOA (Time of Arrival), TDOA (Time Difference of Arrival)), and/or satellite positioning, such as GPS (Global Positioning System) or GLONASS (Global Orbiting Navigation Satellite System) may be applied. The terminal may, after initial communication with the arrangement 404, act autonomously (information regarding whole route received for independent utilization, for instance) or at least partially rely on information obtained from the arrangement 404 (real-time navigation instructions based on updated terminal location, for example).
Figure 5 illustrates an embodiment of product and/or service promotion in accordance with the present invention. A number of metadata entities may be associated with a number of data entities, preferably images such as photographs of subject entities like products or services to be promoted or otherwise put forth via the arrangement. A metadata entity may be or include a data entity identifier, identifier of the data entity's subject matter such as product/service name and/or model name, a subject class such as product or service class, an indication of the dealer or other source of the subject, an indication of dealer/source location, an indication of the manufacturer, a price indicator, and/or a free word field. A search facility 504 applying the metadata may be provided to the users by the arrangement for finding interesting products and/or services. Potential metadata values may be applied as search terms. The outcome of the search may include a number of images with metadata matching the search criteria. The search results may be visualized 502. Optionally the wayfmding methods described herein may be utilized for determining and visualizing a route to the (nearest) location offering the product or service represented by the image. Metadata search terms applied by the users may be further monitored, analyzed and exploited. For example, an indication of most popular search criteria, and/or of related hits, i.e. search results meaning the data entities such as the images, may be listed or otherwise provided to the users and/or the operator or administrator of the arrangement. Further, the operator/administrator may be provided with a tool to more specifically manage the search terms, i.e. through censoring or "moderating".
Instead of utilizing merely technical criterion, such as the total number of uses per search term or search string, or the search frequency, i.e. number of uses per selected time period, other criteria such as commercial criteria may be additionally or alternatively applied for determining the order of the search terms in the search term list shown to the users. For example, advertisers such as product manufacturers or dealers may be willing to be pay for a desired (e.g. early) position in the "most searched" or "most popular" listing.
In addition or instead of image data, some other form of a visual representation such as video may be supplemented with related metadata. Additionally or alternatively, still some other type of a data entity, such as a multimedia element, audio element, or textual element may be supplemented with metadata. The similar reproduction and search procedure as described above may be correspondingly applied.
For example, product and/or service information may be searched and visualized. Naturally the solution is generally applicable also to other type of subject entities. The images of the search result group may be visualized, e.g. in an automatically rotating or user-controllably rotatable slideshow format. Metadata such as product/service dealer information (e.g. location of the nearest dealer) may be represented as well.
Figure 6 is a block diagram of one embodiment in accordance with the arrangement 600 of the present invention. As contemplated hereinbefore, the arrangement 600 may physically contain a number of at least functionally connected elements. The arrangement 600 is typically provided with one or more processing devices capable of processing instructions and other data, such as one or more microprocessors, micro-controllers, DSP's (digital signal processor), programmable logic chips, etc. The processing entity 602 may thus, as a functional entity, physically comprise a plurality of mutually co-operating processors and/or a number of sub-processors connected to a central processing unit, for instance. The processing entity 602 may be configured to execute the code stored in a memory 604, which may refer to instructions and data relative to the digital signage arrangement software logic and software architecture 610 for controlling the arrangement 600. The processing entity 602 may be configured to control data visualization and optionally also audio reproduction, for example.
Similarly, the memory entity 604 may be divided between one or more physical memory chips or other memory elements. The memory 604 may store program code and other data such as marketing data, map/wayfmding data, etc. The memory 604 may further refer to and include other storage media such as a preferably detachable memory card, a floppy disc, a CD-ROM, or a fixed storage medium such as a hard drive. The memory 604 may be non-volatile, e.g. ROM (Read Only Memory), and/or volatile, e.g. RAM (Random Access Memory), by nature. Software (product) 610 may be provided on a carrier medium such as a memory card, a memory stick, an optical disc (e.g. CD-ROM or DVD), or some other memory carrier.
The UI (user interface) 612 may comprise a display or a data projector 612b, and keyboard/keypad or other applicable user (control) input entity 612a such as a touch screen and/or a voice control input, or a number of separate keys, buttons, knobs, switches, a tag reader such as RFID reader, a touchpad, a joystick, a mouse, and/or imaging device such as a barcode (1st, 2nd, and/or 3rd generation compatible, for example) reader configured to provide the user of the arrangement 600 with practicable data visualization and device control means, respectively. The UI 612 may include one or more loudspeakers and associated circuitry such as D/A (digital- to-analogue) converter(s) for sound output, and optionally a microphone with A/D converter for sound input. A printer may be included in the arrangement for providing more permanent output.
Part of the functionality of the UI 612 may be optionally arranged via a user terminal device. For example, a mobile-executable application or other software may be downloaded preferably wirelessly to the terminal for data visualization, audio playback and/or communication such as user input provision relative to the (rest of the) arrangement 600. Alternatively or additionally, browser and/or messages such as text or USSD messages may be applied for user input purposes and/or data transfer in the opposite, downlink direction.
Accordingly, the arrangement 600 may further comprise a data interface 608 such as a number of wired and/or wireless transmitters, receivers, and/or transceivers for communication with other devices such as terminals and/or network infrastructure(s).
Non-limiting examples of the generally applicable technologies include GSM (Global System for Mobile Communications), GPRS (General Packet Radio Service), EDGE (Enhanced Data rates for Global Evolution), UMTS (Universal Mobile Telecommunications System), WCDMA (wideband code division multiple access), CDMA2000, PDC (Personal Digital Cellular), PHS (Personal Handy-phone System), WLAN (Wireless LAN, wireless local area network), WiFi, Ethernet, USB (Universal Serial Bus), RFID, NFC (Near-Field Communication), and Firewire. Further, a Bluetooth adapter for peer-to-peer communication and piconet/scatternet use may be provided.
It is clear to a skilled person that the arrangement 600 may comprise numerous additional functional and/or structural elements for providing advantageous communication, processing or other features, whereupon this disclosure is not to be construed as limiting the presence of the additional elements in any manner.
Figure 7 is a flow diagram of an embodiment of a method in accordance with the present invention. At 700 an arrangement in accordance with an embodiment of the present invention is obtained and configured, for example via loading and execution of related software, for managing a digital signage installation. Optionally, a connection may be established to a network server and/or a mobile client device (terminal) carried by a user. At 702, digital signage data, such as product, service and/or wayfmding -related data, is visualized via an applicable entity such as a separate digital signage display and/or terminal device of the user. The arrangement may be configured to receive user input via a number of options including a dedicated user interface like a touch screen. Alternatively or additionally, interface software running in a compatible terminal device carried along by the user may be applied, for example. At 704, the user input is captured, the user input being indicative of a request for adapting of the present view visualized via the visualization entity in terms of at least one usability issue, such as disability and/or other property of the user, e.g. physical property such as height. At 706, the position, size and/or the way of representation, such as style, of at least one element of the view is adapted in response to the user input so as to improve the usability of the arrangement in the light of the aforesaid usability issue. At 708, the method execution is ended. Broken lines depict the potentially repetitive nature of various method items. At 710, a method flow diagram relating to an embodiment of wayfmding in connection with digital signage is illustrated. Initial and final method items (not shown) with preparatory and concluding actions, respectively, may be similar to the ones 700, 708 of the adaptation method also in this case. The method may actually be executed jointly with the adaptation method, in parallel therewith, or independently. At 712, route/navigation starting point is solved. It may be, by default, associated with the location of the digital signage arrangement, or at least e.g. the display thereof, if the related elements are physically separate. Optionally the user may determine the starting point utilizing the user input entity and further optionally e.g. his/her personal terminal device may be capable of preferably wirelessly providing route source location information to the signage gear. At 704b, an indication of the route target location, or a plurality of locations such as a number of waypoints and the final destination, is received, preferably again utilizing the user input entity. The target location may imply the location of a particular store, for instance. Alternatively, as in the afore-explained special embodiment, the target location may indicate the location of a predetermined vehicle. At 714, at least part of the route, preferably at least e.g. the initial direction is determined, which may refer to dynamic calculations and/or utilization of pre-defined routes ("route database"), for example. At 702b, route information such as a 3D view like a video camera or still (photo) image view preferably augmented with route navigation data such as instructive arrows, is provided utilizing the visualization entity and/or data interface for transmitting data to be visualized to an external device such as a terminal device of the user as mentioned hereinbefore. In the case of real-time navigation, the terminal location may be acquired using e.g. network-based or terminal-based positioning, whereupon the route information may be dynamically determined on the basis of the location by the arrangement and/or the terminal itself.
The scope can be found in the following claims. Notwithstanding the various embodiments described hereinbefore in detail, a person skilled in the art will understand that different modifications may be introduced to the explicitly disclosed solutions without diverging from the fulcrum of the present invention as set forth in this text and defined by the independent claims.

Claims

Claims
1. A computerized electronic arrangement (106, 108, 1 10, 1 14, 404, 600) for digital signage, such as a media kiosk like a touch screen stand, comprising
-a memory entity (604), such as one or more memory chips, configured to store data, such as product, service, and/or wayfmding information, to be visualized via the arrangement,
-a visualization entity (612b), such as a display, configured to visualize the data,
-a processing entity (602), such as at least one microprocessor or microcontroller, configured to control the visualization of the data, and -a user input entity (612a, 608), such as a touch screen functionality associated with the data visualization entity, configured to receive user input indicative of a request by the user of the arrangement for adaptation of the present view visualized via the visualization entity in terms of at least one usability issue, such as disability and/or other property of the user, e.g. physical property such as height, whereupon the processing entity is configured to adapt the position (208, 208b), size (204, 204b), and/or the way of representation (212, 212b) of at least one element of the view in response to the user input so as to improve the usability of the arrangement in the light of the aforesaid usability issue (202, 220).
2. The arrangement of claim 1, configured to apply the adaptation utilizing at least one action selected from the group consisting of: move the element lower in the view, move the element upper in the view, center the element, zoom in or out the element or at least part thereof, change the font type associated with the element, change the font size associated with the element, change the font emphasis associated with the element, change the element size, enlarge the element, divide the element into a number of sub-elements, change at least one color associated with the element, switch symbolic and/or graphical representation into a textual one or vice versa, reproduce audio signal describing the element, such as speech or music sample, or synthesized speech, and reproduce visual representation describing an audio signal such as reproduced speech.
3. The arrangement of any preceding claim, configured to remove an element from the view (206) or add an element to the view (222) as a part of the adaptation.
4. The arrangement of any preceding claim, configured to visualize an element associated with the adaptation (212, 212b), such as an icon or symbol forming a part of the view, wherein the element and/or the related adaptation is addressable by the user preferably via the user input entity for triggering the adaptation.
5. The arrangement of any preceding claim, comprising a data transfer interface (608) for communication with a user terminal, a server, and/or a network infrastructure.
6. The arrangement of any preceding claim, configured to receive an indication of a target location as set by the user and to determine at least part of the route thereto such as at least initial direction.
7. The arrangement of claim 6, further configured to visualize at least part of the route (301, 304, 406, 408), such as initial direction, preferably via a 3D view (302), such as an isometric or a first-person view.
8. The arrangement of claim 7, configured to visualize at least part of the route utilizing at least one view selected from the group consisting of: computer- generated artificial view, natural photo view augmented with route guidance data, natural video view augmented with route guidance data, edited photo view augmented with route guidance data, and edited video view augmented with route guidance data, said route guidance data optionally including an arrow symbol.
9. The arrangement of any of claims 6-8, configured to obtain an indication of the location of a target vehicle, identity of a near-by location, and/or identity of the vehicle, and to determine at least part of the route substantially to the location of the vehicle based on the indication (406, 408).
10. The arrangement of any of claims 6-9, wherein the indication is provided as tag- , such as FID (Radio Frequency Identification) tag, or barcode-based information, optionally utilizing a tag reader or a barcode reader, respectively.
1 1. The arrangement of any preceding claim, configured to store a number of data elements, such as images relating to products and/or services, associated with a number of metadata entities, and to provide a data element search facility to a user such that a number of user-determined search criteria is obtained and matching one or more data elements are retrieved and preferably visualized to the user.
12. The arrangement of claim 1 1, configured to maintain at least use instance number or use frequency -based ranking relative to used search criteria, optionally with search criteria censoring and/or utilization of one or more additional ranking criteria.
13. The arrangement of any preceding claim, wherein the user input entity comprises a touch screen, optionally with a dual-touch or a multi-touch capability, further optionally wherein the arrangement is configured to convert a dual-touch or multi-touch input into a location change of an element of the view.
14. A method for adapting a digital signage view, said method comprising -visualizing data (702), such as product, service, and/or wayfmding information, via a digital signage data visualization entity, such as a display,
-obtaining user input via a user input entity, such as a touch screen functionality associated with the data visualization entity, the user input being indicative of a request by the user of the arrangement for adaptation of the present view visualized via the visualization entity in terms of at least one usability issue (704), such as disability and/or other property of the user like a physical property such as height, and
-adapting the position, size and/or the way of representation of at least one element of the view in response to the user input so as to improve the usability of the arrangement in the light of the aforesaid usability issue (706).
15. The method of claim 14, comprising obtaining an indication of a target location defined by the user (704b) and determining at least part of the route thereto (714), such as at least initial direction.
16. The method of claim 15, comprising visualizing at least part of the route (702b), such as initial direction, preferably via a 3D view, such as an isometric or first- person view.
17. Computer software comprising code means adapted, when run on a computer, to execute the method items of any of claims 14-16.
18. A carrier medium comprising the computer software of claim 17.
EP10857203.3A 2010-09-13 2010-09-13 Arrangement and related method for digital signage Ceased EP2616909A4 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/FI2010/050701 WO2012035191A1 (en) 2010-09-13 2010-09-13 Arrangement and related method for digital signage

Publications (2)

Publication Number Publication Date
EP2616909A1 true EP2616909A1 (en) 2013-07-24
EP2616909A4 EP2616909A4 (en) 2014-12-03

Family

ID=45831045

Family Applications (1)

Application Number Title Priority Date Filing Date
EP10857203.3A Ceased EP2616909A4 (en) 2010-09-13 2010-09-13 Arrangement and related method for digital signage

Country Status (2)

Country Link
EP (1) EP2616909A4 (en)
WO (1) WO2012035191A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3264041A1 (en) * 2016-06-28 2018-01-03 Yim Mai Amy Lee Navigation system and method
JP7410690B2 (en) * 2019-11-01 2024-01-10 株式会社ファーストリテイリング Information processing device, program, information processing method, and signage system
CN111178579A (en) * 2019-11-26 2020-05-19 恒大智慧科技有限公司 Automatic navigation method in intelligent community, computer equipment and readable storage medium
NL1043806B1 (en) * 2020-10-05 2022-06-03 Atsence B V Inclusive personal wayfinding assistive method and system for electronic devices for all, in particular for visually impaired, dyslectics, the color blind, elderly and children.

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6049328A (en) * 1995-10-20 2000-04-11 Wisconsin Alumni Research Foundation Flexible access system for touch screen devices
US6061666A (en) * 1996-12-17 2000-05-09 Citicorp Development Center Automatic bank teller machine for the blind and visually impaired
US6907576B2 (en) * 2002-03-04 2005-06-14 Microsoft Corporation Legibility of selected content
WO2003096305A1 (en) * 2002-05-14 2003-11-20 Ascom Autelca Ag Method, system interface and apparatus for handicapped users
WO2004073512A1 (en) * 2003-02-21 2004-09-02 Harman/Becker Automotive Systems (Becker Division) Gmbh Method for obtaining a colour palette in a display to compensate for colour blindness

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
No further relevant documents disclosed *
See also references of WO2012035191A1 *

Also Published As

Publication number Publication date
WO2012035191A1 (en) 2012-03-22
EP2616909A4 (en) 2014-12-03

Similar Documents

Publication Publication Date Title
US10509477B2 (en) Data services based on gesture and location information of device
US10728706B2 (en) Predictive services for devices supporting dynamic direction information
JP5486680B2 (en) Portal service based on dialogue with points of interest detected via directional device information
US20170249748A1 (en) System and method for converting gestures into digital graffiti
JP5456799B2 (en) Device transaction model and service based on device direction information
CN108337907B (en) System and method for generating and displaying location entity information associated with a current geographic location of a mobile device
JP6580703B2 (en) System and method for disambiguating a location entity associated with a mobile device's current geographic location
US8769442B2 (en) System and method for allocating digital graffiti objects and canvasses
CN107167137B (en) Route recommendation method in indoor place and user terminal
US20080250337A1 (en) Identifying interesting locations based on commonalities in location based postings
US9739631B2 (en) Methods and systems for automatically providing point of interest information based on user interaction
US11790022B2 (en) User interfaces and methods for operating a mobile computing device for location-based transactions
WO2012035191A1 (en) Arrangement and related method for digital signage
Samuel et al. Smart indoor navigation and proximity advertising with android application using BLE technology
Krammer et al. Findings from a Location Aware Smartphone Application for a novel Retail Shopping Experience

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20130415

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20141105

RIC1 Information provided on ipc code assigned before grant

Ipc: G06Q 30/06 20120101AFI20141030BHEP

Ipc: G06F 3/048 20130101ALI20141030BHEP

Ipc: G06Q 10/04 20120101ALI20141030BHEP

Ipc: G06F 3/033 20130101ALI20141030BHEP

Ipc: G06F 3/041 20060101ALI20141030BHEP

17Q First examination report despatched

Effective date: 20171117

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

REG Reference to a national code

Ref country code: DE

Ref legal event code: R003

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20201101