WO2012035191A1 - Agencement et procédé associé de signalisation numérique - Google Patents

Agencement et procédé associé de signalisation numérique Download PDF

Info

Publication number
WO2012035191A1
WO2012035191A1 PCT/FI2010/050701 FI2010050701W WO2012035191A1 WO 2012035191 A1 WO2012035191 A1 WO 2012035191A1 FI 2010050701 W FI2010050701 W FI 2010050701W WO 2012035191 A1 WO2012035191 A1 WO 2012035191A1
Authority
WO
WIPO (PCT)
Prior art keywords
arrangement
view
data
entity
user
Prior art date
Application number
PCT/FI2010/050701
Other languages
English (en)
Inventor
Petteri Lappalainen
Markus Porvari
Timo Arnivuo
Mikko Ahtiainen
Heikki Uljas
Original Assignee
Hyperin Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hyperin Inc. filed Critical Hyperin Inc.
Priority to PCT/FI2010/050701 priority Critical patent/WO2012035191A1/fr
Priority to EP10857203.3A priority patent/EP2616909A4/fr
Publication of WO2012035191A1 publication Critical patent/WO2012035191A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • G06Q10/047Optimisation of routes or paths, e.g. travelling salesman problem
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]

Definitions

  • the present invention generally relates to the fields of digital information storage, processing, transfer and representation.
  • the invention concerns digital signage and dynamic adaptation thereof in view of usability.
  • Digital marketing is an evolving technology used in a retail setting that brings together promotional, advertising and informational content to local out-of-home environments.
  • Various researches indicate that generally over 70% of the consumer decision making process is done on location of the purchased service or a product and digital advertisements are 5-10 times more likely to be noticed when compared to static media. Accordingly, no wonder why digital marketing is one of the world's fastest growing marketing channels with a projected volume of 1 1,4 b € by the year 201 1, but yet picking up momentum due to heavy costs, complicated hardware or system projects and the lack of standardized services.
  • Digital signage generally refers to the provision of information such as wayfmding data or marketing data including advertisements to visual perception by way of digital signs, i.e. digital displays like flat screens, media kiosks or feasible projection means, for instance.
  • the content to be visualized via the digital signage displays may be thus flexibly controlled even remotely by administrative personnel through the access of the associated content management entity such as a content server and a network infrastructure, such as a wired or wireless network, connecting the server with a number of signage displays.
  • Data may be ultimately locally visualized by each display such that also various environmental factors such as ambient lighting conditions are catered for.
  • a typical solution for e.g. indoor navigation in connection with a digital signage display located in a mall, airport or other location merely shows the associated floor plans in a stripped-down 2D map format, which is, however, rather sub-optimal regarding e.g. the natural human sense of the corresponding space, and may result in annoying misjudgments relative to correct position, heading, and route in the space as taken by the users.
  • the objective of the present invention is to at least alleviate one or more of the afore-explained defects associated with the prior art solutions and provide an interactive and adaptive alternative for digital signage equipment such as media kiosks or other terminals.
  • the objective is met by a digital signage arrangement and a related method in accordance with the present invention.
  • the digital signage arrangement may be configured to display data such as product, service, and/or wayfmding information to a user thereof via a suitable visualization device, such as a display or a projector, and further be configured to receive user input on the basis of which the view, i.e. one or more elements thereof, produced by the visualization device may be adapted so as to elevate the usability thereof from the standpoint of the particular user.
  • the arrangement may be physically realized as a substantially integrated entity such as a media kiosk, e.g. a touch screen stand, or through co-operation of multiple at least functionally connected entities such as a (touch)screen or other type of input/visualization entity connected to a remote computer, or as a more comprehensive terminal device, such as a mobile terminal, a PDA (personal digital assistant), or a tablet computer running e.g. a web browser and/or other enabling application and being connected to a remote computer system providing data, for instance.
  • a media kiosk e.g. a touch screen stand
  • multiple at least functionally connected entities such as a (touch)screen or other type of input/visualization entity connected to a remote computer
  • a more comprehensive terminal device such as a mobile terminal, a PDA (personal digital assistant), or a tablet computer running e.g. a web browser and/or other enabling application and being connected to a remote computer system providing data, for instance.
  • a computerized electronic arrangement for digital signage such as a media kiosk like a touch screen stand, comprises
  • a memory entity such as one or more memory chips, configured to store data, such as product, service, and/or wayfmding information, to be visualized via the arrangement,
  • -a visualization entity such as a display, configured to visualize the data
  • processing entity such as at least one microprocessor or microcontroller, configured to control the visualization of the data
  • the arrangement may further be configured to perform the adaptation through utilization of at least one option selected from the group consisting of: move the element lower, e.g.
  • the arrangement may be configured to visualize an element associated with the adaptation, such as an icon or symbol forming e.g. a part of the view, wherein the element is selectable by the user utilizing the user input entity for triggering the adaptation of the view.
  • the arrangement may include a data transfer interface, such as at least one interface for wired communication and/or at least one wireless transceiver, for communication with at least one external entity. Alternatively or additionally, as being potentially physically distributed, one or more entities of the arrangement may mutually communicate utilizing the data transfer interface. Data to be transferred may include control instructions and/or data to be visualized to the users via the display entity (or otherwise reproduced, e.g. audibly), for example. Likewise, data such as usage statistics, user input and/or related requests may be transferred via the data transfer interface.
  • the arrangement such as the user input entity thereof, may include a touch screen for data output and input.
  • the display element of the touch screen visualizes the data
  • the touch sensitive part e.g. optical, infrared, imaging, resistive, and/or capacitive part, registers the user's touch including one or more parameters such as location, pressure (force), duration, etc.
  • the touch screen includes a dual-touch or a multi-touch feature capable of simultaneously registering two or more touches and preferably also touch locations, respectively. Different functionalities such as element movement, zooming, and/or element activation may be controlled accordingly.
  • the user may indeed be provided with a possibility to change e.g.
  • the arrangement may include at least one user input entity selected from the group consisting of: a switch, button, keypad, keyboard, and touchpad.
  • the aforementioned data interface such as a predetermined transceiver may be configured to receive user input utilizing wired or wireless data transfer technology.
  • suitable wireless technology e.g. Bluetooth, FID, NFC (Near Field Communication), WiFi/WLAN (Wireless LAN), and/or a selected cellular network technology (GSM, UMTS, CDMA2000, etc.
  • GSM Global System for Mobile communications
  • UMTS Code Division Multiple Access 2000
  • CDMA2000 Code Division Multiple Access 2000
  • a user may carry a personal communications-enabled device such as a cellular phone, a PDA, or a tablet computer along and apply it for providing user input to the arrangement.
  • Input may be submitted using textual messages such as SMS (Short Message Service) messages or via an USSD (Unstructured Supplementary Service Data) application, for instance.
  • SMS Short Message Service
  • USSD Unstructured Supplementary Service Data
  • a call to a predetermined number optionally shown via or on the arrangement may be applied for input purposes.
  • DTMF Dual-tone Multi-Frequency
  • the arrangement may be configured to provide wayfmding aid to a user thereof. In order to obtain this, e.g. the visualization entity may be applied.
  • the aforementioned wayfmding information may refer to location, map, and/or route information, and related data, for example, available via the arrangement.
  • a 2D map, floor plan, or other information may be shown to the users with optional route guidance data such as graphical data including e.g. arrows. Audio guidance such as spoken route instructions may be provided.
  • a substantially 3D representation displayed via a 3D projection i.e. utilizing a two-dimensional plane provided by a display, for example
  • a 3D projection i.e. utilizing a two-dimensional plane provided by a display, for example
  • an isometric view or other wayfmding-enabling view, such as a bird's eye view
  • the predetermined direction is the initial direction to which the user has to travel in order to reach the destination from the location.
  • the view may be a first-person view, i.e.
  • the wayfmding view may thus generally refer to a computer-generated, i.e. artificial view with artificial elements such as direction arrows, and/or it may comprise natural elements such as a static or real-time camera view augmented with artificial elements like computer-generated route guidance data, e.g. the arrow symbols, overlaid thereon and resulting in an augmented reality view. Route information may thus be cleverly shown also with the 3D view.
  • the wayfmding embodiment may also be implemented as a stand-alone solution potentially independent of the view adaptation described hereinbefore.
  • the arrangement may be configured to provide vehicle finding aid, which may be considered as a special case of wayfmding.
  • vehicle finding aid For example, in the context of a mall, airport or other big complex with large-scale parking facilities such as an optionally multi-store garage facility, finding one's own vehicle, such as a car, after having spent some time doing completely different things, such as shopping, may turn out challenging due to various facts: quite often the all the garages and related structures, parking spaces, etc. look the same and without carefully observing the characteristic features such as floor number, near-by structures, and the route walked upon leaving the vehicle in the first place, considerable amount of time may be unnecessarily consumed to find the vehicle again.
  • the arrangement may be thus arranged to provide route information, optionally including e.g. direction information such as initial heading, for locating the vehicle on the basis of user input.
  • the user input may include at least one element selected from the group consisting of: coordinates of the location of the vehicle, identification data of the location of the vehicle, and identification data such as register number of the vehicle. Data included in the user input may be first manually acquired by the user. He/she may write it down or, upon parking the vehicle, type it in or scan (photograph, for example) it utilizing his/her electronic notebook device such as a cellular phone or PDA, or other available device.
  • the data visualized in the near-by display or surface may include floor number and/or other ID, parking space number and/or other ID.
  • a barcode or a tag e.g. an FID tag, may be applied for representing and/or carrying the necessary information.
  • Identification data relating to the vehicle itself may be utilized in route determination provided that the parking facilities include technology, such as imaging technology, to detect and recognize the vehicle accordingly and monitor its movements and/or trace its parking location.
  • a vehicle may contain a tag that can be externally read for tracking purposes.
  • the arrangement may be configured to determine a route from the location of the arrangement, an element such as the visualization entity thereof, or some other predetermined location, back to the vehicle.
  • the arrangement may utilize a number of predetermined route determination criteria such as route length and/or estimated route duration. Typically minimization of both criteria is sought after.
  • a database of ready-determined routes may be applied.
  • the vehicle finding aid may exploit the aforesaid more generic wayfmding feature for initial direction visualization, for instance.
  • the arrangement may be configured to associate a number of metadata entities with a number of data entities, preferably images such as photographs of subject entities like products or services to be promoted or otherwise put forth via the arrangement.
  • a metadata entity describing an image may include at least one metadata element selected from the group consisting of: image identifier, identifier of the image's subject matter such as product name and/or product model name, subject class such as product or service class, dealer or other source of the subject, dealer location, manufacturer, price, and free word field.
  • An image may be shown via the visualization entity in a (view) position and/or at a time instant depending on the metadata. Images may be searched using the metadata and a related search facility offered by the arrangement.
  • Potential metadata values may be applied as search terms by the user of the arrangement.
  • the result group of the search may include images with metadata matching the query criteria.
  • Search results, i.e. images, matching the query may be visualized.
  • the wayfmding methods described herein may be utilized for determining and visualizing a route to the (nearest) location offering the product or service represented by an image.
  • a method for adapting digital signage views comprises
  • a digital signage data visualization entity such as a display
  • a user input entity such as a touch screen functionality associated with the data visualization entity
  • the user input being indicative of a request by the user of the arrangement for adaptation of the present view visualized via the visualization entity in terms of at least one usability issue, such as disability and/or other property of the user, e.g. physical property such as height, and
  • the utility of the present invention arises from a plurality of factors depending on each particular embodiment.
  • use experience associated with digital signage equipment may be enhanced by adapting data representation according to user preferences.
  • wayfmding may be made more illustrative and user-friendly through the proposed 3D modeling and/or augmented camera view solution.
  • Vehicle location finding embodiments may provide additional value to the users of digital signage gear when they are about to exit e.g. a mall provided with a huge parking facility with little or no memory of the location of their vehicles, for instance.
  • products purchasable at or near the location of the digital signage user equipment may facilitate implementation of both versatile and efficient product/dealer search-and-fmd feature in the context of digital signage with optional hooks such as wayfmding.
  • the users may be provided with more targeted and complete information package concerning each marketed service or product.
  • data transfer may refer to transmitting data, receiving data, or both, depending on the role(s) of a particular entity under analysis relative a data transfer action, i.e. a role of a sender, a role of a recipient, or both.
  • the terms “a” and “an” do not denote a limitation of quantity, but denote the presence of at least one of the referenced item.
  • first and second do not denote any order, quantity, or importance, but rather are used to distinguish one element from another.
  • Fig. 1 illustrates the general concept of the present invention according to an embodiment thereof.
  • Fig. 2a illustrates an embodiment of the present invention as to the view adaptation feature suggested.
  • Fig. 2b illustrates the embodiment of Figure 2a at a second instant after acquisition of user input and related adaptation.
  • Fig. 3 illustrates an embodiment of wayfmding in accordance with the present invention.
  • Fig. 4 illustrates an embodiment of wayfmding, in particular vehicle finding, in accordance with the present invention.
  • Fig. 5 illustrates an embodiment of product/service promotion in accordance with the present invention.
  • Fig. 6 is a block diagram of one embodiment in accordance with the arrangement of the present invention.
  • Fig. 7 depicts a flow diagram of a method in accordance with the present invention to be performed by the server arrangement.
  • An arrangement according to an embodiment of the present invention may be configured to decide on the selected adaptation based on a predetermined logic and the nature of the user input.
  • the user input thus preferably indicates the usability issue to be tackled.
  • an icon, other symbol, graphical element and/or text which is associated technically, i.e. by the logic of the arrangement, and preferably also mentally with a certain usability issue, such as user disability and/or other property, may be visualized using the visualization entity, the selection or other activation measure of which by the user, e.g. via the touch screen of the arrangement, subsequently triggers the adaptation procedure.
  • a symbol of a wheelchair may be associated with a corresponding disability affecting e.g.
  • the user input may even be automatically provided to the arrangement via a personal terminal device of the user, such as a mobile terminal or communications-enabled tablet, a PDA, or e.g. a wristop computer.
  • the terminal device may be configured to store user-adjustable settings for digital signage to be transferred advantageously automatically upon detection of external signage equipment (e.g. via active scan procedure) or not until user confirmation/initiation obtained via the UI of the terminal.
  • the view may be adapted by adapting at least one visualized element thereof, and/or the view may be adapted by adding thereto and/or removing therefrom a number of elements.
  • an already visualized element may be adapted to lessen the burden of consuming it. Consuming may refer to reading, for example, in the case of an element including text from the standpoint of a weak-eyed or visually impaired user.
  • a new element such as a textual and/or graphical element may be visualized; for example, in the case of a hearing-impaired user, a scarcely heard audio message such as an ad may be suitably converted into a text visually represented to the user.
  • a number of audio-related factors such as audio level and/or reproduction speed or tempo of speech or other audio may be adapted. For example, playback volume/audio level may be increased.
  • Elements visualized but considered, according to predetermined logic, as secondary may be removed, moved to background, minimized, or otherwise de-emphasized in favour of priority element(s).
  • commercial information such as an ad may be considered as secondary in contrast to e.g. geographical information such as a map including e.g. a floor plan.
  • priority element(s) may be highlighted and emphasized during the adaptation for elevated usability. Reverting to Figure 1, the overall architecture applicable in connection with the present invention is illustrated according to one potential embodiment 101.
  • the arrangement of the present invention may be implemented via a device 106, 108, such as a media kiosk or other display-containing device, best suiting each particular context in question 102, 104, by which it may be referred to a mall, bus station, metro station, airport, commercial building, company premises, etc.
  • the device 106, 108 may also be formed of a plurality of at least functionally connected devices such that e.g. necessary power and/or communications connections are established in between to form the arrangement.
  • the arrangement may be provided with a connection or at least connectivity to a number of wired and/or wireless network infrastructures 1 12, which may refer to public or private network(s) both alike.
  • An Internet connection/access may be arranged, for example.
  • external entities such as a number of servers 1 14, e.g. a cloud computing infrastructure, functionally connected to the arrangement via the Internet and/or using a private network, for instance, may be applied.
  • the server 1 14 may host, e.g. in a commercial context, a web site for consumers, i.e. potential end users of the arrangement, and/or an administration portal, which may be used for creating the content for the user interface, such as shop information and ads, and/or for information sharing between product and service providers creating the consumer content.
  • User-related data such as user selections and other user input may be gathered for analysis, e.g. profiling and targeted marketing purposes.
  • the arrangement may be considered to at least logically contain a number of communications network(s) 1 12 and/or external entities such as servers 1 14. Yet, in some embodiments the arrangement may even substantially consist of or contain a wireless mobile terminal device 1 10 such as a cellular phone, a PDA (personal digital assistant), a tablet computer, or a laptop computer, which is discussed further below.
  • the arrangement may be provided with a display device, user input device (these two may be cleverly combined in a touch screen, for example), a computing device such as a number of processors, and a memory device such as a number of memory chips. Further, a number of optional elements such as a data transfer interface for communication with external entities may be provided therewith.
  • each element may have to be provided with a data transfer means sufficient for necessary mutual communication.
  • the elements of the arrangement and their higher level configuration remind the one of a personal computer (pc), for example, which makes it functionally possible to implement a similar solution in a mobile personal terminal device provided that the mobile device is configured to contain and/or obtain necessary data such as location data and location-related data for adequately serving the user in terms of digital signage.
  • the mobility may also open up new possibilities relative to location-awareness and wayfmding features provided by some embodiments of the arrangement as described in more detail hereinbelow.
  • the arrangement such as a media kiosk, may thus include a pc running a web browser with necessary add-on applications and features.
  • a web application framework such as SilverlightTM may utilized for constructing the functionality of the arrangement.
  • suitable supplementary software such as the Real Kiosk (R-kiosk) extension for the FirefoxTM browser, may further be applied to turn the browser more into a kiosk-style browser UI.
  • the arrangement may offer easy and simple way to find products and services of interest in a defined local environment and the product or service provider an efficient medium to reach the correct target group in a defined local environment through e.g. Internet connected touch screens and/or mobile devices, for example.
  • FIG. 2a an embodiment of the view adaptation feature is illustrated via two views relating to two time instants, respectively.
  • Figure 2a shows merely an exemplary view 202 of a digital signage solution prior to adapting
  • Figure 2b represents a corresponding adapted view 220.
  • a map/floo ⁇ lan/wayflnding view 204 or generally a first element, or a "sub-view", of the overall view 202 may be provided with optional pointers, symbols, and/or text, describing entities such as shops and restaurants e.g. in the vicinity of the signage gear itself or at an alternative, e.g. user-selected, location such as a different floor, etc.
  • a second element 210 such as an advertisement space for associated text, images, and/or videos relating to a number of advertisements, etc., may be simultaneously visualized at a different location than the first element 204.
  • the view 202 may include further elements such as a clock/calendar element 206.
  • Some elements 208, 212 may relate to user input entity such as a touch screen and indicate actions triggered via selecting (touching the associated screen location in conjunction with a touch screen, for example, or via point-and-click, if e.g. a cursor or highlighting indicator and a relating control feature such as a joystick, trackball, or a touchpad is provided) or otherwise activating them, e.g. via a button press of a predetermined button.
  • an element 212 associated with the adaptation such as an icon or symbol forming a part of the view 202, may be provided.
  • the element 212 may, as described above, be selectable and/or otherwise activable, or at least the related adaptation function addressable, by the user via the user input entity for triggering the adaptation.
  • the user input entity may apply data interface for obtaining the input transferred utilizing e.g. a short-range wireless or cellular connection, for instance.
  • the element 212 may be configured to visually indicate the nature of the disability and/or of the usability-enhancing action associated therewith.
  • the element 212 includes the international symbol of access (ISA) through the selection of which the display view 202 is adapted from the standpoint of a predetermined disability such as physical disability.
  • ISA international symbol of access
  • Element 204b illustrates an enlargened, zoomed (in) and relocated (moved closer to the screen bottom) element 204.
  • the usability has improved relative to e.g. physically disabled users using a wheelchair etc. and having difficulty seeing clearly to the upper part of the display, for instance.
  • Elements 206 and 210 considered of having lesser importance in the light of the usability adaptation associated with the element 212 have been removed. Alternatively, they could be at least reduced, hid, minimized, put into background, or otherwise de-emphasized, for example.
  • Elements 208 have been re-arranged including re -positioning 208b to facilitate access thereto.
  • Element 212 has been updated 212b regarding its visual appearance in this case to indicate the current state ("usability adaptation on") of the associated functionality to the user. Alternatively or additionally, e.g. future state after next activation could be indicated by the visual update. Re-selection of element 212b, or some other predetermined user input, may switch the view 220 back into 202, for instance, so that the views 202, 220 may be alternated.
  • New element 222 has also appeared.
  • the element 222 may be associated with a second usability adaptation measure, such as increase in text/font size to further facilitate reading the shown information by the visually impaired.
  • FIG. 3 illustrates an embodiment of wayfmding in accordance with the present invention.
  • a view 301 may be produced by a digital signage arrangement such as a media kiosk after the user thereof has input a destination location utilizing the user input entity, for example.
  • the equipment may utilize a dynamic route determination algorithm and/or a database of at least partially ready-determined routes. Even the latter may work fine in connection with static setting and e.g. fixed location of the arrangement (display).
  • the view 301 may include a 3D element 302, such as a camera-provided view including still photo image data and/or video image data, a computer-generated view, or a mixture of both, illustrating e.g. the surroundings of the arrangement in a fixed or dynamic, potentially user-controllable, direction.
  • the direction is aligned with the initial direction of the determined route.
  • predetermined or user-selected part(s) of the overall route may be visually and/or otherwise, potentially audibly, represented with route guidance data via the arrangement.
  • At least part of the route may be shown utilizing video data, a number of images, and/or computer graphics, for example.
  • the element 302 may include an isometric, a first-person, or a bird's eye view, for instance.
  • route guidance data 304, 306 such as a number of arrows 304, lines, other symbols, and/or textual/numerical information, e.g. distance indicator associated with direction arrow 304 and/or a (target) location pointer 306, may be shown simultaneously with the view as superimposed thereon, for instance.
  • other data such as commercial data and/or various announcements may be provided on the wayfmding view 301. Audio playback (route guidance and/or other audio data) is a further possibility.
  • the arrangement 404 may be configured to provide route information, optionally including e.g. directional information such as initial heading, for locating the target vehicle such as a parked car.
  • the user input may include coordinates of the location of the vehicle, other identification data of the location of the vehicle or near-by location, and/or vehicle-related identification data such as register number of the vehicle.
  • Route information may be provided via the data visualization entity and/or utilizing other techniques such as a printed route note and/or a printed route map. At least part of the route may be illustrated via a plane view 406, e.g. floor plan view, or applying 3D modeling techniques and/or camera views 408, for example.
  • a plane view 406 e.g. floor plan view, or applying 3D modeling techniques and/or camera views 408, for example.
  • computer-generated route guidance data in a form of arrows has been superimposed on a camera view, being a photo image or video image view.
  • route information and/or guidance data may in some implementations be supplied to a personal digital terminal device of a user, optionally in order to enable substantially continuous, real-time navigation service.
  • the terminal may send data such as location information to the arrangement 404.
  • Feasible indoor and/or outdoor positioning technologies may be utilized.
  • WLAN, Bluetooth, ZigBee, cellular (e.g. Cell-ID, TOA (Time of Arrival), TDOA (Time Difference of Arrival)), and/or satellite positioning, such as GPS (Global Positioning System) or GLONASS (Global Orbiting Navigation Satellite System) may be applied.
  • the terminal may, after initial communication with the arrangement 404, act autonomously (information regarding whole route received for independent utilization, for instance) or at least partially rely on information obtained from the arrangement 404 (real-time navigation instructions based on updated terminal location, for example).
  • Figure 5 illustrates an embodiment of product and/or service promotion in accordance with the present invention.
  • a number of metadata entities may be associated with a number of data entities, preferably images such as photographs of subject entities like products or services to be promoted or otherwise put forth via the arrangement.
  • a metadata entity may be or include a data entity identifier, identifier of the data entity's subject matter such as product/service name and/or model name, a subject class such as product or service class, an indication of the dealer or other source of the subject, an indication of dealer/source location, an indication of the manufacturer, a price indicator, and/or a free word field.
  • a search facility 504 applying the metadata may be provided to the users by the arrangement for finding interesting products and/or services. Potential metadata values may be applied as search terms.
  • the outcome of the search may include a number of images with metadata matching the search criteria.
  • the search results may be visualized 502.
  • the wayfmding methods described herein may be utilized for determining and visualizing a route to the (nearest) location offering the product or service represented by the image.
  • Metadata search terms applied by the users may be further monitored, analyzed and exploited. For example, an indication of most popular search criteria, and/or of related hits, i.e. search results meaning the data entities such as the images, may be listed or otherwise provided to the users and/or the operator or administrator of the arrangement. Further, the operator/administrator may be provided with a tool to more specifically manage the search terms, i.e. through censoring or "moderating".
  • search term list shown to the users.
  • other criteria such as commercial criteria may be additionally or alternatively applied for determining the order of the search terms in the search term list shown to the users. For example, advertisers such as product manufacturers or dealers may be willing to be pay for a desired (e.g. early) position in the "most searched" or "most popular" listing.
  • image data some other form of a visual representation such as video may be supplemented with related metadata.
  • still some other type of a data entity such as a multimedia element, audio element, or textual element may be supplemented with metadata. The similar reproduction and search procedure as described above may be correspondingly applied.
  • product and/or service information may be searched and visualized.
  • the images of the search result group may be visualized, e.g. in an automatically rotating or user-controllably rotatable slideshow format.
  • Metadata such as product/service dealer information (e.g. location of the nearest dealer) may be represented as well.
  • FIG. 6 is a block diagram of one embodiment in accordance with the arrangement 600 of the present invention.
  • the arrangement 600 may physically contain a number of at least functionally connected elements.
  • the arrangement 600 is typically provided with one or more processing devices capable of processing instructions and other data, such as one or more microprocessors, micro-controllers, DSP's (digital signal processor), programmable logic chips, etc.
  • the processing entity 602 may thus, as a functional entity, physically comprise a plurality of mutually co-operating processors and/or a number of sub-processors connected to a central processing unit, for instance.
  • the processing entity 602 may be configured to execute the code stored in a memory 604, which may refer to instructions and data relative to the digital signage arrangement software logic and software architecture 610 for controlling the arrangement 600.
  • the processing entity 602 may be configured to control data visualization and optionally also audio reproduction, for example.
  • the memory entity 604 may be divided between one or more physical memory chips or other memory elements.
  • the memory 604 may store program code and other data such as marketing data, map/wayfmding data, etc.
  • the memory 604 may further refer to and include other storage media such as a preferably detachable memory card, a floppy disc, a CD-ROM, or a fixed storage medium such as a hard drive.
  • the memory 604 may be non-volatile, e.g. ROM (Read Only Memory), and/or volatile, e.g. RAM (Random Access Memory), by nature.
  • Software (product) 610 may be provided on a carrier medium such as a memory card, a memory stick, an optical disc (e.g. CD-ROM or DVD), or some other memory carrier.
  • the UI (user interface) 612 may comprise a display or a data projector 612b, and keyboard/keypad or other applicable user (control) input entity 612a such as a touch screen and/or a voice control input, or a number of separate keys, buttons, knobs, switches, a tag reader such as RFID reader, a touchpad, a joystick, a mouse, and/or imaging device such as a barcode (1 st , 2 nd , and/or 3 rd generation compatible, for example) reader configured to provide the user of the arrangement 600 with practicable data visualization and device control means, respectively.
  • a display or a data projector 612b such as a touch screen and/or a voice control input, or a number of separate keys, buttons, knobs, switches, a tag reader such as RFID reader, a touchpad, a joystick, a mouse, and/or imaging device such as a barcode (1 st , 2 nd , and/or 3 rd generation compatible, for example) reader
  • the UI 612 may include one or more loudspeakers and associated circuitry such as D/A (digital- to-analogue) converter(s) for sound output, and optionally a microphone with A/D converter for sound input.
  • D/A digital- to-analogue
  • a printer may be included in the arrangement for providing more permanent output.
  • a mobile-executable application or other software may be downloaded preferably wirelessly to the terminal for data visualization, audio playback and/or communication such as user input provision relative to the (rest of the) arrangement 600.
  • browser and/or messages such as text or USSD messages may be applied for user input purposes and/or data transfer in the opposite, downlink direction.
  • the arrangement 600 may further comprise a data interface 608 such as a number of wired and/or wireless transmitters, receivers, and/or transceivers for communication with other devices such as terminals and/or network infrastructure(s).
  • a data interface 608 such as a number of wired and/or wireless transmitters, receivers, and/or transceivers for communication with other devices such as terminals and/or network infrastructure(s).
  • Non-limiting examples of the generally applicable technologies include GSM (Global System for Mobile Communications), GPRS (General Packet Radio Service), EDGE (Enhanced Data rates for Global Evolution), UMTS (Universal Mobile Telecommunications System), WCDMA (wideband code division multiple access), CDMA2000, PDC (Personal Digital Cellular), PHS (Personal Handy-phone System), WLAN (Wireless LAN, wireless local area network), WiFi, Ethernet, USB (Universal Serial Bus), RFID, NFC (Near-Field Communication), and Firewire. Further, a Bluetooth adapter for peer-to-peer communication and piconet/scatternet use may be provided.
  • GSM Global System for Mobile Communications
  • GPRS General Packet Radio Service
  • EDGE Enhanced Data rates for Global Evolution
  • UMTS Universal Mobile Telecommunications System
  • WCDMA wideband code division multiple access
  • CDMA2000 Code Division Multiple Access 2000
  • PDC Personal Digital Cellular
  • PHS Personal Handy-phone System
  • WLAN Wireless LAN, wireless local area network
  • WiFi Ethernet
  • the arrangement 600 may comprise numerous additional functional and/or structural elements for providing advantageous communication, processing or other features, whereupon this disclosure is not to be construed as limiting the presence of the additional elements in any manner.
  • FIG. 7 is a flow diagram of an embodiment of a method in accordance with the present invention.
  • an arrangement in accordance with an embodiment of the present invention is obtained and configured, for example via loading and execution of related software, for managing a digital signage installation.
  • a connection may be established to a network server and/or a mobile client device (terminal) carried by a user.
  • digital signage data such as product, service and/or wayfmding -related data, is visualized via an applicable entity such as a separate digital signage display and/or terminal device of the user.
  • the arrangement may be configured to receive user input via a number of options including a dedicated user interface like a touch screen.
  • interface software running in a compatible terminal device carried along by the user may be applied, for example.
  • the user input is captured, the user input being indicative of a request for adapting of the present view visualized via the visualization entity in terms of at least one usability issue, such as disability and/or other property of the user, e.g. physical property such as height.
  • the position, size and/or the way of representation, such as style, of at least one element of the view is adapted in response to the user input so as to improve the usability of the arrangement in the light of the aforesaid usability issue.
  • the method execution is ended. Broken lines depict the potentially repetitive nature of various method items.
  • a method flow diagram relating to an embodiment of wayfmding in connection with digital signage is illustrated.
  • Initial and final method items (not shown) with preparatory and concluding actions, respectively, may be similar to the ones 700, 708 of the adaptation method also in this case.
  • the method may actually be executed jointly with the adaptation method, in parallel therewith, or independently.
  • route/navigation starting point is solved. It may be, by default, associated with the location of the digital signage arrangement, or at least e.g. the display thereof, if the related elements are physically separate.
  • the user may determine the starting point utilizing the user input entity and further optionally e.g. his/her personal terminal device may be capable of preferably wirelessly providing route source location information to the signage gear.
  • an indication of the route target location is received, preferably again utilizing the user input entity.
  • the target location may imply the location of a particular store, for instance.
  • the target location may indicate the location of a predetermined vehicle.
  • at least part of the route preferably at least e.g. the initial direction is determined, which may refer to dynamic calculations and/or utilization of pre-defined routes ("route database"), for example.
  • route information such as a 3D view like a video camera or still (photo) image view preferably augmented with route navigation data such as instructive arrows, is provided utilizing the visualization entity and/or data interface for transmitting data to be visualized to an external device such as a terminal device of the user as mentioned hereinbefore.
  • the terminal location may be acquired using e.g. network-based or terminal-based positioning, whereupon the route information may be dynamically determined on the basis of the location by the arrangement and/or the terminal itself.

Abstract

La présente invention porte sur un agencement électronique informatisé (106, 108, 110, 114, 404, 600) de signalisation numérique, tel qu'un kiosque multimédia de type socle à écran tactile, comprenant une entité à mémoire (604) configurée de façon à conserver en mémoire des données, telles que produit, service et/ou informations de signalisation, destinées à être visualisées par l'intermédiaire de l'agencement, une entité de visualisation (612b), telle qu'un dispositif d'affichage, configurée pour visualiser les données, une entité de traitement (602) configurée de façon à commander la visualisation des données et une entité d'entrée utilisateur (612a, 608), telle qu'une fonctionnalité d'écran tactile associée à l'entité de visualisation de données, configurée de façon à recevoir une entrée d'utilisateur indicative d'une demande par l'utilisateur de l'agencement pour une adaptation de la vue actuelle visualisée par l'intermédiaire de l'entité de visualisation en termes d'au moins un problème d'utilisation, tel qu'un handicap et/ou d'autres propriétés de l'utilisateur, par exemple une propriété physique telle que la taille, après quoi l'entité de traitement est configurée de façon à adapter la position (208, 208b), la taille (204, 204b) et/ou le mode de représentation (212, 212b) d'au moins un élément de la vue en réponse à l'entrée de l'utilisateur de façon à améliorer la facilité d'utilisation de l'agencement à la lumière du problème d'utilisation précité (202, 220). Un procédé associé est présenté.
PCT/FI2010/050701 2010-09-13 2010-09-13 Agencement et procédé associé de signalisation numérique WO2012035191A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/FI2010/050701 WO2012035191A1 (fr) 2010-09-13 2010-09-13 Agencement et procédé associé de signalisation numérique
EP10857203.3A EP2616909A4 (fr) 2010-09-13 2010-09-13 Agencement et procédé associé de signalisation numérique

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/FI2010/050701 WO2012035191A1 (fr) 2010-09-13 2010-09-13 Agencement et procédé associé de signalisation numérique

Publications (1)

Publication Number Publication Date
WO2012035191A1 true WO2012035191A1 (fr) 2012-03-22

Family

ID=45831045

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2010/050701 WO2012035191A1 (fr) 2010-09-13 2010-09-13 Agencement et procédé associé de signalisation numérique

Country Status (2)

Country Link
EP (1) EP2616909A4 (fr)
WO (1) WO2012035191A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3264041A1 (fr) * 2016-06-28 2018-01-03 Yim Mai Amy Lee Système et procédé de navigation
CN111178579A (zh) * 2019-11-26 2020-05-19 恒大智慧科技有限公司 智慧社区内自动导航方法、计算机设备及可读存储介质
US20210133451A1 (en) * 2019-11-01 2021-05-06 Fast Retailing Co., Ltd. Information processing apparatus, non-transitory computer readable storage medium, information processing method, and signage system
NL1043806B1 (en) * 2020-10-05 2022-06-03 Atsence B V Inclusive personal wayfinding assistive method and system for electronic devices for all, in particular for visually impaired, dyslectics, the color blind, elderly and children.

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998027533A2 (fr) * 1996-12-17 1998-06-25 Citicorp Development Center Machine de guichet automatique pour les non-voyants et les personnes ayant une deficience visuelle
US6049328A (en) * 1995-10-20 2000-04-11 Wisconsin Alumni Research Foundation Flexible access system for touch screen devices
US20030164861A1 (en) * 2002-03-04 2003-09-04 Monique Barbanson Legibility of selected content
WO2003096305A1 (fr) * 2002-05-14 2003-11-20 Ascom Autelca Ag Procede, interface de systeme et appareil destines a des utilisateurs handicapes
WO2004073512A1 (fr) * 2003-02-21 2004-09-02 Harman/Becker Automotive Systems (Becker Division) Gmbh Procede permettant d'obtenir une palette de couleurs dans un affichage afin de compenser le daltonisme

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6049328A (en) * 1995-10-20 2000-04-11 Wisconsin Alumni Research Foundation Flexible access system for touch screen devices
WO1998027533A2 (fr) * 1996-12-17 1998-06-25 Citicorp Development Center Machine de guichet automatique pour les non-voyants et les personnes ayant une deficience visuelle
US20030164861A1 (en) * 2002-03-04 2003-09-04 Monique Barbanson Legibility of selected content
WO2003096305A1 (fr) * 2002-05-14 2003-11-20 Ascom Autelca Ag Procede, interface de systeme et appareil destines a des utilisateurs handicapes
WO2004073512A1 (fr) * 2003-02-21 2004-09-02 Harman/Becker Automotive Systems (Becker Division) Gmbh Procede permettant d'obtenir une palette de couleurs dans un affichage afin de compenser le daltonisme

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2616909A4 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3264041A1 (fr) * 2016-06-28 2018-01-03 Yim Mai Amy Lee Système et procédé de navigation
US20210133451A1 (en) * 2019-11-01 2021-05-06 Fast Retailing Co., Ltd. Information processing apparatus, non-transitory computer readable storage medium, information processing method, and signage system
US11620823B2 (en) * 2019-11-01 2023-04-04 Fast Retailing Co., Ltd. Information processing apparatus, non-transitory computer readable storage medium, information processing method, and signage system
CN111178579A (zh) * 2019-11-26 2020-05-19 恒大智慧科技有限公司 智慧社区内自动导航方法、计算机设备及可读存储介质
NL1043806B1 (en) * 2020-10-05 2022-06-03 Atsence B V Inclusive personal wayfinding assistive method and system for electronic devices for all, in particular for visually impaired, dyslectics, the color blind, elderly and children.

Also Published As

Publication number Publication date
EP2616909A4 (fr) 2014-12-03
EP2616909A1 (fr) 2013-07-24

Similar Documents

Publication Publication Date Title
US10509477B2 (en) Data services based on gesture and location information of device
US10728706B2 (en) Predictive services for devices supporting dynamic direction information
JP5486680B2 (ja) 指向性デバイス情報を介して検出された興味のある地点との対話に基づくポータルサービス
US20170249748A1 (en) System and method for converting gestures into digital graffiti
JP5456799B2 (ja) 装置の方向情報に基づく装置の取引モデルおよびサービス
CN108337907B (zh) 生成并显示与移动设备的当前地理位置相关联的位置实体信息的系统和方法
JP6580703B2 (ja) モバイルデバイスの現在の地理的ロケーションに関連付けられたロケーションエンティティの曖昧性解消のためのシステムおよび方法
US20080250337A1 (en) Identifying interesting locations based on commonalities in location based postings
US9739631B2 (en) Methods and systems for automatically providing point of interest information based on user interaction
US11790022B2 (en) User interfaces and methods for operating a mobile computing device for location-based transactions
JP2014178724A (ja) クーポン提供方法、クーポン提供サーバ及びクーポン提供システム
EP2616909A1 (fr) Agencement et procédé associé de signalisation numérique
Samuel et al. Smart indoor navigation and proximity advertising with android application using BLE technology
WO2021079829A1 (fr) Dispositif d'affichage, système d'aide a un événement, procédé d'affichage et procédé de production d'un système d'aide à un événement
Krammer et al. Findings from a Location Aware Smartphone Application for a novel Retail Shopping Experience

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10857203

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2010857203

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE