US11276237B2 - Augmented reality campus assistant - Google Patents
Augmented reality campus assistant Download PDFInfo
- Publication number
- US11276237B2 US11276237B2 US16/950,776 US202016950776A US11276237B2 US 11276237 B2 US11276237 B2 US 11276237B2 US 202016950776 A US202016950776 A US 202016950776A US 11276237 B2 US11276237 B2 US 11276237B2
- Authority
- US
- United States
- Prior art keywords
- user
- augmented reality
- mobile device
- media content
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/131—Protocols for games, networked simulations or virtual reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/025—Services making use of location information using location based information parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/029—Location-based management or tracking services
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/2866—Architectures; Arrangements
- H04L67/30—Profiles
- H04L67/306—User profiles
Definitions
- This invention relates to information processing systems and methods in a workplace environment. More particularly, the invention relates to systems and methods for displaying information for use by human users in a workplace environment. These systems and methods may include an augmented reality mobile device application with voice interactive and other functions including user-selectable buttons.
- the present invention relates to methods, systems and data transmitted from intelligent networked mobile computer systems operating in an electronic environment that may be referred to as the “Internet of Things”.
- the “Internet of Things” describes the increasing levels of electronic interconnectedness, computing power and autonomy of behavior featured in everyday devices.
- Devices utilized in the workplace are more commonly called “intelligent” or “smart”, reflecting built-in computational abilities that allow them to control their own behavior in response to environmental changes as well as (or instead of) user controls.
- such devices In a workplace environment, such devices typically log relatively large amounts of data, and transmit that data to other places for processing such as mobile computer systems or external computer systems.
- smart mobile computers allow the employee to move freely about the workspace and retrieve information from computer networks accessible at their fingertips. Examples of these include retail operations where sales assistants or inventory control clerks carry hand-held computers with barcode scanners that can identify products by scanning the barcode and then displaying information associated with that product.
- Another example includes car rental return agents who key information into a smart mobile computer in the parking lot of the rental agency when the car is returned, and then print out a receipt from a mobile printer.
- Hand-held computers require the employee to devote one or both hands to the task of manually typing out commands into a keyboard associated with the computer. Such computers also generally require the employee to focus his gaze and attention to the hand-held computer rather than on the external environment and/or task before him. While these solutions represent an advance over stationary kiosks and strategically located catalogs, there is still much room for improvement to free up the hands and attention of the employee, to thereby increase the employee's productivity.
- “intelligent” and interconnected devices include: medical monitoring equipment in the home that receives data from medical devices, biological sensors and/or implants; wrist-worn activity trackers with the ability to transfer logged health data to a user's computer, and to the manufacturer's servers for analysis; the whole category of “wearable computing” including clothing made of “smart fabrics” with built-in sensors for health, sports and/or safety and the ability to alter their fabric's properties in response to feedback, as well as “smart watches” with built-in computers and Bluetooth connectivity.
- “Smart” appliances are rapidly emerging including enhanced functionalities such as augmented reality features.
- augmented reality eye mobile devices exist, including “Google Glass”, which is reportedly able to continuously monitor the user's surroundings by video, apply face recognition, and provide real-time information to the user.
- An augmented reality, or “AR”, scenario typically involves presentation of digital or virtual image information as an augmentation to visualization of the actual world around the user.
- an augmented reality scene may allow a user of AR technology to see one or more virtual icons superimposed on or amidst real world images.
- portable electronics and/or mobile phone devices now generally include a large variety of sensing capabilities. These capabilities can be utilized to further enhance a user's augmented reality experiences.
- the preferred embodiment of the present invention provides a method for providing augmented reality information to a user in real time.
- the present invention is directed to systems and methods of providing an augmented reality experience to a user working on and navigating around a workplace campus.
- the present invention addresses several problems that persists in the field: that employees and staff are overwhelmed with data and tasks and thus cannot carefully parse the data related to their job in a timely fashion.
- human resources personnel, employees, staff and building management personnel are often overwhelmed with information and tasks that may be easily automated with software and hardware systems.
- tasks such as navigating around a given workplace campus are often difficult for new hires, for employees of very large companies, and for temporary employees who make lateral career moves with great frequency.
- traditional key and lock systems are difficult expensive to change, particularly in industries with high employee turnover. Difficulty managing building access also poses a threat to building security.
- augmented reality methods and systems with voice interactive and augmented reality functions including user-selectable buttons.
- Such methods and systems may be able to utilize facial recognition and provide rich real-time information to the user via composited media and acoustic speakers.
- Composited media may include interactive maps, calendaring functions, building access means, and tools to assist with management of assignment information.
- an augmented reality environment may facilitate access to various locations and resources on a given workplace campus.
- augmented reality methods and systems may offer improved security and a variety of cost-saving features.
- a first objective of the present invention is to provide a means to enhance efficiencies for job seekers, hiring managers, workplace visitors, employees, staff, and the like.
- a second objective of the present invention is to provide a means by which employees can attend to their duties and interact with smart devices without having to manually type out commands into a keyboard.
- a third objective of the present invention is to provide a means of focusing employee attention on the environment around him or her and the task at hand, rather than on a hand-held or wearable device that assists the employee in his or her duties.
- a fourth objective of the present invention is to provide a trigger image or logo-controlled means of facilitating navigation around a workplace campus and a trigger image-controlled means of viewing and interacting with additional active information.
- a fifth objective of the present invention is to provide a means of providing employees with real-time information related to their assignments.
- a sixth objective of the present invention is to provide a means for employees and management to track assignment progress, and assignment histories in real time.
- Still another objective of the present invention is to provide a logo-controlled and/or biometrically-controlled means of accessing a building or a door within a building.
- FIG. 1 shows a specific logo scanned thereby initiating an augmented reality experience according to an embodiment
- FIG. 2 shows an augmented reality system providing building information according to an embodiment
- FIG. 3 shows an augmented reality system utilizing voice recognition according to an embodiment
- FIG. 4 shows an augmented reality system integrating campus routes and travel time according to an embodiment
- FIG. 5 shows an augmented reality system providing building access and remotely unlocking doors according to an embodiment.
- This present invention comprises information processing systems and methods in a workplace environment.
- a workplace environment comprises a workplace campus, workplace interior, locations external to a workplace campus, and other locations associated with a workplace. More particularly, the invention relates to systems and methods for displaying active information for use by human users in a workplace environment. Active information may include information required by employees in the workplace, including user navigation history information, local user building access history information, user remote building access history information, user button selection history information, voice command history information, user calendar history information, user assignment history information, user current assignment information, user timecard history information, user video history information, and the like.
- local user building access history and user remote building access history refer to the recordation of building door locking and unlocking histories of a given user, and differ only in the physical location of the user when a building door was accessed.
- Talent Network history information refers to a user's history of selecting, investigating, or interviewing for a position identified via the Talent Network functionality of the present invention.
- a mobile device may be a wireless mobile device or any type of portable computer device, including a cellular telephone, a Personal Digital Assistant (PDA), smartphone, etc.
- smartphones contemplated by the present invention include Apple's iPhone series, Google's Droid and Nexus One series, Palm's Pre series, and RIM's Blackberry series of smartphones.
- Most, if not all, of these mobile devices include a built-in camera that can be controlled by software applications.
- mobile devices comprise a camera, a processor, a graphical user interface (GUI), and a memory.
- the memory is operatively coupled to the processor and stores program instructions that when executed by the processor, causes the processor to receive an image from the camera. Said image may be displayed on the GUI.
- the GUI may also receive descriptive data for the image and store the descriptive data and image as a listing. Generally, said listing may be transmitted wirelessly to a host server.
- the present augmented reality mobile device application may facilitate employment activities such as navigation, accounting, networking, and the like.
- the mobile device may comprise a display, a GPS module, a compass, a camera and various other input/output (I/O) components.
- the mobile device is capable of capturing media content such as general workplace imagery, badge imagery, logo imagery, sound information, location information, and/or similar media content external to a workplace environment.
- the mobile device or smartphone contemplated by the present invention is also capable of superimposing overlay imagery onto the captured media content.
- the present invention comprises an information processing system in a workplace environment.
- said information processing system may be summarized as a system comprising a server connected to a network, wherein the server receives requests from users via a network.
- This server may include a processor(s), a database for storing trigger image information, and a memory operatively coupled to the processor.
- memory stores program instructions that when executed by the processor, causes the processor to receive media content requests such as a trigger image from a user via the network. Overlay imagery is generated from the trigger image database based on such a request. Finally, overlay imagery and/or composited media content is transmitted to the user via the network.
- the present invention also comprises information processing methods in a workplace environment.
- said information processing methods involve the following steps: 1) media content is captured at a workplace with a camera of a mobile device of a user (i.e., a trigger image in the workplace), 2) the trigger image is decoded to determine a location and a position of the mobile device, 3) trigger image is identified based on the location and the direction of the mobile device, 4) the user capturing the trigger image is identified, 5) overlay imagery is downloaded into the mobile device from a server, 6) overlay imagery is overlaid onto the captured media content (i.e., an object, logo, badge, etc.) to create a composited media content, and 7) the composited media content is displayed on the mobile device.
- the composited media content represents, in essence, the augmented reality experience of the user.
- determining the location and position of the mobile device may utilize a global positioning system module (also referred to herein as “GPS”) in the mobile device.
- GPS global positioning system module
- determining the location and position of the mobile device may utilize a cellular infrastructure to triangulate the location of the mobile device.
- identifying the trigger image may further involve an analysis of a trigger image database.
- the trigger image database includes various information including lists of trigger images, information regarding the workplace campus, structural aspects of the workplace environment, user access information, user past history information, and the like.
- identifying the trigger image may further involve transmitting the location and the direction of the trigger image to the server and thereafter receiving an identification of the trigger image from the server.
- identifying the workplace user may further involve analyzing the workplace user based on an identification of the trigger image and/or user assignment history information.
- identifying the workplace user may further involve extracting an image from media content comprising a trigger image and locating a distinctive feature in the trigger image.
- overlay imagery may be derived from workplace activities of users.
- Workplace activities may comprise current activities of several users, current activities of a single user, past workplace activities of several users, past workplace activities of a single user, and the like.
- Past workplace activities of a single user may include past navigation routes taken by a user, voice commands made by a user, assignment information related to a user, a user's expense history, and the like.
- overlay imagery may include text, icons, graphics, still images, motion video, augmented reality screens or any combination thereof.
- the overlay imagery may also comprise a interface (i.e., Graphical User Interface) to which the user can input data (i.e., voice information, location information, etc.).
- the present invention further comprises a backend server that may include any number of servers (i.e., workplace server, account server, etc.).
- backend servers can retrieve the user's past workplace activities from a user tracking account provided by the workplace or a user account provided by a manufacturer of the one or more trigger images.
- a user tracking account provided by the workplace or a user account provided by a manufacturer of the one or more trigger images.
- Such an account can be a workplace-based account and/or an online account for the user.
- the mobile device of the present invention may communicate with such a server.
- the present invention comprises a mobile device including a camera capable of capturing media content.
- the user initiates an augmented reality experience by capturing media content comprising a trigger image (also referred to herein as a “trigger”).
- a trigger image may comprise various media content including an object, image, video or similar media content.
- Identifying the trigger image comprises transmitting the captured trigger image to a server. The server then identifies the trigger image based on media content transmitted to and stored in the server.
- media content such as overlay imagery may derive from past workplace activities of the user of the mobile device.
- a trigger image may comprise a Wand Campus App Logo that is scanned to begin an augmented reality experience.
- a user may also directly input requests into the augmented reality program. Such requests may be auditory, typed manually, etc. and may relate to any of the functionalities described herein, including calendaring, assignment management, managing job opportunities, and the like.
- Available content chosen by a user may be restricted or limited to a particular theme or category. For example, as shown in FIG. 4 , a user may restrict choices to those related to navigation information. In this way, an employee can call up overlay imagery restricted to specific categories of information by specifying particular input information.
- WMS workplace management system
- a user When a user provides input information to the augmented reality program, this information is transmitted to a workplace management system (also referred to herein as “WMS”) that identifies information relevant to the chosen content. WMS makes a determination based on the input information and then outputs composited media content to an output of the mobile device. For example, a user may restrict choices to those related to navigation information, thereby transmitting to WMS a navigation request and eliciting a determination related to the location, travel direction, travel speed, etc. of the user.
- An overlay module then outputs visual composited media content comprising, for example, a 3D campus map to the display and audible composited media content to the speaker.
- the present invention provides various means for a user to input information, including voice interactive functions and user-selectable buttons.
- the present invention allows a user to view and manipulate active information.
- Active information may include information required by employees in the workplace, including user navigation history information, user maps integration history information, user building access history information, user remote building access history information, voice command history information, user calendar history information, user assignment history information, user timecard history information, user button selection history information, user video history information, and the like.
- the augmented reality environment facilitates access to various locations and resources on a given workplace campus.
- user video history information refers to the history of videos viewed by the user.
- the user provides input information in any of the manners or modes described herein, including voice commands and user-selectable buttons.
- voice command and voice recognition functionalities the user may call up a desired program or application by voice recognition.
- composited media content may be displayed on the augmented reality mobile device screen.
- composited media content may present the user with questions such as, “How can I assist you?”, in response to voice recognition by the augmented reality mobile device.
- the user may control the mobile device by indicating a choice on the virtual screen.
- the mobile device may be responsive to one or more motion or position sensors mounted on the electronic building, exterior structures, interior structures, or on the mobile device itself.
- the signals generated by the user inputs are initially sent to a microprocessor or microcontroller within the mobile device itself.
- the mobile device provides signal transducing and/or processing steps either at a remote server or locally in the mobile device. For example, some of the processing can occur at the remote server, while the remaining processing can occur at the mobile device.
- the user simply vocalizes “yes” or “no” in order to activate or deny the display of information.
- overlay imagery may be activated by questions asked by the user. For example, a user may ask, “What buildings do I have access to?”
- additional input means are contemplated by the present invention.
- the user provides a hand gesture to indicate “yes” or “no”.
- a trackpad depression may indicate “yes” or “no”. Such commands may trigger a variety of programs in the augmented reality system.
- buttons provide yet another means for a user to input information into the augmented reality system.
- Buttons are presented as overlay imagery and may display active information.
- active information may include information required by employees in the workplace, including user navigation history information, building access history information, user remote building access history information, voice command history information, user calendar history information, user assignment history information, user timecard history information, user video history information, and the like. Utilizing these features, the augmented reality environment facilitates access to various locations and resources on a given campus.
- buttons are presented to the user's mobile device as augmented reality screens or icons.
- buttons are presented to the user when the display of media content such as a building, trigger image or other object is roughly centered in the camera view.
- image analysis algorithms built into a capture module automate the activation of programs otherwise controlled by buttons. Such capture modules detect the presence of trigger images in the camera view and immediately begin capturing images and user information.
- the presence of a trigger image or a physical barcode sticker may also be used to automatically begin capturing media content.
- a Logo App utilizes trigger images comprising workplace logos (i.e., the Wand Campus App logo shown in FIG. 1 ).
- the final composited media content is assembled on augmented reality screens surrounding the logo (as opposed superimposing composited media content on the logo itself).
- Active information may include information required by employees in the workplace, including user navigation history information, building access history information, user remote building access history information, voice command history information, user calendar history information, user assignment history information, user timecard history information, user video history information, and the like.
- the logo App interfaces with mapping and travel time functionalities of the augmented reality system.
- mapping functionalities maps integration serves to assist navigation around a workplace campus. This functionality further serves to improve meeting management, security, building access, campus rendezvous with visitors, and the like.
- the mobile device provides 3D composited media campus maps for a user. Said mapping composited media may be accompanied by voice information, video information, and the like.
- the mapping overlay imagery may include text, pictures, buttons, or any combination thereof.
- voice command features and user-selectable features such as buttons allow a user to input navigation information into the augmented reality program.
- the user may input mapping information into the augmented reality program related to his or her desired destination and destination arrival time.
- the user's mobile device screen provides the primary means by which a user views and interacts with the augmented reality mapping program.
- a trigger image i.e., the Human Resources Building
- the forward-facing camera of the mobile device takes an image of the trigger image and sends it for processing to the mobile device's processor.
- trigger image recognition software determines which building or trigger image a user is facing.
- GPS coordinates may be searched in a database to determine what building or trigger image a user is facing.
- information such as building information in the vicinity of the trigger image may then be displayed to the user's mobile device screen.
- composited media content with mapping details is superimposed onto a user's screen in order to assist the user in completing various navigation activities.
- composited media content assists a new employee visiting the Reception Building on his or her first day.
- the new employee may scan a trigger image located at the Reception Building entrance in order to initiate mapping assistance.
- said trigger image may comprise a logo imprinted on the glass door of the Reception Building front entrance.
- the user may hover over a Wand Campus App logo located at the front door, thereby causing the augmented reality application to present various augmented reality screens surrounding the logo on the mobile screen.
- This information may include campus maps, calendar information, expense information, assignment information, job opportunity information, building information, building access tools and the like as described herein.
- building access tools may comprise user-selectable icons enabling locking and unlocking of various doors, both in person and remotely.
- the mapping functionality provides composited media related to workplace events on campus, meeting locations, meeting times, campus routes, travel times, building information, and the like.
- FIG. 4 shows how maps integration may facilitate display of such information, including campus routes and travel times.
- campus route information may comprise cookie crumbs that illuminate a pathway across the campus leading to a given building.
- the mapping functionalities of the present invention may utilize mobile GPS hardware and may interface with calendaring functions provided by the augmented reality system.
- building access features are also provided.
- composited media and overlay imagery comprise building access information
- images or icons summarizing building entry details are provided.
- building entry icons are user-selectable and may be utilized on location or remotely.
- building access features may include trigger image-controlled and/or biometrically-controlled door locks. Similar to the mapping features described above, the user's mobile device initiates use of building access features by first capturing media content (i.e., a still image or motion video) of a trigger image.
- a trigger image may comprise a workplace door, door icon, badge, logo, and the like.
- the mobile device then superimposes composite media content surrounding the trigger image (i.e.; a logo) in order to form a composited media image.
- the augmented reality program asks if the user would like access to the building, or alternately a door within the interior of the building.
- the augmented reality program presents color-coded selectable icons to represent locked and unlocked doors and may allow a user to remotely unlock doors.
- unlocked doors are identified with alphanumeric and color codes.
- the composited media is selectable to lock and unlock doors, either in person or remotely, and may be accompanied by audio information.
- the overlay imagery comprising the building access functionality may include text, pictures, buttons, video, or any combination thereof.
- the system unlocks the door.
- This user command may be combined with biometrically-controlled information, such as a photograph of the user's face.
- the user command and/or biometric information is then transmitted to the WMS, which then identifies the user as being authorized.
- a message is automatically sent to the relevant electronic building location to unlock the door.
- the building has a solenoid attached to the door lock, and unlocks the door under software control.
- the building access features of the present invention may be linked to the calendaring, timecard, and other features of the invention that are herein disclosed. For example, in one embodiment, when accessing an assigned building, an employee's time on the job is automatically tracked. Relatedly, when an employee leaves the building, time tracking may stop.
- trigger image-controlled and/or biometrically-controlled systems remove the need for a workplace to issue master keys to users.
- traditional key and lock systems are difficult to change.
- the disclosed augmented reality system offers better security than either existing key or biometric systems.
- trigger image-controlled door locks are inexpensive to deploy, requiring only a software-controlled solenoid to be installed in each building.
- the present invention can be used to display real-time information to users regarding their daily schedules in addition to information about past activities.
- the mobile device superimposes composited media onto augmented reality screens surrounding the captured trigger image, as described above.
- Calendaring information may include project start and end dates and may also include a countdown providing the number of days remaining to complete a given project.
- the computing platform of the augmented reality system described herein also includes calendaring inputs that maintain user context information such as user preferences, user calendar history information, meeting durations, meeting outcomes, etc.
- the calendar functionality may provide a proxy for user preferences and user context by enabling preferences to be inferred from prior actions and the user's calendar, which indicates user context like time, place, related contact information and subject descriptor.
- timecard functionalities are also available to the user of the present augmented reality system.
- the augmented reality application displays daily working hours that may be tracked and compared on a weekly basis, in addition to a time entry breakdown showing approved time, time pending approval, and overtime hours.
- a backend server can retrieve the user's workplace timecard activity from a user tracking account maintained by the workplace. Such an account can be a workplace-based account and/or an online account for the user.
- the present augmented reality system can be used to display real-time information identifying key assignment contacts, skills required for a given position, and the like.
- the mobile device superimposes composited media content onto augmented reality screens surrounding the captured trigger image.
- Assignment information may include information related to assignment contacts including photographs, positions in the company, employee ID information, barcodes linked to employee information, and the like. Further, assignment completion status may be displayed as a color-coded image and a displayed percentage.
- a color-coded user-selectable list of required position skills may be listed.
- Required position skills may include proficiency with software (i.e.; Microsoft Office), leadership experience, budgeting experience, reporting experience, risk management experience, analytics experience, and the like. Buttons may be presented to a user that permit the presentation of assignment details, timecard information, and the like.
- videos are identified via search by GPS location, search by trigger image recognition, search by vocal command, search by gesture command, and the like.
- a video database may be searched via the GPS coordinates of the Human Resources Building or by keystroke user input of the term “Human Resources Building”. Search results may include geo-tagged training videos for training of Human Resources personnel, navigation assistance to the Human Resources Building, and/or videos associated with maintenance of the Human Resources Building.
- the videos may be scrolled or flipped through using the control techniques described herein. Videos of interest may be played using the control techniques described herein.
- the video may take the form of composited media overlaid onto a real world scene or may be superimposed on the trigger image itself as described above.
- the mobile device may also be darkened to enable higher contrast viewing.
- the mobile device may be able to utilize camera and network connectivity to provide the user with streaming video conferencing capabilities.
- users of the present augmented reality system may receive content from an abundance of sources and may limit their choice of composited media to particular themes or categories.
- a user may limit composited media to only one department or building or within certain geographical limits. In one embodiment, these limits are selected via GPS criteria or by manually indicating a geographic restriction.
- a user may require that sources of streaming content be limited to those within a certain radius (a set number or km or miles) of the user. These limits may be set by voice command, button selection, hand motion, or any other mode of user input described herein.
- FIG. 6 is a flowchart of operations for the augmented reality system in a workplace environment according to some example embodiments.
- a flow diagram 10 includes operations that, in some example embodiments, are performed by components of a mobile device. The operations of the flow diagram 10 begin at block 12 .
- the camera of the mobile device captures media content of a trigger image that is local or remote to a workplace environment.
- the camera of the mobile device can capture still images, video, or a combination thereof.
- Examples of triggers being captured at a workplace environment include a badge, a logo, a building, icon etc. as described above.
- Examples of triggers being captured outside workplace environments include signage at an establishment outside of the workplace campus yet owned by or associated with the workplace (i.e., brick-and-mortar restaurants, stores, etc. associated with the workplace).
- trigger images may be captured by various users of the augmented reality system. Users may include employees, management, visitors, maintenance workers, security workers, and the like.
- a GPS module or cellular infrastructure is used to determine a location of the mobile device at a time when the media content is captured by the camera.
- the GPS module receives signals from a number of satellites orbiting around the Earth. The signals include data that indicates the satellite position and current time. Based on the satellite position and time when signals were sent from multiple satellites, the GPS module can use trilateration to determine its location on the Earth. In some example embodiments, differential GPS is used, wherein the area has already been surveyed using a GPS. The GPS module could determine the location of the mobile device within that area. The overlay module can then adjust the location captured by the GPS module with the location data from the previous survey. As noted above, the location can be determined using a cellular infrastructure to triangulate the location of the mobile device, alternatively or in addition to GPS locating. The overlay module of the mobile device stores this location in the storage unit of the mobile device for subsequent processing.
- the compass of the mobile device determines a direction that a lens of the camera of the mobile device is facing at a time when the media content is captured by the camera.
- the overlay module stores this direction in the storage unit and/or main memory for subsequent processing.
- the overlay module can make this determination based on embedded trigger images, such as logos, badges, and the like in various workplace displays (as described above).
- the overlay module identifies the trigger(s) based on the location of the mobile device and the direction that the lens of the camera is facing at the time when the media content is captured by the camera.
- the overlay module can determine the location of triggers in view of the lens of the camera based on the location of the mobile device and the direction of the lens.
- the overlay module can transmit its location and direction to a backend server.
- the backend server can then return the identification of the viewable triggers to the overlay module.
- the backend server stores the location of triggers in the area (i.e., workplace environment). For example, the locations of the triggers, structural aspects of the workplace environment (i.e., structural posts, walls, etc.), etc. are stored by the backend server.
- the backend server can return the identification of the triggers in the viewable area.
- identifying a workplace user involves an analysis of a trigger image database.
- the trigger image database includes trigger image information.
- Trigger image information may include a list of trigger images, information regarding the workplace environment, the locations of the triggers, structural aspects of the workplace environment, user access information, user past history information, and the like.
- the overlay module downloads, into the mobile device from a backend server, overlay imagery derived from workplace activity.
- Various overlay imagery can be downloaded (as described above).
- the captured image comprises a trigger image
- the overlay imagery may include data regarding past workplace activity for the particular trigger image.
- the overlay imagery can also identify other similar types of programs to execute based on the specific trigger image.
- the overlay imagery can provide media content related to a past workplace activity of the user associated with the mobile device.
- the past workplace activity of the user may comprise past campus navigation routes, past assignment information, past interviews, past meetings, etc.
- the overlay module composites the overlay imagery onto the captured media content to create a composited media content.
- the composited media content can be various combinations of media content. For example, a still imagery (i.e., text, graphics, etc.) can be composited onto a video or a still image. In another example, video imagery can be composited onto a video or still image. In another example, a graphical user interface can be composited onto a video or still image to allow the user to enter information. While the media content has been described relative to visual media content, in some other embodiments, audio media content can be included as either or both the captured media content or part of the overlay imagery.
- the overlay module outputs the overlaid media content to an output of the mobile device.
- the overlay module can output the visual overlaid media content to the display and audible overlaid media content to the speaker.
- the overlay module can output the overlaid media content to other devices. This output can occur through a wired or wireless communications between the mobile device and the other device.
- Said systems may include an augmented reality mobile device system with voice interactive and other augmented reality functions including user-selectable buttons.
- Said system can take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment containing both hardware and software elements.
- the system is implemented in software, which includes but is not limited to firmware, resident software, microcode, etc.
- system can take the form of a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer or any instruction execution system.
- a computer-usable or computer readable medium can be any apparatus that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
- the medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium.
- Examples of a computer-readable medium comprise a semiconductor or solid-state memory, magnetic tape, a removable computer diskette, a random-access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk.
- Current examples of optical disks comprise compact disk-read only memory (CD-ROM), compact disk-read/write (CD-R/W) and DVD.
- a data processing system suitable for storing and/or executing program code comprises at least one processor coupled directly or indirectly to memory elements through a system bus.
- the memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories that provide temporary storage of at least some program code in order to reduce the number of times code is retrieved from bulk storage during execution
- I/O devices including but not limited to keyboards, displays, pointing devices, etc.
- I/O controllers can be coupled to the system either directly or through intervening I/O controllers.
- Network adapters may also be coupled to the system to enable the data processing system to become coupled to other data processing systems or remote printers or storage devices through intervening private or public networks.
- Modems, cable modem and Ethernet cards are just a few of the currently available types of network adapters.
- the Internet refers to the collection of networks and routers that use the Transmission Control Protocol/Internet Protocol (“TCP/IP”) to communicate with one another.
- the internet 20 can include a plurality of local area networks (“LANs”) and a wide area network (“WAN”) that are interconnected by routers.
- the routers are special purpose computers used to interface one LAN or WAN to another.
- Communication links within the LANs may be wireless, twisted wire pair, coaxial cable, or optical fiber, while communication links between networks may utilize 56 Kbps analog telephone lines, 1 Mbps digital T-1 lines, 45 Mbps T-3 lines or other communications links known to those skilled in the art.
- computers and other related electronic devices can be remotely connected to either the LANs or the WAN via a digital communications device, modem and temporary telephone, or a wireless link.
- a digital communications device modem and temporary telephone, or a wireless link.
- the internet comprises a vast number of such interconnected networks, computers, and routers.
- the Internet has recently seen explosive growth by virtue of its ability to link computers located throughout the world. As the Internet has grown, so has the WWW.
- the WWW is a vast collection of interconnected or “hypertext” documents written in HTML, or other markup languages, that are electronically stored at or dynamically generated by “WWW sites” or “Web sites” throughout the Internet.
- client-side software programs that communicate over the Web using the TCP/IP protocol are part of the WWW, such as JAVA® applets, instant messaging, e-mail, browser plug-ins, Macromedia Flash, chat and others.
- Interactive hypertext environments may include proprietary environments such as those provided in America Online or other online service providers, as well as the “wireless Web” provided by various wireless networking providers, especially those in the cellular phone industry. It will be appreciated that the present application could apply in any such interactive communication environments, however, for purposes of discussion, the Web is used as an exemplary interactive hypertext environment with regard to the present application.
- a web site is a server/computer connected to the Internet that has massive storage capabilities for storing hypertext documents and that runs administrative software for handling requests for those stored hypertext documents as well as dynamically generating hypertext documents.
- Embedded within a hypertext document are a number of hyperlinks, i.e., highlighted portions of text which link the document to another hypertext document possibly stored at a web site elsewhere on the Internet.
- Each hyperlink is assigned a URL that provides the name of the linked document on a server connected to the Internet.
- a web server may also include facilities for storing and transmitting application programs, such as application programs written in the JAVA.® programming language from Sun Microsystems, for execution on a remote computer.
- a web server may also include facilities for executing scripts and other application programs on the web server itself.
- a remote access user may retrieve hypertext documents from the World Wide Web via a web browser program.
- the web browser Upon request from the remote access user via the web browser, the web browser requests the desired hypertext document from the appropriate web server using the URL for the document and the hypertext transport protocol (“HTTP”).
- HTTP is a higher-level protocol than TCP/IP and is designed specifically for the requirements of the WWW.
- HTTP runs on top of TCP/IP to transfer hypertext documents and user-supplied form data between server and client computers.
- the WWW browser may also retrieve programs from the web server, such as JAVA applets, for execution on the client computer.
- the WWW browser may include optional software components, called plug-ins, that run specialized functionality within the browser.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/950,776 US11276237B2 (en) | 2018-04-12 | 2020-11-17 | Augmented reality campus assistant |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201862656885P | 2018-04-12 | 2018-04-12 | |
US16/382,122 US10846935B2 (en) | 2018-04-12 | 2019-04-11 | Augmented reality campus assistant |
US16/950,776 US11276237B2 (en) | 2018-04-12 | 2020-11-17 | Augmented reality campus assistant |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/382,122 Continuation US10846935B2 (en) | 2018-04-12 | 2019-04-11 | Augmented reality campus assistant |
Publications (2)
Publication Number | Publication Date |
---|---|
US20210065453A1 US20210065453A1 (en) | 2021-03-04 |
US11276237B2 true US11276237B2 (en) | 2022-03-15 |
Family
ID=68162070
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/382,122 Active US10846935B2 (en) | 2018-04-12 | 2019-04-11 | Augmented reality campus assistant |
US16/950,776 Active US11276237B2 (en) | 2018-04-12 | 2020-11-17 | Augmented reality campus assistant |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/382,122 Active US10846935B2 (en) | 2018-04-12 | 2019-04-11 | Augmented reality campus assistant |
Country Status (1)
Country | Link |
---|---|
US (2) | US10846935B2 (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20200065771A1 (en) * | 2018-08-24 | 2020-02-27 | CareerBuilder, LLC | Location-based augmented reality for job seekers |
JP7456094B2 (en) * | 2019-04-05 | 2024-03-27 | 村田機械株式会社 | Maintenance method and maintenance server |
US11763558B1 (en) * | 2019-04-19 | 2023-09-19 | Apple Inc. | Visualization of existing photo or video content |
US11188755B2 (en) * | 2019-11-01 | 2021-11-30 | Pinfinity, Llc | Augmented reality systems and methods incorporating wearable pin badges |
US11176751B2 (en) * | 2020-03-17 | 2021-11-16 | Snap Inc. | Geospatial image surfacing and selection |
KR20210123198A (en) * | 2020-04-02 | 2021-10-13 | 주식회사 제이렙 | Argumented reality based simulation apparatus for integrated electrical and architectural acoustics |
US20220076514A1 (en) * | 2020-09-09 | 2022-03-10 | Carrier Corporation | System and method of device identification |
CN112468970A (en) * | 2020-11-03 | 2021-03-09 | 广州理工学院 | Campus navigation method based on augmented reality technology |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090010496A1 (en) * | 2006-03-20 | 2009-01-08 | Olympus Corporation | Image information processing apparatus, judging method, and computer program |
US20110032109A1 (en) * | 2009-01-28 | 2011-02-10 | Fox Rodney W | Premises Monitoring System |
US20130083066A1 (en) * | 2011-09-30 | 2013-04-04 | Wms Gaming, Inc. | Augmented reality for table games |
US8509488B1 (en) * | 2010-02-24 | 2013-08-13 | Qualcomm Incorporated | Image-aided positioning and navigation system |
US9235913B2 (en) * | 2011-04-13 | 2016-01-12 | Aurasma Limited | Methods and systems for generating and joining shared experience |
US20160267808A1 (en) * | 2015-03-09 | 2016-09-15 | Alchemy Systems, L.P. | Augmented Reality |
US10467230B2 (en) * | 2017-02-24 | 2019-11-05 | Microsoft Technology Licensing, Llc | Collection and control of user activity information and activity user interface |
-
2019
- 2019-04-11 US US16/382,122 patent/US10846935B2/en active Active
-
2020
- 2020-11-17 US US16/950,776 patent/US11276237B2/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090010496A1 (en) * | 2006-03-20 | 2009-01-08 | Olympus Corporation | Image information processing apparatus, judging method, and computer program |
US20110032109A1 (en) * | 2009-01-28 | 2011-02-10 | Fox Rodney W | Premises Monitoring System |
US8509488B1 (en) * | 2010-02-24 | 2013-08-13 | Qualcomm Incorporated | Image-aided positioning and navigation system |
US9235913B2 (en) * | 2011-04-13 | 2016-01-12 | Aurasma Limited | Methods and systems for generating and joining shared experience |
US20130083066A1 (en) * | 2011-09-30 | 2013-04-04 | Wms Gaming, Inc. | Augmented reality for table games |
US20160267808A1 (en) * | 2015-03-09 | 2016-09-15 | Alchemy Systems, L.P. | Augmented Reality |
US10467230B2 (en) * | 2017-02-24 | 2019-11-05 | Microsoft Technology Licensing, Llc | Collection and control of user activity information and activity user interface |
Also Published As
Publication number | Publication date |
---|---|
US20190318541A1 (en) | 2019-10-17 |
US10846935B2 (en) | 2020-11-24 |
US20210065453A1 (en) | 2021-03-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11276237B2 (en) | Augmented reality campus assistant | |
US11042769B2 (en) | Augmented reality badge system | |
US10373616B2 (en) | Interaction with a portion of a content item through a virtual assistant | |
US9754295B2 (en) | Providing navigation functionality in a retail location using local positioning technology | |
US20210092581A1 (en) | Situational awareness systems and methods | |
US20140245140A1 (en) | Virtual Assistant Transfer between Smart Devices | |
CN110134806B (en) | Contextual user profile photo selection | |
WO2008061146A2 (en) | Remote time and attendance system and method | |
US10410303B1 (en) | Method and system for a mobile computerized multiple function real estate users assistant | |
US20200319775A1 (en) | System and method for analyzing electronic communications and a collaborative electronic communications user interface | |
Nunes et al. | Augmented reality in support of disaster response | |
US20200175609A1 (en) | Augmented reality badges | |
US11734033B1 (en) | Virtual automated real-time assistant | |
US11699269B2 (en) | User interface with augmented work environments | |
Schiliro et al. | the role of mobile devices in enhancing the policing system to improve efficiency and effectiveness: A practitioner’s perspective | |
Manes | The tetherless tourist: ambient intelligence in travel & tourism | |
Oppl | Subject-oriented elicitation of distributed business process knowledge | |
CN101382936A (en) | Active interaction navigation method and system | |
KR20180129374A (en) | Method and system for managing attendance using messenger | |
US20230177776A1 (en) | Systems and methods for enhanced augmented reality emulation for user interaction | |
US20230186247A1 (en) | Method and system for facilitating convergence | |
Kilby et al. | Designing a mobile augmented reality tool for the locative visualisation of biomedical knowledge | |
JP6892174B1 (en) | Virtual exhibition display control device, virtual exhibition system, virtual exhibition display control program and virtual exhibition display control method | |
US10085131B2 (en) | Systems and methods for communicating with a unique identifier | |
CN116420172A (en) | Systems and methods for integrating and using augmented reality technology |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: CITIZENS BANK, N.A., AS COLLATERAL AGENT, CONNECTICUT Free format text: SECURITY INTEREST;ASSIGNOR:PRO UNLIMITED GLOBAL SOLUTIONS, INC.;REEL/FRAME:057356/0817 Effective date: 20210901 |
|
AS | Assignment |
Owner name: U.S. BANK, NATIONAL ASSOCIATION, AS COLLATERAL AGENT, MINNESOTA Free format text: SECURITY INTEREST;ASSIGNORS:PRO CORPORATION;PRO UNLIMITED GLOBAL SOLUTIONS, INC.;PRO UNLIMITED, INC.;REEL/FRAME:057392/0076 Effective date: 20210901 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: MAGNIT, LLC, CALIFORNIA Free format text: MERGER;ASSIGNOR:PRO UNLIMITED CONVERSION NY, LLC;REEL/FRAME:063851/0349 Effective date: 20220915 Owner name: PRO UNLIMITED CONVERSION NY, LLC, CALIFORNIA Free format text: MERGER;ASSIGNOR:PRO UNLIMITED, INC.;REEL/FRAME:063851/0254 Effective date: 20220914 |