US20190141282A1 - Information processing apparatus, information processing method, and non-transitory computer-readable medium storing program - Google Patents

Information processing apparatus, information processing method, and non-transitory computer-readable medium storing program Download PDF

Info

Publication number
US20190141282A1
US20190141282A1 US16/170,835 US201816170835A US2019141282A1 US 20190141282 A1 US20190141282 A1 US 20190141282A1 US 201816170835 A US201816170835 A US 201816170835A US 2019141282 A1 US2019141282 A1 US 2019141282A1
Authority
US
United States
Prior art keywords
user
predetermined position
image
information processing
processing apparatus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/170,835
Other languages
English (en)
Inventor
Xin Jin
Miharu HANAI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toyota Motor Corp
Original Assignee
Toyota Motor Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toyota Motor Corp filed Critical Toyota Motor Corp
Assigned to TOYOTA JIDOSHA KABUSHIKI KAISHA reassignment TOYOTA JIDOSHA KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HANAI, Miharu, JIN, XIN
Publication of US20190141282A1 publication Critical patent/US20190141282A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3647Guidance involving output of stored or live camera images or video streams
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3697Output of additional, non-guidance related information, e.g. low fuel level
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/01Social networking
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/123Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams
    • G08G1/133Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams within the vehicle ; Indicators inside the vehicles or at stops
    • G08G1/137Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams within the vehicle ; Indicators inside the vehicles or at stops the indicator being in the form of a map
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/10Multimedia information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/52User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail for supporting social networking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/52Network services specially adapted for the location of the user terminal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/445Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • H04N5/772Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N5/23203

Definitions

  • the present disclosure relates to an information processing apparatus, an information processing method, and a non-transitory computer-readable medium storing a program.
  • JP 2014-096632 A Japanese Unexamined Patent Application Publication No. 2014-082822
  • JP 2008-082822 A Japanese Unexamined Patent Application Publication No. 2008-082822
  • a case where a user does sightseeing, makes a trip, or the like while driving a vehicle a case where the user captures and stores images outside the vehicle or inside the vehicle at a predetermined position, such as inside a landscape spot, in order to look back on a memory of sightseeing or trip has been examined.
  • the user manually captures and stores an image or the user manually selects an image to be stored among images automatically captured by the drive recorder.
  • the user needs a lot of effort to capture an image or select an image to be stored.
  • the disclosure provides an information processing apparatus, an information processing method, and a non-transitory computer-readable medium storing a program capable of allowing a user to comparatively easily look back on a memory of trip, or the like.
  • a first aspect of the disclosure relates to an information processing apparatus.
  • the information processing apparatus includes a processor.
  • the processor is configured to acquire a current position of a user, acquire an image captured when the user passes through a predetermined position, and make the image be displayed on a map screen in association with the predetermined position.
  • the image captured when the user passes through the predetermined position is made to be displayed on the map screen in association with the predetermined position. Therefore, it is possible to allow the user to comparatively easily look back on a memory of a trip, or the like.
  • the processor may be configured to make a route from a departure place to a destination in a history of movement of the user be displayed on the map screen and make the image be displayed on the map screen in association with the predetermined position on the route.
  • the image captured when the user passes through the predetermined position is made to be displayed on the map screen in association with the predetermined position on the moving route of the user. Therefore, it is possible to allow the user to comparatively easily look back on a memory of a trip, or the like.
  • the predetermined position may include at least one of a position set by the user, a position designated in a route shared by an other user, and a position with a comment or an image in the route shared by the other user.
  • the image captured when the user passes through the position set by the user, the position designated in the route shared by another user, or the position with the comment or the image in the route shared by another user is made to be displayed on the map screen in association with the predetermined position. Therefore, it is possible to allow the user to comparatively easily look back on a memory of a trip, or the like.
  • the processor may be configured to acquire a voice recorded when the user passes through the predetermined position and make the image and an operation icon for reproducing the voice be displayed on the map screen in association with the predetermined position.
  • the image captured when the user passes through the predetermined position, and the voice are made to be displayed on the map screen in association with the predetermined position. Therefore, it is possible to allow the user to comparatively easily look back on a memory of a trip, or the like.
  • the processor may be configured to acquire a voice collected when the user passes through the predetermined position, and in a case where a volume level of the acquired voice is equal to or higher than a threshold, acquire the image.
  • the image captured in a case where the volume level of the voice is equal to or higher than the threshold when the user passes through the predetermined position is made to be displayed on the map screen in association with the predetermined position. Therefore, it is possible to allow the user to comparatively easily look back on a memory of a trip when the user became lively.
  • the processor may be configured to acquire a comment input by the user to the predetermined position and make the image and the comment be displayed on the map screen in association with the predetermined position.
  • the image captured when the user passes through the predetermined position, and the comment are made to be displayed on the map screen in association with the predetermined position. Therefore, it is possible to allow the user to comparatively easily look back on a memory of a trip, or the like.
  • the processor may be configured to, in a case where a volume level of a voice collected when the user passes through the predetermined position is equal to or higher than a threshold, share the predetermined position and the image with terminals of users other than the user.
  • the image captured in a case where the volume level of the voice is equal to or higher than the threshold when the user passes through the predetermined position is made to be shared. Therefore, it is possible to comparatively easily share a record of a trip when the user became lively with other users.
  • the processor may be configured to, in a case where a comment to the predetermined position is input by the user, share the predetermined position, the image, and the comment with terminals of users other than the user.
  • the image with the comment among the images captured when the user passes through the predetermined position is made to be shared. Therefore, it is possible to comparatively easily share a memory having a comparatively high degree of interest with other users.
  • the processor may be configured to acquire an image captured with an imaging device mounted in a vehicle when the user passes through the predetermined position. For this reason, the image captured with a drive recorder or the like when the user passes through the predetermined position with the vehicle is made to be displayed on the map screen in association with the predetermined position. Therefore, it is possible to allow the user to comparatively easily look back on a memory of a trip, or the like with a vehicle.
  • the processor may be configured to acquire an image outside the vehicle captured with the imaging device mounted in the vehicle when the user passes through the predetermined position.
  • the processor may be configured to acquire an image of an inside of a vehicle captured with an imaging device mounted in the vehicle when the user passes through the predetermined position. For this reason, the image of the inside of the vehicle captured with a drive recorder or the like when the user passes through the predetermined position with the vehicle is made to be displayed on the map screen in association with the predetermined position. Therefore, it is possible to allow the user to comparatively easily look back on a memory of a trip, or the like with a vehicle.
  • the processor may be configured to acquire an image of the user who is in a vehicle captured with an imaging device mounted in the vehicle when the user passes through the predetermined position. For this reason, the image of occupants with a drive recorder or the like when the user passes through the predetermined position with the vehicle is made to be displayed on the map screen in association with the predetermined position. Therefore, it is possible to allow the user to comparatively easily look back on a memory of a trip, or the like with a vehicle from the image of the user, a friend, or the like at a memorable place or the like.
  • a second aspect of the disclosure relates to an information processing method.
  • the information processing method includes, with an information processing apparatus, acquiring a current position of a user, acquiring an image captured when the user passes through a predetermined position, and making the image be displayed on a map screen in association with the predetermined position.
  • a third aspect of the disclosure relates to a non-transitory computer-readable medium storing a program causing an information processing apparatus to perform a process.
  • the process includes, acquiring a current position of a user, acquiring an image captured when the user passes through a predetermined position, and making the image be displayed on a map screen in association with the predetermined position.
  • FIG. 1 is a diagram showing a configuration example of a communication system according to an embodiment
  • FIG. 2 is a diagram showing a hardware configuration example of a server according to the embodiment
  • FIG. 3 is a diagram showing an example of a functional block diagram of a terminal and a server according to the embodiment
  • FIG. 4A is a sequence diagram showing an example of a process of the communication system according to the embodiment.
  • FIG. 4B is a sequence diagram showing an example of a process of the communication system according to the embodiment.
  • FIG. 5 is a table showing an example of trip information
  • FIG. 6 is a diagram illustrating an example of a display screen of a terminal.
  • FIG. 1 is a diagram showing a configuration example of a communication system 1 according to an embodiment.
  • the communication system 1 has terminals 10 - 1 , 10 - 2 (hereinafter, in a case where there is no need for distinction from each other, simply referred to as “terminals 10 ”), a server 20 , a social networking service (SNS) server 30 , and a content providing server 40 .
  • the number of terminals 10 is not limited to two.
  • the terminals 10 and the server 20 , and the terminals 10 and the SNS server 30 are connected in a communicable state, for example, through a network 50 , such as the Internet, a mobile phone network, a wireless local area network (LAN), or a LAN.
  • the server 20 and the content providing server 40 are connected in a communicable state through the network 50 .
  • Each terminal 10 is, for example, an information processing apparatus (computer), such as a smartphone, a tablet personal computer (PC), a notebook PC, an in-vehicle camera (drive recorder) with a communication function, or a navigation device with a communication function.
  • the in-vehicle camera is an example of “an imaging device mounted in a vehicle”.
  • Each terminal 10 may have an internal camera (“imaging device”) that captures an image.
  • imaging device an external camera in a wired or wireless manner.
  • Each terminal 10 may be constituted of a plurality of devices.
  • each terminal 10 may be constituted of a smartphone, an in-vehicle navigation device, and an in-vehicle camera.
  • the smartphone and the in-vehicle navigation device may be connected to perform communication in a short-distance wireless manner, and the in-vehicle navigation device and the in-vehicle camera may be connected to perform communication through a cable or the like.
  • An image (still image or moving image) captured by the in-vehicle camera may be transmitted to the server 20 through the in-vehicle navigation device and the smartphone.
  • each terminal 10 may be constituted of an in-vehicle navigation device with a communication function and an in-vehicle camera.
  • the in-vehicle navigation device and the in-vehicle camera may be connected to perform communication through a cable or the like.
  • An image captured by the in-vehicle camera may be transmitted to the server 20 through the in-vehicle navigation device with the communication function.
  • Each terminal 10 automatically captures an image at a predetermined position, such as a landscape spot, and stores the image as trip information in the server 20 .
  • each terminal 10 may capture, for example, an image outside a vehicle, such as ahead of the vehicle.
  • Each terminal 10 may capture, for example, an image inside the vehicle, such as a user (occupant) who is in the vehicle.
  • Each terminal 10 may capture an image outside the vehicle and an image inside the vehicle simultaneously, for example.
  • Each terminal 10 displays the images included in the trip information stored in the server 20 on a map screen in association with the predetermined position.
  • Each terminal 10 may store the images in the terminal 10 , and may display the images on the map screen in association with the predetermined position.
  • the trip information may include, for example, information regarding a moving route from a departure place to a destination, or the like.
  • the trip information may include a via-point, a departure time, an arrival time, and a pin, a comment, an image captured by a user, or the like associated with a position on the route.
  • Each terminal 10 performs chat, image sharing, and the like with a group of friends or the like using a social networking service provided by the SNS server 30 .
  • the server 20 is, for example, an information processing apparatus for a server, and manages the trip information uploaded from the terminals 10 for each of the users of the terminals 10 .
  • the server 20 may provide a service, such as navigation, to the terminals 10 .
  • the server 20 shares the trip information uploaded from each terminal 10 with users of other terminals 10 through the SNS server 30 .
  • the server 20 may share the trip information uploaded from each terminal 10 with users of other terminals 10 not through the SNS server 30 .
  • the SNS server 30 is, for example, an information processing apparatus for a server, and provides a social networking service, such as group chat, to the terminals 10 .
  • the SNS server 30 transmits the trip information uploaded from each terminal 10 through the server 20 to other terminals 10 using the social networking service.
  • the SNS server 30 may transmit the trip information to other terminals 10 using social media including Wiki, SNS, a blog, an image sharing site, and the like.
  • the content providing server 40 is, for example, an information processing apparatus for a server, and provides data, such as business hours of facilities, to the server 20 .
  • An information processing program that realizes a process in the server 20 is provided, for example, by a storage medium 101 .
  • the information processing program is installed from the storage medium 101 to the auxiliary storage device 102 through the drive device 100 .
  • the installation of the information processing program is not indispensably performed from the storage medium 101 , and may be downloaded from another computer through the network.
  • the auxiliary storage device 102 stores the installed information processing program, and stores needed files, data, and the like.
  • the memory device 103 is, for example, a random access memory (RAM), and in a case where there is a start instruction of the program, reads and stores the program from the auxiliary storage device 102 .
  • the CPU 104 realizes functions related to the server 20 according to the program stored in the memory device 103 .
  • the interface device 105 is used as an interface for connection to the network.
  • a portable storage medium such as a compact disk-read only memory (CD-ROM), a digital versatile disc (DVD) disk, or a universal serial bus (USB) memory
  • CD-ROM compact disk-read only memory
  • DVD digital versatile disc
  • USB universal serial bus
  • auxiliary storage device 102 a hard disk drive (HDD), a flash memory, or the like is exemplified. Both of the storage medium 101 and the auxiliary storage device 102 correspond to a computer-readable storage medium.
  • the hardware configurations of the terminal 10 , the SNS server 30 , and the content providing server 40 may be the same as that of the server 20 .
  • FIG. 3 is a diagram showing an example of a functional block diagram of the terminal 10 and the server 20 according to the embodiment.
  • the server 20 has a storage unit 21 .
  • the storage unit 21 is realized using, for example, an auxiliary storage device or the like.
  • the storage unit 21 stores a trip information database (DB) 211 and the like. Data stored in the trip information DB 211 will be described below.
  • DB trip information database
  • the server 20 has a position acquisition unit 22 , a route search unit 23 , a trip information acquisition unit 24 , a generation unit 25 , a display controller 26 , a sharing unit 27 , and a communication unit 28 .
  • the position acquisition unit 22 , the route search unit 23 , the trip information acquisition unit 24 , the generation unit 25 , the display controller 26 , the sharing unit 27 , and the communication unit 28 represent functions that are realized by a process executed on the CPU of the server 20 by one or more programs installed on the server 20 .
  • the position acquisition unit 22 acquires a current position of the terminal 10 .
  • the position acquisition unit 22 may acquire current position information acquired by a global positioning system (GPS) of the terminal 10 from the terminal 10 , for example, while a navigation function or the like of the terminal 10 is being executed.
  • the route search unit 23 searches for a route from a departure place to a destination.
  • the trip information acquisition unit 24 acquires an image captured and a voice recorded when the user of the terminal 10 passes through a predetermined position from the terminal 10 .
  • the trip information acquisition unit 24 acquires a comment or the like input by the user of the terminal 10 .
  • the generation unit 25 generates trip information based on information acquired by the trip information acquisition unit 24 , and stores the generated trip information in the trip information DB 211 .
  • the display controller 26 makes the terminal 10 display the trip information stored in the trip information DB 211 . Specifically, the display controller 26 makes the image, the comment, and the like be displayed on a screen (map screen) of the terminal 10 , on which a map is displayed, in association with the predetermined position on the map based on the trip information.
  • the sharing unit 27 allows trip information of the user of the terminal 10 to be shared with other users.
  • the sharing unit 27 makes the trip information be shared with other users using the social networking service provided by the SNS server 30 , for example.
  • the communication unit 28 performs communication with the terminal 10 , the SNS server 30 , and the content providing server 40 .
  • the communication unit 28 receives a route search request, a route sharing request, and the like from the terminal 10 .
  • the communication unit 28 transmits the trip information to the SNS server 30 according to an instruction from the sharing unit 27 .
  • the reception unit 11 receives an input operation from the user.
  • the reception unit 11 receives, for example, an operation for designating a route, or the like.
  • the position acquisition unit 12 acquires the current position of the terminal 10 .
  • the trip information acquisition unit 14 controls a camera and a microphone when the user of the terminal 10 passes through a predetermined position, and acquires an image captured with the camera and a voice collected with the microphone.
  • the predetermined position may be a region having a predetermined area.
  • the predetermined position may be, for example, a region including a predetermined spot (hereinafter, referred to as a “predetermined place”) represented by a latitude, a longitude, and the like associated with a landscape spot or a facility, such as a commercial facility.
  • the trip information acquisition unit 14 acquires a comment or the like input by the user of the terminal 10 .
  • the trip information acquisition unit 14 may determine that the user of the terminal 10 passes through the predetermined position.
  • the predetermined threshold may be set in advance according to the predetermined place.
  • the trip information acquisition unit 14 may determine that the user of the terminal 10 passes through the predetermined position.
  • the trip information acquisition unit 14 may determine that the user of the terminal 10 passes through the predetermined position.
  • the display controller 15 makes data of the image, the comment, and the like included in the trip information be displayed on the map screen in association with the predetermined position on the map according to an instruction from the server 20 .
  • the display controller 15 may make data be displayed in association with the predetermined place included in the predetermined position.
  • the trip information sharing unit 16 shares the trip information used by the user of the terminal 10 with other terminals 10 using the server 20 .
  • the trip information sharing unit 16 acquires the trip information from the server 20 or the SNS server 30 .
  • the trip information sharing unit 16 uploads the trip information to the server 20 .
  • Step S 1 the reception unit 11 of the terminal 10 - 1 receives setting of a destination, a departure place, and the like from the user.
  • a current position of the terminal 10 or a position set in advance may be set as a departure place.
  • the trip information acquisition unit 14 of the terminal 10 - 1 acquires an image captured at the predetermined position and a voice recorded at the predetermined position in a case where a predetermined condition is satisfied (Step S 7 ).
  • the trip information acquisition unit 14 makes the in-vehicle camera, the camera embedded in the terminal 10 - 1 , or the like capture an image and makes the in-vehicle camera, the microphone embedded in the terminal 10 - 1 , or the like collect a voice for a predetermined time.
  • the predetermined condition may be, for example, a case where the volume level of the voice collected by the microphone or the like at the predetermined position is equal to or higher than a predetermined threshold. With this, solely in a case where a user inside a vehicle becomes lively or shouts for joy in a landscape spot or the like, an image and a voice can be acquired.
  • the predetermined condition may be, for example, a case where current date and time satisfies a condition of date and time associated with the predetermined position in the content providing server 40 or the like. With this, for example, when the current position is a spot for autumn leaves, solely in a case where the current date and time is a season of autumn leaves, an image and a voice can be acquired. Furthermore, for example, when the current position is a spot where the user can appreciate the sunset, solely in a case where the current date and time is a time slot where the sun sets, an image and a voice can be acquired.
  • the process for determining whether or not the predetermined condition is satisfied when the image and the voice are acquired may be executed in Step S 9 by the trip information acquisition unit 24 of the server 20 , instead of being executed in Step S 7 by the trip information acquisition unit 14 of the terminal 10 - 1 .
  • the generation unit 25 of the server 20 generates trip information including information having the image, the voice, and the like associated with the predetermined position acquired by the trip information acquisition unit 24 , and information of the route from the departure place to the destination (Step S 9 ).
  • the generation unit 25 of the server 20 may replace the image, the voice, and the like, associated with the predetermined position, included in the shared trip information with the image, the voice, and the like transmitted from the terminal 10 - 1 through the process of Step S 8 .
  • the generation unit 25 of the server 20 may automatically generate a comment to the predetermined position and may include the generated comment in the trip information.
  • the generation unit 25 of the server 20 may generate a comment including position information of the predetermined position, date and time when the user passes through the predetermined position, the degree of liveliness inside the vehicle, a facility name or a description of a facility at the predetermined position, and the like.
  • the degree of liveliness inside the vehicle may be calculated based on the magnitude of the volume level of the voice when the user passes through the predetermined position. For example, in a case where the volume level is equal to or higher than a predetermined threshold, the degree of liveliness inside the vehicle may be calculated in a five-step display.
  • the facility name or the description of the facility at the predetermined position may be acquired from the content providing server 40 or the like.
  • the generation unit 25 of the server 20 stores the trip information in the trip information DB 211 in association with an identification (ID) of the user of the terminal 10 - 1 , departure date and time, and the like (Step S 10 ).
  • Information of the route from the departure place to the destination may be generated based on information of a history of positions where the user of the terminal 10 has actually moved, acquired by the position acquisition unit 22 , or may be the route searched by the route search unit 23 .
  • Information from the departure place to the destination may be the route included in the trip information.
  • the trip information DB 211 stores items of a route, and an image, a voice, a comment, and the like to each position in association with a user ID and departure date and time.
  • the user ID is the ID of the user of the terminal 10 .
  • the departure date and time is the date and time when the user departs from the departure place.
  • the route is the route from the departure place to the destination, and may be represented by, for example, each node indicating an intersection or the like on a road to pass through from the departure place to the destination, and an order of passing through the nodes.
  • Each image is the image captured at the predetermined position in Step S 7 or an image captured by a user's operation.
  • Each voice is a voice recorded at the predetermined position in Step S 7 or a voice recorded by a user's operation.
  • Each comment is the comment automatically generated to the predetermined position in Step S 9 or a comment input by a user's operation in Step S 14 described below.
  • the reception unit 11 of the terminal 10 - 1 receives an input of a comment (character information) at a position selected by the user on the map of the screen according to a user's operation (Step S 14 ).
  • the trip information acquisition unit 14 of the terminal 10 - 1 transmits the input comment and the input position of the comment on the map to the server 20 (Step S 15 ).
  • the generation unit 25 of the server 20 includes the comment and the input position of the comment acquired by the trip information acquisition unit 24 in the trip information, and stores the trip information in the trip information DB 211 (Step S 16 ). With this, when the terminal 10 - 1 acquires the trip information, the comment is also displayed in association with the position on the map where the comment is input.
  • the trip information sharing unit 16 of the terminal 10 - 1 receives an operation to designate trip information to be shared and a range of users sharing the trip information from the user (Step S 17 ).
  • the trip information sharing unit 16 of the terminal 10 - 1 transmits a route sharing request including data of the trip information and the range to the server 20 (Step S 18 ).
  • the predetermined condition may be, for example, a case where a volume level of a voice collected by the microphone or the like at the predetermined position is equal to or higher than a predetermined threshold.
  • the predetermined condition may be, for example, a case where the current date and time satisfies a condition of recommended date and time associated with the predetermined position.
  • the predetermined condition may be, for example, a case where a comment to the predetermined position is input by the user. With this, it is possible to share solely an image and a voice of a place with a comment having a comparatively high degree of interest of the user.
  • the SNS server 30 transmits the received trip information of the terminal 10 - 2 of each user included in the group sharing the route (Step S 20 ).
  • the SNS use unit 17 of the terminal 10 - 2 of the user as a sharing destination displays the trip information received from the SNS server 30 on a talk room screen of the group designated from the user of the terminal 10 - 1 (Step S 21 ).
  • the current position of the user is acquired, an image captured when the user passes through the predetermined position is acquired, and the image is made to be displayed on the map screen in association with the predetermined position.
  • a position such as a landscape spot or a sightseeing spot
  • the functional units of the terminal 10 and the server 20 may be realized by, for example, cloud computing constituted of one or more computers.
  • a process of at least a part of the functions of the terminal 10 and the SNS server 30 may be executed in the server 20 .
  • a process of at least a part of the functional units of the server 20 may be executed in the terminal 10 .
  • the terminal 10 and the server 20 are an example of an information processing apparatus.
  • the position acquisition unit 12 or the position acquisition unit 22 is an example of a “first acquisition unit”.
  • the trip information acquisition unit 14 or the trip information acquisition unit 24 is an example of a “second acquisition unit”.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Business, Economics & Management (AREA)
  • Human Computer Interaction (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Navigation (AREA)
  • Instructional Devices (AREA)
  • Traffic Control Systems (AREA)
  • Mechanical Engineering (AREA)
US16/170,835 2017-11-06 2018-10-25 Information processing apparatus, information processing method, and non-transitory computer-readable medium storing program Abandoned US20190141282A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-214207 2017-11-06
JP2017214207A JP7028608B2 (ja) 2017-11-06 2017-11-06 情報処理装置、情報処理方法、及びプログラム

Publications (1)

Publication Number Publication Date
US20190141282A1 true US20190141282A1 (en) 2019-05-09

Family

ID=66329032

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/170,835 Abandoned US20190141282A1 (en) 2017-11-06 2018-10-25 Information processing apparatus, information processing method, and non-transitory computer-readable medium storing program

Country Status (3)

Country Link
US (1) US20190141282A1 (ja)
JP (1) JP7028608B2 (ja)
CN (1) CN109756838A (ja)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190295030A1 (en) * 2018-03-22 2019-09-26 Holman Strategic Investments, LLC Method and system for vehicle management
US11993271B2 (en) 2021-04-14 2024-05-28 Toyota Jidosha Kabushiki Kaisha Information processing apparatus, non-transitory storage medium, and information processing method

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110113540B (zh) * 2019-06-13 2021-06-04 广州小鹏汽车科技有限公司 一种车辆拍摄方法、装置、车辆和可读介质
JP7297300B2 (ja) * 2019-08-06 2023-06-26 株式会社Agama-X 情報処理装置及びプログラム
JP7117282B2 (ja) * 2019-11-08 2022-08-12 本田技研工業株式会社 出力システムおよびその制御方法、並びにプログラム
CN110992514B (zh) * 2019-11-21 2022-01-18 贵州电网有限责任公司 远程拍摄方法及设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060142942A1 (en) * 2004-12-29 2006-06-29 Samsung Electronics Co., Ltd. Device and method for creating electronic album using map data
US20110106434A1 (en) * 2008-09-03 2011-05-05 Masamitsu Ishihara Image capturing system for vehicle
JP2014096632A (ja) * 2012-11-07 2014-05-22 Denso Corp 撮像システム
US9235750B1 (en) * 2011-09-16 2016-01-12 Lytx, Inc. Using passive driver identification and other input for providing real-time alerts or actions
US20180249056A1 (en) * 2015-08-18 2018-08-30 Lg Electronics Inc. Mobile terminal and method for controlling same

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003198904A (ja) * 2001-12-25 2003-07-11 Mazda Motor Corp 撮像方法、撮像システム、撮像装置、撮像制御サーバ、並びに撮像プログラム
JP2012226646A (ja) * 2011-04-21 2012-11-15 Sony Corp 情報提供装置および情報提供方法、並びにプログラム
JP2015230519A (ja) * 2014-06-03 2015-12-21 株式会社デンソー 車両用思い出共有システム、車載機及び思い出共有プログラム
JP2017040551A (ja) * 2015-08-19 2017-02-23 株式会社ユピテル システム及びプログラム
CN106550202A (zh) * 2015-09-16 2017-03-29 深圳市凯立德科技股份有限公司 一种行车记录影像显示方法及装置
CN105469461A (zh) * 2015-11-19 2016-04-06 莆田市云驰新能源汽车研究院有限公司 一种路景分享方法及装置
CN107305561B (zh) * 2016-04-21 2021-02-02 斑马网络技术有限公司 图像的处理方法、装置、设备及用户界面系统
CN106652102A (zh) * 2017-01-04 2017-05-10 江西沃可视发展有限公司 一种基于地图轨迹的行车记录回放方法及其终端设备
CN107067498A (zh) * 2017-05-26 2017-08-18 北京小米移动软件有限公司 一种行车记录方法及装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060142942A1 (en) * 2004-12-29 2006-06-29 Samsung Electronics Co., Ltd. Device and method for creating electronic album using map data
US20110106434A1 (en) * 2008-09-03 2011-05-05 Masamitsu Ishihara Image capturing system for vehicle
US9235750B1 (en) * 2011-09-16 2016-01-12 Lytx, Inc. Using passive driver identification and other input for providing real-time alerts or actions
JP2014096632A (ja) * 2012-11-07 2014-05-22 Denso Corp 撮像システム
US20180249056A1 (en) * 2015-08-18 2018-08-30 Lg Electronics Inc. Mobile terminal and method for controlling same

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190295030A1 (en) * 2018-03-22 2019-09-26 Holman Strategic Investments, LLC Method and system for vehicle management
US11993271B2 (en) 2021-04-14 2024-05-28 Toyota Jidosha Kabushiki Kaisha Information processing apparatus, non-transitory storage medium, and information processing method

Also Published As

Publication number Publication date
JP2019086372A (ja) 2019-06-06
CN109756838A (zh) 2019-05-14
JP7028608B2 (ja) 2022-03-02

Similar Documents

Publication Publication Date Title
US20190141282A1 (en) Information processing apparatus, information processing method, and non-transitory computer-readable medium storing program
US9990585B2 (en) Information processing apparatus, information processing method and computer-readable storage medium for generating course information
US20120268620A1 (en) Information providing apparatus, information providing method, and program
CN110033632B (zh) 车辆摄影支援装置、方法以及存储介质
JP2014112302A (ja) 所定領域管理システム、通信方法、及びプログラム
US8570424B2 (en) Display control apparatus and display control method
KR102121327B1 (ko) 이미지 획득 방법, 피제어 기기 및 서버
US20190107414A1 (en) Information processing apparatus, information processing method, and non-transitory storage medium storing program
JP2012247841A (ja) 近隣人物特定装置、近隣人物特定方法、近隣人物特定プログラム及び近隣人物特定システム
JP2013134228A (ja) ナビゲーションシステム、方法及びコンピュータプログラム
JP2006338553A (ja) コンテンツ再生装置
CN111680238A (zh) 信息分享方法、装置和存储介质
US20210116260A1 (en) Information processing apparatus, information processing method, information processing system, and non-transitory storage medium storing program
JP2016050895A (ja) ランドマーク表示装置、方法、およびプログラム
US20210334307A1 (en) Methods and systems for generating picture set from video
KR102111758B1 (ko) 차량용 영상 처리 장치 및 이를 이용한 데이터 공유 방법
US12111172B2 (en) Apparatus and method of providing contextual-information-based service
JP2016103049A (ja) 情報処理装置、システム、制御方法、及びプログラム
EP3134861B1 (en) Use of wireless connection loss to facilitate identifying and recording video capture location
KR20210109759A (ko) 차량용 컨텐츠 기반 경로 정보 제공 방법 및 그를 수행하기 위한 장치
US9222790B2 (en) Method and apparatus for crowdsourced tour creation and provision
KR20170027020A (ko) 관광 사진, 영상 콘텐츠를 활용한 관광 시스템
KR20170025732A (ko) 여행 기록을 제공하기 위한 장치, 이를 위한 방법 및 이 방법이 기록된 컴퓨터 판독 가능한 기록매체
JP2023132130A (ja) 情報処理装置、情報処理方法、およびプログラム
WO2023021759A1 (ja) 情報処理装置、情報処理方法

Legal Events

Date Code Title Description
AS Assignment

Owner name: TOYOTA JIDOSHA KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JIN, XIN;HANAI, MIHARU;SIGNING DATES FROM 20180830 TO 20180905;REEL/FRAME:047315/0965

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION